TechCrunch is reporting—but not describing in detail—a vulnerability in a series of stalkerware apps that exposes personal information of the victims. The vulnerability isn’t in the apps installed on the victims’ phones, but in the website the stalker goes to view the information the app collects. The article is worth reading, less for the description of the vulnerability and more for the shadowy string of companies behind these stalkerware apps.
Entries Tagged "privacy"
Page 3 of 132
A good lesson in reading the fine print:
Cignpost Diagnostics, which trades as ExpressTest and offers £35 tests for holidaymakers, said it holds the right to analyse samples from seals to “learn more about human health”—and sell information on to third parties.
Individuals are required to give informed consent for their sensitive medical data to be used but customers’ consent for their DNA to be sold now as buried in Cignpost’s online documents.
Of course, no one ever reads the fine print.
EDITED TO ADD (3/12): The original story.
A Berlin-based company has developed an AirTag clone that bypasses Apple’s anti-stalker security systems. Source code for these AirTag clones is available online.
So now we have several problems with the system. Apple’s anti-stalker security only works with iPhones. (Apple wrote an Android app that can detect AirTags, but how many people are going to download it?) And now non-AirTags can piggyback on Apple’s system without triggering the alarms.
Apple didn’t think this through nearly as well as it claims to have. I think the general problem is one that I have written about before: designers just don’t have intimate threats in mind when building these systems.
A reporter interviews a Uyghur human-rights advocate, and uses the Otter.ai transcription app.
The next day, I received an odd note from Otter.ai, the automated transcription app that I had used to record the interview. It read: “Hey Phelim, to help us improve your Otter’s experience, what was the purpose of this particular recording with titled ‘Mustafa Aksu’ created at ‘2021-11-08 11:02:41’?”
Customer service or Chinese surveillance? Turns out it’s hard to tell.
EDITED TO ADD (3/12): Another article.
Two US senators claim that the CIA has been running an unregulated—and almost certainly illegal—mass surveillance program on Americans.
No real details yet.
Senators have reintroduced the EARN IT Act, requiring social media companies (among others) to administer a massive surveillance operation on their users:
A group of lawmakers led by Sen. Richard Blumenthal (D-CT) and Sen. Lindsey Graham (R-SC) have re-introduced the EARN IT Act, an incredibly unpopular bill from 2020 that was dropped in the face of overwhelming opposition. Let’s be clear: the new EARN IT Act would pave the way for a massive new surveillance system, run by private companies, that would roll back some of the most important privacy and security features in technology used by people around the globe. It’s a framework for private actors to scan every message sent online and report violations to law enforcement. And it might not stop there. The EARN IT Act could ensure that anything hosted online—backups, websites, cloud photos, and more—is scanned.
There are two bills working their way through Congress that would force companies like Apple to allow competitive app stores. Apple hates this, since it would break its monopoly, and it’s making a variety of security arguments to bolster its argument. I have written a rebuttal:
I would like to address some of the unfounded security concerns raised about these bills. It’s simply not true that this legislation puts user privacy and security at risk. In fact, it’s fairer to say that this legislation puts those companies’ extractive business-models at risk. Their claims about risks to privacy and security are both false and disingenuous, and motivated by their own self-interest and not the public interest. App store monopolies cannot protect users from every risk, and they frequently prevent the distribution of important tools that actually enhance security. Furthermore, the alleged risks of third-party app stores and “side-loading” apps pale in comparison to their benefits. These bills will encourage competition, prevent monopolist extortion, and guarantee users a new right to digital self-determination.
Matt Stoller has also written about this.
China is mandating that athletes download and use a health and travel app when they attend the Winter Olympics next month. Citizen Lab examined the app and found it riddled with security holes.
- MY2022, an app mandated for use by all attendees of the 2022 Olympic Games in Beijing, has a simple but devastating flaw where encryption protecting users’ voice audio and file transfers can be trivially sidestepped. Health customs forms which transmit passport details, demographic information, and medical and travel history are also vulnerable. Server responses can also be spoofed, allowing an attacker to display fake instructions to users.
- MY2022 is fairly straightforward about the types of data it collects from users in its public-facing documents. However, as the app collects a range of highly sensitive medical information, it is unclear with whom or which organization(s) it shares this information.
- MY2022 includes features that allow users to report “politically sensitive” content. The app also includes a censorship keyword list, which, while presently inactive, targets a variety of political topics including domestic issues such as Xinjiang and Tibet as well as references to Chinese government agencies.
- While the vendor did not respond to our security disclosure, we find that the app’s security deficits may not only violate Google’s Unwanted Software Policy and Apple’s App Store guidelines but also China’s own laws and national standards pertaining to privacy protection, providing potential avenues for future redress.
It’s not clear whether the security flaws were intentional or not, but the report speculated that proper encryption might interfere with some of China’s ubiquitous online surveillance tools, especially systems that allow local authorities to snoop on phones using public wireless networks or internet cafes. Still, the researchers added that the flaws were probably unintentional, because the government will already be receiving data from the app, so there wouldn’t be a need to intercept the data as it was being transferred.
The app also included a list of 2,422 political keywords, described within the code as “illegalwords.txt,” that worked as a keyword censorship list, according to Citizen Lab. The researchers said the list appeared to be a latent function that the app’s chat and file transfer function was not actively using.
The US government has already advised athletes to leave their personal phones and laptops home and bring burners.
Last summer, the San Francisco police illegally used surveillance cameras at the George Floyd protests. The EFF is suing the police:
This surveillance invaded the privacy of protesters, targeted people of color, and chills and deters participation and organizing for future protests. The SFPD also violated San Francisco’s new Surveillance Technology Ordinance. It prohibits city agencies like the SFPD from acquiring, borrowing, or using surveillance technology, without prior approval from the city’s Board of Supervisors, following an open process that includes public participation. Here, the SFPD went through no such process before spying on protesters with this network of surveillance cameras.
It’s feels like a pretty easy case. There’s a law, and the SF police didn’t follow it.
I wouldn’t be writing about this at all except that Chris is a board member of EPIC, and used his EPIC affiliation in the op-ed to bolster his own credentials. (Bizarrely, he linked to an EPIC page that directly contradicts his position.) In his op-ed, he mischaracterized the EFF’s actions and the facts of the lawsuit. It’s a mess.
One of the fundamental principles that underlies EPIC’s work (and the work of many other groups) on surveillance oversight is that individuals should have the power to decide whether surveillance tools are used in their communities and to impose limits on their use. We have fought for years to shed light on the development, procurement, and deployment of such technologies and have worked to ensure that they are subject to independent oversight through hearings, legal challenges, petitions, and other public forums. The CCOPS model, which was developed by ACLU affiliates and other coalition partners in California and implemented through the San Francisco ordinance, is a powerful mechanism to enable public oversight of dangerous surveillance tools. The access, retention, and use policies put in place by the neighborhood business associations operating these networks provide necessary, but not sufficient, protections against abuse. Strict oversight is essential to promote both privacy and community safety, which includes freedom from arbitrary police action and the freedom to assemble.
So far, EPIC has not done anything about Larsen still being on its board. (Others have criticized them for keeping him on.) I don’t know if I have an opinion on this. Larsen has done good work on financial privacy regulations, which is a good thing. But he seems to be funding all these surveillance cameras in San Francisco, which is really bad.
Rolling Stone is reporting that the UK government has hired the M&C Saatchi advertising agency to launch an anti-encryption advertising campaign. Presumably they’ll lean heavily on the “think of the children!” rhetoric we’re seeing in this current wave of the crypto wars. The technical eavesdropping mechanisms have shifted to client-side scanning, which won’t actually help—but since that’s not really the point, it’s not argued on its merits.
Sidebar photo of Bruce Schneier by Joe MacInnis.