TechCrunch is reporting—but not describing in detail—a vulnerability in a series of stalkerware apps that exposes personal information of the victims. The vulnerability isn’t in the apps installed on the victims’ phones, but in the website the stalker goes to view the information the app collects. The article is worth reading, less for the description of the vulnerability and more for the shadowy string of companies behind these stalkerware apps.
Entries Tagged "surveillance"
Page 3 of 86
A Berlin-based company has developed an AirTag clone that bypasses Apple’s anti-stalker security systems. Source code for these AirTag clones is available online.
So now we have several problems with the system. Apple’s anti-stalker security only works with iPhones. (Apple wrote an Android app that can detect AirTags, but how many people are going to download it?) And now non-AirTags can piggyback on Apple’s system without triggering the alarms.
Apple didn’t think this through nearly as well as it claims to have. I think the general problem is one that I have written about before: designers just don’t have intimate threats in mind when building these systems.
A reporter interviews a Uyghur human-rights advocate, and uses the Otter.ai transcription app.
The next day, I received an odd note from Otter.ai, the automated transcription app that I had used to record the interview. It read: “Hey Phelim, to help us improve your Otter’s experience, what was the purpose of this particular recording with titled ‘Mustafa Aksu’ created at ‘2021-11-08 11:02:41’?”
Customer service or Chinese surveillance? Turns out it’s hard to tell.
EDITED TO ADD (3/12): Another article.
Two US senators claim that the CIA has been running an unregulated—and almost certainly illegal—mass surveillance program on Americans.
No real details yet.
Senators have reintroduced the EARN IT Act, requiring social media companies (among others) to administer a massive surveillance operation on their users:
A group of lawmakers led by Sen. Richard Blumenthal (D-CT) and Sen. Lindsey Graham (R-SC) have re-introduced the EARN IT Act, an incredibly unpopular bill from 2020 that was dropped in the face of overwhelming opposition. Let’s be clear: the new EARN IT Act would pave the way for a massive new surveillance system, run by private companies, that would roll back some of the most important privacy and security features in technology used by people around the globe. It’s a framework for private actors to scan every message sent online and report violations to law enforcement. And it might not stop there. The EARN IT Act could ensure that anything hosted online—backups, websites, cloud photos, and more—is scanned.
China is mandating that athletes download and use a health and travel app when they attend the Winter Olympics next month. Citizen Lab examined the app and found it riddled with security holes.
- MY2022, an app mandated for use by all attendees of the 2022 Olympic Games in Beijing, has a simple but devastating flaw where encryption protecting users’ voice audio and file transfers can be trivially sidestepped. Health customs forms which transmit passport details, demographic information, and medical and travel history are also vulnerable. Server responses can also be spoofed, allowing an attacker to display fake instructions to users.
- MY2022 is fairly straightforward about the types of data it collects from users in its public-facing documents. However, as the app collects a range of highly sensitive medical information, it is unclear with whom or which organization(s) it shares this information.
- MY2022 includes features that allow users to report “politically sensitive” content. The app also includes a censorship keyword list, which, while presently inactive, targets a variety of political topics including domestic issues such as Xinjiang and Tibet as well as references to Chinese government agencies.
- While the vendor did not respond to our security disclosure, we find that the app’s security deficits may not only violate Google’s Unwanted Software Policy and Apple’s App Store guidelines but also China’s own laws and national standards pertaining to privacy protection, providing potential avenues for future redress.
It’s not clear whether the security flaws were intentional or not, but the report speculated that proper encryption might interfere with some of China’s ubiquitous online surveillance tools, especially systems that allow local authorities to snoop on phones using public wireless networks or internet cafes. Still, the researchers added that the flaws were probably unintentional, because the government will already be receiving data from the app, so there wouldn’t be a need to intercept the data as it was being transferred.
The app also included a list of 2,422 political keywords, described within the code as “illegalwords.txt,” that worked as a keyword censorship list, according to Citizen Lab. The researchers said the list appeared to be a latent function that the app’s chat and file transfer function was not actively using.
The US government has already advised athletes to leave their personal phones and laptops home and bring burners.
Last summer, the San Francisco police illegally used surveillance cameras at the George Floyd protests. The EFF is suing the police:
This surveillance invaded the privacy of protesters, targeted people of color, and chills and deters participation and organizing for future protests. The SFPD also violated San Francisco’s new Surveillance Technology Ordinance. It prohibits city agencies like the SFPD from acquiring, borrowing, or using surveillance technology, without prior approval from the city’s Board of Supervisors, following an open process that includes public participation. Here, the SFPD went through no such process before spying on protesters with this network of surveillance cameras.
It’s feels like a pretty easy case. There’s a law, and the SF police didn’t follow it.
I wouldn’t be writing about this at all except that Chris is a board member of EPIC, and used his EPIC affiliation in the op-ed to bolster his own credentials. (Bizarrely, he linked to an EPIC page that directly contradicts his position.) In his op-ed, he mischaracterized the EFF’s actions and the facts of the lawsuit. It’s a mess.
One of the fundamental principles that underlies EPIC’s work (and the work of many other groups) on surveillance oversight is that individuals should have the power to decide whether surveillance tools are used in their communities and to impose limits on their use. We have fought for years to shed light on the development, procurement, and deployment of such technologies and have worked to ensure that they are subject to independent oversight through hearings, legal challenges, petitions, and other public forums. The CCOPS model, which was developed by ACLU affiliates and other coalition partners in California and implemented through the San Francisco ordinance, is a powerful mechanism to enable public oversight of dangerous surveillance tools. The access, retention, and use policies put in place by the neighborhood business associations operating these networks provide necessary, but not sufficient, protections against abuse. Strict oversight is essential to promote both privacy and community safety, which includes freedom from arbitrary police action and the freedom to assemble.
So far, EPIC has not done anything about Larsen still being on its board. (Others have criticized them for keeping him on.) I don’t know if I have an opinion on this. Larsen has done good work on financial privacy regulations, which is a good thing. But he seems to be funding all these surveillance cameras in San Francisco, which is really bad.
Remember when the US and Australian police surreptitiously owned and operated the encrypted cell phone app ANOM? They arrested 800 people in 2021 based on that operation.
New documents received by Motherboard show that over 100 of those phones were shipped to users in the US, far more than previously believed.
What’s most interesting to me about this new information is how the US used the Australians to get around domestic spying laws:
For legal reasons, the FBI did not monitor outgoing messages from Anom devices determined to be inside the U.S. Instead, the Australian Federal Police (AFP) monitored them on behalf of the FBI, according to previously published court records. In those court records unsealed shortly before the announcement of the Anom operation, FBI Special Agent Nicholas Cheviron wrote that the FBI received Anom user data three times a week, which contained the messages of all of the users of Anom with some exceptions, including “the messages of approximately 15 Anom users in the U.S. sent to any other Anom device.”
Stewart Baker, partner at Steptoe & Johnson LLP, and Bryce Klehm, associate editor of Lawfare, previously wrote that “The ‘threat to life; standard echoes the provision of U.S. law that allows communications providers to share user data with law enforcement without legal process under 18 U.S.C. § 2702. Whether the AFP was relying on this provision of U.S. law or a more general moral imperative to take action to prevent imminent threats is not clear.” That section of law discusses the voluntary disclosure of customer communications or records.
When asked about the practice of Australian law enforcement monitoring devices inside the U.S. on behalf of the FBI, Senator Ron Wyden told Motherboard in a statement “Multiple intelligence community officials have confirmed to me, in writing, that intelligence agencies cannot ask foreign partners to conduct surveillance that the U.S. would be legally prohibited from doing itself. The FBI should follow this same standard. Allegations that the FBI outsourced warrantless surveillance of Americans to a foreign government raise troubling questions about the Justice Department’s oversight of these practices.”
I and others have long suspected that the NSA uses foreign nationals to get around restrictions that prevent it from spying on Americans. It is interesting to see the FBI using the same trick.
This development suprises no one who has been paying attention:
Researchers now believe AirTags, which are equipped with Bluetooth technology, could be revealing a more widespread problem of tech-enabled tracking. They emit a digital signal that can be detected by devices running Apple’s mobile operating system. Those devices then report where an AirTag has last been seen. Unlike similar tracking products from competitors such as Tile, Apple added features to prevent abuse, including notifications like the one Ms. Estrada received and automatic beeping. (Tile plans to release a feature to prevent the tracking of people next year, a spokeswoman for that company said.)
A person who doesn’t own an iPhone might have a harder time detecting an unwanted AirTag. AirTags aren’t compatible with Android smartphones. Earlier this month, Apple released an Android app that can scan for AirTags—but you have to be vigilant enough to download it and proactively use it.
Apple declined to say if it was working with Google on technology that would allow Android phones to automatically detect its trackers.
People who said they have been tracked have called Apple’s safeguards insufficient. Ms. Estrada said she was notified four hours after her phone first noticed the rogue gadget. Others said it took days before they were made aware of an unknown AirTag. According to Apple, the timing of the alerts can vary depending on the iPhone’s operating system and location settings.
EDITED TO ADD (11/12): My mistake. It was not a leak:
Ryan Shapiro, executive director of nonprofit organization Property of the People, shared the document with Motherboard after obtaining it through a public record act request. Property of the People focuses on obtaining and publishing government records.
Sidebar photo of Bruce Schneier by Joe MacInnis.