Entries Tagged "obscurity"

Page 1 of 3

Obfuscation as a Privacy Tool

This essay discusses the futility of opting out of surveillance, and suggests data obfuscation as an alternative.

We can apply obfuscation in our own lives by using practices and technologies that make use of it, including:

  • The secure browser Tor, which (among other anti-surveillance technologies) muddles our Internet activity with that of other Tor users, concealing our trail in that of many others.
  • The browser plugins TrackMeNot and AdNauseam, which explore obfuscation techniques by issuing many fake search requests and loading and clicking every ad, respectively.
  • The browser extension Go Rando, which randomly chooses your emotional “reactions” on Facebook, interfering with their emotional profiling and analysis.
  • Playful experiments like Adam Harvey’s “HyperFace” project, finding patterns on textiles that fool facial recognition systems ­ not by hiding your face, but by creating the illusion of many faces.

I am generally skeptical about obfuscation tools. I think of this basically as a signal-to-noise problem, and that adding random noise doesn’t do much to obfuscate the signal. But against broad systems of financially motivated corporate surveillance, it might be enough.

Posted on November 5, 2019 at 6:15 AMView Comments

Federal Trade Commissioner Julie Brill on Obscurity

I think this is good:

Obscurity means that personal information isn’t readily available to just anyone. It doesn’t mean that information is wiped out or even locked up; rather, it means that some combination of factors makes certain types of information relatively hard to find.

Obscurity has always been an important component of privacy. It is a helpful concept because it encapsulates how a broad range of social, economic, and technological changes affects norms and consumer expectations.

Posted on April 24, 2015 at 12:42 PMView Comments

The Simple Trick that Will Keep You Secure from Government Spies

Last week, the German government arrested someone and charged him with spying for the US. Buried in one of the stories was a little bit of tradecraft. The US gave him an encryption program embedded in a—presumably common—weather app. When you select the weather for New York, it automatically opens a crypto program. I assume this is a custom modification for the agent, and probably other agents as well. No idea how well this program was hidden. Was the modified weather app the same size as the original? Would it pass an integrity checker?

Related: there is an undocumented encryption feature in my own Password Safe program. From the command line, type: pwsafe -e filename

Posted on July 7, 2014 at 1:51 PMView Comments

The Insecurity of Secret IT Systems

We now know a lot about the security of the Rapiscan 522 B x-ray system used to scan carry-on baggage in airports worldwide. Billy Rios, director of threat intelligence at Qualys, got himself one and analyzed it. And he presented his results at the Kaspersky Security Analyst Summit this week.

It’s worse than you might have expected:

It runs on the outdated Windows 98 operating system, stores user credentials in plain text, and includes a feature called Threat Image Projection used to train screeners by injecting .bmp images of contraband, such as a gun or knife, into a passenger carry-on in order to test the screener’s reaction during training sessions. The weak logins could allow a bad guy to project phony images on the X-ray display.

While this is all surprising, it shouldn’t be. These are the same sort of problems we saw in proprietary electronic voting machines, or computerized medical equipment, or computers in automobiles. Basically, whenever an IT system is designed and used in secret – either actual secret or simply away from public scrutiny – the results are pretty awful.

I used to decry secret security systems as “security by obscurity.” I now say it more strongly: “obscurity means insecurity.”

Security is a process. For software, that process is iterative. It involves defenders trying to build a secure system, attackers—criminals, hackers, and researchers—defeating the security, and defenders improving their system. This is how all mass-market software improves its security. It’s the best system we have. And for systems that are kept out of the hands of the public, that process stalls. The result looks like the Rapiscan 522 B x-ray system.

Smart security engineers open their systems to public scrutiny, because that’s how they improve. The truly awful engineers will not only hide their bad designs behind secrecy, but try to belittle any negative security results. Get ready for Rapiscan to claim that the researchers had old software, and the new software has fixed all these problems. Or that they’re only theoretical. Or that the researchers themselves are the problem. We’ve seen it all before.

Posted on February 14, 2014 at 6:50 AMView Comments

A New Postal Privacy Product

The idea is basically to use indirection to hide physical addresses. You would get a random number to give to your correspondents, and the post office would use that number to determine your real address. No security against government surveillance, but potentially valuable nonetheless.

Here are a bunch of documents.

I honestly have no idea what’s going on. It seems to be something the US government is considering, but it was not proposed by the US Postal Service. This guy is proposing the service.

EDITED TO ADD (10/11): Sai has contacted me and asked that people refrain from linking to or writing about this for now, until he posts some more/better information. I’ll update this post with a new link when he sends it to me.

EDITED TO ADD (10/17): Sai has again contacted me, saying that he has posted the more/better information, and that the one true link for the proposal is here.

Posted on October 9, 2013 at 1:08 PMView Comments

WhoIs Privacy and Proxy Service Abuse

ICANN has a draft study that looks at abuse of the Whois database.

This study, conducted by the National Physical Laboratory (NPL) in the United Kingdom, analyzes gTLD domain names to measure whether the percentage of privacy/proxy use among domains engaged in illegal or harmful Internet activities is significantly greater than among domain names used for lawful Internet activities. Furthermore, this study compares these privacy/proxy percentages to other methods used to obscure identity ­ notably, Whois phone numbers that are invalid.

Richard Clayton, the primary author of the report, has a blog post:

However, it’s more interesting to ask whether this percentage is somewhat higher than the usage of privacy or proxy services for entirely lawful and harmless Internet activities? This turned out NOT to be the case ­ for example banks use privacy and proxy services almost as often as the registrants of domains used in the hosting of child sexual abuse images; and the registrants of domains used to host (legal) adult pornography use privacy and proxy services more often than most (but not all) of the different types of malicious activity that we studied.

Richard has been telling me about this work for a while. It’s nice to see it finally published.

Posted on October 1, 2013 at 9:09 AMView Comments

Thinking About Obscurity

This essay is worth reading:

Obscurity is the idea that when information is hard to obtain or understand, it is, to some degree, safe. Safety, here, doesn’t mean inaccessible. Competent and determined data hunters armed with the right tools can always find a way to get it. Less committed folks, however, experience great effort as a deterrent.

Online, obscurity is created through a combination of factors. Being invisible to search engines increases obscurity. So does using privacy settings and pseudonyms. Disclosing information in coded ways that only a limited audience will grasp enhances obscurity, too. Since few online disclosures are truly confidential or highly publicized, the lion’s share of communication on the social web falls along the expansive continuum of obscurity: a range that runs from completely hidden to totally obvious.

[…]

Many contemporary privacy disputes are probably better classified as concern over losing obscurity. Consider the recent debate over whether a newspaper violated the privacy rights of gun owners by publishing a map comprised of information gleaned from public records. The situation left many scratching their heads. After all, how can public records be considered private? What obscurity draws our attention to, is that while the records were accessible to any member of the public prior to the rise of big data, more effort was required to obtain, aggregate, and publish them. In that prior context, technological constraints implicitly protected privacy interests. Now, in an attempt to keep pace with diminishing structural barriers, New York is considering excepting gun owners from “public records laws that normally allow newspapers or private citizens access to certain information the government collects.”

The essay is about Facebook’s new Graph search tool, and how its harm is best thought of as reducing obscurity.

Posted on January 22, 2013 at 5:23 AMView Comments

Indian OS

India is writing its own operating system so it doesn’t have to rely on Western technology:

India’s Defence Research and Development Organisation (DRDO) wants to build an OS, primarily so India can own the source code and architecture. That will mean the country won’t have to rely on Western operating systems that it thinks aren’t up to the job of thwarting cyber attacks. The DRDO specifically wants to design and develop its own OS that is hack-proof to prevent sensitive data from being stolen.

On the one hand, this is great. We could use more competition in the OS market—as more and more applications move into the cloud and are only accessed via an Internet browser, OS compatible matters less and less—and an OS that brands itself as “more secure” can only help. But this security by obscurity thinking just isn’t true:

“The only way to protect it is to have a home-grown system, the complete architecture … source code is with you and then nobody knows what’s that,” he added.

The only way to protect it is to design and implement it securely. Keeping control of your source code didn’t magically make Windows secure, and it won’t make this Indian OS secure.

Posted on October 15, 2010 at 3:12 AMView Comments

1 2 3

Sidebar photo of Bruce Schneier by Joe MacInnis.