Entries Tagged "data protection"

Page 3 of 4

New Rules on Data Privacy for Non-US Citizens

Last week, President Trump signed an executive order affecting the privacy rights of non-US citizens with respect to data residing in the US.

Here’s the relevant text:

Privacy Act. Agencies shall, to the extent consistent with applicable law, ensure that their privacy policies exclude persons who are not United States citizens or lawful permanent residents from the protections of the Privacy Act regarding personally identifiable information.

At issue is the EU-US Privacy Shield, which is the voluntary agreement among the US government, US companies, and the EU that makes it possible for US companies to store Europeans’ data without having to follow all EU privacy requirements.

Interpretations of what this means are all over the place: from extremely serious, to more measured, to don’t worry and we still have PPD-28.

This is clearly still in flux. And, like pretty much everything so far in the Trump administration, we have no idea where this is headed.

Posted on January 30, 2017 at 6:04 AMView Comments

Indiana's Voter Registration Data Is Frighteningly Insecure

You can edit anyone’s information you want:

The question, boiled down, was haunting: Want to see how easy it would be to get into someone’s voter registration and make changes to it? The offer from Steve Klink — a Lafayette-based public consultant who works mainly with Indiana public school districts — was to use my voter registration record as a case study.

Only with my permission, of course.

“I will not require any information from you,” he texted. “Which is the problem.”

Turns out he didn’t need anything from me. He sent screenshots of every step along the way, as he navigated from the “Update My Voter Registration” tab at the Indiana Statewide Voter Registration System maintained since 2010 at www.indianavoters.com to the blank screen that cleared the way for changes to my name, address, age and more.

The only magic involved was my driver’s license number, one of two log-in options to make changes online. And that was contained in a copy of every county’s voter database, a public record already in the hands of political parties, campaigns, media and, according to Indiana open access laws, just about anyone who wants the beefy spreadsheet.

Posted on October 11, 2016 at 2:04 PMView Comments

France Rejects Backdoors in Encryption Products

For the right reasons, too:

Axelle Lemaire, the Euro nation’s digital affairs minister, shot down the amendment during the committee stage of the forthcoming omnibus digital bill, saying it would be counterproductive and would leave personal data unprotected.

“Recent events show how the fact of introducing faults deliberately at the request — sometimes even without knowing — the intelligence agencies has an effect that is harming the whole community,” she said according to Numerama.

“Even if the intention [to empower the police] is laudable, it also opens the door to the players who have less laudable intentions, not to mention the potential for economic damage to the credibility of companies planning these flaws. You are right to fuel the debate, but this is not the right solution according to the Government’s opinion.”

France joins the Netherlands on this issue.

And Apple’s Tim Cook is going after the Obama administration on the issue.

EDITED TO ADD (1/20): In related news, Congress will introduce a bill to establish a commission to study the issue. This is what kicking the can down the road looks like.

Posted on January 20, 2016 at 5:02 AMView Comments

New Pew Research Report on Americans' Attitudes on Privacy, Security, and Surveillance

This is interesting:

The surveys find that Americans feel privacy is important in their daily lives in a number of essential ways. Yet, they have a pervasive sense that they are under surveillance when in public and very few feel they have a great deal of control over the data that is collected about them and how it is used. Adding to earlier Pew Research reports that have documented low levels of trust in sectors that Americans associate with data collection and monitoring, the new findings show Americans also have exceedingly low levels of confidence in the privacy and security of the records that are maintained by a variety of institutions in the digital age.

While some Americans have taken modest steps to stem the tide of data collection, few have adopted advanced privacy-enhancing measures. However, majorities of Americans expect that a wide array of organizations should have limits on the length of time that they can retain records of their activities and communications. At the same time, Americans continue to express the belief that there should be greater limits on government surveillance programs. Additionally, they say it is important to preserve the ability to be anonymous for certain online activities.

Lots of detail in the reports.

Posted on May 21, 2015 at 1:05 PMView Comments

The Security of Data Deletion

Thousands of articles have called the December attack against Sony Pictures a wake-up call to industry. Regardless of whether the attacker was the North Korean government, a disgruntled former employee, or a group of random hackers, the attack showed how vulnerable a large organization can be and how devastating the publication of its private correspondence, proprietary data, and intellectual property can be.

But while companies are supposed to learn that they need to improve their security against attack, there’s another equally important but much less discussed lesson here: companies should have an aggressive deletion policy.

One of the social trends of the computerization of our business and social communications tools is the loss of the ephemeral. Things we used to say in person or on the phone we now say in e-mail, by text message, or on social networking platforms. Memos we used to read and then throw away now remain in our digital archives. Big data initiatives mean that we’re saving everything we can about our customers on the remote chance that it might be useful later.

Everything is now digital, and storage is cheap­ — why not save it all?

Sony illustrates the reason why not. The hackers published old e-mails from company executives that caused enormous public embarrassment to the company. They published old e-mails by employees that caused less-newsworthy personal embarrassment to those employees, and these messages are resulting in class-action lawsuits against the company. They published old documents. They published everything they got their hands on.

Saving data, especially e-mail and informal chats, is a liability.

It’s also a security risk: the risk of exposure. The exposure could be accidental. It could be the result of data theft, as happened to Sony. Or it could be the result of litigation. Whatever the reason, the best security against these eventualities is not to have the data in the first place.

If Sony had had an aggressive data deletion policy, much of what was leaked couldn’t have been stolen and wouldn’t have been published.

An organization-wide deletion policy makes sense. Customer data should be deleted as soon as it isn’t immediately useful. Internal e-mails can probably be deleted after a few months, IM chats even more quickly, and other documents in one to two years. There are exceptions, of course, but they should be exceptions. Individuals should need to deliberately flag documents and correspondence for longer retention. But unless there are laws requiring an organization to save a particular type of data for a prescribed length of time, deletion should be the norm.

This has always been true, but many organizations have forgotten it in the age of big data. In the wake of the devastating leak of terabytes of sensitive Sony data, I hope we’ll all remember it now.

This essay previously appeared on ArsTechnica.com, which has comments from people who strongly object to this idea.

Slashdot thread.

Posted on January 15, 2015 at 6:12 AMView Comments

Corporations Misusing Our Data

In the Internet age, we have no choice but to entrust our data with private companies: e-mail providers, service providers, retailers, and so on.

We realize that this data is at risk from hackers. But there’s another risk as well: the employees of the companies who are holding our data for us.

In the early years of Facebook, employees had a master password that enabled them to view anything they wanted in any account. NSA employees occasionally snoop on their friends and partners. The agency even has a name for it: LOVEINT. And well before the Internet, people with access to police or medical records occasionally used that power to look up either famous people or people they knew.

The latest company accused of allowing this sort of thing is Uber, the Internet car-ride service. The company is under investigation for spying on riders without their permission. Called the “god view,” some Uber employees are able to see who is using the service and where they’re going — and used this at least once in 2011 as a party trick to show off the service. A senior executive also suggested the company should hire people to dig up dirt on their critics, making their database of people’s rides even more “useful.”

None of us wants to be stalked — whether it’s from looking at our location data, our medical data, our emails and texts, or anything else — by friends or strangers who have access due to their jobs. Unfortunately, there are few rules protecting us.

Government employees are prohibited from looking at our data, although none of the NSA LOVEINT creeps were ever prosecuted. The HIPAA law protects the privacy of our medical records, but we have nothing to protect most of our other information.

Your Facebook and Uber data are only protected by company culture. There’s nothing in their license agreements that you clicked “agree” to but didn’t read that prevents those companies from violating your privacy.

This needs to change. Corporate databases containing our data should be secured from everyone who doesn’t need access for their work. Voyeurs who peek at our data without a legitimate reason should be punished.

There are audit technologies that can detect this sort of thing, and they should be required. As long as we have to give our data to companies and government agencies, we need assurances that our privacy will be protected.

This essay previously appeared on CNN.com.

Posted on December 5, 2014 at 6:45 AMView Comments

Dan Geer Explains the Government Surveillance Mentality

This talk by Dan Geer explains the NSA mindset of “collect everything”:

I previously worked for a data protection company. Our product was, and I believe still is, the most thorough on the market. By “thorough” I mean the dictionary definition, “careful about doing something in an accurate and exact way.” To this end, installing our product instrumented every system call on the target machine. Data did not and could not move in any sense of the word “move” without detection. Every data operation was caught and monitored. It was total surveillance data protection. Its customers were companies that don’t accept half-measures. What made this product stick out was that very thoroughness, but here is the point: Unless you fully instrument your data handling, it is not possible for you to say what did not happen. With total surveillance, and total surveillance alone, it is possible to treat the absence of evidence as the evidence of absence. Only when you know everything that *did* happen with your data can you say what did *not* happen with your data.

The alternative to total surveillance of data handling is to answer more narrow questions, questions like “Can the user steal data with a USB stick?” or “Does this outbound e-mail have a Social Security Number in it?” Answering direct questions is exactly what a defensive mindset says you must do, and that is “never make the same mistake twice.” In other words, if someone has lost data because of misuse of some facility on the computer, then you either disable that facility or you wrap it in some kind of perimeter. Lather, rinse, and repeat. This extends all the way to such trivial matters as timer-based screen locking.

The difficulty with the defensive mindset is that it leaves in place the fundamental strategic asymmetry of cybersecurity, namely that while the workfactor for the offender is the price of finding a new method of attack, the workfactor for the defender is the cumulative cost of forever defending against all attack methods yet discovered. Over time, the curve for the cost of finding a new attack and the curve for the cost of defending against all attacks to date cross. Once those curves cross, the offender never has to worry about being out of the money. I believe that that crossing occurred some time ago.

The total surveillance strategy is, to my mind, an offensive strategy used for defensive purposes. It says “I don’t know what the opposition is going to try, so everything is forbidden unless we know it is good.” In that sense, it is like whitelisting applications. Taking either the application whitelisting or the total data surveillance approach is saying “That which is not permitted is forbidden.”

[…]

We all know the truism, that knowledge is power. We all know that there is a subtle yet important distinction between information and knowledge. We all know that a negative declaration like “X did not happen” can only proven true if you have the enumeration of *everything* that did happen and can show that X is not in it. We all know that when a President says “Never again” he is asking for the kind of outcome for which proving a negative, lots of negatives, is categorically essential. Proving a negative requires omniscience. Omniscience requires god-like powers.

The whole essay is well worth reading.

Posted on November 11, 2013 at 6:21 AMView Comments

Risks of Data Portability

Peter Swire and Yianni Lagos have pre-published a law journal article on the risks of data portability. It specifically addresses an EU data protection regulation, but the security discussion is more general.

…Article 18 poses serious risks to a long-established E.U. fundamental right of data protection, the right to security of a person’s data. Previous access requests by individuals were limited in scope and format. By contrast, when an individual’s lifetime of data must be exported ‘without hindrance,’ then one moment of identity fraud can turn into a lifetime breach of personal data.

They have a point. If you’re going to allow users to download all of their data with one command, you might want to double- and triple-check that command. Otherwise it’s going to become an attack vector for identity theft and other malfeasance.

Posted on October 24, 2012 at 1:27 PMView Comments

Sidebar photo of Bruce Schneier by Joe MacInnis.