Entries Tagged "laws"

Page 26 of 35

Identity-Theft Disclosure Laws

California was the first state to pass a law requiring companies that keep personal data to disclose when that data is lost or stolen. Since then, many states have followed suit. Now Congress is debating federal legislation that would do the same thing nationwide.

Except that it won’t do the same thing: The federal bill has become so watered down that it won’t be very effective. I would still be in favor of it—a poor federal law is better than none—if it didn’t also pre-empt more-effective state laws, which makes it a net loss.

Identity theft is the fastest-growing area of crime. It’s badly named—your identity is the one thing that cannot be stolen—and is better thought of as fraud by impersonation. A criminal collects enough personal information about you to be able to impersonate you to banks, credit card companies, brokerage houses, etc. Posing as you, he steals your money, or takes a destructive joyride on your good credit.

Many companies keep large databases of personal data that is useful to these fraudsters. But because the companies don’t shoulder the cost of the fraud, they’re not economically motivated to secure those databases very well. In fact, if your personal data is stolen from their databases, they would much rather not even tell you: Why deal with the bad publicity?

Disclosure laws force companies to make these security breaches public. This is a good idea for three reasons. One, it is good security practice to notify potential identity theft victims that their personal information has been lost or stolen. Two, statistics on actual data thefts are valuable for research purposes. And three, the potential cost of the notification and the associated bad publicity naturally leads companies to spend more money on protecting personal information—or to refrain from collecting it in the first place.

Think of it as public shaming. Companies will spend money to avoid the PR costs of this shaming, and security will improve. In economic terms, the law reduces the externalities and forces companies to deal with the true costs of these data breaches.

This public shaming needs the cooperation of the press and, unfortunately, there’s an attenuation effect going on. The first major breach after California passed its disclosure law—SB1386—was in February 2005, when ChoicePoint sold personal data on 145,000 people to criminals. The event was all over the news, and ChoicePoint was shamed into improving its security.

Then LexisNexis exposed personal data on 300,000 individuals. And Citigroup lost data on 3.9 million individuals. SB1386 worked; the only reason we knew about these security breaches was because of the law. But the breaches came in increasing numbers, and in larger quantities. After a while, it was no longer news. And when the press stopped reporting, the “cost” of these breaches to the companies declined.

Today, the only real cost that remains is the cost of notifying customers and issuing replacement cards. It costs banks about $10 to issue a new card, and that’s money they would much rather not have to spend. This is the agenda they brought to the federal bill, cleverly titled the Data Accountability and Trust Act, or DATA.

Lobbyists attacked the legislation in two ways. First, they went after the definition of personal information. Only the exposure of very specific information requires disclosure. For example, the theft of a database that contained people’s first initial, middle name, last name, Social Security number, bank account number, address, phone number, date of birth, mother’s maiden name and password would not have to be disclosed, because “personal information” is defined as “an individual’s first and last name in combination with …” certain other personal data.

Second, lobbyists went after the definition of “breach of security.” The latest version of the bill reads: “The term ‘breach of security’ means the unauthorized acquisition of data in electronic form containing personal information that establishes a reasonable basis to conclude that there is a significant risk of identity theft to the individuals to whom the personal information relates.”

Get that? If a company loses a backup tape containing millions of individuals’ personal information, it doesn’t have to disclose if it believes there is no “significant risk of identity theft.” If it leaves a database exposed, and has absolutely no audit logs of who accessed that database, it could claim it has no “reasonable basis” to conclude there is a significant risk. Actually, the company could point to a study that showed the probability of fraud to someone who has been the victim of this kind of data loss to be less than 1 in 1,000—which is not a “significant risk”—and then not disclose the data breach at all.

Even worse, this federal law pre-empts the 23 existing state laws—and others being considered—many of which contain stronger individual protections. So while DATA might look like a law protecting consumers nationwide, it is actually a law protecting companies with large databases from state laws protecting consumers.

So in its current form, this legislation would make things worse, not better.

Of course, things are in flux. They’re always in flux. The language of the bill has changed regularly over the past year, as various committees got their hands on it. There’s also another bill, HR3997, which is even worse. And even if something passes, it has to be reconciled with whatever the Senate passes, and then voted on again. So no one really knows what the final language will look like.

But the devil is in the details, and the only way to protect us from lobbyists tinkering with the details is to ensure that the federal bill does not pre-empt any state bills: that the federal law is a minimum, but that states can require more.

That said, disclosure is important, but it’s not going to solve identity theft. As I’ve written previously, the reason theft of personal information is so common is that the data is so valuable. The way to mitigate the risk of fraud due to impersonation is not to make personal information harder to steal, it’s to make it harder to use.

Disclosure laws only deal with the economic externality of data brokers protecting your personal information. What we really need are laws prohibiting credit card companies and other financial institutions from granting credit to someone using your name with only a minimum of authentication.

But until that happens, we can at least hope that Congress will refrain from passing bad bills that override good state laws—and helping criminals in the process.

This essay originally appeared on Wired.com.

EDITED TO ADD (4/20): Here’s a comparison of state disclosure laws.

Posted on April 20, 2006 at 8:11 AMView Comments

No-Buy List

You’ve all heard of the “No Fly List.” Did you know that there’s a “No-Buy List” as well?

The so-called “Bad Guy List” is hardly a secret. The U.S. Treasury’s Office of Foreign Assets Control maintains its “Specially Designated Nationals and Blocked Persons List” to be easily accessible on its public Web site.

Wanna see it? Sure you do. Just key OFAC into your Web browser, and you’ll find the 224-page document of the names of individuals, organizations, corporations and Web sites the feds suspect of terrorist or criminal activities and associations.

You might think Osama bin Laden should be at the top of The List, but it’s alphabetized, so Public Enemy No. 1 is on Page 59 with a string of akas and spelling derivations filling most of the first column. If you’re the brother, daughter, son or sister-in-law of Yugoslavian ex-president Slobodan Milosevic (who died in custody recently), you’re named, too, so probably forget about picking up that lovely new Humvee on this side of the Atlantic. Same for Charles “Chuckie” Taylor, son of the recently arrested former president of Liberia (along with the deposed prez’s wife and ex-wife).

The Bad Guy List’s relevance to the average American consumer? What’s not widely known about it is that by federal law, sellers are supposed to check it even in the most common and mundane marketplace transactions.

“The OFAC requirements apply to all U.S. citizens. The law prohibits anyone, not just car dealers, from doing business with anyone whose name appears on the Office of Foreign Assets Control’s Specially Designated Nationals list,” says Thomas B. Hudson, senior partner at Hudson Cook LLP, a law firm in Hanover, Md., and publisher of Carlaw and Spot Delivery, legal-compliance newsletters and services for car dealers and finance companies.

Hudson says that, according to the law, supermarkets, restaurants, pawnbrokers, real estate agents, everyone, even The Washington Post, is prohibited from doing business with anyone named on the list. “There is no minimum amount for the transactions covered by the OFAC requirement, so everyone The Post sells a paper to or a want ad to whose name appears on the SDN list is a violation,” says Hudson, whose new book, “Carlaw—A Southern Attorney Delivers Humorous Practical Legal Advice on Car Sales and Financing,” comes out this month. “The law applies to you personally, as well.”

But The Bad Guy List law (which predates the controversial Patriot Act) not only is “perfectly ridiculous,” it’s impractical, says Hudson. “I understand that 95 percent of the people whose names are on the list are not even in the United States. And if you were a bad guy planning bad acts, and you knew that your name was on a publicly available list that people were required to check in order to avoid violating the law, how dumb would you have to be to use your own name?”

Compliance is also a big problem. Think eBay sellers are checking the list for auction winners? Or that the supermarket checkout person is thanking you by name while scanning a copy of The List under the counter? Not likely.

Posted on April 10, 2006 at 6:23 AMView Comments

Evading Copyright Through XOR

Monolith is an open-source program that can XOR two files together to create a third file, and—of course—can XOR that third file with one of the original two to create the other original file.

The website wonders about the copyright implications of all of this:

Things get interesting when you apply Monolith to copyrighted files. For example, munging two copyrighted files will produce a completely new file that, in most cases, contains no information from either file. In other words, the resulting Mono file is not “owned” by the original copyright holders (if owned at all, it would be owned by the person who did the munging). Given that the Mono file can be combined with either of the original, copyrighted files to reconstruct the other copyrighted file, this lack of Mono ownership may be seem hard to believe.

The website then postulates this as a mechanism to get around copyright law:

What does this mean? This means that Mono files can be freely distributed.

So what? Mono files are useless without their corresponding Basis files, right? And the Basis files are copyrighted too, so they cannot be freely distributed, right? There is one more twist to this idea. What happens when we use Basis files that are freely distributable? For example, we could use a Basis file that is in the public domain or one that is licensed for free distribution. Now we are getting somewhere.

None of the aforementioned properties of Mono files change when we use freely distributable Basis files, since the same arguments hold. Mono files are still not copyrighted by the people who hold the copyrights over the corresponding Element files. Now we can freely distribute Mono files and Basis files.

Interesting? Not really. But what you can do with these files, in the privacy of your own home, might be interesting, depending on your proclivities. For example, you can use the Mono files and the Basis files to reconstruct the Element files.

Clever, but it won’t hold up in court. In general, technical hair splitting is not an effective way to get around the law. My guess is that anyone who distributes that third file—they call it a “Mono” file—along with instructions on how to recover the copyrighted file is going to be found guilty of copyright violation.

The correct way to solve this problem is through law, not technology.

Posted on March 30, 2006 at 8:07 AMView Comments

Fighting Misuse of the Patriot Act

I like this idea:

I had to sign a tedious business contract the other day. They wanted my corporation number—fair enough—plus my Social Security number—well, if you insist—and also my driver’s license number—hang on, what’s the deal with that?

Well, we e-mailed over a query and they e-mailed back that it was a requirement of the Patriot Act. So we asked where exactly in the Patriot Act could this particular requirement be found and, after a bit of a delay, we got an answer.

And on discovering that there was no mention of driver’s licenses in that particular subsection, I wrote back that we have a policy of reporting all erroneous invocations of the Patriot Act to the Department of Homeland Security on the grounds that such invocations weaken the rationale for the act, and thereby undermine public support for genuine anti-terrorism measures and thus constitute a threat to America’s national security.

And about 10 minutes after that the guy sent back an e-mail saying he didn’t need the driver’s license number after all.

Posted on March 8, 2006 at 7:17 AMView Comments

The Future of Privacy

Over the past 20 years, there’s been a sea change in the battle for personal privacy.

The pervasiveness of computers has resulted in the almost constant surveillance of everyone, with profound implications for our society and our freedoms. Corporations and the police are both using this new trove of surveillance data. We as a society need to understand the technological trends and discuss their implications. If we ignore the problem and leave it to the “market,” we’ll all find that we have almost no privacy left.

Most people think of surveillance in terms of police procedure: Follow that car, watch that person, listen in on his phone conversations. This kind of surveillance still occurs. But today’s surveillance is more like the NSA’s model, recently turned against Americans: Eavesdrop on every phone call, listening for certain keywords. It’s still surveillance, but it’s wholesale surveillance.

Wholesale surveillance is a whole new world. It’s not “follow that car,” it’s “follow every car.” The National Security Agency can eavesdrop on every phone call, looking for patterns of communication or keywords that might indicate a conversation between terrorists. Many airports collect the license plates of every car in their parking lots, and can use that database to locate suspicious or abandoned cars. Several cities have stationary or car-mounted license-plate scanners that keep records of every car that passes, and save that data for later analysis.

More and more, we leave a trail of electronic footprints as we go through our daily lives. We used to walk into a bookstore, browse, and buy a book with cash. Now we visit Amazon, and all of our browsing and purchases are recorded. We used to throw a quarter in a toll booth; now EZ Pass records the date and time our car passed through the booth. Data about us are collected when we make a phone call, send an e-mail message, make a purchase with our credit card, or visit a website.

Much has been written about RFID chips and how they can be used to track people. People can also be tracked by their cell phones, their Bluetooth devices, and their WiFi-enabled computers. In some cities, video cameras capture our image hundreds of times a day.

The common thread here is computers. Computers are involved more and more in our transactions, and data are byproducts of these transactions. As computer memory becomes cheaper, more and more of these electronic footprints are being saved. And as processing becomes cheaper, more and more of it is being cross-indexed and correlated, and then used for secondary purposes.

Information about us has value. It has value to the police, but it also has value to corporations. The Justice Department wants details of Google searches, so they can look for patterns that might help find child pornographers. Google uses that same data so it can deliver context-sensitive advertising messages. The city of Baltimore uses aerial photography to surveil every house, looking for building permit violations. A national lawn-care company uses the same data to better market its services. The phone company keeps detailed call records for billing purposes; the police use them to catch bad guys.

In the dot-com bust, the customer database was often the only salable asset a company had. Companies like Experian and Acxiom are in the business of buying and reselling this sort of data, and their customers are both corporate and government.

Computers are getting smaller and cheaper every year, and these trends will continue. Here’s just one example of the digital footprints we leave:

It would take about 100 megabytes of storage to record everything the fastest typist input to his computer in a year. That’s a single flash memory chip today, and one could imagine computer manufacturers offering this as a reliability feature. Recording everything the average user does on the Internet requires more memory: 4 to 8 gigabytes a year. That’s a lot, but “record everything” is Gmail’s model, and it’s probably only a few years before ISPs offer this service.

The typical person uses 500 cell phone minutes a month; that translates to 5 gigabytes a year to save it all. My iPod can store 12 times that data. A “life recorder” you can wear on your lapel that constantly records is still a few generations off: 200 gigabytes/year for audio and 700 gigabytes/year for video. It’ll be sold as a security device, so that no one can attack you without being recorded. When that happens, will not wearing a life recorder be used as evidence that someone is up to no good, just as prosecutors today use the fact that someone left his cell phone at home as evidence that he didn’t want to be tracked?

In a sense, we’re living in a unique time in history. Identification checks are common, but they still require us to whip out our ID. Soon it’ll happen automatically, either through an RFID chip in our wallet or face-recognition from cameras. And those cameras, now visible, will shrink to the point where we won’t even see them.

We’re never going to stop the march of technology, but we can enact legislation to protect our privacy: comprehensive laws regulating what can be done with personal information about us, and more privacy protection from the police. Today, personal information about you is not yours; it’s owned by the collector. There are laws protecting specific pieces of personal data—videotape rental records, health care information—but nothing like the broad privacy protection laws you find in European countries. That’s really the only solution; leaving the market to sort this out will result in even more invasive wholesale surveillance.

Most of us are happy to give out personal information in exchange for specific services. What we object to is the surreptitious collection of personal information, and the secondary use of information once it’s collected: the buying and selling of our information behind our back.

In some ways, this tidal wave of data is the pollution problem of the information age. All information processes produce it. If we ignore the problem, it will stay around forever. And the only way to successfully deal with it is to pass laws regulating its generation, use and eventual disposal.

This essay was originally published in the Minneapolis Star-Tribune.

Posted on March 6, 2006 at 5:41 AMView Comments

Unfortunate Court Ruling Regarding Gramm-Leach-Bliley

A Federal Court Rules That A Financial Institution Has No Duty To Encrypt A Customer Database“:

In a legal decision that could have broad implications for financial institutions, a court has ruled recently that a student loan company was not negligent and did not have a duty under the Gramm-Leach-Bliley statute to encrypt a customer database on a laptop computer that fell into the wrong hands.

Basically, an employee of Brazos Higher Education Service Corporation, Inc., had customer information on a laptop computer he was using at home. The computer was stolen, and a customer sued Brazos.

The judge dismissed the lawsuit. And then he went further:

Significantly, while recognizing that Gramm-Leach-Bliley does require financial institutions to protect against unauthorized access to customer records, Judge Kyle held that the statute “does not prohibit someone from working with sensitive data on a laptop computer in a home office,” and does not require that “any nonpublic personal information stored on a laptop computer should be encrypted.”

I know nothing of the legal merits of the case, nor do I have an opinion about whether Gramm-Leach-Bliley does or does not require financial companies to encrypt personal data in its purview. But I do know that we as a society need to force companies to encrypt personal data about us. Companies won’t do it on their own—the market just doesn’t encourage this behavior—so legislation or liability are the only available mechanisms. If this law doesn’t do it, we need another one.

EDITED TO ADD (2/22): Some commentary here.

Posted on February 21, 2006 at 1:34 PMView Comments

A Model Regime of Privacy Protection

Last year I blogged about an article by Daniel J. Solove and Chris Hoofnagle titled “A Model Regime of Privacy Protection.”

The paper has been revised a few times based on comments—some of them from readers of this blog and Crypto-Gram—and the final version has been published.

Abstract:
A series of major security breaches at companies with sensitive personal information has sparked significant attention to the problems with privacy protection in the United States. Currently, the privacy protections in the United States are riddled with gaps and weak spots. Although most industrialized nations have comprehensive data protection laws, the United States has maintained a sectoral approach where certain industries are covered and others are not. In particular, emerging companies known as “commercial data brokers” have frequently slipped through the cracks of U.S. privacy law. In this article, the authors propose a Model Privacy Regime to address the problems in the privacy protection in the United States, with a particular focus on commercial data brokers. Since the United States is unlikely to shift radically from its sectoral approach to a comprehensive data protection regime, the Model Regime aims to patch up the holes in existing privacy regulation and improve and extend it. In other words, the goal of the Model Regime is to build upon the existing foundation of U.S. privacy law, not to propose an alternative foundation. The authors believe that the sectoral approach in the United States can be improved by applying the Fair Information Practices—principles that require the entities that collect personal data to extend certain rights to data subjects. The Fair Information Practices are very general principles, and they are often spoken about in a rather abstract manner. In contrast, the Model Regime demonstrates specific ways that they can be incorporated into privacy regulation in the United States.

Definitely worth reading.

Posted on February 6, 2006 at 12:21 PMView Comments

REAL ID Harder Than Legislators Thought

According to the Associated Press:

State motor vehicle officials nationwide who will have to carry out the Real ID Act say its authors grossly underestimated its logistical, technological and financial demands.

In a comprehensive survey obtained by The Associated Press and in follow-up interviews, officials cast doubt on the states’ ability to comply with the law on time and fretted that it will be a budget buster.

I’ve already written about REAL ID, including the obscene costs:

REAL ID is expensive. It’s an unfunded mandate: the federal government is forcing the states to spend their own money to comply with the act. I’ve seen estimates that the cost to the states of complying with REAL ID will be $120 million. That’s $120 million that can’t be spent on actual security.

According to the AP, I was way off:

Pennsylvania alone estimated a hit of up to $85 million. Washington state projected at least $46 million annually in the first several years.

Separately, a December report to Virginia’s governor pegged the potential price tag for that state as high as $169 million, with $63 million annually in successive years. Of the initial cost, $33 million would be just to redesign computing systems.

Remember, security is a trade-off. REAL ID is a bad idea primarily because the security gained is not worth the enormous expense.

See also the ACLU’s site on REAL ID.

Posted on January 13, 2006 at 1:23 PMView Comments

1 24 25 26 27 28 35

Sidebar photo of Bruce Schneier by Joe MacInnis.