Entries Tagged "Amazon"

Page 4 of 4

Feudal Security

It’s a feudal world out there.

Some of us have pledged our allegiance to Google: We have Gmail accounts, we use Google Calendar and Google Docs, and we have Android phones. Others have pledged allegiance to Apple: We have Macintosh laptops, iPhones, and iPads; and we let iCloud automatically synchronize and back up everything. Still others of us let Microsoft do it all. Or we buy our music and e-books from Amazon, which keeps records of what we own and allows downloading to a Kindle, computer, or phone. Some of us have pretty much abandoned e-mail altogether … for Facebook.

These vendors are becoming our feudal lords, and we are becoming their vassals. We might refuse to pledge allegiance to all of them — or to a particular one we don’t like. Or we can spread our allegiance around. But either way, it’s becoming increasingly difficult to not pledge allegiance to at least one of them.

Feudalism provides security. Classical medieval feudalism depended on overlapping, complex, hierarchical relationships. There were oaths and obligations: a series of rights and privileges. A critical aspect of this system was protection: vassals would pledge their allegiance to a lord, and in return, that lord would protect them from harm.

Of course, I’m romanticizing here; European history was never this simple, and the description is based on stories of that time, but that’s the general model.

And it’s this model that’s starting to permeate computer security today.

I Pledge Allegiance to the United States of Convenience

Traditional computer security centered around users. Users had to purchase and install anti-virus software and firewalls, ensure their operating system and network were configured properly, update their software, and generally manage their own security.

This model is breaking, largely due to two developments:

  1. New Internet-enabled devices where the vendor maintains more control over the hardware and software than we do — like the iPhone and Kindle; and
  2. Services where the host maintains our data for us — like Flickr and Hotmail.

Now, we users must trust the security of these hardware manufacturers, software vendors, and cloud providers.

We choose to do it because of the convenience, redundancy, automation, and shareability. We like it when we can access our e-mail anywhere, from any computer. We like it when we can restore our contact lists after we’ve lost our phones. We want our calendar entries to automatically appear on all of our devices. These cloud storage sites do a better job of backing up our photos and files than we would manage by ourselves; Apple does a great job keeping malware out of its iPhone apps store.

In this new world of computing, we give up a certain amount of control, and in exchange we trust that our lords will both treat us well and protect us from harm. Not only will our software be continually updated with the newest and coolest functionality, but we trust it will happen without our being overtaxed by fees and required upgrades. We trust that our data and devices won’t be exposed to hackers, criminals, and malware. We trust that governments won’t be allowed to illegally spy on us.

Trust is our only option. In this system, we have no control over the security provided by our feudal lords. We don’t know what sort of security methods they’re using, or how they’re configured. We mostly can’t install our own security products on iPhones or Android phones; we certainly can’t install them on Facebook, Gmail, or Twitter. Sometimes we have control over whether or not to accept the automatically flagged updates — iPhone, for example — but we rarely know what they’re about or whether they’ll break anything else. (On the Kindle, we don’t even have that freedom.)

The Good, the Bad, and the Ugly

I’m not saying that feudal security is all bad. For the average user, giving up control is largely a good thing. These software vendors and cloud providers do a lot better job of security than the average computer user would. Automatic cloud backup saves a lot of data; automatic updates prevent a lot of malware. The network security at any of these providers is better than that of most home users.

Feudalism is good for the individual, for small startups, and for medium-sized businesses that can’t afford to hire their own in-house or specialized expertise. Being a vassal has its advantages, after all.

For large organizations, however, it’s more of a mixed bag. These organizations are used to trusting other companies with critical corporate functions: They’ve been outsourcing their payroll, tax preparation, and legal services for decades. But IT regulations often require audits. Our lords don’t allow vassals to audit them, even if those vassals are themselves large and powerful.

Yet feudal security isn’t without its risks.

Our lords can make mistakes with security, as recently happened with Apple, Facebook, and Photobucket. They can act arbitrarily and capriciously, as Amazon did when it cut off a Kindle user for living in the wrong country. They tether us like serfs; just try to take data from one digital lord to another.

Ultimately, they will always act in their own self-interest, as companies do when they mine our data in order to sell more advertising and make more money. These companies own us, so they can sell us off — again, like serfs — to rival lords…or turn us in to the authorities.

Historically, early feudal arrangements were ad hoc, and the more powerful party would often simply renege on his part of the bargain. Eventually, the arrangements were formalized and standardized: both parties had rights and privileges (things they could do) as well as protections (things they couldn’t do to each other).

Today’s internet feudalism, however, is ad hoc and one-sided. We give companies our data and trust them with our security, but we receive very few assurances of protection in return, and those companies have very few restrictions on what they can do.

This needs to change. There should be limitations on what cloud vendors can do with our data; rights, like the requirement that they delete our data when we want them to; and liabilities when vendors mishandle our data.

Like everything else in security, it’s a trade-off. We need to balance that trade-off. In Europe, it was the rise of the centralized state and the rule of law that undermined the ad hoc feudal system; it provided more security and stability for both lords and vassals. But these days, government has largely abdicated its role in cyberspace, and the result is a return to the feudal relationships of yore.

Perhaps instead of hoping that our Internet-era lords will be sufficiently clever and benevolent — or putting our faith in the Robin Hoods who block phone surveillance and circumvent DRM systems — it’s time we step in in our role as governments (both national and international) to create the regulatory environments that protect us vassals (and the lords as well). Otherwise, we really are just serfs.

A version of this essay was originally published on Wired.com.

Posted on December 3, 2012 at 7:24 AMView Comments

Ebook Fraud

Interesting post — and discussion — on Making Light about ebook fraud. Currently there are two types of fraud. The first is content farming, discussed in these two interesting blog posts. People are creating automatically generated content, web-collected content, or fake content, turning it into a book, and selling it on an ebook site like Amazon.com. Then they use multiple identities to give it good reviews. (If it gets a bad review, the scammer just relists the same content under a new name.) That second blog post contains a screen shot of something called “Autopilot Kindle Cash,” which promises to teach people how to post dozens of ebooks to Amazon.com per day.

The second type of fraud is stealing a book and selling it as an ebook. So someone could scan a real book and sell it on an ebook site, even though he doesn’t own the copyright. It could be a book that isn’t already available as an ebook, or it could be a “low cost” version of a book that is already available. Amazon doesn’t seem particularly motivated to deal with this sort of fraud. And it too is suitable for automation.

Broadly speaking, there’s nothing new here. All complex ecosystems have parasites, and every open communications system we’ve ever built gets overrun by scammers and spammers. Far from making editors superfluous, systems that democratize publishing have an even greater need for editors. The solutions are not new, either: reputation-based systems, trusted recommenders, white lists, takedown notices. Google has implemented a bunch of security countermeasures against content farming; ebook sellers should implement them as well. It’ll be interesting to see what particular sort of mix works in this case.

Posted on April 4, 2011 at 9:18 AMView Comments

Fake Amazon Receipt Generators

They can be used to scam Amazon Marketplace merchants:

What happens once our scammer is armed with his fake receipt? Well, many sellers on Amazon will ask you to send them a copy of your receipt should you run into trouble, have orders go missing, lose your license key for a piece of software and so on. The gag here is that the scammer is relying on the seller not checking the details and accepting the printout at face value. After all, how many sellers would be aware somebody went to the trouble of creating a fake receipt generator in the first place?

They’re also useful if you want to defraud your employer on expense reimbursement forms.

Posted on December 17, 2010 at 6:28 AMView Comments

File Deletion

File deletion is all about control. This used to not be an issue. Your data was on your computer, and you decided when and how to delete a file. You could use the delete function if you didn’t care about whether the file could be recovered or not, and a file erase program — I use BCWipe for Windows — if you wanted to ensure no one could ever recover the file.

As we move more of our data onto cloud computing platforms such as Gmail and Facebook, and closed proprietary platforms such as the Kindle and the iPhone, deleting data is much harder.

You have to trust that these companies will delete your data when you ask them to, but they’re generally not interested in doing so. Sites like these are more likely to make your data inaccessible than they are to physically delete it. Facebook is a known culprit: actually deleting your data from its servers requires a complicated procedure that may or may not work. And even if you do manage to delete your data, copies are certain to remain in the companies’ backup systems. Gmail explicitly says this in its privacy notice.

Online backups, SMS messages, photos on photo sharing sites, smartphone applications that store your data in the network: you have no idea what really happens when you delete pieces of data or your entire account, because you’re not in control of the computers that are storing the data.

This notion of control also explains how Amazon was able to delete a book that people had previously purchased on their Kindle e-book readers. The legalities are debatable, but Amazon had the technical ability to delete the file because it controls all Kindles. It has designed the Kindle so that it determines when to update the software, whether people are allowed to buy Kindle books, and when to turn off people’s Kindles entirely.

Vanish is a research project by Roxana Geambasu and colleagues at the University of Washington. They designed a prototype system that automatically deletes data after a set time interval. So you can send an email, create a Google Doc, post an update to Facebook, or upload a photo to Flickr, all designed to disappear after a set period of time. And after it disappears, no one — not anyone who downloaded the data, not the site that hosted the data, not anyone who intercepted the data in transit, not even you — will be able to read it. If the police arrive at Facebook or Google or Flickr with a warrant, they won’t be able to read it.

The details are complicated, but Vanish breaks the data’s decryption key into a bunch of pieces and scatters them around the web using a peer-to-peer network. Then it uses the natural turnover in these networks — machines constantly join and leave — to make the data disappear. Unlike previous programs that supported file deletion, this one doesn’t require you to trust any company, organisation, or website. It just happens.

Of course, Vanish doesn’t prevent the recipient of an email or the reader of a Facebook page from copying the data and pasting it into another file, just as Kindle’s deletion feature doesn’t prevent people from copying a book’s files and saving them on their computers. Vanish is just a prototype at this point, and it only works if all the people who read your Facebook entries or view your Flickr pictures have it installed on their computers as well; but it’s a good demonstration of how control affects file deletion. And while it’s a step in the right direction, it’s also new and therefore deserves further security analysis before being adopted on a wide scale.

We’ve lost the control of data on some of the computers we own, and we’ve lost control of our data in the cloud. We’re not going to stop using Facebook and Twitter just because they’re not going to delete our data when we ask them to, and we’re not going to stop using Kindles and iPhones because they may delete our data when we don’t want them to. But we need to take back control of data in the cloud, and projects like Vanish show us how we can.

Now we need something that will protect our data when a large corporation decides to delete it.

This essay originally appeared in The Guardian.

EDITED TO ADD (9/30): Vanish has been broken, paper here.

Posted on September 10, 2009 at 6:08 AMView Comments

Data Mining and Amazon Wishlists

Data Mining 101: Finding Subversives with Amazon Wishlists.

Now, imagine the false alarms and abuses that are possible if you have lots more data, and lots more computers to slice and dice it.

Of course, there are applications where this sort of data mining makes a whole lot of sense. But finding terrorists isn’t one of them. It’s a needle-in-a-haystack problem, and piling on more hay doesn’t help matters much.

Posted on January 5, 2006 at 6:15 AMView Comments

Sidebar photo of Bruce Schneier by Joe MacInnis.