Data Is a Toxic Asset

Thefts of personal information aren't unusual. Every week, thieves break into networks and steal data about people, often tens of millions at a time. Most of the time it's information that's needed to commit fraud, as happened in 2015 to Experian and the IRS.

Sometimes it's stolen for purposes of embarrassment or coercion, as in the 2015 cases of Ashley Madison and the US Office of Personnel Management. The latter exposed highly sensitive personal data that affects security of millions of government employees, probably to the Chinese. Always it's personal information about us, information that we shared with the expectation that the recipients would keep it secret. And in every case, they did not.

The telecommunications company TalkTalk admitted that its data breach last year resulted in criminals using customer information to commit fraud. This was more bad news for a company that's been hacked three times in the past 12 months, and has already seen some disastrous effects from losing customer data, including £60 million (about $83 million) in damages and over 100,000 customers. Its stock price took a pummeling as well.

People have been writing about 2015 as the year of data theft. I'm not sure if more personal records were stolen last year than in other recent years, but it certainly was a year for big stories about data thefts. I also think it was the year that industry started to realize that data is a toxic asset.

The phrase "big data" refers to the idea that large databases of seemingly random data about people are valuable. Retailers save our purchasing habits. Cell phone companies and app providers save our location information.

Telecommunications providers, social networks, and many other types of companies save information about who we talk to and share things with. Data brokers save everything about us they can get their hands on. This data is saved and analyzed, bought and sold, and used for marketing and other persuasive purposes.

And because the cost of saving all this data is so cheap, there's no reason not to save as much as possible, and save it all forever. Figuring out what isn't worth saving is hard. And because someday the companies might figure out how to turn the data into money, until recently there was absolutely no downside to saving everything. That changed this past year.

What all these data breaches are teaching us is that data is a toxic asset and saving it is dangerous.

Saving it is dangerous because it's highly personal. Location data reveals where we live, where we work, and how we spend our time. If we all have a location tracker like a smartphone, correlating data reveals who we spend our time with­ -- including who we spend the night with.

Our Internet search data reveals what's important to us, including our hopes, fears, desires and secrets. Communications data reveals who our intimates are, and what we talk about with them. I could go on. Our reading habits, or purchasing data, or data from sensors as diverse as cameras and fitness trackers: All of it can be intimate.

Saving it is dangerous because many people want it. Of course companies want it; that's why they collect it in the first place. But governments want it, too. In the United States, the National Security Agency and FBI use secret deals, coercion, threats and legal compulsion to get at the data. Foreign governments just come in and steal it. When a company with personal data goes bankrupt, it's one of the assets that gets sold.

Saving it is dangerous because it's hard for companies to secure. For a lot of reasons, computer and network security is very difficult. Attackers have an inherent advantage over defenders, and a sufficiently skilled, funded and motivated attacker will always get in.

And saving it is dangerous because failing to secure it is damaging. It will reduce a company's profits, reduce its market share, hurt its stock price, cause it public embarrassment, and­ -- in some cases -- ­result in expensive lawsuits and occasionally, criminal charges.

All this makes data a toxic asset, and it continues to be toxic as long as it sits in a company's computers and networks. The data is vulnerable, and the company is vulnerable. It's vulnerable to hackers and governments. It's vulnerable to employee error. And when there's a toxic data spill, millions of people can be affected. The 2015 Anthem Health data breach affected 80 million people. The 2013 Target Corp. breach affected 110 million.

This toxic data can sit in organizational databases for a long time. Some of the stolen Office of Personnel Management data was decades old. Do you have any idea which companies still have your earliest e-mails, or your earliest posts on that now-defunct social network?

If data is toxic, why do organizations save it?

There are three reasons. The first is that we're in the middle of the hype cycle of big data. Companies and governments are still punch-drunk on data, and have believed the wildest of promises on how valuable that data is. The research showing that more data isn't necessarily better, and that there are serious diminishing returns when adding additional data to processes like personalized advertising, is just starting to come out.

The second is that many organizations are still downplaying the risks. Some simply don't realize just how damaging a data breach would be. Some believe they can completely protect themselves against a data breach, or at least that their legal and public relations teams can minimize the damage if they fail. And while there's certainly a lot that companies can do technically to better secure the data they hold about all of us, there's no better security than deleting the data.

The last reason is that some organizations understand both the first two reasons and are saving the data anyway. The culture of venture-capital-funded start-up companies is one of extreme risk taking. These are companies that are always running out of money, that always know their impending death date.

They are so far from profitability that their only hope for surviving is to get even more money, which means they need to demonstrate rapid growth or increasing value. This motivates those companies to take risks that larger, more established, companies would never take. They might take extreme chances with our data, even flout regulations, because they literally have nothing to lose. And often, the most profitable business models are the most risky and dangerous ones.

We can be smarter than this. We need to regulate what corporations can do with our data at every stage: collection, storage, use, resale and disposal. We can make corporate executives personally liable so they know there's a downside to taking chances. We can make the business models that involve massively surveilling people the less compelling ones, simply by making certain business practices illegal.

The Ashley Madison data breach was such a disaster for the company because it saved its customers' real names and credit card numbers. It didn't have to do it this way. It could have processed the credit card information, given the user access, and then deleted all identifying information.

To be sure, it would have been a different company. It would have had less revenue, because it couldn't charge users a monthly recurring fee. Users who lost their password would have had more trouble re-accessing their account. But it would have been safer for its customers.

Similarly, the Office of Personnel Management didn't have to store everyone's information online and accessible. It could have taken older records offline, or at least onto a separate network with more secure access controls. Yes, it wouldn't be immediately available to government employees doing research, but it would have been much more secure.

Data is a toxic asset. We need to start thinking about it as such, and treat it as we would any other source of toxicity. To do anything else is to risk our security and privacy.

This essay previously appeared on CNN.com.

Posted on March 4, 2016 at 5:32 AM • 45 Comments

Comments

NilsMarch 4, 2016 6:31 AM

The big problem is that the direct risk is for the person the data is about, not for the organisation storing the data.

To have some self regulation there needs to be a cost for storing personal data. (Just the technical storage cost is not enough anymore.)

I find the idea interesting to require anyone storing personal information to inform the person the data is about once per year by physical mail including all stored information.

If some company sees value in storing all websites i visit, they should be able to send me a letter listing all those sites with every visit (everything they store) every year. If they have to send 100 pages of paper per user per year of storaged data, they might consider deleting old stuff.

HalleyMarch 4, 2016 6:51 AM

/\ "We need to regulate what corporations can do with our data.... We can make corporate executives personally liable ... We can make certain business practices illegal."


.

Ahhh yes -- the knee-jerk authoritarian approach to problem solving --
"There Oughta be a Law !! "

Commissar Schneier readily discerns that government politicians & bureaucrats must forcefully step in to fix this data problem -- they are so good at problem solving generally... and especially good at protecting government data and citizen privacy. What could go wrong.

Apparently the hundreds-of-thousands of laws/regulations now in force are insufficient for U.S. society-- a few more data protection laws will surely do the trick. Just ratchet up the police state another notch.
Risk of unintended consequences is of no consequence.

zzzMarch 4, 2016 7:05 AM

@Nils - Maybe requiring data collectors (companies/government/etc...) to share all collected data with the user whenever the user request it would be enough...

WmMarch 4, 2016 7:12 AM

I always look to myself for security. I have no faith in companies or governments doing anything right. The rampant greed of people today will prevent them from thinking or caring about the other guy. All they are really concerned about is doing just enough to keep from being sued. Thus, my cell phone is never on unless I need to make a call. This is probably unpractical for most. A pager will work so people can contact you. All my online credit card transactions are made through generating a one time credit card number using the Bank of America Shop Safe system. I pay cash for everything else. I found a special legal way to give a stealth address for my domain names that do not qualify for WhoisGuard.

Paul RenaultMarch 4, 2016 7:40 AM

Thanks, Bruce for the CNN link.

For years now, I keep asking some hotels I stay at "Why do you hang on to my credit card information? Once the cleaning staff has been in there and can confirm that I having stolen the mattress and the furniture, what use is it to you?" I then I remind them (actually, it's almost NEVER that the hotel manager has heard about it) of the TJX, Target, and so many breaches, and the huge payouts these companies have had to, er, pay out.

(It's not just Trump who seems to be totally unaware of what's going on in the world, eh?)

I know that if I pointed them to this blog, they wouldn't care. But on CNN, they might listen. [sigh]

paulMarch 4, 2016 7:59 AM

If you look at how fast other things that can torpedo a business have percolated into general understanding, we probably have another 5-10 years of this stuff, or until there's some kind of even-more-visible disaster and everyone stands around claiming no one could possibly have known it was coming.

Years and years ago, when I was reporting on aviation safety, some of the people in the field claimed that no amount of reports and warnings was enough, and that only crash reports motivated the FAA and the air transport industry to make changes. Let's hope that isn't true for data protection.

WinterMarch 4, 2016 8:01 AM

@Halley
"Ahhh yes -- the knee-jerk authoritarian approach to problem solving --
"There Oughta be a Law !! ""

Maybe personal data remains the property of the person about whom it is? So the law only protects the "properties" of the public. That is more or less the viewpoint in Europe.

Stolen data harms people. Those who are not up to the task of preventing this harm should not be allowed to store data.

HansMarch 4, 2016 8:56 AM

Why not mandate that any organization that leaks PII, PCI or HIPAA records must pay each affected user $350 per record? That creates a clear incentive to destroy or protect the data. I know it won't happen but that could avoid the trouble of specifying security standards in an ever-changing environment which so far has been a great way to avoid responsibility. "We met the security requirements" will no longer be an excuse.

Clive RobinsonMarch 4, 2016 9:09 AM

@ Bruce,

You might want to put the missing word "lost" at the end of,

... including £60 million (about $83 million) in damages and over 100,000 customers.

de La BoetieMarch 4, 2016 9:11 AM

I've found it useful in discussion with the unwashed to analogise data storage to a radioactive waste dump - I think that better engages peoples' visceral reaction to it, so that it closer aligns with the true nature of it all.

jbMarch 4, 2016 9:17 AM

zzz,

"@Nils - Maybe requiring data collectors (companies/government/etc...) to share all collected data with the user whenever the user request it would be enough..."


How would we ever possibly enforce that?

blakeMarch 4, 2016 9:22 AM

How about a definition of Big Data: "Big data is a data set which, if leaked, destroys your company".

TimHMarch 4, 2016 9:49 AM

@Nils @zzz Great, so I have to keep all those institutions aware of my current address so they don't spray my records to where I'm not.

I'll give you two unmentioned issues:

1. A lot of the capture is not "you", it's a semi-correlated, inferred "you". FB tracking non-FB users, for eaxmple.

2. I don't care that much to know what Experian, State Farm, Blue Shield, FBI, TSA have collected about me. Sure, some of it may be wrong, and in end cases it may affect me. I want the analysis that these instititions have made about me. Far more potentially damaging. No credit, no insurance, dawn raid, no-fly are the consequences of bad inferences and correlations, and that's all secret.

Clive RobinsonMarch 4, 2016 9:54 AM

@ Bruce,

It could have taken older records offline, or at least onto a separate network with more secure access controls.

Not only is "taking offline" --especialy if encrypted-- a more secure option it's also a cheaper and safer way. The down side of which is data storage ages by decay and the storage methods become quickly obsolete. Which from a data subjects point of view is desirable because the cost of transfere to new media quickly gets seen as "undesirable cost" with little or no ROI, thus gets axed.

Taking the data onto another network, is the equivalent of the old "Data Warehousing" idea from the early to mid 1990s. One advantage if done correctly is data that ages becomes visable by the lack of pull requests for it thus it can be put into older data sets and then eventually drop into the bit bucket of mag tape in Iron Mountain or wherever.

One idea from the EU back in the 1980's was a "data tax" which immediately puts a negative value on any data. If combined with an inverse taper tax, keeping old data would get to the point where it would not be economically viable no matter what the decrease in media and technology costs.

But... If you talk to historians, the one type of data they realy want to analyze history with is "day to day" records such as accounts, factory and other business records and similar, which realy are not available until the 19-20th century and even then only occasionally. The reason the cost of keeping records such as hand written ledgers was high, especially when there was no requirment to keep them.

The one reason "big data" got going was the ambiguity on how long business and other records should be kept legaly. Thus as data storage got less expensive more data was kept, thus the cost of keeping it got lower. However as with any cost a business wants to make a return on it if it can. Thus people started to look at ways to capitalize on data they felt they had to keep to avoid legal risk. The result was fairly quickly the "numerati" who could mine out business data from such records, and as history teaches once a market comes into being it has an incentive to evolve hence "big data".

The question few ask is what the confidence on the algorithms used on big data. Many algorithms come from analysing smaller historical data sets. Previous studies have shown that upto 30% of data can be incorect with something like 80% of records effected, and due to the way data is bought and sold the older the data the more likely it is to be in error. With that sort of garbage in what is the output like?

TimHMarch 4, 2016 9:57 AM

@blake
I'd prefer "Big data is a data set which, if leaked, gets mandatory jail time to the CTO that we have to directly employ in the US by law and are not allowed to subcontract to 3rd party orgs".

I believe taht Germany's toxic waste disposal law is still that if you sub your disposal to a 3rd party and they screw up, you are still directly liable.

WinterMarch 4, 2016 10:40 AM

I know that anonymization is impossible in principle, but the cost of deanonimization can be racked up pretty high.

If you really "must" store big data long time, why not compartimentalize it into separate containers which each needs a separate decoding list to link records to pseudonymous identities? Only access to all decoding lists allows for the full linking of all records together and to a name.

Only sensible if access to the decoding lists can be controlled independent of the data records.

ianfMarch 4, 2016 10:53 AM


@ de la Boetie […] in discussion with unwashed equates “big data storage to a radioactive waste dump.”

That's pretty good, but not good enough. Most people have no mental concept/ image of such a ((R))-rated dump, and wouldn't recognize one if they saw one, much less taken precautions not to become contaminated [somewhat atypical example, but still… people smeared stolen Cesium 137 powder onto their bodies as "glowing glitter"]. In fact designing a pictogramatically unambiguous signage method for warning future—possibly "Mad Max'ish"—generations of the DANGER associated with 100k years-long-term nuclear waste dumps occupies the minds of not so few scientists. Our civilizations' oldest written language is mere 5k years old.

    For that unfamiliarity reason alone, it might be better to compare big data storage to compost or garbage dumps, which most (urbane) people at least have some inkling of. “Would you store your birth certificate in a dumpster?” Both compost and trash cans are unpleasant, disinviting, but known to be frequented by rats, animal and human.

JohnMarch 4, 2016 10:58 AM

Over the past couple years I have worked with a couple companies and the problem with securing data is that it's always an after thought...

Companies want to sell their products so bad and also offer the bleeding edge services. There is always a group in every company that well just sabotages IT every chance they get with their clueless ideas. Management always funds this group and their programs always runs amok. I had a school district buy 4,000 IPADS to allows the kids to watch Adobe Flash.... Problem is that IPADs do not support Adobe Flash... OK IT fix this !!!

The example above is typical of every company that I have worked. You try to secure things but it's almost impossible since another group is steering you ship into doom every time. Management never steps in since sales are what drive a company...

PCI compliance has helped a lot since it forces an organization to watch specific traffic that carries credit cards.... But then again this group above goes out and buys IPADS to enter credit card information over wireless with no application security... Nightmare....

Business leaders have not signed off on being responsible for security if they allow their own employees sabotage security. Business leaders are the problem to security even in 2016.

chris lMarch 4, 2016 11:08 AM

Note that some of us did sue to try to prevent the US government from overcollecting data, and suffered at least two data breaches if we finally relented to keep our jobs: hspd12. We were assured that OPM would keep our data safe. If you read down, there was a second smaller data breach when a NASA employee had a laptop with a bunch of unencrypted PII stolen from a car in 2012.

One good start with respect to the US gov't, is that the Privacy Act is extremely weak and requires demonstration of actual harm traceable to the release, which is a difficult threshold when that harm could come years later or in a way that's hard to directly detect. Someone who has access to the stolen OPM data could quietly make all sorts of employment decisions about people without there being any evidence of it whatsoever. Creating real penalties for such improper data release would help, as would taking data systems offline if they fail security audits. OPM had years of notice that their network and data security were weak and out of compliance, but kept operating with everything online anyway because of the rush to clear huge numbers of people to support the continued physical security anxiety that came out of 9/11.

A second possible improvement would, as Clive suggests, be keeping inactive data offline and requiring a "sneakernet" transfer to make it available over the network. This is particularly appropriate for something like the OPM data, which are only active during the rebadging/reclearance process and otherwise shouldn't need to be accessed (or only a limited subset needs to be accessed for things like payroll).

A third and more substantial improvement would be to revise the clearance process-- I've seen no evidence that it's based on any sort of research into what makes a person reliable. Instead it has the marks of J. Edgar Hoover's FBI all over it-- "we want to collect everything we can find so we can hold it over your head and *we* can blackmail you". Simply collecting less information, or destroying it after a clearance decision is made, would reduce the impact of data breaches. You can't release what you didn't retain. An argument that will be used against this is that the clearance process compares responses from the past to check if someone is changing their history. With the OPM data breach, something we don't know is if whoever copied all the data also contaminated the database by changing entries. That could have as much (or more!) impact as using the data that were stolen from the database.

@Nils - as far as making data available to the subject on request, implementation of that seems like it would generally increase the chances of large data breaches, because the data would have to be readily available to comply with requests.

siddMarch 4, 2016 12:18 PM

I second the calls to encrypt data, age it off, and the "data tax." These large repositories of personal data are too attractive, irresistible targets for all manner of criminals and governments.

MikeAMarch 4, 2016 12:19 PM

Making all data held accessible to the user is likely to be followed quite quickly by making the data available to whoever hacked the user's Gmail/Facebook/etc. account.

As for making the CTO/CEO personally liable, unless the rules on third-party "waste disposal" are made absolutely perfect, I assume most companies would follow the practice of the well-known home builder that built my first house. They set up a shell company with essentially no assets (once the development was sold) and some nobody as the "owner", so any judgement becomes a bloodless turnip. Isn't this one of the purposes of "cloud storage"?

ErikMarch 4, 2016 12:30 PM

Keep It Simple, Stupid. I think we that with one tweak we can neatly sidestep the authoritarian vs. libertarian debate on these matters.

IMHO, the root cause of data carelessness is that the default value of PII has been set, by the government court system, at $5 - $10 (legislatures tend to be more busy with important things like what people do with their private parts and fighting over whose donors should be more enriched). In reality, the value should be a few orders of magnitude higher. If the costs are set appropriately, then the rest of the problem sorts itself out. Companies that don't secure their data properly will find themselves open to armageddon-level liabilities, and therefore uninsurable, unable to get financing, etc.

This also spares us the inevitable mess of what specific regulations would need to be passed, and how they would stay up to date (let's be honest - they'd be obsolete before they're enacted), what stupid exemptions the lobbyists would carve out, what stupidly impossible overreaches the technically illiterate legislators and regulators try to incorporate, and so on.

Outside of that notion that I think most people would agree on, I have a few less broadly palatable thoughts:

1) The "corporate veil" needs to go. Management should have unlimited liability, and if they're wiped out then the shareholders should pick up the rest of the damages. This was actually how things worked until the mid 19th century. The moral hazard of the existing "one-way revolving door of cash" is probably responsible for the overwhelming majority of bad corporate behavior.

2) This would have to be accompanied by some serious tort reform. Loser pays, plus additional damages for frivolous cases.

Bob PaddockMarch 4, 2016 12:37 PM

"Federal Rules of Civil Procedure TITLE V. DISCLOSURES AND DISCOVERY
Rule 26. Duty to Disclose; General Provisions Governing Discovery"

Found at Cornell University Law School Legal Information Institute.

(1) Initial Disclosure.

(A) In General. Except as exempted by Rule 26(a)(1)(B) or as otherwise stipulated or ordered by the court, a party must, without awaiting a discovery request, provide to the other parties:

(i) the name and, if known, the address and telephone number of each individual likely to have discoverable information—along with the subjects of that information—that the disclosing party may use to support its claims or defenses, unless the use would be solely for impeachment;

(ii) a copy—or a description by category and location—of all documents, electronically stored information, and tangible things that the disclosing party has in its possession, custody, or control and may use to support its claims or defenses, unless the use would be solely for impeachment; ...

Comes down to if it exists it must be disclosed to the Government.

"2028--A Dystopian Story" by Jack Ganssle, April 28, 2008, explains the absurdities that this leads to in "record nothing. Remember nothing".

edgeMarch 4, 2016 12:37 PM

It's not toxic enough. Financially, the worst they generally suffer is having to buy a bit of credit monitoring for people. And with the immunity that can come with CISA, the toxicity is even lower.

bcsMarch 4, 2016 2:18 PM

Up until "We need to regulate what corporations can do with our data ..." I was all in on this. I'd be all for legislation that makes the holders of data personally liable for what happens if it gets out due to mismanagement (even with rather wide definitions of mismanagement). But I really can't get behind a law that bars an action just because it *might* be a security risk, which smacks of legislative security theater. The phrase "prior restraint" comes to mind.

I'd advocate a regime of transferring risk to people in a position to take action and make choices as well as wide scale mitigation and resilience efforts. Make it risky enough to the actors and incidence will be rare enough that we can figure out how to live with them.

As an aside, I find it interesting that for this security risk, Mr Schneier is advocating prevention, but for terrorism he advocates mitigation and resilience:
https://www.schneier.com/blog/archives/2012/08/preventive_vs_r.html

k15March 4, 2016 2:28 PM

Flip side: those for whom valid data are damaging will have some really nasty incentives.

k15March 4, 2016 2:52 PM

If there are orchestras from hell in play and in some, people are being stepped on at particular times to play their note, is there a way this could be unearthed?

Kenneth CramerMarch 4, 2016 3:18 PM

I'm not exactly for making another government regulation. What about simply holding companies that keep data that is not 100% necessary to conduct their core businesses financially responsible if that data is stolen.

That means that if a company keeps your credit card data, and it isn't a credit card company, they are 100% responsible for all of the costs related to that data being stolen, including any and all identity theft, and 100% of the total cost for all financial institutions in safeguarding their customer's financial assets etc.

The entire thing could be handled civilly without the need for massive legislation or yet another round of government expansion or regulation.

If you want to make companies responsible for their actions, you don't need more regulations, you just need to make incompetence more costly.

tyrMarch 4, 2016 3:48 PM


One thing that seems to be overlooked is the track record
of how we got to the current mess. Businesses and governments
were always record keepers, with some justification no matter
how lame or insidious for their activities. Now you introduce
the personal level computer into every office in the land and
instead of the paperless office you get a flood of printout
no one bothers to read piled in stacks around the average
manager to the point of a crushing hazard if it topples. But
the comp tech improves and disk space increases at a rapid
pace. Now the same incompetents who never figured out what a
computer was for are suddenly able to keep everything stored.
The mad rush was on to collect and never throw anything away.
The only thing that had changed was the technology, the mind
set of the clarking was still the same. The next leap was to
connect everything to another barely understood technology,
the Net. Now the tedious incompetents in management, sales,
and PR were loosed into a worldwide field where their ignorance
became a worldwide danger to themselves and the innocent
bystander.

That's the history without all of the sales glossy pitch of
the brave new world.

Trying to maintain security and some small level of sanity in
the seas of looneyness that surround the holy sacred cows of
the comp illiterate has been a real fun ride. Many here can
do testimonials to the trench warfare trying to stem the tide.

The problem is simply that society has adopted all of this
shiny new tech without any appropriate changes in behaviors.
Until the behaviors change the problems are not going away.

BobMarch 4, 2016 3:54 PM

"To be sure, it would have been a different company. It would have had less revenue, because it couldn't charge users a monthly recurring fee."

The CC # does not not to be stored for recurring transactions; only the transaction code and related info.

.March 4, 2016 4:56 PM

I don't agree with the assessment that start ups are more likely than larger companies to do this.
I think all companies are equally likely and it is driven by a pure profit motive that applies to every company in a capitalistic society.
Besides it is the larger companies that have bigger budgets, bigger data collection, and bigger risks.
The only aspect that comes into play with start ups is the regulation.
As far as willingness, motive, and desire it is the same for all.
Just my 2 cents.

jimMarch 5, 2016 4:32 AM

When it comes to data about themselves, corprorations normally have some sort of data retention policy. Can the same incentives that make them adopt these policies for their own data, make them have similar policies about all data?

targetdroneMarch 6, 2016 3:01 PM

Similar to @de la Boetie, for the last few years I've been using the analogy of "Radioactive Gold". The data is incredibly valuable, it looks shiny and desirable, well-equipped thieves covet it, and you don't recognize that it's far too dangerous to keep anywhere near you -- until it's too late.

PoohbahMarch 6, 2016 6:20 PM

I have always made a distinction in my company between information "assets" (that information which has value to the company) and information "liabilities" (information which are required to conduct business [e.g. employee PII & SSN, cardholder data] which must be protected but has no business value).

My goal has always been to reduce the latter within the constraints of compliance to a bare minimum. Since I work in a retail business, the goal is to put cardholder data completely out of reach of the company via P2P encryption and tokenization.

Mike GerwitzMarch 7, 2016 12:26 PM

The best way to keep data out of the hands of others is to never provide it to begin with. I was disappointed to see no suggestion of using programs on your own computer, rather than services as a software substitute (SaaSS).

When you use online services instead of software on your own computer---many of which are so-called "cloud" services---you put your personal information at risk.

There's a common saying: "there is no cloud; it's just someone else's computer".

Christian CampbellMarch 7, 2016 3:23 PM

I recommend using the term 'hoarding' rather than 'saving.'

fajensenMarch 8, 2016 10:01 AM

Recent Example: DISQUS has been hacked by somebody called "Researchgruppen" - which are really AFA - Antifascistisk Aktion - part of the militant left. In this case it is "racists*" being outed in their homes. With pictures in newspapers.

The yet to be realised bigger problem is that *every* extremist of *any* brand and flavour, not just the ones "we like", can do the exact same thing!

The moral of the sad story is that everything you do or say via the internet will be recorded, stored forever and used (out of context) to build a case against you (or inspire a lynch mob, as the case may be).

http://www.expressen.se/nyheter/expressen-avslojar/expressen-avpixlar-anonyma-skribenter/

http://computersweden.idg.se/2.2683/1.538270/expressen-artiklar-far-disqus-att-uppdatera

*) In Sweden the term "racist" covers everything from people who are merely sceptical of immigration policies to full-on nazis - except, of course, when the racists are immigrants hating other groups of immigrants. That is ignored because, it is understood, those poor disadvantaged people don't know any better; they are simply not properly Civilised like Us yet (Of course, the Swedes do not get the contradictions in this way of thinking).

Cyborg2237March 8, 2016 12:06 PM

@fajensen

In Sweden the term "racist" covers everything from people who are merely sceptical of immigration policies to full-on nazis - except, of course, when the racists are immigrants hating other groups of immigrants. That is ignored because, it is understood, those poor disadvantaged people don't know any better; they are simply not properly Civilised like Us yet (Of course, the Swedes do not get the contradictions in this way of thinking).

That is how it is across the whole West.

In the US, I am not against the general sentiment, I think it is good to be tolerant of 'stranger's in strange lands'. However, when people take this too far and pretend that the extreme bigoted sentiments are not there from immigrants, they make a mockery out of common sense.

You can sometimes see that here even on this forum.

It is simply a difficult problem.

On one hand, you want to be kind to immigrants in difficult situations. It is too easy for majorities to gather against minorities, and viciously so. But, on the other hand, Europe, and to a lesser degree, the US is swamped with immigrants from extremely bigoted areas of the world the likes of which they have not the slightest idea of.

Sadly, that raises a sort of error condition. The real problem is people are too polarized and operate too much on instinct, too little on reasoning.

Dirk PraetMarch 8, 2016 6:55 PM

@ fajensen

In Sweden the term "racist" covers everything from people who are merely sceptical of immigration policies to full-on nazis - except, of course, when the racists are immigrants hating other groups of immigrants. That is ignored because, it is understood, those poor disadvantaged people don't know any better

It has nothing to do with understanding, but everything with an in Western and Northern Europe widespread phenomenon called "political correctness" (PC). PC makes it virtually impossible to discuss awkward immigrant related issues that are just as real as racism and discrimination against them. But I guess this is not entirely the right thread to elaborate on it.

R coxMarch 9, 2016 12:43 PM

I know we are talking about data that can be stolen by remotely connecting to a server, but the toxicity of data was well demonstrated by in 2002 when Arthur Andersen LLP was no more because it tried to destroy data after it wasn't legally allowed. This is part of why keeping data is dangerous. Once the law requests it, there is great legal risk if it is destroyed. Beyond the difficulty of keeping data secure, there is the issue of being used against the owner in legal proceedings.

I also think that end users expectation is a problem. We should demand that we be allowed to set when texts and tweets and emails are destroyed, be a month, year, or 7 years, but we don't. I am only outraged morally as a tax payer the the US government has wasted the public funds to record calls to my family outside the US, but to think that possible after I am gone they are still going waste money storing these calls is infuriating. And it is scary that kids are going to have defend stuff they posted when they were 15 to employers when they are 25.

Dirk PraetMarch 10, 2016 7:33 AM

In a somewhat amusing turn of events, the recently much-plagued Islamic State has just experienced its very own Snowden with a defector delivering a memory stick with personal information of about 22k foreign Syria fighters to Sky News and the German federal police.

Data can be a very toxic asset indeed.

Samat GalimovMarch 12, 2016 7:15 PM

As far as I know, it is possible to do recurring credit card charges without storing any user identifying data.

Recurrent charges are referred by unique ID, not name-card data tuples, when retailers and processing companies talk to each other.

Less data-less profits reasoning has small merit in it, when it comes to credit card charges.

Ollie JonesMarch 12, 2016 8:03 PM

You said that the Ashley Madison service wouldn't have been able to accept recurring subscriptions had they not retained their customers' payment card information.

That's not technologically true. There are several payment card processing services out there (stripe, zuora, etc.) who hold users' confidential payment card data on behalf of various subscription-based software-as-a-service outfits. They make charges and so forth.

Of course, they have the sensitive data and if they get penetrated, the damage could be serious. On the other hand, they have every business incentive to maintain security, and they have the budget and clout to staff highly skilled infosec departments. So there's competent "ice," in Neal Stephenson's word, protecting that sensitive data.

That means their customers, various SaaS outfits, don't have to re-invent the flat tire in the area of payment security.

However, the Ashley Madison company cannot use most of those payment services, because their line of business is considered high-risk.

Paull FerrisMarch 13, 2016 2:51 PM

There is another way, although it turns the 'traditional' security model on its head.

There are a number of excellent companies building identity services for the individual, aiming to replace the myriad of service providers all holding a version of your identity attributes. All well and good.

However, who is to say that these companies should be trusted with your personal data (although they should have to specialise as they depend on trust). Moreover, why should providers of services to you, trust the data held by these third parties?

If they could do, everyone would benefit.

So, operating somewhat under the radar is an open source collaboration between some important stakeholders - regulated companies, regulators, NGOs and (of course) individuals.
Its called ObjectChain Collab, and has built a cooperative architecture, where trust is established via any suitable blockchain.

Since it is necessary for an identity owned by an individual to be held in law to be your own, it is also necessary for a competitive environment of companies to look after your identity. Building the rails on which this competitive environment can develop and thrive is actually critical to providing the whole solution.

Its interesting, and incredibly rewarding work. Ultimately, when an alternative exists to each of your service providers holding a sub-set of your information, we should all be in a position to insist that they divest themselves of the responsibility, and as Bruce indicates, they will all sigh with relief when they can do this.

Leave a comment

Allowed HTML: <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre>

Photo of Bruce Schneier by Per Ervland.

Schneier on Security is a personal website. Opinions expressed are not necessarily those of IBM Resilient.