Security in 2020: Revisited

Ten years ago, I wrote an essay: "Security in 2020." Well, it's finally 2020. I think I did pretty well. Here's what I said back then:

There's really no such thing as security in the abstract. Security can only be defined in relation to something else. You're secure from something or against something. In the next 10 years, the traditional definition of IT security -- that it protects you from hackers, criminals, and other bad guys -- will undergo a radical shift. Instead of protecting you from the bad guys, it will increasingly protect businesses and their business models from you.

Ten years ago, the big conceptual change in IT security was deperimeterization. A wordlike grouping of 18 letters with both a prefix and a suffix, it has to be the ugliest word our industry invented. The concept, though -- the dissolution of the strict boundaries between the internal and external network -- was both real and important.

There's more deperimeterization today than there ever was. Customer and partner access, guest access, outsourced e-mail, VPNs; to the extent there is an organizational network boundary, it's so full of holes that it's sometimes easier to pretend it isn't there. The most important change, though, is conceptual. We used to think of a network as a fortress, with the good guys on the inside and the bad guys on the outside, and walls and gates and guards to ensure that only the good guys got inside. Modern networks are more like cities, dynamic and complex entities with many different boundaries within them. The access, authorization, and trust relationships are even more complicated.

Today, two other conceptual changes matter. The first is consumerization. Another ponderous invented word, it's the idea that consumers get the cool new gadgets first, and demand to do their work on them. Employees already have their laptops configured just the way they like them, and they don't want another one just for getting through the corporate VPN. They're already reading their mail on their BlackBerrys or iPads. They already have a home computer, and it's cooler than the standard issue IT department machine. Network administrators are increasingly losing control over clients.

This trend will only increase. Consumer devices will become trendier, cheaper, and more integrated; and younger people are already used to using their own stuff on their school networks. It's a recapitulation of the PC revolution. The centralized computer center concept was shaken by people buying PCs to run VisiCalc; now it's iPads and Android smartphones.

The second conceptual change comes from cloud computing: our increasing tendency to store our data elsewhere. Call it decentralization: our email, photos, books, music, and documents are stored somewhere, and accessible to us through our consumer devices. The younger you are, the more you expect to get your digital stuff on the closest screen available. This is an important trend, because it signals the end of the hardware and operating system battles we've all lived with. Windows vs. Mac doesn't matter when all you need is a web browser. Computers become temporary; user backup becomes irrelevant. It's all out there somewhere -- and users are increasingly losing control over their data.

During the next 10 years, three new conceptual changes will emerge, two of which we can already see the beginnings of. The first I'll call deconcentration. The general-purpose computer is dying and being replaced by special-purpose devices. Some of them, like the iPhone, seem general purpose but are strictly controlled by their providers. Others, like Internet-enabled game machines or digital cameras, are truly special purpose. In 10 years, most computers will be small, specialized, and ubiquitous.

Even on what are ostensibly general-purpose devices, we're seeing more special-purpose applications. Sure, you could use the iPhone's web browser to access the New York Times website, but it's much easier to use the NYT's special iPhone app. As computers become smaller and cheaper, this trend will only continue. It'll be easier to use special-purpose hardware and software. And companies, wanting more control over their users' experience, will push this trend.

The second is decustomerization -- now I get to invent the really ugly words -- the idea that we get more of our IT functionality without any business relation­ship. We're all part of this trend: every search engine gives away its services in exchange for the ability to advertise. It's not just Google and Bing; most webmail and social networking sites offer free basic service in exchange for advertising, possibly with premium services for money. Most websites, even useful ones that take the place of client software, are free; they are either run altruistically or to facilitate advertising.

Soon it will be hardware. In 1999, Internet startup FreePC tried to make money by giving away computers in exchange for the ability to monitor users' surfing and purchasing habits. The company failed, but computers have only gotten cheaper since then. It won't be long before giving away netbooks in exchange for advertising will be a viable business. Or giving away digital cameras. Already there are companies that give away long-distance minutes in exchange for advertising. Free cell phones aren't far off. Of course, not all IT hardware will be free. Some of the new cool hardware will cost too much to be free, and there will always be a need for concentrated computing power close to the user -- game systems are an obvious example -- but those will be the exception. Where the hardware costs too much to just give away, however, we'll see free or highly subsidized hardware in exchange for locked-in service; that's already the way cell phones are sold.

This is important because it destroys what's left of the normal business rela­tionship between IT companies and their users. We're not Google's customers; we're Google's product that they sell to their customers. It's a three-way relation­ship: us, the IT service provider, and the advertiser or data buyer. And as these noncustomer IT relationships proliferate, we'll see more IT companies treating us as products. If I buy a Dell computer, then I'm obviously a Dell customer; but if I get a Dell computer for free in exchange for access to my life, it's much less obvious whom I'm entering a business relationship with. Facebook's continual ratcheting down of user privacy in order to satisfy its actual customers­--the advertisers--and enhance its revenue is just a hint of what's to come.

The third conceptual change I've termed depersonization: computing that removes the user, either partially or entirely. Expect to see more software agents: programs that do things on your behalf, such as prioritize your email based on your observed preferences or send you personalized sales announcements based on your past behavior. The "people who liked this also liked" feature on many retail websites is just the beginning. A website that alerts you if a plane ticket to your favorite destination drops below a certain price is simplistic but useful, and some sites already offer this functionality. Ten years won't be enough time to solve the serious artificial intelligence problems required to fully real­ize intelligent agents, but the agents of that time will be both sophisticated and commonplace, and they'll need less direct input from you.

Similarly, connecting objects to the Internet will soon be cheap enough to be viable. There's already considerable research into Internet-enabled medical devices, smart power grids that communicate with smart phones, and networked automobiles. Nike sneakers can already communicate with your iPhone. Your phone already tells the network where you are. Internet-enabled appliances are already in limited use, but soon they will be the norm. Businesses will acquire smart HVAC units, smart elevators, and smart inventory systems. And, as short-range communications -- like RFID and Bluetooth -- become cheaper, everything becomes smart.

The "Internet of things" won't need you to communicate. The smart appliances in your smart home will talk directly to the power company. Your smart car will talk to road sensors and, eventually, other cars. Your clothes will talk to your dry cleaner. Your phone will talk to vending machines; they already do in some countries. The ramifications of this are hard to imagine; it's likely to be weirder and less orderly than the contemporary press describes it. But certainly smart objects will be talking about you, and you probably won't have much control over what they're saying.

One old trend: deperimeterization. Two current trends: consumerization and decentralization. Three future trends: deconcentration, decustomerization, and depersonization. That's IT in 2020 -- it's not under your control, it's doing things without your knowledge and consent, and it's not necessarily acting in your best interests. And this is how things will be when they're working as they're intended to work; I haven't even started talking about the bad guys yet.

That's because IT security in 2020 will be less about protecting you from traditional bad guys, and more about protecting corporate business models from you. Deperimeterization assumes everyone is untrusted until proven otherwise. Consumerization requires networks to assume all user devices are untrustworthy until proven otherwise. Decentralization and deconcentration won't work if you're able to hack the devices to run unauthorized software or access unauthorized data. Deconsumerization won't be viable unless you're unable to bypass the ads, or whatever the vendor uses to monetize you. And depersonization requires the autonomous devices to be, well, autonomous.

In 2020 -- 10 years from now -- Moore's Law predicts that computers will be 100 times more powerful. That'll change things in ways we can't know, but we do know that human nature never changes. Cory Doctorow rightly pointed out that all complex ecosystems have parasites. Society's traditional parasites are criminals, but a broader definition makes more sense here. As we users lose control of those systems and IT providers gain control for their own purposes, the definition of "parasite" will shift. Whether they're criminals trying to drain your bank account, movie watchers trying to bypass whatever copy protection studios are using to protect their profits, or Facebook users trying to use the service without giving up their privacy or being forced to watch ads, parasites will continue to try to take advantage of IT systems. They'll exist, just as they always have existed, and -- like today -- security is going to have a hard time keeping up with them.

Welcome to the future. Companies will use technical security measures, backed up by legal security measures, to protect their business models. And unless you're a model user, the parasite will be you.

My only real complaint with the essay is that I used "decentralization" in a nonstandard manner, and didn't explain it well. I meant that our personal data will become decentralized; instead of it all being on our own computers, it will be on the computers of various cloud providers. But that causes a massive centralization of all of our data. I should have explicitly called out the risks of that.

Otherwise, I'm happy with what I wrote ten years ago.

Posted on February 7, 2020 at 12:50 PM • 20 Comments

Comments

Clive RobinsonFebruary 7, 2020 5:08 PM

@ All,

Sometimes you have to remember,

    To look forward you have to first look back, to know the future you have to know the past.

The reason is many ideas "are well before their time" as @Bruce notes,

In 1999, Internet startup FreePC tried to make money by giving away computers in exchange for the ability to monitor users' surfing and purchasing habits. The company failed, but computers have only gotten cheaper since then. It won't be long before giving away netbooks in exchange for advertising will be a viable business.

We may not realy be there yet as hardware unlike software suffers from the "Distance costs metric" but we do get sent what to many look like usefull freebies such as "USB Thumbdrives" for free with marketing logos on... Oh and "software" we realy don't know about (like bitcoin mining). They don't learn the lesson of more than two millennium, and "Look a gift horse in the mouth".

But how about realy looking back in ICT History, further even than ten years, even further than most readers hear have been alive and possibly further than any ever have touched an electronic keyboard. That is going back half a century or more,

https://fosdem.org/2020/schedule/event/early_unix/

    Many of today's fads, like microkernels, hypervisors, multiprocessing and user mode execution actually happened early on in Unix's history, long [before] they were today's fads. "What's old is new again" has never been so apt.

That "What's old is new again" underlies my prediction for the future, especially in ICT security. Not just next year, nor five years, nor a decade but for as long as development out paces memory.

There are two things to realy note about history and ICT Sec,

1, Any physical world trick has an information world equivalent.

2, We don't remember or learn from ICT history.

So even the oldest of tricks like an arthritic dog ballancing a ball on it's nose whilst on it's hind legs can appear as a new trick to most, again and again and...

Why do I say trick rather than crime, well the simple truth is many physical world crimes have not yet crossed over to the information world. Thus as nobody has done it that way yet it's not yet a crime... Even though legislators have written legislation for ICT about as broadly as they can, to try and "catch all" the reality is, man is inventive, and they will find things that the legislators have not covered.

As one of the more hated men in this century effectively observed,

1, There are known knowns.
2, There are unknown knowns.
3, There are unknown unknowns.

Kailib calls the second a "Black Swan", and wrote a book about it, which has no doubt "locked up a large amount of carbon" in the process ;-)

But the point is for some reason ICT Sec stumbles over "Known Knowns" almost all of the time, whilst pretending that they are "Unknown Unknowns"...

Sadly for many they realy are "Unknown Unknowns", because they did not study ICT Sec history, or worse, probably nobody offered to teach them it.

Which is another reason for my prediction, if we don't teach people history then they will be unpleasently surprised way more often than they should.

Lawrence D’OliveiroFebruary 7, 2020 6:29 PM

Regarding deconcentration -- just about all those special devices are in fact built around general-purpose computers. And they are increasingly running general-purpose OSes like Linux. So all they are doing is hiding this general-purpose programmability behind a limited interface. And there have always been hackers trying to probe behind this interface to unlock the computing potential behind it.

Trouble is, quite a few of the companies making these products have a business model that depends rather crucially on their products remaining locked down. And so they manage to get laws passed to make it illegal to open them up, even if it is the owner of the device doing so.

And so the age-old question re-emerges: if I buy something, do I own it?

TrurlFebruary 7, 2020 10:34 PM

On the next 10 years, I predict (or hope) the decentralization will reverse. It is just to difficult to be in charge of your own data. Maybe Microsoft builds another phone OS. Something to break the propriety model. They failed before because they could not compete with an already established App Store. But they didn't understand that users want an open phone. That was their strenght and they didn't use it. If they could load Widows10 on a phone they'd win.

I don't like decentralization. I must have gotten lost in the process, because I have no way but email and USB cord to go to phone to tablet to computer. In 10 years phones have given us great communication tools, but content creation and control of device has suffered.

meFebruary 8, 2020 5:34 AM

d-p14n
c13n
d-c12n
d-c11n
d-c13n
d-p11n
= f10n of:
fancyisation
deperimeterization
consumerization
decentralization
deconcentration
decustomerization
depersonization

all these d-c1xn are hard. maybe we should call them instead:

dc|2n
dc|1n
dc|3n

which also provides more l6s l33tness

Jakub NarębskiFebruary 9, 2020 5:34 AM

Some of the new cool hardware will cost too much to be free, and there will always be a need for concentrated computing power close to the user -- game systems are an obvious example -- but those will be the exception.

Actually, there are now exists game streaming services, where the actual game runs on the cloud. Examples include just launched Nvidia's GeForce Now (which connects to existing game stores like Steam or GOG), Google Stadia which arrived in November, Sony's PlayStation Now, and upcoming Microsoft's Project xCloud (in beta).

Clive RobinsonFebruary 9, 2020 6:09 AM

@ Jakub Narębski, ALL,

Actually, there are now exists game streaming services, where the actual game runs on the cloud.

Whilst that is true, they can not get around the laws of nature, which means "latency" is always an issue irrespective of bandwidth or data coding methods.

It's the reason the likes of "High Frequency Traders" will pay millions one way or another to have the shortest path possible to the "exchange server".

ThinkFebruary 9, 2020 11:01 AM

From my perspective, I see Microsoft or Google’s cloud services hiccup and thousands of small and large enterprises lose their productivity software and internal support costs skyrocket only because the end user doesn’t know why all their stuff doesn’t work - they all call their internal IT for help. I can’t get my email or spreadsheet or chat program or my data off the cloud.

A class failure like Bruce wrote about in a recent Book.

Lots of for profit cloud based applications running within ‘free’ suboptimal browsers. Those browsers running enterprise critical applications as add ins.

The browsers or the antivirus or the O/S are updated and a business lethal mix of incompatibility takes down your browser based mission critical app. Ouch.

Never mind the constant updates of cloud based SaS button and menu placement changes that confound the end user to no end. Changes that used to happen every few years now happen every few weeks.

This is clear example of software companies profit models trumping interoperability.

MikeAFebruary 9, 2020 11:06 AM

@Clive in re latency.

I suspect you are dismissing "frog boiling". Back when I was designing arcade video games, we strove for 1-frame (16-20 Millisecond) latency from control change to visible on screen. Sometimes we had to settle for 2-frame, but were mightily ashamed to hit 3 (50-60msec). Today's monitors often inject that much, even before the game hardware gets its contribution in. But early on, PC games often ran at 10Hz frame-rate, and folks "just got used to it". The pipelines have gotten more complex (and flaky) as time goes by. Even prerecorded video can have audio a second or two before (or after) the corresponding video, like a badly dubbed B-movie. Apparently, there are few enough people who care to make a difference to the bottom line. Pretty much the same as the erosion of privacy and security (and fitness for purpose) across computing in general. Time to see if you can get out of the hot-tub.

Someone recently asked me about the biggest difference between computers today and 30 years ago. It took me very little time to come up with my answer: They weren't so overtly hostile to their users back then. Companies used to spend resources on developing products that mostly worked. Now they spend it on newer dark-patterns. Better ROI.

Clive RobinsonFebruary 9, 2020 2:31 PM

@ MikeA,

Someone recently asked me about the biggest difference between computers today and 30 years ago.

People wonder occasionaly why I still use MS DOS 5 and WordStar 4 or other WordStar compatible editor or IDE. The answer almost always supprises them "lack of latency". Also eyebrow raising surprisingly my Apple ][ from the 1970's using a 1MHz CPU and 64k RAM (with language card) actually has a faster keyboard to screen response in an editor than any computer I've used since and it "makes for a comfortable experience" unlike the pain of anything beyond MS Win XP and Office from the 1990's. Even though it's an odd looking beast my Amstrad PPC640[1] from a decade later with 8088 NEC compatible CPU runing at 8MHz and green/yellow LCD screen runs better, way better using the Mirror II Modem software with built in WordStar key bindings editor, than any MS Windows wordpro/editor program I've had the misfortune to use (oh and the PPC640 could also run an early Borland IDE/compiler).

... and folks "just got used to it"

Not this old grey beard, to me latency is important, to others maybe not so much, but the point remains that the latency of the electronic round trip journy from London to Washington State USA is very very noticable and that's just using the vt220 compatable SSH terminal access to a high end *nix box... Oh and much more irritable than the delay on phones, but... not quite as bad as a seven hour flight over the perpetually snowy wastes of the north with a 24hour lay over in Redmond in between ;-)

[1] https://en.m.wikipedia.org/wiki/PPC_512

justinaFebruary 9, 2020 4:50 PM

Re: "dissolution of the strict boundaries between the internal and external network"

Somewhat on topic. https://https.cio.gov/ I kind of have to give the link as it is, sorry about the autoloading.

There's a corporate C*O-type boss out in left field somewhere on Capitol Hill in Washington, DC.

It's some sort of long-lived super-cookie intended to improve the security of the World Wide Web consumer's browsing experience on compliant websites in an environment of cache-control issues with the Google Chrome browser and transparent proxies on the internet backbone that play tricks such as grabbing https pages and serving them as http to consumers without the security.

Also known as "HSTS" which has a certain vulgar double entendre.

Jesse ThompsonFebruary 9, 2020 5:22 PM

@MikeA

Lower latency means more predictable headshots.

Fortnight didn't become a billion dollar juggernaught in just 14 days or so by offering half a second control to screen latency. Nor from half a second shot to "are they really dead" or duck behind cover to "am I really dead" latency.

Your 16-20 ms latency matters for single-player timed puzzle games, shootemups, and multiplayer same-screen games. Today that would include Flappy Bird, Cuphead, bullethell, Shovel Knight, etc.

Half-second lag may be tolerable for turn-based puzzle games, jrpgs, etc. Candy Crush, Angry Birds, Plants v Zombies, lag means little to these casual titles. Old school final fantasy, Golden Sun, Chronotrigger, (Paper) Mario RPG, those can survive some heavy lag as well before players will really become frustrated.

So streaming gaming services will be quite limited to these non-timed sorts of games, which incidentally do NOT benefit from the kinds of rich graphics that would require a streaming gaming service to run in the first place.

FPS is the responsiveness middle-child though, where 100ms latency may satisfy most casual players and only the competitive Esports folk can really tell the difference between ping times shorter than that.

So I don't think this is frog-boiling at all, it's just shifts in expectation when the game is no longer housed in a closed cabinet with a quarter-timer ticking relentlessly away. ;)

name.withheld.for.obvious.reasonsFebruary 9, 2020 11:02 PM

@ Clive

People wonder occasionaly why I still use MS DOS 5 and WordStar 4 or other WordStar compatible editor or IDE.

...
Also eyebrow raising surprisingly my Apple ][ from the 1970's using a 1MHz CPU and 64k RAM (with language card)

Hardly an eyebrow here, though I have moved on to 6.22.

Hardware terminals, my VT's work just great and I have spare CRT's. The tactile response and firmness makes all the difference. Hardware is more modern, I486/33 with 16M of RAM and a Orchid Video card for running X apps locally. It is tasked as a server and does just fine. On an AMT board, quite beautiful in fabrication, and quality comps all around. Every ogled long and hard at an SGI motherboard and platform--how pretty.

So ya got me beat Clive, and am glad there is someone that can don a hat of foil construction with honour and pride. Me, the pocket protector keeps me safe.

Clive RobinsonFebruary 10, 2020 5:00 AM

@ name.withheld...,

Me, the pocket protector keeps me safe.

Is that to store an HP programable Calculator in ;-)

All jokes etc aside, the simple fact is old hardware was better made and more reliable because it was "business" not "consumer" grade design and manufacturing. Made before the likes of Michael Dell started his saturation bombing businesses with at best consumer internals in business cases. I'll give him his due though, he had realised that Moore's law would mean upgrades around the time of the end of the warranty, so why bother to design for any longer life time.

But there is the security asspect, of "keeping safe" old hardware is way less complex and way harder to hide things in. Because there was no spare silicon or magnetic "real estate" and nothing was "to small to see" so even trying to build it in would have been easily found. So they did not backdoor that part of the supply chain.

Thus old hardware is easier to secure if you wish to be secure.

The problem I have these days is CPU's and high end microcontrolers, are all to easy to hide hardware back doors in by design or accident as Meltdown etc have shown. Because the likes of System on a Chip (SoC) devices are destined for Internet of Things (IoT) or "Smart" devices, are so cheap they are displacing older less complex microcontrolers that are getting harder to get hold of.

Therefor as the modern high complexity SoCs are destined for IoT / Smart devices, I assume if there are deliberate hardware backdoors being put in chips, then these are the ones that it will happen to. But with even clothes irons having SoC's with bluetooth/WiFi comms capability in them, designing secure systems is getting to the point of "box bashing", "reclocking I/O" and stoping stuff comming "back up the output pipes" like the proverbial rat :-(

It's got to the point now where when something odd appears on the "goods inwards" test jig for even the cheapest of transistors you don't automaticaly assume "duff batch"...

AdrianFebruary 10, 2020 12:21 PM

What you call "decentralization," I've always called "re-centralization." The PC revolution allowed users to pull their data off the corporate mainframes so they could control it and process it however they desired. "The cloud" is not a fluffy widely-distributed amorphous respository, but a collection of about three virtual datacenters tightly controlled by companies interested in concentrating everyone's data as much as they can.

I was hopeful peer-to-peer techniques would solve the any-screen issue, but there's no business model there. The cloud miners have told us again and again that centralization of our data is the only way to provide these features.

TimFebruary 11, 2020 10:57 AM

@Matt
Computers didn't get anywhere near 100 times as powerful.

That's one thing that sounded odd to me in the essay. Would Moore's law actually have predicted that to happen?

Clive RobinsonFebruary 12, 2020 2:57 AM

@ Tim, Matt,

Would Moore's law actually have predicted that to happen?

Well yes and no. Intel have morphed Moore's Law quite a few times.

But the rough estimate is "Doubles every 18months" or every year and a half. So for a decade a quick calculation of,

2^(10/1.5) = 101.593667...

Which I would say is "fairly close" ;-)

Drive-By IdealogueFebruary 13, 2020 2:18 PM

@adr

I was hopeful peer-to-peer techniques would solve the any-screen issue, but there's no business model there. The cloud miners have told us again and again that centralization of our data is the only way to provide these features.

My (debatably paranoid) assessment has long been a strong suspicion of a kind of loose/grand/whatever conspiracy/consensus/whatever by the big e$tablishment companies to (relatively) persecute home server utilizers precisely because such persecution leads to the effective stifling of that set of competing business models.

@bs

My only real complaint with the essay is that I used "decentralization" in a nonstandard manner, and didn't explain it well. I meant that our personal data will become decentralized; instead of it all being on our own computers, it will be on the computers of various cloud providers. But that causes a massive centralization of all of our data. I should have explicitly called out the risks of that.

Otherwise, I'm happy with what I wrote ten years ago.

I (perhaps mistakenly) had always interpreted Bruce Schneier's position here as inferring that the decentralization of that data would happen to "a cloud" of computers, but with enough competitors in that "cloud space" such that the end result need not be characterizable as "a massive centralization of all of our data". The reason for my interpretation was a simple view of e.g. text/image/multimedia publishings completely under the user control. As far as access control (public/direct-to-individual/subsets/) (lists), data replication (load balancing/redundancy), etc. Specifically with data replication, the ability to have 100 friends with computers on this 'internet' thing, and choose 10 that you trust enough to serve your data from their computers(servers/workstations/raspberrypis/etc), and maybe another 10 that you trust enough to hold fully encrypted offsite backups. And thus with those 20, be able to publish on the internet without the assistance of any commercial third party. And guessing that general liberty and evolution would lead most to eventually choose free and open source software server solutions to accomplish this, though no doubt with closed source server software solutions competing in the same space.

Many years have passed, and I am prepared to meet the grave seeing that what I saw as inevitable did not come to pass in my lifetime. But it still seems obvious and inevitable to me. But a lot of other weird stuff is at play- potus trump etc. I also wonder about Bruce Schneier's participation in the InternetOfThings meme.

Denis GoddardMarch 9, 2020 1:33 PM

I just wanted to say, reading all you folks’ comments gives a warm feeling to this grizzled old programmer. Back in the early oughts I was complaining that computers were faster but UIs slower and that they would rip my VT100 and fetishized TRS-80 model III keyboard from my cold, dead, extremely quick and dexterous hands

Leave a comment

Allowed HTML: <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre>

Sidebar photo of Bruce Schneier by Joe MacInnis.