Cybersecurity for the Public Interest

The Crypto Wars have been waging off-and-on for a quarter-century. On one side is law enforcement, which wants to be able to break encryption, to access devices and communications of terrorists and criminals. On the other are almost every cryptographer and computer security expert, repeatedly explaining that there's no way to provide this capability without also weakening the security of every user of those devices and communications systems.

It's an impassioned debate, acrimonious at times, but there are real technologies that can be brought to bear on the problem: key-escrow technologies, code obfuscation technologies, and backdoors with different properties. Pervasive surveillance capitalism­ -- as practiced by the Internet companies that are already spying on everyone -- ­matters. So does society's underlying security needs. There is a security benefit to giving access to law enforcement, even though it would inevitably and invariably also give that access to others. However, there is also a security benefit of having these systems protected from all attackers, including law enforcement. These benefits are mutually exclusive. Which is more important, and to what degree?

The problem is that almost no policymakers are discussing this policy issue from a technologically informed perspective, and very few technologists truly understand the policy contours of the debate. The result is both sides consistently talking past each other, and policy proposals­ -- that occasionally become law­ -- that are technological disasters.

This isn't sustainable, either for this issue or any of the other policy issues surrounding Internet security. We need policymakers who understand technology, but we also need cybersecurity technologists who understand -- ­and are involved in -- ­policy. We need public-interest technologists.

Let's pause at that term. The Ford Foundation defines public-interest technologists as "technology practitioners who focus on social justice, the common good, and/or the public interest." A group of academics recently wrote that public-interest technologists are people who "study the application of technology expertise to advance the public interest, generate public benefits, or promote the public good." Tim Berners-Lee has called them "philosophical engineers." I think of public-interest technologists as people who combine their technological expertise with a public-interest focus: by working on tech policy, by working on a tech project with a public benefit, or by working as a traditional technologist for an organization with a public benefit. Maybe it's not the best term­ -- and I know not everyone likes it­ -- but it's a decent umbrella term that can encompass all these roles.

We need public-interest technologists in policy discussions. We need them on congressional staff, in federal agencies, at non-governmental organizations (NGOs), in academia, inside companies, and as part of the press. In our field, we need them to get involved in not only the Crypto Wars, but everywhere cybersecurity and policy touch each other: the vulnerability equities debate, election security, cryptocurrency policy, Internet of Things safety and security, big data, algorithmic fairness, adversarial machine learning, critical infrastructure, and national security. When you broaden the definition of Internet security, many additional areas fall within the intersection of cybersecurity and policy. Our particular expertise and way of looking at the world is critical for understanding a great many technological issues, such as net neutrality and the regulation of critical infrastructure. I wouldn't want to formulate public policy about artificial intelligence and robotics without a security technologist involved.

Public-interest technology isn't new. Many organizations are working in this area, from older organizations like EFF and EPIC to newer ones like Verified Voting and Access Now. Many academic classes and programs combine technology and public policy. My cybersecurity policy class at the Harvard Kennedy School is just one example. Media startups like The Markup are doing technology-driven journalism. There are even programs and initiatives related to public-interest technology inside for-profit corporations.

This might all seem like a lot, but it's really not. There aren't enough people doing it, there aren't enough people who know it needs to be done, and there aren't enough places to do it. We need to build a world where there is a viable career path for public-interest technologists.

There are many barriers. There's a report titled A Pivotal Moment that includes this quote: "While we cite individual instances of visionary leadership and successful deployment of technology skill for the public interest, there was a consensus that a stubborn cycle of inadequate supply, misarticulated demand, and an inefficient marketplace stymie progress."

That quote speaks to the three places for intervention. One: the supply side. There just isn't enough talent to meet the eventual demand. This is especially acute in cybersecurity, which has a talent problem across the field. Public-interest technologists are a diverse and multidisciplinary group of people. Their backgrounds come from technology, policy, and law. We also need to foster diversity within public-interest technology; the populations using the technology must be represented in the groups that shape the technology. We need a variety of ways for people to engage in this sphere: ways people can do it on the side, for a couple of years between more traditional technology jobs, or as a full-time rewarding career. We need public-interest technology to be part of every core computer-science curriculum, with "clinics" at universities where students can get a taste of public-interest work. We need technology companies to give people sabbaticals to do this work, and then value what they've learned and done.

Two: the demand side. This is our biggest problem right now; not enough organizations understand that they need technologists doing public-interest work. We need jobs to be funded across a wide variety of NGOs. We need staff positions throughout the government: executive, legislative, and judiciary branches. President Obama's US Digital Service should be expanded and replicated; so should Code for America. We need more press organizations that perform this kind of work.

Three: the marketplace. We need job boards, conferences, and skills exchanges­ -- places where people on the supply side can learn about the demand.

Major foundations are starting to provide funding in this space: the Ford and MacArthur Foundations in particular, but others as well.

This problem in our field has an interesting parallel with the field of public-interest law. In the 1960s, there was no such thing as public-interest law. The field was deliberately created, funded by organizations like the Ford Foundation. They financed legal aid clinics at universities, so students could learn housing, discrimination, or immigration law. They funded fellowships at organizations like the ACLU and the NAACP. They created a world where public-interest law is valued, where all the partners at major law firms are expected to have done some public-interest work. Today, when the ACLU advertises for a staff attorney, paying one-third to one-tenth normal salary, it gets hundreds of applicants. Today, 20% of Harvard Law School graduates go into public-interest law, and the school has soul-searching seminars because that percentage is so low. Meanwhile, the percentage of computer-science graduates going into public-interest work is basically zero.

This is bigger than computer security. Technology now permeates society in a way it didn't just a couple of decades ago, and governments move too slowly to take this into account. That means technologists now are relevant to all sorts of areas that they had no traditional connection to: climate change, food safety, future of work, public health, bioengineering.

More generally, technologists need to understand the policy ramifications of their work. There's a pervasive myth in Silicon Valley that technology is politically neutral. It's not, and I hope most people reading this today knows that. We built a world where programmers felt they had an inherent right to code the world as they saw fit. We were allowed to do this because, until recently, it didn't matter. Now, too many issues are being decided in an unregulated capitalist environment where significant social costs are too often not taken into account.

This is where the core issues of society lie. The defining political question of the 20th century was: "What should be governed by the state, and what should be governed by the market?" This defined the difference between East and West, and the difference between political parties within countries. The defining political question of the first half of the 21st century is: "How much of our lives should be governed by technology, and under what terms?" In the last century, economists drove public policy. In this century, it will be technologists.

The future is coming faster than our current set of policy tools can deal with. The only way to fix this is to develop a new set of policy tools with the help of technologists. We need to be in all aspects of public-interest work, from informing policy to creating tools all building the future. The world needs all of our help.

This essay previously appeared in the January/February 2019 issue of IEEE Security & Privacy. I maintain a public-interest tech resources page here.

Posted on May 3, 2019 at 4:33 AM • 34 Comments

Comments

AlejandroMay 3, 2019 6:39 AM

Brilliant discussion and synopsis.

But also scary, because Bruce is light years ahead of our governmental representatives who are quite comfortable being technologically ignorant and/or on the payroll of evil, predatory corporations and persons.

I'll just say it, I am a big fan of personal geo-fencing, tribalization and local distributed networks controlled by the techo-peasantry. Skype everything.

Petre Peter May 3, 2019 7:57 AM

In Click Here To Kill Everybody Mr. Schneier also talks about crossing the streams between technology and policy. This divide has been growing, unfortunately, to the point where we have policy makers bragging about the fact that they do not know how the technology they are supposed to regulate works, and technologists who believe that code is law. I do not know how much of my life should be controlled by technology because I am constantly being bombarded with jingles that incantate what I want is what I need. I am not sure of the need to transform technology from a luxury to a necessity.

JonMay 3, 2019 8:09 AM

At what point do the desires of the criminals, who also want backdoors and access to everything, become indistinguishable from the desires of law enforcement? J.

Alyer Babtu May 3, 2019 8:17 AM

At first I thought, like the Surgeon General, but for computers. But then, medicine relates to a given in nature whereas computerology is more a choice. Maybe we have made bad choices and tech has a disproportionate presence. There are things that only can be done with computers, but we use them fsr beyond those “necessary” contexts, and are less human for that.

FaustusMay 3, 2019 9:36 AM

This post makes me sad. Bruce's support for encryption and privacy is slowly walking the plank:

"There is a security benefit to giving access to law enforcement, even though it would inevitably and invariably also give that access to others."

No, there probably isn't a security benefit to giving law enforcement access. It simply creates a larger threat surface. Giving someone a badge does not turn them into a "good guy". It just gives them the ability to be a super-powered malefactor.

We know that people with access use private information to stalk people, to pursue personal grudges and to search down romantic interests. That people who monitor secure feeds enjoy ridiculing people based on the private information that they access. That security services abuse the law and then erase the evidence. With impunity. And much more. We see example after example of such abuse but we still pretend otherwise.

We don't hear about security theater any more on this blog. What new attacks should cause us to reverse our stance on privacy? If any, how many of these were not perpetuated by our own governments, or initiated by our governments or at least allowed to proceed despite government's advance knowledge?

Do such security threats harm even 1/100 of the people that removing the 55mph speed limit in the US did? We have a very selective interest in security. Law enforcement wants to read all our mail but still keep rape allegations against police officers secret. We have almost zero government transparency, and zero police transparency, but these same people demand we trust them with all our private data. I say that this is ridiculous.

There has been righteous talk about freedom and rights on this blog. The privileged have other ways to protect their privacy. Access to private information invariably harms the vulnerable the most. Do we really care about the vulnerable or is our care for our power and pocketbooks?

What was a security guy probably doing in his teens? Criminal hacking. And many security companies have been implicated in hacks and DDOS attacks. But we mostly have to go to Krebs to read about that.

The security industry is problematic. It is served by an increase in attacks. It is served by the fear of largely secret "enemies". It is partially staffed by ex-criminals and some current criminals. It is all about money.

To say that security companies need to be involved in public policy is simply saying that the security industry wants more power. And it appears that our personal privacy is a bargaining chip.

I wonder what John Perry Barlow would say. Is this the direction that the EFF is going? Is so, I would say that it has been put up for sale and it is going, going, GONE.

I am not saying the security industry is evil. Nor the police nor the government. They have important roles and many dedicated people. But power corrupts and it will continue to do so. We should have a right to be left in peace. And we have the right to control these organizations. They are supposed to be serving us, not their hunger for power.

JohnnySMay 3, 2019 9:36 AM

One of the big issues here is that you can have all the backdoors you want, with fancy key escrow and all sorts of infrastructure in place so that everything can be read by law enforcement with no problems if they have all the correct warrants, subpoenas, and whatever Star Chamber permission stamps you desire to try to control the access BUT:

1. The basic mathematics of encryption that is (currently) impossible to break in a time frame that would provide law enforcement useful access is well known and published in free-to-access information sources.

2. Any villain can learn how (or pay someone) to program that math into a useful and virtually unbreakable software tool they can use to communicate. Or just use whatever is out there already.

Example: If I have installed the government-approved backdoored communication application called "ObedientCitizenMessenger", and I want to communicate securely to my buddy, I can just use PGP to encrypt my message before sending it. If "They" think I'm a villain and "They" decide to come after me, "They" may be able to use their backdoors to remove the first level of encryption in the app, but all they'll get is the ciphertext that I created. Since my buddy and I never gave anyone our private keys, our communications are perfectly secure even using the ObedientCitizenMessenger application.

So what's the point? I'm sure lots of people will get rich selling this backdoored "creepware" and the infrastructure to manage key escrow and such, and they'll make even more money selling the access to whoever wants to read the private messages of law-abiding citizens, but this will do NOTHING to help with accessing the communications among real villains.

More money for the software peddlers, more costs to the public and no more protection from villains. Security theatre at its finest!!

1&1~=UmmMay 3, 2019 9:51 AM

@:

"I am not sure of the need to transform technology from a luxury to a necessity."

If a 'luxury' makes life easier then it is the nature of humans to turn it into a necessity one way or another.

For example the motor vehicle in the US went from a luxury to an almost out right necessity in all but a few cities in quite a bit less than a century.

Likewise electronic communications, from telegraph key through to Smart phone that does voice, short messages via SMS, long messages and video and much else besides by Internet connectivity.

If you take a look at the various histories they tend to follow,

1) Luxury for the rich,
2) Toys of BigCorp managment,
3) Business tool,
4) Middle class status symbol,
5) Every day usage,
6) Social necessity,
7) Enforced necessity,
8) Replacment by newer technology.

Step 8 is the interesting one, as it tends to start around step 4.

Also as you look down the list if a new technology does not go through stages 2 or 3 then it also tends to stop at stage 4 and in effect remains a luxury item only.

Impossibly StupidMay 3, 2019 10:11 AM

Everyone should note that this is a bump/dup from 2 months ago, with the addition of the new, dedicated web site. My comments from the old post still stand.

@Jon

The better question is whether the cops themselves ever commit criminal acts. There is ample evidence that they do, of course, and that alone is reason enough to deny their desire to go on limitless fishing expeditions, with data or anything else. Only once they excise their own corruption, when they erase the "thin blue line", then they can begin to think about discussing the idea that they should have the ability to backdoor all forms of encryption. But, again, that will remain a laughable request in the face of an OTP and basic math.

1&1~=UmmMay 3, 2019 10:27 AM

@JohnnyS:

"2. Any villain can learn how (or pay someone) to program that math into a useful and virtually unbreakable software tool they can use to communicate. Or just use whatever is out there already."

The simplest example has been given on this blog a number of times and that is the One Time Pad (OTP). Most people can learn to use one reliably in half a morning of tuition and the rest of the day reinforcing practice. The resulting ciphertext can be used over any communications channel.

However easy as it is to use a One Time Pad, it has it's issues to do with the production, transportation, destruction, and control of the Pad Material. This is made worse by the fact that it can be inordinately large, because you have to securely send in advance sufficient Pad Material to account for the largest sizes and numbers of messages you need to send in any given time period.

There are ways that you can cheat on production and transportation. However you loose the 'forever secure' nature of the OTP by effectively reducing it to a Stream Cipher. Thus the security rests on the strength of the Crypto Secure Digital Random Bit Generator (CS-DRBG) that may or may not have weeknesses or backdoors of it's own.

Likewise any CS-DRBG be it a Stream Cipher or Block Cipher needs some kind of starting point or 'seed' that is truely randomly selected. The usuall high water mark on generating 'seed material' is the True Random Number Generator or TRNG. These use "physical randomness" which in theory makes them impossible to predict. The problem is most actually generate very little true randomness and are fragile in operation. Likewise they are easy to influance in many ways. Thus even the TRNG can be backdoored or predicted by a sufficiently well resourced adversary.

Thus there are a lot of steps that have to be got right, many of which are not easy to find out about. Such as remembering to have a 'null character' in the alphabet you use, and how to use it properly.

justinacolmenaMay 3, 2019 10:39 AM

law enforcement wants to be able to break encryption,

There is a certain color of law to that, gun and a badge on the political left.

to access devices and communications of terrorists and criminals

That would be … we the people. Yes, that's us. You and I. They just haven't read us our rights yet.

It's an impassioned debate, acrimonious at times,

It's the emotions of overprivileged, bitter women who press arbitrary, preëmptive criminal and civil charges in court.

We are the people. They don't want us anywhere near their women and children.

Tim Berners-Lee has called them "philosophical engineers." I think of public-interest technologists as people who combine their technological expertise with a public-interest focus

Once again, government is telling us how to think, and enforcing it under the guise of mental health, social hygiene, amd behavioral health.

The so-called “public interest” specifically excludes the interest of the defendant in such proceedings at court, and we the people have all been systematically reduced to the status of defendant under the New World Order.

JohnnySMay 3, 2019 11:26 AM

@1&1~=Umm

Agreed entirely. It reinforces the point that "practically" unbreakable encryption is fairly easy to implement, even over channels that authorities may think they can access easily. The advantage of PGP over OTP is that if authorities get one copy of the OTP, they can read all messages while with PGP, they need multiple private keys to read all the traffic.

When strong encryption is outlawed, only outlaws will have strong encryption.

ALMay 3, 2019 4:37 PM

I think encrypted speech is covered under the 1st amendment, and that the first amendment would need to be modified as follows: Congress shall make no law ... abridging the freedom of speech so long as that speech can be understood by the government...

We can talk about what's nice, helpful good, or bad, but I think it is moot. We need to talk about what is legal. And I think that under the current version of the 1st amendment, Americans have a legal right to speak in code, which is what encryption is.
https://www.mtsu.edu/first-amendment/article/948/encryption

parabarbarianMay 3, 2019 5:36 PM

@AL

"Americans have a legal right to speak in code, which is what encryption is"

Yes they do but then again...

The 3D data files to print firearms or parts of firearms are also code. However, the State of Massachusetts is threatening to prosecute anyone who posts them if the files are accessible in Massachusetts. That doesn't invalidate your argument but it does illustrate that the issue is more complicated than just code.

TõnisMay 3, 2019 5:45 PM

@Faustus, LOL seems like repeat of an article I already replied to. Maybe it's some kind of a warrant canary function for the site ...

Jesse ThompsonMay 3, 2019 7:01 PM

I think the part of the discussion about Public Interest Technologists that is still really missing here is the concept that no human can reliably tell you what is actually in the public interest.

The founders of this country (The US) relied on a method of checks and balances not only to help ensure that corruption was kept out of public interest, but to define what public interest even meant. The entire goal of democracy is to ensure that all voices are heard and are weighed in some fashion prior to a decision being arrived at.

If we still believe in the democratic process, then we cannot presume to create some layer of field-experts embedded into policy making who will just keep everyone else honest by way of their magical access to some higher moral understanding. We still have to create some process (and if we're being democratic about it, some adversarial process) whereby what each party thinks is moral (including the corrupt ones only seeking to line their own pockets or increase their own influence) get to compete in some arena where the product of compromise is more likely to actually reflect the public interest than any one person's utopian fancies.

Satoshi reached precisely this kind of balance when (t)he(y) created the world's first globally accessible notarization ledger that requires trust in no single authority to participate in or to verify.

In short, we cannot hope to simply dominate myopic or greedy actors by sheer force of will.. or even by sheer force. Our only hope is to devise ever better systems that use this self-interest as fuel in competitions that drive more public interests.

Of course that's not easy, but it's certainly not any harder than trying to wage war of direct resistance against actors who do things against what we personally perceive to be moral.

Erdem MemisyaziciMay 4, 2019 3:08 AM

Well said. Security needs to be an open and a well taught concept. We seem to have decided to make security professionals into quiet secretive groups and sort of hold their knowledge to their chest since the industry is designed to be for profit. There needs to be the open side of that coin as well where security experts are publicly and actively involved in policy. No methodology in security should be a big secret, or at least if it is, it's more likely to be not very secure at all. Take the infamous copyright enforcement algorithms at hand, most hackers already know how they work so that being a company secret only makes that knowledge more profitable so in turn it receives more interest from hackers. Anybody remember the commercials where Apple claimed to have no viruses on their platform? How long did it take for that to change? When you make an encryption algorithm for example, the idea is to make the method public, the key is what provides the entropy to protect that data. That's real security. If you build a weakness into that algorithm and just keep that quiet, you've just given the whole thing a deadline to be absolutely useless.

Impossibly StupidMay 4, 2019 11:23 AM

@Jesse Thompson

I think the part of the discussion about Public Interest Technologists that is still really missing here is the concept that no human can reliably tell you what is actually in the public interest.

That's not entirely true. The biggest problem is that older systems of government, including democracy, are all pre-scientific, and they haven't really been updated to follow an evidence-based process. Great progress could be made in not just establishing a sensible notion of "public interest", but in all areas, if only politicians accepted reality rather than trying to use the laws to enforce some self-interested fantasy they have.

If we still believe in the democratic process

I'd rather follow the scientific process (which, really, is more along the lines of the "adversarial process" you describe). But it, too, requires an educated population in order to function properly. That's where the work needs to be done, but the systems that are promoted or already in place seem geared to keep most people in the dark.

Rob BraxmanMay 4, 2019 12:17 PM

I'm one of these technologists and a privacy evangelist. It's frustrating because as much as I speak out in public, the biggest stumbling block is that most of the people are sheep and don't care.

If the sheep don't care, then politicians aren't going to care. Someday I will be heard by many more. In the meantime, Zuckerberg has a permanent watcher on my broadcasts, making sure he knows what I'm saying.

Jonathan WilsonMay 4, 2019 5:58 PM

I have yet to see a real-world threat (even some of the worst terrorist attacks and mass murders out there or a rogue state like North Korea building nuclear weapons) that (in my opinion) is serious enough to justify allowing law enforcement and intelligence agencies to gain backdoor access to encrypted communications if such access would in any way compromise the security of people other than the intended target.

Sed Contra May 4, 2019 6:02 PM

@Impossibly Stupid

follow an evidence-based process

Recommended for your perusal in regard to the dystopian side of that approach

Ben Winters, Golden State
Dave Eggers, The Circle

Or, just Francis Bacon The New Atlantis, or Plato Republic.

Impossibly StupidMay 5, 2019 5:57 PM

@Sed Contra

Recommended for your perusal in regard to the dystopian side of that approach

I not sure what part of any of those fictional works are relevant when it comes to realistic efforts towards adding modern scientific approaches to government policies; please state directly what issues/warnings you think are applicable.

I would agree, though, that science alone is not the only tool that should be used to establish our values and principles as a society. Even the pre-scientific philosophies of Plato can shape how we use science, but it should at least be clear by now that empirical methods are going to yield better results for everyone than purely magical thinking. There are plenty of low-hanging fruit, like using it to extend the system of checks and balances to the laws themselves, that should be something that could be considered without a great deal of controversy.

Sed Contra May 5, 2019 8:22 PM

@Impossibly Stupid

realistic efforts towards adding modern scientific approaches

Is there is anything modern about science especially if “realistic” is a criterion? “Modern” science is only scientific in the restricted sphere of investigation through quantity. Even here there are questions whether it is science, since it is mostly model based and does not work through nature, that is, uses arbitrary and hence non-scientific starting points.

Politics, justice, the good for the other, and for all, is not quantitative in its nature. “Evidence based” today seems to mean quantitative modelling. Plato, Aristotle etc. were more truly evidence based, and as far as the subject matter allows scientific, because their approach was to start with and look at real things in terms of their natures. The pre-scientism and ”magical thinking” is on the modern side rather. As Aristotle says, each kind of knowing has a exactitude appropriate to it that has to be honored. Politics, ethics etc are only capable of being by grasped at the level of good opinion and are not the subject of a science.

The imaginative literary accounts I referred to above speculate on the tendency to tyranny that arises when inappropriate evidentiary methods are enforced.

If somehow a marvelously good political system were to be hit on, copying it and using it as a prescriptive set of rules would become tyranny. The continual active exercise by people of their political virtue and prudence is the only way to maintain freedom.

mikeMay 6, 2019 5:35 AM

@Jesse Thompson wrote, "The founders of this country (The US) relied on a method of checks and balances not only to help ensure that corruption was kept out of public interest, but to define what public interest even meant. The entire goal of democracy is to ensure that all voices are heard and are weighed in some fashion prior to a decision being arrived at."

Quite evidently they have failed.

Impossibly StupidMay 6, 2019 4:32 PM

@Sed Contra

Is there is anything modern about science especially if “realistic” is a criterion?

Anyone who isn't an anti-science wingnut need only look back at the history of progress to see that the last few hundred years with scientific advancement have been remarkably better for remarkably more people than the millennia before without it. And that's especially true for the last hundred years.

“Modern” science is only scientific in the restricted sphere of investigation through quantity.

If that's what you think, then your science education has been very soft. Even just limiting myself to the topic of discussion here, good science in cybersecurity and law would involved well-understood mechanisms of action, provable causation, and graceful degradation/exception handling. You worry about quantities only after you get the fundamentals right.

Plato, Aristotle etc. were more truly evidence based, and as far as the subject matter allows scientific, because their approach was to start with and look at real things in terms of their natures.

Utter rubbish. They did a lot more navel-gazing philosophy than objective observations of nature. They championed nonsense like humors for medicine, earth, water, air, and fire for physics and a geocentric Universe. Given the amount of ready evidence against your position, I have to assume you're just trolling at this point.

Ancient "wisdom" is nothing to revere. Even early scientific thinking involved taking a lot of shots in the dark. But that's what's great about science. Notions that are correct gather more evidence and greater support. Notions that are wrong are, and this is a critical component, accepted as wrong. A lot of society's ills continue to be caused by people who are objectively wrong, and yet refuse to accept that simple fact.

The imaginative literary accounts I referred to above speculate on the tendency to tyranny that arises when inappropriate evidentiary methods are enforced.

Again, you'll have to be more explicit on what you mean, because nobody rational is going to be swayed by a literary device adopted by an author simply as a means to tell a scary story. Yes, tyrants will seek to undermine scientific findings that contradict their self-centered agendas. But that's nothing new; it's already happening every day. It's exactly why I'm proposing to strengthen the system to make it less vulnerable to their petty whims.

If somehow a marvelously good political system were to be hit on, copying it and using it as a prescriptive set of rules would become tyranny. The continual active exercise by people of their political virtue and prudence is the only way to maintain freedom.

And I say that, if the only way to water the tree of liberty is with blood (as Jefferson posited), your political system is insufficiently advanced. Just as Einstein built on and corrected Newton, I think it's high time we took a thoughtful look at doing the same to Jefferson's notions of democracy. Doing so in accordance with 21st Century science will get us a lot farther than rejecting it.

michael aMay 6, 2019 5:30 PM

"There's a pervasive myth in Silicon Valley that technology is politically neutral. It's not, and I hope most people reading this today knows that. We built a world where programmers felt they had an inherent right to code the world as they saw fit. We were allowed to do this because, until recently, it didn't matter. Now, too many issues are being decided in an unregulated capitalist environment where significant social costs are too often not taken into account."
This is an impressive capture of the present 'root problem' . I wish I had thought of it.

Sed Contra May 7, 2019 9:09 AM

@Impossibly Stupid

more navel-gazing philosophy than objective observations of nature

I used to have pretty much the same opinion. But closer acquaintance and reading the writings of Plato and especially Aristotle changed my view. The erroneous ideas you mention don’t really vitiate Aristotle’s approach. And we doubtless have similarly bad ideas today which later ages will laugh at.

John BeattieMay 7, 2019 1:58 PM

"On one side is law enforcement, which wants to be able to break encryption, to access devices and communications of terrorists and criminals."

Has anyone actually asked the security services if this is so?

Suppose I turn up with super technology such that the security services could break the crypto of terrorists and criminals but not that of law-abiding citizens. Suppose further that we work through some practical use-cases, remembering such principles as innocent until proved guilty and so on: would the representatives of the security services really say that that was adequate?

On the contrary, I think that what is actually wanted is to be able to break _any_ crypto but with some procedural assurances to restrict when it would be done.

Further, and perhaps more importantly, the requirement includes an ask for some assurance that no-one _else_ could break any crypto.

Samuel JohnsonMay 9, 2019 10:59 PM

Maybe I’m wrong, but with regards to encryption, isn’t the cat out of the bag? The math is already published. The working code for the big modern algorithms is on GitHub. Like PZ publishing the code for PGP in the eleventh hour before the criminalization of strongish cryptography. The governments can mandate that all the providers provide a key to all messages. Stegonography makes possible under-the-radar communication for those who care.

Maybe not all malefactors will use best-practice trade craft. Joaquin “Chapo” Guzman shouldn’t have relied on the cell phone network and Blackberry. If he had used a VPN with Signal, or posted pretty pictures to Instagram with the cipher-text hidden in the last byte of each pixel, he would have stayed out of prison longer. “Stubby” hacked groups of bot-net hackers, because they used poor password hygiene. Diligence and technical tour de force will still be required to hide the metadata, but I think stegonography has real possibilities.

If the Department of Justice or the Judicial branch are listening, I expect privacy when I communicate electronically and I want Article 4 to apply.

1&1~=UmmMay 10, 2019 10:03 AM

@Samuel Johnson:

"Maybe I’m wrong, but with regards to encryption, isn’t the cat out of the bag? The math is already published. The working code for the big modern algorithms is on GitHub."

Not wrong, just not seeing the larger security issues.

There is an old saying about something 'being true in theory but not in practice'. AES as an algorithm, is as far as we currently know secure. But as an algorithm it is neigh on useless for all but a few, who make and break algorithms, where numbers still currently favour the SigInt agencies of super powers and more wealthy nations that can afford what is in effect a luxury.

To be of use to most the AES algorithm has to be implemented in practical systems where it forms but a very very small part. To be used within other algorithms that give us crypto-modes and yet further algorithms we more normally call protocols and standards.

Thus AES is like the tiny Matryoshka Doll hidden away in it's larger siblings, to be secure all the siblings that enclose AES must be secure, and all to often they are not. As with the weakest link in the chain under load, the security under attack rests on the weakest layer in the system as the attacker sees it.

Can the designer of a secure system see all the layers as an attacket does. The answer is simply 'NO' for a whole variety of reasons. Thus it's fairly safe to say that any system implementing AES or any other security algorithm for that matter, outside of the simplest is virtually guarenteed to have flaws unknown to the system designers. These flaws may or may not have been known to some before or after the system designers had designed and constructed the system.

The AES algorithm when designed for maximum efficiency in through put is generaly not at all secure as it opens up a whole heap of time and other side channels through which information can and does leak fairly drastically.

As a system designer you can only "design out" weaknesses you are aware of either as individual instances or classes of weaknesses.

It should be clear to most that care to look that we actually appear to know less about weaknesses as instances and classes than we should do. That is the types that become known are effectively more new ones each year than the year before. Or too put it another way the number of new vulnetabilities is rising not falling...

As a system designer you quickly realise that standard development methods can not in any way cope with that problem, because in effect the number of patches in any given time is as a consequence rising. Such a situation is unsustainable at best.

Thus other methods have to be used. As has been pointed out on this blog in the past you have to 'manage complexity' as well as ' mitigating risk'. You realise after a little thought that "segregation and gapping" are the way to go. It's what the NSA used to do with all their secure systems before politicians issued mandates about using the 'Commercial Of The Shelf Suppliers' of their 'friends'.

Whilst segregation has issues and does not readily lend it's self to compact systems all of the methods brought up on this blog, actualy rrquire it to have even a remote chance of being secure.

The problem as the likes of those making faux claims of going dark, is that most are so addicted to the convenience of compact devices that they have given up nearly all opportunity they had for security.

Thus the depressing reality is that as citizens in general we have sold our birth rights for a few grains of useless glitter and glitz.

TIARA GNOMEMay 14, 2019 3:12 AM

At this moment the world is relatively stable. Thousands of people have good jobs hacking away and all is well.

But, God forbid, one big terror event, or war between major states, and then kiss our data security and encryption goodbye--and say hello to Gestapo 2.0.

What happens with data security will be determined by future events. Even if there are no major wars or Nagasaki-style terror events, the decline of the West is going to make Western policies less important.

Leave a comment

Allowed HTML: <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre>

Sidebar photo of Bruce Schneier by Joe MacInnis.

Schneier on Security is a personal website. Opinions expressed are not necessarily those of IBM Resilient.