The Moral Dimension of Cryptography

Phil Rogaway has written an excellent paper titled "The Moral Character of Cryptography Work." In it, he exhorts cryptographers to consider the morality of their research, and to build systems that enhance privacy rather than diminish it.

It is very much worth reading.

EDITED TO ADD (12/15): Good interview with Rogaway.

Posted on December 3, 2015 at 2:49 PM • 28 Comments

Comments

L. W. SmileyDecember 3, 2015 6:38 PM

I've only made a very small inroad into this paper. I wanted to use the word modicum, it was in my mind before starting, but there it appeared on the 1st or 2nd page. Who should have access to strong encryption? Ordinary individuals or the sole province of powerful organizations and governments. Transparency for our lives and veils for corporations and national security cloaking the sins of government. Science and technology can always cut both ways. I think it's of greater importance who mathematicians, scientists, and engineers choose to do their work than the type of work. Choosing an employer is always a moral and economic dilemma and a trade off between those two considerations. We enable our masters. I think they'll be happy to hamstring encryption product just enough to keep it difficult to use, and out of the hands of the majority of everyday people

Finishing this paper is on my reading list, but I've got several in the queue. Finally just started reading Applied Cryptography, which had been highly recommended by an acquaintance a few years ago, but those protocols and that Mallory is a bad bad person. But top of the list, I'm gonna teach myself quantum logic (thanks Von Neumann who hated the Russians and had no qualms for whom he worked - I don't think he joined the Russell-Einstein anti-war crew)

¬ [] A = <> ¬ A
¬ <> A = [] ¬ A

tyrDecember 3, 2015 7:09 PM


I think you'll find John a lot more complicated than
any superficial view gives him credit for if you dig.

L. W. SmileyDecember 3, 2015 8:02 PM

@tyr - With a mind that deep, I'm sure he was complex.

Einstein went to work in the patent office, while Fritz Haber went to work making chemical weapons. Choices choices.

JustinDecember 3, 2015 9:07 PM

@tyr, L.W. Smiley

I don't know John from Adam. If that's his name, he'll have to introduce himself here. But I guess that isn't likely to happen, because Johnny wants his privacy.

@all

And why is privacy always the moral direction when it comes to cryptography? The other side of the coin is cryptanalysis, and without cryptanalysis there is nothing but a false sense of security.

This guy is promoting cryptographic systems that purport to protect privacy, and he preaches as if it were a sin to cryptanalyze such systems. In other words, "my system is secure because it would be immoral to break it, and none of my colleagues will help you do that."

Do you really believe this guy, or is it perhaps a better moral imperative simply to research cryptography (like any other subject) to better discover the truth and let the chips fall where they may? Especially given that others who study it (in secret no doubt) will have no moral qualms about the direction the research will lead them?

L. W. SmileyDecember 4, 2015 2:55 AM

@Justin

Thanks for the symbols. I was looking for them. When are they gonna have latex on the web easy? You don't know JvN from Adam?

"Anyone who considers arithmetical methods of producing random digits is, of course, in a state of sin."
-John von Neumann

MarcosDecember 4, 2015 6:20 AM

I'm pretty sure Von Neumann cared a lot who he worked for. He just had a moral compass that don't align with yours (or mine).

He also did not seem to hate the russians, it's just that they are made of atoms that... ops, sorry, that's explanation is for another rational agent.

ScottDecember 4, 2015 9:42 AM

@Justin

I don't think the essay says anything about cryptanalysis being bad. You can do cryptanalysis for the common good (e.g., improving standards), as opposed to doing it so the government can snoop on its citizens.

AJWMDecember 4, 2015 10:44 AM

as opposed to doing it so the government can snoop on its citizens.

While it may not apply in much of the world, here in the US at least that should (but in fact may not) read "so the government can snoop on its employers".

JustinDecember 4, 2015 12:52 PM

@Scott

I don't think the essay says anything about cryptanalysis being bad. You can do cryptanalysis for the common good (e.g., improving standards), as opposed to doing it so the government can snoop on its citizens.

I'm all for making university research publicly available. But once you publish your research in such a technical field, your political opinions are irrelevant, because anyone who reads it can use it for any purpose they see fit.

You almost seem to be advocating that researchers should take a position of, "We know how the government can snoop on its citizens, but we're not going to tell you." Sure. Many such researchers work for the government. But meanwhile organized criminals, advertisers, foreign spooks, and pimps are conducting their own cryptanalysis, thereby gaining a great deal of power and control over the world, and other branches of the government among other legitimate users are depending on that cryptography at their own peril.

There is no such thing as crypto privacy heaven. When you pierce the veil of it, or even lift up the corner a little bit, you see nothing but the seamy side of life.

JustoutDecember 4, 2015 2:32 PM

@Justin

and he preaches as if it were a sin to cryptanalyze such systems.

I can't see where he preaches that in the paper. Can you provide any pointer?

In other words, "my system is secure because it would be immoral to break it, and none of my colleagues will help you do that."

https://en.wikipedia.org/wiki/Straw_man

Jan WDecember 4, 2015 2:51 PM

This paper is one of the more important papers at the moment. It is a pity that it contains a lot of technical stuff that makes it less accessible for the non-technical reader. But his main message that privacy is not (only)important for the individual, but especially for the society as well, should be made more public. I do hope that this article will be more read, also outside the realm of the cyberpunks or the cryptographers.

CuriousDecember 4, 2015 3:33 PM

I can't help but think that this "international" scope of things, is really a scandalous thing to be attacked so to speak, yet being a big problem that is inherently difficult to resolve, given how there just isn't any international entity, just various state actors that seem to just do as they please.

The "international scope" that I had in mind, are the kind of overreaching state power that meddle with people as they go about their business on the internet, and then there are corporate power that meddle with this in turn, tracking users and siphoning personalized data to be sold to some third party as I understand it.

As for state actors, I am tempted to sketch out one basic problem with having a global internet community, in which the internet has/will be weaponized by state actors and supporters (corporations i guess). Any debate involving such interests and their supporters (governments) should imo be barred from happening, because of how nations states on a general basis could be thought of as being inherently abusive, being utterly untrustworthy. Last I heard UK's GCHQ has given/gotten themselves a license to hack and intrude on everyone and everywhere on a global scale. Not to mention documents by Snowden that iirc show GHCQ having strategic goals of interfering with people's lives ("make something happen in the real world").

As for corporations and their influence on a global community of people on the internet, their trustworthyness is imo low, very likely having a for-profit goals, while having possibly NO altruistic motives beyond making money, or increasing their market power. I find it interesting to think that: if only corporations could be dealt with in such a way as to provide products without a play of superficial politics (that imo can't be trusted), then perhaps we can get some way with corporations. The question then, is, does corporations have what people want? Presumably, people should not be left to sit around and let Google or any browser maker, or ISP for that sake, to just make cosmetic changes to their products.

In addition to having an attitude, I think having all this knowledge about how the internet works and what is happening to it, is really important.

I'd like to think that for-non profit organizations are better at handling and organizing larger efforts that can counteract things or get things going, assuming their efforts aren't subverted by state actors.

Clive RobinsonDecember 4, 2015 6:03 PM

@ Curious,

The question then, is, does corporations have what people want?

The answer is no, and that is not going to change any time soon.

The reason is "the problem space complexity", that is what is possible and conversly what is not possible is beyond by far the greater percentage of computer literate people's ability to grasp. That is not to say that any particular part or possability is difficult to grasp, in general it's not, it's the vastness of the number of possibilities and more importantly their interactions that can not be easily grasped.

Thus most people get "drip fed" information by "interested parties" that is the likes of Corporates and Governments make available information biased in the direction these entities want to take people.

It's one of the reasons why for instance browser security is so lousy and is not likely to change for the better.

However there is a secondary battle that few appear to realise is going on. It's the use of "IP Rights" for censorship. One of the ideals of the Internet was the free flow of information and ideas "for the common good". This is an anathema not just to corporates but governments as well, and they do what they can to prevent information being freely available.

The problem western governments have is their hypocrisy over censorship. They want censorship with a greed most can not understand. In the past with few if any small publishers of information censorship was fairly easily achieved (see UK D or DORA Notices, they were just "requests" but generaly treated by publishers as "commandments from god").

That "pact" did not survive the commercial preasure small publishers brought to bear against their larger brethren. Thus governments are comming up with new pacts based on IP legislation. One of recent times is the idea to "copyright links", that is you can be prevented from linking to an article in various ways, including on the spurious argument that the article contains even a tiny fraction of what a large legal dept considers infringes there enployers IP. Thus any article you might chose to put up can quickly be asigned to "the dark web" but the renediation can be both protracted and expensive. Perversely this actually favours the likes of Google and gives them greater control of what most mortals can see. Further the asymmetry of the process makes "might is right" appear the primary process driver, and as most goverments have close relationships with the "might" out of self interest a hint or suggestion may not even be required to the might with aspirations to be "King Makers".

However it gets worse to the point "Public Health" may be threatened. Common food stuffs contain poisons, many of which are easily dealt with if and only if you have the required knowledge. For instance Red Kidney Beans, Cassava and some "green" patches on root vegetables and leaves on others such as rhubarb. Because "terrorists" are assumed to want to know about such poisons --see "green potato and nicotine"-- the likes of David Cameron's cohort in the UK want to make access to such material either difficult or illegal... Thus what my granny and mother regarded as usefull knowledge is now going to be grounds of suspicion of being a terrorist... It's already happened with one TV Chef giving out a recipe that would result in you ingesting a poison --all be it at a low level-- that could result in organ failure. It's enough to make George Orwell seem like an eternal optimist.

L. W. SmileyDecember 4, 2015 6:41 PM

@ Clive Robinson, all

Is it true that you can go to jail for a long time just for linking to wikileaks? I just read that about Barrett Brown in The Intercept and wikipedia bio. I guess it goes to information censorship, many avenues. What next linking to a journalist's article a federal crime? We're skying down that muddy slop real fast now.

Mark MayerDecember 5, 2015 6:29 PM

@Justin
I'm not sure if you're intentionally lying about the paper, didn't bother to read it, or (if you did read it, but you aren't intentionally lying) just don't have the ability to understand it.

If you're not lying, give it a go. Skimming won't help you much; you need to engage with this material. Making shit up about what you think the author is saying or what you want others to think he is saying isn't cutting it.

JustinDecember 5, 2015 8:03 PM

@Mark Mayer

I will ignore your accusations of lying. From the paper:

I call for a community-wide effort to develop more effective means to resist mass surveillance.

That's fine. He is free to do that. And others who publish research in cryptography will inevitably be of assistance in that or any such related goal, no matter what their political opinions are. Red teams as well as blue teams.

But that's really a red herring. He's calling for such a community-wide effort among cryptographers. Cryptographically, most of the hard problems seem to be solved already in favor of Alice and Bob's privacy and security, although, theoretically, that could change at any time due to new mathematical discoveries, etc.

But the real elephant in the room, that prevents the public from having such effective means of resistance against mass surveillance, is the perennially poor general quality and security of software, operating systems, and hardware. Posters such as Nick P have tried to bring up so-called "high assurance" methodologies, and there are apparently current, real-life, even open-source, viable examples of such, but programmers in general do not seem interested in making such radical changes, (which would admittedly necessitate a great deal of additional time and effort,) to improve the quality of their work.

So we are stuck with the sorry state of software security, and consequently mass surveillance, (whether by governments, pimps, law enforcement, spies, advertisers, drug dealers, organized crime bosses, insurance companies, stalkers, PIs or any other consumers of "big data,") has and will retain the upper hand for the foreseeable future.

LukeDecember 5, 2015 9:10 PM

@ Justin

There really should be two-parts to "Big data," app and data, and then there is throughput which is really part of "the app." For data I'm referring to data at rest and which includes massive amount of metadata that we generate about ourselves thanks to apps like whatsapp, facebook mobile, and even your plain browsers, but really the metadata I'm talking about is those about ourselves that we post to various social media sites, and the passive ones that gets generated in-the-go.

So the two-parts suggest blaming soft issues on the apps is only part of the solution. The real problem is the massive amount of metada that we willingly, and sometimes knowingly, generate on ourselves that gets housed.

JustinDecember 5, 2015 9:55 PM

So the two-parts suggest blaming soft issues on the apps is only part of the solution. The real problem is the massive amount of metada that we willingly, and sometimes knowingly, generate on ourselves that gets housed.

So I want to read a book on topic XYZ without criminal agency PQR drawing a dubious conclusion that I'm a member of class LMN whom they like to rip off. They don't sell it in the brick-and-mortar bookstore. What should I do? Go buy a prepaid debit card with cash somewhere, use TOR, set up a fake identity with a mail drop somewhere, retrieve the package when it comes. All of that raises suspicions among various legitimate and not-so-legitimate authorities, so it will get investigated, and all the more data will be generated, which in turn will not be kept private.

Sure. Use cash for day-to-day transactions, but automatic license-plate recognition, facial recognition, and RFID are spreading everywhere now. Various levels of government collect this information but don't know or care how to keep it out of the hands of thieves, stalkers, pimps, hookers, drug dealers, PIs etc. etc.

How do you interact with modern society without generating massive amounts of data? Just consume the same stuff that other people consume, do and say the same things that other people do and say, and try not to draw attention to yourself? Massively limiting to one's freedom.

Without secure apps built on a secure system with secure hardware (na ga happen) there is no effective way we can limit the massive amount of metadata that we "willingly, and sometimes knowingly" generate.

Mark MayerDecember 6, 2015 12:04 AM

@Justin
I didn't accuse you of lying, I suggested it as one of three possibilities.

I also suggested that you make a closer read of the material instead of skimming or stopping partway. Because you haven't, I think I can reasonably accuse you of being sloppy and slapdash. Don't take it personally. I'm often that way, too.

ianfDecember 6, 2015 12:42 AM


@ Justin: […] “the real elephant in the room, that prevents the public from having effective means of resistance against mass surveillance, is the perennially poor general quality and security of software, operating systems, and hardware.

Undeniably that plays a rôle, but, surely you meant to write that that elephant's first, middle and last names are »Lack of awareness and understanding of cumulative dangers of unbridled surveillance leading to the citizenry not waking up until the police state is a done deal.« (I know, some people go gushy-gaga when it comes to baby elephants' names! — not as cute as cats, but their #memetime will come.)

MarkDecember 6, 2015 2:36 AM


Loved the paper, and the slide show's were a big hit for xmas.

Started reading Data and Goliath....

Its just over 114 days, till our DECO - DSGL, penalties come into effect covering encryption here in Australia.

CallMeLateForSupperDecember 6, 2015 10:37 AM

@Mark
Thanks for the heads-up re: Australia's DECO-DSGL brain-f__t. It looks very bad. I can't explain how it evaded my personal radar.

JustinDecember 6, 2015 6:29 PM

@ ianf

»Lack of awareness and understanding of cumulative dangers of unbridled surveillance leading to the citizenry not waking up until the police state is a done deal.«

You could also say:

“Fascism should more appropriately be called Corporatism because it is a merger of state and corporate power”
― Benito Mussolini

The provenance of this quote is extremely doubtful, but there is a worldwide tendency toward what it describes. Big Data is available on us, and anyone with sufficient money or power has access to it.

MarkDecember 7, 2015 12:12 AM

@CallMeLateForSupper

Thats ok, it's only up to 10 years in jail and .5 mill in fines if your late with the permit.

And the permit process, is full of ambiguities, just like the 'chicken and an egg scenario'

Which comes first ?

The thought, the concept, the thought, the method, the thought, the math, the thought, the code, the thought, the permit, the thought ?

Leave a comment

Allowed HTML: <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre>

Photo of Bruce Schneier by Per Ervland.

Schneier on Security is a personal website. Opinions expressed are not necessarily those of IBM Resilient.