Programmers Who Don't Understand Security Are Poor at Security

A university study confirmed the obvious: if you pay a random bunch of freelance programmers a small amount of money to write security software, they're not going to do a very good job at it.

In an experiment that involved 43 programmers hired via the Freelancer.com platform, University of Bonn academics have discovered that developers tend to take the easy way out and write code that stores user passwords in an unsafe manner.

For their study, the German academics asked a group of 260 Java programmers to write a user registration system for a fake social network.

Of the 260 developers, only 43 took up the job, which involved using technologies such as Java, JSF, Hibernate, and PostgreSQL to create the user registration component.

Of the 43, academics paid half of the group with €100, and the other half with €200, to determine if higher pay made a difference in the implementation of password security features.

Further, they divided the developer group a second time, prompting half of the developers to store passwords in a secure manner, and leaving the other half to store passwords in their preferred method -- hence forming four quarters of developers paid €100 and prompted to use a secure password storage method (P100), developers paid €200 and prompted to use a secure password storage method (P200), devs paid €100 but not prompted for password security (N100), and those paid €200 but not prompted for password security (N200).

I don't know why anyone would expect this group of people to implement a good secure password system. Look at how they were hired. Look at the scope of the project. Look at what they were paid. I'm sure they grabbed the first thing they found on GitHub that did the job.

I'm not very impressed with the study or its conclusions.

Posted on March 27, 2019 at 6:37 AM • 55 Comments

Comments

TatütataMarch 27, 2019 7:22 AM

... showing that developers don't inherently think about security when writing code.

Isn't this rather a demonstration of management failure? I.e., the specification is incomplete or non-existant, but the "Auftraggeber" implicitly expected the underling to make up for this.

Hex encoding is still not the worst possible option.

At a place I worked at, a critical application logged all mainframe interactions in a file visible to all users on the workstation. This included user credentials (why???), which were synchronised with the general Microsoft authentication system. Whoever wrote this garbage knew it wasn't quite a good idea, as the password was "encrypted": Caesar cipher with an offset of one, which isn't quite nearly as secure as ROT13.

Management was full of themselves with so-called PRINCE2 methodology, and shelves of binders full of charts and procedures, but in the end, the code grinder in Bucarest or Mumbai was on his own, and the finished product the real specification. I didn't have access to the source code, and reading decompiled Java byte code only made me dumber. I did have a copy of the holy grail, a sheet of paper photocopied by generations of monks from a medieval manuscript describing the fundamental business logic and transaction validation rules.

I don't think that paying a big management consultancy a 5 or 6 digit amount would have produced a much better result that the 100€ or 200€ charity raffle.

I'm interested in a new title from MIT Press, Adam Barr: The Problem with Software -- Why Smart Engineers Write Bad Code, but from the blurb and the few available excerpts I'm afraid I might be disappointed.

TatütataMarch 27, 2019 7:33 AM

Several years ago I began writing RESTful applications, for which had to deal with the Oauth2 authorization mechanism. Trying to understand what I had to do, and why, I looked it up, and found scathing criticisms of this "standard", and on how it was allegedly been watered down by industry from the first version. Basically, industry didn't want to have anything too complicated.

IIRC, the main gripe with Oauth2 was that it used implicitly TLS for a purpose for which it wasn't designed. The exchange of access credentials, which are sent en clair over the link is directly dependent on the presence of transport security, i.e., a valid certificate chain.

So the fish rots from the head, as they say.

CigaesMarch 27, 2019 7:35 AM

For me, this article mostly shows a spectacular failure at science and statistics by the authors of the study: they dare publish something with a sample size of less than eleven!

MeMarch 27, 2019 8:51 AM

Seriously? They expected someone to put forth the effort of security for $200?

If I were a free-lancer I doubt I'd care enough to do it right either (unless I got asked enough that I had a ready-made solution).

sari yonoMarch 27, 2019 8:55 AM

this is what happens when a company goes i can get this software for €100 you are crazy quoting me €10,000 for something this simple. in-house developers have actual risks of losing their jobs freelancers do not. the faster they can ship something out the more money they can make.

Petre Peter March 27, 2019 9:20 AM

Unlike doctors, programmers do not have to take an oath. Therefore, there is a strong chance that programmers will do the simplest thing that will work and blame the rest on requirements.

scotMarch 27, 2019 9:21 AM

I just got back from the University of Tulsa's cybersecurity conference, and the main theme there seemed to be that you _must_ seriously push security at all levels, all the time, or bad things will happen. While the general speakers were fairly optimistic about things (ironically, this included FB's CSO, days after they leaked 200 million plaintext passwords), the speakers on the technical track (the USAF's head of security, several former CIA) were a lot more pessimistic about the current state of things. Clearly someone in the IDF read "Ender's Game", because they seem to be taking the "find them as young as possible, and train them aggressively and early" approach. Not quite pre-school level, but more than one speaker did mention that to truly prepare users for information security, you have to start teaching them as early as possible, so that the security mindset is fully internalized. The insurance industry might end up being the ones to really start pushing security standards, since they will want to ensure that those they insure against security related losses are actually following good standards, and they will be the ones with the most to lose.

AlexMarch 27, 2019 10:19 AM

What a horrible study. Kinda makes a mockery of actual scientific research. I understand the push for more security inclined programming off the bat, but this seems like a heavy-handed way to deliver that message.

Duck and CoverMarch 27, 2019 10:58 AM

What,

I'm sure they grabbed the first thing they found on GitHub that did the job.

What "Codecut"?

Is that not plagiarism at the very least, if not the crime of "passing off"?

Surely not.

1&1~=UmmMarch 27, 2019 11:17 AM

@Tatütata:

"Caesar cipher with an offset of one, which isn't quite nearly as secure as ROT13."

My goodness no, my manager told me Caesar is way more secure than ROT13, and they couldn't be wrong they get payed squillions more than me.

They said, you have to do Caesar 26 times in a row to get back to the 'plaintext' with ROT13 only twice.

Oh that and apparently Microsoft use ROT13 on every Windows OS instaled as the default in the registry as well...

Joking aside, that last bit is true,

https://www.forensicfocus.com/Forums/viewtopic/t=7641/

And some realy can not believe it,

https://blog.didierstevens.com/2006/07/24/rot13-is-used-in-windows-you’re-joking/

So went ahead and wrote a programe to demonstrate it.

PhaeteMarch 27, 2019 11:35 AM

No real world sense yet for most university students.
Head full of theory and idealism and seriously lacking in commercial reality.
Everyone who ever had to train fresh students from university for an IT job would concur.

zddMarch 27, 2019 11:44 AM

Those who are focusing on the low pay and Freelancer.com are missing the point. The issue is systemic. I have seen it several times in-house, where programmers are making over $100K and fear to be fired.

If not defined on the requirements, most will store in plaintext. If the expectations are not set, most will use MD5 or SHA1.

This is industry knowledge not typically taught in most Computer Science programs, not all companies will pay $6k for a SANS class, governments are not regulating products based on these factors, and most startups will continue to use these services, which means, those that make it big into a component that the industry adopts or sell their product to a major competitor, came from this shaky foundations.

Food for thought.

Gilbert FernandesMarch 27, 2019 12:00 PM

As a developer myself, I am not really specificed with how this was conducted.
Programming properly requires skill and experience.
Only people with 5+ years can start to learn how to write code with decent security. And it takes years of refactoring and programming defensively to be able to somewhat get "good" quality (keeping in mind in principe that the question is not if you will be hacked, but that you will be hacked, and must have layers upon layers of security and be able to act while the attack is going through one of those layers).

Seriously. You grab people that are not even skilled enough from my point of view as programmers working outside of security code. And you ask them to write secure code ? What do you think is going ot happen ?

Because of this, I cannot take this thing seriously. Even less take anything from it of value.

If you want to know what writing code for securing things, go make interviews with Theo de Raadt or OpenBSD developers. They will show you that most security holes are first bugs, not security holes. Bugs that become exploited as security holes. But they are bugs, most of them.

Then, that it takes a LOT of time to read code to look for vulnerabilities. OpenBSD when they did their fork from NetBSD, they choose to focus on security, and they had to invest a heckton of time.. working on reading code, and fixing bugs, refactoring stuff. And that took monthes, years and we're only talking about the operating system base code, not all the crap you are going to install over that system where you can bet that 99.99999 % of people will not go check the sources for trojans or bugs. No way.

This is why they worked hard to invent mecanisms to give the system better security : Niels Provos work on the PRNG, the Canary work on the stack, the development of the amazing OpenBSD packet filter software...

Seriously. Go talk with them and give those questions a proper answer and point of view. Because if someone knows what they're talking about on this field of knowledge, it's the OpenBSD people.

QMarch 27, 2019 12:04 PM

Nahh. I gotta harp on the money/time thing here. Money is Time. €100 is, what, $132 U.S.? A decent programmer can pull down a salary of, what, over $48/hr? So you're talking about 2.5 hours of work. Of course freelancers will charge more per hour to cover their overhead, so less than 2 hours of work. Perhaps only 1 hour.

How many homework assignments, coding projects that is, did you get in College that could be completed in 2 hours?

Even full time employees with 6-figure salaries, they have a long list of things they have to do, with managers breathing down their neck. Time is money. Spend your time on something that's not required, and find yourself out of a job.

It all comes down to spend the money on security, or don't. And that's decided at the top.

Come to think of it, so is blaming the guys on the bottom for what the guys at the top decided.

1&1~=UmmMarch 27, 2019 1:12 PM

@ALL:

A question for you all,

'What are your in house coding rules?'

Firstly do you have any that are actually meaningful?

Also how do you deal with 'errors and exceptions'? Likewise is your 'input validation' at the input or distributed in with the actual business logic from which it is derived?

How about correct use for 'Design by Contract' or 'Defensive Development' when should you use one over the other?

How do your base in house rules compare with say,

https://en.m.wikipedia.org/wiki/The_Power_of_10:_Rules_for_Developing_Safety-Critical_Code

Or the MISRA ruleset and DO-178B compliance requirments?

Do you think you could actually write code that does not use pointer math, recursion and fixed at load time process memory size, and importantly a call depth of no more than four? Just as 8bit micro developers had to do all the time.

Have a read of,

https://coder.today/tech/2017-11-09_nasa-coding-standards-defensive-programming-and-reliability-a-postmortem-static-analysis./

Oh and if you've got the time do watch the videos, they are quite watchable.

Ross SniderMarch 27, 2019 3:11 PM

Isn't the summary of this that programmers who aren't properly incentivized to deliver secure software are bad at security software?

It seems to me that if you ran the same experiment but with the secure implementation specifics being the focal point / key deliverable or otherwise a contingency for getting paid you would get better results.

JeremyMarch 27, 2019 4:38 PM

Bruce says: "Programmers Who Don't Understand Security Are Poor at Security"

In order to reach that interpretation of the study, you first need to know that "most freelance programmers don't understand security". I'm not sure that's currently known as widely as one might hope.

I also agree with the point that the low wages may be skewing things. 83% of freelance programmers that they approached refused the job entirely; I wouldn't be surprised if a significant number of those refused because the payment offered was too low for a competent execution of the task.

DannyMarch 27, 2019 5:10 PM

200 euros? really? What did they hired?
Freelancer.com? does that site still exists? I only looked at it one time like in 2005 and that was all. Extremely unfriendly to developers, no wonder they got crap programmers there. At least if they wanted freelancers should've gone with upwork.com

This test proves only one thing. And that's the pay should match the requirement. Or the corollary - if you pay crap you get crap back.
Next time they should hire actual programmers, not wannabe's.

Sed Contra March 27, 2019 5:51 PM

So, no room any more now for the honour of the regiment, staying late and doing it on your own time because it’s right, smoothly producing it when done with a spiel that makes it part of the boss’s intention spoken and unspoken ? The kind if thing that carves your character and contributes to that invisible aura of a hope of a way forward that is somehow sensed by those that need or deserve to know ? I.e., what the Italians call cuore ?

BarnacleMarch 27, 2019 9:29 PM

Obvious stuff, but it is *good* that even things that seems obvious gets tested in research. You never know when you are up to a surprise.

Some may have the opposite problem. A basic know-how about security, but a boss or constituent that doesn't. I once got ridiculed at an workplace for storing passcodes by salted hashes. "We are not trying to build Fort Knox" where among the condescending comments I got. My boss thought I wasted time and resources by implementing the bare minimum of password security and I had a hard time fighting off 1234 as an default passcode. This was a couple of years ago, and of course, it was an "internet of things" thingy, but I don't know if the term was coined then.

I managed in the end to use hash and salts and refused to set 1234 as an default passcode. I guess I should have used an "in house" XOR-crypto instead! ;)

1&1~=UmmMarch 28, 2019 12:06 AM

@Barnacle:

"I guess I should have used an "in house" XOR-crypto instead! ;)"

Why not indeed, after all it was good enough for Adobe last century ;-)

And surprise surprise it still is... Tucked away on this page,

https://helpx.adobe.com/coldfusion/cfml-reference/coldfusion-functions/functions-e-g/encrypt.html

You will find,

"'The default algorithm, which is the same one used in ColdFusion 5 and ColdFusion MX, uses an XOR-based algorithm that uses a pseudo-random 32-bit key, based on a seed passed by the user as a function parameter. This algorithm is less secure than the other available algorithms.'"

Please note 'The default algorithm', so that makes it still 'The manufacturers recommended setting'. Or you could say 'Adobe still loves XOR encryption' despite knowing better...

Because they have previously been ridiculed for using it, very publically with the help of both the FBI and the world computer press...

For those with long memories, back nearly two degades ago, you might remember when the US arrested a Russian, the company for whom he had worked revealing that Adobe used xor encryption for their e-book reader, with the 'pseudo-random' key beeing the repeated use of 'Adobe',

For those who are a bit younger,

https://www.cnet.com/news/russian-crypto-expert-arrested-at-def-con/

"'Plus ca change, plus c'est la meme chose'"

-- Jean-Baptiste Alphonse Karr 1848.

1&1~=UmmMarch 28, 2019 12:46 AM

I forgot to add a link to Dmitry Sklyarov’s original presentation.

This version is converted from ancient PowerPoint to HTML, so everybody should be able to read it ;-)

https://www.cs.cmu.edu/~dst/Adobe/Gallery/ds-defcon2/ds-defcon.html

You can read more about the case that predictably to many as 'there had been no crime committed' eventually compleatly failed here,

https://en.m.wikipedia.org/wiki/United_States_v._Elcom_Ltd.

And yes you will recognise at least one name in there. The US DoJ prosecutor who failed to drop the case or win it was Robert Mueller. Please do not pile in on this fact as an excuse to make more contempory comment.

LeonMarch 28, 2019 1:01 AM

Take the other viewpoint: from a small app, first release, plaintext passwords in a database managed by one trusted person on a internal server can very well be good enough.
It very much depends on the situational risks and the money and time you are willing to invest to mitigate them.

1&1~=UmmMarch 28, 2019 2:11 AM

@Leon:

"a small app, first release, plaintext passwords in a database managed by one trusted person on a internal server can very well be good enough."

But then it grows in scope, as all such things tend to do, untill it breaks.

The mantra of the software industry is 'code reuse' in various forms, with the result insecure code hangs around at best like the smell of cheap perfume and stale cigars in an out of town motel, but more often like fermenting fish in a tin can*.

The software industry needs to learn that as far as security is concerned you need to 'throw out the garbage' before you find yourself on top of a festering midden of code rotting beneath your feet, that nobody want's to touch let alone dive into...

Anyway it's late thus my brain needs sleep before it can do the subject of insecure legacy code justice.

* According to a Japanese study, a newly opened can of surströmming has one of the most putrid food smells in the world, stronger than similarly fermented fish dishes such as the Korean hongeohoe or Japanese kusaya. Because of the smell and preasure in the can it is normally opened in a bucket of water some distance from any building. It is likewise mostly eaten out doors served with both strong condiments, and strong spirits are often consumed prior, during and after it's consumption for obvious reasons. Apparently archeologists have found the 'delicacy' was made and consumed over nine millennia ago even without the assistance of barrels or cans. Among it's finer top notes is an astringent vinigar smell with base notes including that of rancid butter/cream, and briny middle notes. Therefor having attacked the olfactory senses like a battering ram and caused tears to flow in the young and maidens to flee in horror, it must qualify as a delicacy to be celebrated and it is with it's own festival...

https://en.m.wikipedia.org/wiki/Surströmming

IsmarMarch 28, 2019 3:13 AM

I think all here is pointing to the fact that you get what you pay for and that good programmers cost money. I would argue that this is not the whole story and that ,as is the case with any other product, following proper processes is the key to ensuring quality. This is even more so when dealing with more unforgiving areas like security where the cost of potential failure can outstrip any development costs by orders of magnitude. This fact, I am afraid, is not something sales people and project managers are still willing to accept in their short-term mindsets.

outadocMarch 28, 2019 4:40 AM

Of the secure password storage systems developers chose to implement for this study, only the last two, PBKDF2 and Bcrypt, are considered secure.

8 - Base64
10 - MD5
1 - SHA-1
3 - 3DES
3 - AES
5 - SHA-256
1 - HMAC/SHA1
5 - PBKDF2
7 - Bcrypt

The first, Base64, isn't even an encryption algorithm, but an encoding function, something that the participating developers didn't seem to know. Similarly for MD5, which is a hashing function.

...did I not read this correctly? SHA-256 is considered insecure? I thought they might be referring to salting, but that comes up later.

And MD5 is a hashing function (don't mention SHA for some reason?), which makes it irrelevant how?..

IndigoMarch 28, 2019 5:26 AM

@outadoc

It's not that SHA256 is insecure per se, but it's not what you want to use for password hashing.

See, SHA256, and any standard hashing function, from md5 to SHA3/Keccak, are designed to be fast.

This is not something you want for password hashing. You need slow hashing functions, which is made to be slow.

Bcrypt, PBKDF2, Argon2 are slow hashing functions, and are considered secure for password hashing.

Also, a slow hashing function will usually come with some parameters to be able to tune the time it takes to compute the hash (e.g. the cost factor in BCrypt).

Also, these functions will usually handle the salting for you, which make them inherently more secure by default.

So the take here is : use the right tool for the job.
Password hashing? Slow hashing functions (Bcrypt, Argon2, PBKDF2).
But file hashing? Then fast hashing functions with enough bits according to your need (SHA1, SHA2 or SHA3).

outadocMarch 28, 2019 5:38 AM

@Indigo

Well I learned something today, thank you. Makes me feel a bit uneasy about what I was taught in school...

IsmarMarch 28, 2019 6:06 AM

This may warrant a blog post on its own but including it here as it is related to code quality

"The technology that repelled the hackers was a style of software programming known as formal verification. Unlike most computer code, which is written informally and evaluated based mainly on whether it works, formally verified software reads like a mathematical proof: Each statement follows logically from the preceding one. An entire program can be tested with the same certainty that mathematicians prove theorems."

https://www.quantamagazine.org/formal-verification-creates-hacker-proof-code-20160920/

BobMarch 28, 2019 8:43 AM

@Tatütata "Isn't this rather a demonstration of management failure? I.e., the specification is incomplete or non-existant, but the "Auftraggeber" implicitly expected the underling to make up for this."

No. Salted hash is the standard for password storage. That's like saying "It's my boss's fault I got a ticket. He told me to take a deposit to the bank, but he didn't tell me I had to wear my seatbelt and obey the speed limit."

BobMarch 28, 2019 8:44 AM

@Duck and Cover

Implementing open source code into a project is in no way, shape, or form "plagiarism."

1&1~=UmmMarch 28, 2019 8:58 AM

@Ismar:

"I think all here is pointing to the fact that you get what you pay for and that good programmers cost money"

You left out 'experienced' ;-)

But for others, experience is probably the real factor you are lookibg for. It's often said it takes 10,000hours to train some one in their twenties to become competent at a mechanical skill and four times that to become skilled. So you get payed for 2000hours a year, you can easily work out what you are going to have to pay in dues to become competent in five years.

That said their is an age factor, it can take less than a third of those times if you start around ten, and twice as long if you start at fourty.

That said though there is a lot more to writing software than bashing it out on the keyboard as lines of code a day (which some managers in the past have thougt). Certain mind sets have been found to produce ten times the average rate of not just 'accepted' code but more importabtly those lines that don't need to be changed in maintenance. This is even though their lines of code a day numbers might well appear average.

There are two points that can be immediately drawn from this,

1, It is more a cerebral than manual skill.

2, It is also a skill that is not best suited to many minds.

Whilst as business owners you want the best the chances are you are going to get average at best.

It also suggests a stratagy for some software writers who feel they are below par to follow. Which mad as it might seem is 'Always work on code that you know will be cut as the specification changes'. That way you can churn out lots of 'so so' code knowing it's not going to get realy tested, if tested at all. Thus your commits are high but they get cut for reasons that are not your fault.

But also you need to keep an eye on 'work required' skills and by that I don't mean the 'job required' skills* like languages, toolchains or development methods (though you should keep an eye on those as standard). Back when I got going the only high level language you would find outside a University was Cobol or Basic, the rest was CPU specific assembler in very limited resources. RAM was around 0.1USD a byte and EPROM was more expensive still. You had to know some quite arcane things, like on certain versions of the Z80 CPU it was faster to fetch from RAM than the registers in some instructions. Other tricks like 'XOR A with A' on CPUs that did not have a 'CLRA' instruction was less space in ROM but slower than a load with zero, so there were trade offs that no compiler could do then or even now without help. But the average programmer is not going to write assembler these days, the closest they will get will be C or later derivations. But more importantly for most resource issues will not be something they ever need think about let alone have to out think the compiler to meet the resource issue that is upper most in the problem list, or even have to effectively invent an interpreting programing language that can give you better instruction to byte density than main stream languages.

In fact it's now undesirable to write more than one instruction per line or even play 'language tricks' because you generally not only have the resources, it also has a significant effect on the down stream costs such as upgrading, maintaining and supporting.

Thus these days a good 'practical' 'work required' knowledge that is independent of programing methods/styles, languages, or hardware is where you should be aiming. That is algorithms and abstract data type usage is more likely to be a usefull investment of time as it won't age. Something that requires rather less keyboard time than book time. Likewise other 'foundation' aspects that will give you a beter understanding of things such as formal methods.

But the most important skill you need to develop would once have been called 'thinking hinky' it's defensive programing on a different plane. Like designing Crypto algorithms, you first have to understand how your code can be attacked and how to mitigate it. Whilst at first it might appear to be a language feature issue, it's actually way more general as their are significant attacks against algorithms, protocols and standards. But importantly remember an algorithm, protocol or standard that is theoreticaly secure is very often not implementaion secure, time based side channels being just one attack class they can and do fail against. This is 'work required' knowledge again and independent of the 'job required' highly volatile language, toolchain and develipment methodologies.

* For those that are not sure of the difference between 'work required' and 'job required' think of it broadly as 'Does it appear in job application listings' if it does then it's a 'job requirment' not a 'work requirment'.

Duck and CoverMarch 28, 2019 10:24 AM

@ Bob,

Implementing open source code into a project is in no way, shape, or form "plagiarism."

Actually that rather depends on the licence.

I've seen code that has a very specific licence requirment be put into commercial code very much against the licence requirment.

I suspect anyone who just,

grabbed the first thing they found on GitHub that did the job.

Is highly unlikely to declare that's what they did to their boss for a number of reasons. So the licence terms won't get followed either.

I've seen conversations in commercial shops where some bright young spark has suggested OS usage. It got stamped on hard very fast, because it would then be in effect a conspiracy with those in the room becoming complicit and so on up the managment chain. Even when it's the 'use of a tool' many commercial shops get very cautious even after some corporate lawyer has OK'd it etc. Because of the old 'No 24x7 maintainence' or 'No help line' or the best one 'No body to pay'...

So yes a quick 'cut-n-paste' might save somebody time, but it's legaly very questionable even under 'fair use' rules, as acknowledgment is unlikely to happen.

People tend to forget all code written no matter how good or bad, compleate or incompleate has a copyright on it, yes even that 'hello world' progran. Likewise the way dictionary words are re-purposed in a programing language, is a 'work' thus you get interesting copyright statments in standards etc, likewise implementation in compiler etc documentation and licences, all the way down.

Some people are very strict about such things especially in the various OS movements where 'free' means a lot of different things to a lot of different people.

FaustusMarch 28, 2019 12:29 PM

There is nothing inherently insecure about xor in encryption. But the key has to be a stream of truly pseudo random bits. Xor is used extensively in encryption.

But xor is not as easily applicable as the principal part of the hashing process.

1&1~=UmmMarch 28, 2019 1:09 PM

@Faustus:

"There is nothing inherently insecure about xor in encryption."

No nor any other simple bidirectional function such as ADD/SUB and MUL/DIV in integer fields.

"But the key has to be a stream of truly pseudo random bits."

Is the resulting problem, as with nearly all stream ciphers. The simple bidirectional function alows for a range of attacks that don't realy work on block ciphers.

Block ciphers generally use one way functions as part of a mixing process with the plaintext. This is done over a number of rounds and makes the reversing of individual rounds to find the key information very much more difficult. In part it's why you can increment either the data in or key in when implementing a CTR mode to make a stream generator.

The reason block ciphers became prefered outside of Europe is in general when used in fairly easy to develop modes they don't have the major key reuse weaknesses of stream ciphers. Especially when you consider just how difficult it actually is to prevent 'Key Reuse' without making key selection predictable.

Whilst there is little wrong with a properly implemented stream cipher and use mode, coming up with a 'properly implemented' stream cipher is actually quite hard.

The fact Adobe do not appear to have done it with the XOR function in over a couple of decades gives you an indication of why stream ciphers are not for general use. Especially for tasks like secure password verification storage or systems such as hard drives where similar but not quite identical plaintex is to be expected such as in bulk storage devices such as hard drives.

IndigoMarch 28, 2019 1:17 PM

@Faustus

But then again, what do you do with the key? Where do you store it?

Any form of encryption is just wrong for password.

Bill PaxtonMarch 28, 2019 4:53 PM

@Faustus If a programmer explicitly using a XOR operation is part of a security sensitive process (rather than making calls to a well-reviewed crypto library) there is a problem.

Bruce- this survey is not ideal. But the freelance Indians hired in this survey are likely no less competent than those they would have gotten had they hired similar Indians on a H1b on a salary. And likely no less security conscious than the vast majority of white western business programmers.

DerekMarch 28, 2019 6:49 PM

Microsoft paid their programmers good and yet we still got NTLM. I think it’s just lazy programmers.

BarnacleMarch 29, 2019 3:47 AM

@Faustus:
We all know that. But that is not the point. Whenever someone knowable about security says "XOR encryption" they generally means the insecure Vigenère cipher in a modern form. Of course, used in a one time pad scheme it's as secure as anything else, and many encryption algorithms considered secure (even if they are cryptologically broken) use XOR in some fashion (but not XOR'ing directly with a repeat of the password as in a Vigenère cipher).

Besides, secure password storage should not and can not be based on encryption. It must rely on a one-way function. Otherwise you just have an encryption key to store somewhere in plain. Or encrypt it and put that key in plain. And encrypt that one and...

RobertMarch 29, 2019 4:18 AM

@Tatütata wrote,

"... showing that developers don't inherently think about security when writing code."

the easy answer is this isn't part of the job description.

You can't ask people to complete jobs that weren't part of their job requirements.

HelloMarch 29, 2019 6:12 AM

I think, the funniest thing in the whole research is the following article itself.
"Catalin Cimpanu is a security reporter at ZDNet" stating this:

The first, Base64, isn't even an encryption algorithm [...]


Similarly for MD5, which is a hashing function [...]


[...] a process through which the encrypted password [...]

He is implying that you should encrypt your password. Apparently, with PBKDF2 and Bcrypt, which are both, TAN TAN TAAAAAN, hashing functions. Isn't the whole situation is hilarious?

"Many participants used hashing and encryption as synonyms," the team of academics said in their research paper.
One should add that many security reporters either.

1&1~=UmmMarch 29, 2019 8:30 AM

@Robert:

"the easy answer is this isn't part of the job description."

Do you understand the difference between 'job requirments' and 'work requirments?

A 'job requirment' is specific to an instance of employment, thus things like you find in 'job listings' and 'project specifications/plans'.

However 'work requirments' are in effect what you expect a sutable employee to know as part of their training. Thus they know not to put in screws with a hammer, or try to drill a hole with the drive set in reverse. That is the things you should have learned as part of your progress as a 'journyman' through your training to 'qualified practitioner'.

The issues with password storage have been known about as an ongoing problem since atleast the 1960's or 50's, which means that there should be no 'qualified practitioners' out there who would not be aware of the issues.

Thus what you are actually claiming is these persons training or mentality has made the either not qualified or unfit to claim to be qualified to carry out the job.

Which as you have chosen to excuse them being unprofessional you should realise reflects back on your level of professionalism not exactly favourably.

I know the software industry embraces 'free market ideology' more than most, part of which is seeking to avoide all liability for what is in most cases work 'unfit to market'. But those behaviours are now so bad that as software gains 'physical agency' people will die, in much greater numbers than they already have (Boeing 737 MAX being just the latest 370 odd fatalities). Which will lead eventually to regulation having to be put in place and practitioners having to be licenced in competence by proffessional bodies as it currently is with the practice of law, medicine and other forms of construction/engineering.

Importantly though as time goes on to rise in any of the construction/engineering fields of endevor you will have to be not just multidisciplinary but multilicenced. This not only means considerably more time in formal education but also high level professional life long 'tested learning', something that came as a bit of a shock to many in the medical profession a decade or so ago. As this will have a significant effect on peoples remuneration it will come as a bit of a shock to the 'rent seekers' who will have to pay significantly higher rates for what they would regared as lower productivity, hence their strong opposition to cleaning up their very unsafe, insecure thus very dangerous behaviours.

1&1~=UmmMarch 29, 2019 9:27 AM

@Bill Paxton:

"If a programmer explicitly using a XOR operation is part of a security sensitive process (rather than making calls to a well-reviewed crypto library) there is a problem."

Yes there is.

However using a library is not enough and in some respects might actually be detrimental in the long term.

As any one who has knocked around the industry for a while knows, libraries are just snippets of code, and all code has issues. Thus inescapably libraries have not just coding issues, they bring along a few more of their own as well.

Firstly and quite importantly libraries become a single point of failure for many products. Which makes an attackers life way way more easy (as do fsilings in protocols or standards).

Secondly libraries are never perfect, the Standard C library for instance has had it's security issues. Likewise a number of crypto libraries just 'code cut' the AES code from the competition entries. Mostly those entries were coding for something other than security, like fast execution. Which is what the NSA were hoping for, thus most early crypto libraries and all the products built on them, many of which are still in use have significant and not too difficult to exploit security vulnerabilities such as timing channels that leak key bits to the network even through routers.

Thirdly few ever look inside libraries commercial librares are generally 'black boxes' and even open source libraries are frequently treated the same way. This often leads to issues from either insuficient documentation or understanding of it by programmers who use the library. A couple of examples that come to mind is the use of seeds/IVs and crypto nonces. Using crypto libraries often requires knowledge that few programmers possess, and won't get by not looking in the box (even though it might look like the handywork of Pandora).

Fourthly libraries are built by programmers implementing algorithms. The use of libraries does not encorage programmers to learn about cryptography or other safety or security subjects, they frequently as a matter of expediency take other peoples examples and drop them in to their own code 'sight unseen'. As a result testing will be at best deficient or none at all in the areas of importance (safety/security).

Fifthly by definition libraries are a reflection of the past. That is they are built with out of date knowledge and they age often rapidly where security and safety are concerned. This causes all sorts of issues in other areas of security and safety which can easily become mind blowingly expensive as far as recertification is concerned. Which actively discourages proactive updating of safety or security critical software.

I could go one but I hope the point is made that we need a better way than either option, and before you ask, no I've only vague notions of how to do better, all of which come with a big old price tag.

But in the real physical world we have come to accept that somethings have to be done in the 'capitalist with a small C way' whilst others have to be done in the 'socialist with a small S way' and these need to be payed for by all unlike the capatilist way where it falls on actual consumers.

That is things that form the foundations of society everybody needs and uses, have to be done 'for the good of all' 'by all' such as infrustructure like roads etc paid for by taxation. Other things such as end products have to have aspects of market competition to drive technology and efficiency, thus society forwards, but should be paid for by those who chose to benifit by them. And yes there is a very big fat grey zone in the middle, usually made fatter than it should be by self interested parties.

I'm of the mind that cryprography is a 'public good' like law thus it should like any foundation to society be treated as infrastructure and should not just payed for by all but developed for all. In what manner this social good might be achieved is realy a subject for a thread of it's own, but I certainly think it should be over and above any industry based regulation or taxation.

1&1~=UmmMarch 29, 2019 9:51 AM

@Tatütata:

"Most of those 100€ rent-a-coders apparently did a better job in a couple of hours than Facebook in 15 years..."

Ahh but they unlike FaceCrook had little or nothing further to gain from their work secure or insecure as it might be.

We know from what the Manky Psychoburg has done with the 2FA security aspect from which money could be made --ie the telephone numbers-- he has ruthlessly without consent or informing them monetized behind peoples backs. In the process making fools of many who recommended that that type of 2FA was a security improvment, rather than just another ruthless exploitation tactic.

Thus based on that past behaviour alone, it would be fair to assume that the Manky Psychoburg was fully aware of the storing of plaintext passwords and probably put it in place in the first place. With the specific intent of some how making money from the data at some point in the future (if he has not already done so). Again without informing or asking for consent. Don't be fooled by the 'internal review' that just means that somebody realised it was going to go bad very bad in the near future and wanted to brush it all under the carpet as a risk not worth it's potential reward in this current political climate.

And people wonder why I don't in my private life do Social Media, Instant messaging or even Email these days. I guess others, but by no means all people, will catch up as the penny bounces off of their toe or head whilst dropping from a very great hight...

tsMarch 29, 2019 10:58 AM

For 100$, i don't know what they were expecting,.
That's barely an hour's pay for a proper coding freelancer.

JeremyMarch 29, 2019 3:28 PM

@Leon: "Take the other viewpoint: from a small app, first release, plaintext passwords in a database managed by one trusted person on a internal server can very well be good enough."

If by "good enough" you mean "responsible practice", then NO, it is absolutely not.

People reuse passwords all the time. Reusing a high-security password in a low-security application is a bad idea, and you might not want your users to do it, but it is head-in-the-sand irresponsible to rely on your customers to not do it.

Therefore, ANY public-facing system should store salted hashes instead of plaintext passwords, no matter how trivial the application. Only systems that are not accessible to the general public could reasonably consider doing less.

Of course, if by "good enough" you mean "the lazy developer will probably not be penalized for knowingly screwing their customers", then you are likely correct.

KathyRoMarch 30, 2019 11:03 AM

"Of the 43, academics paid half of the group with €100, and the other half with €200, to determine if higher pay made a difference in the implementation of password security features."

I hope these academics didn't congratulate themselves that they removed the element of pay from their experiment. The only people you're going to attract with such a low fee are programmers with so little experience and/or skill that it invalidates whatever conclusion you draw from it.

I hope the researchers didn't get paid for than €200 for this ridiculous waste of time.

RobertApril 1, 2019 4:25 AM

@1+1~=Umm wrote, "Which will lead eventually to regulation having to be put in place and practitioners having to be licenced in competence by proffessional bodies as it currently is with the practice of law, medicine and other forms of construction/engineering."

I believe you've under estimated the scope of the problem. It will require more than better-educated "professionals" on top of strigent government-issued code of conducts. Using your construction engineering example, local governments have extensive codes of conduct in place to ensure a building is properly constructed. It takes more than hired labours who knew how to screw a screw the right way. The inspection process is multi-layered. Every work/job (your choice of vocab) must no only be properly done by licensed pro's but also checked by qualified inspectors.

This is somewhat inadequately done in the "software engineering" process.

Jörn FrankeApril 8, 2019 4:46 PM

I do not think that study is unrealistic. When I watch the IT market and especially the outsourcing market to the middle east then this is exactly what you get:
They implement the first thing they find on Github and some sprinkles from Stackoverflow.
And this is also the state of security for most of the applications in major enterprises whose core business is not IT.

Andrew BrezaApril 15, 2019 7:27 AM

I think people are being too harsh toward the study authors. Most published research is incremental. Researchers develop a reasonable hypothesis and test it. This study won't exactly rock the world of computer science, but it validates the idea that paying unknown developers small amounts of money results in insecure code.

MissyApril 26, 2019 10:12 AM

I think it's not that easy to write code, password system or some other cybersecurity stuff. So, their job should be paid equally. It's not a copywriting sphere where you can always consult with the Internet if you don't know something. And I think ordinary people may hardly know such stuff as Triple DES, Blowfish, AES or RSA, as this is very specific information and usually programmers are familiar with it as they are trained to deal with such kind of things. By the way, I learned about these things quite accidentally when I was surfing on the Academia portal — https://www.academia.edu/37474878/Encryption_Security_in_Different_Industries. So, before rendering judgment try to look for the required information and understand the concept of this.

Leave a comment

Allowed HTML: <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre>

Sidebar photo of Bruce Schneier by Joe MacInnis.

Schneier on Security is a personal website. Opinions expressed are not necessarily those of IBM Security.