Software Developers and Security

According to a survey: “68% of the security professionals surveyed believe it’s a programmer’s job to write secure code, but they also think less than half of developers can spot security holes.” And that’s a problem.

Nearly half of security pros surveyed, 49%, said they struggle to get developers to make remediation of vulnerabilities a priority. Worse still, 68% of security professionals feel fewer than half of developers can spot security vulnerabilities later in the life cycle. Roughly half of security professionals said they most often found bugs after code is merged in a test environment.

At the same time, nearly 70% of developers said that while they are expected to write secure code, they get little guidance or help. One disgruntled programmer said, “It’s a mess, no standardization, most of my work has never had a security scan.”

Another problem is it seems many companies don’t take security seriously enough. Nearly 44% of those surveyed reported that they’re not judged on their security vulnerabilities.

Posted on July 25, 2019 at 6:17 AM52 Comments


RealFakeNews July 25, 2019 12:53 AM

Part of the problem is:

  • Developers think about security far too late (after they’ve already written code)
  • They wouldn’t know WHY many functions are security threats even if you asked them

To fix this problem, developers need to actually THINK before they write, and actually need to THINK about what it is they’re writing.

Far too many developers write code without any regard for what the end-goal or use actually is, and many things are after-thoughts, if they ever get thought about at all.

Far too many people are writing software who shouldn’t be. It’s too easy to do but requires much understanding and thought to do it properly. Most fail at this basic task.

Joe July 25, 2019 1:27 AM

Matter of pay grade.

The average “developer” or coder don’t design the software they write codes for. It’s usually designed by software architects who themselves write very little of its code if any at all.

The burden of security is usually placed on the designers to come up with cleaner interfaces that segregate coding error from its underlying platform so as to minize security risk. Java was essentially created to tackle this type of issue as one of its advertised pros.

It’s easy to blame the developers for coding errors when the code execution security architecture was poorly designed from the get-go.

Clive Robinson July 25, 2019 5:38 AM

@ Bruce,

I would say those figures are optimistic, and the situation is way way worse. As you know it’s something I’ve been keeping my eye on for a couple of decades or so.

@ ALL,

I’ve been saying for atleast a decade now “Security is a Quality Process” and like any Quality Process it should begin befor the project and be a key driver through out not just development and maintenance out and beyond End of Life for the product. This means “Securiry is everybodies problem” from the most senior through to the most junior.

There should be no finger pointing saying that’s not my job or at my pay grade.

Any one who writes code and does not address “errors and acceptions” correctly which is by far the majoriry of those producing code is behaving detrimentally to security.

Likewise those that do not correctly address all input are behaving detrimentaly to security.

Likewise those that do not correctly address all fault conditions are behaving detrimentaly to securiry.

Likewise with the internal states of the program.

None of this is news, it’s been known in various ways since the 1960’s. The “limited resource issues” that were used as an excuse for the next fourty years realy no longer apply. If you think they do then cut out a few fairly usless “bells and whistles” features marketing’s cat has draged in to stink the place up.

But the real issue is at the end of the day “customers” if you buy crap in the market, then the market will respond by making more crap available for you. That’s been known since before Smith put pen to paper for a decade in Kirkcaldy high street a quater of a millennium ago.

The problem is producing a pile of crap is easy you just keep shoveling it on as fast as possible, eventually it will pile up as high as you want. However to correctly plan, design and construct a tower of the same hight requires knowledge, skills, and carefull testing. All of which take time, which is costly…

So as a customer you should be aware that if you want a Quality Product as compared to a Crap Product then you have to pay a price both in terms of money and time.

What we have is the expected result of a “free market” without regulation. Most other consumer markets are atleast regulated in terms of “safety”[1] and “fitness” as well as “merchantability”. The consumer software market is not regulated in those respects, in fact you don’t actually buy a product[2]. Thus not unexpectedly you have a race for the bottom driven by the need to make as much profit in the journy.

Which in short is pretty much all you need to know about not just how we got where we are but how to start winding it all back.

[1] As I’ve mentioned before some languages do not have seperate words for “safety” and “security” they have just the one. Which brings a significantly different point of view.

[2] The software industry is not about tangible goods, but intangible services. Thus the service provider has rights that a provider of goods does not. Which is another reason we have such abusive EULAs that can not in any honesty be called legal contracts.

ismar July 25, 2019 6:37 AM

As a long time software developer I am going to argue here that is not actually as bad as it looks with a caveat that we are taking about a “standard” Web App.
For example, if I am writing an ASP.NET application that is going to be hosted by one of major cloud providers such as Azure then I already have a lot out-of-the box security features provided by .NET framework (no buffer overflows), the ASP.NET provided support for easy implementation of authentication and authorisation while the hosting environment ensures things like web app firewalls and defence from DDOS attacks are in place by default (to name just a few).
This is way you don’t hear much in the news about these kind of applications being compromised very often.
Having said all of this, it is true that not enough attention is given to the security side of application dev process but the things are starting to improve especially in the highly regulated industries such as medical, banking or defence industries and from there these better security practices are slowly permeating the rest of the software (and hardware) industry.
I would therefore argue that despite all the news, we are now in a better position than say 10-20 years ago.
The most vulnerable to security threats are the relatively new technologies such as IoT which is why they are often used as staging posts for attacks against the rest of the connected systems.

InsecureAtAnyClockSpeed July 25, 2019 7:39 AM

Agree with Clive: security and quality often go hand in hand.

Digging into the referenced article is this bit …

Nearly 44% of those surveyed reported that they’re not judged on their security vulnerabilities.

So, if you’re under the gun to get the code out and you know no one’s paying attention to security, what would you do? That’s right. Hang security.

Which reminded me of a recent twitter thread discussing really basic vulnerabilities uncovered in products from security industry leaders.

How does this happen? They can talk about the security function their product delivers but did they deliver a secure product?

Another public security company I’m familiar with started to roll out security training. And while I think that is a good idea in principle, many employees view it as just another bit of mandatory training and find ways to meet the checkbox without engaging with it. A more traditional, lecture based training was chosen over more hands on and applied training, cost was a factor.

My own experience is similar. As an engineer I can do a thorough job with my work, checking for unexpected behaviors, doing static analysis, and fuzzing my code — but if I’m asked to bring my timeline in, whay’s going to get cut? Security. Quality. Are you going to keep my completed feature turned off for the next two or three sprints while I finish testing? No.

Delivering a potentially vulnerable feature this quarter or waiting to gain confidence? Not even on the table — I haven’t been able to find a clear enough near-term incentive or make a compelling business case to justify it.

I’ve had this discussion with engineering and middle managers and responses have been anywhere from apithetic, to incredulous, to hostile.

Until the incentives change, I don’t see how the culture will change. Until the culture changes, I don’t see how secure development practices become the norm.

Doug July 25, 2019 8:44 AM

I want to start with the humility that, yes, we developers have plenty of failures, myself included. And there are plenty of smart & wise security professionals, for sure. I’m regularly impressed with some of my colleagues in that area.

But it also seems to me a decent number of security professionals only know black-and-white audit rules and have no concept of trade-offs or relative risks themselves. Just the liability of, “this practice can possibly be a vector, so you have to stop it, universally, no matter what impact it has on cost, schedule, maintainability, usability, etc. If you don’t, it’s your fault, not mine.”

IMO it’s neither fair nor helpful for security professionals to throw everyone else under the bus and wash their hands of responsibility because they’ve pointed out every possible vector. A security professional who has had to be responsible for actual implementation and operations and tradeoffs, or even better, who still is, is a valuable commodity.

(One telling anecdote for me was when the security auditors on site went to lunch and left their workstations powered and unlocked.)

SwashbucklingCowboy July 25, 2019 8:49 AM

Michael Hayden, former head of US CIA and NSA, said it best:

The geography we have created is all about speed, convenience and scale; security is an afterthought.

Brandon July 25, 2019 9:31 AM

I’m in the middle of the spectrum on this argument as I’m a security engineer. I design security checks around cloud services as used by our firm for hundreds of contracts. It’s all basic stuff like checking for external access, trying to enforce least privilege on roles, and checking for open ports that shouldn’t be (prior to me starting 3 years ago, about 90% of our environments had 22 and 3389 open to the internet).

The problem I see, as others have put it, is that security is an afterthought. Every time we take over a contract, we do an audit of the environment. Over 70% have admin ports open. 50% use a full-admin role for their web applications (so if you find a flaw, you can take over their entire account). When we interview the developers who chose to stay with the contract rather than their previous employer, it always comes down to “well we needed it to work – we planned to fix that later.” Sometimes the web apps have been in production for 2+ years and “later” just hasn’t come up yet.

But that’s not the only problem. The other major issue is that they are being thrown into platforms where they have no knowledge or experience. I end up explaining how services on AWS interact (and how permissions models are applied) at least 10x a week. They regularly put developers with 0 experience on AWS in charge of major parts of their apps and then get mad at us when we report on the vulnerabilities. I’ve had others tell me that the audit logs were “wrong” because “I know my people wouldn’t do that” when it showed them making really dumb configuration changes (storing your database credentials in S3 is dumb – making the bucket public is really dumb). If they knew better, there are actually two services which leverage IAM credentials and encryption specifically to secure secrets that your app needs.

AWS hasn’t exactly helped by removing most of the focus on Identity and Access Management from their introductory certs. I recertified this year and 70% of the questions were about how to scale something with almost no questions on “how would you secure it?”

MikeA July 25, 2019 9:31 AM

I agree with Clive, especially about Security being Quality, but the idea that Customers “choose” crappy products got my attention.

Pretty much all my employed life was in the development and manufacture of what I’d generalize as “Capital Equipment”. The “Customer” for this stuff, the person who chooses one machine/software platform/standards ecosystem is not typically the “User”. Many people spend 8 hours a day with dreadful equipment (and software) chosen by some other person (generally above them in the hierarchy), who has no interest at all in the problems it causes unless the widgets/week metric drops too low.

My only recourse (also chosen by many other engineers) was “Guerilla workmanship”, spending un-paid overtime to make something “right” rather than “As the boss spec’d it”. (and of course the “traitorous” practice of giving correct estimates including “cleaning up the mess” allowance) Eventually that wears thin.

If only there was a software equivalent of a law to force explosives factory owner to live on premises, with their families.

Ross Snider July 25, 2019 10:13 AM

This rhymes with my professional experience.

Part of the reason security professionals can’t give guidance is that code just isn’t amenable that kind of guidance. That’s the “physics of code” – what makes it powerful is that it can express any possible kind of application or logic. I don’t have a proof, but I’m pretty sure a single piece of guidance that categorizes “secure code” from “insecure code” would solve the halting problem (i.e. is provably impossible to do).

Security developers can (and do) tell developers to “strictly check all untrusted inputs”, but that’s entirely too high level and the standard developer can’t tell you what input is trusted or not, or what strictly checking it would mean. The only way to provide the right kind of guidance that’s specific and contextual enough is to put security professionals closer to the code and the design as it is being formed. In some settings with very narrow and predictable kinds of applications and business logics being implemented that might become more feasible but in general security professionals can’t provide guidance for each specific programming problem because the number of security professionals is almost always dwarfed by the overall needed size of the development base.

In terms of guidance: there are more standards and guidance systems than you could ever use (PCI, HIPAA, OWASP, FEDRAMP, SOC, ICO, etc, etc). They can’t be used because they despite being as specific as they can, they fall short of being specific and contextual enough. Moreover, developers expect (reasonably) that they can remain in “development flow” without needing complex, latent, or otherwise out-of-band processes to make each and every decision.

Basically: Developers generally assume this problem could be solved with guidance, but that’s completely wrong and no such guidance will ever exist and no security professionals could make it. Security professionals generally assume that if developers just try harder they could develop more secure code, but that’s not true developers are practicing a completely different discipline.

In this business setting (where developers greatly outnumber security practitioners) there are some ways to scale security.

One of them is to develop libraries that take care of the security-pertinent decisions up front, and allow developers to write code without having to think very much about security implications of small code differences (an example here might be “React”). However, time to market and business priorities tend to override resource allocation decisions that would put large parts of a work force (partnered with security professionals) to develop and maintain this kind of infrastructure investment.

Another solution is to require that all developer interviews include security professionals who have veto power, all developer teams are required to staff a certain percentage of security professionals (interviewed by the security staff) and that compensation directly reflect security development and operational practices. Of course, that’s not a viable solution for businesses seeking to grow quickly, or who treat those requirements as rubber stamps.

Anonymous July 25, 2019 11:17 AM

This will not change until businesses prioritize security. Developers are just doing what the business managers demand.

And businesses will not prioritize security until regulations force them to do it.

That won’t happen until we have GDPR+ in place in the U.S.

And we’ll never have that as long as corporate lobbies are so powerful.

Corporate lobbies will be too powerful until the 14th amendment is either repealed or rewritten to specifically exclude non-human “persons” from equal protection of the laws.

TL;DR This will never be fixed until we have a new constitution entirely. And that will not happen while the United States remains intact, because we no longer have a coherent polity and have not had one for decades, if not a century or more.

Bob Easton July 25, 2019 11:46 AM


I am NOT a security professional, but for many years was an ACCESSIBILITY professional, practicing the techniques of making the web accessible to people of various (dis)abilities.

While never seeing a survey to express the exact numbers, the arguments were exactly the same as in this article and comments. Simply replace the word “security” with “accessibility” and it is the identical situation. It was that way 15 years ago, 10 years, ago, 5 years ago, and still is.

The only difference is that instead of leaving a site open to thieves and crooks of various sorts, the non-accessible sites are only turning away 15-20% of potential customers and their associated revenue. … and just as here, much of the responsibility goes to management not understanding the true values of either security or accessibility.

Deja-vu all over again!

justinacolmena July 25, 2019 1:11 PM


security professionals

You more or less get a gun and a badge with that title, code is proprietary, vulnerabilities are trade secrets, and your only job is to check receipts before letting consumers walk out the door.

it’s a programmer’s job to write secure code, but … less than half of developers can spot security holes.

Common programming pitfalls such as race conditions and deadlocks “exists” and you have to work as a “team player” and learn to live with those vulnerabilities if you want to keep your job in Silicon Valley. Never call out a senior developer for a programming error unless you want a corporate lackey 9/11 hacker to call out a SWAT team to your home on a trumped-up federal felony warrant.

Petre Peter July 25, 2019 1:53 PM

This survey is a testimony to the triumph of quantity over quality. Working code seems to be the norm not quality code. From this we can deduce that developers are rewarded on the number of features not the quality of the features. My guess is that this happens because in the software industry, the average client [since software is a service] cannot distinguish between good or bad software-it’s much more difficult to sell quality software than selling software based on quantity. The markets won’t fix this because on the surface the software works and if it doesn’t it’s not catastrophic as it is with avionics and medical devices.Legislation is my main hope for quality software. It’s all explained in Bruce’s latest book and it makes sense once it gets pointed out.

Frank Wilhoit July 25, 2019 2:04 PM

The problem is training. Developers are “trained” (which is to say, they learn half a language and one-tenth of a platform, and nothing about quality or about security, either as an aspect of quality or as a thing in its own right) before being hired. They cannot be trained after being hired, because capital budget cannot be spent on training. This is a much larger problem and security is only one aspect of it.

John Carter July 25, 2019 4:36 PM

As one of the linux kernel devs said…. Every bug is a potential security flaw if looked at from that perspective.

ie. The diff between the latest greatest and whatever is in production is a rich source of places to go alooking.

ie. Security guidance zero. Stop writing bugs.

We haven’t got to level zero yet, so don’t hold your breathe for much more.

The next thing to note, one of the best tools for producing CVE’s these days seems to be fuzzing. ie. Throwing random shit at the wall, using heuristics and coverage to guide your million monkeys with malice aforethought.

I’m not sure what training you can give a dev beyond the instruction “assume your users are a million smart malicious monkeys”.

Drone July 25, 2019 4:43 PM

“68% of the security professionals surveyed believe it’s a programmer’s job to write secure code, but they also think less than half of developers can spot security holes.”

As a programmer, you can’t “spot security holes” when you have a Greedy Marketing Goon hovering over your head repeating “Is it done yet? Is it done yet?”

Clive Robinson July 25, 2019 5:08 PM

@ ,

This is a much larger problem and security is only one aspect of it.

What is called “Life long Learning” is expected in all true proffessions and several other Want-2B-professions.

The real difference is that in true proffessions it is required as part if a “Certificate to Practice” renewal which is required not just by proffessional indemnity but by well so “No Cert No Practice”.

Want-2B-professions of which the more modern technical subjects are, are not constraned by either legislation or proffessional indemnity.

The result, in the case of true proffessions the employer carries a legal obligation to ensure their staff remain certified, and an employee could successfully sue an employer if due alowance was not made.

In Want-2B-proffessions the employer carries no liability other than the fairly normal “due dilligence”. Thus an employee has no right of redress for not being given due alowance to remain current.

As has been frequently seen in the software industry this turns employees effectively into surfs in a captive market. As they have no individual barganing power training and certification carries a premium price but is effectively usless as far as practice is concerned, and the automated testing is easily cheated by “cramming houses” that charge even mote exhorbitant rates.

Thus the reality is the certs are only of use to the likes of the Human Resources Dept that uses them in a brain dead fashion to filter candidates. But is a greate advantage to abusive employers. Because this faux certification process also ends up being “age discriminatory” as well as discriminatory against regularly employed staff who are in effect held in surfdom by their employers who are even more incentivised to not supply training and further push against staff who’s qualifications are aged as the employer knows their opportunity to move to another job is reduced by not having the latest qualifications.

In many respects the software industry would make significant strides forward if those working in it had by law to be currently certified practitioners[1].

However where a faux market exists, those involved will just create a new faux market if legislation renders the old faux market not as profitable… After all in the modern world discrimination is just as bad as it always was, it’s just that the way the targets are aimed at changes to appear to be “unbiased” the worst type of course are the “positive discrimination plans” are far from meritorious.

[1] The down side is it carries a very significant formal entry process with some real oddities. If you look in the UK part of the entry process for some proffessions is to attend between three and six Association/Society organised social events where you are required to attend with a partner and pay effectively a weeks full income for the tickets. At these events will be “Members and officers” of the Association/Society judging you and in effect marking your “networking” score card, by which your chances of getting a job on qualification will greatly depend…

Erdem Memisyazici July 25, 2019 5:25 PM

Just about everything you need to know in regards to secure development practices as a software engineer can be found on the OWASP Wiki.

Frank Wilhoit July 25, 2019 6:12 PM


My point was that even where companies understand the value of training, they have a built-in counterincentive due to the fact that all training has to come out of the operating budget rather than the capital budget. Most recently, this came into play with the Boeing 737 MAX debacle, where, for that reason, the customers’ willingness to buy the product was contingent upon no training being required.

Some companies, in some lines of business, understand that certain training is absolutely necessary and they budget for it; most others allow training requirements to take them by surprise, and then they are at the mercy of whether there is any discretionary money left over in the operating budget, which typically dries up long before the end of the fiscal year.

Joe July 25, 2019 10:26 PM

@Clive Robinson wrote, “There should be no finger pointing saying that’s not my job or at my pay grade.”

I’m not trying to dispute this point, but let’s take shoes for a comparison counter argument.

If you designed a pair of shoes, is it feasible to tell people who wear your shoes to walk or run in a certain limited fashion so as to not make the shoe break?

The onus of security should lay within the designer of a platform or product rather than its users who simply wrote “codes” that were compiled without error on such platform.

This is like MS blaming its users for causing BSODs.

Alyer Babtu July 26, 2019 1:01 AM

Probably security is always a hopeless task, as not being the subject of a real science, since it is not defined, rather it is delimted as a negative, e.g., “unauthorized access not allowed”. Somewhat analogous to “nonlinear”, which is an inchoate field in mathematics, as opposed to “linear”, which has a beautiful theory, partly because it is well defined. Perhaps it can be hoped that there are positively defined things that can be done, from which better “security” would emerge indirectly. Those things would be more straightforward and everyone would do them as a matter of course.

Otter July 26, 2019 1:07 AM

“If you designed a pair of shoes, is it feasible … ?”

Sure. Just don’t say it very loudly. Just say loudly that it has (or has not) a flashy decal of Betsy Ross’ flag. Just type it all caps near the end of the EULA or safety instructions.

Better though to say nothing, and bribe a few lawmakers, bureaucrats, and opinion leaders to say that is the way it should be.

Next time you are near a street, look for all the people leashed to Big Brother’s spyphone. Notice how many eyes, fingers, and ears are glued to the bling. Do you need to ask what they think of a bunch of cranky conspiracy theorists?

Vimal July 26, 2019 1:28 AM

It is responsibility of board of directors to enforce security. And everyone is responsible to maintain security and comply to it. Period.

Gunter Königsmann July 26, 2019 1:32 AM

Perhaps I am on a lucky island. But when I write code at work this code is for an embedded system and most discussions we make about the code is how to make it secure.
That means… …perhaps it depends on the application as if I write GUIs in my spare time the focus is on making it work…

Clive Robinson July 26, 2019 4:15 AM

@ Gunter Königsmann,

But when I write code at work this code is for an embedded system and most discussions we make about the code is how to make it secure.

Yes there used to be a significant difference between writing “embedded” code and “application” code[1].

Part of the reason is that those involved with writing embedded code were selling a “good” not a “service”.

That is the product had to meet the requirments of merchantability and fitness that were regulated.

For some reason the real story behind Quality Assurance is not known by many but it should be.

Back in the 1960’sand 70’s the world moved from Valve/Tube electronics on it’s metal chasis inside wooden cases in consumer goods to transistors mounted on PCBs in plastic cases. It more or less killed domestic production of radios, gramaphones and televisions that did not jump to the new technology. The reasons they did not jump was for various reasons, many being not being able to re-tool.

The thing about transistors apart from being small, light, low power and less expensive was they were reliable not just physically but electronically. Thus you could design for “no repair” which made manufacturing a lot cheaper and a lot lot less expensive.

Which was all well and good, but to get them from the Far East to the West incured considerable shipping costs. However as the items were small and light packaging needed to be a lot lot less, and they could be packaged up in pallets of 50-100 units.

This made the cost of transport very very asymetric. So much so returning an item for repair would incure in shipping charges more than the product had cost to build and ship.

Things went well untill complexity rose above a certain point, then the defect rate went up (just as it has in software). The cost of the “return rate” could rob a product of all profit, and could kill a company.

It turns out this was not an unknown problem in manufacturing during the 50’s a similar problem arose in British Manufacturing. The British Standards Institute spent time looking into the problem and how to solve it. Whilst the standards had produced little change in the UK over in the Far East they grabbed them and implemented them and due to buy in from the most senior of directors all the way down to even the cleaning staff the process worked.

At the moment software is still a “Home Industry” and that is changing, currently the defect rates are still slightly worse from foreign software suppliers than domestic but that is changing.

Part of that change is more man hours spent in quality processes on less complex parts.

[1] In the past I’ve made comment that the software insustry is more “art than engineering” that is recognised engineering techniques in other disciplines were not seen in the majority of software development[2]. The exceptions being embedded for Industrial Control, avionics and a limited number of consumer items. If this survey is to be believed then software writers are now actually waking up to the fact.

[2] One reason for this is that engineering processes are very slow compared to software especially in the development phases. Few people realise just how much engineering and science go into basic items like bolts to know what their charecteristics are under all conditions. But then when you consider there might be as few as six bolts holding a wing on to an air frame… Peoples perspective might change. The recent issues with the Boeing MAX aircraft should give people a clue as to why more time should be spent on “engineering software” not “Slapping code up on a screen”. Yes it’s harsh of me to say that but as I’ve indicated before I do have a prototype satellite sitting on my work bench, and the “return cost for repair” on that would be quite literally astronomical.

Clive Robinson July 26, 2019 5:24 AM

@ Erdem Memisyazici,

Just about everything you need to know in regards to secure development practices as a software engineer can be found on the OWASP Wiki.

Err no not in the slightest.

Whilst it might be of help to some Web developers it’s use is limited.

In fact it highlights another aspect of the “Software Industry” which is a belief in “Unicorn Products” that magically solve all problems.

As history shows whilst there have been no Unicorns there have been buckets and buckets of snake oil sold to people who are at a managerial level where they are not experienced enough to know just what the issues realy are. Thus that rainbow colourd marketing presentation with it’s promised crock of gold at the end, just like “fairy gold” quickly turns into near worthless dead weight.

Whilst security is a Quality issue, it’s also a moving target. Whilst some tools can find incorectly dealt with input, others missing state, they can not tell you if they are errors or not.

And none of them yet deal with the issues of bubbling up attacks where the likes of hardware implementation issues can change states in memory below the CPU ISA level in the computing stack.

These are issues I’ve been dealing with for over a decade now and it appears few others dare enter. I had hoped that Meltdown and Spector would goad a bit more activity, but it would appear not.

65535 July 26, 2019 6:41 AM

@ ismar

Although dot Net is supposedly a stronger framework I really wonder how strong it is in the big picture? Is it strong enough to justify MS attempt to monopolize to C sharp area and said compilers?

@ Joe

“This is like MS blaming its users for causing BSODs.”

I hear you on MS products. I am still un-snarling the residual negative effects of the MS patches pushed out in July [I can’t wait for patch T-day in Aug / cough]. Yes, a few people still depend on out-side coders to enhance MS OS’es, MS office and other offerings.

To All

Aside from low quality coding – how much consumer “bad” coding is purposely/maliciously spit-out due to the likes of Three Letter Agencies strong arming or scamming companies and IT standards making bodies? Sans, the elliptical curve episode?

InsecureAtAnyClockSpeed July 26, 2019 7:07 AM

Just about everything you need to know in regards to secure development practices as a software engineer can be found on the OWASP Wiki.

There is good stuff in and about standards and frameworks like OWASP, CERT, MISRA, and BSIMM. But these do not help me justify the extra time and expense needed to even minimally implement the practices.

Yes there used to be a significant difference between writing “embedded” code and “application” code

I’m sure there are parts of industry I’ve missed in 20 years but I’ve seen “secure” government software being built and seen software built by the security industry. It doesn’t encourage me. And I’m concerned events like 737-MAX suggest the bar is slipping down even in the storied ivory towers like avionics.

Denton Scratch July 26, 2019 7:32 AM


“…and like any Quality Process it should begin befor the project and be a key driver through out not just development and maintenance out and beyond End of Life for the product.”

I once (just once) worked on a major project in which the Project Leader was the head of QA (we were nearly all contractors, including that head of QA – the permies were far and few). The project terminated after four years’ effort, when the company in question was forced to merge as a result of the 2008 banking crisis. (No, don’t bother trying to guess).

I liked working in a quality-driven environment. I have been there once since – as hired help on a third-party project.

Andy July 26, 2019 7:46 AM


I’ve been saying for atleast a decade now “Security is a Quality Process” and like any Quality Process it should begin befor the project and be a key driver through out not just development and maintenance out and beyond End of Life for the product.

I’ve heard Gary McGraw tell developers something like: (paraphrase) The most important thing to remember about security bugs is that they are bugs, pure and simple. Who thinks it’s OK to ship buggy code?

It’s also good to point out to developers that long before code gets exploited it’s probably going to crash more than once from people poking at it. As you said, security and quality go hand in hand.

Denton Scratch July 26, 2019 7:59 AM


“What is called “Life long Learning” is expected in all true proffessions and several other Want-2B-professions.”

Well, I don’t consider software development to be a “profession”, in the sense of a practical discipline founded on a basis of sound academic research. “Life-long Learning”, for most software developers, just means trying to keep up with the torrent of new languages, libraries and platforms that seems to have accellerated greatly in the last ten years.

In my experience, it takes a good six months (full-time production work) to become a competent programmer in a new language. Knowing the libraries associated with the language is tougher; for example, the libraries considered ‘standard’ with Java were a moving target, with important new libraries introduced every 2-3 months, and different libraries adopting different API conventions (because the libraries were written by companies with completely different cultures).

Modern web-development seems to be all about Javascript – and about ridiculously-complicated Javascript libraries imported from 3rd-party sites. This is partly driven by the new predominance of mobile devices among clients; you more-or-less have to use JS if you want to build a site that works as well (or better) on a mobile device, and 3rd-party libraries (e.g. the awful JQuery Mobile) are taken to be the least-costly way of achieving that.

Of course, accessing such sites is often a pain in the neck for someone like me, who runs something like NoScript. You have to guess which 3rd-party script is the one that’s preventing the site from rendering (and of course, the developers wouldn’t dream of hosting the script on their own site; they don’t QA 3rd-party scripts anyway, so they may as well just import the latest version from the 3rd-party site (or the 3rd-party’s chosen CDN).

Don’t get me started on Node.js, which positively discourages QA. How can you possibly QA a system that has several hundred dependencies, any of which could change at any time?

Thank DOG that I’m now out of website development.

cat July 26, 2019 11:41 AM


“I’ve been saying for atleast a decade now “Security is a Quality Process” ”

NOT agreed. Quality is all about testing.
How you test against something you don’t know yet?
How you test against yet unknown attack method you never,
ever even haven’t dream about?

You remember when IBM XT introduced memory testing at boot?
Truth is, if you want to test EVERY memory cell for EVERY
POSSIBLE flaw, it takes YEARS. Are you ready to wait so much
on every boot?

Then there’s tons of examples where source code is perfect
but the compiler itself introduces vulnerability. One example:

Were are all humans and can’t foresee all the possible problems,
billions combinations all at once. Too many variables. And then
there’s timeframe. There’s managers and marketing people who
already set the deadline when product MUST be ready – otherwise competitors take the market and no payday for you, test how much
you like your product.

Also there’s another fundamental problem – Von Neumann architecture.
Data and instructions must be separated. Most current problems are due to fact that some data is interpreted as instructions. Buffer overflow is one example. No bulletproof solutions still exist – monitor the Pwn2Own competitions, how easily different safemeasures like ASLR or DEP are just bypassed like a childplay.

There’s also numerous cases where vulnerability has been sitting there for YEARS and no one ever noticed.

And i remind you the StuxNet – non of any those big AV companies detected it – it was detected by small Belorussian company VirusBlokada nobody even didn’t knew about it. Big AV companies were in shame for not detecting StuxNet for years, search for Mikko Hyppönen speech.

How you test against unknown?

A Nonny Bunny July 26, 2019 3:03 PM

@Ross Snider

Part of the reason security professionals can’t give guidance is that code just isn’t amenable that kind of guidance. […] I don’t have a proof, but I’m pretty sure a single piece of guidance that categorizes “secure code” from “insecure code” would solve the halting problem (i.e. is provably impossible to do).

A rule of thumb that gets you half way is: complex=insecure, simple=secure. Obviously, you can write simple and highly insecure code, but people will be able to properly review it if it’s understandable. So make code as simple as possible (or practical), but not simpler.
Likewise, the halting problem is not a problem that any programmer is likely ever suffer from in practice. It’s easy to make algorithms that are guaranteed to halt, and if need be proof they will.

Marv Waschke July 26, 2019 4:58 PM

Software security is a complicated problem. Having been a software architect and developer since the 1970s, I’ve seen a lot security issues close up. My overall impression is that software is more secure today than it used to be, but assaults on software and the consequences of software malfunction and subversion have all intensified. The result is a much higher rate and seriousness of security breaches.

I take issue with those who say security is an afterthought. My experience is that security is often considered and well-thought out in the initial project planning phases, then, as the project progresses, given less and less emphasis. Security features seldom appear on the marquee. Developers who complete a stunning feature that the product managers place at the top of their pressos get praise and bonuses. The guy who takes the edge off performance with methodical security checks, not so much. Finally, somebody in QA gets up the nerve to try a penetration test (probably after hours on his own time) and triggers a security crisis, but by then, band-aids are the only remedy.

I’ve seen this scenario changing some. I think security is being treated more seriously, no longer relegated to clueless guys with flattop haircuts and ramrod postures, but it still does not have the cachet of a feature at the top of the current hype-cycle, and probably never will.

But this is no different from other professions whose practices have grave consequences, such as physicians and civil engineers. Celebrity neurosurgeons are still subject to malpractice charges.

I see regulation and certification of software engineers as a solution. I’d like to see regulations that require the participation and endorsements of certified software engineers on some types of projects, just like you can’t build bridges or oil refineries without civil and chemical engineers. I would also like codes of ethics, professional governing bodies, and statutes that would punish software engineering malpractice with liabilities and penalties as severe as those inflicted on civil engineers and neurosurgeons.

Software is serious today and should be treated like the serious work it is.

Clive Robinson July 26, 2019 5:26 PM

@ cat,

NOT agreed. Quality is all about testing.

It appears you understand neither the Quality process or the testing process…

The testing process is very generalised and is the basis for the scientific method. The way it is supposed to work is you come up with a functional hypothesis, which you then test against for varience. If you find no varience then you’ve actually not learnt anything other than “as expected”. Thus part of testing is to change things to see what the tollerance is on the system you are testing. The way things start to go wrong if you have the time to reflect on them tells you a lot about the limitations of the way the system has been built, thus can drive you forward to find the weak parts of the system and fix them.

The quality process is about how you go about the entire creative process from begining to end across the entire product life and beyond. Testing forms a very small part of this.

A question for you,

    Do you know what a history file is? And why they are so important?

Marv Waschke July 26, 2019 8:34 PM

You’re right. Securing software is difficult. There are many variables and absolute security is not possible. There are more lines of code in Windows 10 than parts in a Boeing 737 Max (or there were the last time I looked), but that does not mean more secure systems are impossible.

There are well-known ways for reducing security risks. Risk analysis is one example. Testing is another. The problem is that we engineers have not held ourselves to a high enough standard.

For example, a better job of risk analysis of conditional execution may have revealed the Spectre and Meltdown chip flaws much earlier. I’d like to see a professional malpractice review of the chip design processes to determine if the chip designers were following best practices as they were designing. Such a review is not possible today because there are no codes of chip design best practices, no legal requirements for chip design engineers, at least none the I am aware of. There could be and we could up this game.

Another example: I’d like to see software engineers personally responsible for reviewing the test plans for the software they develop. Not a company requirement, but a legal responsibility for licensed engineers. Competent engineers have vital knowledge of the vulnerabilities of their code. If the test plan is inadequate in the opinion of their peers, the engineer should be liable along with the test engineers who have their own insight into the potential weaknesses of the system. In other areas of engineering, that’s a reasonable expectation.

I am tired of hearing that software engineering and software engineers are special. Of course we are. We can do so much. But we can also practice responsibly. We owe it to ourselves.

aw, c'mon! July 27, 2019 4:55 PM

I feel that the golden era of computer programming (some still call it “coding” rather than even “scripting”) is at an end.

To build an entire empire or civilization upon the delicate flow of electrons and volatile memory seems wreckless to me. And I used to want to become a computer programmer(!)

For most of recorded history, and also for non-recorded histories, there have been alternatives to electronics and computers. This is not only because they didn’t seem to exist yet. It’s also because ecological and material and social techniques existed.

Those techniques still exist.

It’s still a good time to learn and implement security techniques.
Electronics and their derivative inventions and accessories and wares are not likely to be the cornerstones of reduced stress security techniques.


InsecureAtAnyClockSpeed July 28, 2019 9:15 AM

For those trying to disentangle security and quality, turn it upside down — would you try to claim security on a low quality product?

Would you claim security only based on a robust design?

Could you have assurance without testing? What assumptions would you need to rely on for that to stay true?

Anders July 28, 2019 10:49 AM


There’s old saying – you can prove that software is not secure,
but not vice versa – you can’t prove that there’s no vulnerabilities 🙂
If you don’t find them, someone else most certainly will!

Clive Robinson July 28, 2019 1:20 PM

@ Erdem Memisyazici,

Those principals apply to multiple disciplines, not just the web. If you understand the basics, you’ll do fine.

Did you read my post at the top of the page?

@ InsecureAtAnyClockSpeed,

You ask three questions,

would you try to claim security on a low quality product?

First define what you mean by both “security” and “low quality”?

Let’s say a low quality thermostat, with a Microcontroler a couple of thermistors a few membrane buttons a very low cost LCD a transistor to drive a relay and a capacitor divider and bridge rectifier as the PSU. But no other connections inputs or outputs. It’s certainly secure from remote access control or loading of malware etc. Even the front panel controls can be made secure with a four to six digit entry code.

However if a person has frontpanel access they only need a small screwdriver and a small double ended crock clip lead to short the base of the transistor high or low to turn the relay on or keep it off. So even the security of a four digit lock code is probably enough.

Would you claim security only based on a robust design?

Well again it depends on what you mean by “secure” but yes you can design software in a way that it is immune to certain types of security fault. For instance tampering with system memory can be protected against using encryption built behind the bus drivers in the CPU chip and an appropriate size checksum for each byte.

Could you have assurance without testing? What assumptions would you need to rely on for that to stay true?

The answer to that compound question is a qualified yes. If you take the same steps engineers and architects do with designing bridges and buildings, and the work is correctly supervised then yes.

If for instance you get in 100 25mm bolts you have three choices. Firstly trust the manufacturer and use them as is, over design so that three bolts are used where a single correctly made bolt will work, or test to destruction a random percentage of the bolts.

You do the same in electronics when designing “Intrinsically Safe Systems” not only do you over specify on the components you make double or tripple circuits in series or parallel such that there has to be two or three failures befor the compound circuit reduces to a fail safe mode.

Contrary to what many people think exactly the same techniques that improve “availability” of a system can be used to improve it’s security.

If you look back on this blog far enough you will see that I and one or two others have had extensive conversations on doing exactly this.

But finally before you ask there are known techniques to build secure systems from known to be insecure parts. It’s not as difficult as you might think.

Wael July 28, 2019 1:40 PM

@Clive Robinson,

It’s certainly secure from remote access control

S-s-s-seriously? I’m surprised this came from you. Be honest: if this comment came from another person, wouldn’t you have had a field-day with it? 🙂

I can almost see your reply, which would have started by something along these lines:

Yes and no. With a little hinky (spellchecker suggested “kinky”, but I caught it in time. Not that’s it’s wrong) thinking, one can … Ummm: define “remote”

Maria martinez July 29, 2019 2:20 AM

We can clear the air regarding whether, or belong to single email service or different email services. Basically, all three belong to TWC email login group. If you aren’t able to log into your TWC email, then call us.

We can help you setup your TWC account. You will have to enter a few details in order to create an account on Time Warner Cable, and if you are not aware of the process at all, then leave it to us. Just call us and follow the instructions provided to you.+

We can resolve issues arising in your TWC email. One of the most common problems that people come across is related to Time Warner Cable email login, so if you think you are entering the right login details, yet unable to access your account, then call us.

Clive Robinson July 30, 2019 6:48 AM

This got passed my way the other day,

Just remember when you “measure” it only works fairly when there are real “measurands”.

The problem is if you cast a skeptical eye over it you can easily see how both sides could “game” any “measure” because the software industry does not have any real “measurands”.

The oldest joker in this pack used to be “lines of code a day”, there were so many ways to “fritz it” that it was only dishonest people that won on it.

I remember a friend showing me years ago the equivalent of a “higher level language”, that in reality would be seen as a “macro expander” he had developed. On average it generated six lines of C source for every macro he wrote and used. Also the use of such stock macros made the C code longer as all such translation systems do, because the macros being an incompleate higher level language force a certain coding style requiring more “glue”. The fact it also reduced the number of “key slip” and similar anoyance errors also improved his productivity…

rick July 30, 2019 9:24 AM

@Erdem Memisyazici
Just about everything you need to know in regards to secure development practices as a software engineer can be found on the OWASP Wiki.

Crappy security is in IMHO also a developer training issue.

And the OWASP wiki is a good source for security-related info when developing web applications.

But what if a user is not developing a web app though but e.g. some cryptocurrency, a smart phone app or a stand-alone application. Any good resources or books to consult for these cases?

Phil August 2, 2019 4:20 AM

My current client doesn’t want security. At all. Really.

They gave us root credentials to all their environments (testing, pre-production and production). We told them several times we don’t need so much power and they should give us lower credentials. They refuse. They do things that way with all their contractors and they won’t change.

They have an internally developed SSO system that is very cleverly designed and packaged to be installed and configured on any application server by themselves without us or any other 3rd party participating anywhere in the process and having access to the private keys and certificates. But they insist that we must be the ones executing that procedure in their place and sending them back the generated secrets via email. I’m sure the person who created this tool would get a heart attack if they knew the current procedure completely defeats the initial design.

That’s the kind of safety-aware person I’m currently dealing with. I want to make secure code that I could be proud of.But I’m explicitly forbidden to do so because they don’t want to hear about the subject. They only want things done fast and cheap. Security is just a cost to them.

We’ve tried several times to educate them on security with no success. In the end we just go with the flow. My employer gets his contract money and I get my paycheck. I simply limit the risks to my best effort with the available time and resources.

Clive Robinson August 2, 2019 9:58 AM

@ Phil,

They have an internally developed SSO system that is very cleverly designed and packaged to be installed and configured on any application server by themselves without us or any other 3rd party participating anywhere in the process and having access to the private keys and certificates. But they insist that we must be the ones executing that procedure in their place and sending them back the generated secrets via email. I’m sure the person who created this tool would get a heart attack if they knew the current procedure completely defeats the initial design.

I would not be too sure on that…

If what you say is true, it sounds like you are being set-up to take some or all of the blaim etc should there be a security breach etc.

In essence by spreading the risk out to a sub contractor they are in effect “externalizing the risk” onto your organisation and your organizations insurers.

It’s the old “hot potato game” where the last person holding it is the one who gets burnt.

So the defence is to pass the potato on to someone else, just be sure it’s not you who gets left holding it.

Have a look at,

And take note of this paragraph,

    worked for Amazon [as] a systems engineer in 2015 and 2016. It is worth noting that Capital One’s server was hosted on Amazon Web Services which [Amazon] has been lately making headlines for exposing medical, influencers, social media, personal, financial and military data online

I’ll let you guess who’s going to be used to externalize and mitigate “Capital One’s” issues, with just a quick stirring of the paddle that water is going to get real muddy real quick.

Whilst it might eventually settle down by that time though Jo(e) Public is going to have seen Amazon’s name plastered all over it and not so much Capital One’s, which will keep the share holders and lawyers happy and the executive bonuses more or less intact.

As has been observed, “When supping with the devil use a long spoon” and “When swiming with sharks, don’t think they are your chums, remember they just see you as chum…”

Rick Leir August 16, 2019 9:36 AM

@InsecureAtAnySpeed You hit the nail on the head. As long as the developer’s performance is graded on delivering quickly with no care for security, the product will be insecure (and security minded developers will be unhappy in their work).

Leave a comment


Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via

Sidebar photo of Bruce Schneier by Joe MacInnis.