Comments

Anura April 4, 2014 4:46 PM

On the lighter side, a five-year-old kid finds security flaw in Xbox One:

http://www.bbc.com/news/technology-26879185

The boy worked out that entering the wrong password into the log-in screen would bring up a second password verification screen.

Kristoffer discovered that if he simply pressed the space bar to fill up the password field, the system would let him in to his dad’s account.

RODRIGUEZ April 4, 2014 5:39 PM

Intel Releases $99 “Minnowboard Max,” An Open-Source Single-Board Computer

http://techcrunch.com/2014/04/03/intel-releases-99-minnowboard-max-an-open-source-single-board-computer/

Not to be outflanked by rivals, Intel has released the $99 Minnowboard Max, a tiny single-board computer that runs Linux and Android. It is completely open source – you can check out the firmware and software here(1) – and runs a 1.91GHz Atom E3845 processor.

(1) http://www.minnowboard.org/

beat me like an egg April 4, 2014 10:21 PM

isn’t DDG based in the US?

why not use startpage instead for searches via the form on this site?

DB April 5, 2014 3:39 AM

Something I noticed this week: Mylar, a javascript framework that does real end-to-end encryption for web apps… Also note the links for the paper about it, and another one describing its search engine.

I am enthused that lots of people seem to have been working on things like this lately… maybe someday soon it will become the standard way of doing things.

Jacob April 5, 2014 6:36 AM

@ beat me like an egg

DDG is a US company, and as such they can not promise that they are not under real-time intercept order that may compel them to accumulate your search history. They do need to publish a warrant canary.
And that’s the differentiating point marketed by IXQUICK, which is a Dutch-based outfit.

The DDG on this site uses DDG engine to only search on this site, unlike the search company start page, which searches the whole WWW.
However, the search terms you ask for above are transmitted to DDG for processing by their engine.

name.withheld.for.obvious.reasons April 5, 2014 7:11 AM

The Big [BANG] Picture
In engineering, the focus is on delivering to customers a product or service and understanding what they want…not an especially simple process. It requires actually listening to the customer’s requirements, perceptions, expectations, and experiences with other platforms. Over the years recognizing when a client is not clear on their needs or if a vendor is more interested in what they are selling, my job is to pull back the stick, hit the throttle and plan another approach for a clean landing. This is not a common approach-others might see this as a chance to “tell” the customer what they want and attempts to deliver a product that the customer may not understand or want and might often be sold using the phrase “includes new technology”. The technology is not a product as much as it is a regulatory instrument where individuals unaware of the nature, behavior, use or application and cannot perceive its presence or effect in their lives. In addition, little is understood about “future” effects on society let alone on each individual. This I believe is where the greatest damage is done and where organizations should be focusing more of their efforts. We MUST stop blindly embracing technological “solutions” and instead concentrate on solving problems.

Social Networks won’t Feed [the/your] Kitty

The path future generations may be heading is away from very valuable core lessons and methods that have provided resilience and performance in society. An example is the cyber security insurance model for risk mitigation, or, the displacement of “care”; no reasonable person would sign a contract that resembles many software license(s) / EULA agreement(s). What would make more sense would be a “net positive” development, innovation, and business model instead of an “extraction model”. Instead, suppliers seek cover behind a contract irrespective of product flaws and failures; customers continue to experience constant maintenance requirements, costs, and undue risk to business operations that may be supplemented by insurance (yet another additional cost). Quality can provide value and must be the first consideration instead of changing the expectations of the customer/client/consumer. One only has to review the periodic vulnerability reports issued by U.S. CERT to understand the extent of the problem.

I Know Nothing
In addition, relying on the use of willful and unknowing ignorance of a/the customer is not an opportunity. It is potentially an immoral act; saying to others that how companies and engineers get to market is not as important as the arrival (by whatever means necessary). Taking advantage of individuals without requiring an “informed” process is also plainly unethical and I would argue unlawful. The recent spat surrounding the authority, rules, management, and processes of the intelligence community is endemic and is a reflection of any number of other systems or institutions. Education, both higher and continuing education, fails to produce the requisite critical thinking that provides the necessary feedback when addressing flawed processes or procedures within any number of institutions and the society at large. The National Security Agency’s behavior as documented in the FISC Memorandum, 11 April 2011, demonstrates that the agency asks for forgiveness, not permission, when looking for approval from their oversight body the FISC. I am also reminded the culture that grew out of NASA by the 1980’s and the Challenger space shuttle investigation-the internal focus (historically the institution supported a culture of excellence) changed focusing more on the “things” and “costs” and not the quality and safety of their processes and procedures.

We’ve Pwned your Data–Pay Us Peasant
Commercial data warehouse companies, credit reporting agencies, government agencies, and retail businesses stand on one side of the financial, and non-financial, credit market (Big Data). The other side, without representation or recourse, is the both the victim and customer. Using the information that has been gleaned (aggregating most forms of behavior, public, and private transactions while informing the customer that they are being responsible) entities claim to represent your interests and are not being intellectually honest as they do not providing anything but your data as a service to others. Protecting the class of persons that can be described as finance, commerce, and government is important-individuals/customers/citizens are not. The credit reporting agencies collect, store, and sell credit reports (credit report is aninaccurate description-it is a dossier) to anyone willing to pay. Collection is simple, there are no restraints on the addition of records to a dossier, and does not require vetting and verifying the veracity or completeness of information-the victim/customer is held liable for a system that they are unable to attend in any meaningful fashion. Individual credit reports have “redacted” information about you that you cannot be privy to-a modern day scarlet letter.
In addition, these agencies have no compunction or sense of responsibility with data that they “collect” but do not “create”. The Fair Credit Reporting Act is legislation that states a relationship between the data broker, reporting entities, and the customer/client (credit report data). It is a one-way relationship wherein the customer is subservient to the business and agencies that hold information about you. In agreements where the business claims to “not disclose information to third parties”, another clause holds them harmless if the data is “inadvertently” disclosed. Not only have these agencies promised to be discrete, the fact is that this data is SOLD to other companies that do more than just analyze it-they aggregate data from many sources and provide clearinghouses for information on persons that aren’t even aware that the data exists. The problem here is the broad scope of record sets, the number of persons for which data is recorded, and the inability to do proper discovery in a court case is not possible. This gets much worse if we consider what the United States government is capable of; the vast collect of personal data represents the single largest threat to liberty in the history of the country. I do not believe that technology is providing a service to humanity-we need to rethink our role and responsibility.

Yes, I’m in charge, but that is not my Job
The individuals responsible for corporate direction (rewarded for success) have little to do with processes that are necessary when failure occurs and find others (punished for failure) to hold accountable. Whether it is trade, policy, financial and commercial monocultures, highly consolidated centers of power, the blatantly dishonest and dishonorable statements made by top-level representatives of an organization go unquestioned by those tasked with vetting the integrity of these organizations. When British Petroleum, and/or their agents, spill millions of gallons of oil in the ocean, how is it that not one person is held criminally culpable-if I accidently dropped a dixie cup on the sidewalk I could be fined one thousand dollars and possibly charged with a felony if a spotted owl were present at the scene. But, if I have a business with the financial means to influence the outcome-I can.
Fairness in our society is mostly conceptual, the presence of institutions that fain fairness are many. The courts, large monopolies (telephone and communications companies are a classic example), and the medical and insurance industries are a few examples of organizations that post placards to fairness on the wall but fail to meet the criteria. If I have money, I can access these institutions and expect the outcome to be positive and in my favor-without money the opposite is true. In other words, the worst part of a capitalist system is considered the most favorable-by a few. It appears from an intellectually rationale perspective, persons without money can EXPECT to be treated unfairly irrespective of their circumstances, character, or intent.

It is interesting that larger formal events and issues parallel the need to do the same within the context of society (irrespective of the scope of community). Engineering today is about conformance–and not to excellence, results, or innovation. Corporate consolidation is narrowing the opportunities for other participants in the technology market and the focus on acquiring technology is replacing the development of technology that is useful to society. Benefits are accrued to a few and the responsibility or burden is held by the wider public. This does not represent a net positive outcome of the use of technology-it may appear to work in specific enclaves but in the wider context the risk side of the equation is ignored or dismissed.

Going to Fail
An example of the combination of effects that result in institution failure is an anecdotal description that is generalized for brevity (scores of researchers have performed exhaustive analysis regarding the problem[s] identified).

Defense industry contractors are an example of “guided completion” where the vendors and management control the processes and the “customer” is tied to a chair until payments are made. Where Pentagon program managers may beginwith an initial requirement (the ability of the DoD to perform useful projections of future requirements is not laudable) and solicits bids from domestic sources (there are source/content requirements 60/40). The contracting space is “owned” by a select number of manufacturers and thus the DoD procurement process is a captive market. Everyone knows the rules, understands that government contractors suck from the tit that is the taxpayer but all turn a blind eye to the fact that this is a broken system.
Pentagon procurement itself is a significant drag on DoD performance in almost every measure. Why does this never get fixed-the taxpayer is “customer” and is never present or informed about their purchase so to speak. Program managers may engage a program with a company that initially promises meeting specific contract performance metrics-the most frequent result is slipped schedules and cost overruns. Congressional oversight is ineffectual-congress member’s election is dependent on contributions from defense contractors and they are not willing to hold the industry accountable. Ultimately, the result is the misappropriation of hundreds of billions of dollars and the unnecessary growth of budgets where cost containment is anything but possible. Over a decade, multiple trillion-dollar losses in efficiency can be identified but the courage to fix this problem remains elusive.

  1. Multiple missions and mission creep, clarity of purpose and respect for historic roles and responsibilities is lacking and external/independent review.
  2. Estimating future requirements is an exercise in reading crystal balls, cloudy crystal balls. the Pentagon embraces an asymmetric operational model that supports big programs over multiple generations with limited value.
  3. Procurement process and the industry is largely inefficient; budget cycle time scales are fixed, expenditure calendar, program and requirement specifications and management, cost and production management, supplier acquisition market structure work against cost containment.
  4. Industry participants know the problem; contractors, Pentagon officials, congress, and most secondary supply chain vendors are cognizant of the problem too and are not incentivized to resolve these problems.
  5. Elected officials cannot respond to the problem, fear and perception rules the day. The loss of campaign funds drive the way congresspersons vote and how effectively they perform oversight.
  6. Taxpayers are effectively held hostage to a budget and process that they cannot resolve. The public is effectively left out of their representative government by this process. Thus, the customer (citizen) is not able toparticipate in a system where they are financially responsible and at the ame time send their sons and daughters to serve a government that does not serve them.
  7. Structural economic issues; the increase in productivity and commercial and industrial automation, off-shoring of labor intensive tasks, and the movement of services to foreign entities to reduce costs and increase corporate profits. Our society has been headlong in shedding jobs as labor costs are the largest inputs to a capital enterprise. The false dichotomy is that we can expand the economy while at the same time reduce the number of available positions. Efforts to address the macro economic impacts of a shrinking labor market in nearly every sector continues to go unidentified and addressed. The EXPERTS don’t want to be the messenger, it’s better to make something up than directly address the issue and get fired–goto fail.

nobodyspecial April 5, 2014 9:28 AM

DuckDuckGo isn’t meant to protect your searches against the NSA, even if the search engine was based in the Netherlands there is no guarantee that the secret forces of the MMB aren’t tapping the cables.

DDG is good because it doesn’t store all your searches and pass them on to all it’s advertizers. I’m only marginally concerned that a foreign inteligence agency knows about my posts on schneier – I’m more worried that Amazon puts up the prices on a book I want because Google told them I was searching for it

DB April 5, 2014 4:16 PM

sweet! thanks Nick P!

As I mentioned above, I’m not so thrilled about this one particular implementation of end-to-end encryption system so much as I’m thrilled that more and more people are thinking about these types of things and trying… The more people who see end-to-end encryption as important, the better we all will be eventually. Eventually my hope is that it becomes so commonplace that everything and everyone is using it for every communication channel possible. That’s not likely to happen fully in my lifetime, I know, but a slow but steady movement in that direction would be nice.

kashmarek April 5, 2014 5:34 PM

At this web page:

http://yro.slashdot.org/story/14/04/05/1738226/more-on-the-cuban-twitter-scam

I found this interesting tidbit…

One previously undisclosed top-secret document–prepared by GCHQ for the 2010 annual “SIGDEV” gathering of the “Five Eyes” surveillance alliance comprising the UK, Canada, New Zealand, Australia, and the U.S.–explicitly discusses ways to exploit Twitter, Facebook, YouTube, and other social media as secret platforms for propaganda.”

The above seems to relate a lot to:

Why no one trusts FaceBook to power the future

http://search.slashdot.org/story/14/04/05/1631222/why-no-one-trusts-facebook-to-power-the-future

“… Facebook intentionally makes it hard to leave. Even if you delete your account, your ghost remains—even when you die, Facebook can still make money off you. And that’s not behavior fit for a company that’s poised to take over the future.”

That is the Hotel California theme, “…you can check out anytime you want but you can never leave.” Incidently, that was also the AOL model.

Nick P April 5, 2014 6:07 PM

@ DB

You’re welcome!

“I’m not so thrilled about this one particular implementation of end-to-end encryption system so much as I’m thrilled that more and more people are thinking about these types of things and trying… ”

I agree. It’s the sole purpose I’ve put so much technical data and paper listings on this site over time. Good news is, as those papers show, there’s been a steady amount of work across the entire security spectrum for the past 10 years. I hoped the NSA revelations would inspire more capable people to get in on it en masse. Yet, I haven’t seen that: mostly well-intentioned amateurs who lack expertise.

That’s why I like the various clean slate efforts like DARPA’s and Ethos project that make systems safe ground up. The easier it is for amateurs to code a safe/secure app, the more likely it will happen. Then, easy programming tools can be built on them with security embedded and mainstream programmers can convert more easily. That’s my vision of it, anyway.

DB April 5, 2014 7:07 PM

@Nick P

“Well-intentioned amateurs” can eventually become experts, if they have a good head on their shoulders and put in the energy over a long period of time. So seeing a big influx of “well-intentioned amateurs” due to NSA revelations I see as a good thing for the distant future, just not the present per se. I consider myself in this “amateur” category, for example, and only time will tell which sub-category I’m in (whether I eventually become an expert or not–what is an expert anyway isn’t it more of a relative thing, man with one eye is king among the blind and all that).

Something else I’d like to mention, the way I see it no end-to-end encryption is really designed to thwart targeted attacks, since obviously when it’s properly done the weak points are the endpoints! It’s designed to thwart super duper easy centralized mass attacks only, and make it much more expensive to do mass attacks by forcing attacking every endpoint worldwide (not NSA-impossible, just significantly more expensive/ambitious). So no end-to-end encryption can ever claim to be “NSA proof” really, and measuring any of them against some ideal “NSA proof-ness” might be a bit overzealous. That said, you do raise some interesting points in your analysis of that one, I appreciate that.

The biggest hole I saw in Mylar right from the start was server-side searching available… I was thinking, “what? that’s going to be interesting how that doesn’t simply punch a big hole in end-to-end encryption…” IMO a better approach might be to make it cheaper (less resource intensive) for clients to do searching themselves (of private data anyway, public data is another story)… That way the server is entirely cut out of the picture and doesn’t need a complicated system to prop it up. Still doesn’t touch attacking the end points of course.

Nick P April 5, 2014 9:13 PM

@ DB

“”Well-intentioned amateurs” can eventually become experts, if they have a good head on their shoulders and put in the energy over a long period of time.”

Good point. Building stuff that doesn’t get the job done will make them better or make them quit. This separates the wheat from the chaff. What we really need, though, is an effective knowledge transfer for INFOSEC to new generations. We need to give them the right mental framework to securely design, build, deploy and maintain systems. More tools will be nice, too. If we don’t have these, then the development of necessary expertise on pro-liberty side will continue to happen at a snail’s pace.

Actually, I think snails have covered more ground than high assurance security community. Embarassing loss, eh?

“Something else I’d like to mention, the way I see it no end-to-end encryption is really designed to thwart targeted attacks, since obviously when it’s properly done the weak points are the endpoints!”

I agree if only because it’s outside of scope of the end-to-end scheme, unless it’s an appliance. The workable e2e secure phone system I posted here a while back was a briefcase with several tiny computers in it. It explicity figured endpoint would get attacked, so the middle end did data validation and ensured transparent encryption if a secure call was taking place. The middle end could also be implemented with safe hardware, separation kernel, etc. So, in this scheme, you had endpoint security and end-to-end encryption of voice.

Most don’t try to secure the endpoint as you pointed out, though.

“It’s designed to thwart super duper easy centralized mass attacks only, and make it much more expensive to do mass attacks by forcing attacking every endpoint worldwide”

It might. That’s certainly the idea behind the push for encryption. I’m concerned that NSA capabilities such as QUANTUM, combined with massive endpoint insecurity, mean that they might still do mass collection despite uptake of e2e crypto. My hypothesis is essentially that any massive uptake will be a limited number of tools doing it. So, they keep existing mass exploit tools, leverage their existing framework for running injected code, and simply add plugins for popular e2e frameworks.

Far as I see, the widespread deployment of end to end does force them to attack endpoints, but doesn’t imply a real obstacle to mass collection. I think the only way widespread E2E will do something is if the implementations are too numerous and diverse for NSA to target. For instance, each platform or framework might use one of few high quality libraries for crypto primitives, but implement their own spec. Strange thing about INFOSEC is that, while hampering widespread collection via complexity at NSA end, the amateur efforts would make targeted attacks easier. Perhaps dual layer scheme with outer layer being custom (diversity) and inner layer being strong standard could up resistance on both ends. Others are researching automated code diversity tools.

I’m not sure what direction to take. I’m admittedly brainstorming.

“The biggest hole I saw in Mylar right from the start was server-side searching available… I was thinking, “what? that’s going to be interesting how that doesn’t simply punch a big hole in end-to-end encryption…” ”

Maybe. They have a special construction for encrypted search that supposedly works with a compromised server. I’m not qualified to review it. The part that jumped out on me was the server had the private key and the program that security depended on. So, while they said server was “untrusted,” their actual operation seems to make the server “trusted.” And in a way that both black hats and NSA can target.

functionary April 5, 2014 10:00 PM

@Jacob:
ixquick being based in the Netherlands would seem to make it a better bet than US based DDG, but last time I looked ixquick also had servers in USA. Now why would they do that?

Nick P April 5, 2014 10:43 PM

@ functionary

https://ixquick.com/eng/company-background.html

Their history might give us an answer. It was started by a New Yorker. It has a privacy focus. It was later acquired by a company in Netherlands. Now, according to privacy Q&A, they have servers in both places with one site used for American traffic and one for European. Performance and cost are the likely drivers behind this decision.

They also say they’ve never given a “byte of data” to the government. They also say privacy protections in their base country mean they don’t have to start spying. I’m unconvinced about that guarantee for operations here as they might be compelled to cooperate and lie about it. Or NSA might hit their local networks and servers. I wouldn’t trust that they don’t cooperate with the US govt.

However, such a service would still be valuable if only because their security measures might block many threats besides US government. This is the same justification I used in saying Lavabit shutdown was a bad idea. There are many more immediate and serious concerns for online users than NSA’s dragnet. So, might still be a decent service and various groups in Europe back their claims. So, Netherlands servers might have a solid amount of privacy. However, it’s best to assume that, if you’re located here, you’re at mercy of NSA and maybe other LEO’s/TLA’s.

DB April 6, 2014 12:14 AM

@ Nick P

More knowledgeable people contributing to blogs like this and newcomers to the field reading it is a form of knowledge transfer. Keep it up! It is appreciated.

The issue is that everything in between one physical human being and another has to be secure, for truly private/secure communication to occur. If you make one point strong, but there are many other equally weak points that are just as easy to attack, you just shift where it gets attacked, and the same result happens as if you did nothing at all: the whole message is spied upon, every time, all the time, by everyone who can. Right now, we have an electronic sieve. It’s so full of holes that nobody knows where to start really, and lots more people are realizing this, and it’s kind of disheartening… So what can I really do as an amateur, other than keep plunking away at the little areas I’m most familiar with? Hope that it will layer well and integrate with other people’s work at the boundaries someday? Keep watching for and learning how this can happen?

The QUANTUM thing does concern me too, in exactly the same way you mention. Wouldn’t fixing that entail layering any web-based so-called e2e stuff inside well-done SSL/TLS, with browsers that you knew hadn’t been compromised, on operating systems that also hadn’t been compromised, on hardware that hadn’t been compromised, in a sound proof faraday cage that hasn’t been bugged on the inside, etc…. the list just seems to keep going on and on, hence the whole briefcase thing you mention I think. And if you think my going all the way to a faraday cage is ridiculous, it may not be in the near future, if we keep going headlong with the “collect it all” mentality.

Keep in mind that I’m not actually interested in keeping the police from catching criminals, I’m interested in trying to prevent the police from committing mass human rights violations doing it, since our incredibly stupid lawmakers and court systems don’t have the balls to do it and instead keep flushing the Bill of Rights and the UDHR down the toilet (cue: our good buddy skeptical saying contracts prevent that lol).

@ functionary and others talking about DDG/ixquick/etc

The main problem I have with all the non-google search engines I’ve ever tried, is that they NEVER seem to give me the results I want at the top of the list. It may be great and private and all, but if I can’t find what I’m looking for, it’s utterly useless. Remember back when Google search first came out in the late 90s? This was the problem they solved better than anyone else, and what made them a company. And so far, my experience with the others is teaching me that the others aren’t catching up in this respect…

Clive Robinson April 6, 2014 1:16 AM

@ Name.Withheld…,

Many of the problems you have identified above have been identified befor and will be again over and over again, and some have been discussed on this blog befor off and on.

The reason they happen and are alowed to happen are a function of the society we live in and the complexity of the world it has created.

I call it the “King Game” and it’s fairly easy to see how it works.

In essence humans in general can be viewed as “lazy in outlook” and thus stick to what they know and this producess a couple of problems of “lowest common denominator thinking” and “abdication of responsability”. That is broadly we neither care or care to know what is going on outside of our immediate reach, and thus regard it as “somebody else’s problem”…

That is we see the world the way others tell us it is –not how it is–, and we only listen to those we can touch –who we have common ground with– by choice or circumstances.

There are reasons for this part of which is the make up of our brains which are the result of surviving in a very hostile world where our physical atributes are almost always less in some respect than those that see us as lunch. This is as much true for viri as it is for apex preditors, whilst “strength in numbers” provides protection against the latter it unfortunatly provides opportunity for the former (viri, bacteria, fleas etc upto the size of a rat). Thus we find protection in moderatly sized groups of like individuals, but if the group is to small it dies out from the macro attackers, and if to large from the micro attackers. Add to this that as individuals our brains are designed to be most responsive to the immediate –threat environment– we develop a common outlook to threat and build trust in the group we are in and distrust other groups. As groups grow members develop trust into reliance because it’s more efficient to specialise and specialisation alows groups to grow larger which becomes self reenforcing. As a group grows it forms a heirachy around specialists, “knowledge and skill” become desirable traits and individuals with them are given more trust than others (ie what we call “respect” and “authority”).

But specialisation has it’s own set of problems which turns reliance into dependancy and that in turn alows for exploitation. To stop this we have morals/ethics by which we expect others to behave with “do unto others…”. Unfortunatly whilst what holds for one group might not hold for another and the necesity of trade brought about by specialisation gives rise to the codeification of morals which gave rise to a need for “guardians” to ensure the common morals were adheared to. Via the nature of condlict and need two sets of overlapping guardians evolved one for the group protection or “state” and one for the individuals mental and health which became the “church” (which have further broadend out with time). Each tended to work not just as a counter weight for the other but also to limit the excesses of each other in a mostly unseen battle for for dominance. Part of the king game power sharing was keeping the common people not just in ignorance but either “blissful” or “fearfull” ignorance via various methods. One of which is the notion of the “all powerfull” entity that handed down commands to either the state or church leader as “the messenger on earth” which gives rise to “Divine Right of Kings” in monarchies or it’s equivalent for the church leader in theocracies (which in turn give rise to parliments and ecclesiocracies as power became shared).

Unfortunatly the belife in “Divine Right” and the attendant “god” has several disadvantages one of which is a belief in the “all seeing and all knowing” which scares us because society requires us to wear a “social mask” behind which we hide. Thus we have a hidden fear of being “unmasked” (it’s why the expression “If you have nothing to hide you have nothing to fear” statment has such a chilling effect).

However the “god” can take many forms and when ever you hear people talking about an agencies “mission” you can be reasonably be assured that within the agency “the mission is god” and the leader of such an agency has divine right, and must not be questioned. All rules, morals and ethics can be sacrificed for the sake of the mission and the leader by dint of their position of trust is trusted to be free from taint and act only for the necessity of the mission when doing so…

And as long as we believe such agency leaders are “free from taint” we let them abuse us in any way they chose on the assumption it’s necessary for a mission that is important for society not for their personal gain…

Thus the majority of us with our “somebody elses problem” attitude gift psyco/socio paths with a “god given right” to abuse us as long as they don’t tell us the truth and we have “trust” in the lies we are told…

There is no democracy only a “fearocracy” where “those in the know” use the fear in our minds to control us. They care not about “elected leaders” because the elected are mostly pupets who have the fear of losing the “image and trappings of power” and thus can be controled by those who know how to take it away from them. The only thing that scares the self apointed leaders of a fearocracy is those without fear or vice, and if they cannot be “bought off” then they have to be destroyed in some way.

George April 6, 2014 10:26 AM

Does this new High Performance Trading hack look a little bit like QUANTUM

or

Is my tin foil hat particularly shiny today? 🙂

-cg

Nick P April 6, 2014 10:48 AM

@ George

It actually does. Yet, high performance traders hired the best IT guys they could years ago to come up with those systems. Many have been IDed and were totally private sector. If anything, Id bet NSA copied THEM.

David Oftedal April 6, 2014 11:10 AM

“Unbreakable” security codes inspired by nature
http://www.lancaster.ac.uk/news/articles/2014/unbreakable-security-codes-inspired-by-nature-/

New ‘nearly unbreakable’ encryption scheme is inspired by human biology
http://www.sdtimes.com/content/article.aspx?ArticleID=69025&page=1

Coupling Functions Enable Secure Communications
http://journals.aps.org/prx/abstract/10.1103/PhysRevX.4.011026

“Unlike all earlier encryption procedures, this cipher makes use of the coupling functions between interacting dynamical systems. It results in an unbounded number of encryption key possibilities, allows the transmission or reception of more than one signal simultaneously, and is robust against external noise. Thus, the information signals are encrypted as the time variations of linearly independent coupling functions.”

I’ve no idea what to think. I wonder if the snippet above is actually ciphertext.

name.withheld.for.obvious.reasons April 7, 2014 1:50 AM

    Analysis of Congressional Research Service Report R43459 published 1 Apr 2014:

Overview of Constitutional Challenges to NSA Collection Activities and Recent Developments
Section 215 records are stored in a database, “Collection Store”, and queries executed against the bulk database are stored in another database called the “Corporate Store”. This confirms my suspicion that separate databases are created and are likely considered immune from the restrictions specified for the “Collection Store”. The “Corporate Store” data and its control, maintenance, management, and retention information must be part or any disclosure and should be considered a requisite task of the Intelligence Committee to clarify. Departments of the federal government that use secondary methods to skirt statue must be considered a deliberate act and at a minimum represents contempt for the sovereign.

The rationale vacating the challenge to the Smith V Maryland pen register test was answered as a response to the “SCOPE” of the 4th amendment–this reasoning is fundamentally flawed.

“where one individual does not have a Fourth Amendment interest, grouping together a large number of similarly-situated individuals cannot result in a Fourth Amendment interest springing into existence ex nihilo.”

How did the judge reach this conclusion; so as long as the violation of privacy extends beyond the individual there can be no expectation? Really?Someone on the bench needs to reconsider their grasp of the law; if only one person believes the earth is round–it’s really flat? In addition the FISC ruled that as the bulk collection did not constitute a search there was no violation. Again, the court must be populated with persons needing long vacations and a serious dose of “WTF”. So, if the police just barge through the door, collecting my letters, papers, and other tangible things–there is no violation? Oh, it’s okay as they are barging into your neighbors house as well. I completely disagree with the courts reasoning–or lack thereof. Additionally, Judge Pauley rejected any 4th amendment challenge asserting that lower courts are bound to apply Smith unless it has been explicitly overruled by the Supreme Court itself. Upon reading this I came to believe that I could find a PhD in law inside one of my Cracker Jack’s boxes.

From the D.C. circuit court decision the assertion that “special needs” can be applied to section 215 activity suggests to me the court find a new line of work…there is no exception to the 4th amendment. This argument has been used to undermine the fourth amendment repeatedly. The 4th amendment was written to limit the governments power and the government interprets its authority to impeach the 4th amendment: “where the needs of the government are superior” than it is okay to nullify the text.
WHERE THE HELL IS THAT IN THE CONSTITUTION?
There is only one exception articulated in Article III regarding enumerated rights, and it is not the to the fourth amendment. Which suggests, that unless constitutional authority is given to the court to suspend the 4th amendment–there is no inherent right of the government to exceed its authority.

Autolykos April 7, 2014 3:52 AM

@David Oftedal: I’m really skeptical about this one. The problem with amateurs designing ciphers is that anyone can build a cipher they can’t break themselves – and these guys have never proven that they are any good at breaking stuff.
Best case is that their method requires thinking of new ways to attack it (and I suspect that wouldn’t take too long for a sufficiently motivated professional).
Also, the stuff that’s currently in use is already unbreakable (to the best of our knowledge) if implemented correctly, and has seen a lot more serious testing by really bright guys. So there is no point in switching to a new system that was never really analyzed by anyone who knows what they’re doing.

Autolykos April 7, 2014 4:10 AM

After skimming through the paper, I’d like to say something more specific. Still, I might have missed something critical, so please take it with a large bag of salt:
Their method mostly seems to obfuscate and mix the signals, and does not actually encrypt anything. Plus, the receiver doesn’t have any information the attacker can’t also get (what could be called “key” seems to be derived solely from communication history). And I even suspect the additional workload for the attacker is the same (or at least scales the same way) as for the receiver, and the method for de-obfuscating the message is neither as computationally efficient nor as readily implemented on hardware as common encryption methods (it seems to involve a lot of memory use, quite a bit of floating-point math with non-trivial operations, and might even be inherently “fuzzy” and error-prone).

Autolykos April 7, 2014 4:16 AM

EDIT: Sorry, they seem to actually have a “real” key that still needs to be exchanged in addition to their obfuscation scheme – but I don’t see any demonstrable advantage over exchanging a symmetric key long enough to be secure and stopping to waste time and processor cycles on obfuscation altogether.

Autolykos April 7, 2014 4:31 AM

Some wild brainstorming on a possible attack: With common encryption methods, just getting one bit of the key wrong will result in an unintelligible mess of pseudo-random data. But their method looks somewhat “fuzzy” to me. Getting the parameters just slightly wrong will probably still result in your system dropping out of sync rather quickly, but may still be measurably more stable than completely wrong parameters. Depending on how well that works, you might be able to home your way in towards the correct key given enough data while only searching through a tiny fraction of the key space.
To counter this, they’d need a lot more key material (it might not even help at all), which would make the communication use even more resources.

Mike the goat April 7, 2014 10:24 AM

Namewithheld: the Constitution? Since when does the US govt care about that. In a community run elementary school a reporter documented that the children were taught that the Constitution was “archaic and no longer relevant in today’s world”. Great civics class eh?

Czerno April 7, 2014 10:53 AM

@Mr Schneier : appreciated the new site layout. A suggestion for the index page, if possible : have the date/time (at least date) of the newest comment dispayed alongside with each post summary. It should be auto-updated by the server, of course. Would the software allow this ?

David Oftedal April 7, 2014 2:49 PM

Thank you for the very interesting commentary, Autolykos!

It seems like further proof of this method’s feasibility would be needed before it can be used to encrypt anything bug pictures of giant squids.

name.withheld.for.obvious.reasons April 7, 2014 5:10 PM

@ Mike the goat

…irrelevant in today’s world”. Great civics class eh?

Tis tragic, wonder what an Econ or sociology class consists of…I sure the treatment these subjects fail to inject the correct amount of political correct-ness that keeps a child immune from truth. Encyclopedias are passé, libraries are so 1960’s man, and conversations with your local citizenry amount to nothing more than “where did you get that fab sweater?” My thinks thy protest too much!

Good to hear from you. The board has been relatively quite. Still railing against the windmills–can’t get my ass, donkey, moving any faster. Attempting to stay safe but my mouth runeth over–I think I can see my foot in there.

Anura April 7, 2014 5:30 PM

@name.withheld.for.obvious.reasons

I think what you learn in school these days is irrelevant, as people unlearn everything rather quickly when confronted with massive amounts of propaganda. Economics today is generally reduced into arguments that can fit in a twitter post. “Regulation is killing productivity!”, “Taxes are killing investment!” vs “Well, yes investment is good, and yes excessive regulation is bad, but if you look at the data you will see no evidence that either of these” – oops, message cut off.

Nick P April 7, 2014 8:12 PM

@ Clive Robinson

I thought I posted the verified ML here already (or meant too anyway). I didn’t know about their hardware project so thanks for the link. That’s neat specifically because it allows a more ground up correctness claim. If you recall, I was working a long time ago on an ML or VLISP based system that was proven correct from processor up. This work is trying to do something similar. The key difference is that I wanted one of the existing works ported to VAMP processor, which is formally verified and FPGA-tested. Whatever ML-related stuff they do on VAMP would have a much higher chance of reuse than a custom stack machine just for the project.

Might give them an email suggesting it.

Figureitout April 7, 2014 8:57 PM

yesme
–F*ck me sideways, 2 OS’s are on that list that I downloaded. And you have to upgrade using an already vulnerable connection…

Funny timing, I had a C bug (fixed the bug which caused another bug) on a little school project and these memory checks can be tricky. Not a big fan of “strncmp” right now but it was likely my error.

Interesting programming discussion on hacker news:

https://news.ycombinator.com/item?id=7548991

I’d rather not trash my C-language for “a more secure language”.

Anura April 7, 2014 11:32 PM

If the size of the array was stored with the array as an immutable property, then I question whetherthe actual cost of bounds checking would actually be noticeable. Take this C# code:

int[] a = new int[256];

for (i=0; i<a.Length; i++)
{
    a[i] = i;
}

All the extra bounds checking can be optimized out at compile time because it’s already being checked in the loop in the first place.

So that leaves you with a handful of cases where it can’t be determined at compile time, and if you can’t determine bounds flat compile time, then there absolutely should be extra bounds checking.

C could be made just that little change which would probably stop the majority of vulnerabilities. Granted, there are still wild pointers and dangling pointers that can cause problems, which you can’t fix with a small change.

Figureitout April 8, 2014 1:38 AM

Anura
–Spelling doesn’t matter unless it’s being compiled by a machine. 🙂 Check out some studies on this. Just disconnect that WinXP from the internet, then you got powerline comms and hidden wifi/bluetooth 0days to worry about lol…I still have a copy and an infected copy just to play one of my all time favorite games haha.

yesme April 8, 2014 1:55 AM

@Anura

C has more problems. The funny thing is that all these vulnerabilities of C have been exploited with the latest three OpenSSL and GnuTLS bugs (bounds check, dubious returns and goto / dubious if-then-else). The only thing missing is the #ifdef / m4 / GNU Autotools bug, but give it some time. The year is still young.

C can probably be fixed, but the main problem is backward compatibility. If you get rid of #ifdef that would be wonderful, but changing all the libraries… The same with RIAA and GC.

One of the original creators of C, Ken Thompson, did work on a C without the pitfalls of C and together with Rob Pike and Robert Griesemer they created Go. I hear lots of people say that Rust is the future, but I am affraid it will end up a bit messy. Go to me is as simple as C, but only better.

About OpenSSL itself, I said something about it at the last weeks friday squid post. I think that OpenSSL is one of these things that are “too big to fail” and that’s exactly why we should replace it with something better. Fixing OpenSSL is simply impossible.

Figureitout April 8, 2014 2:10 AM

I think that OpenSSL is one of these things that are “too big to fail”
yesme
–You shouldn’t use that terminology b/c in the US that means bail failures out to the tune of publicly stated $700 billion. Nothing is too big to fail. I say let it burn or initiate the fire; if such a trivial error could ruin the entire security then that heaping mass of ___ needs to be entirely re-evaluated.

RE:Go
–Why can’t they just keep the exact same syntax as pure C? Better yet why can’t we just teach secure C; b/c if we can’t teach secure C, then secure ASM is a joke, then secure Machine language is more joke, and secure processor design is where all computers can be owned and almost know one will know…

yesme April 8, 2014 2:28 AM

@Figureitout

I mentioned “too big to fail” deliberately. OpenSSL is one of these libraries that is used everywhere, but the library itself is a piece of shit. That’s a very strange situation. But as I said before, fixing OpenSSL is simply impossible, so we should start to think about a migration plan.

About Go:

“Why can’t they just keep the exact same syntax as pure C? Better yet why can’t we just teach secure C; b/c if we can’t teach secure C, then secure ASM is a joke, then secure Machine language is more joke, and secure processor design is where all computers can be owned and almost know one will know…”

The motivation for their syntax has been discussed at their site. I think it makes sense.

Writing secure C is very well possible. But only one look in a regular GNU library tells you that it just doesn’t work that way. The OpenBSD guys, they got it right. Think beforehand and don’t mess up. The majority however use the GNU stuff…

Also one downside of C. It doesn’t scale very well with multiprocessing. Go has one of the best concurrency technology if not the best.

yesme April 8, 2014 5:01 AM

A quick scan with http://filippo.io/Heartbleed

Not vulnerable (or already fixed):
google.com
microsoft.com
twitter.com
apple.com
ibm.com
freebsd.org
wikipedia.org
gnu.org
debian.org
ubuntu.org
theintercept.org
wikileaks.org
NSA.gov

Vulnerable:
schneier.com
facebook.com
yahoo.com
duckduckgo.com
openssl.org
apache.org
github.com
fsf.org
CIA.gov
FBI.gov

I bet there are a lot of angry people today.

At tweakers.net, a Dutch tech news website someone suggested the Hiawatha webserver. Of course he gets laughed at, but it makes sense to me. No OpenSSL (it uses PolarSSL), simple and designed with security in mind.

Clive Robinson April 8, 2014 7:07 AM

@ yesme,

It’s funny you should mention GO it’s the language I’m going to be ‘learning’ this summer (I subscribe to the idea of learn a new language each year, with the proviso you have to write a real world app that works across the computing stack). All I have to do is sort out a real world app to do it with (I’m thinking a secure low bit rate voice link might give it a stretch).

One of the things I like “in the adds” is GO defaults to static builds and does reasonable garbage collection, along with reasonable support for parallel / concurant behaviour.

One of my pet hates of C was the difficulty of writing “non blocking” “real time code” to do the likes of terminal IO with very high bit rate moderate data”burst” transfers with very low duty cycles (say 5Kbits at 5Mbit/sec with rest periods between 100sec and 100Ksec). It usually ment getting down and dirty or running semi-independant threads. It’s going to be interesting to see if GO is any better, and how well it turns out code to put in embedded systems.

If anybody has real world experiance with which tool chains and embedded code in GO can they drop a comment on the squid thread.

Mike the goat April 8, 2014 7:14 AM

Namewithheld: thanks! Have been busy with personal and project commitments – have to keep the wolf from the door somehow. Glad to hear you are all well. You know where to find me if you wish to chat.

yesme April 8, 2014 7:24 AM

@Clive Robinson,

I think it’s best to ask these questions on the golang nuts newsgroup.

Keep in mind that the current Go implementation isn’t designed for real-time nor embedded systems but they are getting better at the GC (which is the main bottleneck). That said the Paypal Beacon runs on Go.

Nick P April 8, 2014 7:43 AM

@ yesme, figureitout, Clive

Go vs Brand X
http://cowlark.com/2009-11-15-go/

Very enlightening read. Just goes to show that C is garbage, we had a better systems language already, and Go is a knockoff of it with good tooling. Fun, fun.

Btw, remember that C isn’t the only game in town in systems programming. There were a number of languages. C took off mainly because it was portable to all the weak machines UNIX was being translated to. Alternatives with history include ALGOL’s, PL/I, Ada, Pascal, and Oberon. Each has had OS’s and plenty low level software written in it. Each is superior to C in how readable and reliable it is. Yet, people keep using C for attempting robust systems. (shakes head)

yesme April 8, 2014 8:21 AM

@Nick P

That read is from 2009 and very pre 1.0. Go looks different now. He is also comparing it with Algol-68, a language I am not familiar with.

Let me be clear. Go is NOT a language with fancy features. It’s the opposite (but so is C). If you want a handbag of fancy features, there are lots of other languages and some of these are good too, just think about C# and Rust in the near future.

And because Go doesn’t have the fancy features you sometimes have to write more code. People coming from C#/Ruby/Python could have some problems with that. I don’t have that background and have always liked C. Yet I dislike the pitfalls of C and that’s why Go attracts me. I also don’t care about IDE’s. A text editor and the command line is good enough for me. It’s fast and doesn’t have nasty hooks.

Clive Robinson April 8, 2014 11:13 AM

@ Nick P,

As for the alternative languages you list, the only one I got to grips with by choice was Pascal which has a simple elegance but some darn awkward issues when it comes to writing low level code (especialy efficiently).

C has the advantage many high languags don’t have, that if you use it properly, it’s –nearly– as good as asembler at doing metal level coding efficiently. But… you have to accept a lot of responsability by ditching C’s overly complex pointer system and using simple (void or CPU address bus width) pointers and doing the pointer math and type conversion where required yourself. But like all high level languages C has the usuall “integer carry” issue (ie the CPU has an out of band flag that you cannot see/test/use) that makes doing a lot of things more of a pain than they should be, but atleast alows you to cut down to assembler where required to do it efficiently.

As I indicated the big pain from my point of view is how C interfaces to the OS and the resources it controls. For instance I/O, in C you have hoops to jump through to do simple polling of multiple charecter streams and how you solve it is often not portable in any way, where as in assembler you simply test a hardware or memory flag or compare buffer pointers. Then there is memory managment… If you have never had the fun of writing system allocators (brk(2)) and application allocators (alloca(3), malloc(3) and friends) that use it or appropriate garbage collectors then you realy don’t know what you have had the luck to avoid…

C has advantages in that it’s limited abstraction alows portable code to be written which is it’s big advantage for the bulk of OS code. However the abstraction is overly burdensome for some OS / embedded /maths programing, whilst also insufficient for most application level coding where the code cutter realy needs their hand held and in all honesty should be using a much higher level language (lisp / scripting) which is very strongly typed and checked or type mitigated and has mem / IO resources abstracted way way beyond the code cutters ability to trip up on.

Part of C’s prevelance is it was in effect “free to education” across many otherwise disparat systems a situation you can blaim on the anti-trust suit filed in 1949, which led in 1956 to a consent decree signed by AT&T and Department of Justice, and filed in court whereby AT&T agreed to restrict its activities to the regulated business of the national telephone system and government work. Which changed again under Judge Harold Greene in the 1980s with the culmination of another anti-trust suit filed in the year of C’s birth 1974…

Anura April 8, 2014 12:06 PM

@yesme

C has more problems. The funny thing is that all these vulnerabilities of C have been exploited with the latest three OpenSSL and GnuTLS bugs (bounds check, dubious returns and goto / dubious if-then-else). The only thing missing is the #ifdef / m4 / GNU Autotools bug, but give it some time. The year is still young.

I realize that, but Figureitout’s link was mainly discussing memory vulnerabilities.

The problem with fixing everything in C is that once you start getting into fixing it, then it stops becoming C, and once it stops becoming C you just want to start adding all sorts of stuff on top of it.

“Well, what about RTTI? C doesn’t have strong enough type checking.”
“Well, we need some way to have cleanup code always run… Well, since we are already adding code that is executed as the stack unwinds, why not add RAII?”
“We need a better exception handling system!”
“Macros are horrible… Why not templates?”

I mean, it can be avoided, but I think what we need to recognize is that there is a lot of stuff that’s just bloat when it comes to systems programming, but for most stuff that you develop (games, tools, desktop environment, web browsers, servers, etc.) aren’t so performance critical that a well designed language designed to put safety over performance and to be easier to program in is much more suitable than a language designed with systems compatibility in mind. Of course, you need to take care that it doesn’t just become C++, which is like C but bigger and with more things that can go wrong.

I think what you need to do is sit down with a group of language experts (which I am not) and systems developers (which I am not), classify the vulnerabilities that have occurred in the past 20 years in applications written in C/C++, that is whether it’s buffer/stack overflow, assignment in a condition, indentation failure, etc. and figure out what C could have done different to fix just those problems without getting too bloated.

For a low-level language, I would probably go mostly with C-syntax just because I think it’s the most readable, but Python-like indenting instead of braces, bounds checking, a proper boolean type without implicit casts from other types (preventing accidental assignment in conditions), some-sort of equivalent to a finally block to allow for cleanup, a stronger type system, encapsulation, and make it so that you can’t accidentally leave off a break in a case.

Jacob April 8, 2014 12:16 PM

From the Guardian:
“The European court of justice has declared the data retention directive illegal, torpedoing UK government schemes for the so-called “snooper’s charter” of wide-ranging collection of phone and internet data.”

My question is this: since the US does collect and retain such data, now declared illegal, it is clear that the US is engaged in illegal activities in Europe.
So why don’t the European non-UK governments either penalize the US or at least clearly demand a stoppage to such practices?
When Microsoft or Google acted against European laws the demands and the penalty against these two were clearly stated and enforced. Why not against the US government? Is it because the European security services secretly condone such acts?

http://www.theguardian.com/technology/2014/apr/08/eu-court-overturns-law-snoopers-charter-data-phones-isps

0xUNKNOWN April 8, 2014 12:57 PM

@Jacob
“So why don’t the European non-UK governments either penalize the US or at least clearly demand a stoppage to such practices?”

All the western countries are in cahoots in this.

NSA is just the main collector and processor of it all. It’s not just American residents profiles at NSA.

Anura April 8, 2014 1:17 PM

@yesme

But… But… But… I looked at the syntax one time and it was icky looking.

Seriously, though I think I’ll give it a try sometime – I haven’t really paid much attention to it. I also have Haskell, D and Java on my list of languages to learn, although Java is primarily for professional reasons.

yesme April 8, 2014 1:22 PM

@Anura,

Please follow (or scroll through) the Go Tour. Then you get a good feeling about the syntax. Of course it is a bit different than C. It’s more a combination of Pascal and C. But it’s very clear.

Anura April 8, 2014 3:34 PM

I will take a look when I get home from work. One thing that I’ve noticed is that every single language does something wrong, so I think eventually I will just have to learn language design and make my own.

Anura April 8, 2014 5:14 PM

Speaking of insecurities in langauges, SQL is another language that needs to be improved. There are so many stupid vulnerabilities due to dynamically writing SQL strings. There are alternatives, but either they are entire paradigm changes like NoSQL, or the are solutions that sit on top of SQL like ORM.

You can use parameterized queries or stored procedures to help a lot, but the problem is that the language itself isn’t flexible enough. So I see a lot of times people use a nice, “safe” stored procedure and then build the SQL in a string in the stored procedure and call EXEC. Other alternatives tend to cause performance issues because of the static nature of SQL.

The other problem is that it is one interface for data access and administrative tasks, potentially giving even more power to the attacker. There needs to be a new language for relational databases that is not only designed to be dynamic, but that separates administration from programming. Just like it’s generally considered bad practice to invoke the command shell from an application, you shouldn’t have admin commands available to an applicaiton.

I am planning on starting a project for a replacement. I want to basically design it to create a data abstraction layer. To access that abstraction layer, there would be a type-safe API; your application would never pass integers as strings, an int32 is 4 bytes, a int64 is 8 bytes, a flag for endianess, etc. A string has a length-prefix, etc. – save bandwidth and CPU cycles, and reduce the risk of type safety issues. You never, ever, have this code embedded in strings in your application.

The language itself would still be set based, but with with lazy evaluation to allow you to improve performance. You essentially dynamically pass and manipulate views. By allowing duck typing, you can maximize code reuse without having to resort to string manipulation and EXEC (which I would probably not include in the language).

There are other things involved like properly representing parent-child data, define relationships in the query at the top instead of the middle (e.g. from, where, group, having, select instead of select, from, where, group, having with subqueries here and there), but that’s more on the usability side than a security perspective.

Hopefully it would be implemented as a native language, essentially creating the execution plans directly in the abstraciton layer. I suppose it could also be implemented as a SQL generator itself, but that is not ideal.

name.withheld.for.obvious.reasons April 8, 2014 5:21 PM

What I consider most problematic about OpenSSL, and Swan, is their prevelance in embedded systems–lots of control systems use some firmware based Linux distro and some web-based wrapper for management interface. It’s a wonder that applications (industrial or utility) don’t have the critical analysis or research funds to determine the robustness and completeness of systems engineering. Probably because most vendors are providing a component or sub-system into a system-of-systems. In other words, viewing a product from the “code completion and compiler integrity” to the interface between the factory floor CNC system linked via serial port translation to 802.3. Ingnorance provides only a limited amount of cover for a limited amount of time.

Nick P April 8, 2014 7:51 PM

@ yesme

“Go looks different now. ”

I plan to check it out again. As a language, it offers less than some others and I’m also concerned it might have security issues similar to Java. So new and with so little attacker focus right now I’m waiting it out for any commercial deployment. Language is adequate for systems programming, though.

“He is also comparing it with Algol-68, a language I am not familiar with.”

The comparison is actually the whole point of his post. ALGOL is considered the grandfather of most imperative languages in widespread use. One line of succession went to very inferior BCPL, which got enhanced & transformed into C. ALGOL itself also kept evolving. People ditched it for C’s tiny size, fast speed, and assembler-like properties. Fast forward: Go’s designers think long and hard, inventing a C/C++ replacement that is… nearly equivalent to ALGOL of 1968. What a leap forward. 😉

In author’s words:

“And that’s really depressing. In 2009 we’re getting excited about a language that is less good than one that was released 41 years earlier. Have we learned nothing in the meantime? Well, yes, we have; programming theory has come on leaps and bounds, but it’s largely been restricted to the functional programming world. New technology has been very slow to migrate over to the procedural programming world, which is why we’re only just discovering the joys of generics and type inference.”

“Which brings me back to Go. There’s a lot of work there. Looking at aspects other than the language semantics itself, it appears well-written and effective using modern engineering techniques. This is a good thing. Writing compilers is hard, and the Go developers have put a lot of effort into it.”

“The Go team is obviously well-funded, has plenty of expertise with excellent pedigrees, and has been assigned the task of producing a useful language. This would have been the perfect opportunity to go and find all those new techniques and turn them into a well-designed, orthogonal, expressive, and useful programming language! But instead they fluffed it, and what we have is Go.”

“Imagine what they could have done if they’d not put all that effort into modernizing Algol-68!”

His point is that there’s way better language techniques out there than ALGOL’s (and therefore Go’s). He’d rather see truly innovative language techniques adopted, integrated, and widely deployed. Then, we have cutting edge language with all modern improvements for systems programming [1]. If Go gets massive adoption, then a ton of effort is put into 41 year old (ALGOL68) technology to replace some 30-ish year old technology (C/C++) and then we have to go through all that effort again to replace Go with something modern. Something about this seems… wrong.

Of course, author also points out that Go has plenty funding, development, tools, libraries, community, etc so it’s a useful language. And there’s no disputing it’s a better option than C/C++. It’s just that with the tiniest of imagination I could see them having made it much better with a different strategy. Many hobbyists and academics have designed better languages with awesome features (eg LISP’s macros, Limbo’s safe concurrency, SPIN/Modula3 update live code w/ typesafe linking). And Go’s well-funded designers put a ton of effort into modernizing a 60’s era imperative language instead. (sighs)

[1] At least DECA and Julia are attempting this, with Julia already outperforming Go on benchmarks. There’s also improvements to Haskell, ML, and Scheme to make them good enough for fast, low-level programming. Those are scattered in many projects, though, so Julia seems furthest ahead (albeit targeted for a niche) with DECA being the one I’m rooting for.

Keeping open mind: counterpoint to myself
(Clive might find this interesting)

It just dawned on me that this might be a good thing in one way. Clive will recall one path of my secure systems research focuses on only using 20+ year old hardware and software features to prevent patent trolls from shutting it down. That almost every feature in Go was invented before 1970 should make a compelling argument if it is ever challenged. With modern tools and libraries, it might make a nice systems language for whatever I design. Modula or Oberon, though, I’m leaning toward just because implementing a secure compiler should be easier.

Might also map an Ada subset to Modula or Oberon where I use Ada/SPARK tools for great static checking, then Modula/Oberon compiler for verified compilation. Might be able to autogenerate the Ada from main language, run checks, and produce a report. Basically, a stage in the nightly build system. Can build better static checking tools for main language over time. One of the Modula’s also had a standard library mathematically verified for correctness in certain respects.

@ Clive Robinson

“the only one I got to grips with by choice was Pascal which has a simple elegance but some darn awkward issues when it comes to writing low level code (especialy efficiently)”

Now or in the past? Certain Pascal implementations have sucked, esp in old days. Modern ones people do a lot with so I’m not sure. Might still have those issues. I always coded that stuff in unsafe code, then wrapped it in type-checked functions. Gotta experiment as hardware specs and what hardware actually does are often different. Yet, it worked well enough for me.

“Part of C’s prevelance is it was in effect “free to education” across many otherwise disparat systems a situation you can blaim on the anti-trust suit filed in 1949,”

Makes sense.

Btw, Pascal didn’t die off or become stagnant. Lazarus + Free Pascal are a Delphi-compatible line that’s been updated regularly. If you recall, those Delphi apps tended to be way more reliable than C/C++ apps on Windows which I figured existed solely to test core dump facilities.

Far as more innovative, the final evolution of the line was from Oberon-2 to Component Pascal (incompatible w/ reg Pascal). The latter, when combined with Blackbox Component Builder, was apparently so good a making applications there was a whole community begging them to keep it on life support. Don’t know if they’re still around. However, the language was quite flexible while it’s grammar only took up a page or so in typical Wirth style.

@ Anura

“For a low-level language, I would probably go mostly with C-syntax just because I think it’s the most readable, but Python-like indenting instead of braces, bounds checking, a proper boolean type without implicit casts from other types (preventing accidental assignment in conditions), some-sort of equivalent to a finally block to allow for cleanup, a stronger type system, encapsulation, and make it so that you can’t accidentally leave off a break in a case.”

You just described a subset of Ada in mid-80’s. 😉 It’s been updated 3 times since then, with a GPL version, and can do a lot more now.

Someone (maybe you?) mentioned systematically looking at vulnerability types in languages and dealing with them in a new language. Funny thing is that this was done at least twice where Ada and C/C++ got compared directly. One was NASA on safety-critical software, the other I forgot. Ada had some issues left in it, but ended up being way better than the others (esp in code injection risk). I could try to dig them up for you if you want.

re SQL replacement

The SQL issues are largely a solved problem where solutions just aren’t getting used and expanded. Two links you may find interesting: a parse tree based SQL protection; a survey of attacks and defences. What you describe sound close to what’s mentioned in survey’s “New Query Development Paradigms.” Essentially, a few systems just create a way to build queries that makes them safe every time. Such an approach will usually integrate into their programming language, with proper SQL autogenerated.

No replacement is really necessary: just better API’s, frameworks or middleware. This preserves all the engineering effort that went into the COTS database, along with annual upgrades, while allowing you to extend its security. A similar approach was taken by some MLS secure databases of the past whereby they just put an A1-class system in front of an untrusted database, with a guard and/or query transformation system leveraging A1 security kernel’s protections. One also was based on views, I think, so good call there.

Mike the goat April 8, 2014 8:52 PM

Namewithheld: exactly. I wrote a blog post pretty much echoing your sentiments – OpenSSL is fscking everywhere and there are far better implementations with less lines of code and a less checkered history than it available under reasonably free licenses.

Anura April 8, 2014 9:01 PM

@Nick P
“re SQL (…) No replacement is really necessary: just better API’s, frameworks or middleware.”

Well, a better API is one thing I’m trying to provide. I do agree to an extent that security can be solved today however, as someone who has coded and still codes database applications that are almost entirely written in SQL, I disagree entirely that a replacement isn’t necessary in general (i.e. taking into account more than just security). I go home and cry myself to sleep thinking about the horror of the code I had to deal with that day. I mean, Brain’s Song is uplifting compared to SQL code.

Part of the problem from a security standpoint is that people are still learning to code in SQL, and when people learn to code in SQL it usually means they are not using good APIs, frameworks, or middleware. My goal is for SQL to die a slow and painful death, partially just for my own pleasure, and partially because I want people using a better language.

As for Ada, well, it makes you wonder why C is so popular. I guess part of it is that if you write a library in C, you can use it with so many other languages. I wonder if we need some kind of framework, similar to .NET CLR for interoperability of low-level languages; it probably won’t be compatible with standard C itself, but you could make a close relative for easy portability, and then start porting other languages or coming up with a new one as I mentioned before.

Nick P April 8, 2014 9:24 PM

@ Anura

“I go home and cry myself to sleep thinking about the horror of the code I had to deal with that day. I mean, Brain’s Song is uplifting compared to SQL code.”

I hear you. There were always options, although each leave me wanting something better. The best at dealing with databases, though, were the 4GL’s. Done right, a 4GL gives you high level programming, safety, performance, portability of platforms, portability of databases, powerful data manipulation, and integration with code in standard platforms. Maybe just build a better 4GL that can target app embedding or database server with options for various security features as well.

(Note: There’s quite a few open source versions of stuff on that list. That’s already a start on the work.)

re song: You mean Blink 182’s Adam’s Song? If so, then lol yeah legacy SQL is worse than Adam’s fate.

“I wonder if we need some kind of framework, similar to .NET CLR for interoperability of low-level languages”

OpenVMS actually had that. The OS standardized its calling conventions and some other things specifically for easier cross-language development. The VMS OS itself is written in assembler, BLISS, C, and maybe something else. Ada is also designed for cross-language development of applications, which is the normal way it’s done in Ada land. Another with potential is LLVM. Then, there’s the possibility of subsetting something like CLR to take out much complexity. Hey, wait, a 4GL + C/Ada/FreePascal/Go/C# on a simplified CLR/LLVM… Might be onto something.

Anura April 8, 2014 10:55 PM

The one potential issue I can think of when it comes to interoperability is the possibility of exceptions being thrown. If you are porting C, you won’t want exceptions, but if you are porting C++ you might (although not necessarily). For this reason you need two runtimes: one that allows exceptions, and one that does not. If you are going to do that, you also want two standard libraries – one designed to not use exceptions, and one designed to use exceptions. The code that allows exceptions should be able to call code that does not, but it should not be the other way around.

I have a love hate with exceptions. Sometimes it’s nice to not have to check for exceptions, but they tend to result in code that’s more likely to fail in the first place. I don’t think there’s an ideal solution here.

Figureitout April 9, 2014 12:21 AM

yesme
The motivation for their syntax has been discussed at their site. I think it makes sense.
–Perhaps I can get used to it, the more I read about it, it’s kind of becoming better. There’s just some tiny quirks I would do differently myself but meh; that’s just me lol…I saw some syntax that reminded me of javascript and I freak out…

But only one look in a regular GNU library tells you that it just doesn’t work that way.
–Yeah I have, just for a simple bash command; holy crap.

Nick P
Just goes to show that C is garbage
–Then why is it so popular and so handy? I think you have a vendetta against C. To make a comparison, Newton’s work is still relevant and it’s from the 1700’s; and it’s so goddamn elegant, genius, and powerful. C is here to stay for at least another 50 years I’m predicting. You fck up C, it’s just like fcking up physics, it’s your fault. I actually think it’s healthy for people to have their egos beat down a little; otherwise make a language as portable as C. I don’t think you should disrespect that work on C; it’s crappy programmers that can’t implement and code correctly that are making C insecure. I’m one of them as I made such a dumb error (like an algebra error on a math test) I’m not contributing to any open source security project besides my own for a while.

Clive Robinson
–You’re not wrong on people needing “training wheels”. More information needs to be continually learned and the brain can’t just expand like technology. But consider the heaping mass of information forced on people today compared to the ’80s. You could focus better compared to now, you didn’t have that smart phone in your navy-blue sweatpants pocket like you do now. You also had a lot more freedom back then compared to now, where children grow up in schools w/ cameras looking at them 24/7. You can’t even take a restroom break w/o a “hall pass” in high schools; all of this is totally different to my dad’s experience (who has some patents and products, some of which are working for the military surprisingly) and he could explore w/o being creepily spied on or risk being thrown in jail for just exploring. It slapped me in the face when I made a comment about someone in his hometown selling peaches on the road, my first thought was “I hope that person has a legal permit”. He just laughed and said, “A permit? Haha”. It’s little things like that which add up to freedom which don’t bog you down w/ garbage that detracts from science. I can legally do so many more radio experiments in TN but not IN; even “illegally” but no one gives a sht b/c it won’t even affect your frequency. Just frustrating when older people expect young people to be creative when all the rules and systems in place make it not so; it’s not our fcking fault. It’s the stupid systems that we can’t change that are a problem to consider too.

Anura April 9, 2014 1:10 AM

@Figureitout

What newton did was science (math being a really pure science); science is a search for truth, and it is superseded only by more precise truths. C is a tool, not a science. It’s an old tool, too. Sure it works, but it’s not the best for the job. All programming languages are tools:

Assembly is rock. You can use it to hammer things, grind grains to make flour, shape other rocks into tools, and from there you can do basically anything, but it’s slow and time consuming.

C is fire. You can now refine minerals into copper and steel, forge hammers and knives. Again, you can build anything, but you have to be careful or your burn yourself.

C++ is a toolbox. It’s got screwdrivers, hammers, a saw. You don’t have to worry about setting yourself on fire anymore, but you are probably going to smash your thumb every once in a while.

C# is a set of power tools. You can now build everything you can with C and C++, but significantly faster, and without having to worry about smashing your hand (okay, the analogy kind of breaks down when you realize how much easier it is to cut your arm of with a band saw than a hacksaw).

Sure, C still works, it’s used by a lot of people, but not for a really good reason, and we pay for it in vulnerabilities. I love coding in C for small things; it’s simple, it’s clean, lightweight, but when you start getting into major applications, then it becomes frustrating. I don’t know any developer who doesn’t make mistakes, but in C those mistakes can easily lead to serious security vulnerabilities. Better languages can both minimize those mistakes, and minimize the impact of those mistakes.

Figureitout April 9, 2014 1:41 AM

Anura
–I can see your arguments, really I do. I just don’t get how what Ritchie and Thompson did wasn’t a scientific advancement; not merely a tool. This is easy to see as you delve down into hardware and blah blah electrical circuits (quite cool in my view). Math is a tool in my view (probably the most important one in known history). Thompson even had an epic hack that scares the living sht out of me; that infamous compiler hack. You have the tools to climb the rope or hang yourself. People do dumb sht until the education is easy and clear enough to prevent that; I don’t think the education really exists well enough yet. And wouldn’t it be a lot more efficient to just get known secure machine language chunks? Processor-dependent but diversity is of course dying unless a market develops.

And what about Python, isn’t that ultimately written in C? What happens when other languages are written in C?

Figureitout April 9, 2014 1:54 AM

Anura
–Goddamnit another buffer overflow; and vulnerabilities that are buit-in the “type safe” language you use is what really scares me. Let me f*ck up is all I’m saying; so you have no one to blame but yourself.

Anura April 9, 2014 2:06 AM

That’s the thing, it’s in one of the modules itself. If the module was written in a memory-safe language, you probably wouldn’t have a problem. It’s easier to have buffer checks all handled in one place, the compiler/interpreter, rather than in every single piece of code. Yes, there are always potentials for issues in the implementation, but by using a safer language you reduce the number of vulnerabilities that are in the wild. By implementing a compiler or interpreter in a language designed to reduce stupid mistakes, you reduce the bugs and vulnerabilities in the implementation as well.

There’s no magical solution to fix all the problems, no, and there never will be, but a better language can significantly reduce the problems. It’s no substitute to good coding practices and standards, unit testing, and thorough review and testing, but it makes for a pretty good complement.

Figureitout April 9, 2014 2:19 AM

Anura
–I agree, for anything you’re selling or running on the internet. In the modules, meaning you aren’t going to check it. Like you said, .NET, Java, and SQL are needed to make money; and then try to code securely and your dumbass manager says “Hurry up idiot!”. B/c they don’t know anything about coding or even coding securely which takes like 10-times as long. Otherwise, ASM offline I feel way more comfortable; in a faraday cage, followed by noise room, w/in a faraday cage; underground at least 20ft.

SDR makes Tempest a real security threat now, and I really hope I’m able to read and transmit a keypress from an SDR on newer Dell computers (to prove a massive hole that I sense); anyway I really digress.

yesme April 9, 2014 2:43 AM

@Nick P

“Imagine what they could have done if they’d not put all that effort into modernizing Algol-68!”

Some of the core Go developers came from plan-9, an OS written in a subset of C that didn’t suck. Plan-9 did have it’s own small TLS library (wink wink). Just download it, look at the code and compare it with GNU.

The Go tools are still Ken Thompsons C compiler toolchain. And it works because it is well engineered.

Imagining things isn’t bad. But at some point you have to make it working. Otherwise it will always be “what if” and “imagine”. That’s my issue with Rust. When will it be ready?

The Go guys did take a conservative approach. They also deliberately avoided things that could go wrong in the long run. Take a look at the less is exponentially more blog post from Rob Pike. That explains the rationale of Go.

These guys have experienced at first hand what can go wrong if mistakes are being interpreted as “features”.

That’s what you see with the GNU Autotools icm m4. With the GNU Autotools it is so easy to work around problems created by other projects that the original problems don’t get fixed at all, ever. So they have created a world of #ifdef and m4 and a ridiculously complex build system. The plan-9 guys didn’t have that. In fact, in the original UNIX they didn’t have that either.

So there is a reason for being conventional.

That said, when I code in Go, I am solving the problems. Long, long time ago I played with C++, but most of the time I was figuring out how to organise things. You know thinking about the big picture and drowned in all the details.

That’s my problem with “clever” languages. I am to stupid for that. Or maybe it’s because I am Dutch and like the ideas of Dijkstra.

Go does make you productive. I can guarantee you that. But I can understand that you like more advanced languages. There is nothing wrong with that.

ATM Go lacks a GUI in the standard toolchain. Go QML is looking very good and has the pedigree to become the default GUI.

Wael April 9, 2014 2:58 AM

@Figureitout
That was the polite version 😉

@Anura,

A good carpenter invests in good tools.

That’s the thing! “Tools” — plural. Different languages for different purposes. PHP, eh… not my cup of tea. Yes, popularity isn’t necessarily a good indicator of “goodness”…

Anura April 9, 2014 3:04 AM

Sure, you do have multiple tools, but the problem is we are using a screwdriver for everything we do, and the end result is an obvious pun.

Wael April 9, 2014 3:45 AM

@ Anura,
Funny! I was having a similar discussion today with someone. I understand that this is a language comparison/contrast. However it’s but one aspect of the problem. Suppose you have a perfect “C” programmer working on a project…. Mastery of the language isn’t a guarantee the code will be bug free. There are other aspects that need to be well understood in order to create a more secure, less buggy product. These areas would include the technology (Bluetooth, or cryptography, or whatever the project is about). The developer would also need to understand the limitations and capabilities of the hardware. Now if, say, the BT protocol has some weaknesses, then bugs will rear their head at some unfortunate point in time.

I do realize you are talking about language characteristics that make it easy for people to “create” bugs, and you’d be justified arguing the converse by supposing a developer with perfect knowledge of technology, platform, etc…. Then ask which language is more prone to bugs or security holes — that, of course, implies less than perfect mastery of the programming languages.

My comments are simplistic because they ignored the effects of large teams collaborating on the project with varying skill-sets, etc, etc,…

Mike the goat April 9, 2014 5:29 AM

As a refreshing aside I rang a small hosting company a client uses (despite them having a 100mbit connection direct to Verizon) this morning to explain the heartbleed bug and to note that the VPS they are using is vulnerable. The response I got was great – “we know, we are rolling out a patch in about an hour.”

Sure… Days late but still not bad for a VPS company.

Benni April 9, 2014 5:55 AM

Today the chairman of the NSA Untersuchungsauschuss, that comission of the german parliament that aims to investigate the nsa affair, retired from his position, after one day:

http://www.spiegel.de/politik/deutschland/nsa-ausschuss-vorsitzender-tritt-wegen-snowden-zurueck-a-963405.html#js-article-comments-box-pager

He is saying that the opposition wants to question snowden and this would lead to an interest conflict with him being head of the parlamentarian comission that oversees the german intelligence agencies.

I wonder what Snowden might have additionally? And what it is that the nsa does together with the BND that is so an important secret to keep that any questioning of snowden would raise a conflict of interest for the parlamentarian oversee commission of the bnd.

Randalf April 9, 2014 6:26 AM

@Benni
He is saying that the opposition wants to question snowden and this would lead to an interest conflict with him being head of the parlamentarian comission that oversees the german intelligence agencies.

Maybe he is actually concerned of getting into hot water.

Benni April 9, 2014 7:32 AM

@Randalf: certainly. The question is, what is this hot water? I mean, what answers could snowden give that would lead to a conflict of interest between the man who questions snowden and the chief of the oversee comitte of the bnd?

Clive Robinson April 9, 2014 8:36 AM

@ Anura, Nick P,

    <I do agree to an extent that security can be solved today…

Nagh, no, not a chance 🙁

Firstly as I’ve said a number of times before “Security is a ‘Quality Assurance issue’, and like quality it has to be fully inplace on day zero.”

Secondly as I’ve likewise said numerous times is the “Known Knowns, Unknown Knowns, Unknown Unknowns issue”.

Thirdly is the now well acknowledged “low hanging fruit” problem along with the “tiger and shoe laces” issue.

If you take a look at point one you might think from “day zero” it means you could start tomorrow, sorry no you cannot. Because it also applies to the tools you use and every thing else downwards on the computing stack to device physics as well as upwards past layer 9 (managment) through to international treaties. So at best we are in for the long haul.

Even then you have the specific vectors in classes of vectors, that the Knowns and Unknowns apply to. Currently we are not even close to working out how to deal with the specific Known vectors in known classes of vectors. I’ve made sugestions in the past, but untill the industry gets a grip on responding not to individual specific vectors but to classes of vectors we will fall steadily further and further behind even the least skilled of attackers.

Which brings us onto the “low hanging fruit” issue, the least skilled of attackers are making gains simply because as an industry we are failing to do things even remotely right. As Nick P has pointed out on several occasions we have not learned from let alone implemented quite easy security solutions from nearly half a century ago, and we appear to suffer the fate of those who don’t learn from history in that we appear condemed to relive it over and over like some sick “groundhog day”.

But whilst we scrable around after the least skilled attackers, those with even moderate skills are in our systems worse than a plague of ticks and we are so drained we don’t even have the energy left to scratch let alone get the insecticide to help keep them down.

And one reason for that as some of us having been saying for many years are the likes of the NSA, GCHQ et al. They have done just about everything they can to make sure that the only systems we can get are “thin skined” and the insecticides made of sugar water.

Ross J. Anderson was recently asked a question by a journalist to which response was to make laws to get rid of MI5, and whilst our host has not said the same of the FBI, I suspect he is broadly in agrement. Hoover might be dead and buried but his spirit of abuse is alive and well and appears to be unable to be killed so rather than call upon a priest to exorcise it perhaps we should call in the demolition team to pull the house down and burn the remains to molten metal and dust.

But whatever we chose to do, when and if we decide –instead of prevaricating– it will take upwards of twenty years to get to where we could have been a third of a century ago.

Mike the goat April 9, 2014 9:22 AM

Clive: I like to use the aircraft manufacturer’s industry as a collective that are doing things right. A plane crashes or a defect is found and the NTSB (or international equivilant) issues a report and problems are fixed so that the issue doesn’t recur. Even issues that aren’t strictly mechanical have been studied (human factors) and mitigation attempted (Crew Resource Management). Of course it isn’t perfect but when you look at our industry it just makes me shake my head.

All I know is that we will always have work in this field, and thanks to crummy software written in crummy languages executing on crummy operating systems ad infinitum our work is in more demand than we should be.

Nick P April 9, 2014 12:38 PM

@ Figureitout

re C

It is garbage from a design & engineering viewpoint. It’s cross-platform assembler with certain extra attributes. The year is 2014. In early 70’s, they had languages that were equal in power, performance and reliability to Go. In 80’s-90’s, more system programming languages appeared with C-like speed, yet without any of its problems. The incredibly weak & extremely diverse hardware that justified creation of C no longer applied to most users’ systems. So, with that backdrop, it’s obvious which language an engineer wouldn’t choose to build robust systems. Almost all chose it anyway. (sighs)

The next step is to eliminate C like they should’ve done in 90’s. The safer and more secure languages already performed very well back then. On today’s hardware, they’re blazing fast. And investing in them instead of C would make them even more powerful. That Oberon, for example, was targeted to a microcontroller shows even for small hardware we don’t need C, although we get less safety/runtime. Tradeoffs still exist with safer languages, yet C constantly trades away your safety with every step. You have to fight it constantly to ensure it works properly. Hard to justify that as an engineer as all that effort could be spent writing more useful code instead.

@ Anura

“Sure, Python is written in C; if it wasn’t, this probably wouldn’t have happened:”

LOL. Epic burn. Btw, Python has been written in Python (Stackless), too, and a JVM was written in Oberon. Both worked very well. One team, Juice project, even tried to replace Java applets with Oberon applets. They cleverly sent a compressed version of abstract syntax tree instead of source, allowing a type check for safety, super fast compile, native execution speeds, and low bandwidth use. World chose Java, instead, and now we gotta do a lot of downloads and patches. (sighs again)

Here’s my new headline for C: “Giving black hats job security since 1973.”

re Exceptions

They could be an issue. They were solved in a few previous works that did similar things to what we’re discussing. I don’t have anything specific to add on the issue because the specifics depend on what you end up designing.

@ Wael

“I’m still breastfeeding” (BASIC programmers) LOL.

Hey, hey, hey, I used to write killer code in BASIC. My first language and first favorite. All my UI’s for hacking tools were VB6, with low level code being a separate console BASIC with great FFI. My stuff was readable, got coded quick, rarely crashed, and none of my buddies hit me with a code injection [on my BASIC code at least]. First time I was forced to work in C++ I said “F*** this!” and just wrote a BASIC compiler that generated C++. In BASIC. In one day*. 😛

  • Dealing with all the quirks and BS of MS VCC compiler took way more than a day.

Later coded a version of BASIC in LISP so I could use interactive development, macros, live updates, cross platform, etc with a command to autogen actual BASIC code that rarely failed a compile. It could easily interface to VB6 GUI’s. I could do a whole small, networked application in the time C++ guys were debugging one big module. So all you BASIC haters can suck on THAT workflow. 😉

Side note: I’ve actually considered rebuilding this tool. I’m quite older now with less energy so I’m not sure I have it in me. Cool enough, though, there’s a project on the internet somewhere that did a C-like subset of LISP that compiles to C. They do it so they get C without its development workflow. I’ve got to top that so I might do the unthinkable: write Ada 2012 with LISP. The most powerful, flexible, out-of-control language married with the most restrictive, straight-jacket-for-programmers, control-freak language with result being better than both. (evil, villainous laughter)

@ yesme

“Plan-9 did have it’s own small TLS library (wink wink). Just download it, look at the code and compare it with GNU.”

I think I’ll do that. I do like looking at the work of smart people and comparing old to new. One can learn a lot doing so.

“That’s what you see with the GNU Autotools icm m4. ”

No, no, no don’t bring that up. That is one of the biggest disasters in the field of software development. I’m talking about language features that are already in real languages. Interesting enough, the Limbo safe concurrency I mentioned was co-developed by Pike and deployed in real products. Should be easy to add, eh? Macro’s are definitely riskier, yet Julia already deployed them. Type-safe linking and updates could be tricky yet a few college students threw it together in Modula 3.

So, I’m not talking about band-aids, far out ideas, etc. I’m talking about the current level of capabilities in modern programming languages. These capabilities have already been proven in production and often deployed by people with little experience. Expert, well-funded language designers doing a clean slate language should easily be able to do it. I don’t want every feature in the world, but I could imagine a 21st century effort should have capabilities better than 60-70’s, eh? That’s the only gripe with them.

“The Go guys did take a conservative approach. They also deliberately avoided things that could go wrong in the long run. Take a look at the less is exponentially more blog post from Rob Pike. That explains the rationale of Go.”

The “Aha” moment. Your article may have let me wrap my mind around it. They had their heads buried in C++, Java, etc. They also wanted to keep new language as simple as possible. So, they mentally started from lineage and features of languages like ALGOL/C++/Java, then started cherry picking features. I bet features like I references never entered their minds. They end up with an ALGOL68-like language that’s indeed much simpler than C++ and a suitable replacement. It makes sense that they’d do it that way given that mental framework. Just too bad they limited themselves as such.

Note: Your hate of M4 might make you love this article. Makes me think the whole UNIX stack is garbage and needs a reboot. Article was a partial inspiration for me looking at other architectures.

@ Clive Robinson

Of course, you know my goal is simply to get rid of the low hanging fruit. Getting closer to that goal than ever. The safe languages are a big help there. I’m also specifically pushing languages with little to no runtime rather than heavyweights like CLR and JVM based languages.

Clive Robinson April 9, 2014 1:51 PM

@ Nick P,

Re :- Beginers Alpurpose Symbolic Instruction Code

Yup I cut my teeth on the Dartford Collage original dialect written in Fortran on a computer with the OS written in Fortran (hence string length limit was 60 chars due to Holorith limit).

Mad as it might sound somewhere I have a prototype Pascal compiler written in BASIC from around 1977-8 as a Uni graduate project both in “Print out” and “Punch Paper Tape” from a KSR Teletype… (remember I did say I liked Pascal 😉

Late with a couple of tweeks it did run under MS QBasic and Borland Turbo Basic and with a fiddle or two I got it to also produce P-Code to run on the Apple ][.

A later trick was to make a BASIC to P-Code converter, which has given me a thought for you to chew on “a type safe / secure equivalent of P/F/J Code interpreter to be cross compiled for various CPU’s”.

Nick P April 9, 2014 2:47 PM

@ Clive Robinson

“A later trick was to make a BASIC to P-Code converter, which has given me a thought for you to chew on “a type safe / secure equivalent of P/F/J Code interpreter to be cross compiled for various CPU’s”.”

Great minds think alike: I was toying with that and Modula’s MCODE last night. Just a short brainstorm, though. Wirth customized his processor to be a MCODE interpreter. Then, even OS code could be written in Modula 2 and receive some benefits. Maybe I need to look at P-code again as there’s excellent BASIC IDE’s today (eg RealBASIC, GAMBAS) with plenty libraries and coders. A secure P-code machine might be easy to build, esp if build a hardware interpreter. Maybe add segmentation or tags to force code/data separation, function/module-level compartmentalization, and IO protection. Probably do a dedicated I/O processor with all necessary protections so I/O doesn’t break main machine abstraction. Plus, I still think mainframe Channel I/O is the shit.

You know, it might be good that you mentioned it because I just threw all that together off the top my head and it looks decent. I bet non-pro’s in academia could build it. Might be easier to do an EAL7-grade development on a simple machine, too. I’m having a tough time recently in my designs with the tradeoffs of hardware complexity: one direction gives more system protections far as features, but other direction gives more assurance of implementation. It’s driving me nuts. My approach so far is to do what I used to do here which is explore high level constructions that someone else can build, but using my intuition and experience to prune out assurance-killing choices.

Like I told RobertT, I wish I chose to master hardware design a long time ago instead of software. I’m [probably] too old to learn enough to do the whole thing myself. Had my fragmented memory included hardware design, I’d have already built 10 or 20 different solutions by now lol.

EDIT before submit:

Quick Google found Pascal MicroEngine. So, hardware version has already been done. They did I/O through BIOS, a strategy I’m not adopting as I won’t have BIOS. I also found a Wirth paper on history of Pascal where he pointed out P-code was a hack for quick porting to new architectures. However, it was a clever and effective one in that Pascal got ported to “80 architectures ranging from 8-bit to Cray” in six or so years. And by people who didn’t know compiler design. That’s pretty awesome.

Nick P April 9, 2014 6:17 PM

@ Clive

Found two more LISP machines

Thought you might find these interesting. The first is a simple LISP chip that basically just does those low level LISPy things like following pointers. It doesn’t even have an ALU. The second is a Scheme machine designed with four execution units and static RAM to maximize throughput, namely what gets done between memory requests. First was VLSI, second TTL. Both interesting.

LISP chip
ftp://publications.ai.mit.edu/ai-publications/pdf/AIM-514.pdf

Scheme86 machine
http://groups.csail.mit.edu/mac/users/mhwu/scheme86/sbt.ps

DB April 9, 2014 6:55 PM

A good carpenter doesn’t blame his tools… because he has the right tools for each task, and knows when to use which.

Clive Robinson April 9, 2014 7:29 PM

@ DB,

A point I have repeatedly made to “managment types” over the years but they never get it due to various lies they’ve been told 🙁

Anothe old saw which is perhaps apropriate for C is “Children and fools should not use sharp edged tools” especialy with many “code cutters” subscribing to “When you only have a hammer every problem looks like a nail” as they desperatly try to hammer out their code…

@ Nick P,

You might find this og interest (espeialy if you know some one with an Apple ][ with language carf and UCSD Pascal and Fortran or other language they ported to P-Code II.0,

http://ucsd-psystem-vm.sourceforge.net/

Thanks for the LISP links I’ll have a look at them over the next couple of days.

Oh a thought for you “How come most of these VM interpreters such as P-Code are stack based?”

Anura April 9, 2014 10:03 PM

If there are two things that need to go away on the web, they are Flash and Java. If those can die, I will be a very happy person. They introduce so many security vulnerabilities and every time I update them on my windows box I have to opt-out of them installing crapware on my computer.

Figureitout April 10, 2014 12:37 AM

LOL. Epic burn.
Nick P
–Yeah whatever. Keep sucking those BASIC 8008135 and let me know when your tummy can handle a beer. I’ll change your diaper after too. :p

Wael April 10, 2014 12:49 AM

@Nick P,

Hey, hey, hey, I used to write killer code in BASIC.

I did too, and it was fun. The language syntax was simple enough to allow you more bandwidth to spend on the functionality of the program. Were fun days… I’ve been weaned long time ago though 😉 These days, by the time you master a language it gets almost obsolete…

Figureitout April 10, 2014 12:53 AM

OT
–Researchers at MIT are working on a “perching” ability for drones to park on a power line and charge batteries; then go back out. It’s funny b/c I have some fantasies of being able to have a “whip-like” device that I sling over a power line and can either get some massive amounts power and blast out radio messages or just charge something. Some rather pleasurable physics and maths if you’re into that; neat paper besides you obviously know your work is going towards a police state.

http://groups.csail.mit.edu/robotics-center/public_papers/Moore11.pdf

Wael April 10, 2014 1:44 AM

@Figureitout,

Nice paper. Technically innovative, in a way. Weak from a security perspective because the drone becomes a sitting duck. Perhaps there is a typo in the paper.

Notice that these two signal are 90 degrees out of phase with one another and together they represent the real and complex components of the message signal.

I think “complex” should be “imaginary”. But I could be wrong…. Real+Imaginary= complex.

The paper reminds me of one of my Electromagnetics profs. It reminds me of him because his tests were very interesting. One of the problems went something like this: a plane is flying at 35000 feet, ground speed 500 miles an hour, the fuselage is made of an alloy with a resistance of so and so Ohms… Calculate the current induced across the length of the plane due to moving through the magnetic field of the earth.

On a related note, I read about some retailers who plan on utilizing drones to deliver goods you buy online to your address. Question is, when you see a drone flying by your house, do you run towards it to collect your stuff, or do you run away from it before it drops a tiny explosive on you?

Figureitout April 10, 2014 2:09 AM

Wael
–Yeah think you’re right. At the conclusion, “It can be concluded that by using the real and imaginary components output“. There’s typos everywhere, I made some in this thread even lol. It’s most annoying in something like a nice book, like my nice hardback ARRL operator’s handbook.

Hmm, never thought of that really lol. Like a plane moving around the magnetic field of the earth. So of course a plane really low and super fast will create some power.

Yeah, the company is Amazon. Very disturbing, that system will fail big time for a long time before it’s reliable. There was a recent incident in a triathlon (I used to do those, kind of want to pick it up again since I don’t do competitive sports anymore). Suspicions are it was hacked…If I see a little drone and can knock it out the sky, I’m getting a drone lol.

http://www.runnersworld.com/general-interest/triathlete-hit-in-head-by-drone

Anura April 10, 2014 2:19 AM

As with every other problem in life, the solution to the drone problem is really powerful lasers.

Nick P April 10, 2014 11:23 AM

@ Clive Robinson

“Oh a thought for you “How come most of these VM interpreters such as P-Code are stack based?””

On surface, it looks like a question leading to deep discussion. Truth is the only reason is legacy. In old days, some believed stack machines’ more compact encodings would leads to faster and more memory efficient systems. Similar in nature to CISC vs RISC debate. Over time, hardware shifted to register machines as they proved to be faster and many “stack”-based software constructs were actually emulated on registers. (Cached if nothing else.)

Now, why are so many done that way today? That’s anyone’s guess. My guess is the effects of legacy, so much stack work, rolled over onto next generation. They were taught to use stack-based designs, prevalent processor (x86) is stack-based, most languages worked with stacks, textbooks taught about stacks, and so they write their VM’s/OS’s/chips as stack machines. Fortunately, what the majority was doing didn’t stop outliers among hardware, software, and language designers from coming up with very different options.

@ Wael

“Were fun days… I’ve been weaned long time ago though 😉 These days, by the time you master a language it gets almost obsolete…”

Weaned haha… Funny thing is people say things are obsolete, but I don’t buy it much of the time. A useful tool is still a useful tool. It’s only obsolete to me if I have something so much better available it’s painful to use old one. Look at this feature list and code snippets. Would you be relieved or pained to work with it? I’d be relieved & get plenty work done. Although, it does seems a bit obsolete in that new languages have surpased its capabilities. Python or Ruby, particularly, seem to be the modern BASIC. Maybe obsolete is relative and situational. Not sure.

Note: Another example was VMS or Alpha processors. People kept telling me how obsolete they were. Yet, I still have nothing comparable to PALcode on Alpha whose security benefits were trade secrets to me. And gone. (tears) Then, later I see HP blow up a datacenter to test WAN cluster failover times with this result: Windows 120s, Linux 80s, NonStop 22s, VMS 18s. VMS came back in 18s, even faster than NonStop?! Of course, to someone thinking of obsolecense they’d have predicted a different ordering. I think of capabilities, where VMS mastered reliability/clustering, so I expected a quick recovery and No 1 or 2 spot. “Obsolete” mid-80’s clustering tech barely maintained over time outperforms modern alternatives. Lol.

“The paper reminds me of one of my Electromagnetics profs. It reminds me of him because his tests were very interesting. ”

That actually did sound interesting. My professors were more boring in comparison.

“Question is, when you see a drone flying by your house, do you run towards it to collect your stuff, or do you run away from it before it drops a tiny explosive on you?”

LOL. Never crossed my mind. Wael, we gotta go to DC now! We gotta tell them to get rid of the drones before innovation in online retail ceases to exist! Haha.

@ Anura

“I used to code professionally in VBScript+ASP… What do I win?”

My condolences and an invitation to ASPAnonymous to share your pain with others. I had to do that stuff for a short time. So painful.

@ Anura, Figureitout

re McAfee

You two might like this video by John McAfee. Dude is crazy, yet hilarious.

vas pup April 10, 2014 12:46 PM

@Anura:”Regulation is killing productivity!”, “Taxes are killing investment!”
I’d say “Overregulation is killing productivity”. Any policy should not be applied without taking into consideration current conditions. In Italy it was invented “Italian strike” when workers (as I recall rail road folks) being deprived the right to go on strike just start following exactly all policies, manuals, regulation, etc.
Everything was brought to the point of collapse, but with zero blame power against workers’ action. All workers’ reasonable needs were finally attentively listened to and satisfied. That is good example how to turn tables on the subject of regulations. Point is that regulation basically should set up list of prohibited actions under any circumstances (like no torture, period), but they should not take paradigm of micromanagement of actions allowable(only my opinion).

Taxes have two legs: fiscal (collect $) and stimulative(provide incentive).
When investments are in something good
(real products & services- non bubbles; research, etc.), create good paying jobs within own country, treat their employees as human beings with high standard (e.g. Google), then fiscal component should be decreased for the sake of stimulative component. I think profits out of investments in casino or in new high tech (e.g. $10 million)should NOT be taxed at the same flat rate based on profit amount only but taking into consideration other non-monetary components of particular investments stated above. Same thing with education: you need more engineers, IT, scientist, doctors – you should provide financial incentive for their education (grants, zero interest loans, etc.)but zero incentive for
new lawyers, investment bankers because of pure law of demand and offer on the level of society. Meaning, grants take into consideration what field you going to, not just degree you want to obtain.

Anura April 10, 2014 1:26 PM

@vas pup

This is off topic, so I don’t really want to get too far into this discussion, but I do want to make this point.

If you take a look at productivity growth ranked by decade:

http://data.bls.gov/timeseries/PRS85006092

Total Productivity Growth
1950s 31.35%
1960s 31.33%
2000s 28.68%
1990s 22.53%
1970s 20.78%
1980s 15.59%

And then take a look at GDP growth ranked by decade:

http://www.bea.gov/national/xls/gdplev.xls

Total GDP Growth
1960s 55.47%
1950s 50.88%
1990s 37.53%
1970s 37.20%
1980s 35.88%
2000s 19.44%

Then you notice something that is very out of place. If regulations were the problem, then the 2000s should have towards the bottom in Productivity growth. Note that this is not due to the recession either. If we look at the annualized rates, with only the the first 8 years of the 2000s (2000-2007) then we get the same picture:

Annual Productivity Growth
1950s 2.76%
1960s 2.76%
2000s 2.71%
1990s 2.05%
1970s 1.91%
1980s 1.46%

Annual GDP Growth
1960s 4.51%
1950s 4.20%
1990s 3.24%
1970s 3.21%
1980s 3.11%
2000s 2.65%

As for taxes, you have to consider historical rates, which is problematic due to different rules and changing sources of income, but suffice to say the 1980s and 2000s were the two decades with our lowest top marginal tax rates and long-term capital gains rates, but also the two worst decades for growth. Plus, if it was a lack of money for investment that was the problem, then we wouldn’t also have the highest income inequality.

I can only speculate as to the causes of our weak economy after that point, of course, but I would suspect the fact that we have been cutting government investments (e.g. education and infrastructure) as well as that for the past few decades businesses have focused on cost-cutting, which has kept demand depressed, resulting in a vicious cycle where demand is low, so the way to grow profit is to keep salaries down and cut hours (which you can do if you increase productivity), but that reenforces the low demand, which means investing in expansion is going to be more difficult to profit off of.

Back to my original point, as you can see, this took more than 140 characters.

vas pup April 10, 2014 4:00 PM

@Anura. I got your point and agree in particular on: “as well as that for the past few decades businesses have focused on cost-cutting, which has kept demand depressed, resulting in a vicious cycle where demand is low, so the way to grow profit is to keep salaries down and cut hours…” That modus operandi is absolutely counterproductive as figures you’ve provided confirmed. My vision is (I stated it already more than once on this respectful forum) that for any organization (business, gov, non-for profit) the most valuable resource is people working for it with their skills, knowledge, loyalty and commitment, meaning that investment in the all aspects of work force development (please do your research on Google attitude to their employees) and not treating people as (pardon my comparison) as disposables (‘used condoms’) in a long run generate substantially more revenue and simultaneously creates middle class with sense of dignity, self-respect as a fundamental basis for tranquility and stability in society as a whole. As more businesses adopted such attitude, the better for those businesses first. One sleepless night several years ago I watched C-SPAN with panel of businessmen (billionaires only). One of them was in charge of consulting busyness and another guy (with the same panel) came him several years ago and told that he wants to get rid off all his employees (several thousands). Consultant was puzzled. Really? Yeah, billionaire said, I want all of them become co-owners/partners (not just give one share and claim they are become co-owners), but dramatically change their attitude towards their place of employment. Required plan was developed and implemented. As result, assets growth within several years increased dramatically (don’t remember exact figures, but close to from $8 billion to $14 billion), and that was not hedge fund or bank. Thank again for you posting.

Wael April 11, 2014 1:10 AM

@ Nick P,
Regarding Alpha, I share your viewpoint. I also was a fan, and worked on it for about a year.

Then, later I see HP blow up a datacenter to test WAN cluster failover times with this result: Windows 120s, Linux 80s, NonStop 22s, VMS 18s.

I wasn’t aware of this benchmark! but I have two comments that may explain the results. 1) Alpha was a processor optimized for VMS as far as I remember. 2) Nonstop has a lot of redundancies that may come at a cost.
Re BASIC: Xojo looks awfully close to VB.NET.
Nick P, I’m afraid I’ll pass on the DC trip. Not into politics 😉 politics is a dirty business that should have been featured on one of these episodes:
http://www.discovery.com/tv-shows/dirty-jobs

Clive Robinson April 11, 2014 4:36 AM

@ Wael,

    politics is a dirty business that should have been featured on one of these episodes…

When I was still at school my father told me to be carefull about reading “modern politics” for three basic reasons.

The first is “history is written by the victors”, and that “the victorious polititions will hide anything that makes them look anything other than glorious”. His second point was the fact we lived in a “representational democracy” ment that polititions were dirty. His third point was “any sensible mamal such as a pig or a baboon when given enough space will chose not to live in it’s or any other animals mire, unlike polititions that positivly reveal in wallowing in it”.

He went on to point out no mater how small the space a wise person did not need to live in a toilet, and that I should always give myself space if not physicaly at least mentaly to ensure I kept out of the mire.

Nick P April 11, 2014 8:50 AM

@ Clive Robinson

Thanks for the link. It was an interesting article. The person had a forged photo and everything. The service provider had shoddy security, either not enforcing technical controls or easily conned. And then they claim French law is an obstacle to their cooperation (which I agree with, but for different reasons). That situations just sucks all over.

And people who’ve been on the blog a long time will note I’ve regularly mentioned the importance of choosing jurisdiction of an organization. You want a real legal system to exist, but minimal risk to the business. More benefits (privacy, I.P. protection) if possible. France isn’t quite the country I’d choose to depend on esp considering they’re prolific at economic espionage as well. Looking at options recently, I noticed Zug, Switzerland is very popular among the big companies for its laws. Ireland is the other one, mainly for taxes. I’m not sure how good or bad other laws are over there.

Btw, their privacy policy is actually interesting. Seems more reassuring than many over here. Knowing Australia is a Five Eyes country, though, I wonder how much is true and how much is BS. That I must wonder says much about the kind of world we live in today.

NIck P April 11, 2014 8:55 AM

@ Wael

“1) Alpha was a processor optimized for VMS as far as I remember. 2) Nonstop has a lot of redundancies that may come at a cost. ”

Decent theory on Alpha and VMS. Nonstop has a local failover of about a second. So, I doubt it’s redundancies. So, you get a grade of 50%. 😛

“Not into politics 😉 politics is a dirty business that should have been featured on one of these episodes:”

Lol. I couldn’t bring myself to watch too many episodes of that show. Too nasty. I respect dude for doing the jobs, though. That entertainer works for his paycheck.

Clive Robinson April 11, 2014 9:03 AM

@ Bruce,

You might want to have a look at this biometric piece,

http://gcn.com/blogs/cybereye/2014/03/iris-scans-aging.aspx?admgarea=TC_SecCybersSec

It points out that Iris Scaning –that the USG with NISTs help are trying to turn into universal authentication– may well not be at all reliable, with accusations NISTs methodology is wrong etc,making it “quite a bun fight at the chimps tea party” the troble being atleast one chimp is a proverbial 600lb gorriler…

Wael April 11, 2014 10:15 AM

@Clive Robinson,

He went on to point out no mater how small the space a wise person did not need to live in a toilet, and that I should always give myself space if not physicaly at least mentaly to ensure I kept out of the mire.

Your father is a wise man, and you got his DNA. I used to believe that mediocre security is a result of technical reasons, but that is not always the case. “Mire” has got to be factored in as well.

Anura April 11, 2014 12:04 PM

@Clive Robinson

You might find this of interest and a cautionary note as to what can happen out there,

http://blog.fastmail.fm/2014/04/10/when-two-factor-authentication-is-not-enough/

Here’s my proposal: In-Person authentication. Set up an authentication company; that company has offices around the world. In order to initially establish identity, you have to show up to one of their offices and do the following:

Be photographed
Present sufficient documentation on your identity
Be fingerprinted

Every time you want to modify agreed-upon account information, you have to show up and go through the same routine, which then gets compared to your previous visit. As long as there is no way to forge documents, forge facial features, and forge fingerprints, you are good.

There are ways to forge all of those, aren’t there? Plus, you can compromise the system to change the documents. You know, this is hard. But, I digress; the point is that for some things you don’t need stuff to be instant, and maybe for critical things (e.g. establishing credit, or changing account controls for corporate domain accounts) we need to establish a more solid set of procedures to verify identity, even if you can’t do it while on vacation from your hotel room? It’s something to think about.

At least until we all have chips implanted in our brains that can generate and permanently store 4096-bit Diffie-Hellman keys.

tinfoil April 12, 2014 11:44 AM

@Bruce

Why does schneier.com not support ecdhe for ssl? And at the same time you allow RC4 to be used. Why not disable RC4?

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.