Building Smarter Ransomware

Matthew Green and students speculate on what truly well-designed ransomware system could look like:

Most modern ransomware employs a cryptocurrency like Bitcoin to enable the payments that make the ransom possible. This is perhaps not the strongest argument for systems like Bitcoin -- and yet it seems unlikely that Bitcoin is going away anytime soon. If we can't solve the problem of Bitcoin, maybe it's possible to use Bitcoin to make "more reliable" ransomware.


Recall that in the final step of the ransom process, the ransomware operator must deliver a decryption key to the victim. This step is the most fraught for operators, since it requires them to manage keys and respond to queries on the Internet. Wouldn't it be better for operators if they could eliminate this step altogether?


At least in theory it might be possible to develop a DAO that's funded entirely by ransomware payments -- and in turn mindlessly contracts real human beings to develop better ransomware, deploy it against human targets, and...rinse repeat. It's unlikely that such a system would be stable in the long run ­ humans are clever and good at destroying dumb things ­ but it might get a good run.

One of the reasons society hasn't destroyed itself is that people with intelligence and skills tend to not be criminals for a living. If it ever became a viable career path, we're doomed.

Posted on March 7, 2017 at 8:15 AM • 22 Comments


JG4March 7, 2017 8:18 AM

"If it ever became a viable career path, we're doomed."

It came to pass in post-war Germany, with the economy in ruins, that some of the brightest turned their minds to crime. I don't have the cites handy, but there were some brilliant capers. Perhaps Clive knows some of the folklore.

K.S.March 7, 2017 8:21 AM

"One of the reasons society hasn't destroyed itself is that people with intelligence and skills tend to not be criminals for a living."

Much better explanation is game theory (cooperators vs. exploitators). I think predisposition is genetic in nature, that is it is unlikely that in normal circumstances sufficient incentives could exist to turn cooperator into exploitator.

SoWhatDidYouExpectMarch 7, 2017 8:36 AM

Don't look now but the process you indicate to NOT be happening, is HAPPENING, based on all the recent events in DC. What used to be cooperators are now turning to exploitation.

WinterMarch 7, 2017 8:37 AM

"If it ever became a viable career path, we're doomed."

It is an old joke, but still true:
Stupid criminals rob banks, smart criminals own banks.

The 2008 financial crisis has shown us, again, how true this joke still is. Those people who literally cashed billions at the cost to society of trillions got a handshake and a thank you.

Double FMarch 7, 2017 8:48 AM

"people with intelligence and skills tend to not be criminals for a living."
In the long term, behavior unbound by a moral or ethics code tends to converge in self destructive patterns. This applies to individuals but probably more so to groups. Look at the chaotic White House where everybody looks at everybody else with suspicion. Higher degrees of efficiency require higher levels of order/coherence. Human values and virtues (honored throughout geographies and times) permit such long-term coherence in groups.

keinerMarch 7, 2017 8:54 AM


ahh, you mean Enron. And Goldman Sachs! Ooops, wrong continent, wrong time. But the USA is always "post-war", isn't it? Or pre-war? Disturbing times...

My InfoMarch 7, 2017 9:03 AM

@Bruce Schneier

One of the reasons society hasn't destroyed itself is that people with intelligence and skills tend to not be criminals for a living. If it ever became a viable career path, we're doomed.

No. It isn't quite that bad. Those who commit crime for a living make themselves enemies who will never rest until their way of life of crime is destroyed. On the other hand, those who make an honest living and leave others in peace will be left in peace. I am one of the enemies those criminals have made themselves.

keinerMarch 7, 2017 9:20 AM

@My Info

..but you have remotely heard of something called Mafia? Or the drug cartels from all around the world? I think they sleep quite well, after all.

Legalizing narcotics would bring all these guys to other "business models", most likely including online fraught on completely new scales.

My InfoMarch 7, 2017 10:27 AM


Mafia? Or the drug cartels from all around the world?

Yes, I have. Narcotics already did go legal. Way too legal.

This is the cartel. No one else really matters. Combined market cap (total worth on stock market) of these companies is just over $1,500,000,000,000.00.

Combined gross revenues are almost $400,000,000,000.00 per year with a combined net profit of $57,000,000,000.00 at a nice reasonable net profit margin of 14.5%.

The remainder of that money (the 85.5%) is advertising, lobbying for civil commitment laws, protection rackets, etc.

My InfoMarch 7, 2017 10:30 AM

Oh, yeah. They need money to do R&D, bribe FDA, buy out or stamp out upstart biotechs, and all that other stuff that goes along with maintaining a cartel. Puts Sinaloa and Sicilian Mafia to shame.

My InfoMarch 7, 2017 10:38 AM

Oh, that's right, taxes, too. The gummint is definitely part of that racket.

Vesselin BontchevMarch 7, 2017 11:41 AM

LOL, Bruce, Mathew wrote that article as a joke. No ransomware author with a brain would waste their time on anything like that - and those without a brain wouldn't be able to implement these ideas anyway.

@HGM, @keiner, "DAO" here is a "Distributed Autonomous Organization". Supposedly implemented via smart contracts (e.g., in Ethereum or some other crypto currency that supports them; not sure if there are any others). The idea is that you write the rules of the contract using a JavaScript-like programming language, the contract is published on a blockchain, and is enforced automatically, without human intervention.

There was a DAO called "TheDAO" implemented in Ethereum but there were serious programming bugs in its contracts, which made it possible for an unknown attacker to syphon most of its "money" (i.e., Ethereum tokens) that were worth millions at the time. Since the creator of Ethereum was a major investor in TheDAO, he hard-forked the currency to invalidate the attacker's tokens. It's a fascinating and funny story.

GregWMarch 7, 2017 12:01 PM

@My Info

I have a bit of pharmaceutical industry knowledge, and the dirty little secret is that it has been delivering improvements with diminishing returns reflecting the inverse of Moore's law, "Eroom's law":

The FDA actually requires new drugs to be measurably/verifiably better than old ones which exposes the above situation. I sure as heck hope that Trump/Repubs don't succeed in gutting that basic requirement as the "streamline" new drug approval at the FDA.

albertMarch 7, 2017 1:32 PM


Both links now lead to the wiki page. (as they should have)


Serves me right, I guess.

. .. . .. --- ....

anon100March 8, 2017 5:23 AM

Smart people usually don't become criminals because almost no-one wants to "be evil" and smart people usually can do well in life without resorting to antisocial means to get by. This can break in two ways.

Option one is that the bar on who gets to positively contribute to the society gets too high and smarter and smarter people are left out from the system which increases their likelihood to become criminals.

Option two is that society degrades so much that people don't feel anymore that contributing to it counts as good. This can happen as more and more of the activity that is defined useful in economic sense is some kind of zero sum games or antisocial "corporate bullshit". If there is no moral difference in getting a normal job and running an internet scam, smart people choose whichever is the easiest.

vas pupMarch 10, 2017 9:40 AM

Technology is now at the "root" of all serious criminality, says Europe's police agency.
The returns generated by document fraud, money laundering and online trade in illegal goods helps to pay for other damaging crimes, said Europol.
The wider use of technology by criminal gangs poses the "greatest challenge" to police forces, it said in a study.
It revealed that Europol is currently tracking 5,000 separate international organized crime groups.
Many gangs were turning to technology to help make well-established crimes more lucrative.
For instance, said the report, drones were now being used to transport drugs and many burglars now track social media posts to work out when people are away from their home .
The steady increase in the number of reported burglaries across Europe was a "particular concern" for many nations, it said.
Smart machines v hackers: How cyber warfare is escalating.
There is a gaping hole in the digital defences that companies use to keep out cyber thieves.
The hole is the global shortage of skilled staff that keeps security hardware running, analyses threats and kicks out intruders.
Currently, the global security industry is lacking about one million trained workers, suggests research by ISC2 - the industry body for security professionals. The deficit looks set to grow to 1.8 million within five years, it believes.
The shortfall is widely recognized and gives rise to other problems, says Ian Glover, head of Crest - the UK body that certifies the skills of ethical hackers.
Help has to come from another source: machines.
"If you look at the increase in automation of attack tools then you need to have an increase in automation in the tools we use to defend ourselves," he says.
'Drowning' in data
That move towards more automation is already under way, says Peter Woollacott, founder and chief executive of Sydney-based Huntsman Security, adding that the change was long overdue.
For too long, security has been a "hand-rolled" exercise, he says.
That is a problem when the analysts expected to defend companies are "drowning" in data generated by firewalls, PCs, intrusion detection systems and all the other appliances they have bought and installed, he says.
Automation is nothing new, says Oliver Tavakoli, chief technology officer at security firm Vectra Networks - early uses helped antivirus software spot novel malicious programmes.
"Machine learning is more understandable and more simplistic than AI [artificial intelligence]," says Mr Tavakoli, but that doesn't mean it can only handle simple problems.
The analytical power of machine learning derives from the development of algorithms that can take in huge amounts of data and pick out anomalies or significant trends. Increased computing power has also made this possible.
These "deep learning" algorithms come in many different flavours.
Some, such as OpenAI, are available to anyone, but most are owned by the companies that developed them. So larger security firms have been snapping up smaller, smarter start-ups in an effort to bolster their defences quickly.
At the Def Con hacker gathering last year, Darpa, the US military research agency, ran a competition that let seven smart computer programs attack each other to see which was the best at defending itself.
The winner, called Mayhem, is now being adapted so that it can spot and fix flaws in code that could be exploited by malicious hackers
So now cybersecurity analysts can sit back and let the machine-learning systems crunch all the data and pick out evidence of serious attacks that really deserve human attention.
"It's like the surgeons who just do the cutting," says Mr Tavakoli. "They do not prep the patient, they are just there to operate and they do it very well."

Clive RobinsonMarch 12, 2017 1:37 PM

@ vas pup,

Technology is now at the "root" of all serious criminality, says Europe's police agency.

That argument is an old one that gets troted out over and over, and actually says more about the inability of the police than it does about the ability of the criminals.

If you look at the crimes there is method and there is mechanism. For example you see the same old "con-artist" method used on the new "Internet" mechanism.

All the criminals have realy done is "up-sticks" from a place the landscape of which the Police knew well and policed some what effectively, to a place where for the Police their map said "Here be dragons" of which they new nothing and did not venture.

The thing is what most do not realise is that the "assumed rules" of the tangible physical world do not apply in the intangible information world.

Take the basic notion of "locality" a standard physical world assumption is "the criminal visits the scene of the crime". Even though it does not have to happen in physical world --think remote surgery-- it's by far the norm. The opposite however is the way the intangible information world works, with the criminal being easily being capable of being as far away on the earth as it's possible to get. Obviously the physical world notion of jurisdiction falls apart because the crime is in one place and the criminal in another. Thus both the Police and legislative bodies were caught wrong footed by the criminals who cared not for the idea of abiding by the rules "Plod & Co" understood. A third of a century later there has actually been very little "catchup" by the legislators nomater how hard the Police push.

The minute you realise "locality" is not an issue you immediately start thinking about being in multiple places commiting multiple crimes. That's not possible in the physical world, but in the information world it's almost trivial. Thus you could be running a con not just against one mark but tens, hundreds, thousands of them all at the same time. Thus you have one component of "An army of one".

The other element you need for an army of one is multiple savants and their coresponding "force multiplier" needs. In the physical world force multipliers wether savants or not need considerable resources to build and operate and thus high cost. In the information world you are using other peoples resources against them, and the cost of duplicating savants etc is very very small, thus once developed an attack method has no cost to the originator to duplicate.

These are difficult concepts for quite a few technical people to think about and by and large neither the legislators or police are technical thus their understanding is a good deal less. It's one of the reasons we have such bad technical regulation legislation.

Leave a comment

Allowed HTML: <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre>

Photo of Bruce Schneier by Per Ervland.

Schneier on Security is a personal website. Opinions expressed are not necessarily those of IBM Resilient.