Suckfly

Suckfly seems to be another Chinese nation-state espionage tool, first stealing South Korean certificates and now attacking Indian networks.

Symantec has done a good job of explaining how Suckfly works, and there's a lot of good detail in the blog posts. My only complaint is its reluctance to disclose who the targets are. It doesn't name the South Korean companies whose certificates were stolen, and it doesn't name the Indian companies that were hacked:

Many of the targets we identified were well known commercial organizations located in India. These organizations included:

  • One of India's largest financial organizations
  • A large e-commerce company
  • The e-commerce company's primary shipping vendor
  • One of India's top five IT firms
  • A United States healthcare provider's Indian business unit
  • Two government organizations

Suckfly spent more time attacking the government networks compared to all but one of the commercial targets. Additionally, one of the two government organizations had the highest infection rate of the Indian targets.

My guess is that Symantec can't disclose those names, because those are all customers and Symantec has confidentiality obligations towards them. But by leaving this information out, Symantec is harming us all. We have to make decisions on the Internet all the time about who to trust and who to rely on. The more information we have, the better we can make those decisions. And the more companies are publicly called out when their security fails, the more they will try to make security better.

Symantec's motivation in releasing information about Suckfly is marketing, and that's fine. There, its interests and the interests of the research community are aligned. But here, the interests diverge, and this is the value of mandatory disclosure laws.

Posted on May 26, 2016 at 6:31 AM • 29 Comments

Comments

WmMay 26, 2016 7:04 AM

One thing for certain. I will be sure to not do any business with any companies in India or any U.S. healthcare provider that has an Indian business unit.

GrauhutMay 26, 2016 7:39 AM

@Bruce: Does it really work like that?

"But by leaving this information out, Symantec is harming us all. We have to make decisions on the Internet all the time about who to trust and who to rely on. The more information we have, the better we can make those decisions."

Right, but this ignores the belly of the iceberg. We would also need a list of all the potential victims Symantec knows, that were lucky enough not to become a target in this attack wave in order to make informed decisions.

Big .com's get big rabates in markets based on their buying volume, smaller companies don't have these rabates and need to save on service (and security) quality in order to compete with the big fish in the world market price aquarium.

"Everthing is automagically better than then those p0wned victims" would imho be a snake oil security decision.

SankarMay 26, 2016 7:52 AM

I am guessing the e-commerce company and its shipping partner are Flipkart and WS-Retail respectively.

ZMay 26, 2016 8:41 AM

As far as I know, there are no blanket legal obligation to disclose security incidents, so not sure why we should expect Symantec to do so. Also, it's not because a company has been the target of this specific malware than suddenly they are “less trustable”. Big companies face security incidents all the time.

ScottMay 26, 2016 8:57 AM

Good luck on not doing business with US Healthcare Providers that don't have "an Indian Business Unit".


Ever hear of IBM? I heard they're some big IT outsourcing provider with extensive operations in India.

Bumble BeeMay 26, 2016 9:55 AM

Hmm. "Suckfly." Sounds like the Blandford fly from Great Britain. They have doctors to apply leeches if you aren't feeling well after being bitten by all these flies.

Ugh, yuck. Let's just invade North Korea instead, execute all Kim Jong-Un's barbers, rob (I mean appropriate) all his money and bank accounts, dismantle all his nuke toys, and put all his people on food stamps. They'll get jobs as soon as they figure out they can't buy liquor with food stamps.

blakeMay 26, 2016 10:02 AM

> nation-state espionage tool

> We have to make decisions on the Internet all the time about who to trust

Once you're talking nation-state level conflicts and trust, if it really comes down to it, any government that doesn't recognize you as a citizen is going to have other principal interests, namely their own citizens.

> The more information we have, the better we can make those decisions.

This comes no less than a day after a post about how having all the data isn't all it's talked up to be. If you find that company X has been hit by targeted malware, are you going to ditch them in favor of competitor Y for which you haven't heard of any malware being detected?

From a Register article:

> Symantec only uncovered the attacks two years after most of them had taken place and only then after it knew what to look for.

Horse, barn door, etc.

rMay 26, 2016 10:53 AM

@bumble bee,

You don't live in the hood do you?

To wit: one can certainly but the components to make liquor with food stamps, then you can sell your wine to wine-o's for more stamps to make larger and larger quantities of hooch... grain... wine...

Clive says beer is a little more costly, those are the low hanging fruit.

Oh! Hand sanitizer.

WaelMay 26, 2016 2:54 PM

Suckfly! What an ugly name...

While we became initially curious because the hacktool was signed, we became more suspicious when we realized a mobile software developer had signed it, since this is not the type of software typically associated with a mobile application.

Well, the OS should not trust signed code for everything! Mobile productivity Code signed by a gaming company shouldn't be trusted. A cert should be valid for a specific category of applications for specific platforms. Signed code for mobile devices. "shouldn't" be valid on a desktop. There'll be some hurdles, but that's the cost!

either misused it or it had been stolen from them...

How about bribed out of them. Disgruntled employee selling a signing cert? An HSM should be used to avoid cert leakages. No one should have access to the singing cert. You send the binary to the HSM, and it signs it for you. If the HSM doesn't allow the cert to be extracted out to legitimate users, then malware wouldn't have the ability to "steal" the certificate. (Ex)Employees won't have access to a cert to sell it either.

In addition to the traffic originating from Chengdu, we identified a selection of hacktools and malware signed using nine stolen certificates.

How about weak certificates and untrusted third parties? Who issued these certificates, was it a common CA?

the most likely scenario was that the companies were breached with malware that had the ability to search for and extract certificates from within the organization.

This assumes a dreadful security posture of the "victim" companies.

When a certificate is revoked, the computer displays a window explaining that the certificate cannot be verified and should not be trusted before asking the user if they want to continue with the installation.

Yea, users read everything before they click "Ok, give it to me".

Explorer, which can allow the attacker to execute code with the same privileges as the currently logged-in user.

That's why one needs to browse from a virtual machine that gets torn down after the session concludes. Still login with an unprivileged account on both the host OS and the virtual machine.

From a cert perspective, certificates should only be valid for a class or narrow category of applications. Not a fool proof method, but better than nothing. I have a feeling I said something stooopid there, but not sure what. Will wait for the flame storm :)

woody weaverMay 26, 2016 3:58 PM

perhaps I'm missing something, but can't the impacted developers just revoke their cert? Or is this an issue that the routines that check signed code don't properly implement certificate validation?

rMay 26, 2016 4:09 PM

I'm not sure as you said disclosure laws apply to them, partially because of any NDA/confidentiality agreement in their services/contact and additionally because of the international and criminal implications. The other countries as we know about the various data protection and privacy rules abroad may have stronger or weaker requirements. Symantec shouldn't be obligated to oust these companies it would hurt Symantec's business but the companies involved probably should be held to disclosure.

Requiring immediate disclosure would most certainly tighten things up as shareholders and ceos wouldn't like to watch their stocks dump... (Because they would)

An impromptu disclosure by anyone other than the affected company itself can introduce liability, ESP when the hackers do it: I'm sure you could take advantage of trades the way they were reading acquisition and merger emails.

So, in sum; it's a complex issue and I'm sure if you've got Obama care like i do you can expect another OPM letter.

WaelMay 26, 2016 4:10 PM

@woody weaver,

perhaps I'm missing something...

Apparently the owner of the cert wasn't aware of its misuse. They can revoke the cert after they find out.

NateMay 26, 2016 5:24 PM

@Wael: "Disgruntled employee selling a signing cert? An HSM should be used to avoid cert leakages. No one should have access to the singing cert. You send the binary to the HSM, and it signs it for you."

That's a very interesting usage scenario and I wonder exactly what percentage of software developers signing code 1) even have an HSM and 2) have it set up to sign in such a manner.

Maybe I'm wrong, but I have the gut feeling that the percentage of developers today happily signing code on, eg, an actual computer - you know, the same one that runs their IDE, their source code management system, their filesystem, their databases - is probably close to 90%.

And of those, the percentage who run that IDE and source-control system on some kind of 'cloud' platform giving the owner of the Cloud read access to their signing certificates?

I'm guessing maybe 70-80%? And the cloud companies would want it to be 100%?

I hope I'm wrong. But outside of banks and credit card companies... who even knows what an HSM is?

I am sure that mass certificate and private key leakage * from Cloud Computing is absolutely going to bite us all in the near future. But by the time we ask 'hey, why again did we give Jeff Bezos root access to every computer on the planet?' it will be a little late.


* And by 'leakage' I don't mean 'random hackers will get your passwords', I mean 'a tiny elite core of the military-Internet complex will hold all your passwords, and you will never know who they are or what they can read, and it will probably be legally defined as an act of terrorism to ask.'

Frankly that scares me a lot more than 'hackers randomly get stuff' but most people will consider it 'best-practices security'.

AMay 26, 2016 5:35 PM

It seems like the Indian Govt. is trying to publicize their tech-saviness(probably forcing a term). The term Suckfly, when translated into Hindi(Indian language) is an exaggerated adjective for a miserly businessman. Some businessman from the Indian PM's home state consider it a badge of honor in casual talk. In other words, a novel thought.

65535May 26, 2016 7:56 PM

The lesson learned:

If you start a cyber know your foe may be equally as strong as you are.

The Chinese have basic control of Taiwan where most of hardware in made. It not a big jump for them to implant root kit viruses in the CPU chips or video chips that will be very difficult to detect. The Chinese are clever, ruthless, well manned, and spread across Asia.

This doesn't even touch upon the NSA/FBI's and the Chinese Zero day stock.The cyber war will end badly.

WaelMay 26, 2016 7:57 PM

@Nate,

Maybe I'm wrong, but I have the gut feeling that the percentage of developers today happily signing code on, eg, an actual computer...

You're wrong... about being wrong! I've seen it with my eyes.

Inside Threat ModelMay 27, 2016 1:09 AM

@Wael
The signing of suckfly stood out as a prominent point for me.
I spend a great deal of time talking to customers and prospective customers about the proliferation of software installs with dodgy or non-existent signatures - most don't get it.

WaelMay 27, 2016 1:55 AM

@Inside Threat Model,

installs with dodgy or non-existent signatures - most don't get it.

Perhaps if you illustrate with physical examples, your customers will get it. Something along the lines of using a stolen ID or conning a person to sign a blank check (by putting carbon paper underneath a legitimate paper to sign.) I wouldn't recommend you install a properly signed rogue app fior demonstration.

ThothMay 27, 2016 5:40 AM

@Wael, all
re:Musical/Singing Cert

"No one should have access to the singing cert."

It would be interesting to use audio to verify certs...lol.

re:HSM

Most of them are bulky and expensive but you can either buy a smartcard to do your PKI sign or you can buy a smartcard-HSM (full fledge personal pocket HSM).

Of course he general excuse most people (devs) gives about software keystores are the following:

- I dont know what is this private key or code signing and I am just following some web instructions.

- Using HSM is too paranoid and I have nothing to hide nor valuable to steal.

- HSMs are only for the elites and low level peasants get nothing of the expensive sort...

- Can I trust the HSM ?

For the issue of trust, just buy a Javacard enabled smartcard supporting RSA2048 and download the following applets to turn a smartcard into a HSM.

Otherwise just purchase a cheap smartcard-HSM that includes commercial and dev support with PKCS11 and PKCS15 support (oh I forget they don't know what is those PKCS thingy :( )

Links:
- www.cardomatic.de
- http://www.ftsafe.com/product/smartcard/pkicard
- http://www.ftsafe.com/product/epass
- https://github.com/philipWendland/IsoApplet

FOOTNOTE: PUT IT SIMPLY, NEVER EVER TRUST A SIGNED SOFTWARE AS IT MEANS NOTHING AND ALWAYS RUN THE SOFTWARE IN A VM/MOCROKERNRL OR ANOTHER SACRIFICIAL CD-ROM BOOTED PC.

MikeMay 27, 2016 6:29 AM

Somebody save the universe from suckfly. On a serious note, i would not be making haste in making any opinion before i get strong evidence about suckfly.


Thanks
Mike

GrauhutMay 28, 2016 1:43 PM

@65535: We have already seen this kind of stuff
http://www.information-age.com/technology/security/2105468/security-backdoor-found-in-china-made-us-military-chip


@Mike: SEO spam doesn't work very well if rel=nofollow! is set. And you are playing against a hardware assisted KI now that kicks stuff like this faster than you can say "beep"... :)

{a href="http://www.thedrum.com/profile/seoexpertsindiacom-review" rel="nofollow">Mike

https://en.wikipedia.org/wiki/RankBrain

Packet Sniff No EvilMay 29, 2016 12:42 AM

@Z

As far as I know, there are no blanket legal obligation to disclose security incidents,

I'd always imagined that basic things like fraud, due-diligence, misrepresentation of products (fraud), were theoretically relevant. I.e. it would seem fraudulent if Google knew of Chinese hackers pilfering customers private gmail information on a daily basis for years and never mentioned it. Of course there is an obvious route of turning a blind eye to security so that 'what you don't know can't hurt you legally', but at some point that turns into neglect of a scale that equates to fraud as well IMO. Of course then the ethical waters get even murkier if in such a hypothetical situation Google might legitimately be able to claim that inaction by the NSA was the root cause of the intrusions.

Benjamin LimMay 29, 2016 7:02 AM

What level of security is appropriate for signing keys then? Should we keep it on an airgapped machine and transfer the source code over whenever we have a new release?

anonymousJune 1, 2016 7:20 AM

It would be good to know, if this is a Symantec problem or customer security problem. Symantec provides also for embedded SIM the root certificate for machines, if that is compromised, the impact would be, ... bad....
https://www.symantec.com/products/information-protection/device-certificate-service
"The GSMA’s Embedded eSIM specification provides a mechanism for remote ‘over the air’ provisioning and management of Embedded SIMs in machine-to-machine (M2M) devices.

In the M2M market, this specification allows mobile network operators to provide scalable, reliable and secure connectivity for M2M connected devices; by allowing a specific non-removable SIM embedded into the M2M device at the point of manufacture. The eSIM can later be remotely provisioned with the subscription profile of the operator providing the connectivity, which can be subsequently changed or modified over the air. This eliminates the need by the consumer or the service provider to intervene and replace SIM cards over the lifetime of each M2M product, thereby reducing ongoing operational and logistical costs.

Symantec hosts the Single global Root of Trust for GSMA eSIM in its military grade datacenters and offers certificate-based authentication services to operators and service providers wishing to be part of the GSMA eSIM ecosystem."

NateJune 6, 2016 5:17 PM

@Benjamin Lim: "What level of security is appropriate for signing keys then? Should we keep it on an airgapped machine and transfer the source code over whenever we have a new release?"

Probably! Assuming you trust a USB stick to be an 'air gap'; that's how Stuxnet got onto machines, after all, and the code for several USB firmware viruses is out there.

But my main concern at the moment is much simpler: cloud computing nodes.

If you at least have a _physical_ machine in a space that you physically control (running a reputable open-source operating system like Debian Linux) then it's at least theoretically _possible_ for you to protect any private keys stored on that machine. You might fail, but you've at least got a shot.

But if that machine is a VM running on a hypervisor on an Amazon or Microsoft public cloud in a data centre somewhere in the world which might not even be your country...

... then you're 100% exposed. You need to understand that Jeff Bezos has as fine-grained root access to your RAM and CPU registers as he wants, at any time that he wants it. And it would probably be a five-minute job to set up a hypervisor script to 'scan RAM for known Linux or Windows private keys, and file them'.

I mean, that's what I would do if I were hosting a public cloud! Because ISIS / Silk Road / whoever the enemy of the month is, are certainly running VM nodes on my cloud system, and the FBI / CIA / NSA / etc are equally certainly asking me how can they get root access to those VMs and all those keys. And as a cloud operator, I will either answer 'yes, sir, here's all the data you'd like' or I will be very quickly looking for another career.

And now that I've set up a system to respond to requests from the 'authorised legal authorities' to get random cloud VM root passwords and private keys, I might as well make use of it to take care of my commercial or political rivals too. I mean, what's the downside for me? That they find out? How are they going to do that, if my organisation has a top-secret clearance and I enforce it properly?

Maarten BodewesJune 18, 2016 6:20 AM

All the while this talks about stealing certificates. Only completely at the bottom it says something about "their certificates and corresponding keys". I presume that this malware is stealing PKCS#8 encoded private keys and the accompanying certificates, but with the current articles by both Bruce and Symantec it is hard to be sure.

Leave a comment

Allowed HTML: <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre>

Photo of Bruce Schneier by Per Ervland.

Schneier on Security is a personal website. Opinions expressed are not necessarily those of IBM Resilient.