Leaked Signing Keys Are Being Used to Sign Malware

A bunch of Android OEM signing keys have been leaked or stolen, and they are actively being used to sign malware.

Łukasz Siewierski, a member of Google’s Android Security Team, has a post on the Android Partner Vulnerability Initiative (AVPI) issue tracker detailing leaked platform certificate keys that are actively being used to sign malware. The post is just a list of the keys, but running each one through APKMirror or Google’s VirusTotal site will put names to some of the compromised keys: Samsung, LG, and Mediatek are the heavy hitters on the list of leaked keys, along with some smaller OEMs like Revoview and Szroco, which makes Walmart’s Onn tablets.

This is a huge problem. The whole system of authentication rests on the assumption that signing keys are kept secret by the legitimate signers. Once that assumption is broken, all bets are off:

Samsung’s compromised key is used for everything: Samsung Pay, Bixby, Samsung Account, the phone app, and a million other things you can find on the 101 pages of results for that key. It would be possible to craft a malicious update for any one of these apps, and Android would be happy to install it overtop of the real app. Some of the updates are from today, indicating Samsung has still not changed the key.

Posted on December 8, 2022 at 7:08 AM17 Comments


Clive Robinson December 8, 2022 8:56 AM

@ Bruce,

“leaked or stolen”

Are not the only options…

There is also “Hanlon’s Razor”[1] of,

“Never attribute to malice that which is adequately explained by stupidity.”

There is also “Murphy’s Law”[2],

“The perversity of inanimate objects, or what can go wrong, will go wrong in the worst possible way at the worst possible time and place.”

Both of which predate the perversion that is “code signing”.

We’ve known for a long time “code signing” is actually a very fragile and mostly pointless thing, and we should have replaced it a long time ago.

But heck even Stuxnet did not wake people up…

The big problems that nobody talks about are,

1, Tolkien’s rule.
2, Non seperation of roles and duties.

Tolkien’s “One ring to rule them all” adiquately doescribes the most obvious problem of,

“The further up the hierarchy you go the more power you get”

We’ve seen this problem so badly with CA’s it’s almost a standing joke, that the time validity on CS certs is now so short that new keys will have be replaced before you get them…

Just about every human system follows this hierarchy power issue and as far as I’m aware they all fail. Primarally because of the second problem.

But the simple fact is this was guaranteed to happen by design from day zero with “code signing” and the fact it’s happened yet again, means we will do the same thing to fix the problems as previous times which is basically “flap and do nothing”[3].

[1] Fun fact Robert J. Hanlon’s proffession “computer programmer”. The description was used to follow the principle of Oakham’s Razor that technically was named after a place, a small insignificant but pleasant place in the English “countryside”.

[2] Murhpy’s law goes by a number of names and variations one is known by soldiers as “Deadman’s click” or “The sound you hear when your gun does not fire when your heavily armed enemy is just infront of you”. I can assure you that once experienced and if lucky survived this version tends to haunt you for a very long time…

[3] Yes there is a law for this but I’ll let people look it up…

Andy December 8, 2022 10:32 AM

Loss or compromise of private keys is as old as PKI infrastructure. And so is the solution:

  1. time-stamps: Code signatures are also co-signed by a trusted (!) time-stamping authority
  2. key revocation: This requires that public keys are checked against a cached list or preferably online (Android’s master?) which is expected to be up to date.
  3. liability: code signers should suffer some penalty if they don’t report in time that the key is compromised. After all, they were supposed to keep it safe.

lurker December 8, 2022 11:30 AM


[3] requires that they know when they are violated. Some are so big and complicated (eg. Samsung) it can take quite a while to (a. discover a breach, then (b. move the message to the right place to get action.

Some are so small they suffer the Alfred E. Neuman syndrome, “Wot, me compromised?”

Jakob December 8, 2022 1:24 PM

Do I understand it correctly that it is “only” the App signing key but not the firmware/secure boot signing key? In that case the affected vendors could just generate a new signing key and ship a firmware upgrade (possibly combined with the monthly security upgrade) to migrate to the new key.

Ted December 8, 2022 6:17 PM


Any useful advice for those of us who are not security experts?

I am not an expert, but according to a Chainguard post users would have to sideload a malicious app to be affected. (Does this seem accurate to you?)

Also, only specific phone models were affected, though there’s not a public list yet. And OEMs can push over-the-air updates to rotate the keys.

Going up the supply chain, Zack Newman had some additional suggestions, including a transparency log for Android binaries.

I feel weary about how the leaked keys have been used. (So many leaked OEM keys revealed at one time?) However, Newman also noted that security professionals are good at excessive pessimism. So rah rah anticipating a compromise and planning accordingly.



A good Twitter thread from the Ars article:


Jonathan Wilson December 9, 2022 5:30 AM

Critical code signing keys like this should be stored in hardware signing modules that would prevent signing keys from being obtained by hackers, leaked by rogue employees or accidentally shared by people who don’t know any better.

Salach December 9, 2022 9:27 AM

@Jonathan: It is not enough to put the signing key in an HSM. Even inside an HSM, it is decent protection against extraction, but it can still be (ab)used if the HSM is accessible. The security of code signing keys is a question of access control more than anything else.

Clive Robinson December 9, 2022 10:51 AM

@ Salach,

Re : Non information security

“The security of code signing keys is a question of access control more than anything else.”

It’s “access control” not just “physically” to the HSM, and it’s internal “informational” systems.

It’s both physical and informational access control at all levels of the Computing Stack from the quantum physics level up through the human user/organisasional policy, national regulation and legislation through international treaties.

Sometimes the only way you can have “security” is by intentionally working against the higher levels of the stack and if caught ask for forgiveness later after you’ve ensured the security.

When you think about it this has been considered by some of the finest minds of their time. Consider the actual implications of the Benjamin Franklin quote,

“Three can keep a secret, if two of them are dead.”

In effect says “murder is acceptable” if the “security of the secret” is sufficient warranted…

But as a saying it is now either nolonger true, or close to not being true, hence the failure of “Multi-factor” authentication of, “something you have”(tokens) or “something you are”(biometrics).

Leaving “something you know”(knowledge) that can be extracted from people unless some other extra factors are involved.

Hence as I’ve said before we need to consider

1, Something you know prior to a point in time.
2, Something you know such as a geo-location.

Such that if the knowledge can not be extracted from you before it “times out” then it remains “locked” forever from that attack.

Or there is something about a geo-location out of jurisdiction that requires not just to be in the right location, but be there in person, with the right device, at the right time.

Such precautions are never easy, thus can go wrong in ways that are not realised and mitigated.

NotMuch December 9, 2022 12:34 PM

“The whole system of authentication rests on the assumption that signing keys are kept secret by the legitimate signers.”

Same problem as OTP, key distribution. No matter how good the fundamentals of the authentication or encryption system, keys are always difficult to secure. Impossible to secure when widely distributed. Trusted insiders have access, and the possibilities for outsiders to gain access quickly multiply with the number of trusted key holders.

Salach December 9, 2022 1:01 PM

Yes, access control is for all aspects, logical and physical. Code signing is far from perfect but we don’t have too many alternatives so we need to optimize what we have. I don’t expect small companies to apply strict and serious security around their signing key but I do expect it from the big companies. They can pay an expert and formulate decent policies for key management. With such a leak they look like a bunch of clowns

Greg December 9, 2022 7:42 PM

Android. The joke system by Google. Just look at how many cases of malware found in Google Play store during the last year.

Apparently those guys do not know security.

Clive Robinson December 9, 2022 9:25 PM

@ Salach

Re : Signing Key security

“I don’t expect small companies to apply strict and serious security around their signing key but I do expect it from the big companies.”

I expect key security to be good for all.

The reason, look at it this way,

A tiny two developer Chinese company writes a driver for a bit of hardware they have designed.

It obviously needs to be code signed to be usable by the OS kernel. Unfortunately a driver can and has compromised major consumer OS’s.

Being a tiny company they would be an easy target for a “black-bag” job by an intelligence agency or slightly smarter hackers. Just fond the officr, “pick the lock”, find and copy “the disk” shut the door behind you on the way out, job done and nobody the wiser.

So it’s not the size of the companies that count, but the damage the loss of their keys can do that does.

Salach December 10, 2022 2:26 PM

I understand your reasoning and agree with it, with one issue. The situation you describe is something that may happen, but I would like to see the big company managing the incident, rather than a “food chain” that ends at the tiny company with a lot of dependencies along the way (availability of the tiny company, their competence for disaster recovery, their capabilities to step up security and so on).
I have seen this in one market – some car manufacturers take over code signing from their suppliers, so they can push updates without dependencies on supplier’s code signing. There are obviously disadvantages to this approach bu it does help in the situation you describe.

Lig Bubowski December 16, 2022 4:54 AM

Big fish, Little fish

They all carry diseases

Who wants some freeze dried shrimp?

Tiny company has less surface area

Leave a comment


Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.