Entries Tagged "snake oil"

Page 1 of 3

On the Poisoning of LLMs

Interesting essay on the poisoning of LLMs—ChatGPT in particular:

Given that we’ve known about model poisoning for years, and given the strong incentives the black-hat SEO crowd has to manipulate results, it’s entirely possible that bad actors have been poisoning ChatGPT for months. We don’t know because OpenAI doesn’t talk about their processes, how they validate the prompts they use for training, how they vet their training data set, or how they fine-tune ChatGPT. Their secrecy means we don’t know if ChatGPT has been safely managed.

They’ll also have to update their training data set at some point. They can’t leave their models stuck in 2021 forever.

Once they do update it, we only have their word—pinky-swear promises—that they’ve done a good enough job of filtering out keyword manipulations and other training data attacks, something that the AI researcher El Mahdi El Mhamdi posited is mathematically impossible in a paper he worked on while he was at Google.

Posted on May 25, 2023 at 7:05 AMView Comments

Bogus Security Technology: An Anti-5G USB Stick

The 5GBioShield sells for £339.60, and the description sounds like snake oil:

…its website, which describes it as a USB key that “provides protection for your home and family, thanks to the wearable holographic nano-layer catalyser, which can be worn or placed near to a smartphone or any other electrical, radiation or EMF [electromagnetic field] emitting device”.

“Through a process of quantum oscillation, the 5GBioShield USB key balances and re-harmonises the disturbing frequencies arising from the electric fog induced by devices, such as laptops, cordless phones, wi-fi, tablets, et cetera,” it adds.

Turns out that it’s just a regular USB stick.

Posted on May 29, 2020 at 12:02 PMView Comments

Crown Sterling Claims to Factor RSA Keylengths First Factored Twenty Years Ago

Earlier this month, I made fun of a company called Crown Sterling, for…for…for being a company that deserves being made fun of.

This morning, the company announced that they “decrypted two 256-bit asymmetric public keys in approximately 50 seconds from a standard laptop computer.” Really. They did. This keylength is so small it has never been considered secure. It was too small to be part of the RSA Factoring Challenge when it was introduced in 1991. In 1977, when Ron Rivest, Adi Shamir, and Len Adelman first described RSA, they included a challenge with a 426-bit key. (It was factored in 1994.)

The press release goes on: “Crown Sterling also announced the consistent decryption of 512-bit asymmetric public key in as little as five hours also using standard computing.” They didn’t demonstrate it, but if they’re right they’ve matched a factoring record set in 1999. Five hours is significantly less than the 5.2 months it took in 1999, but slower than would be expected if Crown Sterling just used the 1999 techniques with modern CPUs and networks.

Is anyone taking this company seriously anymore? I honestly wouldn’t be surprised if this was a hoax press release. It’s not currently on the company’s website. (And, if it is a hoax, I apologize to Crown Sterling. I’ll post a retraction as soon as I hear from you.)

EDITED TO ADD: First, the press release is real. And second, I forgot to include the quote from CEO Robert Grant: “Today’s decryptions demonstrate the vulnerabilities associated with the current encryption paradigm. We have clearly demonstrated the problem which also extends to larger keys.”

People, this isn’t hard. Find an RSA Factoring Challenge number that hasn’t been factored yet and factor it. Once you do, the entire world will take you seriously. Until you do, no one will. And, bonus, you won’t have to reveal your super-secret world-destabilizing cryptanalytic techniques.

EDITED TO ADD (9/21): Others are laughing at this, too.

EDITED TO ADD (9/24): More commentary.

EDITED TO ADD (10/9): There’s video of the “demo.” And some history of Crown Sterling’s CEO Robert Grant.

Posted on September 20, 2019 at 12:50 PMView Comments

The Doghouse: Crown Sterling

A decade ago, the Doghouse was a regular feature in both my email newsletter Crypto-Gram and my blog. In it, I would call out particularly egregious—and amusing—examples of cryptographic “snake oil.”

I dropped it both because it stopped being fun and because almost everyone converged on standard cryptographic libraries, which meant standard non-snake-oil cryptography. But every so often, a new company comes along that is so ridiculous, so nonsensical, so bizarre, that there is nothing to do but call it out.

Crown Sterling is complete and utter snake oil. The company sells “TIME AI,” “the world’s first dynamic ‘non-factor’ based quantum AI encryption software,” “utilizing multi-dimensional encryption technology, including time, music’s infinite variability, artificial intelligence, and most notably mathematical constancies to generate entangled key pairs.” Those sentence fragments tick three of my snake-oil warning signs—from 1999!—right there: pseudo-math gobbledygook (warning sign #1), new mathematics (warning sign #2), and extreme cluelessness (warning sign #4).

More: “In March of 2019, Grant identified the first Infinite Prime Number prediction pattern, where the discovery was published on Cornell University’s www.arXiv.org titled: ‘Accurate and Infinite Prime Number Prediction from Novel Quasi-Prime Analytical Methodology.’ The paper was co-authored by Physicist and Number Theorist Talal Ghannam PhD. The discovery challenges today’s current encryption framework by enabling the accurate prediction of prime numbers.” Note the attempt to leverage Cornell’s reputation, even though the preprint server is not peer-reviewed and allows anyone to upload anything. (That should be another warning sign: undeserved appeals to authority.) PhD student Mark Carney took the time to refute it. Most of it is wrong, and what’s right isn’t new.

I first encountered the company earlier this year. In January, Tom Yemington from the company emailed me, asking to talk. “The founder and CEO, Robert Grant is a successful healthcare CEO and amateur mathematician that has discovered a method for cracking asymmetric encryption methods that are based on the difficulty of finding the prime factors of a large quasi-prime numbers. Thankfully the newly discovered math also provides us with much a stronger approach to encryption based on entangled-pairs of keys.” Sounds like complete snake-oil, right? I responded as I usually do when companies contact me, which is to tell them that I’m too busy.

In April, a colleague at IBM suggested I talk with the company. I poked around at the website, and sent back: “That screams ‘snake oil.’ Bet you a gazillion dollars they have absolutely nothing of value—and that none of their tech people have any cryptography expertise.” But I thought this might be an amusing conversation to have. I wrote back to Yemington. I never heard back—LinkedIn suggests he left in April—and forgot about the company completely until it surfaced at Black Hat this year.

Robert Grant, president of Crown Sterling, gave a sponsored talk: “The 2019 Discovery of Quasi-Prime Numbers: What Does This Mean For Encryption?” I didn’t see it, but it was widely criticized and heckled. Black Hat was so embarrassed that it removed the presentation from the conference website. (Parts of it remain on the Internet. Here’s a short video from the company, if you want to laugh along with everyone else at terms like “infinite wave conjugations” and “quantum AI encryption.” Or you can read the company’s press release about what happened at Black Hat, or Grant’s Twitter feed.)

Grant has no cryptographic credentials. His bio—on the website of something called the “Resonance Science Foundation”—is all over the place: “He holds several patents in the fields of photonics, electromagnetism, genetic combinatorics, DNA and phenotypic expression, and cybernetic implant technologies. Mr. Grant published and confirmed the existence of quasi-prime numbers (a new classification of prime numbers) and their infinite pattern inherent to icositetragonal geometry.”

Grant’s bio on the Crown Sterling website contains this sentence, absolutely beautiful in its nonsensical use of mathematical terms: “He has multiple publications in unified mathematics and physics related to his discoveries of quasi-prime numbers (a new classification for prime numbers), the world’s first predictive algorithm determining infinite prime numbers, and a unification wave-based theory connecting and correlating fundamental mathematical constants such as Pi, Euler, Alpha, Gamma and Phi.” (Quasi-primes are real, and they’re not new. They’re numbers with only large prime factors, like RSA moduli.)

Near as I can tell, Grant’s coauthor is the mathematician of the company: “Talal Ghannam—a physicist who has self-published a book called The Mystery of Numbers: Revealed through their Digital Root as well as a comic book called The Chronicles of Maroof the Knight: The Byzantine.” Nothing about cryptography.

There seems to be another technical person. Ars Technica writes: “Alan Green (who, according to the Resonance Foundation website, is a research team member and adjunct faculty for the Resonance Academy) is a consultant to the Crown Sterling team, according to a company spokesperson. Until earlier this month, Green—a musician who was ‘musical director for Davy Jones of The Monkees’—was listed on the Crown Sterling website as Director of Cryptography. Green has written books and a musical about hidden codes in the sonnets of William Shakespeare.”

None of these people have demonstrated any cryptographic credentials. No papers, no research, no nothing. (And, no, self-publishing doesn’t count.)

After the Black Hat talk, Grant—and maybe some of those others—sat down with Ars Technica and spun more snake oil. They claimed that the patterns they found in prime numbers allows them to break RSA. They’re not publishing their results “because Crown Sterling’s team felt it would be irresponsible to disclose discoveries that would break encryption.” (Snake-oil warning sign #7: unsubstantiated claims.) They also claim to have “some very, very strong advisors to the company” who are “experts in the field of cryptography, truly experts.” The only one they name is Larry Ponemon, who is a privacy researcher and not a cryptographer at all.

Enough of this. All of us can create ciphers that we cannot break ourselves, which means that amateur cryptographers regularly produce amateur cryptography. These guys are amateurs. Their math is amateurish. Their claims are nonsensical. Run away. Run, far, far, away.

But be careful how loudly you laugh when you do. Not only is the company ridiculous, it’s litigious as well. It has sued ten unnamed “John Doe” defendants for booing the Black Hat talk. (It also sued Black Hat, which may have more merit. The company paid $115K to have its talk presented amongst actual peer-reviewed talks. For Black Hat to remove its nonsense may very well be a breach of contract.)

Maybe Crown Sterling can file a meritless lawsuit against me instead for this post. I’m sure it would think it’d result in all sorts of positive press coverage. (Although any press is good press, so maybe it’s right.) But if I can prevent others from getting taken in by this stuff, it would be a good thing.

Posted on September 5, 2019 at 5:58 AMView Comments

Unhackable Cryptography?

A recent article overhyped the release of EverCrypt, a cryptography library created using formal methods to prove security against specific attacks.

The Quanta magazine article sets off a series of “snake-oil” alarm bells. The author’s Github README is more measured and accurate, and illustrates what a cool project this really is. But it’s not “hacker-proof cryptographic code.”

Posted on April 5, 2019 at 9:31 AMView Comments

Amateurs Produce Amateur Cryptography

Anyone can design a cipher that he himself cannot break. This is why you should uniformly distrust amateur cryptography, and why you should only use published algorithms that have withstood broad cryptanalysis. All cryptographers know this, but non-cryptographers do not. And this is why we repeatedly see bad amateur cryptography in fielded systems.

The latest is the cryptography in the Open Smart Grid Protocol, which is so bad as to be laughable. From the paper:

Dumb Crypto in Smart Grids: Practical Cryptanalysis of the Open Smart Grid Protocol

Philipp Jovanovic and Samuel Neves

Abstract: This paper analyses the cryptography used in the Open Smart Grid Protocol (OSGP). The authenticated encryption (AE) scheme deployed by OSGP is a non-standard composition of RC4 and a home-brewed MAC, the “OMA digest’.”

We present several practical key-recovery attacks against the OMA digest. The first and basic variant can achieve this with a mere 13 queries to an OMA digest oracle and negligible time complexity. A more sophisticated version breaks the OMA digest with only 4 queries and a time complexity of about 2^25 simple operations. A different approach only requires one arbitrary valid plaintext-tag pair, and recovers the key in an average of 144 message verification queries, or one ciphertext-tag pair and 168 ciphertext verification queries.

Since the encryption key is derived from the key used by the OMA digest, our attacks break both confidentiality and authenticity of OSGP.

My still-relevant 1998 essay: “Memo to the Amateur Cipher Designer.” And my 1999 essay on cryptographic snake oil.

ThreatPost article. BoingBoing post.

Note: That first sentence has been called “Schneier’s Law,” although the sentiment is much older.

Posted on May 12, 2015 at 5:41 AMView Comments

Nice Essay on Security Snake Oil

This is good:

Just as “data” is being sold as “intelligence”, a lot of security technologies are being sold as “security solutions” rather than what they for the most part are, namely very narrow focused appliances that as a best case can be part of your broader security effort.

Too many of these appliances do unfortunately not easily integrate with other appliances or with the rest of your security portfolio, or with your policies and procedures. Instead, they are created to work and be operated as completely stand-alone devices. This really is not what we need. To quote Alex Stamos, we need platforms. Reusable platforms that easily integrate with whatever else we decide to put into our security effort.

Slashdot thread.

Posted on April 28, 2015 at 6:21 AMView Comments

1 2 3

Sidebar photo of Bruce Schneier by Joe MacInnis.