Entries Tagged "business of security"

Page 1 of 6

Book Review: The Business of Secrets

The Business of Secrets: Adventures in Selling Encryption Around the World by Fred Kinch (May 24, 2024)

From the vantage point of today, it’s surreal reading about the commercial cryptography business in the 1970s. Nobody knew anything. The manufacturers didn’t know whether the cryptography they sold was any good. The customers didn’t know whether the crypto they bought was any good. Everyone pretended to know, thought they knew, or knew better than to even try to know.

The Business of Secrets is the self-published memoirs of Fred Kinch. He was founder and vice president of—mostly sales—at a US cryptographic hardware company called Datotek, from company’s founding in 1969 until 1982. It’s mostly a disjointed collection of stories about the difficulties of selling to governments worldwide, along with descriptions of the highs and (mostly) lows of foreign airlines, foreign hotels, and foreign travel in general. But it’s also about encryption.

Datotek sold cryptographic equipment in the era after rotor machines and before modern academic cryptography. The company initially marketed computer-file encryption, but pivoted to link encryption—low-speed data, voice, fax—because that’s what the market wanted.

These were the years where the NSA hired anyone promising in the field, and routinely classified—and thereby blocked—publication of academic mathematics papers of those they didn’t hire. They controlled the fielding of strong cryptography by aggressively using the International Traffic in Arms regulation. Kinch talks about the difficulties in getting an expert license for Datotek’s products; he didn’t know that the only reason he ever got that license was because the NSA was able to break his company’s stuff. He had no idea that his largest competitor, the Swiss company Crypto AG, was owned and controlled by the CIA and its West German equivalent. “Wouldn’t that have made our life easier if we had known that back in the 1970s?” Yes, it would. But no one knew.

Glimmers of the clandestine world peek out of the book. Countries like France ask detailed tech questions, borrow or buy a couple of units for “evaluation,” and then disappear again. Did they break the encryption? Did they just want to see what their adversaries were using? No one at Datotek knew.

Kinch “carried the key generator logic diagrams and schematics” with him—even today, it’s good practice not to rely on their secrecy for security—but the details seem laughably insecure: four linear shift registers of 29, 23, 13, and 7 bits, variable stepping, and a small nonlinear final transformation. The NSA probably used this as a challenge to its new hires. But Datotek didn’t know that, at the time.

Kinch writes: “The strength of the cryptography had to be accepted on trust and only on trust.” Yes, but it’s so, so weird to read about it in practice. Kinch demonstrated the security of his telephone encryptors by hooking a pair of them up and having people listen to the encrypted voice. It’s rather like demonstrating the safety of a food additive by showing that someone doesn’t immediately fall over dead after eating it. (In one absolutely bizarre anecdote, an Argentine sergeant with a “hearing defect” could understand the scrambled analog voice. Datotek fixed its security, but only offered the upgrade to the Argentines, because no one else complained. As I said, no one knew anything.)

In his postscript, he writes that even if the NSA could break Datotek’s products, they were “vastly superior to what [his customers] had used previously.” Given that the previous devices were electromechanical rotor machines, and that his primary competition was a CIA-run operation, he’s probably right. But even today, we know nothing about any other country’s cryptanalytic capabilities during those decades.

A lot of this book has a “you had to be there” vibe. And it’s mostly tone-deaf. There is no real acknowledgment of the human-rights-abusing countries on Datotek’s customer list, and how their products might have assisted those governments. But it’s a fascinating artifact of an era before commercial cryptography went mainstream, before academic cryptography became approved for US classified data, before those of us outside the triple fences of the NSA understood the mathematics of cryptography.

This book review originally appeared in AFIO.

Posted on November 13, 2025 at 7:09 AMView Comments

Texas Sues GM for Collecting Driving Data without Consent

Texas is suing General Motors for collecting driver data without consent and then selling it to insurance companies:

From CNN:

In car models from 2015 and later, the Detroit-based car manufacturer allegedly used technology to “collect, record, analyze, and transmit highly detailed driving data about each time a driver used their vehicle,” according to the AG’s statement.

General Motors sold this information to several other companies, including to at least two companies for the purpose of generating “Driving Scores” about GM’s customers, the AG alleged. The suit said those two companies then sold these scores to insurance companies.

Insurance companies can use data to see how many times people exceeded a speed limit or obeyed other traffic laws. Some insurance firms ask customers if they want to voluntarily opt-in to such programs, promising lower rates for safer drivers.

But the attorney general’s office claimed GM “deceived” its Texan customers by encouraging them to enroll in programs such as OnStar Smart Driver. But by agreeing to join these programs, customers also unknowingly agreed to the collection and sale of their data, the attorney general’s office said.

Press release. Court filing. Slashdot thread.

Posted on August 14, 2024 at 12:48 PMView Comments

The DarkSide Ransomware Gang

The New York Times has a long story on the DarkSide ransomware gang.

A glimpse into DarkSide’s secret communications in the months leading up to the Colonial Pipeline attack reveals a criminal operation on the rise, pulling in millions of dollars in ransom payments each month.

DarkSide offers what is known as “ransomware as a service,” in which a malware developer charges a user fee to so-called affiliates like Woris, who may not have the technical skills to actually create ransomware but are still capable of breaking into a victim’s computer systems.

DarkSide’s services include providing technical support for hackers, negotiating with targets like the publishing company, processing payments, and devising tailored pressure campaigns through blackmail and other means, such as secondary hacks to crash websites. DarkSide’s user fees operated on a sliding scale: 25 percent for any ransoms less than $500,000 down to 10 percent for ransoms over $5 million, according to the computer security firm, FireEye.

Posted on June 2, 2021 at 9:09 AMView Comments

Amazon Has Trucks Filled with Hard Drives and an Armed Guard

From an interview with an Amazon Web Services security engineer:

So when you use AWS, part of what you’re paying for is security.

Right; it’s part of what we sell. Let’s say a prospective customer comes to AWS. They say, “I like pay-as-you-go pricing. Tell me more about that.” We say, “Okay, here’s how much you can use at peak capacity. Here are the savings we can see in your case.”

Then the company says, “How do I know that I’m secure on AWS?” And this is where the heat turns up. This is where we get them. We say, “Well, let’s take a look at what you’re doing right now and see if we can offer a comparable level of security.” So they tell us about the setup of their data centers.

We say, “Oh my! It seems like we have level five security and your data center has level three security. Are you really comfortable staying where you are?” The customer figures, not only am I going to save money by going with AWS, I also just became aware that I’m not nearly as secure as I thought.

Plus, we make it easy to migrate and difficult to leave. If you have a ton of data in your data center and you want to move it to AWS but you don’t want to send it over the internet, we’ll send an eighteen-wheeler to you filled with hard drives, plug it into your data center with a fiber optic cable, and then drive it across the country to us after loading it up with your data.

What? How do you do that?

We have a product called Snowmobile. It’s a gas-guzzling truck. There are no public pictures of the inside, but it’s pretty cool. It’s like a modular datacenter on wheels. And customers rightly expect that if they load a truck with all their data, they want security for that truck. So there’s an armed guard in it at all times.

It’s a pretty easy sell. If a customer looks at that option, they say, yeah, of course I want the giant truck and the guy with a gun to move my data, not some crappy system that I develop on my own.

Lots more about how AWS views security, and Keith Alexander’s position on Amazon’s board of directors, in the interview.

Found on Slashdot.

Posted on January 4, 2021 at 6:11 AMView Comments

The Legal Risks of Security Research

Sunoo Park and Kendra Albert have published “A Researcher’s Guide to Some Legal Risks of Security Research.”

From a summary:

Such risk extends beyond anti-hacking laws, implicating copyright law and anti-circumvention provisions (DMCA §1201), electronic privacy law (ECPA), and cryptography export controls, as well as broader legal areas such as contract and trade secret law.

Our Guide gives the most comprehensive presentation to date of this landscape of legal risks, with an eye to both legal and technical nuance. Aimed at researchers, the public, and technology lawyers alike, its aims both to provide pragmatic guidance to those navigating today’s uncertain legal landscape, and to provoke public debate towards future reform.

Comprehensive, and well worth reading.

Here’s a Twitter thread by Kendra.

Posted on October 30, 2020 at 9:14 AMView Comments

Reforming CDA 230

There’s a serious debate on reforming Section 230 of the Communications Decency Act. I am in the process of figuring out what I believe, and this is more a place to put resources and listen to people’s comments.

The EFF has written extensively on why it is so important and dismantling it will be catastrophic for the Internet. Danielle Citron disagrees. (There’s also this law journal article by Citron and Ben Wittes.) Sarah Jeong’s op-ed. Another op-ed. Another paper.

Here are good news articles.

Reading all of this, I am reminded of this decade-old quote by Dan Geer. He’s addressing Internet service providers:

Hello, Uncle Sam here.

You can charge whatever you like based on the contents of what you are carrying, but you are responsible for that content if it is illegal; inspecting brings with it a responsibility for what you learn.

-or-

You can enjoy common carrier protections at all times, but you can neither inspect nor act on the contents of what you are carrying and can only charge for carriage itself. Bits are bits.

Choose wisely. No refunds or exchanges at this window.

We can revise this choice for the social-media age:

Hi Facebook/Twitter/YouTube/everyone else:

You can build a communications based on inspecting user content and presenting it as you want, but that business model also conveys responsibility for that content.

-or-

You can be a communications service and enjoy the protections of CDA 230, in which case you cannot inspect or control the content you deliver.

Facebook would be an example of the former. WhatsApp would be an example of the latter.

I am honestly undecided about all of this. I want CDA230 to protect things like the commenting section of this blog. But I don’t think it should protect dating apps when they are used as a conduit for abuse. And I really don’t want society to pay the cost for all the externalities inherent in Facebook’s business model.

Posted on December 10, 2019 at 6:16 AMView Comments

Prices for Zero-Day Exploits Are Rising

Companies are willing to pay ever-increasing amounts for good zero-day exploits against hard-to-break computers and applications:

On Monday, market-leading exploit broker Zerodium said it would pay up to $2 million for zero-click jailbreaks of Apple’s iOS, $1.5 million for one-click iOS jailbreaks, and $1 million for exploits that take over secure messaging apps WhatsApp and iMessage. Previously, Zerodium was offering $1.5 million, $1 million, and $500,000 for the same types of exploits respectively. The steeper prices indicate not only that the demand for these exploits continues to grow, but also that reliably compromising these targets is becoming increasingly hard.

Note that these prices are for offensive uses of the exploit. Zerodium—and others—sell exploits to companies who make surveillance tools and cyber-weapons for governments. Many companies have bug bounty programs for those who want the exploit used for defensive purposes—i.e., fixed—but they pay orders of magnitude less. This is a problem.

Back in 2014, Dan Geer said that that the US should corner the market on software vulnerabilities:

“There is no doubt that the U.S. Government could openly corner the world vulnerability market,” said Geer, “that is, we buy them all and we make them all public. Simply announce ‘Show us a competing bid, and we’ll give you [10 times more].’ Sure, there are some who will say ‘I hate Americans; I sell only to Ukrainians,’ but because vulnerability finding is increasingly automation-assisted, the seller who won’t sell to the Americans knows that his vulns can be rediscovered in due course by someone who will sell to the Americans who will tell everybody, thus his need to sell his product before it outdates is irresistible.”

I don’t know about the 10x, but in theory he’s right. There’s no other way to solve this.

Posted on January 17, 2019 at 6:33 AMView Comments

1 2 3 6

Sidebar photo of Bruce Schneier by Joe MacInnis.