A Hacker’s Mind Is Now Published

Tuesday was the official publication date of A Hacker’s Mind: How the Powerful Bend Society’s Rules, and How to Bend them Back. It broke into the 2000s on the Amazon best-seller list.

Reviews in the New York Times, Cory Doctorow’s blog, Science, and the Associated Press.

I wrote essays related to the book for CNN and John Scalzi’s blog.

Two podcast interviews: Keen On and Lawfare. And a written interview for the Ash Center at the Harvard Kennedy School.

Lots more coming, I believe. Get your copy here.

And—last request—right now there’s one Amazon review, and it’s not a good one. If people here could leave reviews, I would appreciate it.

Posted on February 10, 2023 at 3:03 PM9 Comments


Ismar February 11, 2023 4:34 AM

Started reading the book and one definition of hacking that comes to mind would be

“Taking advantage of system’s imperfections “ (and every system has some)

Clive Robinson February 12, 2023 2:51 AM

@ Ismar, ALL,

“Taking advantage of system’s imperfections”

Whilst correct, does not quite cover it…

Consider the question,

“When is an imperfection, not an imperfection but a consideration or similar?”

There are many times as a designer I’ve had to make trade offs.

For instance the lighter I make a design the less impact energy it has on impact with the ground after being dropped from say 2m. Which means I can make the case thinner thus lighter and you get into a kind of spiral where although the case will survive being used it won’t be strong enough to survive a touch screen “firm” finger press in cold weather because a user has a glove on and does not realise that it stops the touch screen from working.

Which is the “imperfection” if any, the thinner case, the touchscreen or both?

Designers go through these sorts of thought processes every day so hundreds of times in any modetately complex or above design such as anything containing electronics or small or relatively small moving parts.

It sometimes leads to what appears to be very strange design choices.

For instance few actually know these days that there was no distinction between hard drives and floppy drives early on they were all in effect removable storage and hard drives due to IBM and others design teams. That is the “platter” that was coated with magnetic material was made of metal like aluminium and structurally self supporting, even though “magnetic tape” removable storage preceded disks floppy drives were much later in the game.

But… These days the platters in hard drives are made of what most would consider very fragile thin glass… As one designer put it,

“When you thin down aluminium enough you get ‘Bacofoil’, thin glass gives you fiber glass stronger than steel”.

At some point there was a “transition zone” where neither material had clear advantages over the other.

This is true in some way for all things you have transition zones and it is here that the playground of hacking be it good or bad exists.

The laws of nature demand such transition zones from all physical objects, you can see them in the plotted graphs. And we try to hide the curves from our thinking with piecewise curve fitting of straight lines. The reality is that very few things in nature follow straight lines in anyway, and we even find that impossibly awkward exponential cure lurking beneath even the turning of a wheel.

Information objects are no different than physical objects in this respect, they too have their own “laws of nature” –that in some cases we are only just discovering– they to are curves on a graph we try and often fail to linearize by piecewise curve fitting…

So the penultimate question is,

“Are the imperfections in us or the objects?”

From which the “ultimate question” is yet to arise.

Ballermann3000 February 12, 2023 12:57 PM

I don’t have the book, yet.

My definition of hacking would be to make use of something for a purpose it is not designed for.

Also there is this weird machine interpretation where your system has an intended part and an unintended part, the weird machine. So the system as a whole can do more than fullfilling it’s purpose. The weird machine exists explicitly when the system is not prepared for unintended or unexpected situations i.e. unexpected input. Attackers (of some kind) can use the weird machine to their advantage i.e. by passing input (weird intstructions) into the system that it’s not prepared for.

Clive Robinson February 12, 2023 6:17 PM

@ Ballermann3000,

“My definition of hacking would be to make use of something for a purpose it is not designed for.”

That is only a part of it.

Originally a “hack” was an improvment, which could mean new functionality, or making existing functionality somehow better like automating process steps or making the steps use less resources or more efficiently.

“Also there is this weird machine interpretation where your system has an intended part and an unintended part, the weird machine”

Ultimately all computers are “state machines” with few people actually understanding the implications of this.

Thus mostly programmers write programs where only very few of the many many states are considered in the “business logic” let alone the rest of the program.

A state machine has two basic actions,

1, Not change state.
2, Change state based on current state and input.

It can thus be “halted” or changing state. The changes of state can be,

1, Without loopback.
2, With loopback.

The first is a chain where no state is returned to that must by definition at some point run out of states and halt.

The second has the potential to loop endlessly thus not halt. Deciding which is going to happen is in some cases either not possible, or only possible by exhaustion. In essence this is what the Church-Turing results on the “Halting problem” proved.

What you describe as a “weird machine” is caused by incorrect implementation of state transition, such that execution can get into states that were either not implemented or implemented for a different purpose incorrectly.

The thing about state machines is that you can design them in a way where the states are correctly controled with only minor changes in the way a programmer writes programes.

One method is by using “design by contract” correctly and designing the interfaces such that state transition is restricted.

This is not liked as “code re-use” can get hammered, which many managmers find unexceptable even though it gives good code reliability and low maintainance.

Thus any “weird machine” is a “major red flag” about bad design and implementation practices, thus a code shop that should where possible be avoided and at the very least their product strongly mitigated.

Mary Freeman February 12, 2023 7:21 PM

Are there any plans to publish on Kobo Books? I don’t support Amazon’s management policy (where possible to avoid).

Chris Johnston February 12, 2023 7:40 PM

I would argue that intentionally building in [ Backdoors ] is mostly different than hacking.
And, the Backdoor is something you see a lot, people use complexity to slip things into legislation / hide things.
In my opinion, political systems that try to control people’s behavior have a poor record.
Other than publicizing legislation in advance, I’m not sure how you could stop it. Although, it is not clear that is sufficient.
Of course, not having the legislation at all or having simple (less complex) bills, could be effective.

Ismar February 19, 2023 5:56 AM

“ From which the “ultimate question” is yet to arise.”
I agree and what I was trying to say was that we are incapable of creating a perfect system.
So then, it is only a question of if there are people smarter than the designers to find the flaws.
I think you would agree that finding flaws is always easier then designing systems

Clive Robinson February 19, 2023 2:26 PM

@ Ismar, ALL,

“what I was trying to say was that we are incapable of creating a perfect system.”

Logically there never can be a “perfect system” for two simple reasons,

1, Perfection is a Point of View of society which is always in continuous flux.
2, No two people have the same Point of View, nor can they.

Which is why we have the old saw of,

“You can please some of the people all of the time, and all of the people some of the time, but you can not please all of the people all of the time.”

“So then, it is only a question of if there are people smarter than the designers to find the flaws.”

They don’t even need to be even close to as smart as the designers.

Back when I was still at school and slightly later in collage I was finding flaws in mainfraim and mini-computer OS’s designed and implemented by people way smarter and more knowledgable and experienced than I was.

My advantage “focus” supported by ternacity and curiosity.

I was needle sharp, they were making armour that due to resource limitations was effectively made of chain mail not plate.

Finding a small hole was comparatively easy.

“I think you would agree that finding flaws is always easier then designing systems”

Definitely, when you design you have a very large area to cover with a commensurately large border to defend in depth. The more “usable” or “user friendly” a system is, the larger that area is and the more like swiss cheese it is.

An attacker has their choice of place to attack where they are strongest. A defender has no choice they have to defend all points irrespective of if they are strong in that area or not.

In physical systems as opposed to information systems, the defenders have three advantages,

1, Their systems can be unique.
2, Their attackers have to be local.
3, Their attackers have at best limited force multipliers.

This means that they can use delays, detection and effective response, to deal with the threat. With the effective response taking the attackers into custody etc.

Information systems are generally in no way unique, being Consumer/Commercial Off The Shelf(COTS) thus an attacker can set up their own systems to practice on. When they attack, the attackers can be half a world away and well hidden thus not subject to physical limitations. As the attacks are not carried out by the attackers, but by scripts/executables put on the defenders systems, then they can mount thousands or millions of simultanious attacks.

Thus the attacker of information systems has several clear advantages over the attackers of physical systems.

Worse, nearly all physical system attacks can be directky migrated to equivalent information systems attacks with little or no change in base methodology. Thus there are thousands of years of well developed physical system attacks that most information systen defenders have no experience of, just poping up like moles in the lawn…

The reality is,

1, There is no effective defence.
2, There are too few attackers.
3, Thus easy attacks go unattacked.
4, Effectively you get attacked by the probability curve of a target rich environment.

Other than “Don’t connect” your best defence is to be “Dull, uninteresting and look worthless”. It’s a defence stratagem actually used in high density housing in cities. If you look like you don’t have two penies to rub together, but your neighbour has flashy gear visable, you can guess which one a criminal is more likely to go for.

It’s what “HOA Police” often don’t get. Hence the need for the “bunker mentality” to turn the areas into gated communities with a border force to keep out strangers etc… Such communities become targets simply because in effect “it pays to advertise” you have something to steal.

The same applies to the Internet, only worse, a lot worse…

Leave a comment


Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.