Going Meta: A Conversation and AMA with Bruce Schneier
Listen to the Audio on TheCyberWire.com
In this episode, Perry Carpenter interviews cybersecurity guru Bruce Schneier. Perry and Bruce explore how cybersecurity is about so much more than technology—It’s about people, so we benefit by taking a multidisciplinary approach.
In preparing for this interview, Perry solicited his LinkedIn network to see what questions people had for Bruce. This is a wide ranging conversation covering everything from Bruce’s thoughts on cybersecurity’s “first principles” to the impact that the pandemic had on society to need for regulation to help raise the overall standards for security and privacy.
Perry Carpenter: Hi, I’m Perry Carpenter, and you’re listening to 8th Layer Insights. This is a cybersecurity show about people and human nature and how we can’t afford to ignore human nature in the way that we designed security and in the way that we build our programs. Today’s episode is going to be a little bit different than previous episodes but, if I were to summarize the theme of today’s episode and use one word to describe it, that word would be synthesis.
Perry Carpenter: Now, if this was a nature documentary, I might open with some type of imagery depicting a chemical combination that has this visually stunning reaction, or I might show an animation of molecules combining to form a new molecular structure. But, this is a podcast and our virtual paintbrushes are sounds and ideas, and those introductory establishing shots can be really hard, especially when talking about an abstract topic like synthesis. So, let’s jump ahead and get straight to the point, because you’re probably asking yourself “what is all of this talk about synthesis?” and “what do we mean in this context?” Let me explain.
Perry Carpenter: Synthesis is really what 8th Layer Insights is all about. And by that, I mean the heart of this podcast, the intent behind it, is to bring ideas and concepts together from several different disciplines, covering a wide swath of topics, all coming together to shed light on the human condition as it relates to security and risk. In the couple of decades that I’ve been involved in cybersecurity, I’ve seen the industry mature quite a bit, but one of the stumbling blocks that we seem to face again and again and again, is that we tend to believe that somehow cybersecurity is different than everything else. It’s its own unique animal. It’s so unique, so new, that we have to figure things out ourselves. And that can have the effect of creating tunnel vision and echo chambers, where we don’t think to learn from the past or look at how other disciplines have grappled with similar problems, or even ask ourselves how our new fangled security controls will collide with human nature.
Perry Carpenter: Sometimes we focus on the individual trees so much, that we forget about the forest and the entire ecosystem that drives the forest and sustains the forest. And that reminds me of a story. You may have heard it before but, even if that’s true, go ahead and listen again with fresh ears. This is the story of the blind man and the elephant.
Female Speaker: The blind man and an elephant. The earliest versions of the parable of blind men and elephant is found in Buddhist, Hindu and Jain texts, as they discuss the limits of perception and the importance of complete context. The parable has several Indian variations, but basically goes like this. A group of blind men heard that a strange animal, called an elephant, had been brought to the town, but none of them were aware of its shape and form. Out of curiosity, they said, “we must inspect and know it by touch.” So, they started out and found it. The first person whose hand landed on the trunk said, “this being is like a thick snake.” For another one whose hand reached its ear, it seemed like a kind of fan. As for another person, whose hand was upon its leg, said “the elephant is a pillar, like a tree trunk.” The blind man, who placed his hand upon its side, said “the elephant is a wall.” Another, who felt its tail, described it as a rope. The last felt its tusk, stating “the elephant is that which is hard, smooth, and like a spear.” There you have it, sometimes we don’t perceive things correctly, because we are too close. We lack context and that lack of context means that we make mistakes. Back to you, Perry.
Perry Carpenter: So, we make mistakes when we lack context, and that’s where big picture thinking becomes so critical, and big picture thinking and synthesis and meta analysis is what today’s guest is known for. If you’re a cybersecurity professional, it’s very likely that you’ve heard the name Bruce Schneier before, and it’s hard to underestimate Bruce’s impact in the field of cybersecurity. This is so much so that most intros for him end up defaulting to words like “guru” and “luminary”. He made a big splash in the computer security world as a cryptographer when he published Applied Cryptography back in 1991, but most people don’t know that Bruce’s earliest training was actually as a physicist. He earned a Bachelors in physics in 1984 before moving on to study computer science for his Master’s degree, which he earned in 1998. Bruce created a few popular cryptographic cyphers, which you’ve probably heard of: Blowfish and Twofish and Threefish among others. In 1999, he invented a cryptographic algorithm called solitaire, which is designed to manually encrypt data using a deck of cards, and this algorithm was a key plot point in Neal Stephenson’s book Cryptonomicon. Bruce even wrote an afterword to that book describing the cypher.
Perry Carpenter: Here’s where the big picture thinking comes in. Despite all of Bruce’s success on the cryptography side of things, he realized something. He realized that cryptography and technology alone will never solve security issues. They face the same issues that the blind men faced when trying to describe the elephant. Over the past couple of decades and change, Bruce has been focusing on security at more of a macro level. He’s been taking the 30,000 ft and above level, that big picture view and he’ll be the first to tell you that we need to approach security in a multidisciplinary manner.
Bruce Schneier: Security is inherently about people. It means a lot of technology, but security has to take people into account, and it’s often economic psychology, sociology non-security topics that explain how security works and fails. Security theater is a phrase I invented post-911 to describe security measures that look good but don’t accomplish anything. This is regulation, it gets a bad name, but actually it keeps us alive. There’s a lot of technologies we have for authentication that aren’t being used because the market doesn’t reward it. All men are not angels, all people are not angels. Security is a tax on the honest, and something we have to pay for, even though we get nothing for it, but we get everything for it. I’m Bruce Schneier, I work at the intersection of security technology and people.
Perry Carpenter: We’ll be touching on several topics with Bruce today but, first, let’s cue the intro.
Perry Carpenter: Hi there, my name is Perry Carpenter. Join me for a deep dive into what cybersecurity professionals refer to as the 8th Layer of security: humans. This podcast is a multidisciplinary exploration into the complexities of human nature and how those complexities impact everything from why we think the things that we think, to why we do the things that we do. And how we can all make better decisions every day. Welcome to 8th Layer Insights. I’m your host, Perry Carpenter. We’ll be right back after this message.
Female Speaker 2: So, what’s a con game? It’s a fraud that works by getting the victim to misplace their confidence in the con artist. In the world of security, we call confidence tricks social engineering. And as our sponsors at KnowBe4 can tell you, human error is how most organizations get compromised. What are some of the ways organizations are victimized by social engineering? We’ll find out later in the show.
Perry Carpenter: Welcome back. I mentioned in the opening section that today’s episode is a little bit different than most of the other shows and that’s because today we have one guest, and that one guest is driving the agenda. Back before I officially launched the podcast, I solicited LinkedIn with a request for topics and potential guests, and Bruce Schneier’s name came up. Thankfully, he agreed to be interviewed, and because that suggestion came to me from LinkedIn, I reached back out and I asked people what kind of questions they would like to have Bruce answer. This is a little bit of a guided AMA session, an ask me anything session with Bruce. And I think this comes at a great time, because most of us have not been going to conferences this year. and so many of us that have become habitualized to hearing Bruce speak from the stage every year, have not had that. As a result, you might be feeling like his voice has been missing from some of the conversations, and I think that’s what makes today’s episode special. What I’ll be doing is I’ll be playing clips from the interview and I’ll be giving some context or some commentary before or after some of these questions, but these are largely Bruce’s words unfiltered and Bruce’s thoughts unfiltered, being brought directly to you. Thank you so much to those of you who submitted questions for Bruce. I hope that you enjoy this interview as much as I did. With that, let’s go to the interview.
Bruce Schneier: I’m Bruce Schneier. I am a security technologist. I do a lot of things. I teach at the Harvard Kennedy school. I teach security to public policy students. I have a company Inrupt, that is trying to commercialize Tim Berners-Lee’s Solid, distributed data ownership technology. I write, I speak, I have a website. I’m a security technologist. I work at the intersection of security technology and people.
Perry Carpenter: Fantastic. You know, Bruce, one of the things that I think people really appreciate about you is that you’re extremely multi-faceted in the way that you think about security and in the way that you opine on security. I want to go philosophical for just a couple of minutes. In philosophy, a first principle is a basic proposition or assumption that cannot be deduced from any other basic proposition or assumption. For you, what would be a couple of first principles for security?
Bruce Schneier: Security is inherently about people. It means a lot of technology, but security has to take people into account, and it’s often economic psychology, sociology, non-security topics that explain how security works and fails. Security is a way of incenting people to behave in a certain manner, and we need to understand. That might be my first principle, that security actually is rarely about security, it’s about other things.
Perry Carpenter: Okay, if you don’t mind, let’s drill into that for a minute. You’ve got a pretty famous quote. I think it was in the preface for Secrets and Lies, where you said “if you think technology can solve your security problems, then you don’t understand the problems and you don’t understand technology.” But then a bit later in a 2013 blog post, you also mention that at the time you felt like training users was a little bit of a waste of time and that the money could be better spent elsewhere. What is the through line between those two opinions? One being that technology can’t solve the problem, so we need to focus on humanity; and the other being that training isn’t necessarily the answer to everything either.
Bruce Schneier: I’m going to pull that apart and give you two separate answers you can use. That’s not even my quote. I believe that’s Roger Needham at Cambridge University who first said that. It is something that we in security say a lot, and I think I popularized it, but I won’t take credit for it.
Perry Carpenter: Okay, quick note, after that reminder from Bruce, I went and looked at the preface to Secrets and Lies again and here is how he phrases it. He says “a few years ago, I heard a quotation and I’m going to modify it here.” Then he goes on to say “if you think technology can solve your security problems, then you don’t understand the problems and you don’t understand the technology.” What I did after that is I went on a search for the original quote and, of course, I used some of those key words and I used Roger Needham’s name and I eventually found it. It looks like this is attributed to both Roger Needham and then also, in a lot of cases, to Peter G. Newman. The original quote is “if you think cryptography will solve your problem, you either don’t understand cryptography or you don’t understand your problem.” Okay, back to Bruce for the second part of his answer.
Bruce Schneier: Let me do the second now. Our society is so complex that we’ve built it in a way that you don’t have to be an expert in things to take advantage of them. I can get on an airplane – wow remember airplanes – without knowing anything about aircraft, safety, or maintenance or pilot training, or any of that. Right, there’s an entire infrastructure that I can safely ignore and know that I’m walking onto a plane and it’s not going to crash. The same thing with cars, pharmaceuticals, restaurants. Society has my back with a lot of expertise so I don’t need it, and computers, networks need to be the same. It can’t be that everyone has to be an information security expert to use a computer securely. Just, it can’t be that everyone needs to be a food hygiene sanitation expert in order to eat at a restaurant and buy at a grocery store. When I think about user training, that’s what I think about. Why are we training the user? Why don’t we train the experts and have them decide what’s right, like we do everywhere else. This is regulation, it gets a bad name, but actually it keeps us alive.
Bruce Schneier: A lot of user education covers for bad security design. You think about the sort of advice we try to give people. “Don’t plug strange USB sticks into computers.” It’s a USB stick. What are you supposed to do with it? “Don’t click on URLs you don’t know.” It’s a URL, you’re supposed to click on it. “Don’t open attachments.” It’s an attachment. In every one of those cases, there’s a pretty serious design problem, and why is it that a malicious URL can hack a computer? Or an attachment? Or a USB stick? Those are design issues. Those are failures in how the computers were built, and blaming the user for not doing the obvious thing, doesn’t make any sense. So, I don’t like user education, because it goes hand in hand with use blaming, and because it implies a level of expertise that we shouldn’t expect.
Perry Carpenter: Yes, that’s an insightful perspective because it recognizes that the user behavior problem is actually a technology failure. It’s a failure of design, but unless and until those fundamental design issues are addressed, the user ends up bearing the brunt of dealing with the issue. It does seem like every year some vendor says that they’ve solved this issue and user behavior won’t be something that we have to think about anymore. But, to date none of those vendor promises or technology problems stand up to the ultimate test of reality, where persistent attackers continue to prevail by manipulating the technologies in unexpected ways, or by breaking the tech outright, or by bypassing the technology through social engineering. Give some of your thoughts on how we get to secure behavior by design, or how we deal with these technology based holes that for decades haven’t yet been plugged?
Bruce Schneier: We in society deal with these things through government. We’ve lived through decades where food wasn’t safe, where cars and airplanes crashed all the time, and we changed that through government regulation. When you stare at a security hole or vulnerability, it hasn’t been fixed in 40 years, you have to realize that the system incensed it not to be fixed. We’ve designed a market where not fixing it is more profitable, so it’s never going to get fixed. If we want more security, we need to require it and we need to require it collectively as citizens, that means government, and then we need to enforce those regulations. I mean this shouldn’t be a surprise. It’s what we do everywhere else, but somehow in the computer world, we have this weird belief that the market will solve problems even if the market doesn’t reward those solutions.
Perry Carpenter: Yes, I think when it comes down to that, there’s still some complex issues that we’ve been grappling with for a long time. I mean, one of those main things that we grapple with is authentication. I mean, why haven’t we solved authentication yet?
Bruce Schneier: I’m not sure what solve authentication yet means? So, I’ll ask you what does solve authentication mean? My bank works pretty well, it feels solved. I go home for Thanksgiving, I know everybody. That all seems solved. What do you mean by solve authentication?
Perry Carpenter: Yes, to where people can’t easily socially engineer passwords or reroute two-factor authentication tokens or something like that, to where if there’s a data breach, or if I know something about somebody that I can’t trick that person into giving up keys to the kingdom.
Bruce Schneier: So, you’re saying why haven’t we made humans smarter? We’ve been trying to do that for centuries. That’s hard. My guess is just genetic engineering isn’t good enough yet?
Perry Carpenter: I mean I think that comes down to part of the problem though. If there’s a place where training can’t fix it because we’re dealing with very complex systems and the user can’t be expected to do that. Then, how do we get to a situation where the technology is secure by design in the very basic ways that we just interact with our systems?
Bruce Schneier: There’s a lot there. One, again, my bank works pretty well. So, I think we have it. Where you don’t have it, it’s because it is more profitable for the companies to have lousy security than to have good security. The reason your phone is so easy to hijack, sim swapping, all those tricks where you call up the phone company and pretend to be somebody else and then hijack their phone. The phone company likes it that way. They make more money, because the authentication is bad. Banks make less money, that’s why bank authentication is better. It really comes down to economics. Facebook makes a lot more money spying on people, having misinformation, than they do fixing any of those problems. We have a lot of authentication systems, and some work better, some work worse, and companies will use the one that is most profitable for them, not the one that is in the user’s best interest. You want to fix that, pass a law.
Bruce Schneier: Some of this is inherent. Remote authentication will always be harder than in person. If you and I meet in person, it will be harder for someone else to impersonate you, than it will be if we meet on the telephone, or we meet, I don’t know, in a text app. That’s always going to be true. There’s a lot of technologies we have for authentication that aren’t being used, because the market doesn’t reward it.
Perry Carpenter: Then, it comes back to basic economic incentives and the fact that there’s not necessarily the regulation in place for some of the advances that we need to make all of this more ubiquitous. From your perspective, we’ve been in various forms of lockdown around the world for the past year. what have we learned about ourselves and security during this time? Has anything stood out to you?
Bruce Schneier: I think we’ve learned that a lot of things are possible remotely that we never imagined or never allowed; remote learning, remote medicine. And that we’re not going to go back to before, we’re going to go back to some hybrid. We’ve learned that organizations can thrive, even without people sitting in an office together, for prescribed hours, and we’re not going back there. And we’ve learned that our network infrastructure is incredibly important to society and that’s not going back. What it means for security is that more things are going to happen remotely, are going to happen without the same social lubricant that gives us so much security in these settings, and that’s going to be more dangerous, from personal corporate and national security. We do need to start facing these challenges. We’re pretty good at muddling through and making things sort of work. We might have to up our game, because looking around the threat actors certainly have upped theirs.
Perry Carpenter: Can you go a bit deeper on that? How are you seeing threat actors up their game and take advantage of this more disconnected form of society that we’re in right now? Where does remote work and everything else fit in with that?
Bruce Schneier: Supply chain is the new attack factor. We are seeing the Russians, the Chinese – I’m sure the US are doing the same thing – criminals, all going after the supply chain in different ways. Whether it’s code libraries that end up into a piece of software; update mechanisms; distribution mechanisms. It’s all being attacked and sometimes the attacks are very fruitful. The Russians’ solar winds campaign was really impressive. The Chinese attack against Microsoft exchange, was very successful. We are seeing different actors go after the supply chain, so it’s less disconnected and more interconnected. Things are massively interconnected, massively global in a way that makes us more vulnerable. I don’t know whether the pandemic fits into this or if it was just by coincidence. The pandemic made us more vulnerable by changing everything so fast. Criminals especially took advantage of the chaos and confusion in those first few months after the shutdowns mid-March 2020. To take advantage of people’s not knowing what’s going on or how to do things, to sort of trick them into doing things they might not want to do. I think we’re better about that now, but we still are accessing our networks via VPNs, our data is all across the internet at various cloud providers. That brings with it new vulnerabilities, and we’re slowly dealing with those.
Perry Carpenter: Fantastic. Let’s go ahead and transition to some of the questions that came in over LinkedIn. I’m going to give you a bunch of rapid fire questions. If you could just give some of your first thoughts that come to mind. The first one is, if you could put one thing on a billboard for everyone to see related to security, what would that be?
Bruce Schneier: Oh wow, what an odd question. It’s so funny, my first thought is security isn’t free. Because it’s not, you have to pay for security, and we all expect it to be free. The more esoteric way of saying that is security is a tax on the honest. Security is something that the honest have to pay for in order for systems to work. It was, I think, Thomas Jefferson who said this, “if all men were angels, then no government would be necessary”, and if all men were angels, no security would be necessary. But all men are not angels, all people are not angels and security is a tax on the honest and something we have to pay for, even though we get nothing for it, but we get everything for it. Alright, it’s a long billboard and pretty philosophical.
Perry Carpenter: Yeah, long billboard, but it speaks of the fracturing of the social contract. So, I think that that’s super, super important to recognize. Alright, if you could change three things about our industry today, what would they be?
Bruce Schneier: I think what’s missing in our industry is government regulation. Right now the security industry is designed with profit in mind. That works well as far as it goes, but there are some enormous market failures that lead to bad outcomes because people are making individual decisions and no one’s making collective societal decisions. What has been missing for decades in cybersecurity, is government, and I think we need to fix that, and that is the first thing we need to fix. Let me think about number two.
Bruce Schneier: I think we need more understanding on human psychology, that security is inherently about people. Too much security has been tech focused and not people focused, and that really means putting social scientists in security development teams. That’s psychologists, sociologists, anthropologists, polisci people who know how people work, and I think that will help us build a lot better security systems.
Bruce Schneier: The third is, I think we need way more diversity in our industry. That we fall into a lot of bad rabbit holes because this has been the one voice in thinking and designing security. If you watch AI, I’ve seen a new area of security spring up, which showed the problems of a mono culture. But the diversity and voices that are now talking about AI security, really show the promise of ways of thinking and solutions, and I’d like to clone that for the rest of cybersecurity.
Perry Carpenter: Okay, so this is a divergence from the LinkedIn questions, but since you’ve mentioned regulation several times now, let me ask you this. You believe that regulation is a chunk of the answer, but we’ve seen regulators try to get involved with things like encryption, from let’s say an overly US centric perspective, where they want to essentially break security by adding things like back doors to preserve what they believe, or what the regulators or the intelligence community believe is going to be within the national interest or what is best for them. How do we deal with that when it comes to regulation?
Bruce Schneier: Well, this is a problem, right. Regulation should be for security and safety, not for insecurity. The problem with the encryption debate is it’s not regulation for defense, it’s regulation for attack, for offense. Yes, if the department of justice is in charge, if the NSA is in charge, you’re going to get lousy security because everyone likes to attack and no one likes to defend. I see the internet as part of our critical infrastructure, that protecting it is paramount. That is a challenge we have, that we really don’t have any agency that is wholly on the side of defense. I mean in the US, I suppose, we have CISA, but they are still really finding their way, and you do have the offensive nature of cyber much more in the ascended. I’m just finished on the Nicole Perlroth’s new book, a great book on the offensive cyberspace, with the awesome title of “This is how they tell me the world ends.” Her titles are better than mine, which I write great titles, but this is a fantastic title. And it’s something that internationally we need to figure out. You’re right that kind of US focused, offense focused, anti security regulation is not going to get us anywhere.
Perry Carpenter: Back to questions from LinkedIn. You’ve been outspoken especially post-911, about security theater. Do you have any examples of security theater that you’re seeing today? If you want to expand on that, can you give some ideas on where those might be helpful psychologically, harmful or benign?
Bruce Schneier: Security theater is a phrase I invented post-911 to describe security measures that look good but don’t accomplish anything. Random ID checks in buildings, National Guard troops in airports holding guns with no bullets, a lot of examples of that; making people feel better, even though they don’t actually do anything. Now, that’s not necessarily bad. We want people to have a feeling of security that is line with the reality. Post-911, everyone was scared. The risk didn’t increase that much actually, so a little theater went a long way. In the past year, we’ve really seen a lot of the same thing, I’ve heard it called hygiene theater. It’s the scrubbing down surfaces for Covid did absolutely nothing and it made people feel better. I know people who would wipe down their groceries when they brought them home or leave them outside for three days. You know, complete nonsense, made no sense, given the science even back then, and now we know it was a waste of time. But that was a piece of theater.
Bruce Schneier: Even today, some of the measures people take outdoors are security theater. We know how the virus spreads, it spreads through the air, it spreads indoors, it spreads with not good airflow. I’m not worried about people at the beach; I’m worried about people in the crowded bars in the evening. We saw a lot of that kind of theater in the last year with Covid, and it was interesting to watch what worked, what didn’t, what people did anyway. Some of it is social signaling. I think, right now, we wear masks even though we’re vaccinated to signal to others who don’t know if we’re vaccinated. I walk into a restaurant, of course I’m going to wear a mask, even if I’m vaccinated because we’re indoors and we don’t know who else is, and those restaurant workers are really at great risk. So, there’s a little theater there and there’s a place where theater is valuable.
Perry Carpenter: You talked a little bit about some of the hygiene theater that you saw and potentially some things that may even erode trust in authorities. How do we deal with these fractures in the view of authority in society when you’ve got hundreds of millions of people probably that may not trust the experts as much as they used to, when, in the long run, it’s probably better to trust the experts? How do we start to get some of that trust back?
Bruce Schneier: That’s an area that I really don’t understand. It’s so far from my field, it’s not security, it’s really psychology. But you’re right, we’re now living in a world where science is disputed. I think of it as the counter enlightenment, that there is a group of people, pretty skewed along political lines, that don’t trust science, that don’t trust math, that don’t trust experts, that have their own answers and damned anybody that contradicts them, even if they have facts. It’s something we as a society have to deal with, and I don’t know how. It’s exacerbated by the press, by social media, by a world where everyone can be a publisher. Right, there’s value in that, but there’s risk in that.
Perry Carpenter: Yeah. Do you have any thoughts then on the rampant spread of disinformation? Is that accelerating the way that I feel like it might be? Or is it more the same that it’s always been, but we are just more aware of it now?
Bruce Schneier: It’s hard to measure. We know that more disinformation is spreading, but we don’t know the effects of it. We have a lot of data, but no real good analysis on what does what. Take this very broadly, you go back a bunch of 100 years, being a publisher was hard, and being a recipient of publications was hard. You had a world of information that didn’t really flow at all. You move into the world of the printing press and increase literacy and you had broadcast, like publishing was hard, but being the recipient was easier. That moved into radio and then television and, again, one to many was the way it all worked. Now we’re going half backwards; it’s very easy to publish, and it’s very easy to receive information. Now we’re in this world where everybody is speaking. We had history nobody speaking, and nobody listening; to 100 years ago, a few people speaking, everyone else listening; today everyone speaking, everyone listening. This is new and I think we don’t fully understand the ramifications of the world we’re in. This is something that people way smarter than me are studying and it’s very outside my area, but I think I gave you my philosophy on it.
Perry Carpenter: Philosophy welcomed. This is a meta type of interview. I’d love to get your thoughts on how technology might be increasing polarization or some of the social issues that we are seeing today.
Bruce Schneier: It’s pretty clear that technology is increasing polarization just by allowing more discrimination. By that word, I mean the ability for people to segment themselves and to be segmented. That you can live a lot of your life now and not come into contact with an idea you disagree with, except in a divisive way. That is something that just wasn’t true before the modern internet technologies so that is affecting things. We’re not sure how but it obviously is.
Perry Carpenter: What about the intentional or unintentional algorithmic encouragement of that?
Bruce Schneier: What about it?
Perry Carpenter: Do you have any thoughts on whether that should be something that’s addressed because it does seem like social media is intentionally or unintentionally encouraging that polarization based on social media’s engagement models.
Bruce Schneier: This gets back to regulation. It seems odd that we would organize our societal political discourse around the short-term financial benefit of a handful of tech billionaires. That seems a really weird way to organize our political system and to organize politics. Yes, I would like to see government regulation here, because the full profit model of political speech isn’t serving our country very well.
Perry Carpenter: We’ll be right back after the break.
Female Speaker 2: And now we return to our sponsor’s question about forms of social engineering. KnowBe4 will tell you that where there’s human contact, there can be con games. It’s important to build the kind of security culture in which your employees are enabled to make smart security decisions. To do that, they need new school security awareness training. See how your security culture stacks up against Knowbe4’s free phishing test. Get it at knowbe4.com/phishingtest. That’s knowbe4.com/phishingtest.
Perry Carpenter: Welcome back to our discussion with Bruce Schneier. Okay, so back in your wheelhouse with cryptography for a minute, we had multiple questions on LinkedIn from people asking about quantum computing and when quantum computing will begin to threaten existing commercial encryption. Then, of course, when our currently trusted encryption models, because of that, won’t be able to be trusted anymore.
Bruce Schneier: Quantum computing is a new way of doing computing and it does threaten some cryptography, but not all of it. I wrote an essay on this, and I urge everyone to go find it. The title is “Cryptography after the aliens land.” Just type that into your favorite search engine and my name and it’ll pop up.
Perry Carpenter: If you want to check out that essay, go ahead and check our show notes. I’ve included a link to the essay there.
Bruce Schneier: I go through the various promises and perils of quantum computing and what it means. The long and short of it is that we’ll be okay, that it will break some encryption, but not all of it, that we are already working on post quantum algorithms and we don’t actually know if it’ll work at all. We know quantum computing is hard, but we don’t know if it is put a person on the moon hard, or put a person on the sun hard. And I mean that really truly. If it’s put a person on the moon hard, we should have breakthroughs over the coming years, in a decade or so we’ll have working quantum computers actually solving problems. If it’s put a person on the sun hard, it’s going to be centuries and we’re not going to solve it. It’s a very speculative technology right now. My money is on put a person on the moon hard, if we’re going to have a betting pool, but cryptography is going to be okay. We have a lot of things we can do that are quantum resistant, even theoretically, and we’ll be fine.
Perry Carpenter: Great, thanks. In this last section I want to be a little bit reflective and speculative. Over the past few decades as you look back, do you see any critical tipping point moments with respect to security that stand out to you?
Bruce Schneier: Nothing comes to mind. I’m not convinced had I thought about it a lot more, I wouldn’t come up with any, but I can’t think of any huge moments. Certainly the terrorist attacks 911 were a moment where a lot of things changed. I’m not sure it changed in cryptography or internet security in the same way. We might want to talk about some of the famous worms or malware. But, again, all of this feels like trends to me, and I don’t think of any flash bulb moments where everything shifted. Maybe that itself is interesting, that things haven’t shifted, that it’s been the same stuff we’ve been dealing with for decades.
Perry Carpenter: Yeah, so as you consider this and think about it, there could potentially be some small inflection points like this year where digital transformation had to speed up a bit and people embrace new technology, but it’s more of a continuation rather than a complete break or shift. Then, I guess, let’s pivot to the future. As we look to the future, what are you most excited about or worried about?
Bruce Schneier: I mean to me the future is the continuation of the present. I worry about the threat actors, I worry about nation states, I worry about criminals. I worry about all of the legal security threats that come from companies following the law and doing things that are bad for our security: the Googles, the Facebooks, surveillance capitalism. I worry about the Internet of Things, and the computerization and networkization of everything and how that will change the threat landscape. I think there is promise in governments getting involved. Not the US, we’re way too dysfunctional, but the EU is the regulatory superpower on the planet, and they pass comprehensive privacy laws and they’re looking at Internet of Things, looking at vulnerability disclosure, looking at AI security. They are large enough that they will move the planet in ways that are good so that’s really where I’m looking.
Perry Carpenter: Alright, so these last few questions that came in from LinkedIn are really just about you. You’ve been at this for a long time now. How do you maintain such a prolific level of output and where does the passion come from to sustain that?
Bruce Schneier: For me, writing is understanding and also writing is a way I can channel my energies. Often I write about a topic because I want to figure it out and the act of writing is how I figure it out. Putting my thoughts in a coherent essay form or book form, helps me explore the issue. If something happens that pisses me off, the way for me to calm down is to write something. Writing is not hard, writing is easy and it actually helps me know what’s going on and I want to affect the debate. It’s how I can talk about the issues in ways that people understand and maybe I can move a political needle.
Perry Carpenter: Do you ever deal with something like imposter syndrome as you’re a cyber security expert speaking into all these other areas that have to do with human dynamics and psychology, sociology and economics? Do you ever feel like you’re speaking into areas where you really don’t deserve a platform? Or do people accuse you of speaking into areas where you don’t have a platform?
Bruce Schneier: I try to stay in my lane. The questions you asked me about misinformation is a good example. I could pontificate about it, but I know there are people other than me who are really researching this and it’s enough outside my area that I don’t want to opine. In a lot of ways, I’m a synthesist; I’m a meta person and I work in the ideas of security that apply in all domains. I actually have opinions on, I don’t know, stadium security, or security at rock concerts, based on some of the systemic things I know about security, even though I’m not a domain expert, even though I know nothing about how stadiums work. I try to maintain an interest and a humility and know where I am speaking and what I can and can’t speak about, and then be honest about it. I can tell you what I think and then couch that with saying “and there are people who are better than me, so if they contradict me, believe them and not me.” I think that’s all part of maintaining what you know. In our world generalists have it hard, because specialists almost always trump them, even though I think generalists have important things to say.
Perry Carpenter: Fantastic. Last question is probably the easiest and simultaneously most important question. Is there any question that you wish that I had asked that I didn’t think to ask? Or is there a last word or thought that you want people to be thinking about and having discussions about?
Bruce Schneier: In the past couple of years, I’ve started thinking about the role of technology and public policy, and the importance of what we’re coming to call public interest technologists; people who bridge the gap between technology and public policy. I am one of those people. I know many of those people, but there aren’t nearly enough of them. We need a career path in public interest tech. We need the ability for lots of people to move into the space that requires expertise in both camps, because all of the important societal political problems of this century are fundamentally technological, like climate change, future of work, robotics, medical. We have a lot of bad policy coming from people who don’t understand the tech, and a lot of bad tech from people who don’t understand the policy. We need a cadre of people who understand both, and that’s what I’m trying to advocate for. It’s been on hold during the pandemic, because everything’s been on hold, but it’s something I feel passionate about and what to get back to.
Perry Carpenter: Well, that was Bruce’s last question and last response and that brings us to the end of today’s show. Bruce’s perspective is always fascinating. As he mentioned, it’s important to recognize that security is about so much more than just technology. It’s about people and taking a meta level approach and seeking a synthesis of ideas across multiple disciplines, is ultimately how we can evolve security and reduce risk. Bruce mentioned fields like psychology and sociology and economics and political science as being critical to improving our security posture. He also didn’t shy away from mentioning the need for regulation as a critical step. His analogy to how regulation elevated the safety and reliability in other industries like the automotive industry, pharmaceutical industry, food industry and finance, is compelling.
Perry Carpenter: The reason that regulation works in these circumstances is because most organizations within a specific vertical, don’t want to bear the brunt of being the first to do something and the only to do something. At that point, they have to pass on the cost and the friction to their customer, and they’ll be the only one that does it. In those circumstances the others that aren’t doing it may get a benefit, because people don’t know why they’re doing the new thing that they’re having to do, why they have these new requirements, and if they can go somewhere else, then they may. Regulation comes in and it deals with that by leveling the playing field, and it puts the economic piece in its place, and also ensures that all organizations within that industry are held to the same standards. That’s been proven to work.
Perry Carpenter: Well, I hope that you enjoyed this interview with Bruce. Next time we’ll be back in our regular format and I’ve got several great guests lined up for that show. Thanks so much for listening and thank you to my guest, Bruce Schneier. I’ve loaded up the show notes with links to all the relevant topics from today’s discussion, including Bruce’s books and much more. Be sure to check those out. If you’ve been enjoying 8th Layer Insights, please go ahead and take a couple of seconds to head over to Apple Podcast and rate and consider leaving a review. That does so much to help.
Perry Carpenter: You can also help by posting about it on social media, recommending it within your network, and, heck, maybe even referring an episode to a friend or a family member. If you haven’t, go ahead and subscribe or follow, wherever you like to get your podcasts. Lastly, if you want to connect with me, feel free to reach out on LinkedIn or Twitter. I also participate in a group on Clubhouse. We meet once a week on Friday. It’s called the Human Layer Club, and you can just search for it on Clubhouse and find it pretty easily. Well, until next time, thank you so much. I’m Perry Carpenter signing off.
Categories: Audio, Recorded Interviews