Keeping Sensitive Information Out of the Hands of Terrorists Through Self-Restraint

In my forthcoming book (available February 2012), I talk about various mechanisms for societal security: how we as a group protect ourselves from the “dishonest minority” within us. I have four types of societal security systems:

  • moral systems—any internal rewards and punishments;
  • reputational systems—any informal external rewards and punishments;
  • rule-based systems—any formal system of rewards and punishments (mostly punishments); laws, mostly;
  • technological systems—everything like walls, door locks, cameras, and so on.

We spend most of our effort in the third and fourth category. I am spending a lot of time researching how the first two categories work.

Given that, I was very interested in seeing an article by Dallas Boyd in Homeland Security Affairs: “Protecting Sensitive Information: The Virtue of Self-Restraint,” where he basically says that people should not publish information that terrorists could use out of moral responsibility (he calls it “civic duty”). Ignore for a moment the debate about whether publishing information that could give the terrorists ideas is actually a bad idea—I think it’s not—what Boyd is proposing is actually very interesting. He specifically says that censorship is bad and won’t work, and wants to see voluntary self-restraint along with public shaming of offenders.

As an alternative to formal restrictions on communication, professional societies and influential figures should promote voluntary self-censorship as a civic duty. As this practice is already accepted among many scientists, it may be transferrable to members of other professions. As part of this effort, formal channels should be established in which citizens can alert the government to vulnerabilities and other sensitive information without exposing it to a wide audience. Concurrent with this campaign should be the stigmatization of those who recklessly disseminate sensitive information. This censure would be aided by the fact that many such people are unattractive figures whose writings betray their intellectual vanity. The public should be quick to furnish the opprobrium that presently escapes these individuals.

I don’t think it will work, and I don’t even think it’s possible in this international day and age, but it’s interesting to read the proposal.

Slashdot thread on the paper. Another article.

Posted on May 31, 2011 at 6:34 AM47 Comments

Comments

Alan May 31, 2011 7:16 AM

Might work in a system with progressive disclosure, i.e., information was first published to a small audience, who could give the publisher feedback, then published to a wider audience, who could again give the publisher feedback, and only after some amount of time, then published to the world.

Mike T. May 31, 2011 7:33 AM

That article presupposes that an individual has enough information to make a rational choice, and is someone who will actually make a rational choice.

In practice, the average human doesn’t know all the many and varied ways that information could be used. Defining non-disclosure as a moral or reputational imperative will have people taking the safe route in all cases, leading to a massive amount of censorship.

Follow the logic:
1. I get a piece of information.
2. I consider disclosing it, but may not have enough information to definitively classify it ‘good’ or ‘bad’.
3. As I am punished by society if I release ‘bad’ information, I will tend to classify the vast majority of my information as ‘bad’.
4. Scaled to millions of people, there soon is less new publicly available information in the world, which leads to even less capability by people to classify it ‘good’ or ‘bad’.
5. Everything is eventually secret.

What would be better is for governments, businesses, and individuals to build structures that do not rely on secrecy for their security.

Mike T
Private Citizen

Another Kevin May 31, 2011 8:14 AM

The comment by Mike T shows that a shame-based secrecy system can have an intolerable human cost. In the end it leads to the dystopic vision where “nice weather we’re having, isn’t it?” becomes “disclosing to possible enemies of the State information that is vital to our national agricultural productivity.”

Moreover, such a system is ultimately futile. It is inevitable that shame will fall on the hearer as well as the teller. If, as Boyd suggests, formal channels to alert the government to sensitive information are established, it is a near certainty that the tale-bearers will be subjected to scrutiny: “How did you learn that secret? From whom? When? Whom else among your associates might have learnt it?” and eventually punished for knowing too much. The government’s own acquisition of intelligence will fail, precisely because it will be punishing its own informers.

vwm May 31, 2011 8:15 AM

Might work, if a majority agrees on what should be published and what should not.

But I find it rather difficult to gain higher moral grounds while discrediting dissidents and whistle-blower as “unattractive figures whose writings betray their intellectual vanity.”

Dwayne Phillips May 31, 2011 8:34 AM

There are a lot of us out here right now who are already abiding by self-censorship. The “shaming others” is a bad idea. If I shame someone for writing something, I am signally that what they have written is correct and a lot of help to people who want to do bad things to other people.

Harald May 31, 2011 8:43 AM

I completely reject his conclusion: “Yet in our reverence for free expression, communication has been tolerated that carries considerable security risks while delivering questionable benefits to society.”

In several places he rejects the idea that full disclosure is acceptable when the attackers already know about, or can easily find, sensitive information. Put bluntly, we know better.

Mr. Boyd needs to study the computer security field in the early days, where vendors would do nothing about vulnerabilities until they were shamed into acting by public disclosure. He needs to investigate the world of physical security, where lock vulnerabilities are routinely known to both locksmiths and lockpicks, but not to the owners of the locked items.

The fact that he does not understand the benefit to society of full disclosure invalidates his entire argument, IMO.

TimH May 31, 2011 8:54 AM

Boyd operates on the premise that ‘the authorities’ who are protected by the proposed self-censorship are benign, and serve the interests of the people. I suggest naivety there. The Obama administration’s persecution of NSA whistleblower Thomas Drake is a clue… Drake exposed massive waste, excess and perhaps illegality in numerous NSA programs. Secrecy is often used as an shield to hide embarassing situations.

Soosan May 31, 2011 8:57 AM

Perhaps the idea is good in basis, but needs to be expanded a bit.

Make anonymous disclosure possible and make sure that after two months it gets disclosed publicly. That way, like with software disclosure, there is (ample?) time to plug the hole. There can be no punishment, and seeing it’s a moral motivation there will be no reward other than feeling good either.

The disclosure will make plugging holes a necessity and nothing will get swiped under the carpet.

As with any system though, who will operate it? Who will control it’s workings?

James May 31, 2011 9:14 AM

I just finished reading Daniel Pink’s book “Drive” and I think you might find the research (also covered in communications classes) about Theory X versus Theory Y where Theory X is people are naturally lazy and will not work if they are not rewarded (and monitored/punished) where Theory Y is that people naturally want to work and monitoring/punishment systems and impede performance. More info at http://en.wikipedia.org/wiki/Theory_X_and_theory_Y

Anyway your moral and reputational systems seem to rely on Theory Y, which may make them more effective.

Danny Moules May 31, 2011 9:16 AM

“where he basically says that people should not publish information that terrorists could use”

There goes chemistry, physics and maths then.

Oliver Holloway May 31, 2011 9:20 AM

This article about self-restraint is merely another review of the idea of “security through obscurity”, decorated with hubris. The author might as well be saying, “Hey, team, if we don’t share our knowledge, the other team will never figure out how to beat us!” Because, you know, we’re just so gosh-darned exceptional, that no opponent ever come up with anything better.

Kudos to the author on writing style and meme direction. He took the time to travel from examples where transparency helped, to examples where transparency helped but was embarrassing, to examples where obfuscation clearly did nothing to prevent terrorism and of course the embarrassment that that can cause, to a conclusion that we just can’t have that kind of embarrassment. Anyone else see the problem with this meme-vector?

Clive Robinson May 31, 2011 10:18 AM

I’m assuming most people have read the various works of George Orwell?

Well he identified this sort of division of information and withholding to a cleque as a very very powerfull method of state control.

One of the things that history teaches us over and over again is that witholding information from society in general does considerably more harm that good in even the short term.

Thus one has to ask questions of Mr Boyed and his intentions, and if he is realy as naive as he comes across.

R Cox May 31, 2011 10:40 AM

I think that the structures we have set up as a society help keep us honest. We see that following clear rules and expectations tend to help us be successful, and so most of us tend to behave in the expected manner. Kids, who have not had the experience to learn profitable behavior, tend to be more likely to behave outside these norms, joyriding in cars, stealing phones, etc.

This idea is supported by the behavior of people in the US up to the mid 20th century. It certainly was not right to own slaves, but was profitable. It certainly was not right to treat people differently based on skin color, but from a commercial point of view it is profitable. Game theory tells us that tit for tat is the best strategy, unless a society can isolate a clearly identifiable minority that can then be exploited. In such cases it is not the a situation of the a minority element, but a societal breakdown.

So what does this mean in terms of keeping secrets on a slippery moral slope. If individuals within the government are spying on individuals beyond what one believes is legal, is it one’s civic duty to report or not? If one knows that certain people in a government are participated in illegal drug trade to raise funds, it is one’s civic duty to report of not? Is civic duty to protect individuals in a government or to insure that immoral behavior as a result of societal breakdown, like allowing children to go hungry while others grow wealthy, is not allowed to continue. The reason this will not work is because it is premised on the idea that we civic duty is defined by a central government committee of individuals, not the people.

NobodySpecial May 31, 2011 10:41 AM

In the case of security then obscurity has pretty much been proven to be bad.

But there is a case to be made for a ‘duty of care’ attitude. I write on a science Q+A site where we have an unwritten policy of not answering questions that are going to be too dangerous to the user (how do I make a railgun/gunpowder etc)

This information is available – if you put a little work into finding it – but once a carefully explained answer is out there on the net it’s going to be found by a kid who didn’t have the background knowledge to be safe.

TimH May 31, 2011 11:13 AM

On ‘dangerous’ chemistry and physics… try an old encyclopedia or Victorian ‘receipts’ books for info on gunpowder/cotton, lead azide etc…

John Campbell May 31, 2011 11:18 AM

Punishments– the government has gotten very good at this since they’ve not read enough of B.F. Skinner’s behaviorism.

When the “system”– be it peer pressure or not– only provided punishment there is little or no encouragement to “right-thinking”.

Mind you, who defines what “right thinking” is? And would you trust any of the folks who may claim to be “in charge” to define what you are allowed to think about, perceive, deduce or create?

This mindset– transitioning from an “open” society w/o castes to a “closed” society where innovation is strictly “managed”– is a lose for the human race as a whole. If enough people accept this then someone push the button now and render us extinct.

rnsanchez May 31, 2011 11:45 AM

Even paranoia has to be “taken” with moderation. Creating the instinct that anyone is spying on you to feed a terrorist chain is going to have terrible social effects (think of it as “FUD zombies”).

Public shaming on the society as a system already exists and is widespread: bullying, which basically boils down to “if you’re not tough, you’re not worthy”. In my understanding, it nurtures a rebelling instinct that eventually will lash back on people, sometimes with very real actions (e.g.: someone taking a machine gun to a (possibly public) place and venting out).

The moment such a new system is put in practice, abuse will start. It is kinda out there already, by the way — smear campaigns, especially during election times.

Plus, going from public shaming to public stoning (or equivalent) is a small step, and soon we will be in the medieval era again, where simply having a facebook account should be enough to all sorts of nasty things that the FUD zombies want to enforce/regulate (think witch-hunt).

pbnj May 31, 2011 11:46 AM

Self-restraint in this context is a two-way street. And those preaching it are practicing darned little of it.

Not just the spying. The uncertainty of careers, prospects, healthcare and the other basics of life.

You can flap your gums all you want. When you stop and look at your actions, and their consequences — that’s when, maybe, you begin to get a clue.

Brandioch Conner May 31, 2011 11:47 AM

@Mike T.
@Oliver Holloway

Excellent points. And an additional problem … Bruce’s “movie plot threats”.

Who determines what is “dangerous” when just about everything can be fit into some kind of “movie plot threat” that the terrorists COULD perform.

We cannot list the locations of our malls on Google because a terrorist might find them.

Gary H. May 31, 2011 1:20 PM

Without a transcendent basis to form a moral consensus, shaming will not be effective. Moral relativism trumps shame.

Pete May 31, 2011 1:36 PM

What does your research into the ‘dishonest minority’, and moral and reputational implications thereof, tell you about the crooks in BT responsible for the BT.Phorm affair?

Or are you (still) too embarrassed to comment on it?

jacob May 31, 2011 3:01 PM

@clive. yep. George Orwell was ahead of his time. This gentleman is being paid (i assume) for putting out this piece of fantasy.

I used to laugh at the conspiracy nuts in the 60’s and 70’s. Not so much now. Yes, I think most government workers really couldn’t care less what people do. However, free speech is inhibited even if people only know that someone is watching. Leaving the up/down changed syntax and meaning that Orwell spoke of in 1984. PC certainly plays into that.

I am tired of the “unexpectedly” news being bantied about. PBS hacked? man, the bad guys even picking on Big Bird.

Dirk Praet May 31, 2011 4:27 PM

Self-censorship is as naive a notion as self-regulation.

First of all, I know few individuals to whom ethics and morale prevail over more earthly matters such as ego, money, sex and power. It’s the stuff only idealists and holy (wo)men are made of. Personally, I think I’d rat out anything and anyone for something as simple as a date with Jessica Alba. Self-restraint, civic duty: it’s like preaching abstinence to hormone crazed teenagers in a pr0nified media landscape while keeping several mistresses yourself. It doesn’t work because it’s simply against human nature.

The second part of the equation is that I don’t need preachers or governments to set up channels for me to talk to God or disclose inconvenient truths, and neither should anyone else. And for the simple reason that both throughout history have never done anything else than betray people’s trust for their own goals only. Whether forced or self-imposed, censorship has never served any other purpose than to preserve secrecy and to perpetuate the status quo.

And in the end, freedom of speech may just be one of the last things the commoner has to make a difference in a world of authoritarian regimes and pseudo-democracies. I guess, and as mentioned in a previous post, that is exactly what the First Amendment was made to protect.

Harry Johnston May 31, 2011 5:56 PM

Out of curiosity, has anybody heard it suggested that 9/11 was inspired by Steven King’s “The Running Man”?

(The movie, by the way, bears no resemblance to the book except that some of the names are the same.)

Richard Steven Hack May 31, 2011 7:32 PM

It’s hard to know where to begin on how muddled and stupid this entire argument is. Fortunately, the posters above have dealt with most of it.

I’ll just add that it’s just another attempt to evade the pure physical and social fact that there is no security.

And my suspicion is it’s deliberately evading in order to push the notion that persecution of whistleblowers is justified. This is a current issue with the Obama administration which has prosecuted more whistleblowers more harshly than any other President in decades. It was one of the first campaign promises Obama reneged on (and there have SO many!). I said during his campaign that Obama was just “Bush Lite” and it has proven to be so.

As an aside, of Bruce’s four social security systems, the first two are jokes, the third produces INsecurity as a byproduct by definition, and the fourth can always be evaded by new technology or vulnerabilities in the current technology.

So once again, read my lips: There is no security. Suck it up.

asd May 31, 2011 8:05 PM

Is the author of document(SlashDot thread on the paper. Another article.) saying that if more people show personal restraint, the goverment won’t have bother doing the negtive(meaning) it does :->:

Dinah May 31, 2011 8:07 PM

Freedom not as indulgence but coupled with responsibility. What’ll they think of next.

Trichinosis USA May 31, 2011 8:24 PM

It’s actually encouraging that this “Boris is listening” mentality is being touted publicly. It means that the real bad guys are nervous about something – perhaps Wikileaks put more of a hurt on them than they’d like to admit. Or perhaps they’re getting ready to deal with a possible mistrial for Bradley Manning; which would certainly cause it’s share of sour grapes, and possibly encourage more whistleblowing.

It’s just psyops when you get right down to it. Ego is one of the reasons people get into espionage, part of the LIFE acronym. Putting down and trying to shame a person who’s revealing secrets out of a real or imagined need for egoboo is easy when that person isn’t working for you and yours; but such statements can be a two edged sword, as the same motives can just as often be ascribed to one’s own operatives.

Gabriel May 31, 2011 8:39 PM

I would propose that it is our patriotic duty to the state and civilization to transfOrm our language so that it is impossible to accidentally convey bad knowledge, which is crimethink. If no bad thoughts can be conveyed, then we can rest assured that no one can inadvertently help terrorists and criminals because of their fallacious egos. This new language should be known as newspeak, and I believe we can institute it by 2050. Remember it is for the children.

Richard May 31, 2011 8:54 PM

Even during a declared war, the ramifications of censoring information may be trickier than they might initially seem. The Slate article “Censorship’s trial balloons” [ http://www.slate.com/id/2102499/ ] may be of interest in this regard.

During World War II, there was a Japanese operation that involved sending off a huge number of “balloon bombs” (incendiary bombs and explosives attached to hydrogen balloons) to drift over the Pacific Ocean and to land within the American mainland. The media stayed silent upon official request from the Office of Censorship, and this may have helped to discourage the Japanese (who were monitoring American media) with regard to the “balloon bomb” campaign. Later on, however, the government changed course and said that revealing certain information would be worthwhile if it would allow “the possible saving of even one American life through precautionary measures.” This was shortly after an incident where one of the “balloon bombs” came down in Oregon, with the resulting explosion causing six fatalities.

Another issue mentioned in the article is when future generations (as opposed to the current generation) simply do not learn about things that happened in the past.

Vles May 31, 2011 11:17 PM

Comments page for the HSAJ article:
http://www.hsaj.org/?comments=7.1.10

And I’d like to offer this 20minutes TED on the power of vulnerability, for it might be helpful in your research Bruce:
http://www.ted.com/talks/brene_brown_on_vulnerability.html

When I was 18 I asked my chemistry teacher in our class how to produce nitroglycerin. She said it was easy, showed us the molecular structure but refused to explain the production process, as it was not appropriate, due to her duty-of-care. While I’m not a terrorist, I’m sure you can understand a youthful desire to try dangerous things at home and her refusal to share such knowledge.
However twelve years later this knowledge is trivial to obtain: (google) http://www.google.com.au/search?hl=en&q=how+to+make+nitroglycerin
There is no such desire or internal motivation for me any longer to try chemical experiments, but in the context of national security the author wishes to point out that nothing beats insider knowledge, or in a different light: What you know about how your system/environment operates, and what is the best and most efficient method to tear it to shreds or do it harm is not immediately apparent to an outsider. You take that knowledge for granted! Exploiting one vulnerability does not make a dent for sure (and they still need to be disclosed and fixed), but stringing them together in the right order can make for a violent recipe if it takes more time to fix than plan an assault against.
I suppose it’s not all bad. You can make those systems more resilient if you know how they fall apart and the author fails to take in to account that by disclosing such information you’re not only giving the terrorists a “road map”, you are also teaching your fellow citizens on how to care better / look out for their systems while they are vulnerable / “wide scope, high impact low hanging fruit”. Much better scenario and odds of mitigation if 15 determined terrorist are trying to tear a certain something to pieces and 308 million Americans are on stand by…

This is homeland security stuff. But this self-restraint is not so much about morals (combating changing societal mores with a higher ethical standard in handling sensitive information), as it is about ethics, me thinks and that’s a whole other topic.

Is mr. Boyd around to discuss?

Otter May 31, 2011 11:24 PM

If attitudes expressed by “unattractive figures whose writings betray their intellectual vanity” are as widespread as Boyd’s belief that he can say it out loud suggests, Homeland Security is hopelessly incompetent.

The Imp June 1, 2011 5:08 AM

The fundamental problem with his idea is that the leaks, according to the inquiry done afterwards, leaked nothing that poses any threat to anyone.

Which means that there’s not a lot of chance to shame someone for releasing what should never have been concealed.

paul June 1, 2011 9:10 AM

This relies incredibly strongly on the public consensus about which disclosures should be shamed and which shouldn’t. As people have noted, that consensus is unlikely to be very accurate. In fact, except in a few cases at either end of the gamut, it seems plausible that public consensus — vulnerable as it is to manipulation by people with money and/or access to media echo chambers — is likely to be the opposite of accurate. People whose malfeasance can be exposed by leaks have a huge incentive to use covert channels to promote shaming; people who gain from leaks (even at the expense of the general public) have the same incentive to use covert channels to damp down shaming. Much cheaper in both cases than either getting information legitimately or correcting the malfeasance.

Iasa June 1, 2011 3:33 PM

rule-based systems — any formal system of rewards and punishments (mostly punishments); laws, ostly;

What does “ostly” mean? I assume it’s a typo, but I’m having a hard time determining what it should have been.

Imperfect Citizen June 1, 2011 8:02 PM

Shaming didn’t stop/hasn’t stopped slavery, genocide, or little Tommy Cunningham from letting the air out of my bicycle tires in second grade.

R. Hamilton June 2, 2011 9:25 AM

Within the narrow context of professional associations, this might actually make sense (there’s some leverage if a person needs to belong to practice their profession), and might even do some good, because it’s dealing with people likely to be knowlegable insiders.

Aside from that, I just don’t see why so many commenters go nuts at the idea of encouraging self-restraint. It would be wonderful if those doing the encouraging weren’t hypocrites, but regardless of that, if enough people started to think before shooting off their mouths, that would only be a good thing, right?

Mind you, I don’t think it’s more than a small part of the problem…and I can’t think of a non-provocative way of saying the rest of what I’m thinking on that.

paul June 2, 2011 10:35 AM

R. Hamilton:

What drives a lot of people nuts is the implicit notion that researchers and others don’t already think long and hard about what to disclose and when and how to disclose it.

Clive Robinson June 2, 2011 11:14 AM

@ R.Hamilton,

“Within the narrow context of professional associations, this might actually make sense(there’s some leverage if a person needs to belong to practice their profession), and might even do some good, because it’s dealing with people likely to be knowlegable insiders.”

Actually probably the exact opposite.

Have a look at the history of lock picking in the past 15 years or so (and even further back)

Matt Blaze a researcher and lecturer (www.crypto.com) found out by simple deduction that locksmiths had been lying to the likes of the hotel and entertainment industry over five pin “master key” locks for many many years (possibly over 100years).

The result was a concerted attack on the research by various “expert members” of the varioud “lock smith” associations.

The only thing that keeps people honest is the fear of being caught and punnished. The moment you remove either of those then the incentive to remain honest disapears like a thief in the night.

The only sensible solution where security is reasonably improvable is a researcher publicly disclosing the weakness 60 or so days after notifing the designers of the system.

The level of the disclosure should be sufficient to give not just warning, but reasonable steps to mitigate the issue untill the designers of the system have come up with an appropriate fix.

However I’m personaly not in favour of full disclosure after the first 60 days. That is to say I am actually in favour of full disclosure by the researcher, but they should alow sufficient time for fixes not just work arounds to be produced (say another 30 or 60 days).

Further the researcher should receive the protection of the law in the same way that whistle blowers etc are provided they have followed a reasonable path of action on the disclosure.

The reason for this is that researchers earn a reputation with each security fault they find. They are entitled to this recognition and improvment in their standing within the community and the secondary fiscal benifits this can bring.

Otherwise they will either not do the research or be tempted to find a high paying customer for their work, either option only brings significant harm to the whole community.

melior June 5, 2011 8:20 PM

If this catches on, we could move to adopt a ‘stigmatization’ of the ‘unattractive’ types who tend to steal, kill, and rape next.

R. Hamilton June 6, 2011 3:36 AM

@Clive Robinson:

I read the Matt Blaze report years ago. I didn’t have a problem with it, given how it was said to have been handled.

Nor am I talking about cases where people have been sitting on a problem and doing nothing about it.

I don’t have a problem with responsible disclosure. I acknowledge that many (if by no means all) professionals that make disclosures seek to do so responsibly.

However, while secrecy does not promote security in general, I’m simply not prepared to assume that disclosure is always the best answer (which means I’m not inclined to trot out a list of examples). And, while it’s not really on the topic here, there are matters not necessarily involving vulnerabilities that might create vulnerabilities if they were disclosed. There’s also the concept of operations security (OPSEC – see wikipedia), where one tries to avoid disclosing patterns of individually insignificant information, that taken together could reveal sensitive information. That concept is perhaps not applicable to most people, but it is applicable to those working with sensitive information, or even those in regular contact or proximity to them.

The basic call for responsible behavior does not seem to me unreasonable. Clearly it’s affected by the behavior of others (it would hardly be responsible to refrain from disclosure simply to give someone else the opportunity to keep ignoring a problem). However, mere hypocrisy by someone else doesn’t excuse irresponsible behavior on one’s own part. And I don’t see a problem with self-censorship so long as it’s not reflexive and unthinking. We do that all the time, so as not to give offense when it would serve no useful purpose, or perhaps because there’s some content that we don’t wish to encourage by becoming a paid subscriber.

This doesn’t mean that I too don’t get frustrated. There’s plenty that’s wrong, and a lot of times more communication (disclosure, if you like) would help. But disclosure as a reflex isn’t much better than secrecy as a reflex. Either way, some thought should take place first. To reduce the incentive for calls for it to be centrally imposed (if for no other reason), it should be self-imposed.

Clive Robinson June 6, 2011 8:24 PM

@ R. Hamilton,

“However, while secrecy does not promote security in general, I’m simply not prepared to assume that disclosure is always the best answer…”

In every case it’s not, it’s a question of harms and there resolution at the end of the day, and with software vulnerabilities we have yet to come up with a better method of getting them fixed.

When talking about harms and resolution you have to start thinking in different ways for different goods and scenarios.

Firstly there is a big difference between tangible physical goods and intangable software goods because the fault resolution methods are different. Tangable goods usually involve a “product re-call” whilst intangable goods usually involve the “release of a patch” or “change of process”. And some tangable goods cannot even be recalled.

However the manufactures of tangable goods should not take a “known to be deficient” idea on into new products or if dangerous carry on producing the good in that way.

Thus it falls onto the producer to behave “responsibly” after being informed of defects in their goods which cause harm

The history of the software industry was “knock it out quick and dirty” because the “first to market” was at one point the market winner. Thus the software producers asked the “what’s in it for me” question with the result they did not behave as responsably as they could or in some cases should have.

In more recent times some software manufactures have become more responsible but many have not. Full disclosure is touted as being the cause of this increased responsibility and this may well be true.

History has shown us that with tangable goods there are three ways to get responsible behaviour,

1, legislation.
2, loss of reputation.
3, loss of business.

However when you are effectivly the “only kid on the block” you can use your monopolistic position to make even legislation ineffective.

When it comes to intangable goods these are usualy not even sold only licenced, which is why the software industry can get away with more than any producer of tangible goods.

And it was the attitude of the supply industry that caused full disclosure to happen.

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.