Security and Compliance

  • Bruce Schneier
  • IEEE Security & Privacy
  • July/August 2004

It's been said that all business-to-business sales are motivated by either fear or greed. Traditionally, security products and services have been a fear sell: fear of burglars, murders, kidnappers, and -- more recently -- hackers. Despite repeated attempts by the computer security industry to position itself as a greed sell -- "better Internet security will make your company more profitable because you can better manage your risks" -- fear remains the primary motivator for the purchase of network security products and services.

The problem is that many security risks are not borne by the organization making the purchasing decision. An organization might be perfectly rational about securing its own networks against threats like theft of proprietary information and business interruption. But the adverse effects of privacy loss are borne more by those whose privacy has been breached. In economics, this is known as an "externality"; an effect of an organizational decision that don't affect the organization.

This is the proper backdrop to understand the recent spate of privacy laws. The goal of these laws is to bring those externalities into the decision process by adding an additional fear motivator: fear of lawsuit, fear of criminal penalties, fear of being out of compliance.

Privacy laws have existed for years in Europe, but the United States has been seeing an increase in network security compliance laws in recent years. The litany includes the Sarbanes-Oxley (SOX) Act of 2002, the California Database Protection Act (SB 1386) of 2001, the Gramm Leach Bliley (GLB) Act of 1999, and the Health Insurance Portability and Accountability Act (HIPAA) of 1996 and 2003.

The creation of these laws generally follows three steps:

  1. Weak systems call people to action. Media headlines about data security penetration fuel public outcry and calls for regulation.
  2. Regulators mandate policy and technology solutions using stopgaps that are typically vague and untested. This is particularly true of SB 1386, but also applies to SOX and GLBA. HIPAA is the most explicit of the bunch.
  3. Industry develops a response, including policies, processes, and solutions they can live with economically as negotiated with regulators and the courts. Notice how the entire process is motivated by fear, and then in turn uses fear to deal with the externality. In Step (1), the public realizes that it has more to lose from these penetrations than the victim organizations do. In other words, the organizations have some fear of attack and some security measures. But there are additional vulnerabilities -- externalities -- that affect the public.

In Step (2), regulators attempt to bring those externalities into the organization. By passing laws and mandating compliance, they introduce an additional fear to the organizations and motivate them to implement additional security measures.

Each of these laws imposes strict requirements on enterprises to establish or identify, document, test, and monitor "internal control" processes. Since most, if not all, of these processes are supported by information technology, these laws have an enormous impact on a company's network security decisions. The four laws specify varying requirements, but all share the following common mandates for organizations:

  • Security Policies: Well-defined policies for data privacy and protection discourage the government from imposing their own standards -- the least desirable of all situations.
  • Security Processes: Demonstrating policy in action with people using technology in a predictable manner to protect data from attackers.
  • Robust Audit Trail: The foundation of evolved process, where regulators require evidence of what happened to justify why events need not be reported.
  • Preventive Measures: Encryption, digital signing, and real-time detection of attacks all serve to pre-empt attacks on data.

But like most legislative actions, the laws are vague and imperfect. Some provisions are simply too much for organizations to implement, others are just poorly worded. So the end result is Step (3): a negotiation between the organizations affected by the laws and the regulators entrusted with enforcing the laws.

The result is improvement. More important than the specific list of countermeasures is a process of continual security improvement. An organization needs to be able to defend itself against a lawsuit saying: "We're implementing these industry-standard processes. We're improving. We may not be perfect today, but we're continually getting better."

As a security professional, I applaud these kinds of regulations. Just as I advocate software liability as way to bring the externalities of insecure software into the decision-making process of the software manufacturer, I see these data privacy laws as a way to force organizations to take personal data security protection seriously.

Categories: Economics of Security, Laws and Regulations, National Security Policy

Photo of Bruce Schneier by Per Ervland.

Schneier on Security is a personal website. Opinions expressed are not necessarily those of Resilient Systems, Inc.