Failing badly

From Wikipedia, the free encyclopedia

Failing badly and Failing well are concepts in systems security and network security describing how a system reacts to failure. The terms have been popularized by Bruce Schneier, a cryptographer and security consultant.[1]

A system that fails badly is one that fails catastrophically once failure occurs. A single point of failure can thus bring down the whole system. Examples include:

  • Databases (such as credit card databases) protected only by a password. Once this security is breached, all data can be stolen.
  • Buildings depending on a single column or truss, whose removal would cause a chain reaction collapse under normal loads.
  • Security checks which concentrate on establishing identity, not intent (thus allowing, for example, suicide attackers to pass).
  • Internet access provided by a single service provider. If the provider's network fails, all Internet connectivity is lost.

A system that fails well is one that compartmentalizes or contains failure. Examples include:

  • Databases that do not allow downloads of all data in one attempt, limiting the amount of compromised data.
  • Structurally redundant buildings conceived to resist loads beyond those expected under normal circumstances, or resist loads when when the structure is damaged.
  • Concrete structures, which show fractures long prior to breaking under load, thus giving early warning.
  • Armoured cockpit doors on airplanes, which confine a potential hijacker within the cabin.
  • Internet connectivity provided by more than one vendor or discrete path, known as multihoming.

[edit] See Also

[edit] References

  1. ^ Homeland Insecurity, Atlantic Monthly, September 2002