Parkerian Hexad
From Wikipedia, the free encyclopedia
Parkerian Hexad was created by Donn B. Parker. The Parkerian Hexad adds three additional atomic, non-overlapping attributes of information to the three classic security attributes of the CIA triad, (confidentiality, integrity, availability).
The Parkerian Hexad attributes are the following:
- Confidentiality
- Possession or Control
- Integrity
- Authenticity
- Availability
- Utility
These attributes of information are atomic in that they are not broken down into further constitutents; they are non-overlapping in that they refer to unique aspects of information. Any information security breach can be described as affecting one or more of these fundamental attributes of information.
Confidentiality refers to limits on who can get what kind of information. For example, executives concerned about protecting their enterprise’s strategic plans from competitors; individuals are concerned about unauthorized access to their financial records.
Possession or Control: Suppose a thief were to steal a sealed envelope containing a bank debit card and (foolishly) its personal identification number. Even if the thief did not open that envelope, the victim of the theft would legitimately be concerned that (s)he could do so at any time without the control of the owner. That situation illustrates a loss of control or possession of information but does not involve the breach of confidentiality.
Integrity refers to being correct or consistent with the intended state of information. Any unauthorized modification of data, whether deliberate or accidental, is a breach of data integrity. For example, data stored on disk are expected to be stable – they are not supposed to be changed at random by problems with the disk controllers. Similarly, application programs are supposed to record information correctly and not introduce deviations from the intended values.
Authenticity refers to correct labeling or attribution of information. For example, if a criminal forges e-mail headers to make it look as if an innocent person is sending threatening e-mail messages, there has been no breach of confidentiality (the thief uses his or her own e-mail account), possession (no information has been taken out of the control of the victim), or integrity (the e-mail messages are exactly as intended by the criminal). What is breached is authenticity: the e-mail is incorrectly attributed to someone else. Similarly, misusing a field in a database to store information that is incorrectly labeled is a breach of authenticity; e.g., storing a merchant's tax code in a field labeled as the merchant's ZIP code would violate the authenticity of the information.
Availability means having timely access to information. For example, a disk crash or denial-of service attacks both cause a breach of availability. Any delay that exceeds the expected service levels for a system can be described as a breach of availability.
Utility means usefulness. For example, suppose someone encrypted data on disk to prevent unauthorized access or undetected modifications – and then lost the decryption key: that would be a breach of utility. The data would be confidential, controlled, integral, authentic, and available – they just wouldn’t be useful in that form. Similarly, conversion of salary data from one currency into an inappropriate currency would be a breach of utility, as would the storage of data in a format inappropriate for a specific computer architecture; e.g., EBCDIC instead of ASCII or 9-track magnetic tape instead of DVD-ROM. A tabular representation of data substituted for a graph could be described as a breach of utility if the substitution made it more difficult to interpret the data. Utility is often confused with availability because breaches such as those described in these examples may also require time to work around the change in data format or presentation. However, the concept of usefulness is distinct from that of availability.
[edit] See also
[edit] Further reading
- Donn B. Parker, “Toward a New Framework for Information Security,” The Computer Security Handbook, 4th ed., Seymour Bosworth and M. E. Kabay ( New York, 2002)