Human reliability

From Wikipedia, the free encyclopedia

Human reliability is related to the field of human factors engineering, and refers to the reliability of humans in fields such as manufacturing, transportation, the military, or medicine. Human performance can be affected by many factors such as age, circadian rhythms, state of mind, physical health, attitude, emotions, propensity for certain common mistakes, errors and cognitive biases, etc.

Human reliability is very important due to the contributions of humans to the resilience of systems and to possible adverse consequences of human errors or oversights, especially when the human is a crucial part of the large socio-technical systems as is common today. User-centered design and error-tolerant design are just two of many terms used to describe efforts to make technology better suited to operation by humans.

Contents

[edit] Human Reliability Analysis Techniques

  • SPAR-H (Gertman et al, 2005)
  • Technique for Human Reliability Analysis (THERP) (Swain & Guttman, 1983)
  • HRA methodology described in Kirwan, 1994
  • HRA and Contextual Control Model (Hollnagel, 1993)
  • Cognitive Reliability and Error Analysis Method (CREAM) (Hollnagel, 1998)
  • Work Safety Analysis (WSA) (Kirwan and Ainsworth, 1992)
  • Related techniques in safety engineering and reliability engineering include:

[edit] Human Error

Human Error has been cited as a cause or contributing factor in disasters and accidents in industries as diverse as nuclear power (e.g., Three Mile Island accident), aviation (see Pilot error), space exploration (e.g., Space Shuttle Challenger Disaster), and medicine (see Medical error). It is also important to stress that "human error" mechanisms are the same as "human performance" mechanisms; performance later categorized as 'error' is done so in hindsight (Reason, 1991; Woods, 1990). Human error is part of the ordinary spectrum of behaviour. Recently, human error has been reconceptualized as resiliency to emphasize the positive aspects that humans bring to the operation of technical systems (see Hollnagel, Woods and Leveson, 2006).

There are many ways to categorize human error (see Jones, 1999).

  • exogenous versus endogenous (i.e., originating outside versus inside the individual) (Senders and Moray, 1991)
  • situation assessment versus response planning (e.g., Roth et al, 1994) and related distinctions in
    • errors in problem detection (also see signal detection theory)
    • errors in problem diagnosis (also see problem solving)
    • errors in action planning and execution (Sage, 1992) (for example: slips or errors of execution versus mistakes or errors of intention; see Norman, 1988; Reason, 1991)
  • By level of analysis; for example, perceptual (e.g., Optical illusions) versus cognitive versus communication versus organizational.

The cognitive study of human error is a very active research field, including work related to limits of memory and attention and also to decision making strategies such as the availability heuristic and other biases in judgment. Such heuristics and biases are strategies that are useful and often correct, but can lead to systematic patterns of error.

Misunderstandings as a topic in human communication have been studied in Conversation Analysis, such as the examination of violations of the Cooperative principle and Gricean maxims.

Organizational studies of error or dysfunction have included studies of safety culture. One technique for organizational analysis is the Management Oversight Risk Tree (MORT) (Kirwan and Ainsworth, 1992; also search for MORT on the FAA Human Factors Workbench.

[edit] See also

[edit] References

  • Gertman, D. L. and Blackman, H. S. (2001). Human reliability and safety analysis data handbook. Wiley. 
  • Gertman, D., Blackman, H., Marble, J., Byers, J. and Smith, C. (2005). The SPAR-H human reliability analysis method. NUREG/CR-6883. Idaho National Laboratory, prepared for U. S. Nuclear Regulatory Commission. [1]
  • Hollnagel, E. (1993). Human reliability analysis: Context and control. Academic Press. 
  • Hollnagel, E. (1998). Cognitive reliability and error analysis method: CREAM. Elsevier. 
  • Hollnagel, E., Woods, D. D., and Leveson, N. (Eds.) (2006). Resilience engineering: Concepts and precepts. Ashgate. 
  • Jones, P. M. (1999). Human error and its amelioration. In Handbook of Systems Engineering and Management (A. P. Sage and W. B. Rouse, eds.), 687-702. Wiley. 
  • Kirwan, B. (1994). A practical guide to human reliability assessment. Taylor & Francis. 
  • Kirwan, B. and Ainsworth, L. (Eds.) (1992). A guide to task analysis. Taylor & Francis. 
  • Norman, D. (1988). The psychology of everyday things. Basic Books. 
  • Reason, J. (1990). Human error. Cambridge University Press. 
  • Roth, E. et al (1994). An empirical investigation of operator performance in cognitive demanding simulated emergencies. NUREG/CR-6208, Westinghouse Science and Technology Center. Report prepared for Nuclear Regulatory Commission. 
  • Sage, A. P. (1992). Systems engineering. Wiley. 
  • Senders, J. and Moray, N. (1991). Human error: Cause, prediction, and reduction. Lawrence Erlbaum Associates. 
  • Swain, A. D., & Guttman, H. E. (1983). Handbook of human reliability analysis with emphasis on nuclear power plant applications.. NUREG/CR-1278 (Washington D.C.). 
  • Woods, D. D. (1990). Modeling and predicting human error. In J. Elkind, S. Card, J. Hochberg, and B. Huey (Eds.), Human performance models for computer-aided engineering (248-274). Academic Press. 

[edit] Further reading

  • Dismukes, R. K., Berman, B. A., and Loukopoulos, L. D. (2007). The limits of expertise: Rethinking pilot error and the causes of airline accidents. Ashgate. 
  • Goodstein, L. P., Andersen, H. B., and Olsen, S. E. (Eds.) (1988). Tasks, errors, and mental models. Taylor and Francis. 
  • Grabowski, M. and Roberts, K. H. (1996). Human and organizational error in large scale systems. IEEE Transactions on Systems, Man, and Cybernetics, Volume 26, No. 1, January 1996, 2-16. 
  • Greenbaum, J. and Kyng, M. (Eds.) (1991). Design at work: Cooperative design of computer systems. Lawrence Erlbaum Associates. 
  • Hollnagel, E. (1991). The phenotype of erroneous actions: Implications for HCI design. In G. W. R. Weir and J. L. Alty (Eds.), Human-computer interaction and complex systems. Academic Press. 
  • Hutchins, E. (1995). Cognition in the wild. MIT Press. 
  • Kahneman, D., Slovic, P. and Tversky, A. (Eds.) (1982). Judgment under uncertainty: Heuristics and biases. Cambridge University Press. 
  • Leveson, N. (1995). Safeware: System safety and computers. Addison-Wesley. 
  • Morgan, G. (1986). Images of organization. Sage. 
  • Mura, S. S. (1983). Licensing violations: Legitimate violations of Grice's conversational principle. In R. Craig and K. Tracy (Eds.), Conversational coherence: Form, structure, and strategy (101-115). Sage. 
  • Perrow, C. (1984). Normal accidents: Living with high-risk technologies. Basic Books. 
  • Rasmussen, J. (1983). Skills, rules, and knowledge: Signals, signs, and symbols and other distinctions in human performance models. IEEE Transactions on Systems, Man, and Cybernetics, SMC-13, 257-267. 
  • Rasmussen, J. (1986). Information processing and human-machine interaction: An approach to cognitive engineering. Wiley. 
  • Silverman, B. (1992). Critiquing human error: A knowledge-based human-computer collaboration approach. Academic Press. 
  • Swets, J. (1996). Signal detection theory and ROC analysis in psychology and diagnostics: Collected papers. Lawrence Erlbaum Associates. 
  • Tversky, A. and Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185, 1124-1131. 
  • Vaughan, D. (1996). The Challenger launch decision: Risky technology, culture, and deviance at NASA. University of Chicago Press. 
  • Woods, D. D., Johannesen, L., Cook, R., and Sarter, N. (1994). Behind human error: Cognitive systems, computers, and hindsight. CSERIAC SOAR Report 94-01. Crew Systems Ergonomics Information Analysis Center, Wright-Patterson Air Force Base, Ohio. 

[edit] External links

In other languages