Human reliability

From Wikipedia, the free encyclopedia

Human reliability is related to the field of human factors engineering, and refers to the reliability of humans in fields such as manufacturing, transportation, the military, or medicine. Human performance can be affected by many factors such as age, circadian rhythms, state of mind, physical health, attitude, emotions, propensity for certain common mistakes, errors and cognitive biases, etc.

Human reliability is very important due to the possible adverse consequences of human errors or oversights, especially when the human is a crucial part of the large socio-technical systems as is common today. User-centered design and error-tolerant design are just two of many terms used to describe efforts to make technology better suited to operation by humans.


Human Error = Adverse outcome of an action, or failure to act.

Human error is part of the ordinary spectrum of behaviour and can result from normal brain function. Useful distinction can be made between errors of intention and errors of execution. Whereas mistaken intentions can be made less likely by increasing competence, trying to be more attentive and other conscious strategies, this approach is generally less successful for slips and lapses of execution. These are more concerned with errors of familiar activities that usually are virtually automatic.

Consequences of human errors, such as ‘spilt milk, or ‘train crash’ are most often used to describe them, although they may give little clue as to their causes. Indeed the severity of consequences is related to circumstances of occurrence rather than root cause. Disaster or a trivial outcome can result from identical errors, depending on when and where they happen.

Although human error results from vulnerable aspects of the brain’s normal functioning, most of the time we manage well enough to avoid them by acquiring and developing natural defensive strategies (look before you leap). However, in the presence of a build-up of aversive influences, such as fatigue and competing distractions or stressors, likelihood of error increases.

Individuals are often blamed for their errors, even though they arose from adverse influences beyond their direct control. Social and organisational consequences of such blame tends to make people defensive and in turn this can increase the difficulty of avoiding re-occurrence, by cutting off frank, open sharing of information.

From a developmental perspective, human beings are virtually unchanged from our ancestors who lived much simpler lives, in stable, familiar surroundings. However, our society is changing at an accelerating rate that increasingly creates new opportunities for error. If benefits of change and innovation are not to be undermined by cumulative increase in everyday errors, developing understanding of this issue needs to take its place among the challenging priorities that society chooses to address. Failure to do so would indeed be a mistake

[edit] See also

[edit] External links


This article about an engineering topic is a stub. You can help Wikipedia by expanding it.