Human-computer interaction

From Wikipedia, the free encyclopedia

Human–computer interaction (HCI) is the study of interaction between people (users) and computers. It is often regarded as the intersection of computer science, behavioral sciences, design and several other fields of study. Interaction between users and computers occurs at the user interface (or simply interface), which includes both software and hardware, for example, general-purpose computer peripherals and large-scale mechanical systems, such as aircraft and power plants. The following definition is given by the Association for Computing Machinery[1]:

"Human-computer interaction is a discipline concerned with the design, evaluation and implementation of interactive computing systems for human use and with the study of major phenomena surrounding them."

Because human-computer interaction studies a human and a machine in conjunction, it draws from supporting knowledge on both the machine and the human side. On the machine side, techniques in computer graphics, operating systems, programming languages, and development environments are relevant. On the human side, communication theory, graphic and industrial design disciplines, linguistics, social sciences, cognitive psychology, and human performance are relevant. Engineering and design methods are also relevant.
HCI is also sometimes referred to as man–machine interaction (MMI) or computer–human interaction (CHI).

Contents

[edit] Goals

A basic goal of HCI is to improve the interactions between users and computers by making computers more usable and receptive to the user's needs. Specifically, HCI is concerned with:

  • methodologies and processes for designing interfaces (i.e., given a task and a class of users, design the best possible interface within given constraints, optimizing for a desired property such as learnability or efficiency of use)
  • methods for implementing interfaces (e.g. software toolkits and libraries; efficient algorithms)
  • techniques for evaluating and comparing interfaces
  • developing new interfaces and interaction techniques
  • developing descriptive and predictive models and theories of interaction

A long term goal of HCI is to design systems that minimize the barrier between the human's cognitive model of what they want to accomplish and the computer's understanding of the user's task.

Professional practitioners in HCI are usually designers concerned with the practical application of design methodologies to real-world problems. Their work often revolves around designing graphical user interfaces and web interfaces.

Researchers in HCI are interested in developing new design methodologies, experimenting with new hardware devices, prototyping new software systems, exploring new paradigms for interaction, and developing models and theories of interaction.

[edit] Differences with related fields

HCI differs with human factors in that there is more of a focus on users working with computers rather than other kinds of machines or designed artifacts, and an additional focus on how to implement the (software and hardware) mechanisms behind computers to support human-computer interaction. HCI also differs with ergonomics in that there is less of a focus on repetitive work-oriented tasks and procedures, and much less emphasis on physical stress and the physical form or industrial design of physical aspects of the user interface, such as the physical form of keyboards and mice. More discussion of the nuances between these fields is at [2]

[edit] Design Principles

When evaluating a current user interface, or designing a new user interface, it is important to keep in mind the following experimental design principles:

  • Early focus on user(s) and task(s): Establish how many users are needed to perform the task(s) and determine who the appropriate users should be; someone that has never used the interface, and will not use the interface in the future, is most likely not a valid user. In addition, define the task(s) the users will be performing and how often the task(s) need to be performed.
  • Empirical measurement: Test the interface early on with real users who come in contact with the interface on an everyday basis, respectively. Keep in mind that results may be altered if the performance level of the user is not an accurate depiction of the real human-computer interaction. Establish quantitative usability specifics such as: the number of users performing the task(s), the time to complete the task(s), and the number of errors made during the task(s).
  • Iterative design: After determining the users, tasks, and empirical measurements to include, perform the following iterative design steps:
  1. Design the user interface
  2. Test
  3. Analyze results
  4. Repeat

Repeat the iterative design process until a sensible, user-friendly interface is created.[2]

[edit] Design Methodologies

A number of diverse methodologies outlining techniques for human–computer interaction design have emerged since the rise of the field in the 1980s. Most design methodologies stem from a model for how users, designers, and technical systems interact. Early methodologies, for example, treated users' cognitive processes as predictable and quantifiable and encouraged design practitioners to look to cognitive science results in areas such as memory and attention when designing user interfaces. Modern models tend to focus on a constant feedback and conversation between users, designers, and engineers and push for technical systems to be wrapped around the types of experiences users want to have, rather than wrapping user experience around a completed system.

  • User-centered design: user-centered design (UCD) is a modern, widely practiced design philosophy rooted in the idea that users must take center-stage in the design of any computer system. Users, designers and technical practitioners work together to articulate the wants, needs and limitations of the user and create a system that addresses these elements. Often, user-centered design projects are informed by ethnographic studies of the environments in which users will be interacting with the system.
  • Principles of User Interface Design: these are seven principles that may be considered at any time during the design of a user interface in any order, namely Tolerance, Simplicity, Visibility, Affordance, Consistency, Structure and Feedback.[3]

[edit] Display Design

Displays are human-made artifacts designed to support the perception of relevant system variables and to facilitate further processing of that information. Before a display is designed, the task that the display is intended to support must be defined (e.g. navigating, controlling, decision making, learning, entertaining, etc.). A user or operator must be able to process whatever information that a system generates and displays; therefore, the information must be displayed according to principles in a manner that will support perception, situation awareness, and understanding.

THIRTEEN PRINCIPLES OF DISPLAY DESIGN[4]

These principles of human perception and information processing can be utilized to create an effective display design. A reduction in errors, a reduction in required training time, an increase in efficiency, and an increase in user satisfaction are a few of the many potential benefits that can be achieved through utilization of these principles.

Certain principles may not be applicable to different displays or situations. Some principles may seem to be conflicting, and there is no simple solution to say that one principle is more important than another. The principles may be tailored to a specific design or situation. Striking a functional balance among the principles is critical for an effective design. [5]

Perceptual Principles

1. Make displays legible (or audible)

A display’s legibility is critical and necessary for designing a usable display. If the characters or objects being displayed cannot be discernible, then the operator cannot effectively make use of them.

2. Avoid absolute judgment limits

Do not ask the user to determine the level of a variable on the basis of a single sensory variable (e.g. color, size, loudness). These sensory variables can contain many possible levels.

3. Top-down processing

Signals are likely perceived and interpreted in accordance with what is expected based on a user’s past experience. If a signal is presented contrary to the user’s expectation, more physical evidence of that signal may need to be presented to assure that it is understood correctly.

4. Redundancy gain

If a signal is presented more than once, it is more likely that it will be understood correctly. This can be done by presenting the signal in alternative physical forms (e.g. color and shape, voice and print, etc.), as redundancy does not imply repetition. A traffic light is a good example of redundancy, as color and position are redundant.

5. Similarity causes confusion: Use discriminable elements

Signals that appear to be similar will likely be confused. The ratio of similar features to different features causes signals to be similar. For example, A423B9 is more similar to A423B8 than 92 is to 93. Unnecessary similar features should be removed and dissimilar features should be highlighted.

Mental Model Principles

6. Principle of pictorial realism

A display should look like the variable that it represents (e.g. high temperature on a thermometer shown as a higher vertical level). If there are multiple elements, they can be configured in a manner that looks like it would in the represented environment.

7. Principle of the moving part

Moving elements should move in a pattern and direction compatible with the user’s mental model of how it actually moves in the system. For example, the moving element on an altimeter should move upward with increasing altitude.

Principles Based on Attention

8. Minimizing information access cost

When the user’s attention is averted from one location to another to access necessary information, there is an associated cost in time or effort. A display design should minimize this cost by allowing for frequently accessed sources to be located at the nearest possible position. However, adequate legibility should not be sacrificed to reduce this cost.

9. Proximity compatibility principle

Divided attention between two information sources may be necessary for the completion of one task. These sources must be mentally integrated and are defined to have close mental proximity. Information access costs should be low, which can be achieved in many ways (e.g. close proximity, linkage by common colors, patterns, shapes, etc.). However, close display proximity can be harmful by causing too much clutter.

10. Principle of multiple resources

A user can more easily process information across different resources. For example, visual and auditory information can be presented simultaneously rather than presenting all visual or all auditory information.

Memory Principles

11. Replace memory with visual information: knowledge in the world

A user should not need to retain important information solely in working memory or to retrieve it from long-term memory. A menu, checklist, or another display can aid the user by easing the use of their memory. However, the use of memory may sometimes benefit the user rather than the need for reference to some type of knowledge in the world (e.g. a expert computer operator would rather use direct commands from their memory rather than referring to a manual). The use of knowledge in a user’s head and knowledge in the world must be balanced for an effective design.

12. Principle of predictive aiding

Proactive actions are usually more effective than reactive actions. A display should attempt to eliminate resource-demanding cognitive tasks and replace them with simpler perceptual tasks to reduce the use of the user’s mental resources. This will allow the user to not only focus on current conditions, but also think about possible future conditions. An example of a predictive aid is a road sign displaying the distance from a certain destination.

13. Principle of consistency

Old habits from other displays will easily transfer to support processing of new displays if they are designed in a consistent manner. A user’s long-term memory will trigger actions that are expected to be appropriate. A design must accept this fact and utilize consistency among different displays.

[edit] Future Developments in HCI[1]

The means by which humans interact with computers continues to evolve rapidly. Human-computer interaction is affected by the forces shaping the nature of future computing. These forces include:


* Decreasing hardware costs leading to larger memories and faster systems
* Miniaturization of hardware leading to portability
* Reduction in power requirements leading to portability
* New display technologies leading to the packaging of computational devices in new forms
* Specialized hardware leading to new functions
* Increased development of network communication and distributed computing
* Increasingly widespread use of computers, especially by people who are outside of the computing profession
* Increasing innovation in input techniques (i.e., voice, gesture, pen), combined with lowering cost, leading to rapid computerization by people previously left out of the "computer revolution."
* Wider social concerns leading to improved access to computers by currently disadvantaged groups

The future for HCI is expected to include the following characteristics:

Ubiquitous communication Computers will communicate through high speed local networks, nationally over wide-area networks, and portably via infrared, ultrasonic, cellular, and other technologies. Data and computational services will be portably accessible from many if not most locations to which a user travels.

High functionality systems Systems will have large numbers of functions associated with them. There will be so many systems that most users, technical or non-technical, will not have time to learn them in the traditional way (e.g., through thick manuals).

Mass availability of computer graphics Computer graphics capabilities such as image processing, graphics transformations, rendering, and interactive animation will become widespread as inexpensive chips become available for inclusion in general workstations.

Mixed media Systems will handle images, voice, sounds, video, text, formatted data. These will be exchangeable over communication links among users. The separate worlds of consumer electronics (e.g., stereo sets, VCRs, televisions) and computers will partially merge. Computer and print worlds will continue to cross assimilate each other.

High-bandwidth interaction The rate at which humans and machines interact will increase substantially due to the changes in speed, computer graphics, new media, and new input/output devices. This will lead to some qualitatively different interfaces, such as virtual reality or computational video.

Large and thin displays New display technologies will finally mature enabling very large displays and also displays that are thin, light weight, and have low power consumption. This will have large effects on portability and will enable the development of paper-like, pen-based computer interaction systems very different in feel from desktop workstations of the present.

Embedded computation Computation will pass beyond desktop computers into every object for which uses can be found. The environment will be alive with little computations from computerized cooking appliances to lighting and plumbing fixtures to window blinds to automobile braking systems to greeting cards. To some extent, this development is already taking place. The difference in the future is the addition of networked communications that will allow many of these embedded computations to coordinate with each other and with the user. Human interfaces to these embedded devices will in many cases be very different from those appropriate to workstations.

Group interfaces Interfaces to allow groups of people to coordinate will be common (e.g., for meetings, for engineering projects, for authoring joint documents). These will have major impacts on the nature of organizations and on the division of labor. Models of the group design process will be embedded in systems and will cause increased rationalization of design.

User Tailorability Ordinary users will routinely tailor applications to their own use and will use this power to invent new applications based on their understanding of their own domains. Users, with their deeper knowledge of their own knowledge domains, will increasingly be important sources of new applications at the expense of generic systems programmers (with systems expertise but low domain expertise).

Information Utilities Public information utilities (such as home banking and shopping) and specialized industry services (e.g., weather for pilots) will continue to proliferate. The rate of proliferation will accelerate with the introduction of high-bandwidth interaction and the improvement in quality of interfaces.

[edit] Some Notes on Terminology

  • HCI vs MMI. MMI has been used to refer to any man–machine interaction, including, but not exclusively computers. The term was used early on in control room design for anything operated on or observed by an operator, e.g. dials, switches, knobs and gauges.
  • HCI vs CHI. The acronym CHI (pronounced kai), for computer–human interaction, has been used to refer to this field, perhaps more frequently in the past than now. However, researchers and practitioners now refer to their field of study as HCI (pronounced as an initialism), which perhaps rose in popularity partly because of the notion that the human, and the human's needs and time, should be considered first, and are more important than the machine's. This notion became increasingly relevant towards the end of the 20th century as computers became increasingly inexpensive (as did CPU time), small, and powerful. Since the turn of the millennium, the field of human-centered computing has emerged with an even more pronounced focus on understanding human beings as actors within socio–technical systems.
  • Usability vs Usefulness. Design methodologies in HCI aim to create user interfaces that are usable, i.e. that can be operated with ease and efficiency. However, an even more basic requirement is that the user interface be useful, i.e. that it allows the user to complete relevant tasks.
  • Intuitive and Natural. Software products are often touted by marketers as being "intuitive" and "natural" to use, often simply because they have a graphical user interface. Many researchers in HCI view such claims as unfounded (e.g. a poorly designed GUI may be very unusable), and some object to the use of the words intuitive and natural as vague and/or misleading, since these are very context-dependent terms. See [6] for more discussion.

[edit] Human Computer Interface

The human/computer interface can be described as the point of communication between the human user and the computer. The flow of information between the human and computer is defined as the loop of interaction. The loop of interaction has several aspects to it including:

  • Task Environment: The conditions and goals set upon the user.
  • Machine Environment: The environment that the computer is connected i.e a laptop in a college student's dorm room.
  • Areas of the Interface: Non-overlapping areas involve processes of the human and computer not pertaining to their interaction. While the overlapping areas, only concern themselves with the processes pertaining to their interaction.
  • Input Flow: Begins in the task environment as the user has some task that requires using their computer.
  • Output: The flow of information that originates in the machine environment.
  • Feedback: Loops through the interface that evaluate, moderate, and confirm processes as they pass from the human through the interface to the computer and back.

[edit] Academic conferences

One of the top academic conferences for new research in human-computer interaction, especially within computer science, is the annually held ACM's Conference on Human Factors in Computing Systems, usually referred to by its short name CHI (pronounced kai, or khai). CHI is organized by ACM SIGCHI Special Interest Group on Computer–Human Interaction. CHI is a large, highly competitive conference, with thousands of attendants, and is quite broad in scope.

There are also dozens of other smaller, regional or specialized HCI-related conferences held around the world each year, the most important of which include:

[edit] Special Purpose

  • UIST: ACM Symposium on User Interface Software and Technology.
  • CSCW: ACM conference on Computer Supported Cooperative Work.
  • ECSCW: European Conference on Computer-Supported Cooperative Work. Alternates yearly with CSCW.
  • ICMI: International Conference on Multimodal Interfaces.
  • MobileHCI: International Conference on Human-Computer Interaction with Mobile Devices and Services.
  • DIS: ACM conference on Designing Interactive Systems.
  • NIME: International Conference on New Interfaces for Musical Expression.
  • HRI: ACM/IEEE International Conference on Human-Robot Interaction.
  • IUI: International Conference on Intelligent User Interfaces.

[edit] Regional and General HCI

  • INTERACT: IFIP TC13 International Conference on Human-Computer Interaction. Biennial, alternating years with AVI.
  • AVI: International Working Conference on Advanced Visual Interfaces. Held biennially in Italy, alternating years with INTERACT.
  • HCI International: International Conference on Human-Computer Interaction.
  • HCI: British HCI Conference.
  • OZCHI: Australasian HCI Conference.
  • IHM: Annual French-speaking HCI Conference.
  • Graphics Interface: Annual Canadian computer graphics and HCI conference. The oldest regularly scheduled conference for graphics and human-computer interaction.
  • NordiCHI: Nordic Conference on Human-Computer Interaction. Biennial.

[edit] See also

[edit] Footnotes

  1. ^ a b ACM SIGCHI Curricula for Human-Computer Interaction
  2. ^ Green, Paul (2008). Iterative Design. Lecture presented in Industrial and Operations Engineering 436 (Human Factors in Computer Systems, University of Michigan, Ann Arbor, MI, February 4, 2008.
  3. ^ Pattern Language
  4. ^ Wickens, Christopher D., John D. Lee, Yili Liu, and Sallie E. Gordon Becker. An Introduction to Human Factors Engineering. Second ed. Upper Saddle River, NJ: Pearson Prentice Hall, 2004. 185-193.
  5. ^ Brown, C. Marlin. Human-Computer Interface Design Guidelines. Intellect Books, 1998. 2-3.
  6. ^ Jef Raskin: Intuitive Equals Familiar. In: Communications of the ACM, vol 37, no 9, September 1994, pp. 17-18, [1]

[edit] Further reading

[edit] External links