The Full Wiki

User interface: Wikis


Note: Many of our articles have direct quotes from sources you can cite, within the Wikipedia article! This article doesn't yet, but we're working on it! See more info or our list of citable articles.


From Wikipedia, the free encyclopedia

In the industrial design field of human-machine interaction, the user interface is (a place) where interaction between humans and machines occurs. The goal of interaction between a human and a machine at the user interface is effective operation and control of the machine, and feedback from the machine which aids the operator in making operational decisions. Examples of this broad concept of user interfaces include the interactive aspects of computer operating systems, hand tools, heavy machinery operator controls. and process controls. The design considerations applicable when creating user interfaces are related to or involve such disciplines as ergonomics and psychology.

A user interface is the system by which people (users) interact with a machine. User interfaces exist for various systems, and provide a means of:

  • Input, allowing the users to manipulate a system, and/or
  • Output, allowing the system to indicate the effects of the users' manipulation.

Generally, the goal of human-machine interaction engineering is to produce a user interface which makes it easy, efficient, enjoyable to operate a machine in the way which produces the desired result. This generally means that the operator needs to provide minimal input to achieve the desired output, and also that the machine minimizes undesired outputs to the human.

Ever since the increased use of personal computers and the relative decline in societal awareness of heavy machinery, the term user interface has taken on overtones of the (graphical) user interface, while industrial control panel and machinery control design discussions more commonly refer to human-machine interfaces.

Other terms for user interface include human-computer interface (HCI) and man-machine interface (MMI).



To work with a system, users have to be able to control and assess the state of the system. For example, when driving an automobile, the driver uses the steering wheel to control the direction of the vehicle, and the accelerator pedal, brake pedal and gearstick to control the speed of the vehicle. The driver perceives the position of the vehicle by looking through the windshield and exact speed of the vehicle by reading the speedometer. The user interface of the automobile is on the whole composed of the instruments the driver can use to accomplish the tasks of driving and maintaining the automobile.


There is a distinct difference between User Interface versus Operator Interface or Human Machine Interface (HMI).

  • The term user interface is often used in the context of (personal) computer systems and electronic devices
    • where a network of equipment or computers are interlinked through an MES (Manufacturing Execution System--or Host.
    • An HMI is typically local to one machine or piece of equipment, and is the interface method between the human and the equipment/machine. An Operator interface is the interface method by which multiple equipment that are linked by a host control system is accessed or controlled.
    • The system may expose several user interfaces to serve different kinds of users. For example, a computerized library database might provide two user interfaces, one for library patrons (limited set of functions, optimized for ease of use) and the other for library personnel (wide set of functions, optimized for efficiency).
  • The user interface of a mechanical system, a vehicle or an industrial installation is sometimes referred to as the human-machine interface (HMI). HMI is a modification of the original term MMI (man-machine interface). In practice, the abbreviation MMI is still frequently used although some may claim that MMI stands for something different now. Another abbreviation is HCI, but is more commonly used for than human-computer interface. Other terms used are operator interface console (OIC) and operator interface terminal (OIT). However it is abbreviated, the terms refer to the 'layer' that separates a human that is operating a machine from the machine itself.

In science fiction, HMI is sometimes used to refer to what is better described as direct neural interface. However, this latter usage is seeing increasing application in the real-life use of (medical) prostheses—the artificial extension that replaces a missing body part (e.g., cochlear implants).

In some circumstance computers might observe the user, and react according to their actions without specific commands. A means of tracking parts of the body is required, and sensors noting the position of the head, direction of gaze and so on have been used experimentally. This is particularly relevant to immersive interfaces.


User interfaces are considered by some authors to be a prime ingredient of Computer user satisfaction.  

The design of a user interface affects the amount of effort the user must expend to provide input for the system and to interpret the output of the system, and how much effort it takes to learn how to do this. Usability is the degree to which the design of a particular user interface takes into account the human psychology and physiology of the users, and makes the process of using the system effective, efficient and satisfying.

Usability is mainly a characteristic of the user interface, but is also associated with the functionalities of the product and the process to design it. It describes how well a product can be used for its intended purpose by its target users with efficiency, effectiveness, and satisfaction, also taking into account the requirements from its context of use.

See also: mental model, human action cycle, usability testing, and ergonomics.
List of human-computer interaction topics

User interfaces in computing

In computer science and human-computer interaction, the user interface (of a computer program) refers to the graphical, textual and auditory information the program presents to the user, and the control sequences (such as keystrokes with the computer keyboard, movements of the computer mouse, and selections with the touchscreen) the user employs to control the program.



Currently (as of 2009) the following types of user interface are the most common:

  • Graphical user interfaces (GUI) accept input via devices such as computer keyboard and mouse and provide articulated graphical output on the computer monitor. There are at least two different principles widely used in GUI design: Object-oriented user interfaces (OOUIs) and application oriented interfaces.
  • Web-based user interfaces or web user interfaces (WUI) accept input and provide output by generating web pages which are transmitted via the Internet and viewed by the user using a web browser program. Newer implementations utilize Java, AJAX, Adobe Flex, Microsoft .NET, or similar technologies to provide real-time control in a separate program, eliminating the need to refresh a traditional HTML based web browser. Administrative web interfaces for web-servers, servers and networked computers are often called Control panels.

User interfaces that are common in various fields outside desktop computing:

  • Command line interfaces, where the user provides the input by typing a command string with the computer keyboard and the system provides output by printing text on the computer monitor. Used by programmers and system administrators, in engineering and scientific environments, and by technically advanced personal computer users.
  • Tactile interfaces supplement or replace other forms of output with haptic feedback methods. Used in computerized simulators etc.
  • Touch user interface are graphical user interfaces using a touchscreen display as a combined input and output device. Used in many types of point of sale, industrial processes and machines, self-service machines etc.

Other types of user interfaces:

  • Attentive user interfaces manage the user attention deciding when to interrupt the user, the kind of warnings, and the level of detail of the messages presented to the user.
  • Batch interfaces are non-interactive user interfaces, where the user specifies all the details of the batch job in advance to batch processing, and receives the output when all the processing is done. The computer does not prompt for further input after the processing has started.
  • Conversational Interface Agents attempt to personify the computer interface in the form of an animated person, robot, or other character (such as Microsoft's Clippy the paperclip), and present interactions in a conversational form.
  • Crossing-based interfaces are graphical user interfaces in which the primary task consists in crossing boundaries instead of pointing.
  • Gesture interface are graphical user interfaces which accept input in a form of hand gestures, or mouse gestures sketched with a computer mouse or a stylus.
  • Intelligent user interfaces are human-machine interfaces that aim to improve the efficiency, effectiveness, and naturalness of human-machine interaction by representing, reasoning, and acting on models of the user, domain, task, discourse, and media (e.g., graphics, natural language, gesture).
  • Motion tracking interfaces monitor the user's body motions and translate them into commands, currently being developed by Apple[1]
  • Multi-screen interfaces, employ multiple displays to provide a more flexible interaction. This is often employed in computer game interaction in both the commercial arcades and more recently the handheld markets.
  • Noncommand user interfaces, which observe the user to infer his / her needs and intentions, without requiring that he / she formulate explicit commands.
  • Object-oriented user interface (OOUI)
  • Reflexive user interfaces where the users control and redefine the entire system via the user interface alone, for instance to change its command verbs. Typically this is only possible with very rich graphic user interfaces.
  • Tangible user interfaces, which place a greater emphasis on touch and physical environment or its element.
  • Task-Focused Interfaces are user interfaces which address the information overload problem of the desktop metaphor by making tasks, not files, the primary unit of interaction
  • Text user interfaces are user interfaces which output text, but accept other form of input in addition to or in place of typed command strings.
  • Voice user interfaces, which accept input and provide output by generating voice prompts. The user input is made by pressing keys or buttons, or responding verbally to the interface.
  • Natural-Language interfaces - Used for search engines and on webpages. User types in a question and waits for a response.
  • Zero-Input interfaces get inputs from a set of sensors instead of querying the user with input dialogs.
  • Zooming user interfaces are graphical user interfaces in which information objects are represented at different levels of scale and detail, and where the user can change the scale of the viewed area in order to show more detail.

See also:

  • Archy, a keyboard-driven user interface by Jef Raskin, arguably more efficient than mouse-driven user interfaces for document editing and programming.


The history of user interfaces can be divided into the following phases according to the dominant type of user interface:

  • Batch interface, 1945-1968
  • Command-line user interface, 1969 to present
  • Graphical user interface, 1981 to present — see History of the GUI for a detailed look

Modalities and modes

A modality is a path of communication employed by the user interface to carry input and output. Examples of modalities:

  • Input — computer keyboard allows the user to enter typed text, digitizing tablet allows the user to create free-form drawing
  • Output — computer monitor allows the system to display text and graphics (vision modality), loudspeaker allows the system to produce sound (auditory modality)

The user interface may employ several redundant input modalities and output modalities, allowing the user to choose which ones to use for interaction.

A mode is a distinct method of operation within a computer program, in which the same input can produce different perceived results depending of the state of the computer program. Heavy use of modes often reduces the usability of a user interface, as the user must expend effort to remember current mode states, and switch between mode states as necessary.

See also


  1. ^


  • Torsten Stapelkamp: Screen- und Interfacedesign. Springer Science Business+Media, Berlin 2007, ISBN 3-540-32949-8

External links

Simple English

A User interface allows a user to interact with a machine. User interfaces mainly provide two things:

  • input The user can change things; he or she can change how the machine works, or give more information to the machine.
  • output After the user has given some input, the machine will do something, and then provide some output

Many machines can be very dangerous. A machine should have a user interface that can be handled easily, even if the person operating the machine has panicked. The user interface should therefore be intuitive, and simple to use. An example of such a user interface is that of the kill switch. A kill switch must shut off the machine at all costs - the idea is to avoid injury or harm to people. This is very different from shutting off the machine at the end of the shift, or when it is no longer needed.

According to EN ISO 13850, the kill switch has to be red on a yellow background.

The colors used to mark different states are close to those used by signals used on the road.

Color Meaning Motes
  Red Danger Alerting of possible danger or of states which make it very important to act immediately
  Yellow Something is not normal If nothing is done, the situation may become dangerous.
  Blue Something needs to be done The person operating the machine needs to do something
  Green Everything is normal Used to show safe conditions, also used to start a new process.
  While Neutral Confirmation, also used for things that cannot be expressed by red, yellow, blue or green.
Operating panel
Color Meaning What it does Notes
  Red Operate in an emergency Kill switch, stop, also used for fighting fire Must not be used for stating/putting the machine into operation
  Yellow Something needs to be done to get back to normal Re-start, Operation to avoid anormal condition or unwanted change. Must not be used for either starting or stopping a machine.
  Blue Start something new Start, Reset
  Green Start the usual/common procedure Start from a safe state Must not be used for stopping/switching off
  White meaning underermined Start/On (preferred), Stop/Off
  Grey Start/On, Stop/Off
  Black Stop/Off (preferred), sometimes Start/On

There may be additional symbols, for example:

Symbol What it does
\mid Start
\bigcirc Stop

In many cases, such symbols are better, because some people are color blind. They need to be explained, like warnings, though.


Got something to say? Make a comment.
Your name
Your email address