A head-up display (HUD) is any transparent display that presents data without requiring the user to look away from his or her usual viewpoint. The origin of the name stems from the user being able to view information with his head "up" and looking forward, instead of angled down looking at lower instruments.
Although they were initially developed for military aviation, HUDs are now used in commercial aircraft, automobiles, and other applications.
The first HUDs were essentially advancements of static gun sight technology for military fighter aircraft. Rudimentary HUDs simply projected a "pipper" to aid aircraft gun aiming. As HUDs advanced, more (and more complex) information was added. HUDs soon displayed computed gunnery solutions, using aircraft information such as airspeed and angle of attack, thus greatly increasing the accuracy pilots could achieve in air to air battles. An early example of what would now be termed a head-up display was the Projector System of the British AI Mk VIII air interception radar fitted to some de Havilland Mosquito night fighters, where the radar display was projected onto the aircraft's windscreen along with the artificial horizon, allowing the pilot to perform an interception without taking his eyes from the windscreen.
In June 1952 the Royal Navy released Naval Staff Requirement NA.39 calling for a carrier-borne strike aircraft with a large range capable of carrying a nuclear weapon under enemy radar cover and striking enemy shipping or ports. Blackburn Aircraft won the tender to produce their design which became the Buccaneer. The Buccaneer prototype first flew on 30 April 1958. The aircraft specification called for an Attack Sight giving navigation and weapon release information for the low level attack mode. There was a fierce competition between supporters of the new HUD design and the familiar electro-mechanical Gunsight with the HUD being cited as a radical even foolhardy option. The Air Arm branch of the Ministry sponsored the development of a Strike Sight. The Royal Aircraft Establishment (RAE) designed the equipment and it was built by Cintel and the system was first integrated in 1958. The Cintel HUD business was taken over by Elliott Flight Automation and the Buccaneer HUD was manufactured and further developed continuing up to a Mark III version with a total of 375 systems made; it was given a `fit and forget' title by the Royal Navy and it was still in service nearly 25 years later. BAE Systems thus has a claim to the world’s first Head Up Display in operational service.
In the United Kingdom, it was soon noted that pilots flying with new gun-sights were becoming better at piloting their aircraft. At this point, the HUD expanded its use beyond a weapon aiming instrument into a piloting tool. In the 1960s, French test-pilot Gilbert Klopfstein created the first modern HUD, and a standardized system of HUD symbols so that pilots would only have to learn one system and could more easily transition between aircraft. 1975 saw the development of the modern HUD to be used in instrument flight rules approaches to landing. Klopfstein pioneered HUD technology in military fighter jets and helicopters, aiming to centralize critical flight data within the pilot's field of vision. This approach sought to increase the pilot’s scan efficiency and reduce "task saturation" and information overload.
In the 1970s, the HUD was introduced to commercial aviation.
Until a few years ago, the Embraer 190 and Boeing 737 New Generation Aircraft (737-600,700,800, and 900 series) were the only commercial passenger aircraft to come with an optional HUD. Now, however, the technology is becoming more common with aircraft such as the Canadair RJ, Airbus A318 and several business jets featuring the device. HUD has become standard equipment on the Boeing 787. Furthermore, the Airbus A320, A330, A340 and A380 families are currently undergoing the certification process for a HUD.
There are two types of HUD. A fixed HUD requires the user to look through a display element attached to the airframe or vehicle chassis. The system determines the image to be presented depending solely on the orientation of the vehicle. Most aircraft HUDs are fixed.
Helmet mounted displays (HMD) are technically a form of HUD, the distinction being that they feature a display element that moves with the orientation of the user's head relative the airframe.
Many modern fighters (such as F/A-18, F-22, Eurofighter) use both a HUD and an HMD concurrently. The F-35 Lightning II was designed without a HUD, relying solely on the HMD, making it the first modern military fighter not to have a fixed HUD.
HUDs are split into 3 generations reflecting the technology used to generate the images.
There are several factors that engineers must consider when designing a HUD:
A typical HUD contains three primary components: A Combiner, the Projector Unit, and the video generation computer.
The combiner is the part of the unit which is located directly in front of the pilot. It is the surface onto which the information is projected so that the pilot can view and use it. On some aircraft the combiner is concave in shape and on others it is flat. It has a special coating that reflects the monochromatic light projected onto it from the Projector Unit while allowing all other wavelengths of light to pass through. On some aircraft it is easily removable (or can be rotated out of the way) by aircrew.
The Projection Unit projects the image onto the combiner for the pilot to view. In the early days of HUDs, this was done through refraction, although modern HUDs use reflection. The projection unit uses a Cathode Ray Tube, light emitting diode, or liquid crystal display to project the image. Projection Units can be either below (as with most fighter aircraft) or above (as with transport/commercial aircraft) the combiner.
The computer is usually located with the other avionics equipment and provides the interface between the HUD (i.e. the projection unit) and the systems/data to be displayed. On aircraft, these computers are typically dual independent redundant systems. They receive input directly from the sensors (pitot-static, gyroscopic, navigation, etc.) aboard the aircraft and do their own computations rather than receiving previously computed data from the flight computers. Computers are integrated with the aircraft's systems and allow connectivity onto several different data buses such as the ARINC 429, ARINC 629, and MIL-STD-1553.
Other symbols and data are also available in some HUDs:
Since being introduced on HUDs, both the FPV and acceleration symbols are becoming standard on head-down displays (HDD). The actual form of the FPV symbol on an HDD is not standardized but is usually a simple aircraft drawing, such as a circle with two short angled lines, (180 ± 30 degrees) and "wings" on the ends of the descending line. Keeping the FPV on the horizon allows the pilot to fly level turns in various angles of bank.
In addition to the generic information described above, military applications include weapons system and sensor data, such as:
During the 1980s, the military tested the use of HUDs in vertical take off and landings (VTOL) and short take off and landing (STOL) aircraft. A HUD format was developed at NASA Ames Research Center to provide pilots of V/STOL aircraft with complete flight guidance and control information for Category-IIIC terminal-area flight operations. These flight operations cover a large spectrum, from STOL operations on land-based runways to VTOL operations on aircraft carriers. The principal features of this display format are the integration of the flightpath and pursuit guidance information into a narrow field of view, easily assimilated by the pilot with a single glance, and the superposition of vertical and horizontal situation information. The display is a derivative of a successful design developed for conventional transport aircraft.
The use of head-up displays allows commercial aircraft substantial flexibility in their operations. Systems have been approved which allow reduced-visibility takeoffs and landings, as well as full Category IIIc landings. Studies have shown that the use of a HUD during landings decreases the lateral deviation from centerline in all landing conditions although the touchdown point along the centerline is not changed.
The image to the right, of a HUD in a NASA Gulfstream V, shows several different HUD elements, including the combiner in front of the pilot. The green 'glare' in the lower right corner of the combiner is a result of backscatter of off-axis light from the projection unit, as well as reflection from ambient light in the flight deck. Because the combiner has a pronounced vertical and horizontal curve to help focus the image, compensation is applied to the display symbols so they appear flat when projected onto the curved surface. When not in use, this combiner can swing up and lock in a stowed position.
The Projector Unit in the Gulfstream GV image would be directly above the pilot's head. In smaller aircraft the design of the projection unit can present interesting spacing and placement issues, as room has to be left for the pilot not only when normally seated but during turbulence and when getting in and out of the seat.
In more advanced systems, such as the FAA-labeled Enhanced Flight Vision System, a real-world visual image can be overlaid onto the combiner. Typically an infrared camera (either single or multi-band) is installed in the nose of the aircraft to display a conformed image to the pilot. In one EVS Enhanced Vision System is an industry accepted term which the FAA decided not to use because "the FAA believes would be confused with the system definition and operational concept found in 91.175(l) and (m) installation, the camera is actually installed at the top of the vertical stabilizer rather than "as close as practical to the pilots eye position". When used with a HUD however, the camera must be mounted as close as possible to the pilots eye point as the image is expected to "overlay" the real world as the pilot looks through the combiner.
"Registration" or the accurate overlay of the EVS image with the real world image is one feature closely examined by the authorities prior to approval of a HUD based EVS. When the pilot is coming in for a landing and "sees" the runway and runway lights through the EVS display, it is really a good thing when they come out under the clouds and the real world runway is right where the camera said it was as the pilot has a very short period of time to:
There are typically five hazard categories:
where if it does occur the design community anticipates that the flight crew will be able to take the appropriate action. The pilot may choose to initiate a missed approach (climb immediately and then figure out what to do because altitude and speed are your friend when trying to deal with "unexpected events") or perhaps to immediately blank the HUD/EVS display (typically there is a thumb switch on the control column for exactly this circumstance) and continue the landing using what can be seen through the window.
While the EVS display can greatly help, the FAA has only "relaxed" operating regulations where an aircraft with EVS operating can perform a CATEGORY I approach to CATEGORY II minimums. In all other cases the flight crew must comply with all "unaided" visual restrictions. (For example if the runway visibility is restricted because of fog, even though EVS may provide a clear visual image it is not appropriate (or actually legal) to maneuver the aircraft using only the EVS below 100' agl.)
In SVS image to the right, immediately visible indicators include the airspeed tape on the left, altitude tape on the right, and turn/bank/slip/skid displays at the top center. The boresight symbol (-\/-) is in the center and directly below that is the Flight Path Vector symbol (the circle with short wings and a vertical stabilizer). The horizon line is visible going across the display with a break at the center, and directly to the left are the numbers at ±10 degrees with a short line at ±5 degrees (The +5 degree line is easier to see) which, along with the horizon line, show the pitch of the aircraft.
The aircraft in the image is wings level (i.e. the flight path vector symbol is relative to the horizon line and there is zero roll on the turn/bank indicator). Airspeed is 140 knots, altitude is 9450 feet, heading is 343 degrees (the number below the turn/bank indicator). Close inspection of image shows a small purple circle which is displaced from the Flight Path Vector slightly to the lower right. This is the guidance cue coming from the Flight Guidance System. When stabilized on the approach, this purple symbol should be centered within the FPV.
The terrain is entirely computer generated from a high resolution terrain database.
In some systems, the SVS will calculate the aircraft's current flight path, or possible flight path (based on an aircraft performance model, the aircraft's current energy, and surrounding terrain) and then turn any obstructions red to alert the flight crew. Such a system could have prevented the crash of American Airlines Flight 965 in 1995.
On the left side of the display is an SVS-unique symbol, which looks like a purple, dimishing sideways ladder, and which continues on the right of the display. The two together define a "tunnel in the sky". This symbol defines the desired trajectory of the aircraft in three dimensions. For example, if the pilot had selected an airport to the left, then this symbol would curve off to the left and down. The pilot keeps the flight path vector alongside the trajectory symbol and so will fly the optimum path. This path would be based on information stored in the Flight Management System's data base and would show the FAA-approved approach for that airport.
The Tunnel In The Sky can also greatly assist the pilot when more precise four dimensional flying is required, such as the decreased vertical or horizontal clearance requirements of RNP. Under such conditions the pilot is given a graphical depiction of where the aircraft should be and where it should be going rather than the pilot having to mentally integrate altitude, airspeed, heading, energy AND longitude and latitude to correctly fly the aircraft.
General Motors began using head-up displays in 1988 with the first color display appearing in 2001 on the Corvette. In 2003, BMW became the first European manufacturer to offer HUD. The displays are becoming increasingly available in production cars, and usually offer speedometer, tachometer, and navigation system displays. Night vision information is also displayed via HUD on certain General Motors, Honda, Toyota and Lexus vehicles. Other manufactures such as Citroën, and Nissan currently offer some form of HUD system. Motorcycle helmet HUDs are also commercially available.
Add-on HUD systems also exist, projecting the display onto a glass combiner mounted on the windshield. These systems have been marketed to police agencies for use with in-vehicle computers.
HUDs have been proposed or are being experimentally developed for a number of other applications. In the military, a HUD can be used to overlay tactical information such as the output of a laser rangefinder or squadmate locations to infantrymen. A prototype HUD has also been developed that displays information on the inside of a swimmer's goggles. A group of Electrical Engineering students from the University of Massachusetts Amherst are integrating technologies in order to develop an affordable Personal Head-Up Display.
Heads-Up Display (also known as HUD) is the visual display on screen that relays important information to the player. These can include health, weapons, ammo, and time left among other game specific information. Some games have tried to limit or hide the HUD so that the players immersion isn't ruined.
Games like Metroid Prime and Metroid Prime 2: Echoes have notable HUDs for their realism and attention to detail. Because these games display the HUD on the inside of Samus's Visor, which can be affected by rain or fog, as well as presenting information in the way you'd expect a robotic suit to present it, the immersion is powerful.
However, having a detailed HUD is not the only way to immerse players. Games like Peter Jackson's King Kong feature no HUD at all when playing a human, giving no clues to health or ammo, but still make it a realistic first-person experience.
A HUD is different from an interface in that it is not interactive, but only displays (sometimes irrelevant) information. In Kingdom Hearts, there was no HUD telling the player the targeted opponent's health, however, the player could later learn the move 'Scan', which automatically displayed the opponents health at the cost of ability points. Most people would equip Scan and lose ability points, even though knowing the opponents health, in no way, benefits the player in combat.
HUDs are often necessary to help the player know that they're losing or making progress and are sometimes viewed as more important than realism. Small, abrasive, and well-designed HUDs can sometimes be easily forgotten by the player, and not affect the immersion at all.
One of the focuses of the Nintendo DS system, was to allow developers to put the HUD on a second screen, that way the player could focus on one screen and not see the HUD at all, but still have it at his disposal whenever he chose to look at it. Many games for the DS use the second screen to display the map or ammo and health meters.