Article - Issue 20, August/September 2004

Tactile display technology

Heidi Castle and Trevor Dobbins

Download the article (295 KB)

A brief overview of its benefits over visual and audio displays

Platform performance and safety depend on operator situational awareness. This is traditionally achieved through visual and audio displays – but an intuitive alternative is the tactile display. Castle and Dobbins discuss how the familiar tactile displays used to vibrate a mobile phone alert function can be developed into more complex tactile display applications used in fixed and rotary wing aircraft, high-speed boats, diving, and assisting the visually impaired. They explain how tactile displays can be used to enhance performance and safety during orientation, navigation, and communication.

The safety and performance of the operator of any platform (aircraft, land vehicle, ship/boat and submarine) is dependent on the situational information available to them. This is normally achieved through visual and audio displays. Although visual displays are usually easy to use, the development of computer-driven visual displays provides the opportunity to present vast quantities of visual information that can overload visual information processing capacity. This overload can lead to reduced performance and safety. The potential for problems has been recognised and design standards such as ISO 13407 are an attempt to overcome this. Audio displays are ideal for specific information but there are disadvantages to their use in noisy environments and environments requiring minimum noise. Tactile displays are an effective alternative to visual and audio displays. They can be used to offload the visual system and are suitable in both high and low noise environments.

We are all familiar with visual and audio displays (seeing and hearing information). However, providing information via the ‘little used’ sense of touch – more specifically referred to as the haptic sense – is an area that can provide communication through stimulation of the skin surface (tactile sensation), pressures felt within the muscles and deeper tissues of the body (proprioception), temperature changes (thermoreception), or pain (nociception). To provide tactile cues, the most common method of stimulating the skin is vibration. The actuators produced to provide these vibrations are known as tactors. The ideal tactor is lightweight, with low power consumption, and high output (i.e. vibration). Vibration output can vary in terms of intensity, frequency and rhythm. Psychophysical factors include the location of the tactor on the body, the receptor type being stimulated and the size of the tactor in relation to the receptor field.

Tactile display applications

Tactile displays have been in existence for many years but it is only recently that their potential is beginning to be exploited. A familiar tactile display is the vibrating alert function on the mobile phone or pager, which is typically used when an audio cue is unacceptable (for example, in meetings or at the theatre). Even basic tactile cues can be both intuitive and informative. A tap on the shoulder instinctively tells you that someone is behind you, in what direction, and that they want your attention.

There are currently three main applications for which tactile displays are considered to have great potential: orientation, navigation, and communication.


The development of tactile displays for orientation was driven in part by the knowledge that over 25 per cent of US military aircraft losses are directly attributed to the pilot’s loss of spatial orientation and situation awareness. Spatial orientation is the ability of an individual to correctly know where they are oriented (positioned) in space, normally in relation to the direction down.

For example, a ground direction cue applied to the torso, via the Tactile Situation Awareness System (TSAS), developed by the US Naval Aerospace Medical Research Laboratory (NAMRL) (see Figure 1), enables pilots to maintain spatial orientation and perform aerobatic manoeuvres without external cues (blindfolded) or internal instrument displays.

Extreme disorientation can occur under water and in conditions of microgravity, which would be encountered by divers and astronauts respectively. Research has been conducted into the use of tactile displays in microgravity using parabolic flights by NASA and further study into this environment is being carried out on the International Space Station by researchers from TNO (NL).

The development and implementation of tactile displays is supported by the conclusion made at the NATO Research & Technology Organisation symposium – ‘Spatial disorientation in military vehicles: causes, consequences and cures’ (La Coruna, Spain, 2002). It was stated that:

The most important advance of recent years with the potential to combat spatial disorientation has been the use of tactile stimuli to give information on spatial orientation.


Navigation through space is a task that must be performed in many different environments and tactile devices can provide directional cues for this purpose. Tactile cues could, for example, inform a pilot of the bearing of a missile that is locked onto an aircraft or the direction of an emergency rendezvous location.

The tactile navigation display may be designed to give information of course errors and course correction instructions.

Waypoint navigation using tactile cues (see Figures 2 and 3) has been demonstrated in a variety of environments. These include divers operating underwater at NAMRL, high-speed boats at QinetiQ, automobiles at TNO, and aircraft at NAMRL and at TNO.


Communication of information other than orientation or navigation information can take place on a very simple or very complex level. It is generally considered that for a display to be highly intuitive it should only convey very simple information. The vibrating alert on a mobile phone communicates simply that someone wishes to speak with the telephone’s user. However, with this application the potential exists to provide more information, such as whether the call is of a business or personal nature, or if the telephone has received a text message.

Therefore, it is necessary to differentiate between different tactile cues. From an audio perspective this is easily achieved by having different ring tones. The tactile equivalent of audio ring tones might be different vibration rhythms. This technology will require the construction of intuitive ‘tactile melodies’, or ‘tactile-icons’, which can be immediately recognised with little cognitive processing required.

The potential of tactile cues for communication are far-reaching and they are particularly suited to environments where sound and light cues are either unavailable or undesirable.

Assisting the blind and visually impaired

Blind and visually impaired individuals rely heavily on touch feedback, whilst individuals who are both blind and deaf are totally dependant on their sense of touch. One example of how tactile technology can enhance the capabilities of the visually impaired is by improving autonomy of navigation. A tactile waypoint navigation system, developed by QinetiQ, has been used to assist in the setting of the first world blind water speed record at a speed of ~73 mph.

The increasingly successful exploitation of tactile displays can be seen within the recently developed blind walking cane by Sound Foresight Ltd, which provides tactile feedback to the user from ultrasonic transponders in the cane. This means that users can be informed of both the direction of, and distance to, objects in their vicinity. The use of the tactile display means that they receive this information both intuitively and discretely.

Tactile display systems

The complexity of a tactile display depends on the intended usage. A simple communication device may have one tactor, whereas a navigation system may have two or more tactors. However, an orientation system, particularly for 3D applications, could potentially have in excess of 100 tactors. The number of tactors incorporated into a device will obviously have implications for its overall size, weight, power supply and control.

The greatest limitation of the tactile display is the requirement for the tactor to be in contact with the body. This requires a method of retaining the tactor against the skin, which in multiple tactor arrays can be a significant technical challenge, particularly when the complex contours of the body must be considered. If any tactors lose contact with the body, the system may be ineffective depending on the amount of redundancy that has been factored into the design.

Further research and development

As well as the continuing development of tactors and methods of mounting them on the body, there are a number of fundamental human factors that require ongoing investigation to ensure the future success of tactile displays.

To make tactors more effective and to ensure they are used to their greatest advantage on the body, there is an ongoing need for research into the characteristics of human tactile information processing. At BAE Systems, Heidi Castle is looking at the workload capacity limitations of the tactile sense in order that its limitations are not exceeded during the course of offloading the visual sense. Trevor Dobbins has been awarded a DTI R&D grant to develop a tactile height display for light aircraft.

As tactile displays may be deployed on platforms that currently have inherent vibration, there is the risk that tactile display cues may be indistinguishable from the background vibration. An example of this was provided during the development of the navigation system used for setting of the blind world water speed record (see Figure 4). It was found that small vibrating tactile devices could not be perceived during trials using a high-speed boat where the vibrations from the boat on the water’s surface and the boat’s large engines made the system ineffective. This was resolved by mounting the tactors on body locations away from where the coxswain held onto the boat structure; for example, moving the tactors from the hands/arms to the torso.

Further work is required to establish what tactile cues are perceived most intuitively and to answer questions such as: how do you provide a turn-right instruction? How do you show how far to turn right? And how do you inform over-corrections in course direction?

Also more research is required on the user’s ability to distinguish different tactile ‘melodies’. The scenario may be analogous to Morse code; it is easy to discriminate and comprehend between two or three letters, but it takes a large investment in training to discriminate between 26 letters accurately and at high speed.

To help facilitate the development of tactile displays a Technical Group has been established by the NATO Research and Technology Agency. The group is currently composed of representatives from the UK, NL, US and Canada. Its aims include:

  • identifying current tactile systems and applications

  • identifying related standards and guidelines for tactile systems

  • compiling a database of published scientific knowledge on tactile perception

  • detailing experimental methods for undertaking tactile research

  • detailing experimental methods for producing reference tactile conditions

  • identifying safety issues related to tactile systems

  • assisting in networking tactile research groups, agencies and specialists

  • disseminating information about tactile systems.

Tactile displays are a relatively new competitor in the display marketplace. They have the potential to enhance both performance and safety in many environments from underwater to microgravity. Further work is required on the development of tactor design with respect to the requirements of specific applications. Research into the human factors of tactile perception will help to optimise tactile display design and enhance their effectiveness for improved operator performance and safety.

Heidi Castle

Bae Systems, Advanced Technology Centre, Filton, Bristol, UK

Trevor Dobbins

Human Sciences & Engineering Ltd, Chichester, UK

Heidi Castle is a scientist in the Human Factors Department at BAE Systems’ Advanced Technology Centre, Bristol. Her field is multi-sensory interfaces and she specialises in the haptic sense; covering all aspects of touch, movement and position feedback as well as temperature and pain sensation. She is presently also doing a PhD at Cranfield University in Haptic Devices in the Cockpit.

Dr Trevor Dobbins is the Director of the consultancy Human Sciences & Engineering Ltd. He is a Science Advisor to the MOD on human factors issues relating to maritime operations and is a visiting Research Fellow of University College Chichester. He previously worked as a Senior Scientist/Project Manager for the DERA/QinetiQ Centre for Human Sciences where he initiated and led the MODs tactile displays research programme in collaboration with the US Naval Aerospace Medical Research Laboratory. He is also the co-chair of the NATO RTO Technical Group on Tactile Displays.

[Top of the page]