In 1983 Hewlett Packard released the HP-150, one of the first commercially available touchscreen PCs. A grid of infrared beams across the front of the monitor enabled it to detect finger movements across the screen.
Over subsequent decades advances were made, as the screens were put to varied uses including airport check-in terminals and point of sales terminals. It was the release of the iPhone in 2007, with multi-point detection, that saw touchscreen interfaces become mainstream.
There are three main elements used in touchscreen technology: a touch sensor, a controller and a software driver. The touch sensor is a clear glass touch-responsive panel that sits over a display screen. There are different types of sensors, with capacitive sensors generally used in consumer devices and resistive sensors in industrial systems.
The capacitive system generates an electric field across the screen and detects a change in charge at a point on a grid caused by the conductivity of the user’s finger. The resistive system has two conducting layers spaced slightly apart which are pressed together at a point when touched allowing current to flow across particular grid elements. The grid location can then be passed to the software for processing.
With the ability to recognise the presence of two or more points of contact, screens are now able to implement advanced functionality such as pinch-and-stretch to zoom. The future holds the possibility of flexible screens and making screens more tactile with temporary buttons that are raised by fluid or gas for specific applications.