For decades the keyboard and the mouse have been our traditional tools for operating computers. With the arrival of mobile devices such as smartphones and tablets, new techniques, such as touchscreens and speech recognition, were required to communicate efficiently and intuitively with these mini-computers which have neither a keyboard nor a mouse.
Today, devices can receive commands via touchscreens and hear what users want through their built-in microphones. But the amount of electronic equipment we want to communicate with will increase still further as the Internet of Things becomes commonplace.
With the aid of eye-tracking systems they would also be able to detect what users are looking at and anticipate what they want to do next, opening up a whole series of new possibilities for intuitive interaction between man and machine.
In many cases the hardware for eye tracking is already available. Eye tracking systems detect a person’s eye movements and the direction in which they are looking. Originally they were developed for market research, behaviour analysis and usability studies. And they have also been in use for some time to help people who no longer have the use of their hands to operate computers.
Many of these systems use infrared light to illuminate the user’s eyes, take a picture with a camera and calculate eye movement from the image data. Such systems need special high-quality cameras, light sources and software, and sometimes hardware accelerators are added to process the huge amount of graphic data.
Today, extremely powerful chips, compact camera sensors and modern high-power LEDs enable eye-tracking functionality to be integrated in consumer devices such as smartphones.
What’s more, in many devices the camera sensor and the infrared light source are already being used for other functions such as facial recognition or iris identification. All that is then needed is an appropriate software to integrate eye tracking as an additional feature.
How eye-tracking works
Modern eye-tracking systems are based on infrared LEDs (IREDs) for illuminating the eyes and a high-resolution camera sensor that registers the light reflected by the eyes. Image processing algorithms take this raw data and calculate the positions of the pupils.
Using information about the position of a reference object, such as the screen, special software is able to determine where exactly the user is looking. Infrared illumination guarantees the necessary contrast between the iris and the pupil, whatever the eye color, particularly in the dark or if the screen background is very bright.
Such systems currently have a range of up to one metre. For smartphones and tablets the typical working distance is around 30 cm, for desktop computers around 60 cm. The resolution on the screen corresponds to the raster size of the eyes and is about 1 cm for tablets and about 2 cm for computers.
The number of IREDs used and the specific arrangement of emitters and camera depend on the type of application, in other words on the working distance and the size of the area to be covered. The setup can also vary with the eye-tracking software used because the geometry of the design depends also on the ability of the algorithms to reliably detect the orientation of the pupils.
Generally speaking, the emitter and camera sensor have to be arranged at a certain angle and at a certain distance with respect to one another to avoid glare from spectacles or direct reflections from the eyes to the sensor. The greater this distance, the better the signal quality and the more flexible the choice of the optimum distance between the user and the device.
Ideally, both eyes should be within the capture area of the camera sensor. It is important for the entire eye to be evenly illuminated. The amount of infrared light that is needed depends on the working distance and may be several watts, even for mobile devices. To keep thermal output due to the high operating currents as low as possible, the emitters are operated in pulsed mode.
Despite this, thermal management is an important aspect of the design, particularly in ever lighter and thinner smartphones and tablets. In this connection the efficiency of the IRED is a major factor in addition to the optical output. The greater the efficiency, the less heat is generated.
Like any application featuring infrared light sources, eye-tracking systems must comply with eye safety standards. The amount of infrared radiation that reaches a normal user is relatively low.
However, precautions must be taken for situations where a technician, for example, may be at risk of looking at the infrared light source from close up. A proximity sensor linked to the eye-tracking system ensures that in such cases the IRED is switched off.
All eyes on the consumer
Concepts that enable eye tracking to be used as a new additional man/machine interface are currently being developed in various areas: smartphones and tablets with eye-tracking functionality have already been demonstrated in which eye contact is used to activate an icon or move a character on screen.
Gaming computers with eye tracking give gamers a sense of being more involved in the action. Systems have been presented in which a gamer can use eye movements to control the viewpoint of an on-screen character instead of using a mouse or trackpad.
> See also: The revolutionary road to autonomous driving
And eye tracking can be used in much the same way for computers – for example by using your eyes to scroll through a document. In the smart home sector, too, there are ways of using eye contact to communicate with a wide range of devices. Smart TVs with eye tracking have already been demonstrated, for example.
Possible applications for these systems have also been proposed in the automotive sector, like driver activity assistants that can monitor the driver’s eyes to detect signs of fatigue.
An eye-tracking function could also be used to detect the direction in which the driver is looking and determine whether he is paying attention to the road ahead or being distracted, helping to avoid critical situations on the road.
We are surrounded by so many complex electronic devices that new additional techniques are needed for intuitive interaction between man and machine. The combination of infrared illumination and camera sensors provides the basis for a wide range of interactive techniques in which a device can see its users and interpret their intentions.
Sourced from Dr. Christoph Goeltner, Product Manager, Osram Opto Semiconductors