Skip to content
Summer Science Exhibition 2006

Seeing through touch









The Royal Society, London, 6-9 Carlton House Terrace, London, SW1Y 5AG


The PHANTOM Omni haptic device. Like a 3D mouse the user moves the pen. The device can resist this motion allowing the user to ‘feel’ virtual objects.

Imagine trying to use your computer blindfolded. This might seem an impossible task but for the two million visually impaired people in the UK it is an everyday reality. Technologies that are currently available can read text from the screen or convert it into Braille but these systems are of little use in reading graphical information. 'The whole basis of graphs and charts is visual', says David McGookin of the University of Glasgow. 'They are extremely useful for the sighted as they provide a visual overview of large amounts of data but are of little use to visually impaired people.' David and colleagues in the Department of Computing Science at the University of Glasgow are developing new ways to negotiate information displayed on computer screens using touch and sound.

Haptic devices are commercially available desktop systems that allow you to interact with your computer using the sensation of touch. The actual device consists of a pen that the user holds attached to a robotic arm with motors located in the base. By moving the pen, you move about the computer screen. When 'contact' is made with an object, information is sent back to the device and the motors alter your ability to move the pen, simulating the feeling of actually touching the object.

These haptic devices combined with 'tactile' bar charts and sounds are providing the tools for visually impaired users to read and draw graphs. 'Bar charts on the computer are constructed with touch in mind,' explains David. 'The bars contain V-shaped grooves and the axes are cylindrical to distinguish them from the bars. Users basically feel their way around the bar chart.' To draw a bar chart the user changes from browse to build mode with one keystroke that makes a confirmative noise. Sounds are then used to indicate the height of the bar with a scale played as the user moves the bar up or down each unit on the y-axis. The system has been tested on blindfolded, sighted and visually impaired users with good results. 'Over 90% of the graphs were constructed accurately,' says David.

By combined use of a mouse and haptic device, David and his colleagues are also developing new ways for visually impaired people to navigate a computer screen. The two handed system has been tested using computer mazes, with encouraging results. A haptic device is used in the dominant hand to move around the maze with sensory feedback letting the user know when they touch the walls of the maze, keeping them within the maze itself. In the other hand, directional information is provided by a moving pad of pins, indicating whether the user needs to move up down left or right, on a mouse. Future research will look at means of providing more information such as the time to the next change in direction. 'This study used a virtual maze, but the techniques have wider applications,' explains Andrew Crossan, also of the University of Glasgow. 'We plan to add sound to the system to provide an even more sophisticated way to browse and navigate data non-visually.'

Seeing through touch The Royal Society, London 6-9 Carlton House Terrace London SW1Y 5AG UK