BBC Future
In Depth

The end for keyboards and mice?

  • Tap and go
    Touchscreen technology, popularised by Apple’s iPhone and a host of other devices, has begun to erode the dominance of traditional keyboards. (Copyright: Getty Images)
  • Look ahead
    But there are a host of other devices on the horizon, including Google Glass – an ‘augmented reality’ headset that overlays the world with graphics. (Copyright: Getty Images)
  • Hand control
    Others systems that augment the real world include Sixth Sense, a prototype developed at the Massachusetts Institute of Technology (MIT) that responds to gestures. (Copyright: SPL)
  • Leap forward
    The idea of tracking gestures is also at the heart of the Leap Motion which works in a similar way to Microsoft’s Kinect system and will go on sale soon. (Copyright: Leap Motion)
  • Mind control
    Other futuristic interfaces are already on the market, including a number of headsets that interpret the brain's electrical activity to control a cursor on screen. (Copyright: SPL)
  • Point and click
    More complex brain interfaces are used to help paralysed patients control computers, allowing them to type by highlighting letters on a grid, for example. (Copyright: G-Tec)
  • Floating future
    Other research explores 'tangible computing' - using physical objects to control on-screen action. MIT’s ZeroN uses a ball suspended in a magnetic field. (Copyright: Jinha Lee/MIT)

HIDE CAPTION

Apple's iPhone and its rivals may have introduced touchscreens to the masses, but now a raft of technologies promise to change the way we interact with computers forever.

You're in fast moving traffic on a busy motorway approaching a complicated junction with just seconds to get into the right lane. Your phone, sensing that now is not the moment to disturb you, diverts an incoming call straight to voicemail. Later, when you are in a more relaxed state, it plays the message back and offers to ring the caller back. 

Even if you are packing an iPhone 5 or the latest Samsung, it is fair to say that your phone is still a long way from doing this. Despite the impressive array of features offered by today's handsets – including voice commands - most people still interact with their phones by pressing buttons, prodding a screen or the occasional swipe or pinch.

It is a similar story with computers. Take Microsoft’s new Windows 8 operating system, due to be launched later this week. Its colourful, tile-laden start screen may look startlingly different to older versions of Windows, but beneath the eye candy it's still heavily reliant on the keyboard and mouse.

In fact, with one or two notable exceptions, it is striking just how little the way we interact with computers has changed in the last few decades.

"The keyboard and mouse are certainly a hard act to follow," says George Fitzmaurice, head of user interface research for US software maker Autodesk. But, despite an apparent lack of apparent novelty in the majority of interfaces of today's mass market devices, there are plenty of ideas in the pipeline.

Take, for instance, technology that can monitor your stress levels. One technique being developed is functional near-infrared spectroscopy(fNIRS) that monitors oxygen levels in the blood, and therefore activity, in the front of the brain. "It measures short term memory workload, which is a rough estimate of how 'busy' you are," says Robert Jacob, professor of computer science at Tufts University, near Boston, Massachusetts.

The technology is currently used in medical settings, but could one day be used to help filter phone calls, for example. Today fNIRS works via a sensor placed on your forehead, but it could also be built into baseball caps or headbands, allowing the wearer to accept only important calls. Perhaps more immediately, it could also help organisations assign workloads efficiently. "You could tell your phone only to accept calls from your wife if you get busy beyond a certain gradation of brain activity," adds Jacob. "If a machine can tell how busy you are it can tell if you can take on an additional workload, or it could take away some of your work and assign it to someone else if you are over-stretched."

Other forms of "brain-computer interface" are already being used and developed for a growing number of applications. Electroencephalography (EEG) picks up electrical signals generated by brain cell interactions. It has long been used to diagnose comas, epilepsy and brain death in hospitals and in neuroscience research. The variation of frequencies of signals generated can be used to determine different emotions and other brain states. Recent years have seen the launch of simplified EEG headsets that sell for as little as $100.

For example, a British company called Myndplay makes interactive short films, games and sports training software which users interact with via these brain wave measuring headsets. Those who can successfully focus their minds or mentally relax sufficiently when required can influence film plots and progress to higher levels in games.

Similar technologies are increasingly being used to help the disabled. Two years ago an Austrian company called Guger Technologies released a system designed to help paralysed patients type by highlighting letters on a grid one by one until the desired letter is selected and the associated EEG signal is detected. Spanish researchers have developed EEG-controlled wheelchairs and are working on using the same method to control prosthetic arms. 

BBC © 2012 The BBC is not responsible for the content of external sites. Read more.

This page is best viewed in an up-to-date web browser with style sheets (CSS) enabled. While you will be able to view the content of this page in your current browser, you will not be able to get the full visual experience. Please consider upgrading your browser software or enabling style sheets (CSS) if you are able to do so.