IBM recently released a list of innovations that have the potential to change the way people work, live and interact in the next five years. The list, based on market and societal trends and emerging technologies from IBM's R&D labs, says using the five senses - touch, sight, hearing, taste and smell - to improve interaction with machines is the next big thing in computing.
"We have already seen the benefits of cognitive systems for advancing numerous aspects of the human experience, from healthcare to weather forecasting. We envision a day when computers make sense of the world around them just like a human brain interacts with the world using multiple senses," says Ramesh Gopinath, Director, India Research Lab and Chief Technology Officer, IBM India/South Asia.
We take a look at what IBM and others are doing to bring cognitive senses to your computer or phone. Almost all of these technologies are at advanced stages of testing.
So, while IBM says they will be available for consumers by 2017, don't be surprised if some of these are available in devices that will be up for sale in the next couple of years. Here is how the senses are finding their way into our digital lives:
By the end of this decade, computers will not only be able to recognise content of visual data (say a picture), but will also make sense of the pixels just like a human sees and interprets a photograph. For example, future computers will know that a red light means stop and can interpret road signage.
A precursor to this is the Google Goggles app that recognises products from photographs and gives you information on it. IBM says that these capabilities will be put to work in healthcare in five years to analyse massive volumes of medical information. For instance, computers will be able to differentiate healthy tissue from a diseased one.
Another use could be the use of cameras as 'body scanners', to identify which outfit would fit a person. The apparel industry is already experimenting with this technology with an eye on how it could be used to bring in more online buyers.
Scientists have been trying to replicate touch and feel on mechanical devices for decades. Success is very close. We could have mobile devices that allow you to touch and feel products, thus redefining retail businesses. IBM says its scientists are developing applications for retail and healthcare sectors, among others, using haptic (tactile feedback), infrared and pressure-sensitive technologies to simulate touch, such as the texture and weave of a fabric. This technology will use vibration capabilities of a phone, assigning a unique set of vibration patterns that recreate the texture of each article.
In the next five years, sensors in your computer or phone will be able to detect if you are coming down with a cold. By analysing odours, biomarkers and various molecules in someone's breath, doctors will have help diagnosing and monitoring onset of ailments, even ones such as liver and kidney disorders, asthma, diabetes and epilepsy.
Companies such as DigiScents and TriSenx are developing devices that recreate smells when connected to a computer. Imagine being able to smell the food you're seeing on screen. DigiScents has indexed thousands of smells based on chemical structure and place on the scent spectrum, before digitising them into a file that can be embedded in web content.
Meanwhile, IBM is developing technology to 'smell' surfaces for disinfectants to determine if rooms have been sanitised. Using novel wireless 'mesh' networks, data on various chemicals will be gathered by sensors. The technology can also learn and adapt to new smells. IBM is already using such technology to analyse environmental conditions and gases to preserve works of art.
IBM is also developing a computing system that experiences flavour. It works by breaking down ingredients to their molecular level, then using the psychology of what flavours and smells humans prefer. By comparing millions of recipes, the system will be able to create new flavour combinations.
Most of these computers use algorithms to determine the precise chemical structure of food and why people like certain tastes. For example, at the University of Tsukuba in Japan, researchers are working on a food simulator that can mimic a food item's taste and 'mouthfeel' ( the sensation created by food or drink in the mouth, used in areas related to testing and evaluating food).
These algorithms have been written to examine how various chemicals interact with each other, the molecular complexity of flavour compounds and their bonding structure. It will then use that information, together with models of perception, to predict the appeal for any particular flavour.
When computers start hearing, a distributed system of clever sensors will detect elements of sound, such as sound pressure, vibrations and waves at different frequencies and interpret the information. For example, a computer can then predict when a landslide might occur. It can also gauge the mood of a speaker, or analyse whether he is lying. The systems will be able to analyse pitch, tone, hesitancy and other such aspects of a conversation to have more productive dialogues and interact with different cultures or even improve customer call centre interactions. Scientists are now studying underwater noise levels to understand impact of 'wave energy conversion machines' on sea life.