May 3, 2024

IBM: In Five Years Your Phone Will Touch, See, Hear, Taste, and Smell [Video]

Posted December 17, 2012 at 6:48pm by iClarified · 10345 views
IBM predicts that in 5 years, machines will emulate human senses, each in their own special way.

In the era of cognitive computing, systems learn instead of passively relying on programming. As a result, emerging technologies will continue to push the boundaries of human limitations to enhance and augment our senses with machine learning, artificial intelligence (AI), advanced speech recognition and more. No need to call for Superman when we have real super senses at hand.

Touch:
In the 1970s, when a telephone company encouraged us to "reach out and touch someone," it had no idea that a few decades later that could be more than a metaphor. Infrared and haptic technologies will enable a smart phone's touchscreen technology and vibration capabilities to simulate the physical sensation of touching something. So you could experience the silkiness of that catalog's Egyptian cotton sheets instead of just relying on some copywriter to convince you.

Sight:
Recognition systems can pinpoint a face in a crowd. In the future, computer vision might save a life by analyzing patterns to make sense of visuals in the context of big data. In industries as varied as healthcare, retail and agriculture, a system could gather information and detect anomalies specific to the task—such as spotting a tiny area of diseased tissue in an MRI and applying it to the patient's medical history for faster, more accurate diagnosis and treatment.

Hearing:
Before the tree fell in the forest, did anyone hear it? Sensors that pick up sound patterns and frequency changes will be able to predict weakness in a bridge before it buckles, the deeper meaning of your baby's cry or, yes, a tree breaking down internally before it falls. By analyzing verbal traits and including multi-sensory information, machine hearing and speech recognition could even be sensitive enough to advance dialogue across languages and cultures.

Taste:
The challenge of providing food—whether it's for impoverished populations, people on restricted diets or picky kids—is in finding a way to meet both nutritional needs and personal preferences. In the works: a way to compute "perfect" meals using an algorithmic recipe of favorite flavors and optimal nutrition. No more need for substitute foods when you can have a personalized menu that satisfies both the calorie count and the palate.

Smell:
When you call a friend to say how you're doing, your phone will know on the full story. Soon, sensors will detect and distinguish odors: a chemical, a biomarker, even molecules in the breath that affect personal health. The same smell technology, combined with deep learning systems, could troubleshoot operating-room hygiene, crops' soil conditions or a city's sanitation system before the human nose knows there's a problem.

Take a look at their video below...

[via CultofMac]