IBM: In Five Years Your Phone Will Touch, See, Hear, Taste, and Smell [Video]

Posted December 17, 2012 at 6:48pm by iClarified | Please help us and submit a translation by clicking here | 7303 views

IBM predicts that in 5 years, machines will emulate human senses, each in their own special way.

In the era of cognitive computing, systems learn instead of passively relying on programming. As a result, emerging technologies will continue to push the boundaries of human limitations to enhance and augment our senses with machine learning, artificial intelligence (AI), advanced speech recognition and more. No need to call for Superman when we have real super senses at hand.

Touch:
In the 1970s, when a telephone company encouraged us to "reach out and touch someone," it had no idea that a few decades later that could be more than a metaphor. Infrared and haptic technologies will enable a smart phone's touchscreen technology and vibration capabilities to simulate the physical sensation of touching something. So you could experience the silkiness of that catalog's Egyptian cotton sheets instead of just relying on some copywriter to convince you.

Sight:
Recognition systems can pinpoint a face in a crowd. In the future, computer vision might save a life by analyzing patterns to make sense of visuals in the context of big data. In industries as varied as healthcare, retail and agriculture, a system could gather information and detect anomalies specific to the task—such as spotting a tiny area of diseased tissue in an MRI and applying it to the patient's medical history for faster, more accurate diagnosis and treatment.

Hearing:
Before the tree fell in the forest, did anyone hear it? Sensors that pick up sound patterns and frequency changes will be able to predict weakness in a bridge before it buckles, the deeper meaning of your baby's cry or, yes, a tree breaking down internally before it falls. By analyzing verbal traits and including multi-sensory information, machine hearing and speech recognition could even be sensitive enough to advance dialogue across languages and cultures.

Taste:
The challenge of providing food—whether it's for impoverished populations, people on restricted diets or picky kids—is in finding a way to meet both nutritional needs and personal preferences. In the works: a way to compute "perfect" meals using an algorithmic recipe of favorite flavors and optimal nutrition. No more need for substitute foods when you can have a personalized menu that satisfies both the calorie count and the palate.

Smell:
When you call a friend to say how you're doing, your phone will know on the full story. Soon, sensors will detect and distinguish odors: a chemical, a biomarker, even molecules in the breath that affect personal health. The same smell technology, combined with deep learning systems, could troubleshoot operating-room hygiene, crops' soil conditions or a city's sanitation system before the human nose knows there's a problem.

Take a look at their video below...

Read More [via CultofMac]



Share
Add Comment
Paresh Panchal - December 18, 2012 at 7:04am
Phone do every thing ? Thinking ? Even do the Sex ?
Lov3boyz - December 18, 2012 at 2:25am
They forget Thinking :)
nick - December 18, 2012 at 1:52am
5years is too long. Its all a matter of vibrations.
culua - December 17, 2012 at 7:39pm
This reminds me of those terminator movies...
ilaw - December 17, 2012 at 7:21pm
Careful IBM, Apple will "patent" the technology and say they invented it, then they will sue...
4 More Comments
Follow iClarified
Apple Predicted to Sell 71.5 Million iPhones in Q4, Sales to Drop Drastically in Q1 [Chart]
How to Enable Dark Mode in OS X 10.10 Yosemite
Instructions on how to enable Dark Mode in Ma...
Apple Announces World AIDS Day 2014 Campaign for (RED)
Apple has announced its largest-ever World AI...
Apple Updates GarageBand With Limited Time (GarageBand)RED Loop Pack
Apple has updated GarageBand with a limited t...
Evernote Updates Penultimate Following User Complaints
Evernote has updated its Penultimate app to a...