Archive for December 18th, 2012
IBM released its annual “5 in 5” list yesterday, the seventh year in a row whereby IBM scientists identify a list of innovations that have the potential to change the way people work, live and interact during the next five years.
The IBM 5 in 5 is based on market and societal trends, as well as emerging technologies from IBM’s R&D labs around the world. This year, the 5 explores innovations that will be underpinnings of the next era of computing, what IBM has described as “the era of cognitive systems.”
This next generation of machines will learn, adapt, sense, and begin to experience the world as it really is, and this year’s predictions focus on one element of the this new era: The ability of computers to mimic the human senses — in their own manner, to see, smell, touch, taste and hear.
But before you try and spoon-feed your iPad some vanilla yogurt, let’s get more practical.
These new sensing capabilities will help us become more aware, productive, and help us think — but not do our thinking for us.
Rather, cognitive systems will help us see through and navigate complexity, keep up with the speed of information, make more informed decisions, improve our health and standard of living, and break down all kinds of barriers — geographical, language, cost, even accessibility.
Now, on to our five senses.
1) Touch: You will be able to touch through your phone. Imagine using your smartphone to shop for your wedding dress and being able to feel the satin or silk of the gown, or the lace on the veil, from the surface on the screen. Or to feel the beading and weave of a blanket made by a local artisan half way around the world. In five years, industries like retail will be transformed by the ability to “touch” a product through your mobile device.
IBM scientists are developing applications for the retail, healthcare and other sectors using haptic, infrared and pressure sensitive technologies to simulate touch, such as the texture and weave of a fabric — as a shopper brushes her finger over the image of the item on a device screen. Utilizing the vibration capabilities of the phone, every object will have a unique set of vibration patterns that represents the touch experience: short fast patterns, or longer and stronger strings of vibrations. The vibration pattern will differentiate silk from linen or cotton, helping simulate the physical sensation of actually touching the material.
2) Sight: A pixel will be worth a thousand words. We take some 500 billion photos a year, and 72 hours of video is uploaded to YouTube every minute. But computers today only understand pictures by the text we use to tag or title them; the majority of the information — the actual content of the image — is a mystery.
In the next five years, systems will not only be able to look at and recognize the contents of images and visual data, they will turn the pixels into meaning, making sense out of it similar to the way a human views and interprets a photographs. In the future, “brain-like” capabilities will let computers analyze features such as color, texture patterns or edge information and extract insights from visual media, having a potentially huge impact on industries ranging from healthcare to retail to agriculture.
But please, no Escher drawings, at least for now…that’s just plain mean.
3) Hearing: Computers will hear what matters. Ever wish you could make sense of all the sounds around you and be able to understand what’s not being said? Within five years, distributed systems of clever sensors will detect elements of sound such as sound pressure, vibrations and sound waves at different frequencies.
It will interpret these inputs to predict when trees will fall in a forest or when a landslide is imminent. Such a system will “listen” to our surroundings and measure movements, or the stress in a material, to warn us if danger lies ahead.
I’m ever hopeful such systems will be able to “listen” to my golf swing and help me course correct so I can play more target golf!
4) Taste: Digital taste buds will help you to eat smarter. What if we could make healthy foods taste delicious using a different kind of computing system built for creativity? IBM researchers are developing a computing system that actually experiences flavor, to be used with chefs to create the most tasty and novel recipes. It will break down ingredients to their molecular level and blend the chemistry of food compounds with the psychology behind what flavors and smells humans prefer.
By comparing this with millions of recipes, the system will be able to create new flavor combinations that pair, for example, roasted chestnuts with other foods such as cooked beetroot, fresh caviar, and dry-cured ham.
“Top Tasting Computer Chefs,” anyone?
5) Smell: Computers will have a sense of smell. During the next five years, tiny sensors embedded in your computer or cell phone will detect if you’re coming down with a cold or other illness. By analyzing odors, biomarkers and thousands of molecules in someone’s breath, doctors will have help diagnosing and monitoring the onset of ailments such as liver and kidney disorders, asthma, diabetes, and epilepsy by detecting which odors are normal and which are not.
Already, IBM scientists are sensing environment conditions to preserve works of art, and this innovation is starting to be applied to tackle clinical hygiene, one of the biggest healthcare challenges today. In the next five years, IBM technology will “smell” surfaces for disinfectants to determine whether rooms have been sanitized. Using novel wireless mesh networks, data on various chemicals will be gathered and measured by sensors, and continuously learn and adapt to new smells over time.
Watch the video below to listen to IBM scientists describe some of these new innovations and their potential impact on our world.