By Pete Pachal ; Mashable – Sun Today’s PCs and smartphones can do a lot — from telling you the weather in Zimbabwe in milliseconds, to buying your morning coffee. But ask them to show you what a piece of fabric feels like, or to detect the odour of a great-smelling soup, and they’re lost. That [...]

The Sundaytimes Sri Lanka

IBM: Computers will see, hear, taste, smell and touch in 5 years

View(s):

By Pete Pachal ; Mashable – Sun

Today’s PCs and smartphones can do a lot — from telling you the weather in Zimbabwe in milliseconds, to buying your morning coffee. But ask them to show you what a piece of fabric feels like, or to detect the odour of a great-smelling soup, and they’re lost.

That will change in the next five years, says IBM. Computers at that time will be much more aware of the world around them, and be able to understand it. The company’s annual “5 in 5″ list, in which IBM predicts the five trends in computing that will arrive in five years’ time, reads exactly like a list of the five human senses — predicting computers with sight, hearing, taste, smell and touch.

The five senses are really all part of one grand concept: cognitive computing, which involves machines experiencing the world more like a human would. For example, a cognizant computer wouldn’t see a painting as merely a set of data points describing color, pigment and brush stroke; rather, it would truly see the object holistically as a painting, and be able to know what that means.

Cognitive advantages

“That’s a foundationally different way of thinking of computing,” Bernie Meyerson, IBM’s vice president of innovation, told Mashable in an interview. “You have to change how you think about absorbing data. You can’t just take a picture and file the picture. You have to treat the picture as an entity at a very high level, as opposed to just a bunch o’ bits.”

“(Cognitive computing) makes for some very interesting shifts in capability,” he adds. “That’s a rather profound sort of driver.”
One of the key differences between a cognizant computer and a traditional one is the idea of training. A cognitive system won’t just continue to give the same wrong or unhelpful answer; if it arrives at the wrong conclusion, it can change its approach and try again.
“In a cognitive machine, you set it up and run it, but it observes,” Meyerson says. “And that’s very different because it statistically calculates an end result. However, if that answer is incorrect and you tell it, it’ll actually re-weight those probabilities that led it to get the wrong answer and eventually get to the right answer.”

Cognition does not equal intelligence

Attributing human senses to machines can’t help but conjure images of androids or self-aware computers capable of independent thought and action. Meyerson says there’s a massive chasm separating cognitive computing and true artificial intelligence.
“This is really an assistive technology,” he explains. “It can’t go off on its own. It’s not designed to do that. What it’s designed to do, in fact, is respond to a human in an assistive manner. But by providing a human-style of input, it’s freed us from the task of programming and moved to the task of training. It simply has — not more intelligence — but more bandwidth, and there’s a huge difference between the two.”

What’s your take on cognitive computing? Is IBM on to something with PCs that can taste, smell, touch, hear and see? How would you use the technology?




Share This Post

DeliciousDiggGoogleStumbleuponRedditTechnoratiYahooBloggerMyspace
comments powered by Disqus

Advertising Rates

Please contact the advertising office on 011 - 2479521 for the advertising rates.