Developed at Cornell University using Kinect sensors, 3D cameras and a database of videos to work out what ‘owners’ want A beer-pouring robot that can read your body movements and anticipate when you want another drink has been developed by American students. Researchers from Cornell University used Microsoft Kinect sensors and 3D cameras to help [...]

Sunday Times 2

A robot butler to tend to your every need

View(s):

Developed at Cornell University using Kinect sensors, 3D cameras and a database of videos to work out what ‘owners’ want

A beer-pouring robot that can read your body movements and anticipate when you want another drink has been developed by American students. Researchers from Cornell University used Microsoft Kinect sensors and 3D cameras to help the robot analyse its surroundings and identify its owner’s needs.

The robot then uses a database of videos showing 120 various household tasks to identify nearby objects, generate a set of possible outcomes and choose which action it should take – without being told. As the actions continue, the robot can constantly update and refine its predictions.

As well as fetching drinks for thirsty owners, the robot can also work out when its owner is hungry and put food in a microwave, tidy up, make cereal, fetch a toothbrush and toothpaste, open fridge doors and more. Ashutosh Saxena, Cornell’s professor of computer science and co-author of a new study tied to the research: ‘We extract the general principles of how people behave.

‘Drinking coffee is a big activity, but there are several parts to it. ‘The robot builds a ‘vocabulary’ of such small parts that it can put together in various ways to recognise a variety of big activities.’  The robot was initially programmed to refill a person’s cup when it was nearly empty.

To do this the robot had to plan its movements in advance and then follow this plan.  But if a human sitting at the table happens to raise the cup and drink from it, the robot was put off and could end up pouring the drink into a cup that isn’t there.  After extra programming the robot was updated so that when it sees the human reaching for the cup, it can anticipate the human action and avoid making a mistake.

During tests, the robot made correct predictions 82 per cent of the time when looking one second into the future, 71 per cent correct for three seconds and 57 per cent correct for 10 seconds.

© Daily Mail, London




Share This Post

DeliciousDiggGoogleStumbleuponRedditTechnoratiYahooBloggerMyspace
comments powered by Disqus

Advertising Rates

Please contact the advertising office on 011 - 2479521 for the advertising rates.