Robot chef!

New Scientist is living in The Jetsons:

“If you want to interpret and understand everyday activities using vision data, it’s very complicated, error-prone, and resource intensive,” says Michael Beetz, who led the research. “If you do it with RFID tags, there is very little sensor information, but it’s highly correlated with the activities you are performing.”

As a result, the robot knows where everything is, and it can learn simple tasks simply by observing the movements of the objects.

“Setting the table is very easily recognised from cups and plates disappearing from the cupboard and appearing on the table, and cleaning up later is characterised by the same objects disappearing from the table and appearing in the dishwasher,” Beetz says.

Researchers at the Technical University of Munich are also looking at ways to get the robot to connect to the internet to look up stuff. “Oh,” it’ll say to itself. “That’s what a coffee mug looks like.”

Video at the link.