Tag Archives: PR2

RoboHow: the Wikipedia that teaches robots how to cook

Developing robots that behave like us is one of the holy grails of modern robotics. And although recent advances in AI technology, human mimicking and automation have brought us closer than ever to that goal and gave machines a better sense of how to navigate their surroundings, there is a lot to improve in the way they work and interact with humans.

Image via today

But fear not! A European initiative founded in 2012, dubbed RoboHow, comes to take up the challenge by creating systems that should help robots learn and share information with each other (even by using actual language), mimicking human learning processes. The aim of the platform is to do away with pre-programming our machines to perform certain tasks, and to teach them how to put information together, use it, and remember it for the future – to “program” themselves.

The German robot PR2 – backed by the RoboHow team – is an example of a machine designed to take advantage of this new approach.

The PR2 is engineered to process written instructions from websites like WikiHow and then physically perform the associated tasks. After being tested in a bar tending setting, the robot is now interested in the kitchen – specifically pancakes. That may seem like a small task, but it requires an intricate framework of prior knowledge of micro-tasks that humans take for granted — such as the amount of pressure required to open a container.

We should have all known the Robot Revolution started when we saw this pancake.
Image via pancakeoftheweek

Ideally, the PR2 would gain that knowledge through experimenting, use it in its environment, and communicate what it has learned to an online database called OpenEase. This would create an open, easily accessible repository of growing knowledge for any robot to tap into and learn from.

MIT Technology Review researchers said they are also considering implementing techniques that would allow robots to learn from observing humans at work, performing certain tasks. One such approach would be studying virtual-reality data after humans have performed such tasks wearing tracking gloves.

The ultimate goal would be creating a set of robots that could adapt to changing environments and instructions and react in an appropriate manner. The biggest barrier is relaying the semiotics of language into algorithm. Bridging that gap would be a giant step forward in developing robots that learn and grow like humans.

MIT tackling more serious science: they program beer-delivering robots

Massachusetts Institute of Technology ‘s Computer Science and Artificial Intelligence Laboratory is on the brink of revolutionizing relaxation with their recent breakthrough: they have programmed two robots that can deliver beverages.

What’s yer poison?
Image via wikimedia

The robots, called PR2, have coolers attached to them and are programmed to roam around separate rooms and go ask people if they want a drink. Should the person say yes, the silicone-powered bartender wheels over to a larger robot that places a beer in the cooler, and returns it to the customer.

While the task of drink-fetching may seem small and underwhelming for a robot, programing a unit that can successfully perform this task is an incredible leap forward in robotics. The study remarks that one advantage of testing out a robot on bartending is that this environment allows the researchers to develop the program that drives the little PR2s with ease.

“As autonomous personal robots come of age, we expect certain applications to be executed with a high degree of repeatability and robustness. In order to explore these applications and their challenges, we need tools and strategies that allow us to develop them rapidly. Serving drinks (i.e., locating, fetching, and delivering), is one such application with well-defined environments for operation, requirements for human interfacing, and metrics for successful completion,” the study reads.

And while the applications that PR2 can be currently employed in are rather limited, the team behind them feels that specialization, rather than generalization of tasks to be performed, is the way to go for robotic progress. As such, they advocate the creation of an “app-store” of sorts, a database of specific, useful robotic behaviors that can be ran to perform specific tasks. One app will allow the robot to butler, another to clean, or sow, or cook, and so on.

“This view of encapsulating particular functionality is gaining acceptance across the research community and is an important component for the near and long term visions of endowing personal robots with abilities that will better assist people.”
It can also point astonishingly well. Image via popsci

It can also point astonishingly well.
Image via popsci

Even in the relatively well-constrained bounds of a specific “application”, endowing a personal robot with autonomous capability will require integrating many complex subsystems; most robots will need some facility in perception, motion planning, reasoning, navigation, and grasping. Each of these subsystems are well-studied and validated individually, but their seamless coordination has proven itself a tricky prize for roboticists, up to now.
“Specific challenges integrators face include coping with multiple points of failure from complicated subsystems, computational constraints that may only be encountered when running large sets of components, and reduced performance from subsystems during interoperation.”
There is also an issue of how robots integrate and coordinate with each-other. I’ll let Ariel Anders, one of MIT’s scientists working on PR2, explain in this video:

The MIT robots are considered groundbreaking (and thankfully not glass-shattering), and i personally feel it’s a great leap forward and can’t wait to have a robot butler of my own. The technology shows great promise, and engineers hope to eventually use it as a basis for more crucial missions. The creators said that they hope to one day use the robots at emergency shelters to take orders for bottles and crackers.

You can read the full abstract here.