WILL ROBOTS EVER DO EXACTLY WHAT HUMANS DO?
The Berkeley Artificial Intelligence Research (BAIR) Lab has come up with the best imitation game. The researchers at the UC Berkeley have taught an android to mimic human skills, advancing machines another step toward global domination.
Learning a new skill by observing others is a “key part” of intelligence in humans and animals, the BAIR blog said.
"Such a capability would make it dramatically easier for us to communicate new goals to robots," according to undergrad Tianhe Yu and PhD student Chelsea Finn. "We could simply show robots what we want them to do, rather than teleoperating the robot or engineering a reward function."
Imitation learning of vision-based skills usually requires hundreds of expert demonstrations before a bot can complete a given task. "Our approach is to combine meta-learning with imitation learning to enable one-shot imitation learning."
As described by Finn and Yu, the "core idea" is that, when provided with a single demonstration of a particular task, like manoeuvring a certain object, the robot can quickly identify what the assignment is and successfully solve it under different circumstances.
"If we want a physical robot to ... emulate humans and manipulate a variety of novel objects, we need to develop a new system that can learn to learn from demonstrations in the form of videos using a dataset that can be practically collected in the real world," the team said.
Soon cyborgs might be able to use various tools or play assorted sports. In the meantime, you can see the robot in action in the video below.