Senin, 20 Mei 2019

Why Facebook is teaching a bug-like robot to walk - CNN

Daisy, which looks like a giant bug, is part of a robot science project inside Facebook's Artificial Intelligence Research (FAIR) group. Since last summer, scientists at FAIR have been helping robots teach themselves how to walk and grasp objects. The goal is for them to learn in a way that's similar to how people gain these skills: by exploring the world around them and using trial and error.
Facebook is training this robot, named Daisy, how to walk to help advance AI research.
Many people may not know the world's largest social network is also tinkering with robots. But this group's work isn't meant to show up, say, in your Facebook news feed. Rather, the hope is that the project will be able to help AI researchers advance artificial intelligence that can learn more independently. They also want to get robots to learn with less of the data that humans are often required to gather before AI can accomplish tasks.
A gunman slaughtered 11 Jewish worshippers. Then people hunted for hate online
In theory, this work could eventually help improve the kinds of AI activities that many tech companies (including Facebook) are working on, such as translating words from one language to another, or recognizing people and objects in images.
In addition to Daisy, Facebook's researchers are working with robots that consist of multi-jointed arms, and robot hands that have touch sensors on their fingertips. They're using a machine-learning technique called self-supervised learning, in which the robots must figure out how to do things — such as pick up a rubber duck — by attempting the task repeatedly, then using data from sensors (such as the tactile sensors on a robot's fingers) to get better and better.
The research is still in an early stage: Meier said the robots are just starting to reach out for objects, but haven't yet determined how to pick them up. Like babies, who first must learn to use their muscles and limbs before they can move — let alone push up to stand — so, too, must the robots go through that discovery process.
Why force a robot to figure out these kinds of tasks?
The robot needs to understand what the consequences of its actions are, Franziska Meier, a research scientist at FAIR told CNN Business.
"As humans we are able to learn this, but we need to be able to teach a robot how to learn that," she said.
Additionally, she said, researchers were surprised to find that letting robots explore while figuring things out for themselves can speed up the learning process.
Some of the robots at Facebook that are teaching themselves to move and to grasp objects.
Daisy was operating in a demo mode when I saw it on a cloudy day last week, but it is learning how to walk via self-supervised learning. The six-legged robot, which the researchers bought, was chosen because of its stability, research scientist Roberto Calandra said. For instance, it started out knowing nothing about the ground it was meant to walk on (which included smooth halls inside Facebook as well as other surfaces). Over time, it's learning how to move forward by taking into account things such as balance and how it is positioned by using sensors on its legs.
Researchers also gave another robot, which consists of a jointed arm with a pincer for grasping, the coordinates of a point in space that they wanted it to reach, and then it spent five hours getting there — making different movements each time that were increasingly informed by what it tried previously.
"Each time, basically, it tries something, it gets more data, it optimizes the model, but we're also exploring," Meier said.
Facebook changes livestream rules after New Zealand shooting
Calandra said one reason for working on this kind of AI with robots, rather than using AI software on a computer, is because it forces the algorithms to use data efficiently. That is, they must figure out how to do tasks in days or hours, since they have to do so in real time, rather than in software simulations that can be sped up to imitate a longer time frame such as months or years.
"If you already know, 'Oh, I can just run more simulations, I can run 400 years of simulations' — this is an approach that, yes, would be very interesting scientifically, but it does not apply to the real world," he said.

Let's block ads! (Why?)


https://www.cnn.com/2019/05/20/tech/facebook-robots-bugs-ai/index.html

2019-05-20 10:35:00Z
52780300383769

Tidak ada komentar:

Posting Komentar