Monday, August 6, 2018

An AI-driven robot hand spent a hundred years teaching itself to rotate a cube

Dactyl is a robot hand that has taught itself to manipulate objects fluidly, almost like a human hand. To achieve this researchers used AI simulations and were able to condense 100 years of training into a few days.

Via MIT Technology Review:

The robotic system, dubbed Dactyl, was developed by researchers at OpenAI, a nonprofit based in Silicon Valley. It uses an off-the-shelf robotic hand from a UK company called Shadow, an ordinary camera, and an algorithm that’s already mastered a sprawling multiplayer video game, DotA, using the same self-teaching approach (see “A team of AI algorithms just crushed humans in a complex computer game”).

The algorithm uses a machine-learning technique known as reinforcement learning. Dactyl was given the task of maneuvering a cube so that a different face was upturned. It was left to figure out, through trial and error, which movements would produce the desired results.

Reinforcement learning is inspired by the way animals seem to learn through positive feedback. It was first proposed decades ago, but it has only proved practical in recent years thanks to advances involving artificial neural networks

Learn more!

No comments:

Post a Comment