An AI-driven robot hand spent a hundred years teaching itself to rotate a cub
AI researchers have demonstrated a self-teaching algorithm that gives a robot hand remarkable new dexterity.
The creation: The robotic system, dubbed Dactyl, was developed by researchers at OpenAI. It taught itself to manipulate a cube with uncanny skill by practicing for the equivalent of a hundred years inside a computer simulation (though only a few days in real time).
How it works: It uses an off-the-shelf robotic hand from a UK company called Shadow, an ordinary camera, and an algorithm that’s already mastered multiplayer video game, DotA, using the same self-teaching approach.
Why it matters: As our own Will Knight explains, the robotic hand is still nowhere near as agile as a human one, and too clumsy to be deployed in a factory or warehouse. Even so, the research shows the potential for machine learning to unlock new robotic capabilities and suggests that some day robots might teach themselves new skills inside virtual worlds.
Source MIT technology Review