Tech Explained: Embodied AI
ga(‘send’, ‘event’, ‘Uncategorized’, ‘article’, ‘article-industry-impression’, {nonInteraction: true});
What embodied AI is: Traditional artificial intelligence (AI) involves a computational approach. Algorithms analyse data and apply what they have ‘learned’ to new situations.
This can have a number of limitations, though, particularly when the AI is trying to interact with the real world. For example, computer vision powered by AI works very well in controlled conditions, such as factories. But in situations where distances, lighting and orientation are constantly changing, not so much.
This algorithmic approach also has a hard time controlling complex movements and situations that require common sense knowledge or knowledge of shared human experience. Why? Because this type of knowledge is grounded in our human embodiment – our relationship with our physical bodies. This has been described as “the idea that the mind is not only connected to the body but that the body influences the mind”. Embodied AI is an approach to computer learning that attempts to apply this relationship to artificial systems.
How embodied AI works: The notion of embodiment in AI involves working with real world physical systems. In other words: robots.
Researchers who are developing embodied AI are moving away from a strictly algorithm-led approach. They instead attempt to first understand how biological systems work, then develop general principles of intelligent behaviour, and, finally, apply this knowledge to build artificial systems such as robots or intelligent devices.
The robots use AI to interact with the physical world and to learn from their interactions. In order to do this, they are equipped with sensors that can import data from the world around them, along with AI systems that can examine and ‘learn’ from this data.
For example, if you want to teach a robot to do something like pick up a wrench, the algorithmic method would program it with the movements needed (e.g.: open hand, move hand over wrench, close hand, etc.). An embodied AI approach, in contrast, might instead teach the robot to guess by having it make random attempts, learning from each wrong attempt, until it reaches its goal.
This approach, called reinforcement learning, was used by a lab at UC Berkeley to teach a robot to fold towels and to play games. In effect, the robot teaches itself.
Benefits of embodied AI: This may all sound like a bit the beginning to the film The Terminator, but embodied AI is already being used in many different applications.
The latest versions of the Roomba robot vacuum cleaner use sensors and AI to learn and remember where objects are in a room. In fact, a number of companies are already working on ways to give embodied AI capabilities to a range of Internet of Things (IoT) enabled devices.
John Deere has developed an autonomous robot that uses AI and computer vision to distinguish crop plants from weeds. This way, it only sprays weed killer on the actual weeds. This dramatically reduced the amount of chemicals used.
Startup Neuromation is also developing agricultural drones that can scan fields and learn which fields need fertiliser or water. Autonomous robots shaped like bees are also being used to pollinate plants.
Healthcare start-up Embodied is one of many companies working to create AI-powered companion robots. In the future, this type of robot could be used to keep the elderly company, to monitor patients or even administer medicines.
Embodied AI could also combine AI with existing IoT devices. For example, instead of using IoT to simply monitor health, as with today’s Fitbit-like devices, an embodied AI-device could make life-saving decisions on the spot, and then administer the correct medication.
What’s next for embodied AI: The area of greatest potential interest in embodied AI may be in autonomous vehicles. In particular, developing onboard systems that allow autonomous vehicles to learn about their environment as they drive. The goal is for these vehicles to learn how to drive much as humans do, by sensing their environment and making decisions based on what they experience.
While this type of highly-functional embodied AI is still a long way off, many researchers are working in this area. At IBM, researchers have developed a do-it-yourself cardboard robot, powered by a Raspberry Pi and IBM’s Watson — trained through machine learning — and which includes a servo-powered arm, a camera, and a microphone. The robot is being made available to anyone who wants to experiment with embodied AI.
As embodied AI moves into the mainstream, new questions — such as how to teach robots to understand context and whether humans and robots will understand concepts and categories in the same way — will likely come to the fore.
Source: New feed 1