US researchers are building a next-level Alexa
Scientists at MIT have developed a system inside a humanoid robot which responds more naturally to commands.
A new “Alexa-like” robot, which will be able to understand plain commands in a similar way to Amazon’s voice assistant, is being developed by researchers in the US.
The ComText system, created by MIT’s Computer Science and AI Laboratory (CSAIL), will be able to carry out nuanced commands which require prior “contextual knowledge” about objects.
The machine will be able to decipher what various objects are and to understand simple commands such as “pick it up”, an ability which robots have been limited in thus far but which comes naturally to humans.
“This semantic gap means that, for robots to understand what we want them to do, they need a much richer representation of what we do and say.”
The robot was built by combining the ComText with a “two-armed, humanoid robot” named Baxter.
Researchers want to develop the robot with “two kinds of memory” – semantic memory and episodic memory.
Robots have previously focused on semantic memory which is based on general facts, such as the colour of the sky.
Researchers hope that by learning about an object’s size, shape, position and who it belongs to, it will be able to respond to simple commands which require multiple steps.
Researchers found that Baxter successfully carried out commands 90% of the time.
The team now want to train future robots to understand multi-step commands and “interact with objects more naturally”.
Such innovations could eventually find uses in self-driving cars and other robotic systems.