Researchers taught robots to understand spoken language

Researchers at the Massachusetts Institute of Technology (MIT) have taught robots to understand commands that are spoken. This will make them full-fledged assistants, with whom you can communicate as well as with other people.

Researchers explained that in the coming years, robots will help people both at home and at work. But voice communication with them will become more complicated if the devices understand only clear instructions. Therefore, robots must be able to receive commands and instructions in spoken language so that devices can communicate in the same way as with ordinary people.

To do this, they developed a scheduler based on thousands of replica samples, which trained to understand the sequence of structures of the spoken language. The system they developed combines a deep neural network and a scheduler based on this sample.

“The main advantage of our planner is that it does not require big data”, said MIT. “Training the robot in the future should look like dog training”.

The scheduler can also collect data about teams and interactions that the robot has not previously encountered. If the system gets confused, the scheduler remembers this and marks the problem for the engineer. After that, he will be able to interpret similar commands.

When the robot is unable to complete the task, many existing machine learning models cannot provide information about what went wrong. In this case, researchers will see problems that prevented the successful completion of this action in the past, and will be able to make changes to the architecture.

Google News button