How to Make Your Robots Obey Your Orders

William Elcock

Are you dreaming of a future where robots are used for a variety of activities so that we don’t have to do them ourselves?

robot_obey_orders_guiding_tech_main.png

Come on, think about it! Cleaning, cooking, doing all of our chores are just a few of the wondrous possibilities. What a wonderful possibility right? Unfortunately, at the moment you will have to keep dreaming.

While there are some amazing robots that exist out there, robots are not yet adaptable enough to carry out a wide range of activities like this effectively. Moreover, although speech recognition technology has advanced by leaps and bounds, it’s still not good enough for use with Robots.

Your best bet for getting something like a hypothetical robot butler to follow your instructions would be to type the instruction set.

Spoken Commands

The problem with spoken commands is that they contain varying levels of complexity, although this might not always be clear.

Imagine telling your robot, “Pick up that box over there.” This seems simple enough but there is an issue. Your robot will have to break this down into a number of steps before completing the action. A possible scenario for carrying out this command is:

  • Turn on tracking system
  • Turn on walking motors
  • Change Direction
  • Take necessary steps
  • Rotate limbs
  • Clench box
  • Lift box

As you can see, this is actually more complex than it first appeared to be. Now imagine that command compared to something like, “Turn on your tracking system.” Although the number of words used to give these 2 commands is similar, their levels of complexity are worlds apart.

How can we solve this? As it stands now, robots will have trouble figuring out the different levels of complexity of spoken commands.

Fear not, a team at Brown University has developed a system which improves the way robots handle spoken commands.

How to Make Your Robots Obey Your Orders: A System for Enabling Robots To Carry Out Spoken Commands Effectively

The researchers at Brown used the data they obtained to train their system to understand varying levels of complexity. The system was then able to gather what action needed to be carried out and understand the levels of complexity associated with different sentence structures.

The team at Brown University decided to tackle the problem of getting robots to carry out spoken commands by using an ingenious system. They used both Amazon’s Mechanical Turk as well as a tool called Virtual Cleanup World to develop their model.

Mechanical Turk is a marketplace for work which requires the intelligence of humans. Although artificial intelligence is making some impressive feats, there are many tasks which humans can do more effectively such as identifying objects in a video
The virtual Cleanup World is a virtual task domain. It consists of color-coded rooms, a virtual robot and an object for the robot to carry out tasks with.
virtual_cleanup_world
Virtual Cleanup World | Brown University

Volunteers at Mechanical Turk figured out which instruction sets led to particular actions in Cleanup world. First, they observed the robot as it carried out a variety of tasks.

They were then asked what instruction sets they thought would work better. The volunteers were asked to create high-level, mid-level, and low-level commands.

High-level commands were those such as instructing the robot to carry a chair to a room of a particular color. Low-level commands were commands broken down into several steps. Mid-level commands combined the features of high, and low-level commands.

The researchers at Brown used the data they obtained to train their system to understand varying levels of complexity. The system was then able to gather what action needed to be carried out and understand the levels of complexity associated with different sentence structures.

Putting The System To The Test

When the robots were able to figure out the desired end result, as well as understand the level of complexity of tasks, they completed the task in just 1 second 90 percent of the time.

Based on this, it was able to devise an appropriate plan based on the spoken commands it was given. After training their system, it was time to test the fruits of their labor. Research made use of Cleanup World once again as well as a real robot operating in a physical space set up similarly to the virtual Cleanup World.

robot_obey_orders_example
People give instructions at varying levels of abstraction — from the simple and straightforward (“Go north a bit.”) to more complex commands that imply a myriad of subtasks (“Take the block to the blue room.”). A new software system helps robots better deal with instructions whatever their level of abstraction. | Brown University

When the robots were able to figure out the desired end result, as well as understand the level of complexity of tasks, they completed the task in just 1 second 90 percent of the time.

However, when there was a breakdown in understanding the level of complexity, task completion took longer. In this case, the robots required 20 or more seconds of planning in order to complete a task.

The researchers will need to find ways of minimizing these breakdowns to create a more efficient system.

Final Thoughts

Robots still have quite a way to go before they are mainstream. However, this work brings us closer to having robots which can easily understand the commands we dish out to them. Until then, go wash your own dishes.

Also See
#Science #speech

Join the newsletter

William Elcock

Written By

William Elcock

William has been helping friends troubleshoot tech problems for several years and thus made the natural progression into tech blogging. In addition to consumer electronics William also has a vested interest in various renewable energy topics.