Home Learn This robot can tidy a room with none help

This robot can tidy a room with none help

0
This robot can tidy a room with none help

Robots are good at certain tasks. They’re great at picking up and moving objects, for instance, and so they’re even convalescing at cooking.

But while robots may easily complete tasks like these in a laboratory, getting them to work in an unfamiliar environment where there’s little data available is an actual challenge.

Now, a brand new system called OK-Robot could train robots to choose up and move objects in settings they haven’t encountered before. It’s an approach that may find a way to plug the gap between rapidly improving AI models and actual robot capabilities, because it doesn’t require any additional costly, complex training.

To develop the system, researchers from Latest York University and Meta tested Stretch, a commercially available robot made by Hello Robot that consists of a wheeled unit, a tall pole, and a retractable arm, in a complete of 10 rooms in five homes. 

While in a room with the robot, a researcher would scan their surroundings using Record3D, an iPhone app that uses the phone’s lidar system to take a 3D video to share with the robot. 

The OK-Robot system then ran an open-source AI object detection model over the video’s frames. This, together with other open-source models, helped the robot discover objects in that room like a toy dragon, a tube of toothpaste, and a pack of playing cards, in addition to locations across the room including a chair, a table, and a trash can.

The team then instructed the robot to choose up a selected item and move it to a brand new location. The robot’s pincer arm did this successfully in 58.5% of cases; the success rate rose to 82% in rooms that were less cluttered. (Their research has not yet been peer reviewed.)

The recent AI boom has led to enormous leaps in language and computer vision capabilities, allowing robotics researchers access to open-source AI models and tools that didn’t exist even three years ago, says Matthias Minderer, a senior computer vision research scientist at Google DeepMind, who was not involved within the project.

“I’d say it’s quite unusual to be completely reliant on off-the-shelf models, and that it’s quite impressive to make them work,” he says.

“We’ve seen a revolution in machine learning that has made it possible to create models that work not only in laboratories, but within the open world,” he adds. “Seeing that this actually works in an actual physical environment could be very useful information.”

Since the researchers’ system used models that weren’t fine-tuned to this particular project, when the robot couldn’t find the article it was instructed to search for it simply stopped in its tracks as a substitute of attempting to work out an answer. That significant limitation is one reason the robot was more likely to reach tidier environments—fewer objects meant fewer possibilities for confusion, and a clearer space for navigation.

Using ready-made open-source models was each a blessing and a curse, says Lerrel Pinto, an assistant professor of computer science at Latest York University, who co-led the project. 

“On the positive side, you don’t have to provide the robot any additional training data within the environment, it just works,” he says. “On the con side, it may possibly only pick an object up and drop it someplace else. You possibly can’t ask it to open a drawer, since it only knows easy methods to do those two things.” 

Combining OK-Robot with voice recognition models could allow researchers to deliver instructions just by talking to the robot, making it easier for them to experiment with available datasets, says Mahi Shafiullah, a PhD student at Latest York University who co-led the research.

“There may be a really pervasive feeling within the [robotics] community that homes are difficult, robots are difficult, and mixing homes and robots is just completely not possible,” he says. “I believe once people start believing home robots are possible, loads more work will start happening on this space.”

LEAVE A REPLY

Please enter your comment!
Please enter your name here