A robot who finds lost objects | MIT News

A busy commuter is ready to walk through the door, only to realize that he has misplaced his keys and has to search through a lot of stuff to find them. By quickly sifting through the clutter, they wish they could find out which stack was hiding the keys.

Researchers at MIT have created a robotic system that can do just that. The system, RFusion, is a robotic arm with a camera and a radio frequency (RF) antenna attached to its clamp. It merges signals from the antenna with visual input from the camera to locate and retrieve an object, even if the object is buried under a pile and completely out of sight.

The RFusion prototype developed by the researchers is based on RFID tags, which are inexpensive, battery-free tags that can be stuck to an object and reflect signals sent by an antenna. Since RF signals can pass through most surfaces (like the mound of dirty laundry that can obscure keys), RFusion is able to locate a tagged item in a stack.

Using machine learning, the robotic arm will automatically zero in on the exact location of the object, move objects over it, grab the object, and verify that it has picked up the right thing. The camera, antenna, robotic arm, and AI are fully integrated, so RFusion can work in any environment without requiring any special setup.

While it is useful to find lost keys, RFusion could have many broader applications in the future, such as sorting batteries to fulfill orders in a warehouse, identifying and installing components in an automotive manufacturing plant, or helping. a senior to perform daily chores at home, although the current prototype is not yet fast enough for these uses.

“This idea of ​​being able to find objects in a chaotic world is an open problem that we have been working on for a few years. Having robots capable of searching for objects under a pile is a growing need in industry today. Right now you can think of this as a Roomba on steroids, but in the short term it could have many applications in manufacturing and warehouse environments, ”said lead author Fadel Adib, associate professor at Department of Electrical and Computer Engineering. Science and director of the Signal Kinetics group at MIT Media Lab.

Co-authors include research assistant Tara Boroushaki, lead author; Isaac Perper, graduate student in electrical engineering and computer science; research associate Mergen Nachin; and Alberto Rodriguez, associate professor of the class of 1957 in the department of mechanical engineering. The research will be presented at the Association for Computing Machinery Conference on Embedded Networked Senor Systems next month.

Sending signals

RFusion begins searching for an object using its antenna, which sends signals back onto the RFID tag (such as sunlight reflected off a mirror) to identify a spherical area in which the tag is located. It combines this sphere with the input of the camera, which reduces the location of the object. For example, the item cannot be located on an area of ​​a table that is empty.

But once the robot has a general idea of ​​where the object is, it needs to rotate its arm widely around the room taking extra steps to find the exact location, which is slow and inefficient. .

The researchers used reinforcement learning to train a neural network that can optimize the robot’s path to the object. In reinforcement learning, the algorithm is trained through trial and error with a reward system.

“This is also how our brains learn. We are rewarded by our teachers, our parents, a computer game, etc. The same thing happens in reinforcement learning. We let the agent make mistakes or do something right, and then we punish or reward the network. This is how the network learns something that is really difficult to model, ”explains Boroushaki.

In the case of RFusion, the optimization algorithm was rewarded when it limited the number of trips it had to make to locate the item and the distance it had to travel to pick it up.

Once the system identifies the exact right spot, the neural network uses combined RF and visual information to predict how the robotic arm should grip the object, including the angle of the hand and the width of the gripper, and whether he needs to remove other items first. . It also scans the item’s tag one last time to make sure it retrieved the correct item.

Cut out of order

The researchers tested RFusion in several different environments. They buried a keychain in a messy box and hid a remote control under a stack of items on a couch.

But if they had provided all the camera data and RF measurements to the reinforcement learning algorithm, it would have overwhelmed the system. So, relying on the method used by a GPS to consolidate data from the satellites, they summarized the RF measurements and limited the visual data to the area just in front of the robot.

Their approach worked well: RFusion achieved a 96% success rate when recovering fully hidden objects under a stack.

“Sometimes if you just rely on RF measurements, there will be an outlier, and if you just rely on vision, sometimes there will be an error on the part of the camera. But if you combine them, they will correct themselves. This is what made the system so robust, ”says Boroushaki.

In the future, the researchers hope to increase the speed of the system so that it can move smoothly, rather than periodically stopping to take action. This would allow RFusion to be deployed in a fast paced manufacturing or warehouse environment.

Beyond its potential industrial uses, a system like this could even be integrated into future smart homes to help people with a number of household chores, Boroushaki explains.

“Each year, billions of RFID tags are used to identify objects in today’s complex supply chains, including clothing and many other consumer goods. The RFusion approach paves the way for autonomous robots that can dig into a pile of mixed items and sort them using the data stored in RFID tags, much more efficiently than having to inspect each item individually, by especially when the articles look like a computer vision system, ”says Matthew S. Reynolds, CoMotion Presidential Innovation Fellow and associate professor of electrical and computer engineering at the University of Washington, who was not involved in the research. “The RFusion approach is a big step forward for robotics operating in complex supply chains where identifying and ‘selecting’ the right item quickly and accurately is the key to ensuring orders are filled on time and that demanding customers are satisfied. “

The research is sponsored by the National Science Foundation, a Sloan Fellowship, NTT DATA, Toppan, Toppan Forms and the Abdul Latif Jameel Water and Food Systems laboratory.


Source link

Comments are closed.