RoboBees Could Pollinate the Future

It’s official: the future has arrived, and it looks like robotic bees shooting laser beams out of their eyes.

In response to the pernicious Colony Collapse Disorder (CCD) that has decimated commercial honey bee populations in the US, researchers have developed micro-robotic “insects” that could artificially pollinate future crops.

The research results from coordinated efforts between investigators at Harvard’s School of Engineering and Applied Science (SEAS) and faculty from the Department of Organismic and Evolutionary Biology, along with faculty from Northeastern University’s Department of Biology.

The “bodies” of these flying insect-sized robots–rather cheekily dubbed RoboBees–possess compact high-energy power sources. The “heads” consist of a collection of artificial smart sensors that control and monitor flight, sense objects, and coordinate the decision-making process.

The developers hope that one day, individual RoboBees will be able to mimic the behavior of an entire colony, communicating with one another and imitating the capacity of real bees to perform specialized tasks such as scouting and foraging. They are quick to emphasize, however, that these micro-insects will in no way serve as a substitute for natural bee populations.

“If robots were used for pollination–and we are at least 20 years away from that possibility–it would only be as a stop-gap measure while a solution to CCD is implemented to restore natural pollinators,” the researchers wrote on their site.

In addition to its ability to complete sophisticated aerial maneuvers, the RoboBee can transition seamlessly into an aquatic environment by adjusting the angle of its wings’ movements as well as their speed, slowing from 120 to only nine flaps per second. The engineers at SEAS admit to seeking inspiration in puffins, the chubby, buoyant-beaked birds whose aptitude for both flying and swimming provide insight into wing propulsion mechanisms.

The RoboBee may be small, but it is the first micro-robot capable of executing both aerial and aquatic maneuvers, providing an important jumping-off point for the development of larger dual-environment robots.

“What is really exciting about this research is that our analysis of flapping-wing locomotion is not limited to insect-scaled vehicles,” said SEAS graduate student Kevin Chen. “From millimeter-scaled insects to meter-scaled fishes and birds, flapping locomotion spans a range of sizes. This strategy has the potential to be adapted to larger aerial-aquatic robots.”

A certain crucial mechanical setback remains: RoboBees lack depth perception, so they will have a tough time avoiding obstacles or landing on the flowers they were designed to pollinate.

Enter, the future.

Computer vision and sensor experts Sanjeev Koppal and Huikai Xie at the University of Florida will cooperate with computer scientist Karthik Dantu of the University at Buffalo to develop radar-like light technology–called lidar–on a micro scale.

Whereas radar uses radio waves to calculate the size, shape, and distance of other objects based on the echoes they reflect back, lidar employs pulses of light.

In other words, RoboBees will read their environment by shooting tiny lasers out of their tiny robo-eyes.

“Lidar is basically exploiting the ‘echo’ of a light pulse,” Koppal explained. “You can imagine that the echo of a light pulse that leaves a sensor, bounces off an object, and returns is really fast. Detecting this quickly, but without complex circuitry and inside a small-form factor, is one of the main challenges.”

Other technologies that benefit from advanced sensor systems, such as autonomous driverless vehicles or Microsoft Kinect, already implement lidar. Scaling the device down to the size of a penny, however, may take up to three years. According to Koppal, once the device is complete, it will weigh only two-thousandths of an ounce.

The researchers hope that the micro lidar system developed for RoboBee navigation will have equally beneficial applications for improving the accuracy of user interface experiences.

“With micro lidar, you can imagine doing natural user interfaces for wearable technologies like smart clothing and smartwatches,” Koppal said.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

w

Connecting to %s

%d bloggers like this: