Self-flying drone dips, darts and dives through trees at 30 mph

Algorithm allows for real-time object-detection without Lidar or Kinect.

A researcher from MIT’s Computer Science and Artificial Intelligence Lab (CSAIL) has developed an obstacle-detection system that allows a drone to autonomously dip, dart and dive through a tree-filled field at upwards of 30 miles per hour.

“Everyone is building drones these days, but nobody knows how to get them to stop running into things,” says CSAIL PhD student Andrew Barry, who developed the system as part of his thesis with MIT professor Russ Tedrake. “Sensors like lidar are too heavy to put on small aircraft, and creating maps of the environment in advance isn’t practical. If we want drones that can fly quickly and navigate in the real world, we need better, faster algorithms.”

Running 20 times faster than existing software, Barry’s stereo-vision algorithm allows the drone to detect objects and build a full map of its surroundings in real-time. Operating at 120 frames per second, the software – which is open-source and available online – extracts depth information at a speed of 8.3 milliseconds per frame.

The drone, which weighs just over a pound and has a 34-inch wingspan, was made from off-the-shelf components costing about $1,700, including a camera on each wing and two processors no fancier than the ones you’d find on a cellphone.