Kinect-carrying drone automatically builds 3D maps of rooms

Google Street View eat your heart out: An MIT-built quadrocopter uses Microsoft Kinect, and some smart odometry algorithms, to fly around a room and spit out a 3D map of the environment.

The drone itself is a lightweight UAV with four rotors, an antenna, protective and stabilising shielding and the guts of a Kinect sensor. It communicates with a nearby laptop, but all sensing and computation to determine position is done onboard an internal computer.

The quadrocopter runs an MIT-developed real-time visual odometry algorithm to work out location and velocity estimates, which helps stabilise the vehicle during its fully autonomous 3D flight. Most small drones use GPS or pre-coded information about the area to avoid bumping into things. This little vehicle does all those calculations on the fly.

Odometry is the process of using data from some kind of sensor to figure out position, like measuring how far a robot's legs or wheels have moved to determine how far it has travelled. In this case, the algorithm looks at successive frames and depth estimates from the Kinect, and matches features across the images to calculate distance and stability.

The data is also simultaneously sent off to a nearby computer, which collates the colour photos and location data to build a neat 3D model of the space. The MIT team collaborated with Peter Henry and Mike Krainin from the robotics and state estimation lab at the University of Washington to use their RGBD-SLAM algorithms.

Krainin and Henry build 3D point clouds just by pointing a Kinect or other depth camera at an object or room, or even carrying a camera through a series of corridors and buildings. You can see some seriously cool videos of this process at their project's website.

While the 3D maps are a nice touch, the primary goal of the project is to create drones that don't depend on pre-made maps or GPS. "In environments where GPS is noisy and maps are unavailable, such as indoors or in dense urban environments, a UAV runs the risk of becoming lost, operating in high threat regions, or colliding with obstacles," MIT's Robust Robotic Group writes on its research site.

"We are developing estimation and planning algorithms that allow MAVs to use environmental sensors such as laser range finders or cameras to estimate their position, build maps of the environment and fly safely and robustly." In the United States, both the Office of Naval Research and the Army Research Office have sponsored the project to help build better drones.

This article was originally published by WIRED UK