Robots Building Better Maps: For robots and other mechanical creatures

Nick’s primary research involves creating algorithms that decipher what the cameras and lasers are detecting to generate a map.

Watch Nick and his Segway Robot in action
Student and robot Enlarge

Nick Carlevaris-Bianco, PhD student in Electrical Engineering:Systems, is using a robot equipped with highly sensitive 3D laser scanners and cameras to generate robust 3D maps. These maps could be used in the future for autonomous navigation of vehicles and similar applications, as well as augmented reality on electronic devices.

“We collaborate with Ford and they have an autonomous F-250 truck with a very similar set of sensors,” explains Nick. “The Segway robot we are using allows us to work on similar algorithms on campus, both indoors and outdoors.”

Nick’s primary research involves creating finely tuned algorithms that decipher what the cameras and lasers are detecting, to ultimately generate a map of a given environment. The algorithms decipher a million data points per second that are generated from highly sensitive laser scanners. Because the element of time (and with time, a changing environment) is built into the map, it becomes, in effect, a 4D map.

This map will be useful to autonomous robots or other mechanical devices so that the basic environment can be known before they attempt to navigate the terrain and interact with temporal objects such as humans or automobiles.

Image Enlarge

“The contribution of our work,” stated Nick, “is that we are specifically building an understanding that the environment changes over time into our map representation. This will allow the map to be used over much longer periods of time and to be continuously updated as new observations of the environment are made. Some examples of the changes we hope to capture include; furniture being moved and doors being open and closed indoors, the changes in lighting throughout the day, weather and the changing of seasons and even construction.”

Nick works in the Perceptual Robotics Laboratory (PeRL) under the direction of Prof. Ryan Eustice, an assistant professor of Naval Architecture and Marine Engineering, Electrical Engineering and Computer Science, and Mechanical Engineering.