Intelligent Autonomous Flying Robots Learn and Map Environment As They Fly 37
An anonymous reader writes with this story about a machine-learning project out of the UK's University of Sheffield: Using simple drones the researchers have created automatic-control software that enables the "flying robot" to learn about its surroundings using a camera and an array of sensors. The robot starts with no information about its environment and the objects within it. But by overlaying different frames from the camera and selecting key reference points within the scene, it builds up a 3D map of the world around it. Other sensors pick up barometric and ultrasonic data, which give the robot additional clues about its environment. All this information is fed into autopilot software to allow the robot to navigate safely, but also to learn about the objects nearby and navigate to specific items.
SLAM? (Score:5, Informative)
Doing this is called Simultaneous Localization and Mapping, or SLAM. There's been enormous progress in that in the last decade. The basic idea is to take a large number of images of the same scene, possibly with inacccurate data about where they were taken, and build up a 3D model. It sort of works most of the time. Some algorithms do well indoors, especially where there are lots of strong edges and corners. Those are easy features to lock onto. Outdoors is tougher, although outdoors you can usually use GPS. It's a basic capabiilty robots need.
The video is frustrating. There's no comparison with previous work. Is this an advance, or did they just use known algorithms. [openslam.org]
off the shelf software (Score:4, Informative)
They are using PTAM package from Uni of Oxford
http://www.robots.ox.ac.uk/~gk... [ox.ac.uk]
Whats more they are using off the shelf ardrone-PTAM package
https://github.com/nymanjens/a... [github.com]
and replicating something done TWO YEARS AGO by Jens Nyman (from Belgian uni)
https://www.youtube.com/watch?... [youtube.com]
so W T F