EU Research Project

European Flag

This project has received funding from the European Union’s Seventh Framework Programme for research, technological development and demonstration under grant agreement no. 266470

Scientific contact

Prof. Dr. Heinrich Bülthoff

Max Planck Institute for Biological Cybernetics

Further details


Download the myCopter flyers:
Initiates file downloadmyCopter objectives flyer
Initiates file downloadmyCopter results flyer
Print page    

Paralell tracking and mapping

This work has been performed at the Autonomous System Lab at the Opens internal link in current windowEidgenössische Technische Hochschule Zürich and is part of the following publication (also see our Opens internal link in current windowpublication database):
Achtelik M. W., Lynen S., Weiss S., Kneip L., Chli M. and Siegwart R. (2012) Visual-Inertial SLAM for a Small Helicopter in Large Outdoor Environments IEEE/RSJ International Conference on Intelligent Robots and Systems 2012, 1-2.


Current out-of-the-box solutions for navigation lack the robustness and flexibility for leaving a controlled laboratory environment and perform navigation with small aerial vehicles. Truly autonomous flight in general environments is not possible without reliance on unrealistic assumptions like uninterrupted GPS signals, perfect communication links to a ground station for data processing and control, or pose measurements from external motion capture systems. Higher level tasks, such as autonomous exploration, swarm operation and large trajectory planning, can only be tackled after solving such issues.
Computer vision techniques are commonly used in research for real-time tracking and navigation. High-performing stereo-based systems have demonstrated successful operation on ground vehicles. However, stereo setups are unsuitable for navigation with small aerial vehicles as the stereo image essentially reduces to a monocular image when the scene is viewed from a large distance or from close by (e.g., during landing). Hence we focus our study around monocular based methodologies.
We have implemented a high-performance parallel tracking and mapping system to achieve monocular self-localisation and mapping on-board a small aerial vehicle. Due to the long trajectories that are flown, the high dynamics of the motion of the vehicle, and dynamic objects in the scene, we use additional sensors including an Inertial Measurement Unit featuring accelerometers and gyroscopes, a magnetometer, air-pressure data, and GPS information where available.
The framework that we have developed is publically available at Opens external link in new window The packages ethzasl_ptam, ethzasl_sensor_fusion, and asctec_mav_framework have been developed partially in the myCopter project.