Personal tools

Difference between revisions of "Autonomous Obstacle Avoidance with Nano-Drones and Novel Depth Sensors"

From iis-projects

Jump to: navigation, search
(Project Description)
(Project Description)
Line 30: Line 30:
 
A preliminary version of such an algorithm has already been implemented.
 
A preliminary version of such an algorithm has already been implemented.
  
[Video: https://www.youtube.com/watch?v=cU40pqu24bw]
+
Video: https://www.youtube.com/watch?v=cU40pqu24bw
  
 
== Character ==
 
== Character ==

Revision as of 09:42, 11 February 2022


Status: Available

Project Description

The depth infromation provided by the ToF multi-zone sensor

One reason for the high research interest in the field of UAVs is their potential to autonomously navigate indoors while avoiding obstacles. Performing this task includes several challenges, such as online perception, control, trajectory optimization, and localization. The small form-factor category represents an even more promising class of UAVs. The UAVs in this class (i.e., nano-UAVs) measure a few centimeters in size and weigh a few tens of grams. They are considered the ideal candidates for navigating in very narrow indoor areas for monitoring and inspection purposes.

Vision-based perception algorithms used routinely on standard-size drones are based on simultaneous localization and mapping (SLAM) – a perception technique that builds a 3D local map of the environment – or end-to-end convolutional neural networks (CNNs). However, due to the large number of pixels typically associated with images, this approach still requires a large number of computations per frame.

However, novel sensors provide an alternative to vision-based solutions, such as the VL53L5CX from STMicroelectronics, a miniaturized, and lightweight optical multi-zone time-of-flight sensor targeted for indoor perception and autonomous navigation purposes. The VL53L5CX features a matrix of 8x8 ToF elements in a compact integrated solution that represents a negligible payload even for a nano-drone. Indeed, the nature of navigation algorithms intrinsically requires extracting the depth, which is directly provided by the optical sensor. Due to this fact, obstacle avoidance and navigation can be performed with a reduced number of pixels (i.e., 64 pixels).

Your goal is to develop an effective and robust obstacle avoidance algorithm that runs entirely on-board. The algorithm will acquire and interpret the depth frames and it will steer the drone accordingly so that it does not collide with the obstacles. Since path planning is not the topic of this project, the drone can fly a random trajectory while avoiding obstacles. Since the physical system featuring a commercial nano-drone (i.e., Crazyflie 2.1) and the ToF multi-zone sensor has already been implemented, this project focuses mostly on the development and validation of the avoidance algorithm.

A preliminary version of such an algorithm has already been implemented.

Video: https://www.youtube.com/watch?v=cU40pqu24bw

Character

  • 30% Literature and algorithm development
  • 30% FreeRTOS C programming (STM32 Platform)
  • 40% In-field evaluation and testing

Prerequisites

  • Strong interest in embedded systems
  • Experience with data acquisition and analysis
  • Experience with low-level C programming

References