Personal tools

Difference between revisions of "Autonomous Mapping with Nano-Drones UWB and Novel Depth Sensors"

From iis-projects

Jump to: navigation, search
(added mapping with nano-drones project page)
 
 
(One intermediate revision by the same user not shown)
Line 1: Line 1:
<!-- Creating Autonomous Mapping on Nano-Drones wit UWB and Novel Depth Sensors -->
+
<!-- Creating Autonomous Mapping on Nano-Drones with UWB and Novel Depth Sensors -->
  
 
[[Category:UAV]]  
 
[[Category:UAV]]  
Line 23: Line 23:
 
Vision-based perception algorithms used routinely on standard-size drones are based on simultaneous localization and mapping (SLAM) – a perception technique that builds a 3D local map of the environment – or end-to-end convolutional neural networks (CNNs). However, due to the large number of pixels typically associated with images, this approach still requires a large number of computations per frame.
 
Vision-based perception algorithms used routinely on standard-size drones are based on simultaneous localization and mapping (SLAM) – a perception technique that builds a 3D local map of the environment – or end-to-end convolutional neural networks (CNNs). However, due to the large number of pixels typically associated with images, this approach still requires a large number of computations per frame.
  
[[File:dronepic.png|thumb|right|300|The Crazyflie 2.1 featuring our custom deck based on the multi-zone ToF sensor]]
+
[[File:Uwb_gap8.png|thumb|right|300|The Crazyflie 2.1 with Philip Wiese's semester project - a deck with UWB and GAP8, the camera can be exchanged with a ToF array sensor]]
  
  
Line 31: Line 31:
 
Your goal is to develop a lightweight 2-d occupancy-grid mapping algorithm that fits onboard a nano-drone - for this, you will use ultrawide-band (UWB) localization to simplify the algorithm as this is the very first time we attempt mapping onboard a nano-drone. Last semester, a student designed the hardware for this task - an extension deck for the Crazyflie featuring both a UWB module as well as an ultra low-power multi-core processor, GAP8. Your task will include porting the driver for the array ToF sensor to GAP8, researching mapping algorithms, implementing a mapping algorithm and in-field testing.  
 
Your goal is to develop a lightweight 2-d occupancy-grid mapping algorithm that fits onboard a nano-drone - for this, you will use ultrawide-band (UWB) localization to simplify the algorithm as this is the very first time we attempt mapping onboard a nano-drone. Last semester, a student designed the hardware for this task - an extension deck for the Crazyflie featuring both a UWB module as well as an ultra low-power multi-core processor, GAP8. Your task will include porting the driver for the array ToF sensor to GAP8, researching mapping algorithms, implementing a mapping algorithm and in-field testing.  
 
If you are interested in this project as a masterthesis, trajectory planning on the generated map can be included.  
 
If you are interested in this project as a masterthesis, trajectory planning on the generated map can be included.  
[[File:Uwb_gap8.png|200px|thumb|left|Philip Wiese's semester project - a deck with UWB and GAP8, the camera can be exchanged with a ToF array sensor]]
 
  
  

Latest revision as of 14:09, 11 February 2022


Status: Available

Project Description

The depth infromation provided by the ToF multi-zone sensor

One reason for the high research interest in the field of UAVs is their potential to autonomously navigate indoors while avoiding obstacles. Performing this task includes several challenges, such as online perception, control, trajectory optimization, and localization. The small form-factor category represents an even more promising class of UAVs. The UAVs in this class (i.e., nano-UAVs) measure a few centimeters in size and weigh a few tens of grams. They are considered the ideal candidates for navigating in very narrow indoor areas for monitoring and inspection purposes.

Vision-based perception algorithms used routinely on standard-size drones are based on simultaneous localization and mapping (SLAM) – a perception technique that builds a 3D local map of the environment – or end-to-end convolutional neural networks (CNNs). However, due to the large number of pixels typically associated with images, this approach still requires a large number of computations per frame.

The Crazyflie 2.1 with Philip Wiese's semester project - a deck with UWB and GAP8, the camera can be exchanged with a ToF array sensor


However, novel sensors provide an alternative to vision-based solutions, such as the VL53L5CX from STMicroelectronics, a miniaturized, and lightweight optical multi-zone time-of-flight sensor targeted for indoor perception and autonomous navigation purposes. The VL53L5CX features a matrix of 8x8 ToF elements in a compact integrated solution that represents a negligible payload even for a nano-drone. Indeed, the nature of navigation algorithms intrinsically requires extracting the depth, which is directly provided by the optical sensor. Due to this fact, obstacle avoidance and navigation can be performed with a reduced number of pixels (i.e., 64 pixels).

Your goal is to develop a lightweight 2-d occupancy-grid mapping algorithm that fits onboard a nano-drone - for this, you will use ultrawide-band (UWB) localization to simplify the algorithm as this is the very first time we attempt mapping onboard a nano-drone. Last semester, a student designed the hardware for this task - an extension deck for the Crazyflie featuring both a UWB module as well as an ultra low-power multi-core processor, GAP8. Your task will include porting the driver for the array ToF sensor to GAP8, researching mapping algorithms, implementing a mapping algorithm and in-field testing. If you are interested in this project as a masterthesis, trajectory planning on the generated map can be included.


Character

  • 30% Literature and algorithm development
  • 50% C programming (PULP Platform)
  • 20% In-field evaluation and testing

Prerequisites

  • Strong interest in embedded systems
  • Experience with low-level C programming

References