Personal tools

Difference between revisions of "Object Detection and Tracking on the Edge"

From iis-projects

Jump to: navigation, search
(Blanked the page)
Line 1: Line 1:
 +
<!-- Creating Object Detection and Tracking on the Edge->
  
 +
[[File:Dvs_drones.png|200px|thumb|right|Example of Drone Detection, comparing DVS and RGB images]]
 +
 +
= Overview =
 +
 +
Dynamic Vision Sensors (DVS) or also called Event-based cameras can detect (when stationary placed) fast-moving and small objects and open-up tons of new possibilities for AI and tinyML. We are creating a completely new system, with an autonomous base station and distributed smart sensor nodes to run cutting-edge AI algorithms and perform novel sensor fusion techniques.
 +
 +
= Project description =
 +
Nowadays, Object detection has switched from classical approaches of finding handcrafted features within an image, to AI approaches. With the hardware speedup GPUs are delivering, these deep neural networks can run incredibly fast and achieve detection accuracy comparable or even superior to humans. However, when it comes to scenarios with low or high brightness, the dynamic range of standard CCD/CMOS cameras perform poorly and as such the networks start to fail. One approach to overcome these problems is to use a novel camera-sensor, called a dynamic vision Sensor, short DVS. These sensors do not record the intensity of pixels, instead, they record intensity changes, similar to a human’s eye. With this novel technology, a dynamic range of >120 dB can be achieved, which is comparable to a human’s eye. We aim of developing new object detection and tracking algorithms (for UAVs, Cars and People), use of bio-inspired processing hardware, implement a new stereo-vision algorithm, attacking existing algorithms with adversarial attacks and building up a new autonomous detection system. This can involve using/programming an MCU, Nvidia Jetson Orin, custom ASIC or even neuromorphic computing platforms.
 +
 +
Your task in this project will be one or several of the tasks mentioned below. Depending on your thesis (Semester/Master thesis), tasks will be assigned accordingly to your interests and skills.
 +
 +
== Tasks: ==
 +
* Event and Frame-based and/or 3D Imaging
 +
* Parallel Programming from MCU to Nvidia Jetson Orin
 +
* Adversarial Attacks
 +
* New Object Detection Algorithms or Neuromorphic Algorithms
 +
 +
 +
== Prerequisites (not all needed!) depending of Tasks ==
 +
* Embedded Firmware Design and experience in Free RTOS, Zephyr, etc…
 +
* Experience in Machine Learning and/or neuromorphic computing
 +
* Parallel programming
 +
 +
 +
 +
== Type of work ==
 +
* 20% Literature study
 +
* 60% Software and/or Hardware design
 +
* 20% Measurements and validation
 +
 +
== Status: Available ==
 +
 +
* Type: Semester or Master Thesis (multiple students possible)
 +
* Professor: : [https://www.ee.ethz.ch/the-department/people-a-z/person-detail.html?persid=194234 Prof. Dr. Luca Benini]
 +
* Supervisors:
 +
{|
 +
| style="padding: 5px" | [[File:Julian_Moosmann.jpg|frameless|left|75px]]
 +
|
 +
===[[:User:Julian | Julian Moosmann]]===
 +
* '''e-mail''': [mailto:julian.moosmann@pbl.ee.ethz.ch julian.moosmann@pbl.ee.ethz.ch]
 +
* '''phone''': +41 79 328 15 91
 +
* '''skype''': mojulian_2
 +
* '''Office''': ETF F 110
 +
 +
| style="padding: 5px" | [[File:philippmayer.jpg|frameless|left|75px]]
 +
|
 +
===[[:User:mayerph | Philipp Mayer]]===
 +
* '''e-mail''': [mailto:mayerph@iis.ee.ethz.ch mayerph@iis.ee.ethz.ch]
 +
* '''phone''': +41 44 63 242 68
 +
* '''skype''': mayer.philipp1
 +
* '''Office''': ETF F 108
 +
|}
 +
 +
* Currently involved students:
 +
** None
 +
 +
[[Category:Available]] [[Category:Digital]] [[Category:Event-Driven Computing]] [[Category:Deep Learning Projects]] [[Category:EmbeddedAI]] [[Category:SmartSensors]] [[Category:System Design]] [[Category:2023]] [[Category:Semester Thesis]] [[Category:Master Thesis]] [[Category:Julian]] [[Category:Hot]]

Revision as of 08:34, 25 May 2023