Personal tools

Embedded Audio Source Localization Exploiting Coincidence Detection in Asynchronous Spike Streams

From iis-projects

Jump to: navigation, search

Introduction

The goal of this project is to develop a software application running on a PULP Chip, enabling the extraction of meaningful content from an asynchronous data stream produced by a “spiking” sensor. The prototype system on which the source localization application will be implemented is constituted by the following blocks:

  • PULP Chip mounted on a prototype board
  • Dynamic Audio Sensor (DAS) Board
  • AER-SPI interface, implemented on an ULP FPGA development board (see [1])

The Dynamic Audio Sensor, which is the sensor that will be used to collect environmental data, is an asynchronous event-based silicon cochlea developed by the Institute of Neuroinformatics of Zurich (INI). The custom chip asynchronously outputs a stream of address-events representing activity in different audio frequency ranges. PULP is an open, scalable Hardware and Software research platform developed as a joint project between the Integrated Systems Laboratory (IIS) of ETH Zurich and the Energy-efficient Embedded Systems (EEES) group of UNIBO.

The PULP platform is a multi-core platform achieving leading-edge energy-efficiency and featuring widely-tunable performance. The aim of PULP is to satisfy the computational demands of IoT applications requiring flexible processing of data streams generated by multiple sensors, such as accelerometers, low-resolution cameras, microphone arrays, vital signs monitors. As opposed to single-core MCUs, a parallel ultra-low-power programmable architecture allows to meet the computational requirements of these applications, without exceeding the power envelope of a few mW typical of miniaturized, battery-powered systems. Moreover, OpenMP, OpenCL and OpenVX are supported on PULP, enabling agile application porting, development, performance tuning and debugging. AER-SPI interface is a custom IP hosted on an ULP FPGA development board. It efficiently collects and stores the data produced asynchronously by the DAS sensor. Bursts of variable size are then transmitted to PULP through a common SPI interface.

Project description

Binaural cochlea.png

One interesting application of the binaural DAS is audio source localization. This is possible by exploiting the inter-aural time difference (ITD); in other words, the difference in the azimuth direction in the arrival timing of sound waves, at the two ears. Similarly to what is done by our auditory system to detect the azimuthal direction of a sound, by looking at the time differences between the left and right output channels of the sensor, it is possible to obtain the direction of the audio source [Chan2007]).

The use of this data representation is particularly efficient [3], an on-time localization algorithm can be used to extract the sound location with approximate resolution and lower latency than using a cross-correlation algorithm on the microphone outputs. Since the burst transmission is not easily predictable, both the algorithm and the power management policy of the system have to be engineered to maximize the energy efficiency. The goal of this project, is to implement such application on an ULP microcontroller like PULP. The project can be divided in the following phases:

  • Study of the different algorithms to implement source localization (both the proposed one and new ideas)
  • Optimization of the algorithm and implementation on PULP
  • Evaluation of the energy efficiency of the system (comparison between the chosen solution and other possibilities)

Required Skills

To work on this project, you will need:

  • familiarity with embedded programming (C language, microcontrollers...)

Other skills that you might find useful include:

  • familiarity with a scripting language for numerical simulation (Python or Matlab or Lua…)
  • to be strongly motivated for a difficult but super-cool project

If you want to work on this project, but you think that you do not match some the required skills, we can give you some preliminary exercise to help you fill in the gap.

Status: Available

Supervision: Alfio Di Mauro, Francesco Conti

Professor

Luca Benini

↑ top

Practical Details

Meetings & Presentations

The students and advisor(s) agree on weekly meetings to discuss all relevant decisions and decide on how to proceed. Of course, additional meetings can be organized to address urgent issues.

Around the middle of the project there is a design review, where senior members of the lab review your work (bring all the relevant information, such as prelim. specifications, block diagrams, synthesis reports, testing strategy, ...) to make sure everything is on track and decide whether further support is necessary. They also make the definite decision on whether the chip is actually manufactured (no reason to worry, if the project is on track) and whether more chip area, a different package, ... is provided. For more details confer to [2].

At the end of the project, you have to present/defend your work during a 15 min. presentation and 5 min. of discussion as part of the IIS colloquium.

Literature

  • [Liu2010] S.-C. Liu et al., Event-Based 64-Channel Binaural Silicon Cochlea with Q Enhancement Mechanisms [3]
  • [Finger2011] H. Finger et al., Estimating the Location of a Sound Source with a Spike-Timing Localization Algorithm [4]
  • [Chan2007] AER EAR: A Matched Silicon Cochlea Pair With Address Event Representation Interface [5]

Links

  • The EDA wiki with lots of information on the ETHZ ASIC design flow (internal only) [6]
  • The IIS/DZ coding guidelines [7]


↑ top