Personal tools

Hardware Constrained Neural Architechture Search

From iis-projects

Jump to: navigation, search


Description

Designing good and efficient neural networks is challenging, most tasks require models to be both highly accurate and robust, as well as being compact. These models then also often have constraints on energy usage, memory consumption, and latency. This results in the search space for manual design in being combinatorially large. A method of tackling the problem of manual design is instead using a neural architecture search (NAS). Many different flavors of NAS exist, such as being differentiable or DNAS [1], NAS methods that utilize evolutionary algorithms, and NAS methods that use reinforcement learning [2]. An interesting and exciting feature of NAS is the ability to include hardware constraints or have constraints on e.g., power consumption and/or memory usage guide the search process for state-of-the-art neural networks [3].


In this project, the student explores and tests a NAS method that has the ability to have hardware constraints to guide the search process for SoA neural networks on BCI related tasks.

Status: Available

Looking for a student for a Semester project.

Supervision:Thorir Mar Ingolfsson, Michael Hersche, Xiaying Wang

Prerequisites

  • Machine Learning
  • Python

Character

20% Theory
80% Implementation

Literature

  • [1] Wu et. al., FBNet: Hardware-Aware Efficient ConvNet Design via Differentiable Neural Architecture Search
  • [2] Tan et. al., MnasNet: Platform-Aware Neural Architecture Search for Mobile
  • [3] Vineeth et al., Hardware Aware Neural Network Architectures (using FBNet)


Professor

Luca Benini

↑ top


Practical Details

↑ top