Personal tools

Visualization of Neural Architecture Search Spaces

From iis-projects

Jump to: navigation, search
Differentiable Neural Arcitechture Search [1]

Description

Designing good and efficient neural networks is challenging, most tasks require models to be both highly accurate and robust, as well as being compact. These models then also often have constraints on energy usage, memory consumption, and latency. This results in the search space for manual design in being combinatorially large. A method of tackling the problem of manual design is instead using a neural architecture search (NAS). Many different flavors of NAS exist, such as being differentiable or DNAS [1], NAS methods that utilize evolutionary algorithms, and NAS methods that use reinforcement learning [2]. An interesting and exciting feature of NAS is the ability to include hardware constraints or have constraints on e.g., power consumption and/or memory usage guide the search process for state-of-the-art neural networks [3].


In this project, the student will work with a tool engineered at the lab for the visualization of NAS spaces. The student will extend and formalize an API that has the following capabilities:

  • User can inject custom graph distance functions
  • User can inject custom proxy statistics
  • User can inject custom NAS space
  • User can inject custom filters to select a subspace (e.g., all the networks whose peak memory utilization does not exceed threshold X)
  • User can inject custom clustering algorithm (possibly with automated hyper-parameter tuning)
  • Tool should visualize network architectures as points in a low-dimensional space, and color them with “heatmaps” determined by chosen proxy statistic

Status: Available

Supervision: Thorir Mar Ingolfsson, Matteo Spallanzani

Prerequisites

  • Algorithms and data structures: directed acyclic graphs (DAGs)
  • Data science: clustering and dimensionality reduction
  • Deep learning fundamentals: backpropagation, CNNs
  • Programming: Python & PyTorch

Character

20% Theory
80% Implementation

Literature

  • [1] Wu et. al., FBNet: Hardware-Aware Efficient ConvNet Design via Differentiable Neural Architecture Search
  • [2] Tan et. al., MnasNet: Platform-Aware Neural Architecture Search for Mobile
  • [3] Vineeth et al., Hardware Aware Neural Network Architectures (using FBNet)


Professor

Luca Benini

↑ top


Practical Details

↑ top