Personal tools

Difference between revisions of "Deep Unfolding of Iterative Optimization Algorithms"

From iis-projects

Jump to: navigation, search
 
 
Line 59: Line 59:
 
[[Category:Available]]
 
[[Category:Available]]
 
[[Category:IIP]]
 
[[Category:IIP]]
[[Category:IIP_5G]]
+
[[Category:IIP_OPT]]
  
 
[[#top|↑ top]]
 
[[#top|↑ top]]

Latest revision as of 12:57, 12 November 2020

Example of algorithm unrolled using Deep Unfolding. The parameters to learn in this example are delta and tau.

Short Description

A great number of applications on signal processing, communications, and imaging, use iterative algorithms to gradually improve a first guess into an adequate solution. Many times, these iterative algorithms have parameters (for example, the step size of gradient descent) that must be tuned for the algorithm to reach an adequate solution within a small number of iterations. In general, tuning these parameters is a time-consuming, manual effort that becomes exponentially harder as the number of parameters increases; this is particularly true if one considers non-fixed parameters that can take different values on different iterations. Furthermore, automated (online) parameter selection methods (e.g., adaptive step-size selection methods) are computationally complex, often exceeding the complexity of the optimization problem itself.

Recently, we have proposed the concept of “Deep Unfolding,” in which the idea is to unroll the computation graph of an iterative algorithm for a fixed number of iterations. Then, we can use deep learning tools to tune the algorithm parameters to their value that will offer the best performance (under a user-defined and problem-specific cost function). We have shown that, when applying Deep Unfolding to a wireless communication algorithm, we are able to obtain the same solution quality with only half the number of iterations! [1]

While we currently have implemented Deep Unfolding using standard deep learning frameworks, such as TensorFlow, this is possibly an overkill, since we are just learning tens of scalar values, unlike the millions of parameters that are typically learned in neural networks. Hence, in this project we will look at exemplary iterative algorithms to which we can apply Deep Unfolding, and we will derive and simplify the corresponding operations required to learn the parameters. The objective of the project is to then implement Deep Unfolding on a resource-constrained system, like a Raspberry Pi. This project requires familiarity with calculus and programming skills.

[1] A. Balatsoukas-Stimming, O. Castañeda, S. Jacobsson, G. Durisi, and C. Studer, "Neural-Network Optimized 1-bit Precoding for Massive MU-MIMO," IEEE 20th International Workshop on Signal Processing Advances in Wireless Communications (SPAWC), July 2019

Status: Available

Looking for 1-2 Semester/Master students
Contact: Oscar Castañeda

Prerequisites

Calculus
Programming skills

Character

10% Literature research
20% Mathematical derivations
50% Software implementation
20% Software implementation in resource-constrained system

Professor

Christoph Studer

↑ top

Detailed Task Description

Goals

Practical Details

Results

Links

↑ top