Personal tools

TCNs vs. LSTMs for Embedded Platforms

From iis-projects

Jump to: navigation, search

Description

In the last few decades, artificial neural networks have been extensively used to provide machines with the human capability of making decisions, detecting patterns, predicting actions. With the monumental growth experienced in this field, nowadays the artificial intelligence is more and more capable of viewing the world as humans do, perceiving it in a similar manner and even use the knowledge for a multitude of tasks such as image and video recognition, audio detection, natural language processing, disease prediction, etc. For time series applications, such as abnormal ECG or EEG detection, the Recurrent Neural Networks (RNNs) are widely used. Especially the Long Short Term Memory networks – usually just called “LSTMs”, are applied in many fields and successfully solved a large variety of problems, thank to their capability of learning long-term dependencies. However, training an RNN is a very difficult task. Another class of neural networks which achieved impressive results in visual tasks is the Convolutional Neural Networks (CNNs), a variation of which, called Temporal Convolutional Networks (TCNs), have been used recently by deep learning practitioners to solve time series tasks with promising and successful outcomes. The main goal of this master thesis is to compare the performance of LSTMs and TCNs on time series applications. The student will identify two time series tasks and compare the performance of the two different network architectures. After identifying the better performing one, the student will proceed with embedded implementation and system integration in order to demonstrate a real-life application using sensor acquisition and machine learning.

Status: In progress

Thorir Ingolfsson

Supervision: Xiaying Wang, Lukas Cavigelli, Michael Hersche

Prerequisites

  • Deep Learning
  • Python (preferably Pytorch
  • Embedded C


Character

20% Literature Research
80% Software Development

Professor

Luca Benini

↑ top