Personal tools

Difference between revisions of "Human Intranet"

From iis-projects

Jump to: navigation, search
(30 intermediate revisions by 3 users not shown)
Line 23: Line 23:
  
 
==Prerequisites and Focus==
 
==Prerequisites and Focus==
If you are an M.S. student at the ETHZ, typically there is no prerequisite. You can come and talk to us and we adapt the projects based on your skills. The scope and focus of projects are wide. You can choose to work on:
+
If you are an B.S. or M.S. student at the ETHZ, typically there is no prerequisite. You can come and talk to us and we adapt the projects based on your skills. The scope and focus of projects are wide. You can choose to work on:
  
 
* '''Efficient hardware architectures in emerging technologies''' (e.g., [https://www.zurich.ibm.com/sto/memory/ the IBM computational memory])
 
* '''Efficient hardware architectures in emerging technologies''' (e.g., [https://www.zurich.ibm.com/sto/memory/ the IBM computational memory])
 +
* '''Exploring new Human Intranet/IoT applications''' (High-level Embedded Programming)
 +
* '''Algorithm design and optimizations''' (Python)
 
* '''System-level design and testing'''  
 
* '''System-level design and testing'''  
 
* '''Sensory interfaces''' (analog and digital)
 
* '''Sensory interfaces''' (analog and digital)
 
* '''FPGA prototyping, ASIC, and accelerators''' (SystemVerilog/ VHDL)
 
* '''FPGA prototyping, ASIC, and accelerators''' (SystemVerilog/ VHDL)
* '''Exploring new Human Intranet/IoT applications''' (High-level Embedded Programming)
+
 
* '''Algorithm design and optimizations''' (Matlab/ Python)
 
  
  
Line 49: Line 50:
 
Here, we provide a short description of the related projects for you to see the scope of our work. The directions and details of the projects can be adapted based on your interests and skills. Please do not hesitate to come and talk to us for more details.
 
Here, we provide a short description of the related projects for you to see the scope of our work. The directions and details of the projects can be adapted based on your interests and skills. Please do not hesitate to come and talk to us for more details.
  
 +
=Brain-Machine Interfaces=
 +
<!--  [[File:BCI.png|thumb|center]]
 +
[[File:BCI-dryEEG.jpg|thumb|right]] -->
 +
[[File:Emotiv-epoc-14-channel-mobile-eeg.jpg|thumb|right|200px]]
  
  
=Smart Eyeglass for Drones=
+
===Short Description===
[[File:Jins_6axis.png|thumb|text-top|right|250px]]
+
Noninvasive brain–machine interfaces (BMIs) and neuroprostheses aim to provide a communication and control channel based on the recognition of the subject’s intentions from spatiotemporal neural activity typically recorded by EEG electrodes. What makes it particularly challenging, however, is its susceptibility to errors over time in the recognition of human intentions.
[[File:Jins_EOG.png|thumb|text-top|right|250px]]
 
  
===Short Description===
+
In this project, our goal is to develop efficient and fast learning algorithms that replace traditional signal processing and classification methods by directly operating with raw data from electrodes. Furthermore, we aim to efficiently deploy those algorithms on tightly resource-limited devices (e.g., Microcontroller units) for near sensor classification using artificial intelligence.
This project plans to deploy and build upon a new breed of eyewear that allows you to look inside yourself, instead of just at what is in front of you. These insights help you see deep within yourself by showing shifts in your emotional state, your activity logs, as well as your health. This device currently provides 6-axis sensors as well as EOG sensors. These collectively allow to recognize body movements, eye movements, and status of mind. In this project, your goal is to interface with this device and extend it with other sensors to create novel machine learning applications, e.g., controlling the movements of our [http://iis-projects.ee.ethz.ch/index.php/Energy_Efficient_Autonomous_UAVs nano-size quadrotor].
 
  
 
===Links===
 
===Links===
* [https://jins-meme.com/en/concept/ JINS Meme Smartglass]
+
* [https://iis-people.ee.ethz.ch/~herschmi/EdgeDL20.pdf Q-EEGNet: an Energy-Efficient 8-bit Quantized Parallel EEGNet Implementation for Edge Motor-Imagery Brain–Machine Interfaces]
* [https://www.youtube.com/watch?v=Om_F0uyfjyc A game application of smartglass]
+
* [https://iis-people.ee.ethz.ch/~herschmi/MEMEA20.pdf An Accurate EEGNet-based Motor-Imagery Brain–Computer Interface for Low-Power Edge Computing]
 +
* [https://iis-people.ee.ethz.ch/~arahimi/papers/EUSIPCO18.pdf Fast and Accurate Multiclass Inference for Motor Imagery BCIs Using Large Multiscale Temporal and Spectral Features]
 +
* [https://iis-people.ee.ethz.ch/~arahimi/papers/MONET17.pdf Hyperdimensional Computing for Blind and One-Shot Classification of EEG Error-Related Potentials]
 +
* [https://arxiv.org/abs/1812.05705 Exploring Embedding Methods in Binary Hyperdimensional Computing: A Case Study for Motor-Imagery based Brain-Computer Interfaces]
  
=Hyperdimensional Affective Computing=
+
===Available Projects===
[[File:Emotion-recognition.jpg|border|super|200px]]
+
<DynamicPageList>
[[File:Emotions-on-arousal-valence-space.jpg|border|super|300px]]
+
category = Available
 +
category = Digital
 +
category = BCI
  
===Short Description===
+
</DynamicPageList>
Affective computing (sometimes called artificial emotional intelligence) is the study and development of systems and devices that can recognize, interpret, process, and simulate human affects. We focus on the emotion recognition and interpretation. Emotion is a subjective mental state caused by some specific events, which is usually accompanied by characteristic behaviors and involuntary physiological changes. Therefore, multi-channel physiological signals (e.g., GSR, ECG, EEG, EOG) become good inputs for emotion analysis, which also can be collected easily and continuously by wearable sensors. However, due to the need of a huge amount of training data for a high-quality machine learning model, energy efficiency constrains and robust issues become major performance bottlenecks, especially for the wearable devices. To overcome this issue, HD computing can come to rescue by providing a low-energy, robust, and fast learning computational paradigm.
 
 
 
In this project, your goal would be to develop an efficient and robust learning method based on hyperdimensional spaces to enhance accuracy and energy consumption.
 
 
 
===Links===
 
* [https://www.research-collection.ethz.ch/handle/20.500.11850/315807 Hyperdimensional Computing-based Multimodality Emotion Recognition with Physiological Signals]
 
  
 
=Epilepsy Seizure Detection Device=
 
=Epilepsy Seizure Detection Device=
Line 80: Line 82:
 
===Short Description===
 
===Short Description===
 
Seizure detection systems hold promise for improving the quality of life for patients with epilepsy that afflicts nearly 1% of the world's population.
 
Seizure detection systems hold promise for improving the quality of life for patients with epilepsy that afflicts nearly 1% of the world's population.
In this project, your goal would be to develop efficient techniques for EEG as well as non-EEG signals to detect an upcoming seizure in an ultra-low-power device. This covers a wide range of analog and digital techniques. The abilities of hyperdimensional computing for one-shot and online learning can come to rescue.
+
In this project, our goal is to develop efficient techniques for EEG as well as non-EEG signals to detect an upcoming seizure in an ultra-low-power device. This covers a wide range of analog and digital techniques. The abilities of hyperdimensional computing for one-shot and online learning can come to rescue.
  
 
===Links===
 
===Links===
Line 87: Line 89:
 
* [https://www.youtube.com/watch?time_continue=87&v=ouyPXkEud40 Controlling tinnitus with neurofeedback]
 
* [https://www.youtube.com/watch?time_continue=87&v=ouyPXkEud40 Controlling tinnitus with neurofeedback]
  
=Online Brain-Computer Interfaces=
+
===Available Projects===
<!--  [[File:BCI.png|thumb|center]]
+
<DynamicPageList>
[[File:BCI-dryEEG.jpg|thumb|right]] -->
+
category = Available
[[File:Emotiv-epoc-14-channel-mobile-eeg.jpg|thumb|right|200px]]
+
category = Digital
 +
category = Epilepsy
  
 +
</DynamicPageList>
  
===Short Description===
 
Noninvasive brain–computer interfaces (BCI) and neuroprostheses aim to provide a communication and control channel based on the recognition of the subject’s intentions from spatiotemporal neural activity typically recorded by EEG electrodes. What makes it particularly challenging, however, is its susceptibility to errors over time in the recognition of human intentions.
 
 
In this project, your goal would be to develop an efficient and fast learning hardware device that replaces the traditional signal processing and classification methods by directly operating with raw data from electrodes in an online fashion.
 
 
===Links===
 
* [https://iis-people.ee.ethz.ch/~arahimi/papers/EUSIPCO18.pdf Fast and Accurate Multiclass Inference for Motor Imagery BCIs Using Large Multiscale Temporal and Spectral Features]
 
* [https://iis-people.ee.ethz.ch/~arahimi/papers/MONET17.pdf Hyperdimensional Computing for Blind and One-Shot Classification of EEG Error-Related Potentials]
 
* [https://arxiv.org/abs/1812.05705 Exploring Embedding Methods in Binary Hyperdimensional Computing: A Case Study for Motor-Imagery based Brain-Computer Interfaces]
 
  
 +
<!--
 
=Extremely Resilient Hyperdimensional Processor=
 
=Extremely Resilient Hyperdimensional Processor=
 
[[File:BrainChip.jpg|thumb|left]]
 
[[File:BrainChip.jpg|thumb|left]]
Line 117: Line 113:
  
 
=Flexible High-Density Sensors for Hand Gesture Recognition=
 
=Flexible High-Density Sensors for Hand Gesture Recognition=
<!-- [[File:Hyperdimensional_EMG.png|thumb|center]] -->
+
[[File:Hyperdimensional_EMG.png|thumb|center]]
 
[[File:FlexEMG.png|thumb|right|500px]]
 
[[File:FlexEMG.png|thumb|right|500px]]
 
  
 
===Short Description===
 
===Short Description===
Line 147: Line 142:
 
* [https://www.tu-chemnitz.de/etit/proaut/publications/IROS2016_neubert.pdf Learning Vector Symbolic Architectures for Reactive Robot Behaviours]  
 
* [https://www.tu-chemnitz.de/etit/proaut/publications/IROS2016_neubert.pdf Learning Vector Symbolic Architectures for Reactive Robot Behaviours]  
 
* [https://www.aaai.org/ocs/index.php/WS/AAAIW13/paper/download/7075/6578 Learning Behavior Hierarchies via High-Dimensional Sensor Projection (paper)]
 
* [https://www.aaai.org/ocs/index.php/WS/AAAIW13/paper/download/7075/6578 Learning Behavior Hierarchies via High-Dimensional Sensor Projection (paper)]
 
+
--->
=More Projects=
 
<DynamicPageList>
 
category = Available
 
category = Digital
 
category = Human Intranet
 
 
 
</DynamicPageList>
 
  
  
Line 167: Line 155:
  
 
=Where to find us=
 
=Where to find us=
 +
* [https://iis-people.ee.ethz.ch/~herschmi/ Michael Hersche]
 +
** '''e-mail''': [mailto:herschmi@iis.ee.ethz.ch herschmi@iis.ee.ethz.ch]
 +
** ETZ J76.2
 +
* [https://iis-people.ee.ethz.ch/~xiaywang/ Xiaying Wang]
 +
** '''e-mail''': [mailto:xiaywang@iis.ee.ethz.ch xiaywang@iis.ee.ethz.ch]
 +
** ETZ J68.2
 
* [https://iis-people.ee.ethz.ch/~arahimi/ Dr. Abbas Rahimi]
 
* [https://iis-people.ee.ethz.ch/~arahimi/ Dr. Abbas Rahimi]
 
** '''e-mail''': [mailto:abbas@iis.ee.ethz.ch abbas@iis.ee.ethz.ch]
 
** '''e-mail''': [mailto:abbas@iis.ee.ethz.ch abbas@iis.ee.ethz.ch]

Revision as of 15:26, 1 September 2020


What is Human Intranet?

HI.png

The world around us is getting a lot smarter quickly: virtually every single component of our daily living environment is being equipped with sensors, actuators, processing, and connection into a network that will soon count billions of nodes and trillions of sensors. These devices only interact with the human through the traditional input and output channels. Hence, they only indirectly communicate with our brain—through our five sense modalities—forming two separate computing systems: biological versus physical. It could be made a lot more effective if a direct high bandwidth link existed between the two systems, allowing them to truly collaborate with each other and to offer opportunities for enhanced functionality that would otherwise be hard to accomplish. The emergence of miniaturized sense, compute and actuate devices as well as interfaces that are form-fitted to the human body opens the door for a symbiotic convergence between biological function and physical computing.

Human Intranet is an open, scalable platform that seamlessly integrates an ever-increasing number of sensor, actuation, computation, storage, communication and energy nodes located on, in, or around the human body acting in symbiosis with the functions provided by the body itself. Human Intranet presents a system vision in which, for example, disease would be treated by chronically measuring biosignals deep in the body, or by providing targeted, therapeutic interventions that respond on demand and in situ. To gain a holistic view of a person’s health, these sensors and actuators must communicate and collaborate with each other. Most of such systems prototyped or envisioned today serve to address deficiencies in the human sensory or motor control systems due to birth defects, illnesses, or accidents (e.g., invasive brain-machine interfaces, cochlear implants, artificial retinas, etc.). While all these systems target defects, one can easily imagine that this could lead to many types of enhancement and/or enable direct interaction with the environment: to make us humans smarter!

Here, in our projects, we mainly focus on sensor, computation, communication, and emerging storage aspects to develop very efficient closed-loop sense-interpret-actuate systems, enabling distributed autonomous behavior. For example, to design the brain of our physical computing (i.e., the compute/interpret component), we rely on computing with ultra-wide words (e.g., 10,000 bits) that eases interfacing with various sensor modalities and actuators. This novel computing paradigm is called hyperdimensional (HD) computing that is inspired from the very size of the biological brain’s circuits: assuming 1 bit per synapse, the brain is made up of more than 24 billion of such ultra-wide words. You can watch some of our demos:

You can also find a collection of complemented projects with source codes/datasets here:

Prerequisites and Focus

If you are an B.S. or M.S. student at the ETHZ, typically there is no prerequisite. You can come and talk to us and we adapt the projects based on your skills. The scope and focus of projects are wide. You can choose to work on:

  • Efficient hardware architectures in emerging technologies (e.g., the IBM computational memory)
  • Exploring new Human Intranet/IoT applications (High-level Embedded Programming)
  • Algorithm design and optimizations (Python)
  • System-level design and testing
  • Sensory interfaces (analog and digital)
  • FPGA prototyping, ASIC, and accelerators (SystemVerilog/ VHDL)



Useful Reading

Available Projects

Here, we provide a short description of the related projects for you to see the scope of our work. The directions and details of the projects can be adapted based on your interests and skills. Please do not hesitate to come and talk to us for more details.

Brain-Machine Interfaces

Emotiv-epoc-14-channel-mobile-eeg.jpg


Short Description

Noninvasive brain–machine interfaces (BMIs) and neuroprostheses aim to provide a communication and control channel based on the recognition of the subject’s intentions from spatiotemporal neural activity typically recorded by EEG electrodes. What makes it particularly challenging, however, is its susceptibility to errors over time in the recognition of human intentions.

In this project, our goal is to develop efficient and fast learning algorithms that replace traditional signal processing and classification methods by directly operating with raw data from electrodes. Furthermore, we aim to efficiently deploy those algorithms on tightly resource-limited devices (e.g., Microcontroller units) for near sensor classification using artificial intelligence.

Links

Available Projects


Epilepsy Seizure Detection Device

Non-EEG Seizure.jpg NeuroPace.jpg

Short Description

Seizure detection systems hold promise for improving the quality of life for patients with epilepsy that afflicts nearly 1% of the world's population. In this project, our goal is to develop efficient techniques for EEG as well as non-EEG signals to detect an upcoming seizure in an ultra-low-power device. This covers a wide range of analog and digital techniques. The abilities of hyperdimensional computing for one-shot and online learning can come to rescue.

Links

Available Projects



Completed Projects

These are projects that were recently completed:


Where to find us