Personal tools

Difference between revisions of "Human Intranet"

From iis-projects

Jump to: navigation, search
(Completed Projects)
 
(86 intermediate revisions by 6 users not shown)
Line 1: Line 1:
[[Category:Digital]]
+
[[File:HI.png|thumb|right|450px]]
[[Category:Human Intranet]]
 
[[Category:ASIC]]
 
[[Category:FPGA]]
 
[[Category:In progress]]
 
[[Category:Semester Thesis]]
 
[[Category:Master Thesis]]
 
  
__NOTOC__
 
 
=What is Human Intranet?=
 
=What is Human Intranet?=
[[File:HI.png|thumb|right|450px]]
 
 
The world around us is getting a lot smarter quickly: virtually every single component of our daily living environment is being equipped with sensors, actuators, processing, and connection into a network that will soon count billions of nodes and trillions of sensors. These devices only interact with the human through the traditional input and output channels. Hence, they only indirectly communicate with our brain—through our five sense modalities—forming two separate computing systems: biological versus physical. It could be made a lot more effective if a direct high bandwidth link existed between the two systems, allowing them to truly collaborate with each other and to offer opportunities for enhanced functionality that would otherwise be hard to accomplish. The emergence of miniaturized sense, compute and actuate devices as well as interfaces that are form-fitted to the human body opens the door for a symbiotic convergence between biological function and physical computing.
 
The world around us is getting a lot smarter quickly: virtually every single component of our daily living environment is being equipped with sensors, actuators, processing, and connection into a network that will soon count billions of nodes and trillions of sensors. These devices only interact with the human through the traditional input and output channels. Hence, they only indirectly communicate with our brain—through our five sense modalities—forming two separate computing systems: biological versus physical. It could be made a lot more effective if a direct high bandwidth link existed between the two systems, allowing them to truly collaborate with each other and to offer opportunities for enhanced functionality that would otherwise be hard to accomplish. The emergence of miniaturized sense, compute and actuate devices as well as interfaces that are form-fitted to the human body opens the door for a symbiotic convergence between biological function and physical computing.
  
 
Human Intranet is an open, scalable platform that seamlessly integrates an ever-increasing number of sensor, actuation, computation, storage, communication and energy nodes located on, in, or around the human body acting in symbiosis with the functions provided by the body itself. Human Intranet presents a system vision in which, for example, disease would be treated by chronically measuring biosignals deep in the body, or by providing targeted, therapeutic interventions that respond on demand and in situ. To gain a holistic view of a person’s health, these sensors and actuators must communicate and collaborate with each other. Most of such systems prototyped or envisioned today serve to address deficiencies in the human sensory or motor control systems due to birth defects, illnesses, or accidents (e.g., invasive brain-machine interfaces, cochlear implants, artificial retinas, etc.). While all these systems target defects, one can easily imagine that this could lead to many types of enhancement and/or enable direct interaction with the environment: to make us humans smarter!
 
Human Intranet is an open, scalable platform that seamlessly integrates an ever-increasing number of sensor, actuation, computation, storage, communication and energy nodes located on, in, or around the human body acting in symbiosis with the functions provided by the body itself. Human Intranet presents a system vision in which, for example, disease would be treated by chronically measuring biosignals deep in the body, or by providing targeted, therapeutic interventions that respond on demand and in situ. To gain a holistic view of a person’s health, these sensors and actuators must communicate and collaborate with each other. Most of such systems prototyped or envisioned today serve to address deficiencies in the human sensory or motor control systems due to birth defects, illnesses, or accidents (e.g., invasive brain-machine interfaces, cochlear implants, artificial retinas, etc.). While all these systems target defects, one can easily imagine that this could lead to many types of enhancement and/or enable direct interaction with the environment: to make us humans smarter!
  
Here, in our projects, we mainly focus on '''sensor, computation, communication, and emerging storage''' aspects to develop very efficient closed-loop sense-interpret-actuate systems, enabling distributed autonomous behavior. For example, to design the ''brain'' of our physical computing (i.e., the compute/interpret component), we rely on computing with ultra-wide words (e.g., 10,000 bits) that eases interfacing with various sensor modalities and actuators. This novel computing paradigm is called hyperdimensional (HD) computing that is inspired from the very size of the biological brain’s circuits: assuming 1 bit per synapse, the brain is made up of more than 24 billion of such ultra-wide words. You can watch some of our demos:
+
Here, in our projects, we mainly focus on '''sensor, computation, communication, and emerging storage''' aspects to develop very efficient closed-loop sense-interpret-actuate systems, enabling distributed autonomous behavior.  
 +
 
 +
<!--For example, to design the ''brain'' of our physical computing (i.e., the compute/interpret component), we rely on computing with ultra-wide words (e.g., 10,000 bits) that eases interfacing with various sensor modalities and actuators. This novel computing paradigm is called hyperdimensional (HD) computing that is inspired from the very size of the biological brain’s circuits: assuming 1 bit per synapse, the brain is made up of more than 24 billion of such ultra-wide words. You can watch some of our demos:
 
* [https://bwrc.eecs.berkeley.edu/sites/default/files/files/u2630/flexemg_v2_lq.mp4#t=2 Video1]
 
* [https://bwrc.eecs.berkeley.edu/sites/default/files/files/u2630/flexemg_v2_lq.mp4#t=2 Video1]
 
* [https://www.youtube.com/watch?time_continue=9&v=vTQGMQ6QaJE Video2]
 
* [https://www.youtube.com/watch?time_continue=9&v=vTQGMQ6QaJE Video2]
 
* [https://iis-people.ee.ethz.ch/~arahimi/papers/ISSCC18-Demo.pdf PDF]  
 
* [https://iis-people.ee.ethz.ch/~arahimi/papers/ISSCC18-Demo.pdf PDF]  
  
 +
You can also find a collection of complemented projects with source codes/datasets here:
 +
* [https://github.com/HyperdimensionalComputing/collection Github link]
 +
-->
 
==Prerequisites and Focus==
 
==Prerequisites and Focus==
If you are an M.S. student at the ETHZ, typically there is no prerequisite. You can come and talk to us and we adapt the projects based on your skills. The scope and focus of projects are wide. You can choose to work on:
+
If you are an B.S. or M.S. student at the ETHZ, typically there is no prerequisite. You can come and talk to us and we adapt the projects based on your skills. The scope and focus of projects are wide. You can choose to work on:
  
* '''Efficient hardware architectures in emerging technologies''' (e.g., [https://www.zurich.ibm.com/sto/memory/ the IBM computational memory])
+
<!-- * '''Efficient hardware architectures in emerging technologies''' (e.g., [https://www.zurich.ibm.com/sto/memory/ the IBM computational memory])-->
* '''System-level design and testing'''  
+
* '''Exploring new Human Intranet/IoT applications'''
 +
* '''Algorithm design and optimizations''' (Python)
 +
* '''System-level design and testing''' (Altium, C-programming)
 
* '''Sensory interfaces''' (analog and digital)
 
* '''Sensory interfaces''' (analog and digital)
* '''FPGA prototyping, ASIC, and accelerators''' (SystemVerilog/ VHDL)
 
* '''Exploring new Human Intranet/IoT applications''' (High-level Embedded Programming)
 
* '''Algorithm design and optimizations''' (Matlab/ Python)
 
  
  
Line 37: Line 33:
 
===Useful Reading===
 
===Useful Reading===
 
*[https://ieeexplore.ieee.org/document/7030200/ The Human Intranet--Where Swarms and Humans Meet]
 
*[https://ieeexplore.ieee.org/document/7030200/ The Human Intranet--Where Swarms and Humans Meet]
*[http://redwood.berkeley.edu/vs265/kanerva09-hyperdimensional.pdf Hyperdimensional Computing: An Introduction to Computing in Distributed Representation with High-Dimensional Random Vectors]
+
*[https://ieeexplore.ieee.org/abstract/document/8490896 Efficient Biosignal Processing Using Hyperdimensional Computing: Network Templates for Combined Learning and Classification of ExG Signals]
*[https://ieeexplore.ieee.org/document/8422472/ Hyperdimensional Modulation for Robust Low-Power Communications]
+
*[https://iopscience.iop.org/article/10.1088/1741-2552/aab2f2/meta A review of classification algorithms for EEG-based brain–computer interfaces: a 10 year update]
*[https://iis-people.ee.ethz.ch/~arahimi/papers/TCAS17.pdf High-dimensional Computing as a Nanoscalable Paradigm]
 
*[http://www.oxfordscholarship.com/view/10.1093/acprof:oso/9780199794546.001.0001/acprof-9780199794546 How to Build a Brain]
 
*[https://mitpress.mit.edu/books/sparse-distributed-memory Pentti Kanerva. 1988. Sparse Distributed Memory. MIT Press, Cambridge, MA, USA]
 
  
 
=Available Projects=
 
=Available Projects=
 
Here, we provide a short description of the related projects for you to see the scope of our work. The directions and details of the projects can be adapted based on your interests and skills. Please do not hesitate to come and talk to us for more details.
 
Here, we provide a short description of the related projects for you to see the scope of our work. The directions and details of the projects can be adapted based on your interests and skills. Please do not hesitate to come and talk to us for more details.
  
 +
==Wearables for health and physiology==
 +
[[File:Cardiorespiratory.JPG|thumb|right|200px]]
 +
===Short Description===
 +
In this research area, we develop wearable systems, algorithms, and applications for monitoring health- and physiological-related parameters in innovative ways. Examples include (but are not limited to): heart rate and respiration rate monitoring, blood pressure monitoring, bladder monitoring, drowsiness detection, monitoring of muscle contractions and identification of innervations, ...
 +
 +
For wearables based on ultrasound, see also the dedicated [[Digital_Medical_Ultrasound_Imaging | '''Ultrasound section''']]
 +
 +
===Available Projects===
 +
<DynamicPageList>
 +
category = Available
 +
category = Digital
 +
category = WearablesHealth
 +
</DynamicPageList>
  
 +
==Brain-Machine Interfaces and wearables==
 +
<!--  [[File:BCI.png|thumb|center]]
 +
[[File:BCI-dryEEG.jpg|thumb|right]] -->
 +
[[File:Emotiv-epoc-14-channel-mobile-eeg.jpg|thumb|right|200px]]
 +
[[File:In_ear_EEG.jpg|thumb|right|200px]]
  
=Smart Eyeglass for Drones=
 
[[File:Jins_6axis.png|thumb|text-top|right|250px]]
 
[[File:Jins_EOG.png|thumb|text-top|right|250px]]
 
  
 
===Short Description===
 
===Short Description===
This project plans to deploy and build upon a new breed of eyewear that allows you to look inside yourself, instead of just at what is in front of you. These insights help you see deep within yourself by showing shifts in your emotional state, your activity logs, as well as your health. This device currently provides 6-axis sensors as well as EOG sensors. These collectively allow to recognize body movements, eye movements, and status of mind. In this project, your goal is to interface with this device and extend it with other sensors to create novel machine learning applications, e.g., controlling the movements of our [http://iis-projects.ee.ethz.ch/index.php/Energy_Efficient_Autonomous_UAVs nano-size quadrotor].
+
Noninvasive brain–machine interfaces (BMIs) and neuroprostheses aim to provide a communication and control channel based on the recognition of the subject’s intentions from spatiotemporal neural activity typically recorded by EEG electrodes. BMIs are a special kind of HMI, focused on the brain. What makes BMIs particularly challenging is their susceptibility to errors over time in the recognition of human intentions.
  
===Links===
+
In these projects, our goal is to develop efficient and fast learning algorithms that replace traditional signal processing and classification methods by directly operating with raw data from electrodes. Furthermore, we aim to efficiently deploy those algorithms on tightly resource-limited devices (e.g., Microcontroller units) for near sensor classification using artificial intelligence.
* [https://jins-meme.com/en/concept/ JINS Meme Smartglass]
 
* [https://www.youtube.com/watch?v=Om_F0uyfjyc A game application of smartglass]
 
  
=Hyperdimensional Affective Computing=
+
*WATCH OUR DEMO: EEG-HEADBAND CONTROLLING A DRONE: https://www.youtube.com/watch?v=3-DysFptdRI
[[File:Emotion-recognition.jpg|border|super|200px]]
 
[[File:Emotions-on-arousal-valence-space.jpg|border|super|300px]]
 
  
===Short Description===
+
===Links===
Affective computing (sometimes called artificial emotional intelligence) is the study and development of systems and devices that can recognize, interpret, process, and simulate human affects. We focus on the emotion recognition and interpretation. Emotion is a subjective mental state caused by some specific events, which is usually accompanied by characteristic behaviors and involuntary physiological changes. Therefore, multi-channel physiological signals (e.g., GSR, ECG, EEG, EOG) become good inputs for emotion analysis, which also can be collected easily and continuously by wearable sensors. However, due to the need of a huge amount of training data for a high-quality machine learning model, energy efficiency constrains and robust issues become major performance bottlenecks, especially for the wearable devices. To overcome this issue, HD computing can come to rescue by providing a low-energy, robust, and fast learning computational paradigm.
+
* [https://iis-people.ee.ethz.ch/~herschmi/EdgeDL20.pdf Q-EEGNet: an Energy-Efficient 8-bit Quantized Parallel EEGNet Implementation for Edge Motor-Imagery Brain–Machine Interfaces]
 
+
* [https://iis-people.ee.ethz.ch/~herschmi/MEMEA20.pdf An Accurate EEGNet-based Motor-Imagery Brain–Computer Interface for Low-Power Edge Computing]
In this project, your goal would be to develop an efficient and robust learning method based on hyperdimensional spaces to enhance accuracy and energy consumption. 
+
* [https://iis-people.ee.ethz.ch/~arahimi/papers/EUSIPCO18.pdf Fast and Accurate Multiclass Inference for Motor Imagery BCIs Using Large Multiscale Temporal and Spectral Features]
 +
* [https://iis-people.ee.ethz.ch/~arahimi/papers/MONET17.pdf Hyperdimensional Computing for Blind and One-Shot Classification of EEG Error-Related Potentials]
 +
* [https://arxiv.org/abs/1812.05705 Exploring Embedding Methods in Binary Hyperdimensional Computing: A Case Study for Motor-Imagery based Brain-Computer Interfaces]
  
 +
===Available Projects===
 +
<DynamicPageList>
 +
category = Available
 +
category = Digital
 +
category = BCI
 +
</DynamicPageList>
  
=Epilepsy Seizure Detection Device=
+
==Epilepsy Seizure Detection Device==
 
[[File:Non-EEG Seizure.jpg|border|text-top|300px]]
 
[[File:Non-EEG Seizure.jpg|border|text-top|300px]]
 
[[File:NeuroPace.jpg|border|text-top|400px]]
 
[[File:NeuroPace.jpg|border|text-top|400px]]
 
<!-- Seizure-prediction.png -->
 
<!-- Seizure-prediction.png -->
 
===Short Description===
 
===Short Description===
Seizure detection systems hold promise for improving the quality of life for patients with epilepsy that afflicts nearly 1% of the world's population.
+
Epilepsy is a brain disease that affects more than 50 million people worldwide. Conventional treatments are primarily pharmacological, but they can require surgery or invasive neurostimulation in the case of drug-resistant subjects. In these cases, personalized patient treatments are necessary and can be achieved with the help of long-term recording of brain activity. In this context, seizure detection systems hold promise for improving the quality of life for patients with epilepsy, providing non-stigmatizing and reliable continuous monitoring during real-life conditions. In this project, our goal is to develop efficient techniques for EEG as well as non-EEG signals to detect an upcoming seizure in an ultra-low-power device.  
In this project, your goal would be to develop efficient techniques for EEG as well as non-EEG signals to detect an upcoming seizure in an ultra-low-power device. This covers a wide range of analog and digital techniques. The abilities of hyperdimensional computing for one-shot and online learning can come to rescue.
+
In this project, our goal is to develop efficient techniques for EEG as well as non-EEG signals to detect an upcoming seizure in an ultra-low-power device. This covers a wide range of analog and digital techniques.
  
 
===Links===
 
===Links===
 +
* [http://ieeg-swez.ethz.ch/ The SWEC-ETHZ iEEG Database and Algorithms]
 
* [https://www.wysscenter.ch/project/epilepsy-monitoring-seizure-forecasts/ Epilepsy monitoring and seizure forecasts at Wyss Center]
 
* [https://www.wysscenter.ch/project/epilepsy-monitoring-seizure-forecasts/ Epilepsy monitoring and seizure forecasts at Wyss Center]
 
* [https://www.youtube.com/watch?time_continue=87&v=ouyPXkEud40 Controlling tinnitus with neurofeedback]
 
* [https://www.youtube.com/watch?time_continue=87&v=ouyPXkEud40 Controlling tinnitus with neurofeedback]
  
=Online Brain-Computer Interfaces=
 
<!--  [[File:BCI.png|thumb|center]]
 
[[File:BCI-dryEEG.jpg|thumb|right]] -->
 
[[File:Emotiv-epoc-14-channel-mobile-eeg.jpg|thumb|right|200px]]
 
  
  
 +
===Available Projects===
 +
<DynamicPageList>
 +
category = Available
 +
category = Digital
 +
category = Epilepsy
 +
</DynamicPageList>
 +
 +
==Foundation models and LLMs for Health==
 +
[[File:EEG_ECG.png|border|text-top|400px]]
 +
[[File:LLM.png|border|text-top|400px]]
 +
<!-- Seizure-prediction.png -->
 
===Short Description===
 
===Short Description===
Noninvasive brain–computer interfaces (BCI) and neuroprostheses aim to provide a communication and control channel based on the recognition of the subject’s intentions from spatiotemporal neural activity typically recorded by EEG electrodes. What makes it particularly challenging, however, is its susceptibility to errors over time in the recognition of human intentions.
+
Incorporating Foundation Models and Large Language Models (LLMs) within artificial intelligence is gaining significant traction, particularly due to their potential applications in the health sector. This project is dedicated to developing sophisticated methodologies for utilizing foundation models and LLMs in health-related applications, specifically analyzing electroencephalogram (EEG) brain signals.
 +
 
 +
In healthcare and biomedical research, implementing advanced computational models, notably Foundation Models and Large Language Models (LLMs), revolutionizes the understanding and interpretation of intricate biosignals. We stand at the vanguard of this revolutionary change, delving into the capabilities of these models for the analysis and interpretation of critical biosignals, including electroencephalograms (EEG) and electrocardiograms (ECG).
 +
 
 +
Foundation Models, encompassing a spectrum of robust, pre-trained models, are transforming our ability to process and interpret large datasets. Initially trained on extensive and diverse datasets, these models are adaptable for specific tasks, offering remarkable accuracy and efficiency. This adaptability renders them particularly beneficial for biosignal analysis, where the intricacies of EEG and ECG data demand both precision and contextual understanding.
 +
 
 +
As a subset of Foundation Models, LLMs have demonstrated efficacy in processing and generating human language. At IIS, we are pioneering the application of LLMs in the domain of biosignal interpretation, extending beyond textual data. This entails training the models to interpret the 'language' of biosignals, translating complex patterns into actionable insights.
  
In this project, your goal would be to develop an efficient and fast learning hardware device that replaces the traditional signal processing and classification methods by directly operating with raw data from electrodes in an online fashion.  
+
Our emphasis on EEG and ECG signals is motivated by these biosignals' profound insights into human health. EEGs, capturing brain activity, and ECGs, monitoring heart rhythms, are instrumental in diagnosing and managing various health conditions. By leveraging Foundation Models and LLMs, our objective is to refine diagnostic accuracy, predict health outcomes, and customize patient care.
 +
 
 +
IIS invites Master's students to immerse themselves in this pioneering area. Our projects offer avenues to engage with state-of-the-art technologies, apply them to real-world health challenges, and contribute to shaping a future where healthcare is more predictive, preventive, and personalized. We encourage your participation in this exhilarating endeavor to redefine the confluence of healthcare and technology.
  
 
===Links===
 
===Links===
* [https://iis-people.ee.ethz.ch/~arahimi/papers/EUSIPCO18.pdf Fast and Accurate Multiclass Inference for Motor Imagery BCIs Using Large Multiscale Temporal and Spectral Features]
+
* [https://braingpt.org/ BrainGPT]
* [https://iis-people.ee.ethz.ch/~arahimi/papers/MONET17.pdf Hyperdimensional Computing for Blind and One-Shot Classification of EEG Error-Related Potentials (paper)]
+
 
* [https://github.com/abbas-rahimi/HDC-EEG-ERP Related Matlab code]
+
 
 +
===Available Projects===
 +
<DynamicPageList>
 +
category = Available
 +
category = Digital
 +
category = HealthGPT
 +
</DynamicPageList>
 +
 
 +
  
 +
<!--
 
=Extremely Resilient Hyperdimensional Processor=
 
=Extremely Resilient Hyperdimensional Processor=
 
[[File:BrainChip.jpg|thumb|left]]
 
[[File:BrainChip.jpg|thumb|left]]
Line 111: Line 148:
  
 
=Flexible High-Density Sensors for Hand Gesture Recognition=
 
=Flexible High-Density Sensors for Hand Gesture Recognition=
<!-- [[File:Hyperdimensional_EMG.png|thumb|center]] -->
+
[[File:Hyperdimensional_EMG.png|thumb|center]]
 
[[File:FlexEMG.png|thumb|right|500px]]
 
[[File:FlexEMG.png|thumb|right|500px]]
 
  
 
===Short Description===
 
===Short Description===
Line 126: Line 162:
 
* [https://iis-people.ee.ethz.ch/~arahimi/papers/ISCAS2018.pdf Gesture Recognition System with Flexible High-Density Sensors]  
 
* [https://iis-people.ee.ethz.ch/~arahimi/papers/ISCAS2018.pdf Gesture Recognition System with Flexible High-Density Sensors]  
 
* [https://iis-people.ee.ethz.ch/~arahimi/papers/papers/ICRC16.pdf Hyperdimensional Biosignal Processing: A Case Study for EMG-based Hand Gesture Recognition (paper)]
 
* [https://iis-people.ee.ethz.ch/~arahimi/papers/papers/ICRC16.pdf Hyperdimensional Biosignal Processing: A Case Study for EMG-based Hand Gesture Recognition (paper)]
* [https://github.com/abbas-rahimi/HDC-EMG Related Matlab code]  
+
* [https://arxiv.org/abs/1901.00234 Adaptive EMG-based hand gesture recognition using hyperdimensional computing (paper)]
 
+
* [https://github.com/abbas-rahimi/HDC-EMG Related Matlab code]
  
 
==Robot Learning by Demonstration==
 
==Robot Learning by Demonstration==
Line 141: Line 177:
 
* [https://www.tu-chemnitz.de/etit/proaut/publications/IROS2016_neubert.pdf Learning Vector Symbolic Architectures for Reactive Robot Behaviours]  
 
* [https://www.tu-chemnitz.de/etit/proaut/publications/IROS2016_neubert.pdf Learning Vector Symbolic Architectures for Reactive Robot Behaviours]  
 
* [https://www.aaai.org/ocs/index.php/WS/AAAIW13/paper/download/7075/6578 Learning Behavior Hierarchies via High-Dimensional Sensor Projection (paper)]
 
* [https://www.aaai.org/ocs/index.php/WS/AAAIW13/paper/download/7075/6578 Learning Behavior Hierarchies via High-Dimensional Sensor Projection (paper)]
 +
--->
 +
 +
= Projects in Progress=
 +
<DynamicPageList>
 +
supresserrors = true
 +
category = In progress
 +
category = Human Intranet
 +
</DynamicPageList>
  
==Completed Projects==
+
=Completed Projects=
These are projects that were recently completed  
+
These are projects that were recently completed:
 
<DynamicPageList>
 
<DynamicPageList>
 
category = Completed
 
category = Completed
 
category = Digital
 
category = Digital
 
category = Human Intranet
 
category = Human Intranet
suppresserrors=true
+
 
 
</DynamicPageList>
 
</DynamicPageList>
  
=Where to find us=
+
=Where to find us=  
* [https://iis-people.ee.ethz.ch/~arahimi/ Dr. Abbas Rahimi]
+
{|
** '''e-mail''': [mailto:abbas@ee.ethz.ch abbas@ee.ethz.ch]
+
| style="padding: 10px" | [[File:Thorir.jpg|frameless|left|100px]]
** ETZ J85
+
| style="padding: 10px" | [[File:SebiFrey.jpg|frameless|left|100px]]
* [http://www.iis.ee.ethz.ch/people/person-detail.html?persid=194234 Prof. Luca Benini]
+
| style="padding: 10px" |
** '''e-mail''': [mailto:lbenini@iis.ee.ethz.ch lbenini@iis.ee.ethz.ch]
+
| style="padding: 10px" | [[File:Andrea_Cossettini.jpg|frameless|left|100px]]
** ETZ J84
+
|-
 +
| [[:User:Thoriri | Thorir Mar Ingolfsson]]
 +
| Sebastian Frey
 +
| [[:User:Xiaywang | Dr. Xiaying Wang]]
 +
| [[:User:Cosandre | Dr. Andrea Cossettini]]
 +
|-
 +
| '''Office''': OAT U21
 +
| '''Office''': ETZ J69.2
 +
| '''Office''': OAT U24 / ETZ J68.2
 +
| '''Office''': OAT U27 / ETZ J69.2
 +
|-
 +
| '''e-mail''': [mailto:thoriri@iis.ee.ethz.ch thoriri@iis.ee.ethz.ch]
 +
| '''e-mail''': [mailto:sefrey@iis.ee.ethz.ch sefrey@iis.ee.ethz.ch]
 +
| '''e-mail''': [mailto:xiaywang@iis.ee.ethz.ch xiaywang@iis.ee.ethz.ch]
 +
| '''e-mail''': [mailto:cossettini.andrea@iis.ee.ethz.ch cossettini.andrea@iis.ee.ethz.ch]
 +
|}
 +
 
 +
 
 +
 
 +
[[Category:Digital]]
 +
[[Category:Human Intranet]]
 +
[[Category:ASIC]]
 +
[[Category:FPGA]]
 +
[[Category:Semester Thesis]]
 +
[[Category:Master Thesis]]

Latest revision as of 19:09, 10 March 2024

HI.png

What is Human Intranet?

The world around us is getting a lot smarter quickly: virtually every single component of our daily living environment is being equipped with sensors, actuators, processing, and connection into a network that will soon count billions of nodes and trillions of sensors. These devices only interact with the human through the traditional input and output channels. Hence, they only indirectly communicate with our brain—through our five sense modalities—forming two separate computing systems: biological versus physical. It could be made a lot more effective if a direct high bandwidth link existed between the two systems, allowing them to truly collaborate with each other and to offer opportunities for enhanced functionality that would otherwise be hard to accomplish. The emergence of miniaturized sense, compute and actuate devices as well as interfaces that are form-fitted to the human body opens the door for a symbiotic convergence between biological function and physical computing.

Human Intranet is an open, scalable platform that seamlessly integrates an ever-increasing number of sensor, actuation, computation, storage, communication and energy nodes located on, in, or around the human body acting in symbiosis with the functions provided by the body itself. Human Intranet presents a system vision in which, for example, disease would be treated by chronically measuring biosignals deep in the body, or by providing targeted, therapeutic interventions that respond on demand and in situ. To gain a holistic view of a person’s health, these sensors and actuators must communicate and collaborate with each other. Most of such systems prototyped or envisioned today serve to address deficiencies in the human sensory or motor control systems due to birth defects, illnesses, or accidents (e.g., invasive brain-machine interfaces, cochlear implants, artificial retinas, etc.). While all these systems target defects, one can easily imagine that this could lead to many types of enhancement and/or enable direct interaction with the environment: to make us humans smarter!

Here, in our projects, we mainly focus on sensor, computation, communication, and emerging storage aspects to develop very efficient closed-loop sense-interpret-actuate systems, enabling distributed autonomous behavior.

Prerequisites and Focus

If you are an B.S. or M.S. student at the ETHZ, typically there is no prerequisite. You can come and talk to us and we adapt the projects based on your skills. The scope and focus of projects are wide. You can choose to work on:

  • Exploring new Human Intranet/IoT applications
  • Algorithm design and optimizations (Python)
  • System-level design and testing (Altium, C-programming)
  • Sensory interfaces (analog and digital)


Useful Reading

Available Projects

Here, we provide a short description of the related projects for you to see the scope of our work. The directions and details of the projects can be adapted based on your interests and skills. Please do not hesitate to come and talk to us for more details.

Wearables for health and physiology

Cardiorespiratory.JPG

Short Description

In this research area, we develop wearable systems, algorithms, and applications for monitoring health- and physiological-related parameters in innovative ways. Examples include (but are not limited to): heart rate and respiration rate monitoring, blood pressure monitoring, bladder monitoring, drowsiness detection, monitoring of muscle contractions and identification of innervations, ...

For wearables based on ultrasound, see also the dedicated Ultrasound section

Available Projects


Brain-Machine Interfaces and wearables

Emotiv-epoc-14-channel-mobile-eeg.jpg
In ear EEG.jpg


Short Description

Noninvasive brain–machine interfaces (BMIs) and neuroprostheses aim to provide a communication and control channel based on the recognition of the subject’s intentions from spatiotemporal neural activity typically recorded by EEG electrodes. BMIs are a special kind of HMI, focused on the brain. What makes BMIs particularly challenging is their susceptibility to errors over time in the recognition of human intentions.

In these projects, our goal is to develop efficient and fast learning algorithms that replace traditional signal processing and classification methods by directly operating with raw data from electrodes. Furthermore, we aim to efficiently deploy those algorithms on tightly resource-limited devices (e.g., Microcontroller units) for near sensor classification using artificial intelligence.

Links

Available Projects


Epilepsy Seizure Detection Device

Non-EEG Seizure.jpg NeuroPace.jpg

Short Description

Epilepsy is a brain disease that affects more than 50 million people worldwide. Conventional treatments are primarily pharmacological, but they can require surgery or invasive neurostimulation in the case of drug-resistant subjects. In these cases, personalized patient treatments are necessary and can be achieved with the help of long-term recording of brain activity. In this context, seizure detection systems hold promise for improving the quality of life for patients with epilepsy, providing non-stigmatizing and reliable continuous monitoring during real-life conditions. In this project, our goal is to develop efficient techniques for EEG as well as non-EEG signals to detect an upcoming seizure in an ultra-low-power device. In this project, our goal is to develop efficient techniques for EEG as well as non-EEG signals to detect an upcoming seizure in an ultra-low-power device. This covers a wide range of analog and digital techniques.

Links


Available Projects


Foundation models and LLMs for Health

EEG ECG.png LLM.png

Short Description

Incorporating Foundation Models and Large Language Models (LLMs) within artificial intelligence is gaining significant traction, particularly due to their potential applications in the health sector. This project is dedicated to developing sophisticated methodologies for utilizing foundation models and LLMs in health-related applications, specifically analyzing electroencephalogram (EEG) brain signals.

In healthcare and biomedical research, implementing advanced computational models, notably Foundation Models and Large Language Models (LLMs), revolutionizes the understanding and interpretation of intricate biosignals. We stand at the vanguard of this revolutionary change, delving into the capabilities of these models for the analysis and interpretation of critical biosignals, including electroencephalograms (EEG) and electrocardiograms (ECG).

Foundation Models, encompassing a spectrum of robust, pre-trained models, are transforming our ability to process and interpret large datasets. Initially trained on extensive and diverse datasets, these models are adaptable for specific tasks, offering remarkable accuracy and efficiency. This adaptability renders them particularly beneficial for biosignal analysis, where the intricacies of EEG and ECG data demand both precision and contextual understanding.

As a subset of Foundation Models, LLMs have demonstrated efficacy in processing and generating human language. At IIS, we are pioneering the application of LLMs in the domain of biosignal interpretation, extending beyond textual data. This entails training the models to interpret the 'language' of biosignals, translating complex patterns into actionable insights.

Our emphasis on EEG and ECG signals is motivated by these biosignals' profound insights into human health. EEGs, capturing brain activity, and ECGs, monitoring heart rhythms, are instrumental in diagnosing and managing various health conditions. By leveraging Foundation Models and LLMs, our objective is to refine diagnostic accuracy, predict health outcomes, and customize patient care.

IIS invites Master's students to immerse themselves in this pioneering area. Our projects offer avenues to engage with state-of-the-art technologies, apply them to real-world health challenges, and contribute to shaping a future where healthcare is more predictive, preventive, and personalized. We encourage your participation in this exhilarating endeavor to redefine the confluence of healthcare and technology.

Links


Available Projects



Projects in Progress


Completed Projects

These are projects that were recently completed:


Where to find us

Thorir.jpg
SebiFrey.jpg
Andrea Cossettini.jpg
Thorir Mar Ingolfsson Sebastian Frey Dr. Xiaying Wang Dr. Andrea Cossettini
Office: OAT U21 Office: ETZ J69.2 Office: OAT U24 / ETZ J68.2 Office: OAT U27 / ETZ J69.2
e-mail: thoriri@iis.ee.ethz.ch e-mail: sefrey@iis.ee.ethz.ch e-mail: xiaywang@iis.ee.ethz.ch e-mail: cossettini.andrea@iis.ee.ethz.ch