Human Intranet
From iis-projects
What is Human Intranet?
The world around us is getting a lot smarter quickly: virtually every single component of our daily living environment is being equipped with sensors, actuators, processing, and connection into a network that will soon count billions of nodes and trillions of sensors. These devices only interact with the human through the traditional input and output channels. Hence, they only indirectly communicate with our brain—through our five sense modalities—forming two separate computing systems: biological versus physical. It could be made a lot more effective if a direct high bandwidth link existed between the two systems, allowing them to truly collaborate with each other and to offer opportunities for enhanced functionality that would otherwise be hard to accomplish. The emergence of miniaturized sense, compute and actuate devices as well as interfaces that are form-fitted to the human body opens the door for a symbiotic convergence between biological function and physical computing.
Human Intranet is an open, scalable platform that seamlessly integrates an ever-increasing number of sensor, actuation, computation, storage, communication and energy nodes located on, in, or around the human body acting in symbiosis with the functions provided by the body itself. Human Intranet presents a system vision in which, for example, disease would be treated by chronically measuring biosignals deep in the body, or by providing targeted, therapeutic interventions that respond on demand and in situ. To gain a holistic view of a person’s health, these sensors and actuators must communicate and collaborate with each other. Most of such systems prototyped or envisioned today serve to address deficiencies in the human sensory or motor control systems due to birth defects, illnesses, or accidents (e.g., invasive brain-machine interfaces, cochlear implants, artificial retinas, etc.). While all these systems target defects, one can easily imagine that this could lead to many types of enhancement and/or enable direct interaction with the environment: to make us humans smarter!
Here, in our projects, we mainly focus on sensor, computation, communication, and emerging storage aspects to develop very efficient closed-loop sense-interpret-actuate systems, enabling distributed autonomous behavior. For example, to design the brain of our physical computing (i.e., the compute/interpret component), we rely on computing with ultra-wide words (e.g., 10,000 bits) that eases interfacing with various sensor modalities and actuators. This novel computing paradigm is called hyperdimensional (HD) computing that is inspired from the very size of the biological brain’s circuits: assuming 1 bit per synapse, the brain is made up of more than 24 billion of such ultra-wide words. You can watch some of our demos:
You can also find a collection of complemented projects with source codes/datasets here:
Prerequisites and Focus
If you are an M.S. student at the ETHZ, typically there is no prerequisite. You can come and talk to us and we adapt the projects based on your skills. The scope and focus of projects are wide. You can choose to work on:
- Efficient hardware architectures in emerging technologies (e.g., the IBM computational memory)
- System-level design and testing
- Sensory interfaces (analog and digital)
- FPGA prototyping, ASIC, and accelerators (SystemVerilog/ VHDL)
- Exploring new Human Intranet/IoT applications (High-level Embedded Programming)
- Algorithm design and optimizations (Matlab/ Python)
Useful Reading
- The Human Intranet--Where Swarms and Humans Meet
- Hyperdimensional Computing: An Introduction to Computing in Distributed Representation with High-Dimensional Random Vectors
- Hyperdimensional Modulation for Robust Low-Power Communications
- High-dimensional Computing as a Nanoscalable Paradigm
- How to Build a Brain
- Pentti Kanerva. 1988. Sparse Distributed Memory. MIT Press, Cambridge, MA, USA
Available Projects
Here, we provide a short description of the related projects for you to see the scope of our work. The directions and details of the projects can be adapted based on your interests and skills. Please do not hesitate to come and talk to us for more details.
Online Brain-Computer Interfaces
Short Description
Noninvasive brain–computer interfaces (BCI) and neuroprostheses aim to provide a communication and control channel based on the recognition of the subject’s intentions from spatiotemporal neural activity typically recorded by EEG electrodes. What makes it particularly challenging, however, is its susceptibility to errors over time in the recognition of human intentions.
In this project, your goal would be to develop an efficient and fast learning hardware device that replaces the traditional signal processing and classification methods by directly operating with raw data from electrodes in an online fashion.
Links
- Fast and Accurate Multiclass Inference for Motor Imagery BCIs Using Large Multiscale Temporal and Spectral Features
- Hyperdimensional Computing for Blind and One-Shot Classification of EEG Error-Related Potentials
- Exploring Embedding Methods in Binary Hyperdimensional Computing: A Case Study for Motor-Imagery based Brain-Computer Interfaces
Related Projects
- Exploratory Development of a Unified Foundational Model for Multi Biosignal Analysis
- Deep Learning Based Anomaly Detection in ECG Signals Using Foundation Models
- Pretraining Foundational Models for EEG Signal Analysis Using Open Source Large Scale Datasets
- EEG-based drowsiness detection
- In-ear EEG signal acquisition
- EEG earbud
- Advanced EEG glasses
- Predict eye movement through brain activity
- BCI-controlled Drone
Epilepsy Seizure Detection Device
Short Description
Seizure detection systems hold promise for improving the quality of life for patients with epilepsy that afflicts nearly 1% of the world's population. In this project, your goal would be to develop efficient techniques for EEG as well as non-EEG signals to detect an upcoming seizure in an ultra-low-power device. This covers a wide range of analog and digital techniques. The abilities of hyperdimensional computing for one-shot and online learning can come to rescue.
Links
- The SWEC-ETHZ iEEG Database and Algorithms
- Epilepsy monitoring and seizure forecasts at Wyss Center
- Controlling tinnitus with neurofeedback
Related Projects
- Advanced EEG glasses
- Self Aware Epilepsy Monitoring
- EEG artifact detection with machine learning
- EEG artifact detection for epilepsy monitoring
Extremely Resilient Hyperdimensional Processor
Short Description
The most important aspect of hyperdimensional (HD) computing, for hardware realization, is its robustness against noise and variations in the computing platforms. Principles of HD computing allows to implement resilient controllers and state machines for extreme noisy conditions. Its tolerance in operating with faulty components and low signal-to-noise ratio (SNR) conditions is achieved by brain-inspired properties of hypervectors: (pseudo)randomness, high-dimensionality, and fully distributed holographic representations.
In this project, your goal would be to design and develop an end-to-end robust HD processor with extremely resilient controller based on principles of HD computing, and measure its resiliency against noisy environment and faulty components.
Links
- A Robust and Energy-Efficient Classifier Using Brain-Inspired Hyperdimensional Computing
- PULP-HD: Accelerating Brain-Inspired High-Dimensional Computing on a Parallel Ultra-Low Power Platform
- Associative Synthesis of Finite State Automata Model of a Controlled Object with Hyperdimensional Computing
Flexible High-Density Sensors for Hand Gesture Recognition
Short Description
The surface electromyography (EMG) signals are the superposition of the electrical activity of underneath muscles when contractions occur. Wearable surface EMG devices have a wide range of applications in controlling the upper limb prostheses and hand gesture recognition systems intended for consumer human-machine interaction. High-density EMG electrode array covering the whole arm can ease targeting the most desired muscle locations and cope the issues with sensors misplacement. For robust gesture recognition from such EMG sensors, we rely on brain-inspired HD computing.
In this project, your goal would be to develop new sensors and RTL implementation of HD computing for one-shot gesture learning in an ultra low-power device.
Links
- Flexible EMG Demo
- Gesture Recognition System with Flexible High-Density Sensors
- Hyperdimensional Biosignal Processing: A Case Study for EMG-based Hand Gesture Recognition (paper)
- Adaptive EMG-based hand gesture recognition using hyperdimensional computing (paper)
- Related Matlab code
Robot Learning by Demonstration
Short Description
Robot learning from demonstration is a paradigm for enabling robots to autonomously perform new tasks. HD computing is a nice fit in this area since it naturally enables modeling relation between sensory inputs and actuator outputs of a robot by learning from few demonstrations. In this project, your goal would be to develop algorithms and implementations based on HD computing to enhance a robot to learn from online demonstrations. Further, such HD computing-based paradigm can be coupled to a brain-computer interface device enabling to control a robot by EEG signals from the brain. It has a wonderful application in neuroprosthetics to learn from a patient (see this demonstration at EPFL).
Links
- When the neuroprosthetics learn from the patient
- Learning Vector Symbolic Architectures for Reactive Robot Behaviours
- Learning Behavior Hierarchies via High-Dimensional Sensor Projection (paper)
Smart Eyeglass for Drones
Short Description
This project plans to deploy and build upon a new breed of eyewear that allows you to look inside yourself, instead of just at what is in front of you. These insights help you see deep within yourself by showing shifts in your emotional state, your activity logs, as well as your health. This device currently provides 6-axis sensors as well as EOG sensors. These collectively allow to recognize body movements, eye movements, and status of mind. In this project, your goal is to interface with this device and extend it with other sensors to create novel machine learning applications, e.g., controlling the movements of our nano-size quadrotor.
Links
Hyperdimensional Affective Computing
Short Description
Affective computing (sometimes called artificial emotional intelligence) is the study and development of systems and devices that can recognize, interpret, process, and simulate human affects. We focus on the emotion recognition and interpretation. Emotion is a subjective mental state caused by some specific events, which is usually accompanied by characteristic behaviors and involuntary physiological changes. Therefore, multi-channel physiological signals (e.g., GSR, ECG, EEG, EOG) become good inputs for emotion analysis, which also can be collected easily and continuously by wearable sensors. However, due to the need of a huge amount of training data for a high-quality machine learning model, energy efficiency constrains and robust issues become major performance bottlenecks, especially for the wearable devices. To overcome this issue, HD computing can come to rescue by providing a low-energy, robust, and fast learning computational paradigm.
In this project, your goal would be to develop an efficient and robust learning method based on hyperdimensional spaces to enhance accuracy and energy consumption.
Links
More Projects
- Exploratory Development of a Unified Foundational Model for Multi Biosignal Analysis
- Deep Learning Based Anomaly Detection in ECG Signals Using Foundation Models
- Pretraining Foundational Models for EEG Signal Analysis Using Open Source Large Scale Datasets
- EEG-based drowsiness detection
- In-ear EEG signal acquisition
- EEG earbud
- Advanced EEG glasses
- Predict eye movement through brain activity
- Self Aware Epilepsy Monitoring
- EEG artifact detection with machine learning
- EEG artifact detection for epilepsy monitoring
- BCI-controlled Drone
Completed Projects
These are projects that were recently completed:
- Ultrasound-EMG combined hand gesture recognition
- Smart e-glasses for concealed recording of EEG signals
- Wireless EEG Acquisition and Processing
- Ultrasound based hand gesture recognition
- Design of combined Ultrasound and Electromyography systems
- Ultra low power wearable ultrasound probe
- Hardware Constrained Neural Architechture Search
- Memory Augmented Neural Networks in Brain-Computer Interfaces
- Low Latency Brain-Machine Interfaces
- Deep Convolutional Autoencoder for iEEG Signals
- TCNs vs. LSTMs for Embedded Platforms
- An Energy Efficient Brain-Computer Interface using Mr.Wolf
- Exploring Algorithms for Early Seizure Detection
- Improving Resiliency of Hyperdimensional Computing
- Toward Superposition of Brain-Computer Interface Models
- FPGA Optimizations of Dense Binary Hyperdimensional Computing
- Fast and Accurate Multiclass Inference for Brain–Computer Interfaces
Where to find us
- Michael Hersche
- e-mail: herschmi@iis.ee.ethz.ch
- ETZ J76.2
- Xiaying Wang
- e-mail: xiaywang@iis.ee.ethz.ch
- ETZ J68.2
- Dr. Abbas Rahimi
- e-mail: abbas@iis.ee.ethz.ch
- ETZ J85
- Prof. Luca Benini
- e-mail: lbenini@iis.ee.ethz.ch
- ETZ J84