Personal tools

Difference between revisions of "Human Intranet"

From iis-projects

Jump to: navigation, search
(Useful Reading)
(Introduction)
Line 1: Line 1:
 
=Introduction=
 
=Introduction=
 
[[File:HI.png|thumb|center]]
 
[[File:HI.png|thumb|center]]
The world around us is getting a lot smarter quickly: virtually every single component of our daily living environment is being equipped with sensors, actuators, processing, and connection into a network that will soon count billions of nodes and trillions of sensors. These devices only interact with the human through the traditional input and output channels. Hence, they only indirectly communicate with the brain—through our five sense modalities—forming two separate computing systems, while it could be made a lot more effective if a direct high bandwidth link existed between the two, allowing them to truly collaborate with each other and to offer opportunities for enhanced functionality that would otherwise be hard to accomplish. The emergence of miniaturized sense, compute and actuate devices as well as interfaces that are form-fitted to the human body opens the door for a symbiotic convergence between biological function and physical computing.
+
The world around us is getting a lot smarter quickly: virtually every single component of our daily living environment is being equipped with sensors, actuators, processing, and connection into a network that will soon count billions of nodes and trillions of sensors. These devices only interact with the human through the traditional input and output channels. Hence, they only indirectly communicate with our brain—through our five sense modalities—forming two separate computing systems: biological versus physical. It could be made a lot more effective if a direct high bandwidth link existed between the two systems, allowing them to truly collaborate with each other and to offer opportunities for enhanced functionality that would otherwise be hard to accomplish. The emergence of miniaturized sense, compute and actuate devices as well as interfaces that are form-fitted to the human body opens the door for a symbiotic convergence between biological function and physical computing.
  
 
Human Intranet is an open, scalable platform that seamlessly integrates an ever-increasing number of sensor, actuation, computation, storage, communication and energy nodes located on, in, or around the human body acting in symbiosis with the functions provided by the body itself. Human Intranet presents a system vision in which, for example, disease would be treated by chronically measuring biosignals deep in the body, or by providing targeted, therapeutic interventions that respond on demand and in situ. To gain a holistic view of a person’s health, these sensors and actuators must communicate and collaborate with each other. Most of such systems prototyped or envisioned today serve to address deficiencies in the human sensory or motor control systems due to birth defects, illnesses, or accidents (e.g., invasive brain-machine interfaces, cochlear implants, and artificial retinas). While all these systems target defects, one can easily imagine that this could lead to many types of enhancement and/or enable direct interaction with the environment: to make us humans smarter!
 
Human Intranet is an open, scalable platform that seamlessly integrates an ever-increasing number of sensor, actuation, computation, storage, communication and energy nodes located on, in, or around the human body acting in symbiosis with the functions provided by the body itself. Human Intranet presents a system vision in which, for example, disease would be treated by chronically measuring biosignals deep in the body, or by providing targeted, therapeutic interventions that respond on demand and in situ. To gain a holistic view of a person’s health, these sensors and actuators must communicate and collaborate with each other. Most of such systems prototyped or envisioned today serve to address deficiencies in the human sensory or motor control systems due to birth defects, illnesses, or accidents (e.g., invasive brain-machine interfaces, cochlear implants, and artificial retinas). While all these systems target defects, one can easily imagine that this could lead to many types of enhancement and/or enable direct interaction with the environment: to make us humans smarter!

Revision as of 10:14, 29 July 2018

Introduction

HI.png

The world around us is getting a lot smarter quickly: virtually every single component of our daily living environment is being equipped with sensors, actuators, processing, and connection into a network that will soon count billions of nodes and trillions of sensors. These devices only interact with the human through the traditional input and output channels. Hence, they only indirectly communicate with our brain—through our five sense modalities—forming two separate computing systems: biological versus physical. It could be made a lot more effective if a direct high bandwidth link existed between the two systems, allowing them to truly collaborate with each other and to offer opportunities for enhanced functionality that would otherwise be hard to accomplish. The emergence of miniaturized sense, compute and actuate devices as well as interfaces that are form-fitted to the human body opens the door for a symbiotic convergence between biological function and physical computing.

Human Intranet is an open, scalable platform that seamlessly integrates an ever-increasing number of sensor, actuation, computation, storage, communication and energy nodes located on, in, or around the human body acting in symbiosis with the functions provided by the body itself. Human Intranet presents a system vision in which, for example, disease would be treated by chronically measuring biosignals deep in the body, or by providing targeted, therapeutic interventions that respond on demand and in situ. To gain a holistic view of a person’s health, these sensors and actuators must communicate and collaborate with each other. Most of such systems prototyped or envisioned today serve to address deficiencies in the human sensory or motor control systems due to birth defects, illnesses, or accidents (e.g., invasive brain-machine interfaces, cochlear implants, and artificial retinas). While all these systems target defects, one can easily imagine that this could lead to many types of enhancement and/or enable direct interaction with the environment: to make us humans smarter!

Here, in our projects, we mainly focus on sensor, computation, and storage aspects to develop very efficient closed-loop sense-interpret-actuate systems, enabling distributed autonomous behavior. More specifically, to design our physical brain (the compute/interpret component), we rely on computing with ultra-wide words (e.g., 10,000 bits) that eases interfacing with various sensor modalities and actuators. This novel computing paradigm is called hyperdimensional (HD) computing that is inspired from the very size of the brain’s circuits: assuming 1 bit per synapse, they constitute more than 24 billion of such ultra-wide words. Overall, our projects cover algorithmic, hardware/software, and system level design and developments.

Prerequisites and Focus

If you are an M.S. student, typically there is no special prerequisite. We can redefine and adapt the project based on your skills. The scope and focus of projects are wide. You can choose to work on:

  • Theory of learning systems including HD computing, Hidden Markov Model (HMM), and clustering algorithms
  • Exploring various embedded/IoTs applications
  • Algorithmic design and optimizations (Matlab/ Python)
  • Hardware and digital architecture design
  • FPGA prototyping (SystemVerilog/ VHDL)
  • ASIC chip and accelerators for low signal-to-noise ratio conditions

Useful Reading

Available Projects

Here, we provide a tiny list of related projects just for your information. The new directions and details of the projects can be adapted based on your interests and skills. Specifically, you can also take a look at the list of publications here to find other active projects we are working on. Please do not hesitate to contact us for more details.


Epilepsy Seizure Detection

Seizure-prediction.png

Short Description

Seizure detection devices hold promise for improving the quality of life for patients with epilepsy that afflicts nearly 1% of the world's population. In this project, your goal would be to develop an efficient hardware for EEG signals to detect seizures with an ultra-low-power device. This project focuses on design and developing efficient techniques in analog front-end and digital signal processing. The abilities of HD computing for one-shot and online learning can be exploited as well.

Links

Online Brain-Computer Interfaces

BCI.png

Short Description

Noninvasive brain–computer interfaces and neuroprostheses aim to provide a communication and control channel based on the recognition of the subject’s intentions from spatiotemporal neural activity typically recorded by EEG electrodes. What makes it particularly challenging, however, is its susceptibility to errors over time in the recognition of human intentions.

In this project, your goal would be to develop an efficient and fast learning method based on HD computing that replaces the traditional signal processing and classification methods by directly operating with raw data from electrodes in an online fashion.

Links


Extremely Resilient HD Processor

BrainChip.jpg

Short Description

The most important aspect of HD computing, for hardware realization, is its robustness against noise and variations in the computing platforms. Principles of HD computing allows to implement resilient controllers and state machines for extreme noisy conditions. Its tolerance in operating with faulty components and low signal-to-noise ratio (SNR) conditions is achieved by brain-inspired properties of hypervectors: (pseudo)randomness, high-dimensionality, and fully distributed holographic representations.

In this project, your goal would be to design and develop an end-to-end robust HD processor with extremely resilient controller based on principles of HD computing, and measure its resiliency against noisy environment and faulty components.

Links

Flexible High-Density EMG Hand Gesture Recognition

Hyperdimensional EMG.png

Short Description

The surface EMG signals are the superposition of the electrical activity of underneath muscles when contractions occur. Wearable surface EMG devices have a wide range of applications in controlling the upper limb prostheses and hand gesture recognition systems intended for consumer human-machine interaction. High-density EMG electrode array covering the whole arm can ease targeting the most desired muscle locations and cope the issues with sensors misplacement. For robust gesture recognition from such EMG arrays, we rely on brain-inspired HD computing.

In this project, your goal would be to develop an RTL implementation of HD computing for one-shot gesture learning in an ultra low-power device.

Links


Robot Learning by Demonstration

Image source: Neubert et al, IROS 2016

Short Description

Robot learning from demonstration is a paradigm for enabling robots to autonomously perform new tasks. HD computing is a nice fit in this area since it naturally enables modeling relation between sensory inputs and actuator outputs of a robot by learning from few demonstrations. In this project, your goal would be to develop algorithms and implementations based on HD computing to enhance a robot to learn from online demonstrations. Further, such HD computing-based paradigm can be coupled to a brain-computer interface device enabling to control a robot by EEG signals from the brain. It has a wonderful application in neuroprosthetics to learn from a patient (see this demonstration at EPFL).

Links

Other Available Projects


Projects In Progress


Contact Information