Personal tools

Difference between revisions of "Human Intranet"

From iis-projects

Jump to: navigation, search
(Available Projects)
Line 40: Line 40:
  
  
==Smart Eyeglass==
+
==Smart Eyeglass for Drones==
[[File:Jins_6axis.png|thumb|right|300px]]
+
[[File:Jins_6axis.png|thumb|right|250px]]
[[File:Jins_EOG.png|thumb|right|300px]]
+
[[File:Jins_EOG.png|thumb|right|250px]]
  
 
===Short Description===
 
===Short Description===
We plan to deploy and build upon a new breed of eyewear that allows you to look inside yourself, instead of just at what is in front of you. These insights help you see deep within yourself by showing shifts in your emotional state, your activity logs, as well as your health. This device currently provides 6-axis sensors as well as EOG sensors. These collectively allow to recognize body movements, eye movements, and status of mind. In this project, your goal is to interface with this device and extend it with other sensors to create novel machine learning applications, e.g., controlling the movements of our [http://iis-projects.ee.ethz.ch/index.php/Energy_Efficient_Autonomous_UAVs nano-size quadrotor].
+
This project plans to deploy and build upon a new breed of eyewear that allows you to look inside yourself, instead of just at what is in front of you. These insights help you see deep within yourself by showing shifts in your emotional state, your activity logs, as well as your health. This device currently provides 6-axis sensors as well as EOG sensors. These collectively allow to recognize body movements, eye movements, and status of mind. In this project, your goal is to interface with this device and extend it with other sensors to create novel machine learning applications, e.g., controlling the movements of our [http://iis-projects.ee.ethz.ch/index.php/Energy_Efficient_Autonomous_UAVs nano-size quadrotor].
  
 
===Links===
 
===Links===

Revision as of 15:15, 5 August 2018

What is Human Intranet?

HI.png

The world around us is getting a lot smarter quickly: virtually every single component of our daily living environment is being equipped with sensors, actuators, processing, and connection into a network that will soon count billions of nodes and trillions of sensors. These devices only interact with the human through the traditional input and output channels. Hence, they only indirectly communicate with our brain—through our five sense modalities—forming two separate computing systems: biological versus physical. It could be made a lot more effective if a direct high bandwidth link existed between the two systems, allowing them to truly collaborate with each other and to offer opportunities for enhanced functionality that would otherwise be hard to accomplish. The emergence of miniaturized sense, compute and actuate devices as well as interfaces that are form-fitted to the human body opens the door for a symbiotic convergence between biological function and physical computing.

Human Intranet is an open, scalable platform that seamlessly integrates an ever-increasing number of sensor, actuation, computation, storage, communication and energy nodes located on, in, or around the human body acting in symbiosis with the functions provided by the body itself. Human Intranet presents a system vision in which, for example, disease would be treated by chronically measuring biosignals deep in the body, or by providing targeted, therapeutic interventions that respond on demand and in situ. To gain a holistic view of a person’s health, these sensors and actuators must communicate and collaborate with each other. Most of such systems prototyped or envisioned today serve to address deficiencies in the human sensory or motor control systems due to birth defects, illnesses, or accidents (e.g., invasive brain-machine interfaces, cochlear implants, artificial retinas, etc.). While all these systems target defects, one can easily imagine that this could lead to many types of enhancement and/or enable direct interaction with the environment: to make us humans smarter!

Here, in our projects, we mainly focus on sensor, computation, communication, and emerging storage aspects to develop very efficient closed-loop sense-interpret-actuate systems, enabling distributed autonomous behavior. For example, to design the brain of our physical computing (i.e., the compute/interpret component), we rely on computing with ultra-wide words (e.g., 10,000 bits) that eases interfacing with various sensor modalities and actuators. This novel computing paradigm is called hyperdimensional (HD) computing that is inspired from the very size of the biological brain’s circuits: assuming 1 bit per synapse, the brain is made up of more than 24 billion of such ultra-wide words. You can watch some of our demos:

Prerequisites and Focus

If you are an M.S. student at the ETHZ, typically there is no prerequisite. You can come and talk to us and we adapt the projects based on your skills. The scope and focus of projects are wide. You can choose to work on:

  • Efficient hardware architectures in emerging technologies (e.g., the IBM computational memory)
  • System-level design and testing
  • Sensory interfaces (analog and digital)
  • FPGA prototyping, ASIC, and accelerators (SystemVerilog/ VHDL)
  • Exploring new Human Intranet/IoT applications (High-level Embedded Programming)
  • Algorithm design and optimizations (Matlab/ Python)


Useful Reading

Available Projects

Here, we provide a short description of the related projects for you to see the scope of our work. The directions and details of the projects can be adapted based on your interests and skills. Please do not hesitate to come and talk to us for more details.


Smart Eyeglass for Drones

Jins 6axis.png
Jins EOG.png

Short Description

This project plans to deploy and build upon a new breed of eyewear that allows you to look inside yourself, instead of just at what is in front of you. These insights help you see deep within yourself by showing shifts in your emotional state, your activity logs, as well as your health. This device currently provides 6-axis sensors as well as EOG sensors. These collectively allow to recognize body movements, eye movements, and status of mind. In this project, your goal is to interface with this device and extend it with other sensors to create novel machine learning applications, e.g., controlling the movements of our nano-size quadrotor.

Links

HD-Based Affective Computing

Emotion-recognition.jpg
Emotions-on-arousal-valence-space.jpg

Short Description

Affective computing (sometimes called artificial emotional intelligence) is the study and development of systems and devices that can recognize, interpret, process, and simulate human affects. We focus on the emotion recognition and interpretation. Emotion is a subjective mental state caused by some specific events, which is usually accompanied by characteristic behaviors and involuntary physiological changes. Therefore, multi-channel physiological signals (e.g., GSR, ECG, EEG, EOG) become good inputs for emotion analysis, which also can be collected easily and continuously by wearable sensors. However, due to the need of a huge amount of training data for a high-quality machine learning model, energy efficiency constrains and robust issues become major performance bottlenecks, especially for the wearable devices. To overcome this issue, HD computing can come to rescue by providing a low-energy, robust, and fast learning computational paradigm.

In this project, your goal would be to develop an HD-based efficient and robust learning method to enhance accuracy and energy consumption.


Online Brain-Computer Interfaces

BCI.png

Short Description

Noninvasive brain–computer interfaces and neuroprostheses aim to provide a communication and control channel based on the recognition of the subject’s intentions from spatiotemporal neural activity typically recorded by EEG electrodes. What makes it particularly challenging, however, is its susceptibility to errors over time in the recognition of human intentions.

In this project, your goal would be to develop an efficient and fast learning hardware that replaces the traditional signal processing and classification methods by directly operating with raw data from electrodes in an online fashion.

Links


Extremely Resilient HD Processor

BrainChip.jpg

Short Description

The most important aspect of HD computing, for hardware realization, is its robustness against noise and variations in the computing platforms. Principles of HD computing allows to implement resilient controllers and state machines for extreme noisy conditions. Its tolerance in operating with faulty components and low signal-to-noise ratio (SNR) conditions is achieved by brain-inspired properties of hypervectors: (pseudo)randomness, high-dimensionality, and fully distributed holographic representations.

In this project, your goal would be to design and develop an end-to-end robust HD processor with extremely resilient controller based on principles of HD computing, and measure its resiliency against noisy environment and faulty components.

Links

Flexible High-Density EMG Hand Gesture Recognition

Hyperdimensional EMG.png

Short Description

The surface EMG signals are the superposition of the electrical activity of underneath muscles when contractions occur. Wearable surface EMG devices have a wide range of applications in controlling the upper limb prostheses and hand gesture recognition systems intended for consumer human-machine interaction. High-density EMG electrode array covering the whole arm can ease targeting the most desired muscle locations and cope the issues with sensors misplacement. For robust gesture recognition from such EMG arrays, we rely on brain-inspired HD computing.

In this project, your goal would be to develop an RTL implementation of HD computing for one-shot gesture learning in an ultra low-power device.

Links


Robot Learning by Demonstration

Image source: Neubert et al, IROS 2016

Short Description

Robot learning from demonstration is a paradigm for enabling robots to autonomously perform new tasks. HD computing is a nice fit in this area since it naturally enables modeling relation between sensory inputs and actuator outputs of a robot by learning from few demonstrations. In this project, your goal would be to develop algorithms and implementations based on HD computing to enhance a robot to learn from online demonstrations. Further, such HD computing-based paradigm can be coupled to a brain-computer interface device enabling to control a robot by EEG signals from the brain. It has a wonderful application in neuroprosthetics to learn from a patient (see this demonstration at EPFL).

Links

Other Available Projects


Projects In Progress


Where to find us