Personal tools

Difference between revisions of "GUI-developement for an action-cam-based eye tracking device"

From iis-projects

Jump to: navigation, search
 
(6 intermediate revisions by one other user not shown)
Line 2: Line 2:
  
 
==Short Description==
 
==Short Description==
Vision is the primary sense of humans and the analysis of eye movements reveals much about cognitive processes. Eye tracking has provided insights into the perception of art, the viewing behavior of novice and expert athletes and the ability to drive in the elderly. In such cases, it is impertinent to record the participants’ behavior under natural conditions outside of laboratory settings. Therefore, mobile eye tracking devices which also record the participants view through a head-mounted scene camera are essential. Together with the Department of Neurology of the University Hospital Zurich, the IIS is developing such a mobile eye tracker. Currently, the scene camera (an off-the shelf action-cam) is controlled via a proprietary app on a tablet, whereas the eye tracker is controlled with a custom, android-based app from the IIS.
+
Vision is the primary sense of humans and the analysis of eye movements reveals much about cognitive processes. Eye tracking has provided insights into the perception of art, the viewing behavior of novice and expert athletes and the ability to drive in the elderly. In such cases, it is impertinent to record the participants’ behavior under natural conditions outside of laboratory settings. Therefore, mobile eye tracking devices which also record the participants view through a head-mounted scene camera are essential.
 +
 
 +
Together with the Department of Neurology of the University Hospital Zurich, the IIS is developing such a mobile eye tracker. Currently, the scene camera (an off-the shelf action-cam) is controlled via a proprietary app on a tablet, whereas the eye tracker is controlled with a custom, android-based app from the IIS.
 +
 
 
Your task is to include the functionality of the first app into the latter. This includes interfacing the action-cam, reading it’s video stream and controlling its status (start, pause, stop) and all its settings. The live preview should be displayed in the app and a possibility to incorporate a live eye position overlay should be provided.
 
Your task is to include the functionality of the first app into the latter. This includes interfacing the action-cam, reading it’s video stream and controlling its status (start, pause, stop) and all its settings. The live preview should be displayed in the app and a possibility to incorporate a live eye position overlay should be provided.
  
===Status: Available ===
+
===Status: In progress===
 
: Looking for 1-2 Semester/Master students
 
: Looking for 1-2 Semester/Master students
 
: Supervisors: [[:User:Mackd|David J. Mack]], [[:User:Schoenle|Philipp Schönle]]
 
: Supervisors: [[:User:Mackd|David J. Mack]], [[:User:Schoenle|Philipp Schönle]]
Line 20: Line 23:
 
: [http://www.iis.ee.ethz.ch/portrait/staff/huang.en.html Qiuting Huang]
 
: [http://www.iis.ee.ethz.ch/portrait/staff/huang.en.html Qiuting Huang]
  
[[Category:Available]]
+
[[Category:Completed]]
 +
[[Category:2016]]
 
[[Category:Eye tracking]]
 
[[Category:Eye tracking]]
[[Category:Hot]]
 
 
[[Category:Image and Video Processing]]
 
[[Category:Image and Video Processing]]
 
[[Category:Mack]]
 
[[Category:Mack]]
Line 29: Line 32:
 
[[Category:Schoenle]]
 
[[Category:Schoenle]]
 
[[Category:Semester Thesis]]
 
[[Category:Semester Thesis]]
 +
[[Category:Analog]]

Latest revision as of 11:10, 3 May 2018

EOGSetup.png

Short Description

Vision is the primary sense of humans and the analysis of eye movements reveals much about cognitive processes. Eye tracking has provided insights into the perception of art, the viewing behavior of novice and expert athletes and the ability to drive in the elderly. In such cases, it is impertinent to record the participants’ behavior under natural conditions outside of laboratory settings. Therefore, mobile eye tracking devices which also record the participants view through a head-mounted scene camera are essential.

Together with the Department of Neurology of the University Hospital Zurich, the IIS is developing such a mobile eye tracker. Currently, the scene camera (an off-the shelf action-cam) is controlled via a proprietary app on a tablet, whereas the eye tracker is controlled with a custom, android-based app from the IIS.

Your task is to include the functionality of the first app into the latter. This includes interfacing the action-cam, reading it’s video stream and controlling its status (start, pause, stop) and all its settings. The live preview should be displayed in the app and a possibility to incorporate a live eye position overlay should be provided.

Status: In progress

Looking for 1-2 Semester/Master students
Supervisors: David J. Mack, Philipp Schönle

Character

70% Programming (Java/Android)
30% Image and Video Processing

Prerequisites

Good programming skills (ideally Java/Android)
Basic knowledge of image processing and GUI design

Professor

Qiuting Huang