Changes

Jump to navigation Jump to search
3,169 bytes added ,  15:26, 6 June 2018
Added source code links to the gitlab repository where SMI-Eyetracking example resides.
Line 1: Line 1:  
== SR Research ==
 
== SR Research ==
    +
[[image:SR_logo.jpg | right |150px]]
 
[[File:eyelink.jpg|thumb|200px|SR Research EyeLink II]]
 
[[File:eyelink.jpg|thumb|200px|SR Research EyeLink II]]
    
=== EyeLink II ===
 
=== EyeLink II ===
   −
The EyeLink II system consists of three miniature cameras mounted on a padded headband. Two eye cameras allow binocular eye tracking or selection of the subject’s dominant eye. An optical head-tracking camera integrated into the headband allows accurate tracking of the subject’s point of gaze without the need for a bite bar.
+
 
 +
The EyeLink II system consists of three miniature cameras mounted on a padded headband. Two eye cameras allow binocular eye tracking or selection of the subject’s dominant eye.  
 +
 
 +
An optical head-tracking camera integrated into the headband allows accurate tracking of the subject’s point of gaze without the need for a bite bar.
 +
 
 +
 
 +
 
    
'''Specifications'''
 
'''Specifications'''
Line 25: Line 32:  
| Fast and simple participant setup, calibration, and validation
 
| Fast and simple participant setup, calibration, and validation
 
|}
 
|}
 +
 +
 +
=== EyeLink 1000 ===
 +
[[image:Eyelink1000.jpg | right |150px]]
 +
http://www.sr-research.com/eyelink1000.html
 +
 +
The core of the EyeLink 1000 eye tracker consists of a custom designed high-speed camera connected to a dedicated Host computer. Running on a real-time operating system, the Host software provides extremely fast eye sample access with incredibly low inter-sample variability, accessed via a set of programming interfaces for multiple operating systems and programming languages
 +
 +
 +
'''Specifications'''
 +
 +
{| class="wikitable"
 +
|-
 +
! Eyelink 1000
 +
|-
 +
| Sampling Rate || Head Supported: 2000 Hz Monocular / 1000 Hz Binocular
 +
Remote / Head Free: 500 Hz Monocular
 +
|-
 +
| Accuracy || Head Supported: 0.25º -0.5º average accuracy
 +
Remote / Head Free: 0.5º average accuracy
 +
|-
 +
| Real-time Data Access || Head Supported: 1.4 msec (SD < 0.4 msec) @ 2000 Hz
 +
Remote / Head Free: 3 msec (SD < 1.2 msec) @ 500 Hz
 +
|-
 +
| Participant Setup || Very simple and easy. Typically less than 2-5 minutes
 +
|-
 +
| Resolution || Head Supported: 0.01º RMS, micro-saccade resolution of 0.05º
 +
Remote / Head Free:0.05º RMS, saccade resolution of 0.25
 +
|}
 +
Full specifications: http://www.sr-research.com/pdf/techspec.pdf<br />
 +
 +
[[Eyelink_1000_calibration | Matlab calibration example]]
 +
 +
[[Eyelink1000plussetup|Eyelink 1000 plus setup]]
    
== Tobii ==
 
== Tobii ==
    +
[[File:tobii_logo.jpg | right]]
 
[[File:Tobii.jpg|thumb|200px|Tobii T60]]
 
[[File:Tobii.jpg|thumb|200px|Tobii T60]]
 
[[File:TX300.png|thumb|250px|Tobii TX300]]
 
[[File:TX300.png|thumb|250px|Tobii TX300]]
 +
 +
    
===Tobii T60/T120===
 
===Tobii T60/T120===
Line 37: Line 81:  
There is a Tobii setup in the babylab, which is in a fixed lab setup.
 
There is a Tobii setup in the babylab, which is in a fixed lab setup.
   −
There are also two mobile Tobii setups. Please contact Gero Lange (mailto:g.lange@psych.ru.nl) for booking requests. Please keep in mind that these mobile setups are booked _without_ a room. If you plan to use these eyetrackers, make sure to also book a room that can be locked. You can use it in a cubicle, as long as you make sure that the eyetracker is stored in a locked room at the end of the day.
+
There are also two mobile Tobii setups. Please contact [mailto:g.lange@psych.ru.nl Gero Lange] for booking requests. Please keep in mind that these mobile setups are booked _without_ a room. If you plan to use these eyetrackers, make sure to also book a room that can be locked. You can use it in a cubicle, as long as you make sure that the eyetracker is stored in a locked room at the end of the day.
    
'''Lab setup'''
 
'''Lab setup'''
Line 89: Line 133:  
*[[Media:TobiiEyetrackerExtension_1_1.zip|Full Documentation (zip)]]
 
*[[Media:TobiiEyetrackerExtension_1_1.zip|Full Documentation (zip)]]
   −
*[http://www.visionspace.at/uploads/media/TobiiWorkspaceExtension_1_0.zip Neurobs Presentation plugin & documentation]
+
*[http://www.visionspace.at/index.php?id=3&L=1 Neurobs Presentation plugin & documentation]
 
      
== SensoMotoric Instruments (SMI) ==
 
== SensoMotoric Instruments (SMI) ==
 
+
[[image:SMI_logo.jpg |right]]
 
[[File:IViewXRED.jpg|thumb|200px|SMI RED500]]
 
[[File:IViewXRED.jpg|thumb|200px|SMI RED500]]
 
[[file:eyeviewx.jpg |thumb|200px|SMI iView X]]
 
[[file:eyeviewx.jpg |thumb|200px|SMI iView X]]
    
=== SMI RED500 ===
 
=== SMI RED500 ===
 +
 +
[[SMI RED 500 Eye Tracker (DCC)|How to connect SMI RED 500 eyetracker (Scheme)]]
 +
    
'''Downloads'''
 
'''Downloads'''
Line 104: Line 150:  
*[[Media:Eye-tracking_Presentation_iview.zip | Presentation Extension (zip)]]
 
*[[Media:Eye-tracking_Presentation_iview.zip | Presentation Extension (zip)]]
   −
*[[Media:Eye-tracking_Presentation_iview.zip|Presentation Script (zip)]]
+
*[[Media:SMI-Eyetracker-Example.zip|Example Presentation Script (zip, new script since 2018-06-05)]] An example script that connects and communicates with the IViewX program. It shows two pictures side by side. Data can be analysed using BeGaze. There are a lot of explanations in the files. The comments will help understanding how to use an SMI eyetracker in Presentation.
 +
Source code: [https://gitlab.socsci.ru.nl/h.voogd/SMI-Eyetracking-Example/blob/master/SMI-Eyetracker-Example.sce SMI-Eyetracker-Example.sce ], [https://gitlab.socsci.ru.nl/h.voogd/SMI-Eyetracking-Example/blob/master/SMI-Eyetracker-ExampleINFO.pcl SMI-Eyetracker-ExampleINFO.pcl] [https://gitlab.socsci.ru.nl/h.voogd/SMI-Eyetracking-Example/blob/master/SMI-Eyetracker-ExamplePCL.pcl SMI-Eyetracker-ExamplePCL.pcl] [https://gitlab.socsci.ru.nl/h.voogd/SMI-Eyetracking-Example/blob/master/SMI-Eyetracker-ExampleSUBS.pcl SMI-Eyetracker-ExampleSUBS.pcl]
    
*[[Media:Python.zip|Python Script (zip)]]
 
*[[Media:Python.zip|Python Script (zip)]]
      
=== iView X ===
 
=== iView X ===
Line 116: Line 162:     
*[[Media:Eye-tracking_Presentation_iview.zip | Presentation Extension (zip)]]
 
*[[Media:Eye-tracking_Presentation_iview.zip | Presentation Extension (zip)]]
 
+
*[[Media:SMI-Eyetracker-Example.zip|Example Presentation Script (zip, new script since 2018-06-05)]] An example script that connects and communicates with the IViewX program. It shows two pictures side by side. Data can be analysed using BeGaze. There are a lot of explanations in the files. The comments will help understanding how to use an SMI eyetracker in Presentation.
*[[Media:Eye-tracking_Presentation_iview.zip|Presentation Script (zip)]]
+
Source code: [https://gitlab.socsci.ru.nl/h.voogd/SMI-Eyetracking-Example/blob/master/SMI-Eyetracker-Example.sce SMI-Eyetracker-Example.sce ], [https://gitlab.socsci.ru.nl/h.voogd/SMI-Eyetracking-Example/blob/master/SMI-Eyetracker-ExampleINFO.pcl SMI-Eyetracker-ExampleINFO.pcl] [https://gitlab.socsci.ru.nl/h.voogd/SMI-Eyetracking-Example/blob/master/SMI-Eyetracker-ExamplePCL.pcl SMI-Eyetracker-ExamplePCL.pcl] [https://gitlab.socsci.ru.nl/h.voogd/SMI-Eyetracking-Example/blob/master/SMI-Eyetracker-ExampleSUBS.pcl SMI-Eyetracker-ExampleSUBS.pcl]
    
*[[Media:Python.zip|Python Script (zip)]]
 
*[[Media:Python.zip|Python Script (zip)]]

Navigation menu