Changes

Jump to navigation Jump to search
1,590 bytes added ,  11:58, 27 August 2020
Explanation experiment buildup
Line 1: Line 1:  
=Examples=
 
=Examples=
   −
Webgazer is an open source library for eye tracking solutions using common webcams. For more detailed information about it, please visit the authors website: https://webgazer.cs.brown.edu, it explains how it can be used in your own application. An example of how Webgazer can be used in your own jsPsych experiment is described below.<br>
+
Webgazer is an open source library for eye tracking solutions using common webcams, licensed under [[GPLv3][http://www.gnu.org/licenses/gpl-3.0.en.html]]. For more detailed information, please visit the authors website: https://webgazer.cs.brown.edu, it explains how it can be used in your own application. An example of how Webgazer can be used in your own jsPsych experiment is described below.<br>
 +
 
 +
==jsPsych example==
 +
The code is available here: https://gitlab.socsci.ru.nl/tsg/webgazer_calibration_example
 +
The implementation is based on the demo example from the Brown HCI group (https://webgazer.cs.brown.edu/calibration.html?). In the demo example, there are a few dependencies as listed below that link to external servers. In this example, the corresponding utilised tools have been downloaded and added locally to the server. To stay up to date, these links may be restored to have future implementations benefit from new developments.<br>
 +
The experiment starts with a calibration and then runs a simple experiment where it is asked to look at the fixation cross in the middle of the screen and then focus as soon as possible on random red spheres popping up on the edges or corners of the screen. At the end. all the corresponding eye gazing tracings are shown in a graphic. Only eye gazing x,y-coordinates are saved as data to the server, no video data is stored. The calibration part and presentation of an image, (i.e. one trial) together with everything that is needed to get the eye tracking data saved is implemented as jsPsych plugins: webgazer-calibration-plugin.js and webgazer-trial-plugin.js. The jsPsych implementation of the experiment itself is put together in main.js.<br>
 +
For server data storage, it uses the Radcloud Mini solution: https://www.socsci.ru.nl/wilberth/radcloud/index.html <br>
 +
<br>
 +
To run this example:http:
 +
//exp.socsci.ru.nl/webgazer/webgazer_calibration_example/jsp_webgazer.html
    
==Webgazer credits:==
 
==Webgazer credits:==
Line 22: Line 31:  
       Model BlazeFace: https://tfhub.dev/tensorflow/tfjs-model/blazeface/1/default/1
 
       Model BlazeFace: https://tfhub.dev/tensorflow/tfjs-model/blazeface/1/default/1
 
       Model FaceMesh: https://tfhub.dev/mediapipe/tfjs-model/facemesh/1/default/1
 
       Model FaceMesh: https://tfhub.dev/mediapipe/tfjs-model/facemesh/1/default/1
Other dependencies:
+
==Other dependencies:==
 
localforage: https://raw.githubusercontent.com/localForage/localForage/master/dist/localforage.js<br>
 
localforage: https://raw.githubusercontent.com/localForage/localForage/master/dist/localforage.js<br>
 
bootstrap: https://getbootstrap.com/docs/4.3/getting-started/download/<br>
 
bootstrap: https://getbootstrap.com/docs/4.3/getting-started/download/<br>
 
sweetalert: sweetalert.min.js just copied from webgazer demo because not found here: https://github.com/t4t5/sweetalert<br>
 
sweetalert: sweetalert.min.js just copied from webgazer demo because not found here: https://github.com/t4t5/sweetalert<br>

Navigation menu