Difference between revisions of "Here"

From TSG Doc
Jump to: navigation, search
(credits webgazer example)
 
(Explanation experiment buildup)
Line 1: Line 1:
 
=Examples=
 
=Examples=
  
Webgazer is an open source library for eye tracking solutions using common webcams. For more detailed information about it, please visit the authors website: https://webgazer.cs.brown.edu, it explains how it can be used in your own application. An example of how Webgazer can be used in your own jsPsych experiment is described below.<br>
+
Webgazer is an open source library for eye tracking solutions using common webcams, licensed under [[GPLv3][http://www.gnu.org/licenses/gpl-3.0.en.html]]. For more detailed information, please visit the authors website: https://webgazer.cs.brown.edu, it explains how it can be used in your own application. An example of how Webgazer can be used in your own jsPsych experiment is described below.<br>
 +
 
 +
==jsPsych example==
 +
The code is available here: https://gitlab.socsci.ru.nl/tsg/webgazer_calibration_example
 +
The implementation is based on the demo example from the Brown HCI group (https://webgazer.cs.brown.edu/calibration.html?). In the demo example, there are a few dependencies as listed below that link to external servers. In this example, the corresponding utilised tools have been downloaded and added locally to the server. To stay up to date, these links may be restored to have future implementations benefit from new developments.<br>
 +
The experiment starts with a calibration and then runs a simple experiment where it is asked to look at the fixation cross in the middle of the screen and then focus as soon as possible on random red spheres popping up on the edges or corners of the screen. At the end. all the corresponding eye gazing tracings are shown in a graphic. Only eye gazing x,y-coordinates are saved as data to the server, no video data is stored. The calibration part and presentation of an image, (i.e. one trial) together with everything that is needed to get the eye tracking data saved is implemented as jsPsych plugins: webgazer-calibration-plugin.js and webgazer-trial-plugin.js. The jsPsych implementation of the experiment itself is put together in main.js.<br>
 +
For server data storage, it uses the Radcloud Mini solution: https://www.socsci.ru.nl/wilberth/radcloud/index.html <br>
 +
<br>
 +
To run this example:http:
 +
//exp.socsci.ru.nl/webgazer/webgazer_calibration_example/jsp_webgazer.html
  
 
==Webgazer credits:==
 
==Webgazer credits:==
Line 22: Line 31:
 
       Model BlazeFace: https://tfhub.dev/tensorflow/tfjs-model/blazeface/1/default/1
 
       Model BlazeFace: https://tfhub.dev/tensorflow/tfjs-model/blazeface/1/default/1
 
       Model FaceMesh: https://tfhub.dev/mediapipe/tfjs-model/facemesh/1/default/1
 
       Model FaceMesh: https://tfhub.dev/mediapipe/tfjs-model/facemesh/1/default/1
Other dependencies:
+
==Other dependencies:==
 
localforage: https://raw.githubusercontent.com/localForage/localForage/master/dist/localforage.js<br>
 
localforage: https://raw.githubusercontent.com/localForage/localForage/master/dist/localforage.js<br>
 
bootstrap: https://getbootstrap.com/docs/4.3/getting-started/download/<br>
 
bootstrap: https://getbootstrap.com/docs/4.3/getting-started/download/<br>
 
sweetalert: sweetalert.min.js just copied from webgazer demo because not found here: https://github.com/t4t5/sweetalert<br>
 
sweetalert: sweetalert.min.js just copied from webgazer demo because not found here: https://github.com/t4t5/sweetalert<br>

Revision as of 11:58, 27 August 2020

Examples

Webgazer is an open source library for eye tracking solutions using common webcams, licensed under [[GPLv3][1]]. For more detailed information, please visit the authors website: https://webgazer.cs.brown.edu, it explains how it can be used in your own application. An example of how Webgazer can be used in your own jsPsych experiment is described below.

jsPsych example

The code is available here: https://gitlab.socsci.ru.nl/tsg/webgazer_calibration_example The implementation is based on the demo example from the Brown HCI group (https://webgazer.cs.brown.edu/calibration.html?). In the demo example, there are a few dependencies as listed below that link to external servers. In this example, the corresponding utilised tools have been downloaded and added locally to the server. To stay up to date, these links may be restored to have future implementations benefit from new developments.
The experiment starts with a calibration and then runs a simple experiment where it is asked to look at the fixation cross in the middle of the screen and then focus as soon as possible on random red spheres popping up on the edges or corners of the screen. At the end. all the corresponding eye gazing tracings are shown in a graphic. Only eye gazing x,y-coordinates are saved as data to the server, no video data is stored. The calibration part and presentation of an image, (i.e. one trial) together with everything that is needed to get the eye tracking data saved is implemented as jsPsych plugins: webgazer-calibration-plugin.js and webgazer-trial-plugin.js. The jsPsych implementation of the experiment itself is put together in main.js.
For server data storage, it uses the Radcloud Mini solution: https://www.socsci.ru.nl/wilberth/radcloud/index.html

To run this example:http: //exp.socsci.ru.nl/webgazer/webgazer_calibration_example/jsp_webgazer.html

Webgazer credits:

If you use WebGazer.js please cite the following paper (https://jeffhuang.com/Final_WebGazer_IJCAI16.pdf):@inproceedings{papoutsaki2016webgazer,
author = {Alexandra Papoutsaki and Patsorn Sangkloy and James Laskey and Nediyana Daskalova and Jeff Huang and James Hays},
title = {WebGazer: Scalable Webcam Eye Tracking Using User Interactions},
booktitle = {Proceedings of the 25th International Joint Conference on Artificial Intelligence (IJCAI)},
pages = {3839--3845}, year = {2016}, organization={AAAI}

WebGazer download:

webgazer.js: https://webgazer.cs.brown.edu

WebGazer dependencies:

In webgazer.js original web page links were changed to local copies of the models:

  // const BLAZEFACE_MODEL_URL="https://tfhub.dev/tensorflow/tfjs-model/blazeface/1/default/1";
     const BLAZEFACE_MODEL_URL="./model/tfjs/blazeface";
  // const FACEMESH_GRAPHMODEL_PATH = 'https://tfhub.dev/mediapipe/tfjs-model/facemesh/1/default/1';
     const FACEMESH_GRAPHMODEL_PATH = './model/tfjs/facemesh';
  models downloaded from:
     Model BlazeFace: https://tfhub.dev/tensorflow/tfjs-model/blazeface/1/default/1
     Model FaceMesh: https://tfhub.dev/mediapipe/tfjs-model/facemesh/1/default/1

Other dependencies:

localforage: https://raw.githubusercontent.com/localForage/localForage/master/dist/localforage.js
bootstrap: https://getbootstrap.com/docs/4.3/getting-started/download/
sweetalert: sweetalert.min.js just copied from webgazer demo because not found here: https://github.com/t4t5/sweetalert