Difference between revisions of "Webgazer eye tracking solution implemented as jsPsych plugin"
Wiki-admin (talk | contribs) (Created page with "=Webgazer= Webgazer is an open source library for eye tracking solutions using common webcams, licensed under GPLv3[http://www.gnu.org/licenses/gpl-3.0.en.html]. For more deta...") |
Wiki-admin (talk | contribs) m |
||
(4 intermediate revisions by the same user not shown) | |||
Line 4: | Line 4: | ||
==jsPsych example== | ==jsPsych example== | ||
The code is available here: [https://gitlab.socsci.ru.nl/tsg/webgazer_calibration_example] | The code is available here: [https://gitlab.socsci.ru.nl/tsg/webgazer_calibration_example] | ||
− | The implementation is based on the demo example from the Brown HCI group (https://webgazer.cs.brown.edu/calibration.html?). In the demo example, there are a few dependencies as listed below that link to external servers. In this example, the corresponding utilised tools have been downloaded and added locally to | + | The implementation is based on the demo example from the Brown HCI group (https://webgazer.cs.brown.edu/calibration.html?). In the demo example, there are a few dependencies as listed below that link to external servers. In this example, the corresponding utilised tools have been downloaded and added locally to our Titus server. To stay up to date, these links may be restored to have future implementations benefit from new developments.<br> |
The experiment starts with a calibration and then runs a simple experiment where it is asked to look at the fixation cross in the middle of the screen and then focus as soon as possible on random red spheres popping up on the edges or corners of the screen. At the end. all the corresponding eye gazing tracings are shown in a graphic. Only eye gazing x,y-coordinates are saved as data to the server, no video data is stored. The calibration part and presentation of an image, (i.e. one trial) together with everything that is needed to get the eye tracking data saved is implemented as jsPsych plugins: webgazer-calibration-plugin.js and webgazer-trial-plugin.js. The jsPsych implementation of the experiment itself is put together in main.js.<br> | The experiment starts with a calibration and then runs a simple experiment where it is asked to look at the fixation cross in the middle of the screen and then focus as soon as possible on random red spheres popping up on the edges or corners of the screen. At the end. all the corresponding eye gazing tracings are shown in a graphic. Only eye gazing x,y-coordinates are saved as data to the server, no video data is stored. The calibration part and presentation of an image, (i.e. one trial) together with everything that is needed to get the eye tracking data saved is implemented as jsPsych plugins: webgazer-calibration-plugin.js and webgazer-trial-plugin.js. The jsPsych implementation of the experiment itself is put together in main.js.<br> | ||
For server data storage, it uses the Radcloud Mini solution: https://www.socsci.ru.nl/wilberth/radcloud/index.html <br> | For server data storage, it uses the Radcloud Mini solution: https://www.socsci.ru.nl/wilberth/radcloud/index.html <br> | ||
<br> | <br> | ||
To run this example:http: | To run this example:http: | ||
− | //exp.socsci.ru.nl/webgazer/webgazer_calibration_example/jsp_webgazer.html | + | https://exp.socsci.ru.nl/webgazer/webgazer_calibration_example/jsp_webgazer.html |
− | + | <br> | |
− | + | If you would create your own experiment using this jsPsych Webgazer plugin, please contact TSG. | |
− | If you | ||
− | |||
− | |||
− | |||
− | |||
==WebGazer download:== | ==WebGazer download:== | ||
Line 34: | Line 29: | ||
bootstrap: https://getbootstrap.com/docs/4.3/getting-started/download/<br> | bootstrap: https://getbootstrap.com/docs/4.3/getting-started/download/<br> | ||
sweetalert: sweetalert.min.js just copied from webgazer demo because not found here: https://github.com/t4t5/sweetalert<br> | sweetalert: sweetalert.min.js just copied from webgazer demo because not found here: https://github.com/t4t5/sweetalert<br> | ||
+ | |||
+ | ==Publications== | ||
+ | If you use WebGazer.js please cite: | ||
+ | |||
+ | <nowiki> | ||
+ | @inproceedings{papoutsaki2016webgazer, | ||
+ | author = {Alexandra Papoutsaki and Patsorn Sangkloy and James Laskey and Nediyana Daskalova and Jeff Huang and James Hays}, | ||
+ | title = {WebGazer: Scalable Webcam Eye Tracking Using User Interactions}, | ||
+ | booktitle = {Proceedings of the 25th International Joint Conference on Artificial Intelligence (IJCAI)}, | ||
+ | pages = {3839--3845}, | ||
+ | year = {2016}, | ||
+ | organization={AAAI} | ||
+ | } | ||
+ | </nowiki> |
Latest revision as of 11:42, 27 August 2020
Webgazer
Webgazer is an open source library for eye tracking solutions using common webcams, licensed under GPLv3[1]. For more detailed information, please visit the authors website: https://webgazer.cs.brown.edu, it explains how it can be used in your own application. An example of how Webgazer can be used in your own jsPsych experiment is described below.
jsPsych example
The code is available here: [2]
The implementation is based on the demo example from the Brown HCI group (https://webgazer.cs.brown.edu/calibration.html?). In the demo example, there are a few dependencies as listed below that link to external servers. In this example, the corresponding utilised tools have been downloaded and added locally to our Titus server. To stay up to date, these links may be restored to have future implementations benefit from new developments.
The experiment starts with a calibration and then runs a simple experiment where it is asked to look at the fixation cross in the middle of the screen and then focus as soon as possible on random red spheres popping up on the edges or corners of the screen. At the end. all the corresponding eye gazing tracings are shown in a graphic. Only eye gazing x,y-coordinates are saved as data to the server, no video data is stored. The calibration part and presentation of an image, (i.e. one trial) together with everything that is needed to get the eye tracking data saved is implemented as jsPsych plugins: webgazer-calibration-plugin.js and webgazer-trial-plugin.js. The jsPsych implementation of the experiment itself is put together in main.js.
For server data storage, it uses the Radcloud Mini solution: https://www.socsci.ru.nl/wilberth/radcloud/index.html
To run this example:http:
https://exp.socsci.ru.nl/webgazer/webgazer_calibration_example/jsp_webgazer.html
If you would create your own experiment using this jsPsych Webgazer plugin, please contact TSG.
WebGazer download:
webgazer.js: https://webgazer.cs.brown.edu
WebGazer dependencies:
In webgazer.js original web page links were changed to local copies of the models:
// const BLAZEFACE_MODEL_URL="https://tfhub.dev/tensorflow/tfjs-model/blazeface/1/default/1"; const BLAZEFACE_MODEL_URL="./model/tfjs/blazeface"; // const FACEMESH_GRAPHMODEL_PATH = 'https://tfhub.dev/mediapipe/tfjs-model/facemesh/1/default/1'; const FACEMESH_GRAPHMODEL_PATH = './model/tfjs/facemesh'; models downloaded from: Model BlazeFace: https://tfhub.dev/tensorflow/tfjs-model/blazeface/1/default/1 Model FaceMesh: https://tfhub.dev/mediapipe/tfjs-model/facemesh/1/default/1
Other dependencies:
localforage: https://raw.githubusercontent.com/localForage/localForage/master/dist/localforage.js
bootstrap: https://getbootstrap.com/docs/4.3/getting-started/download/
sweetalert: sweetalert.min.js just copied from webgazer demo because not found here: https://github.com/t4t5/sweetalert
Publications
If you use WebGazer.js please cite:
@inproceedings{papoutsaki2016webgazer, author = {Alexandra Papoutsaki and Patsorn Sangkloy and James Laskey and Nediyana Daskalova and Jeff Huang and James Hays}, title = {WebGazer: Scalable Webcam Eye Tracking Using User Interactions}, booktitle = {Proceedings of the 25th International Joint Conference on Artificial Intelligence (IJCAI)}, pages = {3839--3845}, year = {2016}, organization={AAAI} }