Knopp, BenjaminAuras, DanielSchütz, Alexander, C.Endres, Dominik2024-08-152024https://tam-datahub.online.uni-marburg.de/handle/tam/27https://doi.org/10.60834/tam-datahub-5# Movement Primitive Gaze Data This archive contains data to recreate the paper "Reading Decisions from Gaze Direction during Graphics Turing Test of Gait Animation". The accompanying code and further information can be obtained here: https://gitlab.uni-marburg.de/knoppbe/movementprimitivegaze ## Files - `eyedata/*.txt`: Files containing eyetracking data, most importantly the point of regard in pixel coordinates of the recorded videos (which are not part of this archive) - `keypoint_labels/*.h5`: Files containing the pixel positions of keypoints (stimulus displaying monitor, avatar body landmarks) - `postcalibration.json`: Contains offsets to post-calibrate gaze data. It is provided here, to enable analysis without videos. Use `src/plot_postcal.py` to reproduce. - `temp_exp1.json`: Contains decision data from 2AFC Experiment (Please see https://gitlab.uni-marburg.de/knoppbe/mp-identification.git on how to produce this file) - `preprocessed_data.json`: Contains preprocessed decision data from 2AFC Experiment (Please see https://gitlab.uni-marburg.de/knoppbe/mp-identification.git on how to produce this file)We investigated gaze direction during movement observation. The eye movement data were collected during an experiment, in which different models of movement production (based on movement primitives, MPs) were compared in a two alternatives forced choice task (2AFC). In each trial, participants observed side-by-side presentation of two naturalistic 3D-rendered human movement videos, where one video was based on motion captured gait sequence, the other one was generated by recombining the machine-learned MPs to approximate the same movement. The participants' task was to discriminate between these movements while their eye movements were recorded. We are complementing previous binary decision data analyses with eye tracking data. Here, we are investigating the role of gaze direction during task execution. We computed how much information is shared between gaze features extracted from eye tracking data and decisions of the participants, and between gaze features and correct answers. We found that eye movements reflect the decision of participants during the 2AFC task, but not the correct answer. This result is important for future experiments (possibly in virtual reality), which should take advantage of eye tracking to complement binary decision data.tar-archiveenhttps://creativecommons.org/licenses/by-sa/4.0/110-02 Biologische Psychologie und Kognitive NeurowissenschaftenEyetracking data during observation of movement primitive generated motionResearch Data