Publications in Scientific Journals:

H. Liao, W. Dong, H. Huang, G. Gartner, H. Liu:
"Inferring user tasks in pedestrian navigation from eye movement data in real-world environments";
International Journal of Geographical Information Science, 32 (2018).

English abstract:
Eye movement data convey a wealth of information that can be used
to probe human behaviour and cognitive processes. To date, eye
tracking studies have mainly focused on laboratory-based evaluations
of cartographic interfaces; in contrast, little attention has been
paid to eye movement data mining for real-world applications. In this
study, we propose using machine-learning methods to infer user
tasks from eye movement data in real-world pedestrian navigation
scenarios. We conducted a real-world pedestrian navigation experiment
in which we recorded eye movement data from38 participants.
We trained and cross-validated a random forest classifier for classifying
five common navigation tasks using five types of eye movement
features. The results show that the classifier can achieve an overall
accuracy of 67%. We found that statistical eye movement features
and saccade encoding features are more useful than the other
investigated types of features for distinguishing user tasks. We also
identified that the choice of classifier, the time window size and the
eye movement features considered are all important factors that
influence task inference performance. Results of the research open
doors to some potential real-world innovative applications, such as
navigation systems that can provide task-related information
depending on the task a user is performing.

Wayfinding; random forests; task inference; eye tracking; machine learning

"Official" electronic version of the publication (accessed through its Digital Object Identifier - DOI)

Created from the Publication Database of the Vienna University of Technology.