TY - JOUR
T1 - A multimodal dataset for authoring and editing multimedia content
T2 - The MAMEM project
AU - Nikolopoulos, Spiros
AU - Petrantonakis, Panagiotis C.
AU - Georgiadis, Kostas
AU - Kalaganis, Fotis
AU - Liaros, Georgios
AU - Lazarou, Ioulietta
AU - Adam, Katerina
AU - Papazoglou-Chalikias, Anastasios
AU - Chatzilari, Elisavet
AU - Oikonomou, Vangelis P.
AU - Kumar, Chandan
AU - Menges, Raphael
AU - Staab, Steffen
AU - Müller, Daniel
AU - Sengupta, Korok
AU - Bostantjopoulou, Sevasti
AU - Katsarou, Zoe
AU - Zeilig, Gabi
AU - Plotnik, Meir
AU - Gotlieb, Amihai
AU - Kizoni, Racheli
AU - Fountoukidou, Sofia
AU - Ham, Jaap
AU - Athanasiou, Dimitrios
AU - Mariakaki, Agnes
AU - Comanducci, Dario
AU - Sabatini, Edoardo
AU - Nistico, Walter
AU - Plank, Markus
AU - Kompatsiaris, Ioannis
N1 - Publisher Copyright:
© 2017 The Authors
PY - 2017/12
Y1 - 2017/12
N2 - We present a dataset that combines multimodal biosignals and eye tracking information gathered under a human-computer interaction framework. The dataset was developed in the vein of the MAMEM project that aims to endow people with motor disabilities with the ability to edit and author multimedia content through mental commands and gaze activity. The dataset includes EEG, eye-tracking, and physiological (GSR and Heart rate) signals collected from 34 individuals (18 able-bodied and 16 motor-impaired). Data were collected during the interaction with specifically designed interface for web browsing and multimedia content manipulation and during imaginary movement tasks. The presented dataset will contribute towards the development and evaluation of modern human-computer interaction systems that would foster the integration of people with severe motor impairments back into society.
AB - We present a dataset that combines multimodal biosignals and eye tracking information gathered under a human-computer interaction framework. The dataset was developed in the vein of the MAMEM project that aims to endow people with motor disabilities with the ability to edit and author multimedia content through mental commands and gaze activity. The dataset includes EEG, eye-tracking, and physiological (GSR and Heart rate) signals collected from 34 individuals (18 able-bodied and 16 motor-impaired). Data were collected during the interaction with specifically designed interface for web browsing and multimedia content manipulation and during imaginary movement tasks. The presented dataset will contribute towards the development and evaluation of modern human-computer interaction systems that would foster the integration of people with severe motor impairments back into society.
UR - http://www.scopus.com/inward/record.url?scp=85034082151&partnerID=8YFLogxK
U2 - 10.1016/j.dib.2017.10.072
DO - 10.1016/j.dib.2017.10.072
M3 - ???researchoutput.researchoutputtypes.contributiontojournal.article???
AN - SCOPUS:85034082151
SN - 2352-3409
VL - 15
SP - 1048
EP - 1056
JO - Data in Brief
JF - Data in Brief
ER -