TY - JOUR
T1 - Automatic Identification of Facial Tics Using Selfie-Video
AU - Loewenstern, Yocheved
AU - Benaroya-Milshtein, Noa
AU - Belelovsky, Katya
AU - Bar-Gad, Izhar
N1 - Publisher Copyright:
© 2013 IEEE.
PY - 2025
Y1 - 2025
N2 - The intrinsic nature of tic disorders, characterized by symptom variability and fluctuation, poses challenges in clinical evaluations. Currently, tic assessments predominantly rely on subjective questionnaires administered periodically during clinical visits, thus lacking continuous quantitative evaluation. This study aims to establish an automatic objective measure of tic expression in natural behavioral settings. A custom-developed smartphone application was used to record selfie-videos of children and adolescents with tic disorders exhibiting facial motor tics. Facial landmarks were utilized to extract tic-related features from video segments labeled as either "tic"or "non-tic". These features were then passed through a tandem of custom deep neural networks to learn spatial and temporal properties for tic classification of these segments according to their labels. The model achieved a mean accuracy of 95% when trained on data across all subjects, and consistently exceeded 90% accuracy in leave-one-session-out and leave-one-subject-out cross validation training schemes. This automatic tic identification measure may provide a valuable tool for clinicians in facilitating diagnosis, patient follow-up, and treatment efficacy evaluation. Combining this measure with standard smartphone technology has the potential to revolutionize large-scale clinical studies, thereby expediting the development and testing of novel interventions.
AB - The intrinsic nature of tic disorders, characterized by symptom variability and fluctuation, poses challenges in clinical evaluations. Currently, tic assessments predominantly rely on subjective questionnaires administered periodically during clinical visits, thus lacking continuous quantitative evaluation. This study aims to establish an automatic objective measure of tic expression in natural behavioral settings. A custom-developed smartphone application was used to record selfie-videos of children and adolescents with tic disorders exhibiting facial motor tics. Facial landmarks were utilized to extract tic-related features from video segments labeled as either "tic"or "non-tic". These features were then passed through a tandem of custom deep neural networks to learn spatial and temporal properties for tic classification of these segments according to their labels. The model achieved a mean accuracy of 95% when trained on data across all subjects, and consistently exceeded 90% accuracy in leave-one-session-out and leave-one-subject-out cross validation training schemes. This automatic tic identification measure may provide a valuable tool for clinicians in facilitating diagnosis, patient follow-up, and treatment efficacy evaluation. Combining this measure with standard smartphone technology has the potential to revolutionize large-scale clinical studies, thereby expediting the development and testing of novel interventions.
KW - Automatic detection
KW - facial landmarks
KW - machine learning
KW - motor tics
KW - tic disorders
KW - tourette syndrome
UR - http://www.scopus.com/inward/record.url?scp=85208265874&partnerID=8YFLogxK
U2 - 10.1109/JBHI.2024.3488285
DO - 10.1109/JBHI.2024.3488285
M3 - ???researchoutput.researchoutputtypes.contributiontojournal.article???
C2 - 39475732
AN - SCOPUS:85208265874
SN - 2168-2194
VL - 29
SP - 409
EP - 419
JO - IEEE Journal of Biomedical and Health Informatics
JF - IEEE Journal of Biomedical and Health Informatics
IS - 1
ER -