Investigating the use of face video data to measure mental health
LE3 .A278 2022
Bachelor of Science
Due to the increase in mental distress and suicidality caused by COVID-19, as well as the growing humanitarian crises around the world, the need for mental health screening and treatment is increasing. A proactive mental health screening tool that uses face video data could help identify when individuals need help. The current exploratory study investigates the ability to associate facial behavior, specifically Eye-Blink Rate (EBR) and facial affect, with measures of depression (CESD-R), attachment style (ECR-RS), mood (POMS), and empathy (EQ). To collect facial behavior data remotely, the iMotions Home Access System was used. Facial behavior was measured using the AFFDEX automated facial affect coding algorithm and passed through an R script to determine metrics of interest. The study used a within-subject design (N = 49). Each participant completed a positive condition, which was primed with a positive YouTube Video, and negative condition, which was primed with a news clip about COVID-19. For each condition, participants completed tasks including a Passive viewing task, Narrative task, and three Motor tasks (COMMAND, IMITATE, and OPPOSITE). Key findings found individuals exhibit more positive facial expression during positive conditions (p < .001) and exhibit more facial expressions during Narrative tasks versus Passive viewing (p < .001). Findings also found differences in response time when imitating expressions compared to making OPPOSITE expressions (p < .001). Additionally, findings found EBR to be significantly affected by task (p < .001).
The author retains copyright in this thesis. Any substantial copying or any other actions that exceed fair dealing or other exceptions in the Copyright Act require the permission of the author.