Scientists develop AI app to detect depression by facial expressions

Scientists have unveiled an AI-powered app designed to detect mood and early signs of depression, offering potential solutions to individuals grappling with mental health issues. 

Known as MoodCapture, this innovative smartphone application, developed by Dartmouth researchers, utilizes artificial intelligence and facial-image processing software to analyze facial expressions and environmental cues during regular phone use.

The app's unique approach involves the front camera of a smartphone capturing facial expressions and surroundings, subsequently evaluating the images for clinical indicators associated with depression.

Remarkably, MoodCapture has demonstrated an impressive 75pc accuracy in identifying early symptoms of depression in a study involving 177 individuals diagnosed with major depressive disorder.

Researchers employed image-analysis AI to correlate self-reports of feeling depressed with specific facial expressions, eye movements, head positioning, muscle rigidity, and environmental features like lighting, colors, photo locations, and the presence of others in the image.

The app operates in real-time, analyzing a sequence of images each time the user unlocks their phone. Over time, MoodCapture establishes personalized connections between expressions and background details, enabling it to identify user-specific features indicative of the onset of depression.

Ideally, an AI application like MoodCapture would recommend proactive steps, such as spending time outdoors or reaching out to a friend, rather than directly notifying an individual about the possibility of entering a depressive state.

These promising results suggest that with further development, MoodCapture could potentially be available to the public within the next five years.

The app's ability to proactively detect and monitor mental health could revolutionize how individuals manage and address depressive symptoms. 

ePaper - Nawaiwaqt