We know smart glasses can play podcasts and put an AI assistant in your ear, but what if they knew what you were feeling? That’s the idea behind Emteq Lab‘s Sense glasses. They’re not on the market yet, but the end goal is a lightweight pair of specs equipped with sensors that read minute changes in users’ facial muscles, all with the goal of detecting real-time mood shifts to unlock insights into health, eating habits, and more.
Emteq is one of a growing number of companies in the field of “affective computing,” technology designed to recognize, interpret, process, and/or simulate human emotions. For good or ill, the future is likely to be packed with the stuff.
How do emotion-sensing glasses work?
Credit: Emteq
The technology behind Emteq’s emotion-tracking glasses is sophisticated, but the concept is straightforward: the glasses’ inward-facing sensors monitor the electrical activity of your zygomaticus muscle group (smiling muscles), the corrugator supercilii group (forehead muscles), as well as the muscle groups that control your brow, and combines that information with heart-rate and head movement data, then puts it all together into a real-time record of your emotions you can access on your smart phone.
That’s the idea, anyway. Whether any machine can accurately interpret what emotions for everyone through facial muscle movement is a complex question. Research indicates that basic emotions like happiness, sadness, surprise, and disgust are expressed facially in similar ways across cultures, but cultural influences and individual differences affect how we display emotions. Some people have poker faces. Some people laugh when they’re scared. And anyone can smile when they’re feeling blue.
Use cases for emotion-sensing glasses
I recently spoke with Emteq’s CEO Steen Strand and saw a demo. The Sense glasses prototype seems to work as advertised in a normal-looking pair of eyeglass frames. The eventual vision for the technology spans everything from virtual meetings to mental health monitoring to dietary tracking.
Making virtual meetings more “natural”
“When we’re in a conversation, you want to see my face, I want to see yours. We can react to each other,” Strand said. “If you want to do that virtually, you have to know what my face is doing.” The idea is that expression-sensing glasses could make avatars and virtual interactions more “real” by putting what’s on your real face onto your digital face.
For some kinds of virtual conversations, this would be amazing, but what if I want to not look bored during a meeting? Either way, existing VR technology can do something similar, but according to Strand, Emteq’s tech provides a better solution. “A lot of existing technology, particularly in VR, is just more heavy on power and computing,” Strand said. “We’re using these very lightweight, low-power sensors that just look at a little tiny part of your face, and from that we can infer what your whole face is doing.”
Mental health
A constant monitor of real-life emotions could provide an additional diagnostic tool for mental health professionals, according to Strand. “The gold standard for diagnosing depression right now is a questionnaire,” he said. “Not only does that have inherent bias, it’s also a sliver in time. How you feel at one moment could differ from how you feel an hour later,” but a constant record of emotions would, presumably, be more revelatory of one’s mental state.
For people who have trouble knowing what emotion their face is displaying, whether because of a physical condition like facial paralysis or a mental health concern like autism, emotion-sensing glasses could provide a window to a sense that most of us take for granted.
What do you think so far?