“We can manage what we measure, but what we mostly measure are things like money or speed,” Nduka says. “What we can’t really measure is quality. And quality is about emotions. And emotions can be sensed most sensitively with expressions.”
AI Vision
Humanity has been asking whether AI can truly know how people feel for a long time, and most of the answers come down to, well, probably not. Even without a bunch of advanced cameras and AI smarts, reading emotions can be tricky.
“Gauging emotion through facial expressions is kind of somewhat debatable,” says Andrew McStay, a professor and director of the Emotional AI Lab at Bangor University in the UK. McStay says that even if the company were using AI to “smooth out” the data collected by the sensors to make it more usable, he’s not convinced it can actually read emotions with accuracy. “I just think there are fundamental flaws and fundamental problems with it.”
Cultural differences also inform how different people display emotion. One person’s smile might mean congeniality or joy, while others might be a nervous expression of fear. That type of signaling can vary widely from culture to culture. How emotions register on the face can also fluctuate depending on neurodivergence, though Emteq says it wants to help neurodivergent users navigate those kinds of awkward social interactions.
Strand says Emteq is trying to take all of these factors into account, hence the pursuit for more and more data. Emteq is also adamant that its use cases will be wholly vetted and overseen by health care providers or practitioners. The idea is that the tech would be used by therapists, doctors, or dietary consultants to ensure that all the data they’re collecting straight off your face isn’t used for nefarious purposes.
“You’ve got to be thoughtful about how you deliver information, which is why we have experts in the loop. At least right now,” Strand says. “The data is valuable regardless because it empowers whoever is making the assessment to give good advice. Then it’s a question of what is that advice, and what’s appropriate for that person in their journey. On the mental health side, that’s especially important.”
Strand envisions therapy sessions where instead of a patient coming in and being encouraged to share details about stressful situations or anxious moments, the therapist might already have a readout of their emotional state over the past week and be able to point out problem areas and inquire about them.
Nearsighted
Regardless of how good Emteq’s smart glasses are, they’re going to have to compete with the bigwigs already out there selling wearable tech that offers far wider use cases. People might not be interested in sporting a bulky-ish pair of glasses if all they can do is scan your face and look at your food. It’s not far-fetched at all to imagine these internal facing sensors being incorporated into something more feature rich, like Meta’s Ray-Ban smart glasses.
“This has always been kind of the way with these kinds of products,” McStay says. “These things often start with health, and then quickly they kind of get built out into something which is much more marketing oriented.”