Experts are wary of Apple’s AI research to detect mood

While Apple is reportedly working on AI technology capable of detecting mental health states and emotions, some are skeptical.

It is not clear and is still not proven if the AI ​​is reliable for produce clear diagnoses and unsure of how such an “emotional AI” would be used in the field, according to Jorge Barraza, assistant professor of psychology practice at the University of Southern California and technical director of Immersion, a technology provider in the field. neuroscience.

“When we infer things from emotional AI at the macro level – which means we tend to see patterns at the macro level – at the individual level, it starts to get a little more questionable,” said Barraza.

Outside of a social context, “we don’t know how much sense [emotion] a to enable us to understand what are the psychological experiences of people, ”he added. “Different types of expressions or emotions can have different meanings, whether in a social context or not.”

The research project apparently arose from a Apple-sponsored joint research project with UCLA that the university first made public in 2020, according to the the Wall Street newspaper.

Apple and UCLA researchers are looking to create algorithms that can use digital signals to detect depression or anxiety. Data points they use include facial recognition, sleep patterns, typing behavior, and vital signs

Mental health research

Researchers use Apple devices, including the iPhone and Apple Watch, with a Beddit Sleep Monitor device. The project started with 150 participants in 2020 and is expected to involve around 3,000 people by the end of 2023.

Neither Apple nor UCLA responded to requests for comment on the research project.

Researchers don’t just try to understand a person’s mental health and seek to determine whether a person is suffering from anxiety or depression.

Research based on personal devices, while unproven, could yield useful tools, Barraza said.

I consider this technology to be very promising. Not in terms of diagnosing things like depression or anxiety, but at least serving in a directional way to make people aware of their day-to-day life.

Jorge barrazaProfessor, University of Southern California

“I see this technology as very promising,” he said. “Not in terms of diagnosing things like depression or anxiety, but at least serving in a directional way to give people an awareness of their day-to-day.”

Apple’s interest in emotional AI

Apple’s interest in AI emotion started in 2016 when he bought Emotient, a vendor that uses AI to read emotions.

Emotient is one of the growing number of providers in the field of emotional AI. Meanwhile, companies use similar systems that use AI and machine learning to measure employee engagement, and assess potential candidates for employment.

Apple’s use of the technology is different from what others have done before, as researchers focus on multiple data points, Barraza said. He said that usually AI researchers focus on facial recognition (capturing expressions such as smiles and frowns) or voice analysis (tone and words used). Instead, researchers working with Apple and UCLA are looking at both facial recognition and voice analysis, as well as heart rate, sleep patterns, and more.

“We are talking about a big data set,” he said. “We don’t just rely on one piece of information to tell us how people live [their emotions]. “

Emotions differ depending on the social context

While technology can be useful in educating people about their emotional well-being, Barraza said the approach should always be viewed with skepticism, especially if the data is used to predict how one feels.

Despite the intentions of Apple or whoever owns the technology, it cannot be used in the way intended. Instead, it could be used in a way detrimental to an employee or maybe an older interpretation of the data.

Culture and emotions

Another challenge of emotional AI is how to deal with the way different emotions are viewed or perceived in different cultures.

“What might be different in certain cultures or certain subgroups or certain ages… this nuance is so difficult to detect,” said R “Ray” Wang, founder and senior analyst of Constellation Research.

Wang said the challenge for any business trying to develop emotional AI is knowing when the data is good enough.

Researchers need to determine the level of precision they want in order to avoid false positives and false negatives, he said. They should research where there might be a bias and where false patterns might be in the dataset. This could mean taking into account cultural differences, accents, or even racial differences that could affect a person’s emotional well-being.

However, as one of the largest manufacturers of mobile devices in the world, Apple may stand a chance to make the technology work thanks to its vast network of users.

“We are at the beginning of emotional AI,” Wang said. “It’s going to take off over time. But if you release it too soon and lose people’s trust, that’s the risk.”

Source link

About Roberto Frank

Check Also

Apple is testing ways to use AirPods as a health device

Image by Natasha Connell from Unsplash When you think about how intricate the smartphone has …

Leave a Reply

Your email address will not be published. Required fields are marked *