According to some experts, a new unexpected trend that may be formed in the testing of different companies on artificial intelligence technology in their products is the use of models called “Emotional artificial intelligencewill help the newly developed robots Human emotions better understand
According to the new Enterprise Saas Emerging Tech Research PitchBook, the technology uses sensors to recognize visual, audio and other content inputs combined with machine learning and psychology to identify human emotions.
Currently, some major cloud service providers offer services that provide developers with access to emotional AI capabilities, such as Microsoft Azure’s Emotion API Cognitive Services or Amazon’s Web Services Recognition Service.
Ways to use emotional artificial intelligence
PitchBook senior analyst Derek Hernandez writes in this report:
“With the proliferation of artificial intelligence assistants and fully automated human-machine interactions, emotional AI promises human-like interpretations and responses. Cameras and microphones are an integral part of emotional AI hardware. They can be on a laptop, phone or individually in a physical space. “Wearable hardware is also likely to provide another way to use emotional AI.”
The report argues that if businesses deploy AI assistants for their managers and employees, and AI chatbots become corporate representatives in customer service, the technology will be unable to tell the difference between “What do you mean?” In angry and confused states, how can it function properly?
However, as the report points out, artificial intelligence regulations such as the EU’s AI law, which prohibits the use of emotion recognition systems for tasks such as education, may oppose this idea.
RCO NEWS