According to some experts, a new unexpected trend that may be formed in the testing of differe companies on artificial ielligence technology in their products is the use of models called “Emotional artificial ielligencewill help the newly developed robots Human emotions better understand
According to the new Eerprise Saas Emerging Tech Research PitchBook, the technology uses sensors to recognize visual, audio and other coe inputs combined with machine learning and psychology to ideify human emotions.
Currely, some major cloud service providers offer services that provide developers with access to emotional AI capabilities, such as Microsoft Azure’s Emotion API Cognitive Services or Amazon’s Web Services Recognition Service.
Ways to use emotional artificial ielligence

PitchBook senior analyst Derek Hernandez writes in this report:
“With the proliferation of artificial ielligence assistas and fully automated human-machine ieractions, emotional AI promises human-like ierpretations and responses. Cameras and microphones are an iegral part of emotional AI hardware. They can be on a laptop, phone or individually in a physical space. “Wearable hardware is also likely to provide another way to use emotional AI.”
The report argues that if businesses deploy AI assistas for their managers and employees, and AI chatbots become corporate represeatives in customer service, the technology will be unable to tell the difference between “What do you mean?” In angry and confused states, how can it function properly?
However, as the report pois out, artificial ielligence regulations such as the EU’s AI law, which prohibits the use of emotion recognition systems for tasks such as education, may oppose this idea.



