The app, built into a fitness tracker for this research, collects physical and speech data to analyze the overall tone of the story in real time. Using artificial intelligence, the app can also figure out which part of the conversation was happy or sad, and tracks emotional changes in five-second intervals.
In the research, participants were asked to wear a Samsung Simband with the app installed and tell a story. The band also monitored the participants’ physical changes, such as increased skin temperature, heart rate, or movements such as waving their arms around or fidgeting. Overall, the neural networks were able to determine tone with 83 percent accuracy — though it is unclear whether the research has been peer-reviewed.
Generally, the AI associated parts of speech that had long pauses or used monotonous vocal tones as sad, while varied speech patterns were categorized as happy. The team hopes to label more complex emotions soon.
“Imagine if, at the end of a conversation, you could rewind it and see the moments when the people around you felt the most anxious,” said graduate student Tuka Alhanai, who is part of the research team. The product could be used to help those with anxiety or conditions like Asperger’s or autism. “Our work is a step in this direction, suggesting that we may not be that far away from a world where people can have an AI social coach right in their pocket.”
The research is an ongoing effort from MIT’s Computer Science and Artificial Intelligence Laboratory to study emotion detection. Last fall, the team built a device to identify human emotions using wireless signals.
No comments:
Post a Comment