When it comes to relationships, it’s not what you say, it’s how you say it — at least, according to one U professor’s research.
Brian Baucom, a professor in clinical psychology, helped create an algorithm that can predict a married couple’s relationship outcome just based on their tones of voice. The algorithm forecasts an improved or worsened relationship nearly 79 percent of the time, beating the accuracy of human therapists. Baucom collaborated with project leaders Shrikanth Narayanan, Panayiotis Georgiou and others from the USC School of Engineering on the project.
“Just the fact that you can go from these abstract vocal cues to really important psychological outcomes is promising,” Baucom said.
Researchers recorded couple’s conversations during marriage therapy sessions for two years and tracked the couple’s relationship status for five years.
“We find that the tone of voice is really important,” said Georgiou, who worked on the technical aspects of the study. “It’s not so much how people talk as individuals, but how what they say relates to what the other person is saying.”
Observing whether partners would change their voice intensity or respond to their partner’s vocal changes provided the most insight into their relationship outcome. The team designed a system that takes the recordings and breaks them into acoustic categories, such as pitch and intensity.
“There are other things like jitter and shimmer — those are things we hear like the shakiness of someone’s voice,” Baucom said.
Baucom said the acoustic frequencies themselves are not “inherently linked to theory of what makes relationships work or not work.” Rather, they simply tracked all acoustic information and had the program see what connections could be made to relationship outcomes.
After the algorithm was refined, they compared its findings to experts’ analyses of the conversations using traditional relationship theory. Experts broke down conversation into qualities, such as acceptance or criticism, and evaluated the couple’s relationship health. They found that the acoustic system provided better results than the experts’ assessments.
Baucom compared the system’s attempts to predict relationship outcome to Pandora’s Internet radio service. Like Pandora trying to guess a song you’d like based on others’ input, their algorithm attempts to guess a couple’s future based on other couples with similar acoustic characteristics. Its predictions are usually accurate, but it isn’t quite ready to be a relationship fortune-teller.
The study was part of the group’s larger project to see what information machines can learn from humans. Previous studies looked at body language or tracked key words throughout conversations.
“Blame is highly related to people using the word ‘you,’” Georgiou said, as an example. “If somebody says to their partner, ‘it’s your fault,’ ‘you didn’t take the trash out,’ ‘you’re always late coming back,’ it throws the blame on the partner.”
The team plans to continue combining acoustic information, body language and other factors to get predictions that are more accurate. Baucom is currently planning a study at the U for the U.S. Department of Defense. The study will use similar technology to detect an individual’s risk of suicide.
Researchers from UCLA and the University of Washington recruited more than 100 couples for the study, funded by a grant from the National Science Foundation. The group published their findings in Proceedings of Interspeech in September 2015.
@mbatman72