Support groups have long been a sanctuary for individuals seeking connection and reassurance from peers facing similar challenges. Recent research from the University of Kansas and the University of Southern California delves into the nonverbal behaviors that signal connections between participants in virtual support group settings. This study, presented at the 27th International Conference on Multimodal Interaction, analyzes how dyadic alliance—the bond between two individuals—can be measured through observed behaviors.
The study involved data from 18 support groups comprising a total of 96 participants. Researchers assessed the sense of connection felt among participants through surveys administered before and after group sessions. By utilizing computational algorithms to track verbal and nonverbal communication, the research team explored the potential of machine learning in measuring the dynamics of these alliances, which may significantly influence mental health interventions.
Mental health support demand has surged in recent years, driven in part by the lingering effects of the COVID-19 pandemic. As professionals struggle to meet this growing need, many have turned to artificial intelligence as a potential tool. According to Yunwen Wang, assistant professor of journalism and mass communications at the University of Kansas and one of the study’s authors, the project was initiated in response to increasing burnout among mental health professionals. Wang noted, “We kept talking about getting back to normal after COVID, but I felt like there is this lingering effect of COVID, and there are a lot of thoughts that we had about the opportunity of AI and how we can ethically leverage it.”
Participants in the study were recruited specifically for general anxiety support groups. Their mental health and emotional states were evaluated before and after the sessions, where they interacted via online video conferencing. A virtual conversational agent, embodied in a robot, facilitated these discussions. This agent was operated by a human who could intervene if necessary, creating a hybrid model of support.
Researchers meticulously analyzed recordings from the sessions, transcribing verbal exchanges and identifying nonverbal cues such as head nods, smiles, and other gestures. They also examined vocal characteristics like pitch variation and the intensity of smiles. Findings indicated that both verbal and nonverbal behaviors play a critical role in establishing a sense of alliance among participants. For instance, a speaker felt more connected when the listener displayed frequent head nods and brow raises, while speakers who varied their pitch and gestures also reported a stronger sense of alliance.
The implications of these findings suggest that understanding both verbal and nonverbal communication could enhance the measurement of alliance in group settings. They build upon existing research indicating that strong alliances among participants can lead to improved overall group engagement and outcomes.
While the study opens the door for using AI to identify behavioral markers indicative of dyadic alliance, the researchers stress the importance of ethical considerations in mental health applications. Wang emphasizes that the study is not advocating for a complete replacement of human therapists with AI. Instead, it aims to understand how machine learning can augment human interactions: “The goal wasn’t to replace humans or to compare human versus AI-assisted mental health support facilitators.”
The ongoing research also tackles pressing questions regarding the ethical use of AI in mental health, particularly as the popularity of chatbots and AI-driven agents rises. Discussions about the regulatory landscape around AI therapy, especially in places like California, highlight concerns over trust, efficacy, and privacy. “We consider this work as one of the first steps to understand the boundaries of AI’s applications in mental health,” Wang stated.
As the team continues to investigate user trust in AI-driven support systems, they aim to clarify how much AI involvement can be perceived as acceptable by users in mental health contexts. Ultimately, this research may provide insights into enhancing mental health services by identifying when participants forge genuine connections in group settings. Wang concluded, “In support groups, it may be the human-to-human dynamic that’s really helpful for people to come together and share their experiences.”
For more information, refer to the work of Kevin Hyekang Joo and colleagues, titled “Multimodal Behavioral Characterization of Dyadic Alliance in Support Groups,” published in the Proceedings of the 27th International Conference on Multimodal Interaction.
