TY - GEN
T1 - How Can Client Motivational Language Inform Psychotherapy Agents?
AU - Hoang, Van
AU - Rogers, Eoin
AU - Ross, Robert
N1 - Publisher Copyright:
©2024 Association for Computational Linguistics.
PY - 2024
Y1 - 2024
N2 - Within Motivational Interviewing (MI), client utterances are coded as for or against a certain behaviour change, along with commitment strength; this is essential to ensure therapists soften rather than persisting goal-related actions in the face of resistance. Prior works in MI agents have been scripted or semi-scripted, limiting users’ natural language expressions. With the aim of automating the MI interactions, we propose and explore the task of automated identification of client motivational language. Employing Large Language Models (LLMs), we compare in-context learning (ICL) and instruction fine-tuning (IFT) with varying training sizes for this identification task. Our experiments show that both approaches can learn under low-resourced settings. Our results demonstrate that IFT, though cheaper, is more stable to prompt choice, and yields better performance with more data. Given the detected motivation, we further present an approach to the analysis of therapists’ strategies for balancing building rapport with clients with advancing the treatment plan. A framework of MI agents is developed using insights from the data and the psychotherapy literature.
AB - Within Motivational Interviewing (MI), client utterances are coded as for or against a certain behaviour change, along with commitment strength; this is essential to ensure therapists soften rather than persisting goal-related actions in the face of resistance. Prior works in MI agents have been scripted or semi-scripted, limiting users’ natural language expressions. With the aim of automating the MI interactions, we propose and explore the task of automated identification of client motivational language. Employing Large Language Models (LLMs), we compare in-context learning (ICL) and instruction fine-tuning (IFT) with varying training sizes for this identification task. Our experiments show that both approaches can learn under low-resourced settings. Our results demonstrate that IFT, though cheaper, is more stable to prompt choice, and yields better performance with more data. Given the detected motivation, we further present an approach to the analysis of therapists’ strategies for balancing building rapport with clients with advancing the treatment plan. A framework of MI agents is developed using insights from the data and the psychotherapy literature.
UR - http://www.scopus.com/inward/record.url?scp=85189750032&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:85189750032
T3 - CLPsych 2024 - 9th Workshop on Computational Linguistics and Clinical Psychology, Proceedings of the Workshop
SP - 23
EP - 40
BT - CLPsych 2024 - 9th Workshop on Computational Linguistics and Clinical Psychology, Proceedings of the Workshop
A2 - Yates, Andrew
A2 - Desmet, Bart
A2 - Prud�hommeaux, Emily
A2 - Zirikly, Ayah
A2 - Bedrick, Steven
A2 - MacAvaney, Sean
A2 - Bar, Kfir
A2 - Ireland, Molly
A2 - Ophir, Yaakov
A2 - Ophir, Yaakov
PB - Association for Computational Linguistics (ACL)
T2 - 9th Workshop on Computational Linguistics and Clinical Psychology, CLPsych 2024
Y2 - 21 March 2024
ER -