Scale, Engage, or Both?: Potential and Perils of Applying Large Language Models in Interview and Conversation-Based Research
Author(s)
Hwang, Angel Hsing-Chi; Aubin Le Qu?r?, Marianne; Schroeder, Hope; Cuevas, Alejandro; Dow, Steven; Kapania, Shivani; Rho, Eugenia; ... Show more Show less
Download3715070.3748284.pdf (504.7Kb)
Publisher Policy
Publisher Policy
Article is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use.
Terms of use
Metadata
Show full item recordAbstract
An increasing number of studies apply tools powered by large language models (LLMs) to interview and conversation-based research, one of the most commonly used research methods in CSCW. This panel invites the CSCW community to critically debate the role of LLMs in reshaping interview-based methods. We aim to explore how these tools might (1) address persistent challenges in conversation-based research, such as limited scalability and participant engagement, (2) introduce novel methodological possibilities, and (3) surface additional practical, technical, and ethical concerns. The panel discussion will be grounded on the panelists’ prior experience applying LLMs to their own interview and conversation-based research. We ask whether LLMs offer unique advantages to enhance interview research, beyond automating certain aspects of the research process. Through this discussion, we encourage researchers to reflect on how applying LLM tools may require rethinking research design, conversational protocols, and ethical practices.
Description
CSCW Companion ’25, Bergen, Norway
Date issued
2025-10-17Department
Program in Media Arts and Sciences (Massachusetts Institute of Technology)Publisher
ACM|Companion of the Computer-Supported Cooperative Work and Social Computing
Citation
Models in Interview and Conversation-Based Research. In Companion of
the Computer-Supported Cooperative Work and Social Computing (CSCW
Companion ’25), October 18–22, 2025, Bergen, Norway. ACM, New York, NY,
USA, 4 pages.
Version: Final published version
ISBN
979-8-4007-1480-1