BACKGROUND
AI-driven medical chatbots allow patients to seek consultations without the constraints of time and space. Despite the rapid advancements and, in some cases, superior performance of AI medical chatbots compared to human physicians in specific domains, user hesitation persists. Furthermore, AI medical chatbots are still relatively new to patients in China. Understanding how patients' perceptions (AI versus human physicians) influence the trust-building process is crucial for the broader adoption of this technology.
OBJECTIVE
This study aims to explore how different perception (Robot and Human-like) of users build trust in AI medical chatbot. Moreover, this study examines the moderating role of privacy concern on trust in technology and trust in AI.
METHODS
PLS-MGA was adopted with data collected from 1547 participants, both online and offline, to examine the empirical results.
RESULTS
Perceived ease of use (β = .433, p < .001), privacy concern (β = .079, p = .006), and brand reputation (β = .264, p < .001) were positively associated with trust in technology and trust in brand. Trust in brand (β = .151, p < .001) positively influenced trust in technology, and trust in technology significantly predicted trust in AI across all dimensions: cognition (β = .20, p < .001), information (β = .19, p < .001), and behavior (β = .17, p < .001). Privacy concern moderated the relationship between trust in technology and trust in AI (β = .11, p = .001), information (β = .08, p < .001), and behavior (β = .09, p < .001). No significant differences were found for AI experience, health status, and perceived risk. However, the paths for perceived ease of use, trust in brand, and trust in technology (benevolence) were significantly stronger in the human group.
CONCLUSIONS
This study contributes both theoretically and practically by advancing the current understanding of the trust-building process in AI healthcare. It also examines the moderating effect of privacy concerns in the trust-building process within the AI healthcare context.