Abstract
AbstractBackgroundRecent advancements in large language models (LLMs) have created new ways to support radiological diagnostics. While both open-source and proprietary LLMs can address privacy concerns through local or cloud deployment, open-source models provide advantages in continuity of access, independence from commercial update cycles, and potentially lower costs.PurposeTo evaluate the diagnostic performance of open-source LLMs on challenging radiological cases across multiple subspecialties.MethodsWe evaluated the diagnostic performance of eleven state-of-the-art open-source LLMs using clinical and imaging descriptions from 4,049 case reports in the Eurorad library. Cases spanned all radiological subspecialties and excluded those with explicit mentioning of the correct diagnoses in the case description. LLMs provided differential diagnoses based on clinical history and imaging findings. Responses were considered correct if the true diagnosis was included in the top three LLM suggestions. Llama-3-70B evaluated LLM responses, with its accuracy validated against radiologist ratings in a case subset (n = 140). Confidence intervals were adjusted based on this validation. Models were further tested on 60 non-public brain MRI cases from a tertiary hospital to assess generalizability.ResultsLlama-3-70B demonstrated superior performance (75.1 ± 1.7% correct), followed by Gemma-2-27B (63.9 ± 1.8%) and Mixtral-8x-7B (61.5 ± 1.8%). Performance varied across subspecialties, with highest accuracy across models in genital (female) imaging (59.7 ± 2.7%) and lowest in musculoskeletal imaging (47.1 ± 1.5%). Llama-3-70B’s judging accuracy was 87.8% (123/140; 95% CI: 0.82 – 0.93) compared to radiologists. Similar performance results were found in the non-public dataset, where Llama-3-70B (71.7 ± 14.1%), Gemma-2-27B (53.3 ± 15.1%), and Mixtral-8x-7B (51.7 ± 15.1%) again emerged as the top models.ConclusionSeveral open-source LLMs showed promising performance in identifying the correct diagnosis based on case descriptions from the Eurorad library, highlighting their potential as decision support tools for radiological differential diagnosis in challenging, real-world cases.
Publisher
Cold Spring Harbor Laboratory