Health Law Highlights

Can ChatGPT Be Trusted to Provide Medication Information to Patients?

Summary of article from Drug Topics, by Lauren Massaro:

A study published in the Journal of the American Pharmacists Association has found that while AI chatbot, ChatGPT, can provide correct answers to common medication questions, it may not always provide complete information. The study evaluated ChatGPT’s responses to questions about the top 20 drugs, and found that 92.5% of responses were completely correct, but only 80.8% were complete. The study also found inconsistencies in the chatbot’s responses when the same questions were asked again after two weeks. Despite these limitations, the authors believe AI chatbots have potential as healthcare tools, but stress the importance of patients validating information with healthcare professionals. They also highlight the need for robust regulatory frameworks, fact-checking mechanisms, and increased public education about health literacy and responsible use of AI.