Blog

Conversational AI

Applying ChatGPT in Healthcare

June 15, 2023


Applying ChatGPT in Healthcare

ChatGPT, a powerful language model developed by OpenAI, has been praised as one of the most important innovations in decades. ChatGPT, Large Language Models, and Generative AI have shown remarkable results in various industries, including medicine. This will be a multi-part blog series where we will explore the potential applications of ChatGPT in healthcare, such as virtual health assistants, patient support, health education, and language translation. We will also discuss the limitations and ethical considerations associated with using language models in healthcare.

What is ChatGPT?

ChatGPT is a large language model (LLM). It is based on the GPT-3.5 architecture, which stands for “Generative Pre-trained Transformer.” It is fine-tuned for human-like text interaction and can understand inputs and generate outputs using the context and history of the conversation. ChatGPT is exposed to a massive amount of text data from the Internet where it learns grammar, syntax, and ingests information about the world.

ChatGPT, developed by OpenAI, has shown remarkable results in many areas, including medicine, such as passing medical exams! Other companies have recently released their language models such as Google’s Bard and Facebook’s LLaMA.

Some of the largest and most innovative companies in the US have been quick to adopt ChatGPT. Salesforce is introducing ChatGPT to Slack in the form of “Einstein” who can help draft responses, summarize threads, or share recent news pertaining to the company. Instacart is implementing the tool to respond to user questions and suggest recipes and shopping lists.

What can ChatGPT do in healthcare?

ChatGPT’s conversational AI format makes it well-suited for impactful use cases in healthcare. At GYANT, we see four high-value applications for large language models in healthcare.

Virtual Health Assistants: ChatGPT augments virtual assistants to provide information, answer questions, and summarize basic medical advice to patients. It can also help with the conversational elements of symptom checking, self-care instructions, and general health education.

Patient Support and Engagement: ChatGPT can be used to engage with patients, provide support, and answer common questions. It can enhance appointment scheduling, and medication reminders, and provide general health guidance.

Health Education: ChatGPT can source educational content on a wide range of health topics. It can provide explanations, answer queries, and deliver personalized educational materials to patients and the general public.

Language Translation: ChatGPT’s language capabilities can be valuable in healthcare settings with diverse patient populations. It can assist in translating medical information and facilitating communication between healthcare providers and patients who speak different languages.

Limitations and Ethics of GPT-3.5 in Healthcare

First, it is important to understand the limitations of ChatGPT. There are several challenges to relying on LLMs to power healthcare applications:

  • Accuracy: model is not guaranteed to give correct information. OpenAI’s usage policies, explicitly prohibit relying on it for medical advice.
  • Compliance: sending potentially sensitive data to non-HIPAA compliant cloud API.
  • Bias: model may bias towards particular demographics that are more prevalent in the data set.
  • Legal issues: the LLM’s licensing may restrict its use for commercial and/or medical applications.
  • Data ownership: ownership of generated content may be subject to dispute.

The most problematic shortcoming is that it does not explicitly represent facts or structured information. Rather, it is optimized for generating plausible-sounding text, but not answers which are guaranteed or verified to be factually accurate. As a result, it can confidently spit out false information, as it does when asked about the residency program at OSF Healthcare (one of GYANT’s clients):

This text looks great on the surface but gives the wrong information about the residency specialties and an inactive URL.

Another example is triaging symptoms. GPT-3.5 gives good general advice for sinus care.

However, a medical professional might recognize that this advice is incomplete. If a patient has sinus pain and sinus congestion lasting longer than ten days they should be seen by a provider as this indicates the patient may be suffering from a bacterial infection that requires antibiotics.

In the sinus congestion example, a user asks for advice and ChatGPT returns a well-written, albeit incomplete, answer. The answer lacks pertinent information that the user did not know to ask for. Thus, it would not be wise to let an LLM give medical advice on its own.

Additionally, there is a level of ethical consideration to take. There are regulations to outline what can and cannot be done, but ultimately each organization and individual will have different principles on which they base their decisions. Here are a couple of questions to think about:

  • Can we rely on ChatGPT to provide medical advice without complete transparency regarding its training data and sources?
  • How can organizations strike a balance between the benefits of efficiency and accessibility offered by ChatGPT with the potential risks of misdiagnosis or misinterpretation of symptoms?
  • In what instances is it justifiable to use ChatGPT as a primary source of information for patients? At what point should it always be complemented with human expertise and oversight?

Conclusion

ChatGPT is a powerful language model with meaningful applications in healthcare. Its conversational format strengthens virtual health assistants, patient support and engagement, health education, and language translation. However, it is important to recognize the limitations of LLMs, including the potential for confidently presenting inaccurate information.

There is a lot that our healthcare system can gain from ChatGPT, and it is undoubtedly a breakthrough that will change the industry for the better. In the near term, we need to capitalize on the clean-cut use cases while we carefully consider the efficacy of LLMs in medical scenarios.

Guide

Digital Front Door

Digital Front Door® Guide: Reimagining the consumer experience

Consumers are rapidly adopting new technologies designed to make their lives easier–from online banking to grocery delivery.

Case Study


Gain capacity to care

Maximize clinical capacity, reduce administrative burden, expand access, and increase patient satisfaction.