AI chatbots can help pregnant women with opioid use disorder, new study finds

SSON researchers explore how GPT-4 can be fine-tuned to give trustworthy information to help those struggling with addiction.


For expectant mothers struggling with opioid use disorder, the stigma surrounding addiction can make it difficult to seek help.

Drew Herbert, a doctoral student at the University of Missouri Sinclair School of Nursing and a full-time nurse who has experience treating people with the disorder, has seen this firsthand.

“It can be a tough conversation for pregnant women to bring up to their clinician, family or friends in the first place, and if they do, they might be told simply, ‘You should stop,’ ‘Think about your baby’ or ‘Just quit,’” Herbert said. “Not only are these responses lacking empathy and compassion, but it can also be hard to quit cold turkey, as opioid withdrawals can be dangerous to both the mom and her unborn fetus.”

While online chatbots can sometimes be unreliable, Herbert knows that many people often feel more comfortable asking the internet for medical advice anonymously — without fear of judgment. So, Herbert was curious whether GPT-4 could be trained to give safe, accurate, supportive and empathetic treatment guidance for pregnant women with opioid use disorder.

Turns out, it can, Herbert found.

A trusted and compassionate resource

In a recent proof-of-concept study, Herbert created “Jade,” a fictional woman who, at six weeks pregnant, told the fine-tuned version of GPT-4 she was ready to quit opioids but didn’t know how.

Rather than immediately telling Jade what to do, the chatbot first thanked Jade for sharing her journey and commended the strength it took to ask for help. Then, drawing from evidence-based guidelines from the American Society of Addiction Medication and the American College of Obstetrician and Gynecologists, GPT-4 introduced Jade to buprenorphine, a medication that can help reduce cravings and prevent withdrawal symptoms, creating a safer environment for both Jade and her unborn baby.

“We instructed the chatbot to use motivational interviewing, a counseling approach where you help people overcome resistance to change,” Herbert said. “Rather than telling someone what to do, you empower them on their path toward change from a place of empathy and support. This approach is highly effective in substance use disorder spaces.”

When Jade asked for help finding a doctor in the Denver area, the chatbot responded with a list of local treatment centers, telemedicine options and reputable directories such as the Substance Abuse and Mental Health Services Administration’s treatment locator tool and the American Society of Addiction Medicine’s provider directory.

Two clinicians with experience treating opioid use disorder rated the chatbot’s responses as safe, accurate and relevant more than 96% of the time.

Just the beginning

Herbert sees significant potential from this initial study. He plans to further fine-tune the chatbot using feedback from clinicians and women who have experienced opioid use during pregnancy.

In the future, Herbert does not see chatbots as a replacement for clinicians, but as a supplemental source of trustworthy information that can be used in a variety of ways.

Initial online conversations with chatbots might encourage people to see a clinician once they feel more prepared. Evidence-based information from chatbots can be incorporated into smartphone applications and webpages with FAQs. Recorded therapy sessions can be transcribed so chatbots can identify best practices for addiction treatment professionals and areas for improvement.

“My overall goal is to disseminate accurate treatment information to as many people as possible for as little cost as possible,” Herbert said. “In the midst of a nationwide clinician shortage, not everyone has quick, easy and affordable access to health care. We are learning more about how these language models can be leveraged to help people become healthier, and once we build out these models further, they can potentially help a lot of people.”

 “Generative AI-derived information about opioid use disorder treatment during pregnancy: an exploratory evaluation of GPT-4’s steerability for provision of trustworthy person-centered information” was published in Journal of Studies on Alcohol and Drugs.


Article was originally published on May 1, 2025 by Show Me Mizzou.

Related posts

042325_baby-1536x1024

AI chatbots can help pregnant women with opioid use disorder, new study finds


SSON researchers explore how GPT-4 can be fine-tuned to give trustworthy information to help those struggling with addiction.

022825_baby_s

New research sheds light on breastfeeding barriers in rural communities


SSON study examines factors impacting why rural first-time moms often switch from breastfeeding to formula feeding their infants.

Nurse helping an elderly patient.

Mizzou researchers aim to reduce avoidable hospitalizations for nursing home residents with dementia


A new National Institutes of Health-funded study finds nursing home residents with dementia are more likely to be involved in avoidable hospitalizations.

Brian Consiglio

phone