Educating Dora: teaching conversational AI in healthcare to sound right
20 March 2026 | By: Dr Adam Brandt and Dr Spencer Hazel | 4 min read
Applied linguists from Newcastle University have been helping an artificial intelligence (AI) automated voice system to ‘sound right’ - not just technically, but conversationally.
Following on from their first funded project, Dr Adam Brandt and Dr Spencer Hazel from our School of Education, Communication and Language Sciences speak to us about Dora – a medically-regulated conversational AI agent designed to aid with patient aftercare.
Contents:
- What is Dora?
- Dora calling
- Why conversation is harder than it looks
- From scripts to prompting
- The empathy conundrum
- What next?
What is Dora?
Dora is an autonomous clinical assistant that carries out routine telephone consultations with patients, helping NHS teams collect information, spot problems early, and free up clinician time for where it’s most needed. It was designed to carry out routine telephone consultations between patients and clinical teams – for example, checking in with patients pre- or post-surgery, or collecting standard information that helps a clinical pathway keep moving.
Importantly, Dora isn’t intended to replace clinical judgement. Instead, it acts as a mediating system: it coordinates information flow between patients and healthcare teams. Where a patient flags something that needs attention, Dora can help ensure the case is escalated so a clinician follows up.
It’s also built for accessibility. Rather than asking patients to download an app or learn a new interface, Dora can deliver consultations by telephone - familiar technology for most people, including those who may be less comfortable with digital healthcare tools. The Director General, Finance of the NHS, Elizabeth O'Mahony described Dora as a ‘no regrets’ technology that ‘we know work[s]’ and needs to be ‘prioritised’.
Dora calling
When Newcastle University applied linguists Adam Brandt and Spencer Hazel were approached by the Digital Health start-up Ufonia in late 2021, the brief sounded straightforward: older patients sometimes struggled to understand Dora’s speech in routine healthcare telephone consultations.
What followed was anything but straightforward.
Brandt and Hazel began working closely with Ufonia’s design and engineering teams, feeding the science of real-world conversation into the product development process. And as Dora’s use expanded across multiple NHS Trusts, the work evolved too - from tweaking wording and timing, to helping to shape the overall interaction so that routine healthcare consultations can be carried out safely, efficiently, and in a way that feels natural to patients.
‘Today, Dora has conducted more than 150,000 telephone consultations. That scale matters: it represents a large volume of routine interactions handled consistently, freeing clinical time for the patients who most need it.’
Why conversation is harder than it looks
Most of us think we understand conversation. We’ve all had phone calls. We know what it feels like when talk is flowing. And when it becomes awkward.
But that everyday competence hides an extraordinary amount of detail. Human conversation runs on finely timed, highly patterned practices: how we take turns; how we show we’re listening; how we fix misunderstandings; how we signal uncertainty; how we ask questions; how we close down a conversation. Much of this happens so fast, and so smoothly, that we hardly notice it.
A conversational AI system can’t rely on ‘common sense’. It has to be designed to cope with these interactional realities, including the moments where talk goes wrong. And when timing, phrasing, or tone is ‘off’, people notice immediately. This is where the qualitative linguistic method of Conversation Analysis comes in. It’s a way of studying real talk, moment-by-moment, to understand how conversation actually works, rather than how we imagine it works. By bringing that lens into product development, Brandt and Hazel could help Dora’s designers make the interaction feel more natural, and more user-friendly, for patients.
From scripts to prompting
In the early stages, improving Dora meant working on what many people would call the ‘script’. This referred to the order of steps in the consultation, the specific design of Dora’s talk (such as the questions it asks) and the ways Dora handled common misunderstandings.
But clinical conversation design isn’t only about choosing the right wording. It’s also about sequence and timing:
-
how does the consultation open and establish what will happen next?
-
what counts as an ‘answer’ to a question?
-
what happens if a patient hesitates, asks a side question, or misunderstands?
-
how does the system move back on track without sounding brusque?
As Large Language Model (LLM) functionality entered the picture, another shift happened. Instead of writing fixed scripts line-by-line (conversation designer as playwright), designers increasingly craft prompts and constraints that guide the system on how to conduct the consultation (conversation designer as director).
As a result, the task becomes: how do you ensure the system completes the clinical job safely and consistently, while still behaving in ways that patients experience as conversationally normal?
Dora has already been used to help patients recovering from cataract surgery.
The empathy conundrum
As conversational systems become more widespread, expectations shift too. It’s no longer enough for a system to be merely ‘user-friendly’. In healthcare contexts, people also want it to feel appropriately caring.
That’s where the empathy question becomes tricky.
While the patient-user response has on the whole been overwhelmingly positive, some early feedback voiced concerns about machines not being able to show empathy, or offer the right kind of ‘reassurance and understanding of nuance’. So rather than asking ‘can AI feel empathy?’, a more useful design question is: ‘What does ‘empathy’ look like in consultations, and how do people recognise it?’
To explore this, the team turned to recordings of equivalent consultations carried out by human clinicians (sourced from participants consenting in a formally registered research study), alongside real patient–Dora consultations and user feedback. By looking at what human clinicians do that is experienced by patients as empathic, the team can design similar communicative practices into Dora. The focus is not necessarily on explicit shows of empathy, but on the small interactional moves that matter: how concerns are acknowledged, how uncertainty is handled, how the consultation stays task-focused while still feeling attentive.
This work has fed into an Innovate UK–supported project, Encoding Empathy, which brings together partners including Ufonia, the University of York, and University College London Hospitals NHS Trust. The aim is to improve automated telephone consultations for longer-term monitoring, including by identifying effective practices in human consultations and using them to inform how LLM-driven systems conduct clinical talk.
What next?
Throughout this collaboration, Brandt and Hazel have been constantly surprised at how quickly conversational AI technology continues to develop. Keeping up with the pace of change is also a challenge.
To support this, they have been successful in applying for a British Academy Talent Development Award to help them upskill further, enabling them to gain expertise in contemporary conversation design and prompt engineering. They aim to use these new skills to be able to help the design and engineering team to continue improving Dora.
You might also like
- read the initial blog: Meet Dr Dora – the AI helping patients recover from cataract surgery
- discover the Innovate UK–supported project: Encoding Empathy
- explore how researchers are encoding empathy in conversational AI through Interactional AI
- learn more about the researchers involved in this study:
- Dr Adam Brandt, Senior Lecturer in Applied Linguistics
- Dr Spencer Hazel, Reader in Applied Linguistics Communication
- find out more about the research that makes a difference to people’s lives at our School of Education, Communication and Language Sciences
