A doctor friend remembers it like it was yesterday, even though the incident happened almost twenty years ago. She had just finished school and was out in the wild interacting with patients, and when she got some test results back, her heart sunk. The patient's prognosis was grim, and she had to break the news -- and she had absolutely no idea how. She'd learned everything about the human body in medical school, but the emotional component of dealing with patients was barely covered. She stumbled through the conversation and cried in the ladies room afterwards. Since then, she's gotten better at having tough conversations, but still sees many medical professionals who struggle with empathetic communication.
Many medical schools would like to include more training for communication, but there are limiting factors. Medical students are busy and curricula is packed, and hiring actors to play patients in a variety of situations is time consuming and costly. In some places, there is the additional challenge of finding patient actors who can speak all the languages represented in the patient population. And even if schools can find actors, they go home at the end of the day, and their time with students is limited.
If you've ever read this newsletter, you can guess what I'm about to say next -- virtual reality and artificial intelligence can solve this problem. Here are three examples of cases I've worked on to prove this out:
Black maternal mortality.
A hospital chain come to me with an astonishing and powerful case study. A Black physician had just given birth to her third child and was experiencing high blood pressure, and her doctor and colleague, a white man, put her on the standard medication to treat it. Except it didn't work, and her blood pressure kept climbing. She'd go back, he'd increase the dose, and things would only get worse. Frantic, one night the woman started searching online and discovered a Facebook group where women in similar situations proposed solutions. When she took it to her doctor, he dismissed her, and she understood in theory -- she was a medical professional as well and wary of "Doctor Google." But she went ahead and tried what she found online, and it worked. Afterwards, she and her doctor had a long conversation about the incident. They were colleagues and friends and he genuinely liked and respected her and wanted to help -- but he had been trained a certain way, and had a well-founded skepticism of internet cures. What came out of this was a conversational VR piece where the user was able to interact with and embody this nuanced experience. The idea was not to villainize anyone, but to help young doctors have a sense of empathy and a more open mind.
2. Honest conversations about medical misinformation
A few years later, I worked on the flipside of this piece -- helping doctors have empathetic conversations when patients presented misinformation. Working with a large global medical NGO, we built another conversational piece, only this time the doctors were talking with patients who believed in incorrect information. Rather than dismissing it out of hand and alienating the patient, the user had to ask questions and gently challenge assumptions while being respectful.
3. AI powered patient conversation training
Last year I worked with a medical school that was trying to solve several issues about teaching patient communication. One of their biggest issues was that they were based in a more rural community and had a large Spanish speaking patient population but lacked the resources to hire Spanish speaking medical actors. The school used AI powered virtual human avatars, which could be programmed to speak in multiple languages, in order to build scenarios where students could practice interacting with Spanish speaking patients. Additionally, with a bit of extra programming, the avatars could speak some English and some Spanish, a realistic situation in many communities.
All three pieces were successful insofar as users reported more confidence in their communication skills and higher levels of empathy. Additionally, it cost substantially less then hiring human actors, and students felt more comfortable practicing in a headset than with another person. Not every doctor who goes through this training will come away with an impeccable bedside manner, but if they can soften the blow of bad news just a bit, that's worth it.