Sometimes annoying, often malicious, bots have gotten a bad rap. However, when deployed to work with humans—instead of replacing them—bots can help people behave, well, nicer than they might otherwise be, even on their best days.
For health plans and payers preparing for the annual open enrollment period this fall, bots programmed with machine learning compassion algorithms offer a compelling method to improve the experience for consumers as they select healthcare coverage for 2024.
Neither autonomous nor customer-facing, so-called “compassion bots” work behind the scenes to help contact center associates perform more productively, efficiently, and compassionately. Though machines cannot experience emotions, they can be programmed to process data incorporating compassion, kindness, and empathy as Amit Ray describes in his 2018 seminal book, “Compassionate Artificial Intelligence: Frameworks and Algorithms.”
Ray describes how AI can alleviate human suffering, and, as anyone navigating the healthcare marketplace knows, it’s no picnic sorting through a dizzying array of coverage options.
It’s here that compassionate AI bots and contact center associates can drive healthcare reform forward, says Patrick Arcement, executive director, healthcare solutions, client success, TTEC.
“We don't have to wait for Congress. We don’t have to wait for the federal government,” Arcement said.
“We can reform healthcare by simply getting people’s needs taken care of with solutions we find for them as licensed agents,” he told Customer Strategist Journal. Knowledgeable associates, trained and coached by bots to listen more effectively and to engage on a deeper level with empathy will ultimately lead members to the plan best suited to their needs.
Train. Practice. Then practice more
AI bots engage trainees in simulated member interactions, prior to taking live calls. These role-playing exercises enable associates to experiment with different ways of handling calls to determine which work best within compliance guidelines. Because the bots provide real-time responses, feedback, and coaching, associates build skills and confidence without reliance on scripts.
- Positivity reinforcement: Bots programmed to recognize language as compassionate or cold encourage associates to choose the kinder option. For example, in response to a member’s “Thank you,” bots prompt associates to say something like, “It’s been my pleasure to help you today,” rather than the slovenly, “No problem.”
Objection bots: This flavor of AI bot enters a role-playing exercise programmed to resist any suggestion associates offer. Here, trainees get exposed to common pushback such as “No. I need to check with my wife before making this decision.
Rather than end the call at that point as many do, associates learn techniques to continue the conversation, in a respectful and empathetic way such as responding with, “I totally get that and I would do the same. Is she available to speak now?” or “Let’s schedule a call-back time because I want to be your licensed agent. I want to earn your business.”
Arcement said compassionate dialogue like this makes efficient use of both the member’s and associate’s time.
Listen hard: AI bots train associates to listen in new, more compassionate ways by flagging cues that might otherwise get overlooked. For example, Arcement said, a caller may open a conversation with, “I went to the VA and they can’t provide the catheter I need. This one hurts me.”
AI can encourage a caring and respectful reply that acknowledges the member’s status as a military veteran such as, “First of all, I want to say on behalf of a grateful nation and myself, we thank you for your service. Secondly, you’re right. Nobody should have to deal with that pain of a catheter, so it'll be my absolute privilege to help you get the right catheter on your plan.”
Grace under pressure: Given that some contact center associates are young adults, they may be ill-prepared to handle a difficult call that opens: “My wife just passed away and she used to handle all our healthcare matters.”
A standard reply in the call flow might be “What is your member number?” but Arcement said that’s not compassionate at all.
Instead, bots encourage a reply such as, “On behalf of my company and my co-workers, I’d like to offer our condolences and I know the last thing you need is a healthcare problem right now. We’ll take as much time as you need and I'm going to get this off your plate, answer all your questions, and make sure you leave this call more comfortable than you came in. Okay, so I'm here for you.”
Rich, relevant replies resonate
Many associates would not expect they’re allowed such elaborate responses. Research has shown that going the extra mile in this way is appreciated and encourages members to share information more freely, which then better equips the associate to help get them enrolled in the best plan for their needs.
A recent study published by JAMA found that the lengthier responses generated by ChatGPT were preferred over physicians’ comparatively curt responses and bots rated higher than live physicians on both quality and empathy.
Arcement said the use of training bots shortens calls, improves the experience, enhances accuracy, and reduces attrition because associates are more satisfied in their work when they know they helped someone overcome a challenge or enroll in the best plan for their specific needs.
AI bots as your open enrollment wingman
Compassionate bots bring trainees up to speed on best practices and they play another role once associates move to live calls during open enrollment. Bots can monitor the speed at which an associate is speaking and alert them to slow down, as needed, so callers can better process information.
Bots can listen to a conversation and quickly retrieve relevant documents from a knowledgebase, rather than an associate fumbling to locate the information.
Speed to proficiency is among the big wins bots bring healthcare plans. In tandem with proven training techniques, bots reduce training time and accelerate the learning curve with personalized coaching and real-time feedback.
“You still need people,” Arcement said. “There is still a people element. Bots just make the person more savvy.”