Personality, language hamper health chatbot implementations

When it comes to accessing healthcare, patients are faced with a lot of decisions: Do I take the dreaded trip to the emergency room or wait to see a PCP? How long can I wait to be seen?

Increasingly, medical centers are looking to ease this stress with chatbots. However, this tech comes with its own set of obstacles and opportunities. 

Lisa Mason, senior director of product design, user research and product analytics at Providence St. Joseph, will be speaking at the HIMSS20 Global Health Conference & Exhibition in Orlando, Florida, in March about her work implementing chatbots. Her talk will zero in on four areas: developing a personality for chatbots, designing avatar icons used in the chatbot, interaction methods, and tips and tricks around how patients enter data. 

So far, Providence St. Joseph is employing chatbot technology for FAQs, as well as to help patients figure out which mode of care they should be accessing. 

One of the major challenges has been creating a personality for the chatbots. 

“In terms of the personality, that is the really hard thing to make sure you do well. That is the face of the chatbot and the face of your organization with the patient,” Mason said. “There are certain things we have tried that have not really resonated with patients. For example, the use of emoticons and emojis. Oftentimes, people aren’t feeling well and we have tried to lighten it up, and that hasn’t gone over well with patients.”

Making the interactions more to the point goes over better with patients, who are often sick and want answers quickly, she said. 

“The tricks that have worked are being really short and succinct in what you are communicating but to do it in a way that shows the right amount of empathy but don’t go overboard,” she said. “People realize it is a chatbot, and if it tries display too many human characteristics people don’t like it.”

In addition to getting the personality right, the language can also be tricky. 

“Language is hard to do and hard to do well,” Mason said. “For example, we asked the question: If the patient was coughing up phlegm, and if yes [the bot] then would say, ‘Yikes, I’m sorry to heart that.’ But the patients were interpreting, ‘Yikes’ as ‘I have something serious,’ and would get anxious and scared.”

Testing the chatbots isn’t quite so straightforward. Typically, when people are using the bots they are ill. 

“The individuals that will use the chatbot are not healthy, and to test it we can’t bring in unhealthy people. That is not ethical. We can’t bring them into our facilities and say, ‘I want you to use the chatbot to figure out if you should go to the doctor,’” Mason said. “So, when we want to see if they are using the chatbot, we need to bring in healthy people. And that is not a real-world situation — people who are healthy are going to interact differently.”

While there are challenges to testing and implementing the bots, Mason said she has seen the technology improve care for patients. 

“What we are seeing so far is that we are getting people to the right mode of care and we are getting them to book appointments that keep our population healthy so they aren’t waiting until they are ill to go to the right provider,” she said. “It is better for health outcomes and better for patients.” 

Lisa Mason will share some of Providence St. Joesph’s chatbot tricks and tools at HIMSS20 in a session titled “Best Practices for Helpful Healthcare Chatbots” on Thursday, March 12, at 11:30 a.m. to 12:15 p.m. in room W311A. 

More regional news

Men's fertility startup raises $3.5M for mail-order testing, storage services

Sony lands FDA clearance for NUCLeUS Operating Room

Global investment in mental health technology surges above half a billion pounds

Source: Read Full Article