"I'm here if you need to talk."
Now here's the undeniable truth of this universe: we're all dying. Rest assured everyone dies soon, whether we want to think about it or not, we just don't know when exactly.
Talking about end-of-life decisions isn't very easy specially with people who will be most affected by them. But who else would you talk about it, right?
Timothy Bickmore at Northeastern University in Boston, Massachusetts noticed this problem and wanted to enable people to communicate with no holds barred. "People near the end of their lives sometimes don’t get the chance to have these important conversations before it’s too late," he said as reported by NewScientist.
So, they've created a tablet-based chatbot to lend a non-judgemental ear and talk about almost everything that comes with death. The bot has two modes: neutral mode and pro-spiritual mode. In the first mode, the bot talks to you about funeral plans and wills but doesn't respond when vented with religious concerns. The second mode, however, "takes active interest in the user's spirituality, tailors dialog to the user's religious orientation, demonstrates knowledge of the user's religious beliefs and orientation, and acts supportive," according to the paper. As of now, the bot can support Christianity, Judaism, Islam, Hinduism, Budaism, and Sikhism.
The latter, and perhaps the slightly more souped-up version, can also guide users to meditate while conversing on a wide range of religious topics. You know, talk about human things like the spirituality and fear of death. And when the person actually starts getting ready to make decisions about their end-of-life plans, the bot alerts a family member or caregiver to follow up and formalise these plans.
The bot was tested with 44 people aged 55 and over in Boston. Less than half of them had some kind of chronic illness and almost all had spent time with someone who was dying. After some time of talking with the bot, most participants then reported to be more prepared for the inevitable and are all set to complete their last will and testament.
It isn’t fully autonomous, though, for it has a fairly rigid script it has to stick onto. This is because an unscripted system can easily “get into situations where the agent recommends things that are dangerous," Bickmore says.
“It’s hard for humans to be non-judgemental when they’re having these kinds of conversations,” says Rosemary Lloyd from The Conversation Project. “So some people might find it easier to talk to a chatbot about their thoughts.”
Get weekly science updates in your inbox!