Yet as web-savvy as many of these organisations are, when we think of chatbots, virtual assistants and other AI-driven conversational interfaces, the third sector may not spring to mind first and foremost. We’ve been conditioned to think that chatbots are suited only to industries that can sustain high investment: retail, finance and banking, for example.
However, the reality is that chatbot development has crossed the chasm, and powerful solutions are more accessible than ever before. While the cost efficiencies that chatbots can generate are compelling for charities and non-profits, perhaps the more exciting value is the innovative new ways they can provide support and advice..
Target Audience 101
Many charities and non-profits are dedicated to raising awareness of key issues, but increasing engagement with target audiences is an ongoing task, with many non-profit boards requiring proof of year upon year improvement.
Teenagers may not be the first age group we may think of when it comes to chatbot target audiences, but they are one of Planned Parenthood’s top target audiences, and the use case is a great fit. As an established non-profit that has been around for the better part of a century, Planned Parenthood has always strived to provide accurate sexual and reproductive information to teenagers, but with its new chatbot Roo, it can provide personalised information to specific questions at any time of the day, seven days a week.
What’s cool about Roo, but certainly not unique amongst chatbot development, is that its been developed with and tested by teenagers as its target audience group. It’s a wonderful example of how one of the oldest non-profits is increasing the use of an already-available and amassed database of information.
It’s also a great resource for those who are looking to get information that is not featured on the Planned Parenthood website, in a way that allows this age group to make queries without fear of judgement or embarrassment.
Helping Via Immersive Stories
Chatbot interfaces are frequently question based, but this doesn’t mean that engagement has to end with a simple answer to a direct question. rAInbow is ‘a smart companion,’ designed to help people who are in abusive, controlling or unhealthy relationships.
While rAInbow can answer direct questions, it also provides a personalised and immersive story option that can help those experiencing abuse, which can be a very isolating experience. One option rAInbow gives its users is ‘or, I can tell you the story of one of my friends,’ which gives its users a relatable story in which they may be able to draw out parallels in abuse in order to relate to advice being given.
It’s a great example of the use of storytelling and how it may prompt realisation amongst people, as well as AI’s worth in delivering information 24x7 in complex and often highly sensitive situations.
Beyond this, rAInbow is part of an emerging trend in standalone chatbots. While many chatbots today are integral parts of website offerings and are aligned with FAQs and volunteer helplines, the emergence of these standalone bot-based sites is perhaps proof of exactly how comfortable people feel in going to bots for basic advice as an entry point of engagement in reaching out for help.
Scaling Support for Victims of Cyber Crime
Like any business, scaling resources has traditionally been a challenge for charities and non-profits, many of which operate on a capped employee basis with the support of volunteers. The Cyber Helpline, the first UK nationwide cyber crime support service, is a great example of how chatbots can really help an organisation scale to meet growing demand.
Let’s face it, cyber crime is not going to go away or get simpler to provide support for. And any non-profit dedicated to helping the victims of cyber crime has got serious matters of scale to attend to. The Cyber Helpline is helping individuals contain, recover and learn from cyber attacks by connecting them with a cyber security chatbot and volunteer experts who provide relevant advice and guidance, at any time, day or night.
It’s a great example of using chatbots as the front line, with experienced cyber crime agents on hand to take on more complex questions and provide in-depth help. Like all successful chatbots, this project required deep user research, which revealed unique requirements. The organisation really understood how people report and describe these cyber crimes. In this instance, it was quickly identified that it’s a priority to have a system that allows people to provide as much detail as possible in their own words, whilst ensuring privacy as the utmost priority.
In a sector wholly dedicated to providing help to those who need it, chatbots may be lowering the barrier to engagement through the creation of a completely non-judgmental, always-on entry point. It is encouraging to see how easily the technology can adapt to meet various non-profits’ needs and highly complex human pain points, showing that surprisingly, sometimes a chatbot is just what a person needs in the journey to asking for and receiving help.