Creating a mental health resource chatbot with a deep neural network.

Alisha Arora
7 min readFeb 7, 2021

It is without a doubt that mental health is still a stigma. We can speak about it and post about it.

However that is not enough, because we are still not educated about it.

Apparently over 46% of Americans feel unsure how to support a friend with mental illness, and over 60% of people around the globe are not educated on the types of mental illnesses as well as the symptoms. This causes many around the world to have a limited amount of access to accurate treatment.

Education is key to creating change and it is about time that we start educating.

I built an educational mental health chatbot that uses NLP and a Deep Neural Network. My chatbot’s name is Serenity (I chose the name serenity as it means peaceful), and through my program you can chat with Serenity and build your knowledge on mental health disorders as well as supporting your peers with mental illness.

The completed program looks something like this:

You: Hey Serenity, how are you? Serenity: Good Alisha! How can I help you? You: I want to learn more about Schizophrenia, what is Schizophrenia? Serenity: Schizophrenia is a serious psychiatric disorder characterized by the person slowly losing contact with reality, often experiencing delusions or hallucinations. You: I know someone who has Schizophrenia, what can I do to support them? Serenity: Here are some tips Alisha; Maintain a low-stress environment, Use language that is positive and supportive instead of critical, Have one person speak at a time and keep voices down.

So let’s get down to how I built this model!

The Basics

For my program I used NLP and a Deep Neural Network, both of these concepts are subsections of Artificial Intelligence (AI).

If you want to learn more about AI, I have a youtube series called #AIwithAlisha where I cover all the topics in AI, that I would highly suggest watching.

NLP

NLP is a field of AI that gives machines the ability to read, understand and derive meaning from the human language.

If you think about it like this, every line in a book or every tweet you read has information that can be extracted. It’s easy to derive meaning from a sentence when there is only one of them. But imagine having to look at millions of tweets or text messages, it is just not manageable.

This is unstructured data (Data generated from conversations, declarations or even tweets).

When building algorithms we usually have some traditional rows and column structure of relational databases. All in all, “neat” data. However with unstructured data, it is messy and very difficult to manipulate.

With NLP it is not just about interpreting a text or speech based on its keywords but actually understanding the meaning behind those works. Using NLP we can perform sentiment analysis’s and can also detect figures of speech like exaggeration.

Deep Learning

There are 3 main layers in a Deep learning neural network, an input layer, a hidden layer and an output layer. The system must process layers of data between the input and output to solve a task in our case make a prediction. Inputs to a neuron can either be features from a training set or outputs from a previous layer’s neurons. A neuron takes a group of weighted inputs, applies an activation function, and returns an output.

For an example if we were to use a neural network to convert American currency to Canadian currency. In order to determine the output we have to understand that the weight tells us how important that exact neuron is and the bigger the weight the more important role it plays in the neural network. This means that the neurons that store the Canadian currency are the ones we want to have a greater weight since that’s what we want to know.

To better understand Deep Neural Networks I would highly suggest checking out this video that talks about what a DNN is and how it works.

Building a mental health educational chatbot

Essentially what I have built is an ordinary chatbot with a twist as we are using Deep Learning to make better predictions of possible answers to questions.

Step 1: Initializing Chatbot Training

Our first step is to initialize the training for our chatbot.

The data I have used is through using a json file (JavaScript objects that lists different tags that correspond to different types of word patterns) where we store all of our “intents”. A basic json file looks like this.

This will be our sample data, that will help our chatbot understand what we are saying and have sample answers to respond with. In my case my data was more related to mental illnesses.

Next, through taking those words list we will lemmatize all the words. Lemmatize means to turn a word into its base meaning, or its lemma. For instance the words “eating”, “eats”, “ate” all have the same lemma which is just “eat”. Through lemmatizing our words we are getting each word to be at it’s most simple level. This will help save unnecessary error and time.

The first step is completed and it’s time to build our deep learning model!

Step 2: Building the Deep Learning Model

The next step to building the chatbot now that we have our training and testing data, is to use a deep learning model from keras called Sequential. Sequential is a neural network model from keras, if you want to learn more, check out this documentation .

import tensorflow as tf  
from tensorflow.python.framework import ops
ops.reset_default_graph()
net = tflearn.input_data(shape=[None, len(training[0])])
net = tflearn.fully_connected(net, 8)
net = tflearn.fully_connected(net, 8)
net = tflearn.fully_connected(net, len(output[0]), activation='softmax')
net = tflearn.regression(net)
model = tflearn.DNN(net)
try:
model.load('model.tflearn')
except:
model.fit(training, output, n_epoch=1000, batch_size=8, show_metric=True)
model.save('model.tflearn')

This network is composed of 3 layers, the first layer has 128 neurons, the second one having 64 neurons, and the third one having the number of intents as the number of neurons. The point of this network is to be able to predict which intent to choose given a data topic.

The model is trained with stochastic gradient descent (I highly suggest watching my video on linear regression to get a brief understanding on gradient descent.)

The whole model is turned into a numpy array! We will use this model to form our chatbot interface! This model will predict answers to questions through the intent data.

Step 3: Building the Chatbot

Now we assemble our chatbot!

Our chat() will takes in a message, predicts the class with our predict_class() function, puts the output list into getResponse(), then outputs the response. What we get is the foundation of our chatbot. We can now tell the bot something, and it will then respond back.

def chat():
print("Start talking with the bot(type quit to stop)!")
while True:
inp = input('You:')
if inp.lower() == 'quit':
break
results = model.predict([bag_of_words(inp, words)])
results_index = numpy.argmaz(results)
tag = labels[results_index]
for tg in data ["intents"]
if tg ['tag'] == tag:
responses = tg['responses']
print(random.choice(responses))
chat()

Results

After a lot of tweaking and editing, the chatbot is made. Let’s run the chatbot!

You: Hi Serentiy  
Serenity: Hi Alisha, how can I help you?
You: What are some symptoms of depression?
Serenity: Here are some symptoms of depression: Hopeless outlook, Lost interest, Increased fatigue, Suicidal thoughts, and Anxiety

Now there are definitely some gaps, because if I were to ask a basic question like this:

You: What italian restaurants should I try in Toronto?

Then the chatbot won’t be able to answer that and says:

Serenity: Unfortunately, I do not have an answer for that.

But then again it’s a good thing it’s a mental health educational chatbot and not a restaurant locator!!

The Future

One of the biggest questions is now that we have this chatbot what are we going to do with it? and how is this technology going to be accessible?

Now that we have built the chatbot what’s left is implementing it in places where users who need it the most have access to the chatbot. A few groups that this chatbot would be beneficial for are students in school and university. My next steps are to continue building the chatbot through deep learning mechanisms and NLP algorithms in order to better educate people on mental illnesses. From there, my plan is to implement my chatbot in school boards and university institutions. If you have anyone connections that would be useful or if you are someone from an educational background I would love to connect; alishaarora0526@gmail.com

If you enjoyed this article:

  • Share it with your network 🙏
  • Connect with me on Linkedin to stay updated on my AI journey, and shoot me a message (I love meeting new people).
  • Subscribe to my newsletter, for monthly updates on what I’m working on!

--

--