Mini Project
Mini Project
Mini Project
Citation Beiley, Michael Rei. (2019). Mental Health And Wellness Chatbot
(Bachelor's thesis, University of Arizona, Tucson, USA).
By
____________________
Computer Science
MAY 2019
Approved by:
____________________________
Allison Obourn
Department of Computer Science
Abstract
This project is a mental health and wellness chatbot. Chatbots are computer applications
that exchange messages with a human user. There are many ways to access chatbots,
including Facebook, Twitter, smartphone apps, and messaging platforms such as Skype and
Discord. These chatbots all follow a general formula. The user can send messages to the
chatbot, and it responds with an appropriate message or action. This chatbot was built to run in
Discord, a popular text and audio communication platform. Discord supports development of
custom bots and was a natural choice for this project. This chatbot has several features,
including mood responses, mood graphing, and a therapist search. Each of these features are
designed to improve the general well-being of the chatbot users. The target demographic is
adults who frequently use computers, since they are particularly vulnerable to mental health
issues such as depression (Madhav, 2017). A user study was conducted to examine the user
experience. Initial results indicate a positive user experience, but further studies are required to
There is a growing need for mental health care in the United States. According to the
Bureau of Health Workforce, over 106 million Americans live in areas with mental health care
shortages. It is critical to explore new, more accessible avenues for mental health care. One
potential solution is mental health applications (MHapps) such as chatbots. Chatbots are
computer applications that exchange messages with a human user. Chatbots all follow a similar
formula. The user can send messages to the chatbot, and the chatbot responds appropriately.
For example, if the user tells the chatbot, “I’m feeling sad”, the chatbot might reply, “I’m sorry to
hear that. My heart truly goes out to you.” These bots utilize a technique called natural language
chatbots can store data on the user’s past responses. This enables the bot to learn about a
specific user’s history. Then in the future, the bot can ask about any recurring issues the user
might be having.
When first learning about MHapps, a common question is “How effective can an artificial
mental health care provider be?” It is reasonable to assume people would not be willing to use
MHapps and would prefer to interact with an actual person. However, there is strong evidence
to support people’s willingness to use MHapps. In a study done at the University of Southern
California in 2014, researchers set up an experiment to test how much participants would
disclose to a virtual human in clinical interviews. All participants spoke to a virtual human on a
computer screen. However, half of the participants were told the virtual human was a computer,
while the other half were told it was controlled by an actual human. The participants who
thought it was a computer reported lower fear of self-disclosure, displayed their sadness more
intensely and were rated by observers as more willing to disclose (Lucas, 2014). This study
shows chatbots can allow users to be to more fully examine their emotions and be more honest
about their experiences. This is a large advantage of MHapps over traditional mental health
care options.
This application is designed to improve the overall mental and physical health of its
users. The target demographic is computer users and frequent users of an application called
Discord. Discord is a free platform designed to send text messages, talk in chat rooms, and
share media. This platform is commonly used amongst video game players and every day
computer users. Since Discord already supports independent development of chatbots, it was
This chatbot is designed to address several health issues caused by using computers for
long periods of time. For example, looking at a computer screen for hours can have
consequences, such as eyestrain and headaches. Fortunately, there are simple methods to
combat these issues. The application reminds the user to spend time looking away from their
screens. The application also has a number of features to help improve the mental health of its
users. The primary features are mood responses, mood graphing, and a therapist search tool.
There is more information on each of these tools in the Design section. Additionally, there are
regular reminders for the user to stand up from their computer and walk around. This feature is
especially important since computer users spend several consecutive hours sitting at their
desks. This is unhealthy by many objective standards. According to a recent study published in
the American Journal of Epidemiology, “ prolonged leisure-time sitting (≥6 vs. <3 hours per day)
was associated with higher risk of mortality from all causes, cardiovascular disease, cancer,
diabetes, kidney disease, Alzheimer's, and several others (Patel, 2018). Through these periodic
health reminders, the application could help improve the health of its users over a long period of
time.
Related Work
One of the most effective MHapps is called Woebot. This MHapp was developed by two
cognitive behavioral therapy techniques. The chatbot sends the user a message every day and
tries to start a conversation. In a study done to test Woebot’s efficacy, seventy individuals age
18-28 were recruited to receive either 2 weeks of self-help content from Woebot or directed to
information-only control group (Fitzpatrick, 2017). Both at the study’s onset and 2-3 weeks later,
all participants completed web-based versions of the 9-item Patient Health Questionnaire, the
7-item Generalized Anxiety Disorder scale, and the Positive and Negative Affect Scale. No
significant differences existed between the groups at baseline, but the results showed great
promise for Woebot and similar MHapps. A univariate analysis of covariance was performed
and determined that the Woebot group members significantly reduced their symptoms of
depression, whereas the informational control group members did not. Members of both groups
significantly reduced symptoms of anxiety. This study is one of the first to prove that chatbots
can be an effective and engaging way to reduce symptoms of mental illnesses. In my own
chatbot, I implement some of the same features that Woebot has, including mood tracking and
mood responses.
In early 2016, a literature review was published that highlighted the best features to
include in an effective MHapp. The review was performed by David Bakker at Monash
University and reviewed MHapp-related articles between March 1975 and March 2015. The
exact number of reviewed studies was not tracked. Ultimately, the author made 16
recommendations for creating the best MHapps. Bakker also reviewed the most popular
MHapps available in the iTunes App Store, looking for the 16 traits he endorsed. He asserted
that there were “some major gaps in their capabilities when compared with the
recommendations, including, “addressing both anxiety and low mood”, “designed for use by
nonclinical populations”, and “reporting of thoughts, feelings, or behaviors” (Baker, 2016). The
recommendations did not included were either not applicable or not feasible for my chatbot. If
the user reports feelings of anxiety or low mood, the bot gives acknowledgement of these
feelings as well as suggestions for overcoming them. Some suggestions include exercising
regularly or improving sleep habits by going to bed at roughly the same time each day.
Unfortunately, many mental health applications will label their users with a mental illness
diagnosis. There is a lot of research that suggests this labeling process can be harmful and
stigmatizing by making the user feel unable to change their condition (Moses, 2009). This
chatbot avoids this mistake. At no point during interaction with the chatbot, is a label provided
for a user. If the user is displaying depressive symptoms, the chatbot may acknowledge that the
user could benefit from examining facts about their current situation instead of focusing on
worst-case scenarios. However, the chatbot never gives the user the label of “being depressed”
or “having depression”.
There are a few Discord bots dedicated to mental health and wellness. One such bot is
called Theodore. Theodore has mental health and online therapy resources as well as mental
health hotlines. This bot has several other resources, such as relaxing music, lighthearted
remarks, and drawings. There are several elements of this bot I have adopted. Specifically, the
links to therapy resources and mental health hotline numbers. These are very useful inclusions
and were easily implemented. My chatbot also has several other features, such as mood
This application has several different features, including mood tracking, mood
responses, and therapist searching. Additionally, the application sends the user a short,
reminder message every hour. The messages say things like “Get up and go for a walk! It will
feel good to stretch your legs.” and “Make sure to get plenty of rest tonight! Sleep is NOT for the
weak.” The idea behind this feature is to get users doing healthy things regularly. In future
iterations of the application, users will be able to set how often the reminders are sent and can
add or remove any reminders they want to. This way, users can tailor the application to fit their
personal needs.
Mood responses is a primary feature of the chatbot. Users are provided with 15 different
possible emotions, ranging from “!happy” and “!excited” to “!sad” and “!depressed”. The
exclamation point is used to indicate a command being issued. After entering their mood, users
are entered into a conversation with the bot. In Figure 1, there is an example conversation.
Figure 1. Example conversation
This feature is intended to act as a first line of defense for individuals without someone to
express their emotions to. This feature is not meant to replace human interaction. In fact,
several of the bot’s messages ask the user to seek a friend to talk to. However, if there is no
one for the user to reach, the bot continues speaking to the user. This is shown in Figure 2.
Figures 3, 4, and 5 show the different emotions and conversations users can have with the bot.
Figure 3. Legend - Mood response diagram
In future versions, there will be more emotions to choose from and more extensive
conversations. There are several nuances to the mood response section. After the user sends a
This is to make the user experience feel more like a human to human conversation. Additionally,
the bot waits several seconds before sending its reply. The longer the reply, the longer the bot
spends “typing”. Each emotion has a unique response. However, each response has at least 2
specific parts. The first part is acknowledgment. The bot affirms the emotion felt by the user.
This is done to make the user feel heard and know that their feelings are valid. The second part
is problem solving. If the user is feeling sad or depressed, the bot will offer some advice for
alleviating those feelings. If the user is feeling happy, the bot offers advice for being happy more
often. This part is designed to help the user cope with their emotions. The bot encourages
sharing emotions with other humans, since this is fundamental to human experience. However,
Mood tracking and graphing are included as well. If a user is struggling with mental
health issues, it may be useful to have a record of their moods over time. This helps the user
track their progress as they work to improve their mental health. Alternatively, users may be
interested to see how their moods fluctuate based on other factors in their lives, such as an
upcoming exam. Every time a user uses one of the mood commands, such as “!happy” or
“!excited”, it is logged in the user’s account. Once a week has passed, the chatbot will send the
user a plot of their moods. The plot uses the valence-arousal emotion chart to assign values to
each of the moods. For example, moods such as “!depression” are going to have a very low
score, such as 5 out of 100. While moods like “!very happy” have 95 out of 100. An example
The user can also change how much time the graph covers. If a user wants to see their moods
over the past year, they simply use the “!mood” command to set the graph time to “1 year”. The
chatbot guides the user throughout the entire mood graphing experience and informs the user of
The last primary feature is the “Find a Therapist” feature, which helps users find suitable
therapists in their area. The user can initiate this feature by typing “!therapist”. The chatbot is
designed to never insinuate the user has a mental disorder or serious mental health issue. This
is because giving the user a label like “depression”, has been shown to make the user feel like
they can’t change their condition (Moses, 2009). Thus, the bot only allows for voluntary use of
the “!therapist” command. After issuing the command, the user simply answers questions asked
by the bot. First, the user is asked what city they want their therapist to be in. Then, they’re
asked if there are any specific conditions they want the therapist to specialize in, such as
anxiety or depression. The user can then specify if they prefer a male or female therapist.
Finally, the bot asks what insurance the user has. The user is left with a list of possible
therapists perfectly suited to their needs. From this point, the user can read the individual
therapists’ descriptions and make their final decision. This feature is made possible through
Psychology Today’s website. This website helps users find therapists in the way just described.
The advantage to using the bot instead over the website, is the more personalized experience
to the user. The bot provides encouraging messages throughout the whole process and is
designed to feel like a conversation instead of a mundane search exercise. Additionally, the
Analysis
In order to test the chatbot, a user study was performed. There were three participants.
Each were college students aged 20-22. There were 2 females and 1 male. I explained briefly
the bot was designed for mental health and wellness and then told them about the “!help”
command. This displays all the possible commands. After showing them how to use this
command, I let them use the bot for several minutes. I sat next to them and answered any
questions they had. I also occasionally suggested they use a certain feature or try something
There was some general confusion when first using the chatbot. Two of the users had
never used a chatbot before. These users weren’t familiar with using a key character, “!”, before
each command. Aside from this initial learning curve, users easily interacted with the bot. They
issued commands and responded to the bot without instruction from me. One critique was the
bot’s inability to respond to any user message. This user wanted to be able to say whatever
they wanted and have the bot respond accordingly. Finally, one user was concerned about how
consistently they would enter their moods and how this might poorly affect their mood graph.
Along with the critiques, there were several positive reactions as well. Two users
remarked the chatbot was fun to use. One user complimented how quickly the bot responded.
Another particularly liked the mood graphing feature and said it would be useful for long term
mood tracking. They thought users with mental health issues would especially benefit from this
feature. The command to set the mood graph time worked well too. A user who tried it noted it
I noticed the users quickly learned how to use the bot. Once they understood that they
could issue commands and the bot expected responses, users had a smooth experience. The
users also seemed engaged with the chatbot and seemed to enjoy the experience. The users
In future studies, I would include many more subjects. It is not feasible to draw confident
conclusions from a study with three subjects. Also, the subjects were around the same age. It
would be better to have subjects of a variety of ages. I also knew the subjects and they could
have given a biased review of the application. There is also issues with the research
procedures. As the creator of the chatbot, I am a biased administrator of the study. To eliminate
this bias, I need a person who has no investment in my chatbot to run participants.
Future Work
In the future, the chatbot will undergo several revisions. The initial user experience
needs polishing such that anyone can easily interact with the bot. This means adding a short
introduction to the chatbot, with instructions on how to issue commands and respond to the
bot’s messages. The bot’s sending and receiving timings also need an update. The bot sends
messages much faster than an average person does. To compensate, the responses will be
sent more slowly to make the bot feel as human-like as possible. The bot will also wait longer for
a user response after it has asked the user a question. This time was shown to be too short in
the user study. Additionally, the bot will include more emotion commands and extend current
responses. There will also be multiple responses for the same emotions, so the user does not
The mood graphing feature will be improved as well. Currently, the feature relies on
voluntary mood commands from the user. This is unreliable, since users might not share their
emotions consistently. To solve this problem, the chatbot will proactively ask the user how they
are feeling. This should increase the number of mood responses and provide more data points
for the mood graphs. The values assigned to each emotion will also be revisited. While
accepted metrics were used, such as the valence-arousal emotion chart, there is still room for
improvement. One possibility is to give surveys asking participants to assign a value from 0 to
100 to a list of emotions. The average values for each emotion could factor into the final value in
the mood graph feature. This would give a collective perception of emotions and perhaps a
Conclusion
This application is a mental health application designed to improve to mental health and
well-being of frequent computer users. Each of the features is intended to help users cope with
their emotions and build healthy habits. Whether it’s through mood tracking or finding a
therapist, the chatbot is a faithful servant to its users. Lots of people use computers frequently
and its important for them to stay mindful of their health. The chatbot regularly sends health
reminders and encourages the user to stay active. On a platform such as Discord, this
application is particularly useful. Many Discord users spend countless hours on the computer,
but don’t necessarily socialize during that time. This bot will be a temporary substitute for users
in need of a friend or someone to confide in. With enough improvement and development, this
Bakker, D., Kazantzis, N., Rickwood, D., & Rickard, N. (2016). Mental Health Smartphone Apps:
Doward, J. (2016, November 05). Men much less likely to seek mental health help than women.
help--mental-health
Fitzpatrick, K. K., Darcy, A., & Vierhile, M. (2017). Delivering Cognitive Behavior Therapy to
Young Adults With Symptoms of Depression and Anxiety Using a Fully Automated
4(2). doi:10.2196/mental.7785
Lucas, G. M., Gratch, J., King, A., & Morency, L. (2014). It’s only a computer: Virtual humans
doi:10.1016/j.chb.2014.04.043
Madhav, K.c., et al. “Association between Screen Time and Depression among US Adults.”
doi:10.1016/j.pmedr.2017.08.005.
Mental Health Care Health Professional Shortage Areas (HPSAs) (2018, April 3). Retrieved
from https://www.kff.org/other/state-indicator/mental-health-care-health-professional-
shortage-areas-hpsas/?currentTimeframe=0&sortModel=%7B%22colId%22%3A
%22Location%22%2C%22sort%22%3A%22asc%22%7D
Moses, T. (2009). Self-labeling and its effects among adolescents diagnosed with mental
11.003
Patel, Alpa V, et al. “Prolonged Leisure Time Spent Sitting in Relation to Cause-Specific
Mortality in a Large US Cohort.” American Journal of Epidemiology, vol. 187, no. 10,
Wei, H., Chen, M., Huang, P., & Bai, Y. (2012). The association between online gaming, social
244x-12-92