What Data Can Tell Us About Mental Health

by | Nov 17, 2022 | Healthcare

Reading Time: 5 minutes

Mental illnesses affect more than 52 million Americans, or about 20 percent of the U.S. adult population, according to the National Institute of Mental Health (NIMH). It would seem something afflicting so many people would provide plenty of data to study, but there is a tricky aspect to data surrounding mental health.

The NIMH, which is part of the National Institutes of Health within the U.S. Department of Health and Human Services, estimates only about half of people with mental illnesses receive treatment. Most of the data about mental health in the United States comes from those people already in treatment. So there is data, but it is more often coming from those who have already begun the process of seeking help.

Researchers are working to better understand mental health and to figure out how to intervene earlier. They’re using the data they do have to inform artificial intelligence (AI) tools in order to do so.

What mental health research using data looks like

Research in the field of mental health takes many shapes. Some of it focuses on identifying causes or symptoms, as well as finding the most effective treatments. That can, in turn, help develop preventative measures that can combat mental illnesses. Data scientists have recently begun using machine learning to go through larger data sets in the hopes of finding patterns that can reveal some of this information.

Other research uses data for treatment and intervention, such as working to understand when someone is most likely to harm themselves. There are a number of different researchers taking this approach. They use data collected from crisis counseling hotlines or even through apps that check in with people about how they are feeling throughout the day.

The benefits of the data

One of the obstacles preventing people from seeking treatment is accessibility. In 2018, Harvard Business review reported that 40% of Americans live in areas designated by the federal government as having a shortage of mental health professionals. Even those who live in an area with access to services might not be able to afford them.

This is another benefit to the technology. In cases like this, artificial intelligence plays a major role in providing some kind of treatment. Chatbots have been designed that can help patients work through depression or anxiety. The University of Southern California’s Institute for Creative Technologies developed a virtual therapist named Ellie. In addition to simulating a human conversation like other chatbots, Ellie can detect nonverbal cues and respond to those. Researchers say that patients see Ellie as less judgmental than a person, and have been more willing to open up in those sessions.

Ellie gathers this type of data to inform specialists. Patients speaking to Ellie have information collected like their rate of speech and the amount of time they pause before answering questions, in addition to the movement and position of points on their face. Ellie was funded to help treat veterans suffering with PTSD, and because of the analysis, researchers discovered patients with PTSD touch their heads and hands more frequently than patients without PTSD.

How data can be used

The Crisis Text Line provides free, 24/7 text-based mental health support and crisis intervention. In addition, it shares its data to empower journalists, researchers, school administrators, parents, and others to understand what their communities face in the hopes of working together to prevent future crises.

Since it began in August of 2013, Crisis Text Line has handled more than seven million texts, and it uses data gathered from those texts for a number of purposes. One way is with machine learning, identifying from previous conversations exactly which words or emojis are used in texts that present the highest risk, so that if not enough volunteers are available to respond to a text, those high-risk texters can be prioritized.

As an example of the type of information someone can learn from the organization’s “Crisis Trends”, if a user selects ‘Suicide’ from the filter, they look at co-occurring issues for texters who contacted the crisis line about suicidal thoughts.

Source: Crisis Text Line website, https://crisistrends.org/#faq

In addition, users can filter by crisis types to see the top words used in texts related to that issue, which may be helpful in identifying someone at-risk.

As widespread as the numbers indicate the issue of mental health is, the issue casts an even wider shadow when you consider all of the other people it affects. One person with mental illness becomes many dealing with the issue when you consider family or other loved ones, or even the people with whom that person interacts in a given day.

Data plays an important role in the work being done to help with treatment. An important piece to remember, though, is that collaboration is important. Data scientists don’t have the subject-area expertise that mental health experts do. And the opposite is true – most therapists, for example, don’t have the capacity to develop a machine learning tool. Working together, though, can result in the types of tools that can reveal more than either can by themselves.

John Sucich
Follow me

You may also like