Machine-learning tech could help doctors predict suicide risk


When a person becomes suicidal, they don’t necessarily rush to tell friends, family, or even their doctor. They might feel ashamed of their thoughts and emotions, and wish instead for them to simply disappear. 

The stigma surrounding suicide is partly why it’s hard to predict and prevent. But doctors also don’t have great tools to diagnose whether someone is suicidal; patients can conceal self-harm and minimize their experiences when completing questionnaires designed to detect suicidal thinking.

This week, however, a group of researchers published a new study that demonstrates how a novel brain imaging technique can identify people who have suicidal thoughts, simply by presenting them with certain key words, asking them to reflect on their meaning, and using machine learning to badyze that brain activity. 

“Suicidality isn’t that you can’t cope with life; it’s you’ve somehow gotten into a pattern of thinking that leads you to consider suicide.” 

The results of the study, published in the journal Nature Human Behaviour, challenge the common stereotype that suicidal people could change their perspective if they just tried hard enough. In fact, the study suggests that suicidal feelings and thoughts are deeply intertwined with the way the brain processes information. 

“Suicidality isn’t that you can’t cope with life; it’s that you’ve somehow gotten into a pattern of thinking that leads you to consider suicide,” says Marcel Just, a cognitive neuroscientist who is the study’s lead author and a professor of psychology at Carnegie Mellon University. 

Just and his co-authors studied 34 young adults, half of whom had a history of suicidal thinking or past attempts and half of whom didn’t. The participants were placed in a functional magnetic resonance imaging machine (fMRI), which measures brain activity by monitoring blood flow. The researchers then showed each person 30 words related to suicide and positive and negative feelings, including “death,” “desperate,” “carefree,” “kindness,” “trouble,” and “worried.” 

To badyze the results, the researchers used machine learning to characterize people’s brain activity patterns, and, 91 percent of the time, it correctly determined which participant had a history of suicidal thoughts, and which didn’t. It also successfully identified which individuals had previously attempted suicide. 

In general, the badysis yielded critical information about which concepts led to the clearest distinctions between the groups. The brains of participants with suicidal thoughts and behavior responded much differently to the words “death,” “cruelty,” “trouble,” “carefree,” “good,” and “praise,” and most of those people showed high levels of self-reported depression that included a negative view of the self, world, and the future.  

“Our research shows that suicidal ideation is exactly the way you think about things,” Just says. “Something changed the way your brain and mind work.”

Though the study is small, it demonstrates the promise of fMRI used in tandem with machine learning, a novel approach that resolves some of the challenges of relying on imaging to make conclusions about brain activity. Machine learning makes it possible to observe statistically significant differences between patients and a control group, which has been difficult in the past. 

Just says that if the technique remains successful in larger studies, it could become an important tool in helping doctors badess suicide risk and develop targeted treatments. If a psychologist, for example, had better information about which concepts were altered in a suicidal patient, they could potentially tailor talk therapy or medication to positively change that person’s way of thinking. 

That technique, however, would be just one tool for diagnosis and treatment, as it couldn’t capture the full spectrum of experiences that put people at risk for suicide. 

“You’d want to find out a whole lot more besides this about a person,” Just says. 

While the technique is promising, fMRI machines aren’t easily accessible or cheap. That’s why Just and his colleagues are hoping to use an electroencephalogram (EEG) — a test that detects electrical activity in the brain — to similarly decode how the brain responds to key concepts related to suicide and positive and negative feelings. Just says that many psychiatric clinics have an EEG. 

“There’s no question that our brains are malleable. They are the most powerful tool that mother nature gives us.” 

The study’s results also raise complex questions about new technology that helps reveal what’s happening in our brains as we think. In a dystopian future, you could imagine the tool becoming a way to exclude people with suicidal thoughts or behavior from certain professional and private roles, including military service, political office, or even parenthood.

Just says the technology requires immense focus and participation from the subject, so it couldn’t be forced on people — yet. How people decide to subject their thoughts to examination and whether that information is shared publicly will eventually become the “ultimate privacy question,” adds Just.  

In the meantime, he’s hopeful that the technology, if proven successful, will give patients and their doctors meaningful ways to badess and prevent suicide risk. Just is optimistic in the human ability to influence and shape the brain with the right tools. 

“There’s no question that our brains are malleable,” he says. “They are the most powerful tool that mother nature gives us.” 

If you want to talk to someone or are experiencing suicidal thoughts, text the Crisis Text Line at 741-741 or call the National Suicide Prevention Lifeline at 1-800-273-8255. Here is a list of international resources. 306a 9ed5%2fthumb%2f00001

Source link

Leave a Reply

Your email address will not be published.