Machine Learning Algorithm Uses EHR Data to Calculate Suicide Attempt Risk

March 13, 2021

Using information found in electronic health records (EHRs), researchers have created a real-time predictive model of suicide attempt risk which can be used to screen in non-psychiatric specialty settings.

“We cannot screen every patient for suicide risk in every encounter -- nor should we,” said Colin G. Walsh, MD, Vanderbilt University Medical Center, Nashville, Tennessee. “But we know some individuals are never screened despite factors that might put them at higher risk. This risk model is a first pass at that screening and might suggest which patients to screen further in settings where suicidality is not often discussed.”

Over the 11 consecutive months concluding in April 2020, predictions ran silently in the background as adult patients were seen at Vanderbilt University Medical Center. The algorithm, dubbed the Vanderbilt Suicide Attempt and Ideation Likelihood (VSAIL) model, uses routine information from EHRs to calculate 30-day risk of return visits for suicide attempt, and, by extension, suicidal ideation.

Upon stratifying adult patients into 8 groups according to their risk scores per the algorithm, the top stratum alone accounted for more than one-third of all suicide attempts documented in the study, and approximately half of all cases of suicidal ideation. As documented in the EHR, 1 in 23 individuals in this high-risk group went on to report suicidal thoughts, and 1 in 271 went on to attempt suicide.

Over the 11-month test, some 78,000 adult patients were seen in the hospital, emergency room, and surgical clinics at Vanderbilt University Medical Center. As subsequently documented in the EHR, 395 individuals in this group reported having suicidal thoughts and 85 lived through at least 1 suicide attempt, with 23 surviving repeated attempts.

“Here, for every 271 people identified in the highest predicted risk group, 1 returned for treatment for a suicide attempt,” said Dr. Walsh. “This number is on a par with numbers needed to screen for problems like abnormal cholesterol and certain cancers. We might feasibly ask hundreds or even thousands of individuals about suicidal thinking, but we cannot ask the millions who visit our medical centre every year -- and not all patients need to be asked. Our results suggest artificial intelligence might help as one step in directing limited clinical resources to where they are most needed.”

The findings are published in JAMA Network Open.


SOURCE: Vanderbilt University Medical Center