A new study from Vanderbilt University Medical Center shows that clinical alerts driven by artificial intelligence (AI) can help doctors identify patients at risk of suicide and improve prevention efforts in everyday medical settings. It shows that there is potential for improvement.
A team led by Colin Walsh, M.D., associate professor of biomedical informatics, medicine, and psychiatry, is using an AI system called the Vanderbilt Suicide Attempt and Ideation Likelihood Model (VSAIL) to help VUMC’s three neurology clinics. We tested whether we could effectively prompt doctors. To screen patients for suicide risk during routine clinic visits.
The study, reported in JAMA Network Open, compared two approaches: automated pop-up alerts that interrupt a doctor’s workflow, and a more passive system that simply displays risk information in a patient’s electronic health record.
The study found that interruptive alerts were much more effective, with physicians performing suicide risk assessments in conjunction with 42% of screening alerts, compared to just 4% with passive systems. .
“Most people who die by suicide have seen a health care provider in the year before their death, and the reasons are often unrelated to mental health,” Walsh said. “However, universal screening cannot be practiced in every setting. We developed VSAIL to identify high-risk patients and encourage focused screening conversations.”
Suicide has been on the rise for a generation in the United States, and it is estimated that 14.2 out of 100,000 Americans die each year, making suicide the 11th leading cause of death in the United States. Research shows that 77% of people who die by suicide had contact with their primary care provider in the year before their death.
Calls for improved risk screening have led researchers to look for ways to identify patients most in need of evaluation. The VSAIL model, developed by Walsh’s team at Vanderbilt University, analyzes routine information from electronic medical records to calculate a patient’s 30-day suicide attempt risk. In previous prospective testing, VUMC patient records were flagged but not alerted, but this model has proven to be effective in identifying high-risk patients, and the system has flagged them. 1 in 23 people later reported suicidal thoughts.
In the new study, when patients identified as high-risk by VSAIL came to Vanderbilt’s neurology clinic for an appointment, their physicians received either a disruptive or non-disruptive alert on a random basis. The study focused on neurology clinics because certain neurological conditions are associated with increased suicide risk.
The researchers suggested that similar systems could be tested in other medical settings.
“The automated system only flagged about 8% of all patients who came in for screening,” Walsh said. “This selective approach makes it more feasible for busy clinics to implement suicide prevention efforts.”
The study included 7,732 patient visits and a total of 596 screening alerts over a 6-month period. During the 30-day follow-up period, a review of VUMC’s health records revealed that none of the patients in the randomized vigilance group experienced episodes of suicidal ideation or attempts. Although interruptive alerts have been more effective in encouraging screening, they can lead to “alert fatigue,” where physicians become overwhelmed with frequent automated notifications. The researchers noted that future studies should investigate this concern.
“Healthcare systems need to balance the effectiveness of interruption alerts with the potential downside,” Walsh said. “However, these results suggest that automated risk detection, combined with well-designed alerts, may identify more patients in need of suicide prevention services. ”
More information: Risk Model-Based Clinical Decision Support for Suicide Screening, JAMA Network Open (2025). DOI: 10.1001/jamanetworkopen.2024.52371
Provided by Vanderbilt University Medical Center
Citation: AI systems help doctors identify patients at risk of suicide (January 3, 2025) https://medicalxpress.com/news/2025-01-ai-doctors-patients- Retrieved January 3, 2025 from suicide.html
This document is subject to copyright. No part may be reproduced without written permission, except in fair dealing for personal study or research purposes. Content is provided for informational purposes only.