Clinicians rattling around at their hospital workstations know what those 1s and 0s humming in the background are doing, right?
In fact, doctors and health systems are often unaware of important details about the algorithms they rely on for purposes such as predicting the onset of dangerous medical conditions. But in what advocates call a step forward, federal regulators are now requiring electronic health record (EHR) companies to disclose extensive information to customers about the artificial intelligence tools within their software.
Since early January, clinicians have been asked what variables are included in the predictions, whether the tools have been tested in the real world, what the tool developers have done to address potential bias, and You will be able to view the model card or “nutrition label” which contains warnings such as. Improper use etc.
This article is exclusive to STAT+ subscribers
Subscribe to STAT+ to unlock this article and get additional analysis of the technologies disrupting healthcare.
Already have an account? Log in
See all plans