Connect with us

International Circuit

Improving AI-enabled Healthcare in the US

A data scientist said during an online demonstration of the solution that a toolkit that can reduce algorithmic bias in Artificial Intelligence (AI) tools for the health industry would mean better care for everyone. The unique risks in algorithmic bias come from the way that it allows the systematic and repeatable automation of biases to impact people on a previously impossible scale. Designers may input the assumptions into the technology so they should be tested to ensure they are not automating harm.

“Called Diagnosing Bias, the toolkit contains resources to help government health care procurement officers incorporate best practices for algorithmic accountability. The toolkit’s two main elements are a procurement template generator tool with AI contract riders and the AI Model Checklist.” – Matthew Zhou, Tech Policy Fellow, Aspen Institute

The template generator provides procurement officers with a readymade template for writing health care AI contracts that includes clauses addressing transparency, bias mitigation, security and privacy. This is not unlike a request for proposal templates common for many other purchases, such as property. The goal is to make these templates open-sourced and freely available to procurement officers.

The second tool is the checklist, which provides a set of guiding questions and transparency artefacts that procurement officers can solicit from health care AI companies at each stage of the AI design process. Health care AI has the potential to vastly improve medicine – once bias is minimised. A Science Academic Journal article stated that health care algorithms overlooked 28% of Black patients when compared to white patients with the same disease.

The National Institute for Health Care Management Foundation offers examples: The American Heart Association’s Heart Failure Risk Score assigns three additional points to patients identified as “nonblack,” which may raise the bar for hospital admission for Black patients. The STONE score, which predicts the likelihood of kidney stones in patients who arrive at the emergency room with flank pain adds three points for “nonblack” patients, leading clinicians away from diagnosing the condition.

As AI is among the most important technological developments in the healthcare sector, numerous startups are developing AI-driven imaging and diagnostic solutions that are accountable for the growth of the market. The US is evolving as a popular hub for healthcare innovations. Several start-ups have appeared in the last few years to automate the analysis of medical images. The COVID-19 outbreak has significantly promoted the implementation of remote health check-ups using digital tools, delivering clinical services to patients over distance rather than in-person.

A significant increase in the number of artificial intelligence startups in the healthcare sector is anticipated to positively influence the North American market growth during the forecast period. Additionally, the growing usage of artificial intelligence in the healthcare industry is among the other factors expected to fuel the demand for artificial intelligence in healthcare in North America.

As reported by OpenGov Asia, to help clinicians avoid remedies that may potentially contribute to a patient’s death, researchers at MIT have developed a machine learning model that could be used to identify treatments that pose a higher risk than other options. Their model can also warn doctors when a septic patient is approaching a medical dead end — the point when the patient will most likely die no matter what treatment is used — so that they can intervene before it is too late.

When utilised to a dataset of sepsis sufferers in a hospital intensive care unit, the investigator mannequin confirmed that about 12% of the therapies for deceased sufferers have been dangerous. The research additionally exhibits that about 3% of sufferers who didn’t survive have been caught in a medical stalemate 48 hours earlier than demise. OpenGov Asia

Copyright © 2024 Medical Buyer

error: Content is protected !!