The Problem With An AI-based Overdose Risk Score For Pain Medications.

The fact that we are evaluated and judged for a wide variety of life situations is nothing new. School grades, credit score for loans and also the Safety Score for driving a Tesla are familiar and new examples of how we try to summarize complex relationships into simple numbers to give us access to further education, to loans or to software.

What is new, however, is that such assessments are also becoming more common in other industries, are calculated by artificial intelligence, and can lead to unexpected side effects. For example, the assessment of criminals’ likelihood of recidivism was targeted by activists when it became apparent that COMPAS – the name of the software – discriminated against minorities in particular and assessed the poverty from which the criminals came more than their actual likelihood of recidivism. This had dramatic consequences for those affected: Judges who used this software for sentencing purposes punished these offenders with longer prison sentences.

Example of NarxCare’s overdose risk score.

Now another such risk score came under fire, and this one actually started with the best of intentions. In light of the opioid crisis in the U.S., where certain painkillers had been heavily promoted by the pharmaceutical company behind them and generously prescribed by doctors, the ostensibly non-addictive drug led to a pandemic of opioid addiction. For, as it would turn out, the pharmaceutical company had suppressed its own research findings on addiction, with the effect that the U.S. is even talking about an “opioid crisis.”

The company Appriss came on the scene, tapping into extensive patient and physician databases and using AI to calculate an Overdose Risk Score from the data, indicating how high a patient’s risk is of being given an overdose of painkillers as a result of prescribed medication. While Appriss’ NarxCare database was initially intended only as an additional tool for prescribing physicians, it became increasingly important over time. They have now been mandated in all but one U.S. state, and with media and political attention on the opioid crisis, physicians are feeling the pressure to take this score more into account. Lawsuits against physicians who ignored the score are on the horizon, and this is leading them to adopt the score uncritically.

Several studies have shown that the overdose risk score does not live up to its promise. Starting with a data base that is too small and thus not meaningful enough, a surplus of male patients in the data set, the failure to take into account, for example, cases of sexual abuse in female patients, which leads to increased prescriptions of antidepressants, data that are clearly not very related to this also flow into the calculation. And this has to do with what is called “doctor shopping.” What this means is that some patients try to abuse the system by not just getting pain medication prescribed by one doctor, but by seeing multiple doctors, sometimes even in other states, and trying to get pain medication prescribed by all of them.

However, this introduces literally painful errors in the score. Patients whose situation worsens, or who undergo surgery due to an accident or other illness, suddenly have a number of additional doctors, the specialists, in their medical history, which negatively affects the overdose risk score. The increasing number of treating physicians is added under the category of “Doctor Shopping”. Yes even veterinarians are listed there. For example, Wired magazine reports on a patient who adopted two elderly and sick dogs from the shelter and was prescribed painkillers for their treatment by the veterinarian. However, because the veterinarian prescribes the medication in the pet owner’s name and not the dogs, the veterinarian and the prescribed painkillers show up in her score.

Sorry Not Sorry

40 Kunstgriffe für Minister, Manager und sonstige Mistkerle, die Scheiße gebaut haben und nun die Aufregung nicht verstehen.

Dieses Buch kann bereits beim Verlag oder Amazon vorbestellt werden.

The result can have serious consequences: the doctors and pharmacists suspect that they are dealing less with a patient and much more with a drug-addicted person. The way they thus behave around the patient changes drastically. From genuinely caring to reproachful. This score now results in patients plagued by pain no longer receiving pain medication. Yes even some doctors refused to continue to care for the patients they had cared for for years as patients. They dropped them.

Analyses showed that while painkiller prescriptions fell by 10 percent after the introduction of the overdose risk score, suicides and overdoses increased by one-third. Desperate patients tried to get painkillers from other sources. Without medical care, they overdosed on painkillers from dubious sources. Or they were so desperate and could no longer stand the pain that they saw no other way out but to kill themselves.

Thus, what was originally a good idea with the best intent to provide physicians with an overdose risk assessment tool has been transformed by the current attention to the opioid crisis and its introduction in all states as a recommended, though not Food and Drug Administration (FDA) officially approved, database into an AI tool that physicians can only neglect at their own risk.

As in other cases, data bias and an inadequate composition of factors used to calculate an AI-based score have serious implications for the lives of those affected. Thus, with the increasing use of AI in all aspects of our lives, the discussion about the proper use of AI becomes urgent.

A longer article appeared in Wired magazine in October 2021.

Leave a Reply