Before the algorithm, not all are the same – digital

The corona vaccine is here, will everything be fine now? Even if “lateral thinkers” demos suggest otherwise, the word of the Standing Vaccination Commission applies to most people in this country – benevolent demigods in white who dictate who and when may or at least should have an injection.

In the USA one goes other ways. To ensure that every resident willing to vaccinate receives the necessary dose, the local Ministry of Health has entered into a partnership with the data mining company Palantir. The company develops a software platform that decides on the distribution of resources and logistics. The system should be “simple, fair and just,” said the project manager. How exactly it works is not known.

Chances are, any other company would have been more appropriate. Palantir doesn’t exactly have the best track record of social justice. Among other things, the company has become known for using legally questionable methods to help US immigration authorities find illegal immigrants. The fear is now that the use of non-transparent algorithms could further cement existing injustices in the health system.

Algorithms favor white patients over non-white patients

This is not a theoretical skepticism. In one in the specialist journal last year Science published study, the authors examined ten algorithms that are used in the US healthcare system. All of them preferred white patients to those of other skin color. This is not only due to the data with which the system is fed. But also because of wrong assumptions by the programmers and wrong modeling of the world. The developers assumed that people who spend more money on health care would automatically be sicker and therefore need more support. In fact, the more spending group may spend more because they have more wealth and are therefore more likely to be white. As a result, the algorithm allocates fewer co-payments to black people with the same disease than to white patients.

One of the main attractions of algorithms is that they allow those in power to blackbox unpopular results for which they themselves would normally have to answer, said tech critic Roger McNamee recently on Twitter. One speaks of a black box when the decision-making processes within a computer program cannot be understood by outsiders. The supposedly neutral mathematics is not only used for automated decision-making, but also to cover up responsibility.

Stanford algorithm ignores doctors

In recent years there have been repeated reports that people were disadvantaged by the decisions made by an algorithm because of their origin, skin color or sociodemography. So now they should entrust their lives to an opaque formula. That could intensify the rampant skepticism about science. At Stanford University Hospital, too, an algorithm decides who gets the vaccine quickly and how, and who has to wait. For weeks medical ethicists and epidemiologists worked on the best program. Age and frequency of Covid infections in the relevant field of activity played a role in the allocation.

Ironically, the doctors who were most frequently used in corona intensive care and thus also had the greatest exposure to infected patients were ignored by the supposedly fair and independent program. Only seven out of a total of 1300 were allowed to have one of the first 5000 vaccine doses injected. A large part of the vaccine went to administrative employees or medical professionals who hold their consultation hours mainly via zoom. So it was not surprising that the doctors at a press meeting, at which vaccinations should be publicly effective, with protest signs and angry slogans like “F.ck the Algorithm!” published.

.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.