shopify analytics tool

Algorithm

Why We Need Accountable Algorithms

AI and machine learning algorithms are marketed as unbiased, objective tools. They are not. They are opaque mechanisms of bureaucracy and decision making in which old-fashioned racist, sexist, and classist biases are hidden behind sophisticated technology, usually without a system of appeal. As their influence increases in society, we face a choice. Do we ignore their pernicious effects, or do we understand, regulate, and control the biases they exert? If we want them to represent transparent fairness, freedom, and consistency in an efficient, cost-saving manner, we must hold them accountable somehow.

What is an algorithm? For my purposes I simply mean a system trained on historical data and optimized to some definition of success. We even use informal algorithms, defined this way, in our own heads. The dinners I make for my family on a daily basis require the data of the ingredients in my kitchen and the amount of time I have to cook. The way I assess whether a meal is “successful” is to see, afterwards, if my kids ate their vegetables. Note that I curate the data - I actually don’t include certain foods, like ramen noodles or sprinkles, into my ingredients list. I also have a different definition of success than my kids would have. Over time, the succession of meals optimized to my definition of success varies wildly from the one my kids would have used. There are two obvious ways that I have inserted my agenda into my algorithm. Indeed any algorithm builder does this – they curate their data, and they define success and likewise measure the cost of failure.

In general, people are intimidated by algorithms and don’t question them the way they should. Thousands of teachers have been told “it’s math, you wouldn’t understand it,” regarding administrators’ statistical value-added model for teachers, even though teachers’ tenure or job status depend on the results. Criminal defendants likewise have no recourse to understand or protest against their recidivism risk scores, used by the court to decide whether a criminal defendant’s profile matches someone who can be expected to return to prison after leaving, even though a higher score can mean a longer prison term. The people targeted by these algorithms – usually in the form of scoring systems – have very little power, and typically no recourse to understand or interrogate their scores.

Algorithms don’t make things fair. They embed historical practices and patterns. When the medical school at St. George’s Hospital in London automated their application process, they noted that it came out both sexist and xenophobic. That surprised them, since they’d expect a computer wouldn’t be discriminatory. But it happened, of course, because the historical data they fed to the algorithm to train it was, itself, sexist and xenophobic. The algorithm of course picked up on this pattern and propagated it.
     — Cathy O'Neil

blog comments powered by Disqus
2019       2018       2017       2016       2015       2014       2011       2010       2009       2008       2007       2006       2005