Remember Midas? Professor Stuart Russell did In his first BBC Reith Lecture about Artificial Intelligence last week. Midas had asked the gods that everything he touched be turned to gold. So it was, leaving Midas gasping for food or drink. Then there was the sorcerer’s apprentice. The apprentice asked the brooms to help him bring water but didn’t specify any limits. The water kept coming until he was drowning in it.
Russell’s point is that AI is like this. The issue isn’t its power. What matters most is the objectives we set it. Wherever we set narrow objectives and ignore wider consequences, we destroy balance. The threat with us now. It isn’t really posed by AI. The true enemy is a focus by companies on maximising, not optimising.
‘(The) social media content-selection algorithms, that choose items for your newsfeed or the next video to watch aren’t particularly intelligent, but they have more power over people’s cognitive intake than any dictator in history. The designers thought, perhaps, that the algorithm would learn to send items that the user likes, but the algorithm had other ideas. Like any rational entity, it learns how to modify the state of its environment, in this case the user’s mind, in order to maximise its own reward, by making the user more predictable. A more predictable human can be fed items that they are more likely to click on, thereby generating more revenue. Users with more extreme preferences seem to be more predictable. And now we see the consequences of growing extremism all over the world’.
Companies can grow profits by modifying the state of the environment around them, including the minds of their customers and consumers. This may not, initially, be the result of a positive company decision: it may be the unintended consequence as digital marketing works through different platforms.
This makes the job of a director much harder. Are you a director of a company that is involved, directly or indirectly, in altering people’s minds to their detriment? How do you define detriment or monitor for it?
And what about addictive behaviours? According to Public Health England there are more than 400 suicides a year associated with problem gambling and one estimate suggests the figure may be as high as 600.
In a recent American court judgement. A Federal Court in Ohio in the USA. has ruled that Walgreens, CVS and Walmart are responsible for ‘creating a public nuisance’ because they ‘helped create an oversupply of addictive opioid pills’
The cost to the US taxpayer of from opioid addiction was already estimated at $78.5 billion back in 2013. By 2019 70,000 people were dying annually from drug overdoses and more than half of these were related to opioids. One 2019 estimate said that the cumulative financial burden would be $1.5 trillion by 2020.
The drug manufacturers are accused of aggressively marketing their products long after they became aware of the evidence of harm. The pharmacies and retailers who distributed the drags, it is alleged, were aware of growing misuse of these drugs and failed to detect, stop, or report suspicious orders.
A traditional view of the role of a pharmacy company might be that, provided that the doctor has issued a prescription, the pharmacy merely complies with the request. Since there is a whole architecture of professional medicine, law and regulation around the prescribing of drugs, it is not for the pharmacy company to question the wisdom of the prescribing or its wider consequences when abuse emerges.
That architecture clearly failed in the USA. The opioid crisis represents yet one more rebuke for the disciples of Milton Friedman and for all those who hold a compartmentalised view of ethics- a view that used to say, crudely, that ‘It’s up to regulators to stop abuse; my job is to make money’.
That primitive defence of maximising is largely discredited now. Unthinking statements such as those by our Prime Minister – when he attributed progress in developing new vaccines to ‘greed, my friends’ sound anachronistic. In the UK most directors now understand that Section 172 of the UK’s Companies Act is not an invitation to maximise short-term shareholder value regardless of consequences. In general major businesses have accepted that their focus must be on optimising, not maximising. That then makes decisions much harder: there are many different impacts to consider.
If we accept that a director must think through the systemic consequences of the company’s actions, where is the line to be drawn?
Here are some questions for a board to ponder.
What do we know about the algorithms which are used in our products and processes? Do we regularly catalogue them and monitor their impacts?
Do we deploy technologies that change customer behaviour to make it more predictable?
What ethical guidance do we give to those – outside as well as inside the company – who develop these algorithms?
Do our sales depend on products or services which have addictive qualities? How much addiction is acceptable to us?
How do we know that the algorithms used in our HR and recruitment processes are fair?
What do we do to inform ourselves about ways in which our products may be abused ?
What would it take for us to exit altogether from a lucrative market which we concluded was altering people’s minds in dangerous ways?
I would love to hear from directors who have thought about these questions and have answers to them.