Thursday, July 16, 2020

'The Algorithm Made Me Do It': How Racist-Tech led an African-American man sleeping in a filthy cell

We've heard of Fintech, maybe even Regtech, but have we heard of Racist-Tech?

In the past few weeks, the US sees the largest protests in its history. I am not referring to the protests where armed protestors show up to state-capitals without much reaction. Rather, these are the protests that were in response to the death of George Floyd. George Floyd who died after a police officer kneeled on his neck (with his hands in his pocket) for eight minutes and forty-six seconds. These protests, in contrast, have been met with a strong reaction.

A related incident occurred a few months before Mr. Floyd lost his life.

As reported in NPR, Robert Julian-Borchak Williams was picked up by police by January 2020 and when he got to the station, he was surprised to the lack of resemblance between him and the pictures of the suspect.

The officer's response? "So I guess the computer got it wrong, too." 

Regardless, "Williams was detained for 30 hours and then released on bail until a court hearing on the case, his lawyers say."

(For more on the story, check out this video)

The story is chilling, to say the least.  The knee jerk reaction is to think of Skynet and dark AI. But is that really what's happening here?

The social unrest speaks to how the desegregation struggles of the 1960s have not totally succeeded. The challenge is that racism is systemic. Within the institutions that hold society together, the gothic systems that existed in the 1950s somehow still exist until today. Sure, it's illegal for prosecutors, judges and cops to be racist. But then how do we explain the treatment of George Floyd and Robert Williams? Is there is no overall monitoring provisioning to ensure that the desired equality is achieved? For example, good monitoring controls over a system would assess the outcomes to see if the desired outcomes are achieved. There was a case that tested this idea. In McClesky v Kemp, where the defence team provided Dr. Baldus's study that statistically proved that the African American is 4.3 times more likely to get the death penalty, the "big data" analysis was rejected and Warren McClesky was put to death by the state. (And yes it controlled for 35 non-race variables).

In other words, data analysis shows there actually is a problem. However, the courts essentially denied this reality and pretended everything is okay.

What does this have to do with Racist Tech?

It means that the systems and the data are biased. Racist Tech will naturally grow out of such systems. AI and predictive policing models that use data from the court system - also pretending everything is okay - will inevitably lead to people like Mr. Williams getting caught up in the criminal justice system. Compared to George Floyd he only had to spend 30 hours in a filthy cell. But during that time he would have no idea whether it was going to be 30 hours or 30 months, given how long it takes to exonerate the innocent.

I was once asked at a conference whether we can look forward to a future where AI takes over. My response was to point out the real issues is with the human that run the technology.  If I had to answer that question today, I would simply ask them to call Mr. William who knows that the nightmare scenario is here already.

Author: Malik Datardina, CPA, CA, CISA. Malik works at Auvenir as a GRC Strategist that is working to transform the engagement experience for accounting firms and their clients. The opinions expressed here do not necessarily represent UWCISA, UW, Auvenir (or its affiliates), CPA Canada or anyone else.



No comments: