Skip to content
Abolition Algorithms
Dispatch

No Humans Involved

Yeshimabeit Milner

September 13, 2022

Why I am not speaking at the Global AI Summit



This morning/ 5:30 pm Riyadh time, I am scheduled to speak at the Saudi Global AI Summit, the exclusive gathering of AI leaders from around the world. My panel is entitled “AI Through the Lens of Equality” where I will be joined, perhaps not coincidentally, by all-women co-panelists.

I initially agreed to participate months ago because I saw the event as an opportunity to speak honestly and candidly to an audience of more than 3,000 researchers, scientists, policymakers, and world leaders about algorithmic violence and D4BL’s work.

We also wanted to gather information on the ground to better grasp the geopolitical implications of Big Tech’s role in the Middle East. With Google opening two new headquarters in Saudi Arabia and the US Military’s announcement of the construction of a new Pentagon testing site in the Saudi desert, I wanted to investigate the implications of these developments on our ongoing work to end data weapons at home.

But tonight/this morning, I will not be speaking as planned. Nor will I be in attendance. On August 26, 2022, news broke that Salma al Shehab, a Saudi citizen and current Ph.D. student at Leeds University was sentenced to 34 years in prison for speaking out on Twitter (which is co-owned by the Kingdom) about the treatment of women. When a coalition of activists led by SumofUs called for a boycott of the summit last week, we knew that the best way for us to influence the conversation was with our absence, not our presence.

Our decision to boycott the Saudi conference was galvanized by this call to action against widespread platform censorship and the criminalization of activists online but was informed by a larger shift in our thinking at D4BL: to totally renegotiate our relationship with corporations and government entities and to break free from dependence on major foundations to lead work that is truly unbought and unbossed at a time when it is needed the most.

Often, the conversation around AI at panels like this week’s summit focuses on what we at D4BL have come to see as a distraction from the real threats these technologies pose. Our activism has led us to worry less about whether AI can become ‘human’ and more about how AI will be used to determine who is human and justify state violence against those categorized as nonhuman.

The battle for the soul of AI is not about preventing tech from becoming human but about preventing AI from being used to define who is human and who is not.

The title of this open letter is from the seminal text, No Humans Involved: An Open Letter to My Colleagues, written by Jamaican theorist and philosopher Sylvia Wynter about the Los Angeles Police Department’s use of the internal code NHI. ‘No Humans Involved’ to categorize poor Black and Brown victims and ensure the crimes against them were never solved.

In the aftermath of the jury’s acquittal of the police officers that tortured and brutalized Rodney King, Wynter urged her colleagues in academia to consider their own responsibility in the form of inaction and apathy to dominant systems of racial categorization that benefit them, but rendered others “outside the sanctified realm of human obligation.” Thirty years since Rodney King, her call to action is even more urgent and one that we cannot ignore.

We use this boycott as an opportunity to elaborate on a position that we must take at this time. We cannot allow words like ‘human-centered’ and ‘human in the loop’ to disguise the weaponization of artificial intelligence systems against people that do not fall into rigid categories of whiteness, straightness, maleness, and middle-classness. We must reject all efforts to use AI to define who is considered human in the first place.

We refuse to sanitize or rebrand violence for the purpose of access, power, or profit. We cannot support AI convenings in a country that in 2017 became the first ever to grant legal personhood to a robot, before allowing human women to drive. When we say No More Data Weapons or no Fascist AI, and when we stand in solidarity with Google employees demanding No Tech for Apartheid, we mean it.

As young Black people from displaced and dispossessed communities, it may look foolish to refuse money and reject opportunities like this to scale our reach and work. But it was wisdom and diligence that got us here, and by wisdom and diligence, we will persist.

No platform is worth it if it means our integrity, dignity, and ability to stand fully in our truth. As we have learned with inflation the value of money is fleeting – but the true human cost of losing sight of our core mission is incalculable.