Being anti-racist can be more than just disaggregating data, though that may be one piece of the process. It requires that we actively think about our motivations, bias in datasets, and what stories are being told by whom.
What follows is a list of questions for data visualization designers thinking about their role in the current movement. Some are adapted from the considerations shared around COVID-19 (which still apply here) and others were shared by friends, colleagues, or based in past experience making my own mistakes — I’m still learning too.
Ultimately, the aim of getting inside an opponent’s OODA Loop, is to break their rhythm and cause them to miss a beat. Speed undeniably has a role here, because a combatant must have the perception speed, or coup d’oeil, to observe an opportunity, the mental speed to process the evolving situation and available options, and the performance speed to exploit an opening. But, as Bruce Lee wrote, “speed in delivering a stroke will lose most of its effectiveness unless the stroke is properly timed.”
In practice, it is not enough to make decisions as fast as one can, because at a certain point this approach becomes divorced from one’s opponent and their actions. Instead, decisions and actions should ideally happen in a way that sets up an opponent and makes them vulnerable to having their rhythm broken.
A Rite Aid employee based in Detroit, the population of which is more than 75 percent Black, told Reuters bluntly that the software the company started out using, “doesn’t pick up Black people well.” The loss-prevention staffer added, “If your eyes are the same way, or if you’re wearing your headband like another person is wearing a headband, you’re going to get a hit.”
The American Civil Liberties Union recently filed a complaint against the Detroit police on behalf of a Michigan man who was arrested in January based on a false positive match generated by facial recognition software. In light of the complaint, Detroit’s police chief admitted, “If we were just to use the technology by itself, to identify someone, I would say 96 percent of the time it would misidentify.”
We’re not at a loss for in-depth accounts of the tech industry these days. Reporters, cultural critics, academic historians, and tech figures themselves have been busy trying to explain a social and economic paradigm shift that’s affected everything from our dating lives to the security of municipal infrastructure. Books like Alexandra Wolfe’s 2017 Valley of the Gods have fetishized Silicon Valley, offering portraits of tech as a culture apart, rising up to replace the moribund institutions that have failed society—academia, public transit, local news media, government. Other books, such as Zucked: Waking Up to the Facebook Catastrophe, by Roger McNamee, a venture capitalist and an early mentor of Mark Zuckerberg’s, have taken a far darker view. Where these accounts converge is in portraying tech as nothing less than the catalyst of a radically new social order.
Uncanny Valley is a different sort of Silicon Valley narrative, a literary-minded outsider’s insider account of an insulated world that isn’t as insular or distinctive as it and we assume. Wiener is our guide to a realm whose denizens have been as in thrall to a dizzying sense of momentum as consumers have been. Not unlike the rest of us, she learned, they have been distracted and self-deluded in embracing an ethos of efficiency, hyperproductivity, and seamless connectivity at any cost. Arrogant software developers, giddy investors, and exorbitantly paid employees—all have been chasing dreams of growth, profits, and personal wealth, without pausing to second-guess the feeling of being “on the glimmering edge of a brand-new world,” as Wiener puts it in the middle of her book.
For most of human history, though, that perspective has not been recorded. Going back to the theory of Man the Hunter, the lives of men have been taken to represent those of humans overall. When it comes to the other half of humanity, there is often nothing but silence. And these silences are everywhere. Films, news, literature, science, city planning, economics, the stories we tell ourselves about our past, present and future, are all marked – disfigured – by a female-shaped “absent presence”. This is the gender data gap.
These silences, these gaps, have consequences. They impact on women’s lives, every day. The impact can be relatively minor – struggling to reach a top shelf set at a male height norm, for example. Irritating, certainly. But not life-threatening. Not like crashing in a car whose safety tests don’t account for women’s measurements. Not like dying from a stab wound because your police body armour doesn’t fit you properly. For these women, the consequences of living in a world built around male data can be deadly.
When the Iowa attorney general’s office began investigating an unclaimed lottery ticket worth millions, an incredible string of unlikely winners came to light — and a trail that pointed to an inside job. (Listen for:the meticulous data science mindset of the prosecutor, how a simple algorithm with a couple of lines of code can have enormous impact, and how most white collar crimes are committed by modern day Raskolnikovs – people who think they are smarter than the rest of the society.)