In her new personal history of the internet, Lurking, Joanne McNeil argues that the mess Big Tech is in is largely due to its insistence on designing and making for users rather than for people. The problem is especially conspicuous with Black people. Computational narratives of existence, where metrics and nebulous data lead to monetization, don’t fit Black experiences.
Tech journalism exists, in theory, to address these problems by holding the industry accountable, through access and insight. Compounding the problem, however, is that Black people have historically been considered neither the authors nor the audience of tech journalism. For centuries, accounts of countless Black experiences with technology have been lost or poorly preserved. Black creation viewed as content and catalyst has been deeply incompatible with the conventions of tech and its discourse. That incompatibility, though nothing new, fails Black people by design, and that failure has damaged us all.
The constant surveillance of Black people today, as Simone Browne illustrates in Dark Matters, can be traced back to lantern laws of Slave Codes and the 18th-century theory of the panopticon, the all-seeing prison. Much of tech journalism has expanded the panopticon’s lens, especially in its “reporting” on interacting with Black people online. In 2010, Gizmodo offered a disturbing precedent of how Black Twitter would be covered: “Why I Stalk A Sexy Black Woman” and its follow up “So This Hipster Douche Stalks a Sexy Black Woman On Twitter.” Black social media use became a thing for tech companies and tech journalism alike to mine, and voyeuristically. Opportunistic tech creators, never missing a chance to make the same mistake multiple times, have built tools like otherside.is and vicariously.io, which scrape and curate Twitter lists and accounts of those outside users’ traditional milieu. With articles and products like these, one can “diversify” and collect “new experiences” rather than hire Black writers or cover Black experiences in context. The message becomes: You need to be aware of what Black people are doing but you don’t have to talk to them. New platforms like Clubhouse further this model, using the content of Black users while not investing in the infrastructure, to disturbing results.
Browne’s historical grounding makes the repetitive nature of these missteps all the more stark. “Toxic Twitter,” for example, is the title of an Amnesty International report on online abuse, which Black women suffer from disproportionately. One of Toxic Twitter’s earliest appellations is a 2014 cover story in The Nation that suggested Black women’s resistance to racism in feminism can be worse than the actual racism. Racist depictions of Black women throughout history and glaringly on social media repeat this failure of power analysis and historical context. The terminology changes (usually appropriated from Black cultural makers)—cancel culture, bad discourse, etc.—but the context-free arguments remain the same. Where the racism of making Black women synonymous with “toxic” and “cancel” carries over and unnamed power imbalances equivocating admitted racists and people trying to stay alive. Last summer, a letter in Harper’s bemoaning “cancel culture” spawned multiple interviews and interventions for its millionaire signatories, who fear the end of “open debate.” Meanwhile marginalized young journalists, specifically young Black women, are barely quoted anywhere in the media, even as they are pushed out of newsrooms at an alarming rate.
When tech media is not accustomed to listening to Black women, tech users fail to realize when Black women are not real. MSNBC recently revealed that Russian operatives had been posing as Black people to spread misinformation. Black women have been chronicling this fakery for years. Even though Black women were the first to expose the alt right six years ago, their work and the alarms they’ve rung consistently remain uncredited and unheard. Platforms often take weeks to intercept targeted harassment of Black women, to the point where other users simply describing the abuse and delay of intervention are targeted.
There becomes a pseudo Overton Window lock: Harmful behavior toward Black women isn’t enough to inspire change until others are harmed, but the original harms are often lost by journalists tasked with covering tech. The power and rhetoric that went unchecked becomes common. And the tactics used against Black women for “lulz” become weapons used in the conspiracies destabilizing the very nature of truth, from the swarming of victims to posing as Black women to destabilizing communities (or countries). Defining the systemic abuse becomes a frustrating exercise of describing an empty space that no one believes is there. If we can follow, surveil, and automate everyone, how could we miss anything important? And if it is important it is only important for how it changes the mythical “standard user” no matter how many are hurt before.
Another way this plays out: While the radicalization of “standard users” (i.e., white men) who perform racist actions is considered intrinsically part of tech’s story, the lens is often not extended to the targets of that racism. After Demetria Hester was attacked by a man named Jeremy Christian on Portland’s Trimet transportation in 2017, Christian was not even questioned. Not long after, he was charged with murdering two men who were defending two Black teens from his racist Islamophobic attacks. During the summer 2020 protests in Portland, Hester was arrested for organizing with Mothers United for Black Lives Matter. There are some 200 million Google results for Jeremy Christian, including taxonomies of his hate speech and denials by right wing groups of their connections to him. There are 200,000 results for Demetria Hester. Not one appears to connect her experience to the abuse of Black women online, even after studies like Amnesty’s had come out. Hester, who went viral after Christian threatened her at his sentencing, spoke of being ignored when she tried to warn cops about Christian and others who were targeting her. Authorities were not listening when she needed help, but they were when they wanted her silent. Christian’s radicalization is much discussed and seen as inherently digital, yet the surveillance and real life consequences for Hester are only digital as far as they are covered, a pattern repeated over and over.
Black women have been chronicling this phenomenon as long as they’ve been online. In the forward to #HashtagActivism, Genie Lauren notes that hashtags were originally derided as “repeated resistance”: The taking of often myopic reporting of vital Black issues and raising their attention while grounding them in their impact on Black people. Lauren, whose work was instrumental in stopping a juror on the Trayvon Martin murder trial from making a macabre profit grab of a book deal, reveals she had her entire Twitter account deleted for a still unclear transgression. Even the term coined to describe this specific discrimination of Black women, misogynoir, is often used, but usually without mention of its creator, Moya Bailey.
In her essay “Venus In Two Acts,” Saidiya Hartman describes the practice of “critical fabulation,” in which archives and narrative are used to “make visible disposable lives.” Hartman’s focus on the mythological and historical Venus, the goddess of love and the derogatory pseudonym of Saartjie Baartman in colonial South Africa, offers an amazing predecessor to the reality of Black women navigating tech and media. Though rarely recognized, our voices are already active, we are imagining and reimagining archives, futures and cybercultures in breathtaking ways, from digital humanities projects, to record digitization, to innovating release schedules. The taxonomy of the projects, their construction and placement, is so much window dressing. From ethics in AI to abuse on social media, the reality is that none of this can continue without looking at the entire bedrock of not just tech, but those who charged themselves with covering it and the world it is in.
These missing connections—the result of tech journalism’s failures—lead to a disbelief that these circumstances continually shape the landscape of tech. Most recently the departure of Timnit Gebru from Google exemplifies the schism between the lived experience of Black women and how tech journalism placed them in the world. While publicly celebrating diversity—with doodles and Black voices and pictures in year-end roundups—Google quickly and clumsily separated from a leading Black voice they praised only months before. Gebru can connect her experience to years and centuries of systemic abuse of Black women. Yet most coverage of her hiring, including in WIRED, decenters Gebru by casting her story as another warning of the dangers of “Big Tech,” as if the occurrence was an aberration, not part of history. Gebru’s research is ultimately a straightforward declaration of the dangers of unexamined language bias in falsely portraying the world. She was fired in a day.
Real Life. Real News. Real Action
Zillion Things Mobile!Read More-Visit US
On Wednesday, January 6, rioters sieged the US Capitol, brandishing Nazi and Confederate flags. Videos of Black Capitol workers running from the insurrectionists and of cops shaking the violent intruders’ hands were shared for hours. The marginalized folks on my Twitter timeline spoke of begging and pleading with tech and tech media to truly, finally take heed of the history of white supremacy and abuse. As others wondered if anyone could have seen this coming, I thought of Demetria Hester and Timnit Gebru, arrested and fired for surviving and naming white supremacist violence. I thought of all the posts inciting violence that, as Gebru correctly noted, flowed through the AI filters, because “standard users” don’t arouse suspicions. I thought of the critical fabulation and how making Black women visible as people is one innovation that tech can’t seem to manage. It’s one it can no longer avoid.
More Great WIRED Stories
📩 Want the latest on tech, science, and more? Sign up for our newsletters!
The secret history of the microprocessor, the F-14, and me
What AlphaGo can teach us about how people learn
Unlock your cycling fitness goals by fixing up your bike
6 privacy-focused alternatives to apps you use every day
Vaccines are here. We have to talk about side effects
🎮 WIRED Games: Get the latest tips, reviews, and more
🏃🏽♀️ Want the best tools to g
Subscribe to the newsletter news
We hate SPAM and promise to keep your email address safe