I’m Ben and I write Everything in Moderation, a weekly newsletter about (no prizes for guessing) content moderation. Shane is taking a well-earned break and he’s invited me to host the Weekly Wrap until he’s back in the chair.
This week, we were introduced to the Disinformation Dozen, twelve personalities with large online followings reportedly responsible for a whopping 65% of anti-vaccine misinformation and conspiracy theories on Twitter and Facebook. The dozen — named in a report by NGO Center for Countering Digital Hate — are useful in that they put a face to the otherwise faceless infodemic we’ve all found ourselves in since the pandemic began.
And yet, the report raises the same old questions: Why weren’t the policies that these users broke applied in a timely or consistent manner? Were efforts made to warn users about their false narratives and, if so, when? And what is stopping these dozen superspreaders being deplatformed (which we know can work if done across the board)? The answers to these questions will inform the ongoing government vs platform showdown in the United States (see Recommended articles).
As my former school teachers will attest to, I’m a very slow reader so I don’t have a book recommendation for you this week. But there’s a wealth of juicy reads to get — hit reply and let me know what you make of the selection.
Domestic terrorism, the likes of which manifested itself at Capitol Hill earlier this year, has been on the rise for years but never seemingly been deemed a governmental priority. That all changed on January 6 and conversation has now turned to how enforcement agencies like the FBI can spot online signals that suggest planned violence. In its latest podcast, Tech Policy Press has a good interview with Clint Watts, author and research fellow at Foreign Policy Research Institute, about the role that social media monitoring can play in efforts to combat violent terroism.
Gizmodo: Company That Aims to Solve the 'Crisis of Toxicity Online' Makes Money From the Daily Caller and Ben Shapiro
It didn’t occur to me how much adtech vendors play a crucial role in the business model of hate speech and misinformation until I read this piece. By partnering with right-wing sites that peddle falsehoods about vaccines and election fraud, these software companies funnel dollars to publishers for them to continue the cycle. And all the while pretending to be the answer to online toxicity. One to watch out for.
TIME: Joe Biden’s Fight with Facebook is just beginning
It feels like an age ago but it’s worth remembering that the week started with the President of the United States accusing the world’s largest social network of “killing people” by not confronting Covid-19 misinformation. Joe Biden might have rowed back his comments a few days later but it’s clear that Facebook is in the sights of his administration. This piece lays out what we can expect over the coming months.
Rest of World: How the son of a homophobic politician in Nigeria became a queer OnlyFans star
I’m fascinated by OnlyFans and the way it has opened up opportunities for everyday people to express themselves and make money. This story is a heartwarming and brilliant example of that. But it’s also a reminder that platform users in conservative countries like Nigeria face untold hurdles to even be on what one OnlyFans user calls “a system that doesn’t favour you”.
Techdirt: The Eternal October: Bringing Back Tech Optimism, Without The Naivety
If you’re looking for something a bit more hopeful, have a read of Mike Masnick’s blogpost about how it's time to empower people to take control of their online lives. It is the perfect moment, he argues, to “recognize how technology and innovation have amazing potential for good without overlooking the fact that they can also be abused for nefarious purposes”. A so-called ‘Eternal October’.