Since the January 6 Capitol riot, we’ve seen a fracturing of the momentum within far-right groups in the US. Thousands of QAnon accounts were removed from the larger social platforms, Trump left the White House, and the big question was where would they go from there.
There’s been a move to alternate platforms with Telegram being the primary winner. Parler is the biggest loser. Gab continues to do its thing. Gettr has some traction.
But there’s also been a circling of wagons around anti-vaccine disinformation. For example, Insider reported this week about how militia groups like the Three Percenters and Oath Keepers were in disarray after the events of January 6. Using platforms like Telegram and focusing on the anti-vaccine narrative has helped them rebuild and recruit.
There is still a real threat of militia groups "plotting attacks on government targets, both buildings and people" in the coming months. Anti-vaccine rhetoric is rarely confined to vaccines. Once a conspiracy theory sets root in people, they begin to believe all sorts of other theories. This is why we must be vigilant about the spread of disinformation, and stay on alert to the evolving campaigns of deception that are critical in driving hate, both on and offline.
For Your Headphones This Weekend: Podcast Slot
A recent podcast series from The New York Times unravels a secret plot from right-wing extremists in Germany to assassinate politicians and pin the carnage on an asylum seeker. It’s all about engendering “Day X”; what they hope will be a resultant fury, enough to precipitate the end of democracy.
While meandering at times, the series is helpful in bringing much of the problem of far-right extremism in modern-day Germany to an English-speaking audience. With Merkel’s time winding down, and critical elections in the country taking place in September, it’s a good time to think about how we can protect democracy in an age of algorithmic, viral, incendiary disinformation.
As an aside, we’re already working with multiple clients preparing for the German election. For example, see a blog post from months ago about how anti-democratic forces were already copying Trump’s election fraud playbook. Contact email@example.com if you want to learn more.
Editor’s Pick: Book Slot
This week I’ve been reading Silicon Values: The Future of Free Speech Under Surveillance Capitalism by Jillian C. York.
York explains how social media, and its role in moderating content, has evolved since the 2000s. If nothing else, it’s a fantastic review of major historical and cultural moments such as the Arab Spring, Gamergate, the Rohingya genocide, key elections, and their intersection with social media.
Throughout, York emphasises the mistakes made by platforms in censoring speech too much. She also explains that she used to believe free speech was the answer. Gamergate forced her to reconsider, but she constantly wrestles with the fact that there aren’t easy answers to the questions she poses about what we should moderate. This is a worthwhile addition to the bookshelf on content moderation.
Recommended Articles: From the Kinzen Slack channels this week
Journal of Democracy. The Future of Platform Power: Making Middleware Work
Daphne Keller writes about the concept of middleware: how platforms could give users more control over what they want to see moderated. To do this, Keller explores the practicality in setting up third parties who would manage this moderation process. They’d need to be given user data, for example, and provide the tooling and judgements that could allow for informed users to make these decisions. These companies could act on behalf of all participating platforms, so when, for example, a researcher highlights that a particular post is problematic, all platforms get the benefit of this insight. It’s an interesting approach.
Journal of Democracy. The Future of Platform Power: Quarantining Misinformation
Like the post above, this is part of a series responding to suggestions by Francis Fukuyama around this concept of middleware. Here, Robert Faris and Joan Donovan find Fukuyama’s diagnosis to be too narrow. Nevertheless, they argue that middleware could be useful by tapping into the expertise of librarians. These people are experts in sorting information and evaluating trustworthiness, after all. However, Faris and Donovan also point out the possibilities of self-created echo chambers, with the ensuing dangers of radicalisation and polarisation.
Poynter. Here are Twitter’s most prolific citizen fact-checkers
Alex Mahadevan and Harrison Mantas talk with three of the most prolific users experimenting with Twitter’s Birdwatch programme.