Weekly

Kinzen's Weekly Wrap - October 15, 2021

One of the most jarring revelations from Facebook whistleblower Frances Haugen focuses on the international failures of content moderation. Haugen revealed to Congress that 87% of Facebook's misinformation investments are focused on the English language, but only 9% of Facebook users actually speak English.

John Oliver did a deep dive on this, so if you’re looking to learn more about the experience of disinformation in other global languages, check it out

Kinzen works in 10+ languages, expanding to 20+ soon. So we have a pretty unique perspective on all this. Tune in next week for more from me on this…

Also this week, we announced a unique partnership with Griffith College here in Dublin to provide a leadership programme for people working in Trust and Safety. This course is aimed at early-stage trust, safety and content moderation professionals who are keen to progress into management and team leader roles in the sector. More course details are available here.

For Your Headphones This Weekend: Podcast Slot

It’s worth listening to Ben Decker on the Synthetic Society podcast. Ben is a former colleague and now the founder and CEO of Memetica. He talks through learnings from his research into disinformation on social media, and considers the future of content moderation. 

Editor’s Pick: Book Slot

Writer Otto English recently published Fake History: Ten Great Lies and How They Shaped the World. In it, he journeys through various events and aspects of historical thinking to pierce common wisdom. He documents how simplified historical narratives are used to serve present-day politics.

It’s a fun and surprisingly light read, considering the weightiness of the topic. For a reminder that disinformation is hardly a new problem, it’s worthwhile to check it out.


Recommended Articles: From the Kinzen Slack channels this week

The Washington Post. Why outlawing harmful social media content would face an uphill legal battle

If you’re looking into regulation of social platforms and disinformation, look no further than the work of Jeff Kosseff and Daphne Keller. Here, they team up for insights into the difficulties of regulating algorithms in the United States. The First Amendment has such a big bearing on what is possible. Lately, there’s been some discussion that while it may be impossible to regulate “freedom of speech”, maybe it would be possible to regulate “freedom of reach”. But Kosseff and Keller say that the same challenge awaits. (Also worth checking out Twitter’s guiding principles for regulators, released this week.)

The Atlantic. It’s Not Misinformation. It’s Amplified Propaganda.

Renée DiResta argues that the word “propaganda” is so often considered from a top-down perspective, because of the rise of totalitarianism in the 1930’s. But in the age of social media, there is a similar phenomenon springing from the bottom-up: amplified propaganda. She calls it “ampliganda”. This piece reflects on the growing calls for more agreed definitions around terms like disinformation and misinformation. 

First Draft. Covid-19 vaccine misinformation and narratives surrounding Black communities on social media

Useful research providing insights into the role of vaccine misinformation affecting Black communities. Includes a detailed breakdown of the methodology involved and recommendations for platforms.

Bloomberg. Social Media Platforms Share Little in Fight Against Misinformation

This article focuses on how platforms don’t always work in tandem when dealing with misinformation. On the one hand, cooperation can be useful. Shared learnings. Adopting standards content moderation that are guided by best practices. But it may also be true that each platform has its own unique challenges and needs to have its own way of working through them.

Sky News. COVID-19: Unvaccinated pregnant women make up one fifth of most critically ill coronavirus patients in England

It’s tragic to see vaccine misinformation around fertility and pregnancy having this real-world impact.