The events of January 6 have changed the content moderation space.
Even for those of us who who investigate misinformation, the past couple of weeks has been a whirlwind.
There is the question as to whether this actually is unprecedented; platforms have taken actions like this outside the Anglosphere before, but it’s received way less attention. That is one of the many points Jillian C York made after compiling a list of everything pundits are getting wrong about this moment in platform moderation.
There is obviously the question of free speech and moderation itself. Now it seems that every platform hosting content, big or small, has to think carefully about and act on these discussions.
Finally there is the Great Migration of MAGA supporters to alternative platforms. This has become very tricky for disinformation investigations.
Since January 6 we’ve had a number of questions coming from clients asking for help on monitoring alternative platforms like Telegram, 8kun, Gab, Discord, etc.
Our response has been very deliberate. We want to be careful in thrusting people into these platforms where there is a real danger of them being identified, harassed and threatened.
If you are stepping into monitoring these platforms, please keep these rules in mind:
Aside from security concerns, remember that a lot of the chatter in these closed networks is self-referential. It might take a lot of study before you “get” what is going on. Much of the conversation will be attempts at childish gags or contain lewd/violent imagery. Much of the content will require further verification before you can think about acting on the information you’ve seen.
Our job at Kinzen is to support moderators and researchers, and indeed any organisation at risk from disinformation. We've set up a daily briefing which helps navigate a rapidly evolving landscape. It covers the latest disinformation narratives and the key content that is promoted every day.
To get it, you can get in touch with us at firstname.lastname@example.org.