Kinzen’s product is not a newspaper or a story.
We are defined by the blend of human and machine. Our human research takes the best of the journalistic craft, scales its impact through technology, and is applied to help the people doing the work of protecting our online communities.
Doing this involves utilising a raft of old skills and learning new ones. Sometimes true innovation is not starting everything again from scratch but rethinking principles and practices, and finding new ways to invigorate them.
We apply old-fashioned journalistic rigour through the keen eye of sub-editors, the itch to investigative, and the awareness of the power of language. Equally we adopt new skills from data scientists and machine learning engineers about data collection and technological models for pattern detection.
Some of these conversations are extremely challenging. We are pushing the technology to do things we have never seen before. We are aware of how technology has previously failed because of poor quality training data. We are equally aware of the possibility for bias in such training data and the need for independent oversight of our structures and processes. (More about that in upcoming blog posts.)
How we work
Our researchers track the daily evolution of dangerous narratives, from election fraud to anti-vaccine rhetoric, and many more.
We do this across multiple platforms, both mainstream and fringe. This includes but is not limited to Facebook, Twitter, YouTube, Instagram, TikTok, Snapchat, Parler, Gab, MeWe, 4chan, 8kun, Telegram, and so on. We will often see a new trend develop on one platform before ultimately infecting them all.
We work across a diverse range of languages. We’ve started with English, German, Arabic, Spanish, Portuguese, Hindi, Turkish, French, Russian and Swedish. We regularly find that dangerous narratives affect different regions differently. Sometimes hashtags are regurgitated across each language, or they are tweaked for localisation. Other times there is a big time lag between the spread of narratives across language.
We focus on every possible format - text, audio, video, live. This includes everything from podcasts to videos, newsletters to blog posts, tweets to messaging boards.
Kate Starbird coined the term participatory disinformation to describe the “tight feedback loops between ‘elites’ and their audiences”. Disinformation is powered both from the bottom up and from the top down. We agree. But we also evaluate risk by considering influence. An account with one million followers should be subject to more scrutiny than one with 20.
We think carefully about crafting policies to steer our work, about the nature of the disinformation data we collect, and how that feeds into actionable intelligence for our clients.
We also think deeply about the mental health of our staff. Doing this work can be extremely challenging. It’s important everyone knows the latest advice from leaders in the field, such as the Dart Center.
Journalism applied in new ways to help the essential workers of the internet
Although traditional journalism faces challenges, we can use the same values and skills in innovative ways to benefit society. And although journalism has a vital role in holding technology companies accountable, we can also apply it to identify disinformation at scale with smart technology.
Journalism at its best has always been about helping people navigate information. With this they can make informed judgements. Through decades of trial and error, journalism is the craft best suited to considering the necessary nuances in tackling disinformation, in balancing freedom of expression with harmful threats to democracy. Journalists are tasked with understanding the complexity of language and are often, therefore, best placed to understand the way narratives are shaped by new dog whistles, or attempts to evade moderators.
Remember: these moderators on the frontlines need all the help they can get.
Just as we all reflected on the role of “essential workers” during COVID lockdowns, so too must we think of moderators as the essential workers in the disinformation age. There has been a copious amount of reports into the struggles these people face in cleaning up the internet every day. How can we help them make better and faster judgements?
That’s what we’re doing at Kinzen.
Our private daily disinformation digest keeps such teams up to date about what they need to know at the start of every day. Our Intelligence and Analysis products can help them scale up their efforts to understand emerging threats on multiple platforms.
They deserve our support. Kinzen is ready to help.