How does disinformation get produced and amplified? It’s a seemingly simple question but it has many different answers. Yochai Benkler has regularly argued that elites are the most worrisome spreaders of disinformation. People like Trump, or indeed the mainstream media, gain huge engagement online, clearly. Disinformation is a top-down phenomenon.
There is another body of thought which is that disinformation springs from the bottom-up: from dodgy message boards through to the dominant social platforms and then onto TV. Disinformation is tested and created in smaller networks and gradually works its way up to the superspreaders.
I think recent events show how both are happening at the same time, and how they reinforce each other. For months, we’ve seen promotion of ivermectin as a cure for COVID. (There’s not enough evidence for that.) As this story gained traction, last weekend Rolling Stone published an article, based on a local news report, about how Oklahoma hospitals were overrun with patients who had taken ivermectin and needed treatment. The story was shared by some prominent liberals, fully sure it proved their thesis that people on the right are taking the drug in significant quantities. But there were significant flaws with the original story, and how it got repackaged. This led some on the right to respond by vilifying the doctor at the centre of the story, who as Daniel Dale explains here, didn’t necessarily say anything false or inflammatory.
Meanwhile Joe Rogan spoke about using ivermectin to treat his case of COVID. This sent the original promoters of the drug in COVID denial networks into a frenzy. Joy and vindication and hope were flying through these groups.
Disinformation is not a case of bottom-up or top-down. It’s both, and they reinforce each other all the time. The Oklahoma story teaches us that in the fast-paced world of online information, it’s worthwhile reserving judgement whenever you see a headline that inflames your passions. Slowing down and taking a breath is one of the best things individuals can do in these moments.
Editor’s Pick: Book Slot
When we talk about causes of disinformation, discussion often turns to the business model. In this area, one book has become canonical, even if you might disagree with it: Shoshana Zuboff's The Age of Surveillance Capitalism.
Zuboff’s writing style veers throughout: at times beautiful like poetry, at times as heavy as academia. At all times she is fierce in her criticisms of tech platforms and advertising driven business models. This compels them to constantly collect data and surveil us, she argues.
The book is long and a fuller explanation would require more space. But if you’re working in this area or debating these ideas, there’s no question you need to be familiar with Zuboff’s thesis, as its influence continues to grow.
Recommended Articles: From the Kinzen Slack channels this week
Rest of World. “Disinformation influencers” for hire, only $15 a day
A profile of recent research from Mozilla showing how, in Kenya, a social media campaign to undermine the country’s judiciary was partly promoted by “disinformation influencers”. The concept of influencers paid to spread propaganda is deeply concerning for democracy. Young people can be particularly susceptible to messages from these folks. It’s not the first example of this we’ve ever seen, and it won’t be the last, but coming up with solid countermeasures is tricky. We’ll always need more expert-led research like Mozilla’s into the disinformation for hire industry.
The Wall Street Journal. Pro-China Online Network Used Fake Accounts to Urge Asian-Americans to Attend Protests, Researchers Say
From influencers to state actors. We regularly talk to clients about how state actors are just a part of the disinformation story, along with domestic, or home-grown conspiracy theories. This research is a useful reminder of the geopolitical interests often at work in disinformation. We’ve seen evidence of Russians posing as pro- and anti-Black Lives Matters organisers before, now there’s similar research about the Chinese.
Science Advances. Scaling up fact-checking using the wisdom of crowds
Fact-checking struggles to scale. It’s a small industry and expensive to produce fact-checks for everything on the internet. So what can be done? This research studied how asking “laypeople” to get involved can actually have surprisingly positive results. As platforms like Twitter look at decentralisation of content moderation and crowdsourced fact-checking, this kind of finding is certainly interesting. I’m sure more experimentation is needed however. The research was summarised here.
Newsguard. Sizing the Infodemic: NewsGuard Analysts Have Now Found More than 500 ‘News’ Sites Peddling COVID-19 Misinformation and Identified 50 Hoaxes Relating to the COVID-19 Vaccines
It’s very difficult to combat this problem, but research like this proves useful in helping us to understand it, at least.