Blog

America First: What We Get Wrong About the World’s Information Crisis

Part one of a two-part post about the challenge of content moderation for the world.

Watching America’s election cycle from abroad is like witnessing a solar eclipse. We see our collective fears in its dark shadows. But we get blinded if we stare too long.

I can’t look away, even though I live in Ireland. But I am conscious how US politics distorts our perception of fundamentally global challenges like disinformation.

I could make a credible argument that COVID-related disinformation is the primary threat to the 2.7 billion Facebook users on the planet. But all that really matters right now is disinformation as it affects voters in the US, where roughly one in twelve Facebook users live.

I am not arguing that America’s experience of politically-charged disinformation doesn’t matter. Trump’s victory in 2016 was the wake-up call heard around the world. US politics is a petri dish in which toxic speech and conspiracy theories are bred for export. But there’s something deeply unsettling about the politics of one country dictating content moderation policies for the world. 

It’s time to turn our gaze away from America’s political eclipse, to separate what is truly universal about the American experience of disinformation from what is exceptional. 

Domestic divisions drive disinformation campaigns

The spectre of Russian interference is a case in point. Four years on, state-sponsored disinformation is still framed as a matter of national security for the US, rather than as the manifestation of home-grown divisions (as my friend Claire Wardle has eloquently explained). American democracy suffered not because its enemies are strong, but because its immune systems are weak. 

Perhaps Russia’s lasting victory in 2016 was persuading the world to think about disinformation as an act of war. Talking about campaigns of online deception with the language of conflict is self-defeating. This is not a battle with a victor, or an end. A far better metaphor for disinformation is public health. We live in a global commons, at risk from a virus without borders.

There are American mutations of the virus that need to be understood in isolation. Misinformation in the 2020 election cycle was an infection spread from the top down. A study by Cornell University found Donald Trump made up nearly 38 percent of the overall “misinformation conversation” around coronavirus. Other influential research showed Trump’s misinformation was further amplified by mainstream and partisan media.

The top-down spread of misinformation by populist demagogues and elite media is by no means an American phenomenon. But there is a danger in attaching too much significance to what we can measure in the United States, and fail to measure what is of universal significance.

Politicians and mainstream media do play the role of ‘superspreaders’ within national boundaries. But they are rarely the source of the virus. Instead, conspiracy theories are incubated within online spaces that have porous borders, and migrate and evolve as they move through content platforms, languages, cultures and content formats. 

Disinformation thrives best in host societies where identity is in flux, not just those with Donald Trump as President. The very same denial and delusion about COVID and mask-wearing that drive protests in Michigan bring people to the streets in Dublin and Berlin. The feedback loop through which the QAnon conspiracy travels to Germany can work the other way. How long before we see attacks on the 5G network exported to the United States?

The difficult challenges facing platforms

In my experience, there are plenty of good people inside content platforms who understand the global interconnectedness of the disinformation crisis. But they are boxed in by economic and regulatory realities.

A small minority of social media users live in the United States. But social media companies generate far more revenue from each American user than their international brethren. 

No matter what the outcome of the coming election, all roads will continue to lead back to Washington as the new US Congress considers the future of Section 230, the legal bedrock on which global content moderation rests. No one outside the United States can influence the reform or repeal of this legislation, but everybody on the planet with a social media account would feel the impact of its change.

The human cost of this power dynamic is already here, but it's just not equally distributed.  

When COVID forced big tech companies to send home their human moderators, they were forced to rely more on algorithmic content filters. In Syria, dissident activists saw their social media accounts closed down and content documenting potential war crimes scrubbed. In other countries, the algorithm had the opposite effect. Anti-racist campaigners in France reported a 40 per cent increase in hate speech on social media. 

Without doubt, the most compelling testimony about the unequal impact of disinformation comes from groundbreaking Filipino journalist Maria Ressa.

“The West now knows that online hate and violence translates to real-world violence,” she says. “But what you guys feel there is a fraction of the reality we live in.” 

One of the most telling details in Maria’s story is that her experience of disinformation pre-dates the last US election. She rang an alarm bell during a meeting with Facebook in August 2016, sharing data about networks of disinformation in the Philippines:

“The people I met with were shocked but didn’t know what to do. At the end of the meeting, I said, You have to do something, because if not Trump could win. And we all laughed because that didn’t seem possible … “ 

We now know everything is possible. But we still don’t seem to fully appreciate that we are all connected. There is no American solution to a global information crisis. There is no global solution unless we see beyond the American crisis. 

Tomorrow, in the second of this two-part post, I’ll be asking how we might build a truly global and collaborative content moderation solution.  


What to read next