Taking a Different Approach to Measuring the Impact of Content Moderation

In a recent blogpost, Nick Sainsbury explained the problem of process-driven moderation metrics. Here he shows how the ‘jobs-to-be-done’ framework might help rethink what success looks like for a Trust and Safety team.

Why are people really using your platform or product? What do they arrive there looking to do? And how are they expecting to be made to feel? 

These are questions that product teams around the world often ask themselves when coming up with new services or features. Widely known as the ‘jobs-to-be-done’ approach or jobs theory, it has helped companies avoid lacklustre launches and expensive failures for decades and provides companies a user-focused lens with which to see the world.

At Kinzen, we use the jobs theory in our day-to-day work building tools and providing vital insight to help keep online communities safe. But we also believe that this framework could be a smart way for Trust and Safety Teams to decide upon a metric or set of metrics that demonstrate the effectiveness of their content moderation efforts. 

Let’s look at how it might work for two common ‘jobs-to-be-done’. 

Help me feel informed

For millions of people, social media is the main source of information about what is going on in the world. It helps them keep up with current affairs across a wide range of topics and is often the first place to get notified of important events. Many also use social media to gather a wide range of opinions and perhaps to borrow someone else’s opinion so we feel and look smarter. Its ability to inform is part of its appeal. However, Pew Research Center findings show that many users feel the majority of what they see is ‘largely inaccurate’ and often leaves them ‘more confused about current events’.

Core to feeling informed is trust and accuracy. Trust is a volatile feeling and extremely difficult to measure effectively. Accuracy, while still challenging, is more feasible to measure and define. How might we begin to measure accuracy in a transparent and trustworthy way? I believe public and decentralised crowdsourcing is the most promising way to truly have an impact.

Twitter is acutely aware that people turn to their feeds as a way to stay informed. Birdwatch enables Twitter’s users to highlight and explain where information may be misleading. While we don’t have visibility of how the pilot is being measured, one can see how a credibility or trustworthiness score could be built from the newly available data. Imagine being able to track how many pieces of information you engage with that have been genuinely disputed or found to be misleading by the wisdom of the crowd? Such a metric would provide a really powerful number for Trust & Safety teams to understand and improve a core job-to-be-done. 

Help me pass time 

Not all users are necessarily looking to be informed; some ‘hire’ platforms to solve other emotional jobs, such as being entertained or connecting with others.  

This is more difficult to track. Connection drives a huge proportion of users’ actions and personal belief systems but often runs counter to operational metrics that moderation teams keep an eye on. How might we assess if people feel true value and satisfaction from their time spent on a site or platform?

Sentiment is one way to measure the quality of connections on a platform and could help pinpoint where toxicity, identity attacks, insults, and harassment occur. A study published in 2019 found sentiment was a good predictor of toxicity and could even detect abuse when users posted with deliberate misspellings to avoid filters. 

The team at Hedonometer approached this challenge by attempting to define happiness and then measuring that sentiment online. And while it’s not explicitly stated that a moderation team's job is to make sure users are happy or influence emotions, how platforms make us feel is an important consideration in the overall conversation about what gets measured. 


To create an open and inviting online environment for users, we must identify and measure the right metrics that a Trust and Safety team can use, day in and day out. But, to find the right metrics, we must first get close to our users and find out why they are here. 

Each platform or site hosts a diverse range of users who are there to accomplish a wide variety of jobs-to-be-done. I believe a combination of reducing exposure to harmful content and a jobs-to-be-done approach outlined in this two-part blogpost provides a powerful balance of metrics for teams involved in content moderation to anchor their work around.

What to read next