Report

“It’s a Challenge From a Morale Perspective”: What it’s Like Setting the Rules at Some of the World’s Largest Platforms

“In some sense, it is a little bit like pushing back the ocean” - Trust & Safety expert

Online abuse has existed for as long as people have browsed the web. But as millions of people have come online over the last decade, the need to provide a safe space for people to exist and interact has grown ever more important. 

Trust and Safety teams are at the forefront of this work, trying to prevent bad actors from misusing their platform and at the same time keeping an eye on the community’s overall health. While roles vary in scope and seniority, these professionals are responsible for spotting trends in user behaviour, assessing risks and writing policies that ensure users act appropriately. 

Working with a wide variety of internal and external stakeholders, it’s their job to respond to escalated reports, reconcile new guidelines with existing rules and put in place protocols for live or developing scenarios. Those working in this space know the work is technically difficult, emotionally draining and high stakes. 

Kinzen works hand-in-hand with Trust and Safety professionals at some of the largest companies in the world to spot emerging risks and detect violating content. The team thinks about and tries to solve for specific technical issues, such as evolving narratives among anti-vaxxers or techniques for moderating audio content. However, they don’t always hear about wider challenges that indirectly inform how Trust and Safety experts keep users safe. Over the last few months, we looked to change that.

First-hand accounts

While there has been a number of seminal pieces of research about the people enforcing platform rules, comparatively little is known about the people who develop the rules in the first place.

“This is an industry that is brand new and young… and no one, I think, knows what they're doing” - Trust & Safety expert

We conducted a survey of people working in these kinds of roles and spoke to six Trust and Safety practitioners — five from well-known platforms and the sixth working in academia — involved in the process of setting platform policy. Though not a representative group, our research heard from people working in a range of countries, company types and areas, such as gaming and business. We also worked with the Trust and Safety Professional Association (TSPA) to identify suitable interviewees, all of whom agreed to participate on condition of anonymity. 

One important thing to note: given the sensitivity of the work being done by Trust and Safety teams, we found that interviewees needed to trust in how we used their insights in order to talk to us about their work. As such, we agreed to share a draft of this post with them in advance. No substantive changes were made.

This post summarises some of the main challenges that we heard about, which we’re sharing here to inform conversations currently taking place in the wider Trust and Safety space.

A challenging environment

As media coverage over the last few years has demonstrated, people working in roles within Trust and Safety teams face a host of challenges in their work. From keeping up with new online safety regulations around the world to the accuracy of artificial intelligence, online safety professionals have their hands full right now.

These topics came up to some degree during our research but, throughout our survey and interviews, three other pressing challenges came to the fore. In no particular order, these were:

1. “I need to be better at thinking about metrics”

Our research showed that meaningful success metrics within Trust and Safety teams have been difficult to develop, often resulting in key performance indicators (KPIs) that are unclear, not mature in their usage or even non-existent. Interviewees also felt that they lacked the time and space to develop these metrics.

Those that had experience in quantifying their work tended to use output-oriented, rather than outcome-oriented, metrics. These included:

  1. Prevalence of violations, such as the number of user reports or percentage of content that contravenes guidelines 
  2. Time to action, like average time to review a piece of content or to resolve a report
  3. Action accuracy, such as internal audits on human-made or algorithmic decisions

Several interviewees also noted that it was difficult to measure things that had been avoided. For example, if a new product feature prevents users from receiving abuse, how can you quantify the future effect on users not receiving hateful messages? 

Without clear goals, teams reported working reactively or playing ‘whack-a-mole’ with issues. Some used positive media coverage, or avoiding negative press, as an alternative indicator of success.

“I get into a cycle because my work is hard to measure. How do you measure something that didn’t happen?” - Trust & Safety expert

2. “It feels frustrating that there's that sort of roadblock”

We heard that the organisational environment, rather than technical issues related to their work, posed a significant challenge to scaling solutions that keep users safe.  

Interviewees reported that constantly changing company priorities, senior involvement in key decisions and a lack of access to other departments slowed down work or prevented it from being executed. The difficulty of getting internal alignment from stakeholders came up in answers to a number of questions.

In some cases, this working culture was a cause of frustration and dampened professionals’ natural inclination to find solutions to remove bad actors from the platform.

“There’s several right answers to [known internet personality] and some of the challenges associated with his content. But the internal and external criticism that our team received from multiple sides was a challenge from a morale perspective” - Trust & Safety expert

3. “The general understanding is “Make the porn go away”

Interviewees noted that a limited understanding of online safety and a lack of clarity about what Trust and Safety teams do both internally and externally made their work more difficult. This affected practitioners’ ability to align expectations with senior staff and come to an agreement about what is and isn't the responsibility of the Trust and Safety team.

“We can't have good conversations (about difficult issues). And I think that's the most frustrating thing” - Trust & Safety expert

It was also felt that the general population, as well as legislators and policymakers, did not understand the trade-offs of product or policy decisions, such as ID verification or what constitutes adult content. This low baseline knowledge makes it challenging for Trust and Safety professionals to design inclusive online spaces. 

Our takeaways

These first-hand accounts provide a small window into the ongoing challenges that Trust and Safety teams, and particularly those with a policy remit, are faced with. It is by no means exhaustive but there are a number of conclusions to be drawn which require additional thought and research: 

  • Moving towards an outcome-oriented approach of measuring success could help Trust and Safety teams prioritise burning issues and demonstrate the impact of their work but require time, support and headspace to develop.
  • Non-technical issues relating to the organisation, its structure and culture have an outsized effect on the work of Trust and Safety professionals compared to technical ones (such as reducing the prevalence of one type of violating content).
  • Society’s general lack of understanding about online harms and the role of Trust and Safety teams has a direct and tangible effect on those working to improve online safety in platforms and requires more thinking.

Further questions that build on this work could include:

  • What impact, if any, do output-style metrics related to volume and time-to-action have on the behaviour of platform users? 
  • What other metrics are being used to quantify Trust and Safety teams and what can we learn from them? 
  • How can organisations empower Trust and Safety teams to keep users safe and clear barriers or obstacles that impede that process? 
  • What can Trust and Safety teams, and the wider online safety community at large, do to improve the base knowledge of decision-makers and citizens alike?
“The focus at the moment is solely on whether or not we can get platforms to comply with policy rather than whether or not policy actually works for people” - Trust & Safety expert

The team at Kinzen will continue to engage with civil society organisations, research institutions and the platforms themselves to ensure moderators, policymakers and Trust and Safety teams have the information and tools they need to make the right decision at the right time. 

If you or your organisation have the same mission or want to talk about our research and how we might be able to help, get in touch.

“You need your CEO to understand that you exist, to understand why you exist, to care about you and (how) you stack up against the other parts of the team” - Trust & Safety expert

Learn more

Want to understand more about setting better Trust and Safety KPIs? Kinzen is hosting a roundtable discussion with the Trust and Safety Professional Association about the merits of using different metrics to measure your community’s health and inform your strategy. This is a small, intimate event with a limited number of available spaces. Get in touch to express your interest in attending.

Related reading

What to read next