NO MORE EXCUSES! WE HAVE A DUTY TO PROTECT MODERATORS FROM HARM

We’re delighted to share an article penned by Sharon Fisher, Head of Trust & Safety at Keywords Studios – which powers our Individual Ambassador Programme – urging the games industry to protect moderators…

I recently read about an investigation by Colombia’s Ministry of Labor into troubling, traumatic content moderation practices in the country. Similar concerns have been expressed about moderators working in Tunisia. And moderators in Germany are organising to demand better working conditions.

These articles really hit home for me. I feel compelled to share my thoughts on better practices that we can all adopt to prevent this from happening at our own organisations, or the vendors you work with.

How many times have we seen organisations put money savings ahead of human wellbeing? Moderators are treated like second-class citizens, even though their work is extremely difficult and important. Their worth is still being measured by unreasonable performance metrics.

Taking it one step further, companies pay thousands of dollars to remove toxic or unlawful content from users (humans) by giving that same content to moderators (also humans) as if one group of humans was worth more than the other.

Reality is, moderators who are exposed to disturbing content continue to experience mental distress. Some organisations offer wellbeing and resilience resources, but they are often difficult for moderators to access. In the Colombia case, there are allegations that moderators who took time off work to care for their wellbeing lost out on lucrative bonuses or even lost their jobs because of it.

My fellow Trust & Safety, gaming, player engagement and social platform professionals — we must (and can) do better!

If we are going to ask human beings to do this difficult work, we have a moral duty to provide them with working conditions that truly support their wellbeing and mental health.

I believe that at least four conditions need to be met before we should even consider asking someone to moderate content on our platforms:

  1. Thoughtful recruitment practices that prioritise skills like resilience, communication, empathy, and bias awareness. HR should carefully recruit, interview, and onboard employees based on the content they could potentially be exposed to, not only on their ability of understanding a language. At Keywords Studios, my Trust & Safety team and I collaborated with our incredible HR team to revise the moderator job description and create an initial moderation test that evaluates candidates’ knowledge, bias awareness, willingness to be mentally supported and analytical skills.
  2. Accessible and trustworthy wellbeing and resilience programs. My team calls this ‘everyday care’. To be successful, these programs require daily leadership support from the top down and across the board. Moderators should have access to mental health resources at any time during work hours or outside of work hours — not only at the end of the line when damage has been done.  With this, we are focusing on prevention, by providing constant knowledge, tools, support and processes for everyday care, while the end of the line resources remain still.
  3. Empathetic and compassionate leadership at every level. Leadership must see content moderators as more than KPIs or numbers on a spreadsheet. At Keywords Studios, we refer to moderators as “superheroes”, in honour of their extraordinary skills, commitment to saving lives, and bravery in the face of challenging work. Leadership is committed to equipping our superheroes with the armour — a combination of training, technology, processes, everyday care, and wellbeing resources — they need to protect themselves.
  4. Technology that minimises exposure to damaging content through automation. It is 2023, and technology has come a long way, we cannot eliminate the risk that moderators will be exposed to shocking content. When lives are at stake, it’s necessary. But AI and automation are two of the tools we can use to lower the risk.

When all four conditions are met, I believe that we can move into a new era of ‘responsible moderation’ — an approach that honours the superheroes who ensure the safety of our online (and real life) communities with a blend of technology, everyday care, and wellbeing resources.

I recently reflected on my year of building Trust & Safety at Keywords Studios. In that year, I’ve had the opportunity to speak with many passionate members of the Trust & Safety and gaming communities. We all agree that, for too long now, the lack of consideration for moderators’ wellbeing has been the ‘elephant in the room’. Now, we want to protect and honour the superheroes on our platforms and show up for them the same way they show up for our online communities.

At Keywords Studios, we want to bring responsible moderation to every online platform in the world — but we cannot do it alone!

Today, I invite my fellow Trust & Safety professionals in the gaming and social industries to ask ourselves:

  • Are we giving our most under-appreciated human beings the support they need to thrive?
  • How can we convince our organisations to embrace the new era of responsible moderation?
  • How will we come together as an industry to challenge the status quo?

So, who’s with me?  It’s time to become Allies to our Superheroes and push towards the next era of responsible moderation!