Daily Bulletin

The Times Real Estate

.

  • Written by Jennifer Beckett, Lecturer in Media and Communications, University of Melbourne
We need to talk about the mental health of content moderators

Selena Scola worked as a public content contractor, or content moderator, for Facebook in its Silicon Valley offices. She left the company in March after less than a year.

In documents filed last week in California, Scola alleges unsafe work practices led her to develop post-traumatic stress disorder (PTSD) from witnessing “thousands of acts of extreme and graphic violence”.

Read more: Facebook is all for community, but what kind of community is it building?

Facebook acknowledged the work of moderation is not easy in a blog post published in July. In the same post, Facebook’s Vice President of Operations Ellen Silver outlined some of the ways the company supports their moderators:

All content reviewers — whether full-time employees, contractors, or those employed by partner companies — have access to mental health resources, including trained professionals onsite for both individual and group counselling.

But Scola claims Facebook fails to practice what it preaches. Previous reports about its workplace conditions also suggest the support they provide to moderators isn’t enough.

It’s not the first time

Scola’s legal action is not the first of its kind. Microsoft have been involved in a similar case since December 2016 involving two employees who worked in their child safety team.

In both cases the plaintiffs allege their employer failed to provide sufficient support, despite knowing the psychological dangers of the work.

Both Microsoft and Facebook dispute the claims.

How moderating can affect your mental health

Facebook moderators sift through hundreds of examples of distressing content during each eight hour shift.

They assess posts including, but not limited to, depictions of violent death – including suicide and murder – self-harm, assault, violence against animals, hate speech and sexualised violence.

Studies in areas such as child protection, journalism and law enforcement show repeated exposure to these types of content has serious consequences. That includes the development of PTSD. Workers also experience higher rates of burnout, relationship breakdown and, in some instances, suicide.

Aren’t there workplace guidelines?

Industries including journalism, law and policing have invested a significant amount of thought and money into best practice policies designed to protect workers.

In Australia, for example, those working in child safety opt-in to the work rather than cases being assigned. They are then required to undertake rigorous psychological testing to assess if they are able to emotionally compartmentalise the work effectively. Once working, they have regular mandated counselling sessions and are routinely reassigned into other areas of investigation to limit the amount of exposure.

The tech industry has similar guidelines. In fact, Facebook helped create the Technology Coalition, which aims to eradicate online child sexual exploitation. In 2015, the coalition released its Employee Resilience Guidebook, which outlines occupational health and safety measures for workers routinely viewing distressing materials. While these guidelines are couched as specific to workers viewing child pornography, they are also applicable to all types of distressing imagery.

Read more: Facebook's moderation rules prove it's OK with being a hostile place for women

The guidelines include “providing mandatory group and individual counselling sessions” with a trauma specialist, and “permitting moderators to opt-out” of viewing child pornography.

The guidelines also recommend limiting exposure to disturbing materials to four hours, encouraging workers to switch to other projects to get relief, and allowing workers time off to recover from trauma.

But it’s not just about guidelines

Having support available doesn’t necessarily mean staff feel like they can actually access it. Most of Facebook’s moderators, including Scola, work in precarious employment conditions as outside contractors employed through third party companies.

Working under these conditions has been shown to have a detrimental impact on employee well-being. That’s because these kinds of employees are not only less likely to be able to access support mechanisms, they often feel doing so will risk them losing their job. In addition, low pay can lead to employees being unable to take time off to recover from trauma.

Insecure work can also impact one’s sense of control. As I’ve previously discussed, moderators have little to no control over their work flow. They do not control the type of content that pops up on their screen. They have limited time to make decisions, often with little or no context. And, they have no personal say in how those decisions are made.

According to both the filing, and media reports around Facebook’s moderator employment conditions, employees are under immense pressure from the company to get through thousands of posts per day. They are also regularly audited, which adds to the stress.

Where to from here?

Adequate workplace support is essential for moderators. Some sections of the industry provide us with best case examples. In particular, the support provided to those who work in online mental health communities, such as Beyond Blue in Australia, is exemplary and provides a good blueprint.

Read more: 'Haters gonna hate' is no consolation for online moderators

We also need to address the ongoing issue of precarity in an industry that asks people to put their mental health at risk on a daily basis. This requires good industry governance and representation. To this end, Australian Community Managers have recently partnered with the MEAA to push for better conditions for everyone in the industry, including moderators.

As for Facebook, Scola’s suit is a class action. If it’s successful, Facebook could find itself compensating hundreds of moderators employed in California over the past three years. It could also set an industry-wide precedent, opening the door to complaints from thousands of moderators employed across a range of tech and media industries.

Authors: Jennifer Beckett, Lecturer in Media and Communications, University of Melbourne

Read more http://theconversation.com/we-need-to-talk-about-the-mental-health-of-content-moderators-103830

Business News

Insulation Solutions for Meeting Modern Industrial Standards

As global energy costs soar and environmental regulations tighten, industries face unprecedented pressure to optimise their operations while minimising their ecological footprint. Modern industrial ...

Daily Bulletin - avatar Daily Bulletin

How Australian Startups Should Responsibly Collect, Use and Store Customer Data?

Owing to the digital landscape, data is the most important currency in the market. From giant e-commerce sharks to small businesses, every company is investing heavily to responsibly collect data an...

Daily Bulletin - avatar Daily Bulletin

Revolutionising Connections - The Power of Customer Engagement Software

As time goes by, customer expectations keep on rising ever so rapidly. Businesses that must keep pace will need future-ready tools to deliver connectedness at every touchpoint. Customer engagement a...

Daily Bulletin - avatar Daily Bulletin

LayBy Deals