Listen to the full interview.

Ella Dawson is regularly exposed to the darker side of the internet, as she’s tasked with seeking out and removing the worst examples of social media so the rest of us never see them.

In many ways, social media moderators like Dawson are the bodyguards of the internet, leaping in front of every offensive bullet to protect brands or communities. If they do their job well, no one else may even realize a bullet was fired …

… never mind that it can still bruise or scar the person who took the hit.

Dawson is the Facebook platform manager for TED Conferences, the global nonprofit dedicated to spreading ideas. Many of the presentations are published online as TED Talks, and Dawson manages the ensuing conversations on Facebook. Some of these ideas are more contentious than others – particularly in the polarized, us-versus-them, often factually fluid culture that is sadly all too common on social platforms today.

“I try to cultivate a fairly positive, thoughtful comment thread, but that can be difficult when it’s a topic like race, feminism, domestic violence, global warming, or vaccinations. People really come out of the woodwork to ruin each other’s day,” Dawson says.

Many of these comments come from people she calls “drive-by commenters.” These are people who show up to share an opinion but may not know your brand or even want to see your content.

Social media moderators have feelings too

In most cases, the negative activity is aimed at the brand or a community rather than the anonymous employee working behind that logo. Still, it can be hard not to take certain things personally. “It’s very difficult to take your own personal identity out of it,” Dawson says. “There are moments where it does feel like you’re the one being insulted or attacked.”

It’s hard for #socialmedia mods to keep personal beliefs & identity separate to stay unaffected, says @brosandprose. Click To Tweet

Some commenters may intentionally make offensive comments to provoke a negative reaction – the garden-variety troll. Some may think their extreme or objectionable view is just as valid as any other – and any outrage or offense is in the minds of other people. And then there may be honest, well-intentioned discussions that still risk being insensitive or triggering to people with a different or closer relationship to the topic – including the person moderating the discussion.

Dawson gives the example of what would happen if TED published a talk on suicide prevention: “We know we’ll get a bunch of people making inappropriate comments about suicide – judging things, triggering things.”

The TED team recognizes that the intent or severity of the comments doesn’t always matter. If someone is affected, they’re affected.

“We’ll have people tap out if they need to, if it’s personally upsetting to them to moderate those comments,” she says. “We do the same if there’s a talk about race, police brutality, or diversity in hiring – all those things. If someone feels they relate to the content and that reviewing those comments might be upsetting to them, they can step back and ask somebody to take it on.”

Social comments and the damage done

No one would dispute that certain topics can upset some people or that unambiguously abusive, offensive, or extreme content can ruin a person’s day. That’s why this sort of content is moderated in the first place.

However, it’s easy for employers and managers to dismiss a certain amount of negative behavior – complaints, aggravation, the usual low-grade trolling activity – as part of the territory for those working in social.

But Dawson warns that trivializing or normalizing this aspect of the job can conceal the real impact it can have on those tasked with handling such negativity for extended periods of time. “It can prevent people from fully understanding the emotional impact of the work,” she says. “There’s something really frustrating about having the real emotional labor of reading through those comments dismissed as ‘Just the internet’ or ‘Don’t read the comments.’ It shrugs away the impact it has on people’s daily psyche.”

This regular – sometimes daily – exposure to negativity and conflict can gradually build up until symptoms begin to appear, like a repetitive strain injury. “It’s like a death by a thousand cuts, every day wearing you down a little more, bit by a little bit,” Dawson says.

Moderating #socialmedia can be like a death by a thousand cuts, wearing you down bit by bit, says @brosandprose. Click To Tweet

“You lose faith in humanity on some days,” she says. “When you’ve been moderating a bunch of awful comments, if you’re wading through 4CHAN to track a scandal, it can really demoralize you. It can impact your mood. It can make you feel really burnt out, cranky, and tired. It can impact your mental and emotional well-being.”

Dawson says her job can sometimes exacerbate her anxiety, particularly when dealing with a PR crisis or handling an intense situation: “I find it saps my creativity. It saps my ability to collaborate with others. It impacts my temper. There are a lot of ways in which it hurts my ability to be good at my job when I’m experiencing psychological burnout. It makes me a less valuable employee.”

Management’s duty of care for social media teams

By not taking the negative impacts of social media moderation seriously enough, employers may be violating their duty of care – with potential legal consequences.

Facebook is defending a class-action lawsuit from a growing number of former content reviewers. They claim to have developed symptoms of post-traumatic stress disorder from viewing thousands of extreme and graphic videos in the course of their work. While Facebook does have guidelines in place, the complainants claim the company isn’t doing enough to protect the 15,000 content moderators employed through third-party companies.

While most moderators will rarely, if ever, experience the same kind of content as Facebook’s content reviewers, employers still need to be aware of the mental health risks of working continuously in an environment as emotive, extreme, and quarrelsome as social media can be.

Unfortunately, most people aren’t always good at admitting to others when they’re struggling – particularly to their managers. “People are afraid of seeming weak and unqualified. We valorize resilience. It’s embarrassing to say, ‘I’m really upset by these Facebook comments.’ That’s not something we feel a serious individual would say in the workplace. Yet it is absolutely the right and responsible thing to do,” Dawson says.

These conversations are much easier to have in a supportive, empathetic workplace. Managers can remove the stigma by taking a proactive approach, regularly discussing what the team is seeing and what the possible impacts might be, offering support, or making it easy to “tap out” and have someone else take over.

Dawson is the first to admit that some people are more emotionally resilient than others but warns against it being a disqualifying factor or that brands should attempt to hire moderators with thicker skins. “Emotional resilience is very difficult to identify in candidates,” she says. “It’s often difficult to identify in yourself. You can be emotionally resilient for a few days and then over the course of a few months find that wears down.”

However, depending on the role, employers should be upfront during the hiring process about what a prospective social media employee might be exposed to. This can also open up the conversation to the kinds of support available or specific situations or topics that may be personally distressing that can be taken into account.

Company policies can also include measures to reduce the risk of negative stress building up, such as limiting moderation shifts to no more than four consecutive hours or allowing for recovery days following a major PR crisis or toxic episode. Dawson suggests greater flexibility may also help, such as remote working and flexible hours, so that moderators work where they feel most secure and comfortable and don’t feel they need to be alert to what’s happening in social media 24/7. “If people are doing this emotional labor, can they do it from home? If people are only commenting from 7 p.m. to 11 p.m., can the moderator work those hours?”

Awareness of the toll #socialmedia moderation takes on employees has a long way to go, says @brosandprose. Click To Tweet

Dawson doesn’t expect the social media environment to become less toxic any time soon. “I think a lot of the social platforms are rotten to the core. They were built in a way that leans into our worst instincts as people. They were built to outrage and provoke. The type of content that succeeds on Facebook and Twitter is what makes you respond in a knee-jerk way. It doesn’t encourage empathy and thoughtfulness before you comment, like, or share.”

Despite this skepticism, Dawson sees a growing understanding of these factors that she hopes will gradually improve working conditions for social media workers. “There’s more awareness now that being on social media has a real toll, both on the user and the employee,” she says. “I think that being a junior staffer or working a job that requires emotional labor is always precarious.

“Social media managers are starting to have this conversation about how we can recover, what we should be asking for at work, where we should be working in terms of the support given to staff. In some way it’s getting better but, in some ways, I think we have a very long way to go.”