(Dreamstime/TNS)

Platforms struggle to keep up with moderating content amid COVID-19

Record numbers of people around the globe are spending increased amounts of time at home on their favourite platforms

While hundreds of thousands of companies across the country have seen work grind to a halt amid COVID-19, Chris Priebe is experiencing the opposite.

The owner of Two Hat, an artificial intelligence-powered content moderation company based in Kelowna, B.C., has never been busier helping customers including gaming brands Nintendo Switch, Habbo, Rovio and Supercell sift through billions of comments and conversations and quickly identify and remove anything harmful to users.

“We processed 60 billion last month. It used to be 30 billion. That’s how bad coronavirus is. That is at least twice the normal volume,” said Priebe in April, before monthly processing volumes hit 90 billion.

“(Platforms) are faced with, in some cases, 15 times the volume. How can they possibly care for their audience? Because that doesn’t mean that the revenues are up 15 times or that they can afford to hire that many more people.”

Priebe is not alone in the scramble to keep online, social media and gaming platforms safe amid COVID-19. Companies including Facebook, Instagram, Twitter, YouTube and Google have all been warning users since at least April that they are experiencing shortages of content moderators, causing a backlog in the removal of harmful posts.

The stakes are high. Record numbers of people around the globe are spending increased amounts of time at home on their favourite platforms, challenging servers and turning messaging services, social networks and comment sections into a wild west.

The situation has heightened privacy experts’ worries about the spread of misinformation and the likelihood that users will stumble upon hate speech, pornography, violence and other harmful content.

“Quite a few people are fairly dissatisfied with the content moderation process as it is…and then you add on this pandemic…You are seeing a huge increase in harassing behaviour and problematic behaviour and then having the content stay up longer,” said Suzie Dunn, a University of Ottawa professor who specializes in the intersection of technology, equality and the law.

“It’s a real challenge because content moderators are a little bit like frontline workers. They’re an essential service that we need to have at a time like this, so we would hope to see more content moderators working.”

However, unlike workers in other sectors who have been working from home since the COVID-19 pandemic arrived, such a shift is difficult for many content moderators as their jobs deal with images and language you wouldn’t want kids or other family members catching a glimpse of.

“Some of them may not be able to work on certain things that they would work on in the office,” Kevin Chan, Facebook Canada’s head of public policy, told The Canadian Press.

“They’re looking at potentially private, and sensitive things that have been reported to them and we need to make sure….that these things can be treated in the secure and private manner that they deserve.”

Full-time Facebook employees have stepped up and are taking on some of the moderating work, including from contractors who can’t have proprietary and sensitive content at home. These workers are dealing with content related to “real-world harm” like child safety and suicide and self-injury.

“There is no question this is going to pose challenges to the degree to which we can be as responsive,’ Chan said.

READ MORE: B.C. records just one new COVID-19 case in last 24 hours

To deal with the situation, Facebook has rolled out measures meant to curb the flow of COVID-19 misinformation and is focused on weeding out and removing content around terrorism and anything inciting violence or linking to “dangerous” individuals and organizations.

At Twitter, machine learning and automation is being used to help the company review reports most likely to cause harm first and to help rank content or “challenge” accounts automatically.

“While we work to ensure our systems are consistent, they can sometimes lack the context that our teams bring, and this may result in us making mistakes,” Twitter said in a blog. ”As a result, we will not permanently suspend any accounts based solely on our automated enforcement systems.”

Google has also upped its reliance on machine-based systems to reduce the need for people to work from the office and said the increase in automation has many downsides, including a potential increase in content classified for removal and slower turnaround times for appeals.

“They are not always as accurate or granular in their analysis of content as human reviewers,” added a Google blog released in March.

This is a sentiment Priebe has encountered many times, but he has a counter-argument: “AI is not perfect but…humans are also not perfect.”

He gives the example of a child playing a game at home during the pandemic, when pedophiles might be more active online and trying to contact young people.

“You have three different humans look at the same conversation and they’re not going to give you the same answer. Some of them are going to call it grooming and some of them aren’t,” said Priebe.

Priebe believes an ideal system blends humans and AI because the latter is good at knowing what to do with obvious cases like when a user’s content is flagged almost a dozen times in a short period of time or when someone gets a message that only reads hello and hits report just to see what the button does.

“You don’t need a human to have to be looking at their screen and looking at this absolutely sexual content in front of potentially their children who snuck up behind them because artificial intelligence is going to win every time on that,” he said.

“Let humans do what humans do well, which is deal with that middle category of stuff that is subjective, difficult or hard to understand, that the AI is not confident about.”

Regardless of how the moderation gets done, some things will always slip through the cracks, especially in a pandemic, said Dunn.

“No system is perfect.”

Tara Deschamps, The Canadian Press


Like us on Facebook and follow us on Twitter.

Want to support local journalism during the pandemic? Make a donation here.

Coronavirus

Just Posted

Invermere gets new CAO

Invermere found his new CAO after a long period of research.

Radium council discusses short term rentals

RHS council are elaborating the second draft plan for STR

Farmers’ Institute report highlights emerging local food scene

Beef cattle ranching remains mainstay of valley agriculture

Beautification process begins in Canal Flats downtown

Canal Flats wants to improve the esthetic of the town

B.C.’s top doctor thanks supporters after revealing threats over COVID-19 measures

Dr. Bonnie Henry says COVID-19 has caused some people to lash out in anger and frustration out of fear

West Kootenay mother searching for son missing since Sept. 1

Nelson police are investigating the disappearance of Cory Westcott

NDP, Greens divided on pace of child care improvements in B.C. election campaign

NDP Leader John Horgan recommitted to $10-a-day child care and blamed the Greens for not supporting his efforts

BC Liberal Leader talks drug addiction in the Lower Mainland

Drug addiction and public safety a top priority says Andrew Wilkinson

Join Black Press Media and Do Some Good

Pay it Forward program supports local businesses in their community giving

Is it time to start thinking about greener ways to package cannabis?

Packaging suppliers are still figuring eco-friendly and affordable packaging options that fit the mandates of Cannabis Regulations

Pandemic derails CP Holiday Train

Canadian Pacific will work to get donations to food banks while also producing an online music concert

Vanderhoof’s Brian Frenkel takes on top job in tough times

We can get through this, new local government leader says

Local councils important, Horgan says as municipal conference ends

B.C. NDP leader says ‘speed dating’ vital, online or in person

Most Read