TikTok poaches content moderators from Big Tech entrepreneurs in Europe

TikTok has poached hundreds of content moderators in Europe from outsourcing companies that serve social media rivals such as Facebook, as the group seeks to tackle a growing problem of harmful content.

The abbreviated video app, owned by China’s ByteDance, quickly expanded its “trust and safety center” to Dublin, and hired additional moderators in London to review material posted by European users.

At least 190 of those who have joined since January 2021 previously worked through contractor companies for Accenture, Covalen and Cpl, according to an FT analysis of public LinkedIn profiles.

Meta, Facebook, and Instagram’s parent company, YouTube, and Twitter are known to rely heavily on these contractor companies to oversee and remove some of the platforms’ most violent and harmful content. TikTok said it has hired several hundred moderators in the UK and Ireland since January last year, adding to the thousands it has at similar hubs in California and Singapore.

Earlier this month, Meta chief executive Mark Zuckerberg blamed its slowing growth on younger users fleeing Facebook and Instagram to TikTok, causing the company to lose more than $220 billion in value. business in a day. But with TikTok’s huge growth comes the problem of dealing with the worst user excesses, an issue that has put major social networks in the crosshairs of politicians and regulators around the world.

“Our continued investment in our trust and safety operations reflects our focus on maintaining TikTok as a place for creativity and entertainment,” said Cormac Keenan, global head of trust and safety at TikTok.

This push meant that TikTok’s European workforce grew by more than 1,000 people in 2020, when the company’s revenue in the region increased by 545% to 170.8 million. dollars. But according to documents filed with Britain’s Companies House, pre-tax losses quadrupled to $644.3 million, “mainly due to the increase in the number of employees to support the growth of the business”. [in Europe]”.

TikTok’s strategy has been to offer moderators internal positions with better salaries and benefits in order to attract experienced staff from the same limited talent pool as Facebook, Instagram, YouTube and Snap.

New hires often speak multiple languages ​​and have experience moderating content, according to people with first-hand knowledge of the hiring process. The company said languages ​​were a “key consideration for potential applicants”.

“I chose TikTok because the benefits are better, the environment is better, and the company values ​​every member,” said a TikTok employee, who joined last year from Accenture. “It was better for my career and I wanted to be able to work from home, which was a battle at Accenture.”

Another content moderator who switched from YouTube to TikTok said levels of disturbing content at work were similar, but psychological support was better at TikTok.

Accenture, Cpl and YouTube did not respond to requests for comment, and Covalen declined to comment.

Candie Frazier, a former content moderator in California, is suing TikTok. She claims the company failed to protect her mental health after watching extreme and violent videos. The company said it does not comment on ongoing litigation, but continues to develop a range of wellness services to support moderators.

Facebook also previously agreed to pay $52 million to a group of thousands of US moderators who said they were traumatized after watching disturbing content on the platform.

Meta said it offers training and wellness support for in-house and contracted content moderators, chat areas that allow reviewers to step away from their desks if necessary, and technology that ensures reviewers are not exposed to potentially graphical content for long periods of time.

Last month, Meta revealed that its monthly active users of its services had dropped to 2.9 billion for the first time. TikTok has over a billion monthly active users, putting it in line with Instagram and above Snap, which has over 500 million.

Mary I. Bruner