Last October, when Facebook was facing heat over Russian interference in the 2016 US presidential election, the social media giant had told the Senate that it would double the number of employees working on sensitive security and community issues to 20,000 by end 2018.
Their job description includes content moderation, or sifting through online posts, articles and videos to flag off offensive, violent, fake, criminal and terror-related content. In the wake of the Cambridge Analytica scandal, Facebook founder and CEO Mark Zuckerberg had announced that the company had already hired 5,000 of the promised employees. The search is now on for the remaining half to clean up Facebook.
And according to The Economic Times, Indian graduates are lining up to apply. "The teams working on safety and security at Facebook are doubling in size this year to 20,000 [globally]. This includes our growing team of 7,500 content reviewers - a mix of full-time employees, contractors and companies we partner with," a spokesperson for Facebook told the daily. "This lets us scale globally, covering every time zone and over 50 languages... we do not disclose the locations for security reasons."
Genpact, a professional services firm, reportedly won a contract to provide content management services to Facebook this year. Facebook directly employs content moderators but also outsources the task of screening posts in more than 50 languages across the globe to companies like Genpact. To that end, the latter is busy hiring content moderators across a slew of Indian languages - including Tamil, Kannada, Oriya, Chattisgarhi, Nepali, Marathi, Mizo and Punjabi.
The firm has put out advertisements on online employment platforms, saying: "Monitor and moderate user-generated content/video on one of the reputed Social Website to ensure that the online community is maintained as a safe and fun environment. Should be comfortable for any content Sexual Assault, Terrorism, Child Abuse, may be live suicidal videos and bloodshed." The daily added that Genpact even held walk-in interviews for these roles last month.
While the ads did not specify that the jobs are for Facebook, a correlation is highly likely given that Genpact is hiring for Hyderabad location, where the social media company has a large presence.
Though it is far from a cushy job - constantly reviewing unpalatable content could take a serious psychological toll on folks who do it for a living and hence leads to high attrition - there's no dearth of applicants. The report added that on jobs platform Naukri.com, over 3,000 people applied for 30 Genpact job openings for screening Kannada language content. The salary on offer helps too: Rs 2.25-4 lakh a year, plus monthly incentives.
But recognising that the job of a content moderator involves emotional stress, possibly even secondary trauma, Facebook is reportedly increasing focus on the health of employees doing this tough job. Citing a Facebook executive the daily reported that the company was implementing multiple measures to protect its reviewers. "There is an ongoing training for reviewers and a wellness programme as well [to look into the side effects of the exposure to violent content]," the source added.
If you had thought that the policing job had long been taken over by artificial intelligence, then you are partly correct. While machine leaning, which boasts the advantage of speed as well as none of the human frailties, is already being deployed to flag inappropriate content, the algorithms still aren't able to replace humans entirely. So human content moderators will be needed for quite a while longer.
Edited By Sushmita Choudhury Agarwal