Last week, using search terms listed in Facebook’s internal research on the subject, CNN located active Instagram accounts purporting to offer domestic workers for sale, similar to accounts that Facebook researchers had flagged and removed. Facebook removed the accounts and posts after CNN asked about them, and spokesperson Andy Stone confirmed that they violated its policies.
“We prohibit human exploitation in no uncertain terms,” Stone said. “We’ve been combatting human trafficking on our platform for many years and our goal remains to prevent anyone who seeks to exploit others from having a home on our platform.”
“To counter these challenges … we have also developed technology that can proactively find and take action on content related to domestic servitude,” Facebook said in the letter. “By using it, we have been able to detect and remove over 4,000 pieces of violating organic content in Arabic and English from January 2020 to date.”
A ‘severe’ risk to Facebook’s business
Following the publication of the BBC investigation, Apple contacted Facebook on October 23, 2019, threatening to remove its apps from the App Store for hosting content that facilitated human trafficking. In a November 2019 internal document titled “Apple Escalation on Domestic Servitude — how we made it through this [Site Event]” a Facebook employee detailed the actions the company took over the course of a week to mitigate the threat, including taking action against more than 130,000 pieces of domestic servitude-related content in Arabic on Facebook and Instagram, expanding the scope of its policy against domestic servitude content and launching proactive detection tools in Arabic and English.
“Removing our applications from Apple platforms would have had potentially severe consequences to the business, including depriving millions of users of access to IG & FB,” the document states. “To mitigate against this risk, we formed part of a large working group operating around the clock to develop and implement our response strategy.”
Despite the scramble during that week, Facebook had been well aware of such content before the BBC reached out. “Was this issue known to Facbeook before the BBC enquiry and Apple escalation?” the internal report states, “Yes.”
In March 2018, Facebook workers assigned to the Middle East and North Africa market flagged reports of Instagram profiles dedicated to selling domestic laborers, internal documents show. At the time, these reports “were not actioned as our policies did not acknowledge the violation,” a September 2019 internal report on domestic servitude content states.
Stone, the Facebook spokesperson, said the company did have a policy prohibiting human exploitation abuses at the time. “We have had such a policy for a long time. It was strengthened after that point,” he added.
Internal Facebook documents show that Facebook launched an expanded “Human Exploitation Policy” on May 29, 2019 that included a prohibition on domestic servitude content related to recruitment, facilitation and exploitation.
In September 2019, a Facebook employee posted to the company’s internal site a summary of an investigation into a trans-national human trafficking network that used Facebook apps to facilitate the sale and sexual exploitation of at least 20 potential victims. The criminal network used more than 100 fake Facebook and Instagram accounts to recruit female victims from various countries, and used Messenger and WhatsApp to coordinate transportation of the women to Dubai, where they were forced to work in facilities disguised as “massage parlors,” the summary said.
The investigation identified $152,000 spent to buy advertisements on its platforms related to the scheme, including ads targeting men in Dubai. The company removed all pages and accounts related to the trafficking ring, according to the report. Among the recommended “action items” listed for response to the investigation is a request that Facebook clarify policies for how it handles ad revenue associated with human trafficking to “prevent reputational risk for the company (not to profit from ads spent for HT).”
About a week later, a subsequent report outlined more broadly the issue of domestic servitude abuse on Facebook’s platforms. The document includes samples of advertisements for workers posted to Instagram; one describes a 38-year-old Indian woman for sale for the equivalent of around $350 (the company says it removed the related accounts).
More recent documents show that despite efforts Facebook took to remove such content immediately and in the weeks and months following the Apple threat, it has still struggled to regulate domestic servitude content.
A report distributed internally in January 2020 found that “our platform enables all three stages of the human exploitation lifecycle (recruitment, facilitation, exploitation) via complex real-world networks,” and identified some commonly-used naming conventions for domestic servitude accounts to help with detection. Traffickers from labor “recruitment agencies” used “FB profiles, IG Profiles, Pages, Messenger and WhatsApp to exchange victims’ documentation … promote the victims for sale, and arrange buying, selling and other fees,” the document said of one trafficking network the company identified.
In a February 2021 report, researchers found that often labor recruitment agencies communicated with victims via direct messages but rarely posted post public content violations, making them difficult to detect. The report also said Facebook lacks “robust proactive detection methods … of Domestic Servitude in English and Tagalog to prevent recruitment,” although the Philippines is a top source country for victims, and that it didn’t have detection capabilities turned on for Facebook stories. The report laid out plans for a preventative educational campaign for workers, and said researchers identified at least 1.7 million users who could benefit from information about workers’ rights.
“While our previous efforts are a start to addressing the off-platform harm that results from domestic servitude, opportunities remain to improve prevention, detection, and enforcement,” the February report stated. The company has implemented on-platform interventions to remind people seeking employment of their rights, and has information on its Help Center for users who encounter human trafficking content, Stone said.
And although Facebook researchers have heavily investigated the issue, domestic servitude content appears to still be active and easily found on Instagram. Using several common account naming trends highlighted in one domestic servitude internal research document, CNN last week identified multiple Instagram accounts purporting to offer domestic workers for sale, including one whose account name translates to “Offering domestic workers” that features photos and descriptions of women, including their age, height, weight, length of available contract and other personal information. Facebook confirmed these posts violated its policies and removed them after CNN asked about them.
In early 2021, Facebook launched “search interventions” in English, Spanish and Arabic that create “friction” in search when users “type in certain keywords related to certain topics (that we have vetted with academic experts),” according to Stone. He added that the company launched these search interventions for sex trafficking, sexual solicitation and prostitution in English, and for domestic servitude and labor exploitation in Arabic
“Our goal is to help deter people from searching for this type of content,” Stone said. “We’re continuing to refine this experience to include links to helpful resources and expert organizations.”
60 total views, 2 views today