USA: Meta allegedly fails to detect payments for child sexual abuse material made via its platforms, report says
"How Facebook Messenger and Meta Pay are used to buy child sexual abuse material", 22 March 2024
When police in Pennsylvania arrested 29-year-old Jennifer Louise Whelan in November 2022, they charged her with dozens of counts of serious crimes, including sex trafficking and indecent assault of three young children.
One month earlier, police said they had discovered Whelan was using three children as young as six, all in her care, to produce child sex abuse material. She was allegedly selling and sending videos and photos to a customer over Facebook Messenger. She pleaded not guilty.
The alleged buyer, Brandon Warren, was indicted by a grand jury in February 2022 and charged with nine counts of distribution of material depicting minors engaged in sexually explicit conduct. Warren also pleaded not guilty.
Court documents seen by the Guardian quote Facebook messages between the two in which Warren allegedly describes to Whelan how he wants her to make these videos.
Whelan received payment for the footage over Meta Pay, Meta’s payment system, according to the criminal complaint against him.
A spokesperson for Meta confirmed that the company has seen and reported payments via Meta Pay on Facebook Messenger that are suspected of being linked to child sexual exploitation.
“...We support law enforcement in its efforts to prosecute these criminals and invest in the best tools and expert teams to detect and respond to suspicious activity. Meta reports all apparent child sexual exploitation to NCMEC [the National Center of Missing and Exploited Children], including cases involving payment transactions,” the spokesperson said.
Meta fails to detect payments for child abuse material, say moderators
Through reviewing documents and interviewing former Meta content moderators, a Guardian investigation has found that payments for child sexual abuse content taking place on Meta Pay are probably going undetected, and unreported, by the company.
“We responded to valid legal process,” said a Meta spokesperson, in response to the Guardian’s findings that the company did not detect these crimes.
Former Meta compliance analyst
This means that payments connected to illicit activities are probably taking place undetected, financial crimes experts said.
A Meta spokesperson said that the company uses a combination of automated and human review to detect suspicious financial activity in payment transactions in Messenger.
Meta has a team of about 15,000 moderators and compliance analysts who are tasked with monitoring its platforms for harmful and illegal content. Possible criminal behavior is supposed to be escalated by Meta and reported to law enforcement. Anti-money laundering regulations also require money service businesses to train their compliance staff to have access to enough information to be able to detect when illegal financing occurs.
Yet contractors monitoring Meta Pay transaction activity do not receive specific training for detecting and reporting money flows that could be related to human trafficking, including the language, codewords and slang that traffickers typically use, a former Meta Pay payment compliance analyst contractor said.
A Meta spokesperson disputed the payment compliance analyst’s claims.
“Compliance analysts receive both initial and ongoing training on how to detect potentially suspicious activity – which includes signs of possible human trafficking and child sexual exploitation. Our program is regularly updated to reflect the latest guidance from financial crime regulators and safety experts,” the spokesperson said.
...
Siloed work prevents flagging suspicious transactions, say ex-moderators
Other former content moderators interviewed by the Guardian compared their jobs to call center or factory work. ... They say they could not communicate with the Meta Pay compliance analysts about suspicious transactions they witnessed.
“We were not allowed to contact Facebook employees or other teams,” one former moderator said. “Our managers didn’t tell us why this was.”
Gretchen Peters, who is the executive director of the Alliance to Counter Crime Online, has documented the sale of narcotics, including fentanyl, over Meta’s platforms. She also interviewed Meta moderators who were not permitted to communicate with other teams in the company. She said this siloing was a “major violation” of “know your customer” banking regulations.
A Meta spokesperson said the company prohibits the sale or purchasing of narcotics on its platforms and removes that content when it finds it.
Messenger’s encryption will hide illicit behaviors on Meta Pay, say advocates
In December, Meta announced it had rolled out end-to-end encryption for messages sent on Facebook and via Messenger. Encryption hides the contents of messages from anyone but the sender and intended recipient by converting text and images into unreadable cyphers that are unscrambled on receipt.
Yet this move could also affect the company’s ability to prevent illicit transactions on Meta Pay. Child safety experts, policymakers, parents and law enforcement criticized the move, arguing encryption obstructs efforts to rescue child sex trafficking victims and the prosecution of predators.
A Meta spokesperson said the decision to move to encryption was to “provide people with privacy”, and that the company encourages users to self-report private messages related to child exploitation to the company.
PayPal did not respond to a request for comment.