Meta withdraws a sextortion expert’s job offer after he publicly criticizes Instagram

Meta withdrew a job offer from a prominent cyber intelligence analyst immediately after he criticized Instagram for its failure to protect children online.

Paul Raffile had been offered a job as a human exploitation investigator, focusing on issues such as extortion and human trafficking. He had participated in a webinar on April 24 on protecting against financial extortion, in which he criticized Instagram for allowing children to fall prey to scammers and offered possible solutions.

“The only reason I can think of for withdrawing the offer is that I am trying to shed light on this major problem of these crimes happening on Instagram, and that Instagram is doing little to prevent this so far,” said Paul Raffile .

Raffile co-organized the webinar, which featured the parents of four children who died after being scammed on Instagram. The 350 attendees included employees from Meta, the National Center of Missing and Exploited Children (NCMEC), law enforcement agencies, the United Nations Office on Drugs and Crime, Visa, Google and Snap.

Paul Raffile. Photo: courtesy

Raffile told the Guardian that he made some brief introductory remarks during the webinar, which lasted less than a minute.

Raffile was scheduled to start his new $175,000-a-year role the following Monday, but he received the call rescinding the offer within hours of the webinar ending. Meta’s hiring manager did not share the reason for his firing, stating that the directive came from “many pay levels above us,” Raffile said.

Meta declined to comment, calling the situation an “individual personnel matter.”

Raffile said: “It shows that Meta does not want to take this issue seriously. I have raised legitimate concerns and recommendations, and they may not be willing to be aggressive enough to address this issue.”

Financial extortion schemes have skyrocketed over the past two years, with more than 26,700 cases involving child victims reported to NCMEC in 2023 alone. According to the FBI, sextortion is the fastest growing cybercrime in the US.

The victims are mainly teenage boys, who approach scammers by pretending to be attractive girls. After coercing victims to send sexually explicit images of themselves, a scammer threatens to distribute the photos to their friends and family unless they pay a ransom.

A significant portion of these cases are the result of cybercriminals in Nigeria targeting teenagers abroad. Scammers call themselves ‘Yahoo Boys’ and are usually active on Instagram and Snapchat. The crime can be deadly. Minors are often overwhelmed by threats from scammers, and financial extortion has led to at least 20 teen suicides between October 2021 and March 2023, the FBI said.

Meta said in a statement that it has strict rules against non-consensual sharing of intimate images.

Raffile questioned why Meta and other social media companies have failed to take effective action against financial extortion.

“I had spoken to Yahoo Boys at previous employers, financial institutions and technology companies,” he said.

He previously held positions at consulting firms Booz Allen Hamilton and Teneo.

He said: “We were able to remove them from our platforms within four to six months. Yet the social media platforms have had two years to deal with this.”

A spokesperson for Meta said its expert teams are aware that sextortion actors are disproportionately spread in several countries, including in West Africa.

Raffile said Instagram’s design features help facilitate this cybercrime, including plans to encrypt direct messages, which provides more privacy but could hamper investigations. Another major problem is users’ inability to keep their followers and watchlists private, meaning an extortionist has access to their victims’ friends and family, he said.

“They message the victim and say, ‘Hey, I got your nudes and I took a screenshot of all your friends and family, your followers.’ Meta does not take the privacy of teenagers seriously enough,” said Raffile.

Raffile criticized Meta’s April announcement that it would blur images found to contain nudity as the default for under-18s. But teens can still choose to watch them.

“It sounds illegal to allow minors to transmit these images on their platform,” he said. “Why don’t we just block them?”

Meta said in a statement: “This feature aims to strike the balance between protecting people from seeing nude photos and educating them about the risks of sharing them, without preventing or interrupting people’s important conversations .”

Leave a Reply

Your email address will not be published. Required fields are marked *