By Ihechi Enyinnaya
Meta Platforms Inc., the parent company of Facebook, Instagram, and WhatsApp, announced the removal of 63,000 accounts linked to the notorious scam group known as the “Yahoo Boys.” This action was reported in Meta’s Q1 2024 Adversarial Threat Report released on Wednesday.
These accounts, deleted over recent weeks, were involved in financial sextortion scams and the distribution of blackmail scripts. A smaller network of 2,500 accounts, connected to approximately 20 individuals, specifically targeted adult men in the United States using fake identities.
Meta used advanced technical signals and thorough investigations to identify and disable these accounts, enhancing its automated detection systems. The company highlighted that financial sextortion is a global crime, with “Yahoo Boys,” primarily based in Nigeria, specializing in various scams.
“We removed around 63,000 accounts in Nigeria attempting to target people with financial sextortion scams, including a coordinated network of around 2,500 accounts,” Meta stated. The company also removed Facebook accounts, Pages, and groups run by Yahoo Boys, which were banned under its Dangerous Organizations and Individuals policy for organizing, recruiting, and training new scammers.
During the investigation, Meta discovered that while many scammers’ attempts were unsuccessful, some targeted minors. These incidents were reported to the National Center for Missing and Exploited Children (NCMEC). Meta also shared information with other tech companies through the Tech Coalition’s Lantern program to help curb these scams across platforms.
Meta further disclosed that it removed approximately 7,200 assets in Nigeria, including 1,300 Facebook accounts, 200 pages, and 5,700 groups, which were providing scam-related resources. These resources included scripts and guides for scams and links to collections of photos for creating fake accounts.
Since the crackdown, Meta’s systems have been actively blocking attempts by these groups to return, continually improving their detection capabilities. The company has also been working closely with law enforcement, supporting investigations and prosecutions by responding to legal requests and alerting authorities to imminent threats.
Beyond account removal, Meta funds and supports NCMEC and the International Justice Mission in running Project Boost, a program that trains law enforcement agencies worldwide in processing and acting on NCMEC reports. The company has conducted several training sessions, including in Nigeria and Cote d’Ivoire, with the latest session taking place last month.
To protect users, especially teens, Meta has implemented stricter messaging settings for users under 16 (under 18 in some countries) and displays safety notices to encourage cautious behavior online.
Recently, Meta was fined $220 million by Nigeria’s Federal Competition and Consumer Protection Commission (FCCPC) for multiple violations of data protection laws linked to WhatsApp. The investigation, initiated in May 2021, found that Meta’s privacy policies infringed on users’ rights, including unauthorized data sharing and discriminatory practices. Meta plans to appeal the decision, disputing the findings and the imposed penalty. The FCCPC aims to ensure fair treatment of Nigerian users and compliance with local regulations.