
The National Crime Agency (NCA) has issued a stark warning about organized online groups grooming girls as young as 11, coercing them into self-harm, sharing abusive content, and even attempting suicide1. These “com networks,” predominantly composed of teenage boys, operate on mainstream platforms like Discord and Telegram, leveraging social engineering and doxing tactics2. The NCA reports a sixfold increase in such cases between 2022 and 2024, with thousands of UK users exchanging millions of harmful messages3.
Technical Tactics and Platform Exploitation
These gangs avoid the dark web, instead using gaming chats and encrypted messaging apps to evade detection. Tactics include blackmail via stolen credentials, API abuse to automate harassment, and manipulation of platform moderation gaps. For example, Discord’s webhook APIs have been weaponized to flood victims with abusive content4. Case studies reveal offenders like Richard Ehiemere (17) and Cameron Finnigan (19) exploited Telegram’s file-sharing features to distribute child abuse imagery1.
Detection and Mitigation Strategies
Platforms can implement the following technical measures:
- Behavioral Analysis: Monitor for rapid-fire message bursts or abnormal login patterns indicative of botnets.
- Content Moderation APIs: Integrate tools like Google’s Perspective API to flag coercive language in real-time.
- Parental Controls: Tools such as Apple’s Screen Time or Microsoft Family Safety can restrict unknown contacts.
“Tech companies must act to protect children. These groups collaborate at scale to inflict harm.” — Graeme Biggar, NCA Director General1
Relevance to Security Professionals
For threat researchers, these groups exhibit APT-like coordination, warranting analysis of their TTPs (Tactics, Techniques, and Procedures). Blue teams should audit enterprise communication tools for similar vulnerabilities. The NCA’s undercover operations highlight the need for enhanced logging of metadata in encrypted apps to trace offenders3.
Conclusion
The Online Safety Act2 mandates platform accountability, but technical enforcement remains inconsistent. Proactive measures like hardened API permissions and machine learning-based anomaly detection are critical to disrupt these networks.
References
- “Sadistic online harm groups putting people at unprecedented risk, warns the NCA,” National Crime Agency, Mar. 2025.
- “Online gangs of teenage boys sharing extreme material an emerging threat in UK,” The Guardian, Mar. 25, 2025.
- “Sadistic online gangs pose grave risk, says NCA,” BBC News, Mar. 2025.
- “Sadistic and violent: NCA says online gangs of teen boys pose grave risk,” ITV News, Mar. 25, 2025.