Editor's note

This content contains descriptions of sexual violence, rape and assault.

A new investigation by CNN has uncovered a disturbing global network of online communities where men share tactics, tools, and encouragement to drug and sexually assault women, often their own partners, while evading detection. 

The report, published as part of CNN’s As Equals series, shows how these spaces function like what one expert described to CNN as an “online rape academy,” where abuse is normalised, taught, and even monetised. 

According to CNN, this activity cuts across multiple platforms, from pornographic websites to encrypted messaging apps. On one site, CNN reported that more than 20,000 videos categorised as “sleep content” have been uploaded by users, many showing women who appear unconscious or sedated. These videos are often tagged with labels like “#passedout” and “#eyecheck,” the latter referring to clips where perpetrators lift victims’ eyelids to demonstrate they are not awake. 

CNN said that within these communities, users openly exchange advice on how to carry out the abuse. In one chat reviewed by the network, participants discussed how to administer substances without causing overdose while still ensuring their victims remained unconscious. As CNN put it, "Members trade advice on how to drug their partners.” 

The investigation also found that some perpetrators are turning this abuse into a business. CNN reported that users advertised livestreams of assaults for as little as $20 per viewer, often paid in cryptocurrency. In one exchange cited in the report, a user described how paying viewers actively directed the abuse in real time, telling CNN, "They told me what to do and I did it.” 

CNN further reported that one individual claimed to be selling so-called “sleeping liquids” to buyers around the world, promising the substances were tasteless and odourless and would leave no memory of the assault. 

The network traced this activity across multiple regions, including Europe and West Africa, highlighting how anonymity and fragmented enforcement allow these communities to thrive. CNN connected this ecosystem to the 2024 case of Dominique Pelicot, where dozens of men were convicted for repeatedly assaulting a woman who had been drugged by her husband. While that case briefly drew global attention and led to the shutdown of a related platform, CNN reported that similar networks have continued to emerge elsewhere. 

Central to the report are the voices of survivors, many of whom said they were abused by people they trusted. One woman told CNN, “You don’t expect anything other than innocence to come from your partner,” describing the shock of discovering years of abuse. Another survivor said her partner attempted to manipulate her perception of reality after she began to suspect something was wrong, telling CNN that “he was trying to change my reality… saying that it hadn’t happened.” A third survivor said she only came to understand what had been done to her after discovering recorded videos, explaining that without them, “it would have been difficult to believe.” 

Experts told CNN that the problem is deeply embedded in both technology and social systems. The report notes that platforms often benefit from legal protections that shield them from liability for user-generated content, while recommendation algorithms can push increasingly extreme material to wider audiences. At the same time, CNN reported that law enforcement and healthcare systems frequently lack the training needed to properly identify cases of drug-facilitated sexual assault. 

Citing the World Health Organisation, CNN said reliable data on these crimes is “scarce by design,” in part because victims may have limited memory of the assault and are less likely to report it. 

Despite mounting evidence, accountability remains inconsistent. CNN reported that Telegram removed one of the groups identified in the investigation after being contacted, stating that content promoting sexual violence violates its rules.  

“Moderators empowered with custom AI tools proactively monitor public parts of the app and accept reports in order to ban accounts breaching our terms of service and remove millions of pieces of harmful content each day, including content that calls for sexual violence,” Telegram said in a statement to CNN. 

Telegram’s new privacy rule will now share data with law enforcement
It would include IP addresses and phone numbers of terror suspects.