The children’s charity NSPCC has called on technology companies to introduce real-time safeguards to prevent the creation and sharing of child sexual abuse images, following an increase in recorded offences across the UK.
New data analysed by the charity shows that police forces logged nearly 37,000 offences involving indecent images of children between April 2024 and March 2025: an 8% rise on the previous year. The NSPCC said the figures highlight the scale of online abuse and the need for stronger preventative measures.
Of the cases where a platform was identified, 43% took place on Snapchat, while platforms owned by Meta accounted for almost a quarter overall, including Instagram, WhatsApp, Facebook and Messenger.
The charity warned that the true scale of offending is likely to be higher, as end-to-end encryption can make detection more difficult and lead to under-reporting.
The NSPCC is urging companies to deploy existing technology that can block nude images from being created, shared or viewed on children’s devices. It argues that embedding such safeguards could prevent children from being coerced into creating images that are later used for exploitation or blackmail, as well as protecting them from exposure to harmful content.
Chris Sherwood, CEO of the NSPCC, said it is “indefensible” that around 100 offences are recorded each day and accused technology companies of failing to do enough to protect children using their services. He added that the technology required to prevent such harm already exists and could be implemented immediately.
The charity also highlighted the long-term impact on victims, including cases where children are subjected to blackmail or ongoing harassment after images are shared online. It warned that without effective safeguards, young people remain at risk of grooming, coercion and exploitation.
The Government has indicated it is prepared to take further action if companies fail to act. Jess Phillips, Minister for Safeguarding and Violence Against Women and Girls, described online child sexual abuse as one of the most serious forms of offending and said the scale revealed by the data is “deeply shocking”.
Phillips said the Government is committed to making it impossible for children in the UK to take, share or view nude images and has already announced plans to ban so-called “nudification” applications used to generate abusive content.
The NSPCC said it will continue to press both industry and government to ensure that stronger protections are introduced, arguing that mandatory safeguards may be necessary if voluntary action by technology companies proves insufficient.
