rotating globe
19 Feb 2026


UK mandates 48-hour removal of non-consensual intimate images

Social media and tech firms must delete private images shared without consent within two days or face heavy fines and possible service blocks

The UK government is introducing a new rule requiring social media and tech companies to remove non-consensual intimate images within 48 hours of being reported. The legislation is aimed at protecting victims, particularly women and girls, from online abuse and the spread of private or AI-generated explicit content.

Under the amendments to the Crime and Policing Bill, platforms will have a legal duty to act quickly once notified. Failure to comply could result in fines of up to 10 % of a company’s global revenue or, in extreme cases, a ban on operating in the UK.

Currently, sharing intimate images without consent is already illegal, but victims often face delays and repeated reporting to multiple platforms before content is removed. The new rule simplifies the process by expecting firms to remove flagged images across all services and prevent them from being re-uploaded.

Prime Minister Sir Keir Starmer said tech firms were “on notice” and emphasized that responsibility should shift from survivors to platforms hosting harmful content. He described rapid online abuse as a key part of the broader fight against violence targeting women and girls.

The regulator Ofcom may treat non-consensual intimate images with the same seriousness as child sexual abuse or terrorist material. Digital tools could be used to detect and block content automatically before it spreads further.

In addition, the government is exploring stricter measures on AI-generated sexually explicit content and potential age restrictions for social media users under 16. Officials hope these steps will reduce the circulation of harmful content and provide quicker relief for victims.

Also Read: Sarvam Kaze puts India in AI smart glasses race