Online platforms have until March 16, 2025, to assess and mitigate exposure to illegal content, or face hefty fines under the UK’s Online Safety Act. Ofcom, the enforcing regulator, published its final codes of practice, requiring firms to identify and address risks from illegal material like child sexual abuse, extreme violence, and self-harm promotion.

Ofcom head Dame Melanie Dawes called this the “last chance” for the industry to comply, emphasizing that platforms must act now to avoid enforcement actions beginning in March. Critics, including the Molly Rose Foundation and NSPCC, argue the OSA lacks specific measures to protect children from harmful content.

The OSA, which became law in October 2023, mandates that platforms use hash-matching technology to detect illegal content and implement child safety features. Some tech firms have pre-emptively introduced safety measures for teens, such as restricting unknown contacts.

Technology Secretary Peter Kyle praised the codes as a “significant step” towards safer internet use, urging compliance from technology companies. The codes still require parliamentary approval, but platforms are expected to be prepared by mid-March.