“It is reasonable to conclude that legislation and regulation is driving the companies to make significant numbers of important child privacy and safety changes” – Steve Wood, PrivacyX Consulting, Author.
The impact of regulation to make children’s digital lives safer is shown to have been beneficial, by using the number of changes made by social media companies as a metric. The project that delved into reporting “Impact of regulation on children’s digital lives” shares promising end results with regards to dictating what the companies in charge of online platforms by law have to adhere to.
Investigating whether the legislation and regulation has a benefit on increasing safety for children, data from the websites for Meta, Snapchat, Google and TikTok was obtained during a period from 2017–24 and demonstrated interesting evidence that the companies themselves had announced publicly over how well they were meeting requirements.
The companies selected because of their extensive use by children were scored in two risk categories: risks in the OECD’s typology (contact, conduct, consumer, content and cross-cutting) and (2) four types of changes to the design of the service delivered by default algorithms, (automated) tools, information and support.
These modern-day capabilities of AI and automation is what is responsible for targeting harmful content to teenagers and children. The findings indicated the highest OECD risk category was content risk which incurred 56 changes. Followed by cross-cutting (41), contact (16), consumer (11) and conduct (4).
The findings reveal 128 changes recorded during the period of the project, with only a peak of 42 changes recorded in 2021, tying with the AADC coming into effect.
Meta, which owns Facebook and Instagram, was found to be the most active company announcing 61 changes.
The highest category change was ‘by default’ – with 63 changes – followed by tools (37), information (21) and support (7).
A full disclaimer is part of concluding the results of the project, arguing that whilst the regulation is proved to have “substantive benefits in protecting children online”, “further” research is needed to assess the full extent of the benefits, especially as DSA and OSA are fully implemented through 2025 and 2026.