Meta wishes a common age limit would be set across all digital services, as it bows to listening to campaigners and regulators on introducing safety features and addressing the ‘dark side’ of their services. The social media giant and other tech corporations have been reprimanded for enabling deaths in young people at the exposure of algorithmic dangerous content online. 

Meta’s lower age limit will empower parents with the permissions to give their teens access to use digital services only when it’s appropriate.

“We think an EU wide digital majority age can be an effective solution to the challenge of ensuring teens have a safe, appropriate experience online,” Helen Charles, policy director at Meta, told MLex in an interview.

Meta addressing its child safety concerns on its own platforms in European Union will please EU countries mandating national solutions including wanting to impose strict social media bans and force tech companies to install identity verification.

Meta, which owns Facebook and instagram, said their “top priority” is safety of young people. Despite being more of an online harm than the physical world nowadays, over the last decade they have built and altered products with teens in mind, and acknowledged how AI could drive a proliferation of harmful content into social feeds. There is greater awareness that “consistent protection” of youths is needed across “all the different digital platforms they use”, leading to the establishment a common Digital Majority Age across EU member states. Research shows that three-quarters of EU parents want to be involved in their children’s digital lives to oversee what they are viewing online and give parental consent to accounts only when they are over the age of 16.

Meta agrees that parents are the primary teachers to children around the dangers online or off-line and should have the ultimate say on what online services they use.

“Regulation should empower this, underpinning their ability to make decisions for their family”.

All digital services are pulled into the social media mandate, not just social media platforms, as teens engage with a variety of apps – at least 40 apps per week – to complete their digital lives gaming, streaming, messaging, and browsing. 

If regulators only focused on social media sites they would be missing the “bigger picture” of what harms never cease to exist on the World Wide Web and unknowingly accept less safe digital spaces

But also, robust age verification mechanisms that have been enforced on tech providers are critical to reduce burdens on parents alone, which should be reliable, easy-to-use and privacy-preserving. Meta does emphasise that the solution should be supported at the app-store or operating level where they have first contact with users to verify their age and identities, but this deflects some responsibility from individual tech services.