Brussels is weighing new regulations that could restrict access to popular social media platforms like TikTok, Instagram, and Snapchat for younger users. The European Commission has tasked a group of experts with developing proposals to better protect children and adolescents online, including an assessment of whether a legal minimum age for social media use is necessary. This move comes amid growing concerns about the impact of social media on youth mental health and well-being, and increasing pressure from member states to grab action.
The debate centers on finding the right balance between protecting vulnerable users and upholding fundamental rights. While a blanket ban could offer a degree of safety, it likewise raises questions about freedom of expression and access to information. The Commission’s approach will likely involve navigating complex legal and technical challenges, as well as considering the varying perspectives of EU member states. The core question is how to effectively verify age online without compromising user privacy.
What’s Driving the Push for Regulation?
European Commission President Ursula von der Leyen announced the formation of the expert group in response to mounting calls for greater online safety for young people. The group is expected to deliver recommendations by the end of the summer, focusing on how to best safeguard children and adolescents online. A key consideration is whether a statutory minimum age for social media is the most effective solution. Currently, such a measure isn’t provided for in EU rules, meaning any implementation would require new legislation or regulatory changes proposed by the European Commission.
The move also highlights a tension between national sovereignty and EU-level regulation. While individual EU member states can impose certain obligations on their citizens, enforcing age restrictions on large online platforms requires a coordinated approach. According to experts, the EU Commission holds sole responsibility for prescribing and enforcing rules on these platforms, meaning individual countries may be limited in their ability to impose additional requirements like age verification. Stephan Dreyer, a media law expert at the Leibniz Institute for Media Research in Hamburg, noted that simply “criminalizing or sanctioning children” for seeking their own protection is a problematic approach.
Existing EU Frameworks and the Digital Services Act
Despite the current lack of specific age restrictions, existing EU regulations already address the protection of minors online. The Digital Services Act (DSA), a landmark piece of legislation governing online content, requires platforms like YouTube, Instagram, TikTok, and Snapchat to take “appropriate and proportionate measures” to protect underage users. However, the DSA does not explicitly mandate age verification, and allows platforms to avoid collecting additional data for age checks.
The Commission argues that platforms that don’t verify age must find alternative ways to protect young people from harmful content, such as pornography or violent material. This interpretation is contested, and it remains unclear whether European courts would uphold this view. The DSA’s Article 28 specifically addresses “Online protection of minors,” outlining the responsibilities of platforms.
Growing Support and Potential Solutions
There’s increasing momentum across the EU for stricter rules regarding youth access to social media. In October 2025, a majority of EU heads of state and European Parliament members voiced support for a minimum age requirement. France is already moving forward with its own legislation, which has cleared a hurdle in parliament. However, some EU countries with more liberal stances may resist, citing concerns about infringing on the fundamental rights of children and adolescents.
Within Germany, the governing coalition is divided, with the CDU and SPD advocating for a complete ban on social media for children under 14, while the CSU remains unconvinced. The SPD has even suggested utilizing the EU Digital Identity Wallet (EUDI Wallet) for age verification. The EUDI Wallet, expected to be available in Germany by early 2027, is designed to allow for anonymous age verification without storing personal data like names or birthdates. The EU Commission believes that integrating this system, or a similar alternative, could fulfill the DSA’s requirements for age control.
What’s Next?
The expert group’s recommendations will be crucial in shaping the future of social media regulation in the EU. The Commission will then need to decide whether to propose legislative changes or new regulations. The debate is likely to continue, with stakeholders weighing the benefits of increased protection against the potential drawbacks of restricting access to online platforms. The success of Australia’s recently implemented social media laws, which prohibit users under 16 from accessing platforms without parental consent, will be closely monitored by the EU. Von der Leyen has indicated she is watching the implementation of Australia’s policy closely to inform the EU’s approach.
The path forward remains uncertain, but the growing consensus around the need to protect young people online suggests that some form of regulation is likely. What form that regulation will take – a complete ban, age verification, or a combination of measures – remains to be seen. Share your thoughts in the comments below.