Home » Technology » Brussels Weighs Measures to Protect Minors Online, Pressuring Apple and Google

Brussels Weighs Measures to Protect Minors Online, Pressuring Apple and Google

by

EU Commission Considers Device-Level Age Verification for App Stores

Brussels, Belgium – teh European commission is open to exploring age verification for digital platforms being conducted directly on users’ devices or within app stores, signaling a potential shift in its approach to child online protection. This comes as the Commission releases the results of a recent survey seeking public opinion on the age-access filters currently employed by tech giants Apple and Google.The survey results, published on Friday, precede next week’s release of new EU guidelines on protecting minors online, a key component of the Digital Services Law (DSA). Brussels also plans to launch a beta version of a European age verification application.

The survey specifically probed whether Apple’s iOS App Store and google Play Store have sufficient measures in place to prevent minors from accessing harmful content. This focus suggests the Commission may be considering holding the two major technology companies more accountable for age verification.

Meta (formerly Facebook), which does not operate its own operating system, has been advocating for age verification to be performed on the device itself. In contrast, Google and Apple prefer that individual applications be responsible for verifying user ages, rather than the platforms hosting them.

While the European Commission has previously favored an application-level verification process, friday’s survey results indicate a willingness to consider device-level verification as well.Meta has been actively campaigning in Brussels to promote this approach.

Both Apple and Google’s app stores are classified as Very Large online Platforms (VLOPs) under the DSA. This designation imposes a legal obligation on them to prevent minors from accessing sensitive or harmful content.

“Their scale and influence make them key actors in the application of age guarantee methods, including age verification tools, as well as association solutions, such as parental controls and age classification,” the Brussels survey stated.

Participants in the survey were asked whether their respective app stores possess adequate methods to prevent children from accessing content deemed inappropriate for their age.

“The Commission has commissioned an external contractor to carry out a study on the supervision of application store ecosystems according to the Digital Services law,” announced thomas Regnier, a spokesperson for the Commission.

Regnier further elaborated that the survey findings will provide more insight into the measures VLOPs are taking to protect minors online, “including application stores.”

He concluded, “The Commission considers that all online platforms, including application stores, must comply with the obligations imposed by the Digital Services Law and protect minors online.”

What specific measures is the European Commission considering to enforce stricter age verification beyond self-reporting?

Brussels Weighs Measures to Protect Minors Online,Pressuring Apple and Google

The Expanding Regulatory Landscape for Child Safety Online

The European Commission is intensifying its scrutiny of tech giants Apple and Google,considering new measures to bolster online safety for children. This push comes amidst growing concerns about exposure to harmful content, data privacy violations, and the potential for online exploitation of minors. the proposed regulations aim to significantly increase the responsibility of platforms in protecting young users, potentially reshaping the digital landscape for both companies and consumers. Digital Services Act (DSA) compliance is at the heart of these discussions.

Key Proposed Regulations & their impact

Brussels is exploring several avenues to enhance child online protection, including:

Stricter Age Verification: Moving beyond self-reporting, the Commission is evaluating methods for more robust age verification systems. This could involve digital ID solutions or collaboration with telecom providers. The goal is to prevent underage access to age-restricted content and services.

Enhanced Parental Controls: Regulations are expected to mandate more granular and user-kind parental control tools within app stores and on platforms. These tools would allow parents to manage screen time, filter content, and monitor their children’s online activity more effectively.

Prohibition of Manipulative Techniques: The Commission is targeting “dark patterns” – deceptive design choices used to manipulate users, especially children – into making unintended purchases or sharing personal facts. This aligns with broader efforts to promote digital wellbeing.

Increased Transparency Requirements: Apple and Google might potentially be required to provide greater transparency regarding their algorithms and content moderation practices, specifically concerning content viewed by minors.

Data Minimization for Children: Regulations are likely to emphasize the principle of data minimization, limiting the collection and processing of personal data belonging to children. This is a core tenet of the General Data Protection Regulation (GDPR).

Pressure on Apple’s App store and Google Play

The proposed measures directly impact both the Apple App Store and Google Play Store. Both platforms are facing pressure to:

Improve App Review Processes: Strengthen scrutiny of apps targeting children or potentially accessible to them, focusing on data privacy, content appropriateness, and the presence of manipulative techniques.

Enforce Age Ratings More Effectively: Ensure accurate and consistent application of age ratings, preventing apps with inappropriate content from reaching younger audiences.

Provide Clearer Information to Parents: Offer easily accessible information about app privacy policies and data collection practices.

Facilitate Reporting Mechanisms: Streamline the process for reporting harmful content or apps targeting children.

The Role of the Digital Services Act (DSA)

the Digital Services Act (DSA), which came into force in February 2024, provides the legal framework for these new regulations. The DSA designates Very Large Online Platforms (VLOPs) – including major social media platforms and search engines – as having special obligations to protect users, including minors. The Commission is leveraging the DSA to enforce stricter standards and hold platforms accountable for failing to protect young users. Online content moderation is a key focus.

Real-World Examples & Case Studies

several recent incidents have fueled the push for stronger regulations:

TikTok Fines (2023): The Irish Data Protection Commission (DPC) fined TikTok €345 million for violating GDPR rules related to processing the personal data of child users. This highlighted the vulnerabilities in existing data protection measures.

youtube’s COPPA Settlement (2019): YouTube paid a $170 million settlement to the US Federal Trade Commission (FTC) for violating the Children’s Online Privacy Protection Act (COPPA) by illegally collecting data from children.

Growing Concerns about Online Grooming: Reports of online grooming and sexual exploitation of minors continue to rise, prompting calls for more proactive measures to protect vulnerable children.

Benefits of Enhanced online Child Safety

Stronger regulations offer several benefits:

Reduced Exposure to harmful Content: Protecting children from exposure to inappropriate or perilous content, such as violence, hate speech, and misinformation.

Enhanced Data privacy: Safeguarding children’s personal data and preventing its misuse.

*improved digital

You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Adblock Detected

Please support us by disabling your AdBlocker extension from your browsers for our website.