age_assurance

Age Assurance, Digital Rights, and the Cost of Simplification

Age Assurance in Europe: Privacy, Rights & What’s Changing” on a clean, minimal background. The design suggests a professional, editorial tone focused on digital policy, privacy, and access to online content.

A conversation with Svea Windwehr (D-64)

Regulation is moving faster than the infrastructure meant to support it.

Across Europe, age assurance frameworks are being introduced with increasing urgency. The stated goal is clear: protect minors. But the mechanisms being proposed—and in some cases implemented—raise more complex questions about privacy, proportionality, and who can realistically comply.

To ground this discussion beyond abstraction, we collaborated with policy expert Svea Windwehr, in dialogue with D-64 – Zentrum für Digitalen Fortschritt, to better understand what is at stake.

What regulators are trying to solve

From a digital governance perspective, age assurance is not driven by a single policy objective.

As Svea Windwehr explains, one focus has long been to block access for users below a certain age threshold to entire categories of content or services. Historically, that has applied mainly to gambling, pornography, and websites selling alcohol or tobacco. Increasingly, social media platforms are also being considered within that logic.

At the same time, Svea points to a more nuanced debate that is beginning to take shape. Rather than focusing only on blocking access, some policymakers are looking at age signals as a way to create more age-appropriate online environments for young people. In practice, that could mean adapting platform features, default settings, or other parts of the user experience depending on age.

The distinction matters. One approach is primarily restrictive. The other is more focused on adaptation. However, they share unresolved questions around whether it is possible to assess age online in a manner that is private, secure and does not lead to the disenfranchisement of individuals or communities, particularly those traditionally underserved or exploited by technology providers.

Beyond the binary of safety vs. rights

The public debate is often framed as a conflict between youth protection and digital rights. Svea suggests that this framing is too simplistic.

In her view, young people need and have a right to both: safety and fundamental rights online, including privacy and the ability to explore, play, and express themselves freely. Privacy and safety are often presented as competing values, but, as she notes, this is not necessarily a zero-sum trade-off. Privacy-protective defaults and confidential messaging services, for example, are crucial  to young people’s safety online.

What follows from that is not a need to choose one value over the other, but to take seriously the challenge of designing systems that support both, and prioritize holistic approaches to empowering young people online.

Where implementation becomes difficult

This is where the debate becomes more fragile.

Svea notes that most age assurance systems currently available tend to fall short in one of three ways: they may lack privacy and security, they may not be widely accessible, or they may be too easy to circumvent. That creates a difficult question for platforms: when is it proportionate to implement a system that may solve one problem while creating another?

For smaller providers in particular, this is not an abstract policy issue. It becomes a practical and operational one. As Svea points out, they are often left to navigate these trade-offs on their own, without enough guidance from policymakers on how to weigh fundamental rights, robustness, and accessibility in a fair and workable way.

Proportionality and the bias toward scale

For independent platforms, “proportionate compliance” is often easier to invoke than to define.

Svea highlights a structural issue in current policymaking: big tech’s products and practices frequently serve as the default mental model when online safety laws are developed. This can flatten the wide diversity of services and platforms that fall under the same rules, even when their scale, resources, and public function differ significantly.

The result is a compliance burden that smaller platforms and not-for-profit projects may struggle to absorb. As Svea notes, this affects not only existing providers but also future ones. When regulation is built around the capabilities of the largest actors, it becomes harder for new and innovative alternatives to emerge. In that sense, the issue is not only about compliance costs. It is also about whether policy unintentionally reinforces the dominance of big tech.

What is at stake for sexual health and adult education platforms

This becomes particularly relevant for independent platforms operating in lawful but sensitive areas such as sexual health and adult education.

Svea warns that a heavy-handed approach to online safety—especially one that does not account for the diversity and value of smaller, more specialised platforms—risks harming an ecosystem that provides important public resources. That would not only affect the platforms themselves, but also the people who rely on them to explore, learn, or educate their communities.

As she notes, this would also matter for young people. Removing or weakening responsible spaces for engagement with sexuality and sexual health does not eliminate the need for those conversations. It may simply reduce the quality, care, and responsibility with which they happen.

What users are likely to feel

For adult users of lawful platforms, age assurance is likely to be experienced less as an abstract policy framework and more as friction.

Svea notes that age assurance always introduces some form of friction between the user and the platform. Even if that friction can be mitigated, she suggests it may still leave users feeling less private and less empowered in the future.

That change in experience is significant. It points to the fact that age assurance is not only a technical system or legal requirement. It also reshapes how people relate to digital spaces and what kind of trust they feel those spaces deserve.

Why privacy concerns are real

Privacy concerns around age verification are not incidental. They depend heavily on the system being used.

Svea points in particular to age estimation technologies based on facial images, video, or behavioural signals. These systems, she notes, often rely on very large amounts of data collection. They can also involve opaque value chains, where users are expected to hand over sensitive information to corporations they know very little about.

That opacity matters. It affects not just compliance, but legitimacy.

Where XO stands

XO operates at the intersection of education, sexuality, and media. That requires us to engage with these regulatory developments seriously, but also with precision.

We support the need to protect minors online. At the same time, we believe this objective cannot be separated from questions of privacy, proportionality, access, and the structural impact of regulation on independent platforms.

Like Svea, we believe the conversation must move beyond false binaries. Safety and digital rights should not be treated as mutually exclusive. And if compliance is to be meaningful, it must take into account the actual diversity of platforms, not only the capabilities of the largest ones.

This is especially important in areas such as adult education and sexual health, where lawful platforms may serve a broader educational and social function than regulation often acknowledges.

One step forward

This is not a settled conversation.

Age assurance frameworks will continue to evolve, and so will the debates around them. What matters now is resisting simplification—especially when simplification obscures the real trade-offs involved.

We will continue following these developments, engaging with experts and partners, and contributing to a conversation that is more precise, more proportionate, and more attentive to the realities of independent platforms.

Because clarity will not come from flattening the issue.

It will come from examining it properly.

Author:

Anarella Martínez Madrid