Minors are traditionally framed as subjects in need of heightened protection in private law relationships. However, they are also individuals in development, whose agency and autonomy should be respected and fostered.
This tension is particularly visible in the digital environment. The internet is now an integral part of children’s lives, offering a wide range of opportunities for learning, exploring, and interacting with others. Nevertheless, the online environment exposes children to numerous risks, such as cyberbullying, fraudulent marketing practices, grooming, abuse, and exploitation. The scale of these problems is alarming, with a significant proportion of children experiencing online harm. Furthermore, the rise of generative artificial intelligence creates additional risks for youngsters, such as being victims of AI hallucinations, disinformation, misinformation, and deepfakes.
Article 28 of the Digital Services Act (DSA) emerges in this landscape as one of the most interesting – yet, still quite unexplored – provisions among the European “digital acts”. Article 28 aims to tackle the issue of the online protection of minors, imposing on platforms the obligation to take the appropriate and proportionate measures to ensure a high level of “privacy, safety and security” for minors online.
This blog post suggests a specific interpretation of Article 28 DSA. I argue that, if Article 28 DSA is read in the light of the best interest of the child, it could support a comprehensive conception of minors’ “privacy, safety and security”, able to incorporate interests such as data protection, freedom of expression, education, and self-determination, in line with the European and international human rights Charters. I also maintain that privacy, safety, security, and the other fundamental rights enshrined in Article 28 DSA are particularly suitable to be enforced through aggregate litigation. If these interests are interpreted on a collective basis, Article 28 DSA can act as a driving force to shape a child-friendly digital environment.
The digital vulnerability of minors in online markets
In many legal systems, minors enjoy a specific legal status, defined by their age. Thus, age operates as a convenient proxy, in terms of legal certainty, for assessing cognitive development, awareness, and decision-making capacity in legal transactions.
Nevertheless, the awareness and capabilities of minors vary according to their age (a six-year-old and a sixteen-year-old differ significantly from one another), level of education, social background, and the relationship with their guardians.
European “digital law” also considers age in granting specific protection. The GDPR, the DSA, and the AI Act, consider minors as vulnerable subjects. The concept of digital vulnerability has been extensively addressed by legal scholars (Malgieri, 2023; Grochowski, 2025; in this Symposium, see Giacalone-Manz). Minors are intrinsically vulnerable as particularly subject to the risk of being harmed. While children often use online tools better than their guardians, they are not necessarily fully aware of the consequences of their acts. In other words, digital ability does not mean digital literacy.
And yet, the online environment also allows minors to express their personal identity, exercise their freedom of expression and assembly, access educational resources, and participate in social life. This ambivalence should be taken into account when applying, interpreting and formulating legal rules, in order to avoid a paternalistic approach that undermines minors’ agency.
Nonetheless, there seems to be an increasing tendency in imposing bans on the use of social media by minors. Australia has recently banned the creation of social media accounts and their use to children below age sixteen, and the European Parliament has also advocated for the adoption of a similar measure. The French Parliament is also working on a bill banning the use of social media to users below the age of fifteen, and other Countries, such as Denmark, Germany, Spain, and Greece, are also evaluating the possibility of adopting similar measures. However, bans may lead to restrictions which would curtail freedom of expression and connection, and are extremely difficult to enforce, thus risking also being ineffective. It is crucial to protect children decreasing, but not neutralizing the risk to their fundamental rights. The DSA shows great potential in this respect.
“Privacy, safety and security” of minors
Article 28 DSA requires all platforms “accessible to minors” to adopt “appropriate and proportionate measures” ensuring a “high level of privacy, safety, and security” for children. The provision’s scope of application, and its potential effects, are huge. In fact, contrary to the risk assessment and mitigation measures provided for in Articles 34 and 35, which should also prevent harm to children, but apply only to very large online platforms (VLOPs), Article 28 DSA applies irrespective of the platforms’ size (except for small and medium-sized enterprises, as per Article 19 DSA) and of the service they offer. Rental platforms, marketplaces, dating apps, and social media services, are potentially all covered by this provision, provided that they are accessible to minors.
At the same time, the provision’s wording leaves some doubts and open questions. The European Commission Guidelines on the interpretation of Article 28 DSA define minors as users below age eighteen – differently from the GDPR, which sets the threshold at sixteen years old. Moreover, Article 28 DSA is silent with respect to when a platform is accessible to minors. Recital 71 states that a platform is accessible to minors when it targets minors, when it is predominantly used by them, or when providers are otherwise aware that some minors are using their services. However, what counts as “predominantly” used by minors? How many minors are “some minors”? How should platforms effectively assess whether these thresholds are met without “processing additional personal data”, as Article 28 prescribes?
It is difficult to identify the “appropriate and proportionate measures” which platforms should adopt, and to distinguish them from the risk-assessment obligations of Article 34 DSA. The EC Guidelines provide some clarity in this respect, specifying the need to integrate privacy, protection, and security in the design of platforms, by default, for instance, by implementing age assurance methods, or recommender systems considering diversity, inclusiveness, and fairness. However, the Guidelines also adopt a risk-based approach, and the difference with the measures laid down in Section 5 DSA is not crystal clear in practice.
Difficulties remain in interpreting the true meaning of “privacy, safety and security”. To begin with, these interests may conflict with one another. For instance, parental control may enhance the security of minors, undermining their privacy. While it may be understandable to limit the privacy of a ten-year-old for the benefit of security, it is less the case for a sixteen-year-old teenager. Article 28 DSA also does not mention other fundamental rights, such as data protection, freedom of expression and information, and the right to education.
However, it is possible to draw a balance between conflicting interests, on a case-by-case assessment, in the light of the best interest of the child. The best interest of the child, as a principle and an interpretative tool increasingly employed by courts, and acknowledged by fundamental rights’ Charters, could help to solve disputes between conflicting rights, ensuring the full development of minors’ personality, without sacrificing their protection. The best interest is in fact acknowledged by the European Charter for Fundamental Rights, which is primary EU law, and also recalled in the EC Guidelines.
Furthermore, the best interest may involve interpreting “privacy, safety and security” broadly, as encompassing other fundamental rights, such as freedom of expression and information, data protection, non-discrimination, and access to education. These rights are all acknowledged in the ECFR, and mentioned in the Guidelines as rights that must be included in platforms’ assessments when implementing Article 28. Their safeguard is crucial to developing a safe and secure digital environment, where minors can participate and express their personality meaningfully.
The collective dimension
Ensuring the effectiveness of Article 28 DSA is of utmost importance. The task cannot be left only to public authorities. Public enforcement under the DSA is well-structured, but also complex, and requires huge resources. Moreover, there is still some uncertainty over the powers of the Digital Services Coordinators, and many overlaps with the role of other supervisory authorities.
There comes the need to consider private enforcement and judicial remedies, and their collective dimension. In fact, platforms’ design affects minors in a structurally similar way. In other words, a failure to implement the appropriate measures to ensure minors’ safety, privacy and security, not only exerts an effect on the individual rights of underage users, but systemically impacts the well-being of all minors accessing the platform.
The interests protected in Article 28 DSA have a group dimension, as collective interests of a specific group of recipients, namely minors, and general interests of society. Furthermore, due to their vulnerability, minors and their guardians often struggle to access judicial remedies individually, especially in the light of the non-material nature of the harms they might suffer, which appear trivial on an individual scale, and not worth starting a dispute.
The DSA provides two possibilities to aggregate disputes. First, Article 90 DSA refers to the Representative Actions Directive (RAD). In addition, Article 86 DSA introduces a mandate-based mechanism, allowing the listed organizations to bring collective actions in the interest of the recipients of information society services, beyond the RAD, similarly to Article 80 GDPR (Federico, 2024, 219). In combined disposition with Article 54 DSA, which provides individuals with the right to ask for damages, this provision supports the existence of a private right of action for enforcing the DSA collectively, requesting injunctive or compensatory relief. Accordingly, when national collective proceedings regimes exist, these, and the consumer representative action, can be used to enforce the DSA.
The collective enforcement of the DSA is already happening, as some recent proceedings attest. There may be several scenarios for further enforcement actions under Article 28 DSA, in the light of the Guidelines: for instance, when age-assurance methods are not properly implemented, causing harm to minors, a collective claim may be started. Such claims may be brought forward by consumer entities (as minors are also young consumers), or other no-profit associations having standing under the different European jurisdictions.
The DSA articulates a vision of a digital environment where minors are not conceived as passive objects of protection, but active agents who may thrive in the digital world. If read in the light of the child’s best interest, and enforced collectively, Article 28 DSA may significantly contribute to realize this goal.
(Photo: Max Harlynking)