Última actualización Miércoles, 28/05/2025

These new guidelines aim to support platforms used by minors, and ensure a high level of privacy, safety, and security for children, as required by the EU Digital Services Act. The European Commission is now looking for feedback from all stakeholders, which can be given until 10 June.

The guidelines outline a list of measures that all platforms can implement to protect minors (exception for micro and small enterprises), avoiding restrictions on their rights to participation, information, and freedom of expression.

For example, platforms should:

  • Implement age assurance measures that reduce the risks of children being exposed to pornography or other age-inappropriate content.
  • Set children’s accounts as private by default, thus reducing the risk of unsolicited contact by strangers.
  • Adjust their recommendation systems and prioritise explicit signals by users on whether they like or not the content they see, reducing the risk of children ending up in rabbit holes of harmful content.
  • Enable children to block and mute any user and ensure they cannot be added to groups without their explicit agreement, which may contribute to reducing the risk of cyberbullying.

The European Commission is looking for feedback from all stakeholders, including children, parents and guardians, national authorities, online platform providers and experts, on the proposed guidelines.

Find out more about the guidelines to protect minors under the Digital Services Act and give your feedback before 10 June 2025.

Deja un comentario