Discord is testing a new age verification process that involves scanning a user’s face or ID to access sensitive content, marking a significant change in how the platform handles mature material. The feature is currently being trialed in Australia and the UK, where new laws aim to restrict children’s access to sensitive online content.
The age verification process is triggered when users encounter flagged nude or sexually explicit content or attempt to disable content filters. In these situations, users are prompted to verify their age through a one-time process, either by allowing Discord to access their device’s camera to scan their face or by submitting a photograph of their ID via a scanned QR code.
This measure is part of Discord’s efforts to comply with local laws, such as the UK’s Online Safety Act, which requires online platforms hosting pornographic content to introduce “robust” age-checking techniques. Discord emphasizes that the information submitted during the age verification process will not be stored by the company or its vendors. The face scan tool operates on-device, preventing the collection of biometric information, and the ID scan is deleted upon verification.
Users who are incorrectly verified or banned can retry the process, request a manual review, or appeal the decision. The trial has raised questions about the potential expansion of this age verification process to other regions. While Discord has not confirmed plans for a broader rollout, the company’s response to new laws in the UK and Australia suggests a willingness to adapt to evolving regulatory landscapes. As the experiment continues, the outcome will likely influence how Discord and other social media platforms approach age verification and content moderation in the future.




