Meta, TikTok, and Snap will participate in an external grading process that evaluates social platforms on their protection of adolescent mental health, alongside YouTube and other numerous social platforms.
The Mental Health Coalition’s Safe Online Standards (SOS) initiative created this program, which encompasses approximately two dozen standards, including platform policy, functionality, governance, transparency, and content oversight. Dr. Dan Reidenberg, Managing Director of the National Council for Suicide Prevention, leads the SOS initiative.
The Mental Health Coalition stated that SOS establishes user-informed data on how social media, gaming, and digital platforms design products, protect users aged 13–19, and address exposure to suicide and self-harm content. Participating companies are required to voluntarily submit documentation on their policies, tools, and product features, which will be evaluated by an independent panel of global experts.
Platforms will receive one of three ratings: “use carefully,” “partial protection,” or “does not meet standards.” The highest safety rating, “use carefully,” requires accessible and easy-to-use reporting tools, and privacy, default, and safety functions that are clear and easy for parents to set. It also specifies that platforms and filters help reduce exposure to harmful or inappropriate content. Platforms achieving this rating will receive a blue badge for display.
A rating of “partial protection” indicates that some safety tools exist but may be difficult to locate or use, while “does not meet standards” is assigned if filters and content moderation do not reliably block harmful or unsafe content.
The Mental Health Coalition, founded in 2020, has included Facebook and Meta as partners since its inception. The organization has a history of collaborating with Meta, having announced its plan to partner with Facebook and Instagram in 2021 to destigmatize mental health and connect individuals to resources during the COVID-19 pandemic.
In 2022, the nonprofit published a case study with support from Meta, concluding that mental health content on social media can reduce stigma and increase individuals’ likelihood to seek resources, thereby positively impacting mental health. In 2024, the Mental Health Coalition, in partnership with Meta, launched the Time Well Spent Challenge, encouraging parents to have meaningful conversations with teens about healthy social media use.
The same year, the partnership with Meta established “Thrive,” a program enabling tech companies to share data regarding content that violates self-harm or suicide guidelines. The Mental Health Coalition lists Meta as a “creative partner” on its website.
However, Meta has faced allegations regarding its handling of mental health concerns, including suppressing internal data on the negative effects of its products on users’ mental health, as revealed by “Project Mercury,” an internal research project begun in 2020. Meta is currently on trial in California, facing allegations of child harm due to addictive products, and is one of several impending lawsuits against the company.
Other participants in the rating program include Roblox and Discord, both of which have faced concerns regarding child wellbeing on their platforms. Discord has enhanced its age-verification processes in response to child endangerment concerns.




