In an effort to combat the spread of suicide and self-harm content online, the nonprofit Mental Health Coalition (MHC) has introduced a new program called Thrive. This program, which includes Meta, Snap, and TikTok as founding members, aims to encourage online platforms to share “signals” of potentially harmful material. Platforms will be able to share hashes, unique fingerprints, of graphic suicide and self-harm content as well as content depicting or promoting viral challenges. The hashes will only be linked to content and will not contain identifiable information about accounts or individuals.
Meta has provided the technical infrastructure for Thrive, the same infrastructure used for the Tech Coalition’s Lantern child safety program. Members of Thrive will have the ability to aggregate information on self-harm content and receive alerts about content that raises concerns or violates their policies. They will then independently decide whether to take action based on this information.
Thrive’s director, Dan Reidenberg, will oversee the operational aspects of the program. Participating companies will be responsible for uploading, reviewing, and taking action on any content shared through Thrive. Kenneth Cole, founder of the MHC, expressed excitement about the collaboration with Thrive and praised Meta, Snap, and TikTok for their commitment to addressing suicide and self-harm content.
Notably absent from Thrive is X, formerly known as Twitter, which has faced criticism for its moderation practices. Data shows that X has a smaller moderation team compared to other social platforms, partly due to CEO Elon Musk’s decision to reduce the number of engineers dedicated to trust and safety. Google, the owner of YouTube, is also not a member of Thrive. YouTube has been criticized for its failure to protect users from self-harm content.
Recent studies have shown a link between heavy social media use and reduced well-being, including mood disorders like depression and anxiety. Meta, Snap, and TikTok have faced lawsuits accusing them of contributing to mental health issues. For example, a British authority found Meta-owned Instagram responsible for the suicide of a 14-year-old girl after she was exposed to self-harm content on the platform.
