Meta, TikTok, and Snap commit to joining suicide and self-harm content prevention program.

Canadian news outlets are taking OpenAI to court, eh!

A bunch of Canadian news outlets are in a tizzy, suing OpenAI for allegedly using their content without permission. The Read more

Audio platform Pocket FM leverages AI for content growth

India-based audio platform Pocket FM has a vast content library. However, CEO Rohan Nayak believes there is room for expansion Read more

OpenAI expands its AI-powered web search tool to reach more ChatGPT users

ChatGPT Search, OpenAI's AI-powered web search experience, has officially launched for all ChatGPT users, introducing several new features to enhance Read more

Humanz takes its influencer marketing platform to the US

Humanz, a cutting-edge marketing platform for content creators and brands, has officially made its debut in the U.S. market, as Read more

In an effort to combat the spread of suicide and self-harm content online, the nonprofit Mental Health Coalition (MHC) has introduced a new program called Thrive. This program, which includes Meta, Snap, and TikTok as founding members, aims to encourage online platforms to share “signals” of potentially harmful material. Platforms will be able to share hashes, unique fingerprints, of graphic suicide and self-harm content as well as content depicting or promoting viral challenges. The hashes will only be linked to content and will not contain identifiable information about accounts or individuals.

Meta has provided the technical infrastructure for Thrive, the same infrastructure used for the Tech Coalition’s Lantern child safety program. Members of Thrive will have the ability to aggregate information on self-harm content and receive alerts about content that raises concerns or violates their policies. They will then independently decide whether to take action based on this information.

See also  Exploration of ChatGPT citations paints a bleak picture for publishers

Thrive’s director, Dan Reidenberg, will oversee the operational aspects of the program. Participating companies will be responsible for uploading, reviewing, and taking action on any content shared through Thrive. Kenneth Cole, founder of the MHC, expressed excitement about the collaboration with Thrive and praised Meta, Snap, and TikTok for their commitment to addressing suicide and self-harm content.

Notably absent from Thrive is X, formerly known as Twitter, which has faced criticism for its moderation practices. Data shows that X has a smaller moderation team compared to other social platforms, partly due to CEO Elon Musk’s decision to reduce the number of engineers dedicated to trust and safety. Google, the owner of YouTube, is also not a member of Thrive. YouTube has been criticized for its failure to protect users from self-harm content.

Recent studies have shown a link between heavy social media use and reduced well-being, including mood disorders like depression and anxiety. Meta, Snap, and TikTok have faced lawsuits accusing them of contributing to mental health issues. For example, a British authority found Meta-owned Instagram responsible for the suicide of a 14-year-old girl after she was exposed to self-harm content on the platform.

Cloudflare introduces a fun new marketplace for websites to charge AI bots for scraping

Dave Clark, ex-Amazon e Flexport, acaba de captar $100M para novo empreendimento na cadeia de suprimentos.