News

Meta, TikTok, Snapchat and other social media platforms have long been criticized for failing to remove content deemed harmful to teens, including videos and images of self-harm. Topics Technology ...
Meta, Snap, TikTok, and the Mental Health Coalition developed Thrive to stop graphic self-harm and suicide content on social media like Instagram, Facebook, and others from spreading.
Meta Platforms, Inc. (NASDAQ:META), Snap Inc. (NYSE:SNAP), and ByteDance-owned TikTok have announced a joint initiative to combat the spread of suicide and self-harm content online. What Happened ...
Thrive, which counts Meta, Snap, and TikTok as founding members, will provide ways for platforms to share hashes — essentially unique fingerprints — of graphic suicide and self-harm content ...
Meta is teaming up with Snapchat and TikTok as part of a new initiative to prevent content featuring suicide or self-harm from spreading across the social media platforms, Meta said Thursday in a ...
Sept. 12 (UPI) --Three of the biggest social media platforms are teaming up to address online content that features suicide and self harm, Meta announced Thursday. Meta, the owner Facebook, Instagram ...
Meta, TikTok and Snap are partnering with the Mental Health Coalition to launch a program that invites companies to share signals about graphic content depicting self-harm or suicide.
Meta, the owner Facebook, Instagram, and WhatsApp, has teamed up with Snap (the company that developed and maintains Snapchat), and TikTok to form an initiative called Thrive. The programme aims to ...
Meta will restrict content related to suicide, self-harm and eating disorders from teen users as part of an update to youth safety and privacy policies, the company announced Tuesday. The content ...
Meta, Snap, and TikTok have launched a joint initiative called Thrive, aimed at combating the spread of suicide and self-harm content online by sharing "signals" to identify and address such ...
Thrive, which counts Meta, Snap and TikTok as founding members, will provide ways for platforms to share hashes — essentially unique fingerprints — of graphic suicide and self-harm content and ...