Instagram boosts safety measures for accounts focused on kids

Levy Health aims to assist women in detecting fertility issues earlier

Caroline Mitterdorfer began her fertility journey after a cancer diagnosis at age 27. She co-founded Levy Health to help speed Read more

Lenovo’s newest designs show that PCs can still be enjoyable

Large corporations typically play it safe when it comes to consumer hardware, sticking to incremental updates year after year. Lenovo, Read more

PowerSchool reveals massive data breach: Hackers steal students’ sensitive info!

Welcome to the Edtech World Big news in the education tech world! PowerSchool, the edtech giant, recently experienced a data Read more

Telegram’s Crypto Wallet Debuts in the United States

Telegram Expanding Access to Crypto Wallet Telegram is now offering its crypto wallet to its 87 million users in the Read more

Meta Introduces New Safety Measures for Instagram Accounts Featuring Children

In a recent blog post, Meta announced additional safeguards for Instagram accounts run by adults that primarily feature children. These accounts will now be automatically placed into the app’s strictest message settings to prevent unwanted messages. The platform’s “Hidden Words” feature will also be enabled to filter offensive comments on these accounts. Moreover, new safety features are being rolled out for teen accounts as well.

Keeping Kids Safe Online

Accounts that will be affected by the new message settings include those run by adults who frequently share photos and videos of children, as well as accounts managed by parents or talent managers representing children. Meta acknowledges that while these accounts are mostly used innocently, there are individuals who may try to exploit them by leaving inappropriate comments or soliciting sexual content in direct messages.

See also  European unicorn Alan brings fresh air to Canada's health insurance market

Preventing Abuse and Exploitation

To prevent potentially suspicious adults from accessing accounts primarily featuring children, Meta will take steps to block these individuals from finding such accounts on Instagram. The company will also avoid recommending suspicious adults to these accounts and make it harder for them to connect through Instagram Search.

Addressing Mental Health Concerns

Today’s announcement comes in response to growing concerns over mental health issues related to social media use. Various states, including the U.S. Surgeon General, have highlighted these concerns, with some states going as far as requiring parental consent for minors to access social media platforms.

Empowering Teens with Safety Features

In addition to the new safety measures for accounts featuring children, Meta is introducing new safety features for Direct Messages (DMs) on Teen Accounts. These features include safety tips for teens, display of account joining date in chats, and an option to block and report users simultaneously. Meta aims to provide teens with more information about the accounts they interact with and help them identify potential scammers.

Encouraging Positive Online Behavior

Meta reports that teens have been proactive in responding to safety notices, blocking accounts and reporting suspicious activity. The company also shared updates on the effectiveness of its nudity protection filter, noting that the majority of users, including teens, have kept it enabled.

By implementing these new safety measures, Meta is taking steps to create a safer online environment for children and teens on Instagram.

Snapchat to Introduce Feature that Alerts Parents of Teen’s Whereabouts

Databricks Nears Record $9.5 Billion Round at $60 Billion Valuation