Instagram is experimenting with removing the message button for teenagers as part of a privacy push | Tech Reddy

[ad_1]

I

nstagram will remove the comment button for teenagers if their account is viewed by a “suspicious adult” as part of a series of privacy changes introduced by parent company Meta.

The updates, which also include restrictions for young users on Facebook, came after a major ruling found that social media content viewed by British teenager Molly Russell had chance to kill him by suicide.

Meta’s latest update builds on the limits it introduced last year to prevent teens from interacting with adults they don’t know. These include preventing adults from messaging youth users they are not connected to, or from seeing youth in their “People You Know” profiles. It’s now testing removing the all-natural submit button from teen users’ accounts if they’ve been viewed by a suspicious adult. A Meta account is a “made” account that has been blocked or posted by a teenager on its platforms.

Currently, Facebook implements privacy settings by default for youth under 16 (or 18 in some countries). It also encourages teens to limit who can see their friend lists, the people and pages they follow, the posts they’re tagged in, and who they’re allowed to comment on. their posts, and encourage them to review the posts. are checked in before they appear on their profile. The rules are similar to similar updates that were previously posted on Instagram. Also, in August, the photo-sharing app also updated some safety controls for teenagers to make them less likely to be exposed to sensitive content on the site.

As part of its ongoing privacy push, Meta is actively encouraging young users to report suspicious activity. The new message encourages young people to report to Meta after blocking someone, and provides a safety message with information on how to navigate inappropriate messages from adults.

According to Meta, more than 100 million people saw its safety messages on Messenger in one month in 2021. It also reported a 70 percent increase in reports reported by children in the first quarter of this year compared to the previous quarter, on Messenger and Instagram DMs.

Finally, Meta is partnering with the National Center for Missing and Exploited Children on a global platform for young people to crack down on sexually explicit images they create and share online without their consent. .

“We have worked with NCMEC, experts, experts, parents, and victim advocates around the world to help develop the program and ensure that it is focused on the needs of young people so that it can be They are also managing their programs in these dire situations. More on this new resource in the coming weeks. Meta vice president and head of global security Antigone Davis.



[ad_2]

Source link