Why Meta’s new safety revamp for teens doesn’t stop the alarm bells

Meta’s new policies for teens on Facebook and Instagram is facing skepticism on whether or not they will be effective.

Jan 12, 2024 - 03:30
 0  10
Why Meta’s new safety revamp for teens doesn’t stop the alarm bells

Earlier this week, Meta announced that it is rolling out new safety measures for teens on Instagram and Facebook in the next coming weeks amid a lawsuit from over 40 states accusing the company of allegedly harming young users on its platforms.

Some of the changes that Meta is planning to implement include automatically placing teenage accounts on Instagram and Facebook under restrictive settings (without the ability for them to opt-out) that will block them from seeing sensitive content. 

Related: How Mark Zuckerberg is profiting during Elon Musk's fall from grace

It is also expanding its effort to hide search results for people who search for “terms related to suicide, self-harm and eating disorders” and will direct them to resources where they can seek help. And last, the Meta is pushing new “single tap” notifications on Instagram to remind teens to check their safety and privacy settings “regularly.”

Even though Meta claims that it has consulted with experts on developing these new policies, it is now facing skepticism on how effective the new measures will be at ensuring child safety on its platforms. The move from Meta comes after it rolled out its default end-to-end encryption feature last month for Facebook and Messenger, and amid its increased use of artificial intelligence.

“Given Meta’s move to E2EE and their aggressive competition around AI with its competitors, it remains to be seen if these recent changes will be effective in moving toward greater child protection on the platform overall,” said Lina Nealon, vice president at the National Center on Sexual Exploitation, in a statement.

End-to-end encryption is a feature that allows private messages and calls to only be visible to the sender and receiver, they can’t be accessed by the messaging service itself or law enforcement.

Previously end-to-end encryption was an option that could be turned on by users, but it will now be automatically enabled. The company also plans to expand default end-to-end encryption for Instagram in the future as well.

A 12-year-old boy looks at an iPhone screen on Dec. 19, 2023 in Bath, England. 

Matt Cardy/Getty Images

“This ‘see no evil’ policy without exceptions for child abuse material has placed millions of children in grave danger,” said Nealon. “If Meta truly wants to protect children, it must revisit its decision to move to E2EE or prioritize technology to prevent and identify CSAM creation and sharing on its platforms. At the very least, it must roll back E2EE from minor accounts immediately.”

Meta’s new measures to protect children on its platforms also comes on the heels of a lawsuit from over 40 states, which was filed last year on Oct. 24. The lawsuit alleges that the company “has repeatedly misled the public about the substantial dangers of its Social Media Platforms” and has "concealed” how its platforms “exploit and manipulate” children and teens. The coalition of state attorney generals in the lawsuit are looking to curb Meta’s alleged “harmful tactics” and have the company face penalties and restitution.

“It is apparent that Meta is finally facing enough public pressure from lawsuits, media, advocates, whistleblowers, and Congress to do what they should have done since the very beginning: treat teens as kids – not adults – and to afford them more protections,” said Nealon.

The battle to protect minors on social media may be a long one as social media platforms have reportedly garnered millions of dollars off of minors using their platforms.

According to a recent study from Harvard T.H. Chan School of Public Health, social media platforms such as Facebook, Instagram, X, TikTok and others, have collected $11 billion in advertising revenue collectively in 2022 from users under the age of 18.

“Although social media platforms may claim that they can self-regulate their practices to reduce the harms to young people, they have yet to do so, and our study suggests they have overwhelming financial incentives to continue to delay taking meaningful steps to protect children,” said Bryn Austin, a professor in the Department of Social and Behavioral Sciences at Harvard and a senior author on the study, while speaking to The Harvard Gazette.

Investing can be hard. We make it easier. There are thousands of stocks you can invest your hard-earned money in. Our pros help you decide what stocks to buy and when to buy them. Sign up to find out what stocks we're buying now

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow