Can Teen Instagram Accounts Help Kids?

Meta, the parent company of Facebook, Instagram and WhatsApp, announced Tuesday that it would roll out measures limiting what kind of content young people can access, who they can talk to and how much time they spend on specific media. The new measures will begin with an Instagram rollout that began Sept. 17 in the U.S. but will eventually roll out to Facebook and WhatsApp.

The new policies include automatically making Instagram accounts for users 16 and under private, restricting who can contact or tag teen accounts in posts, muting certain words associated with cyberbullying, and setting the most restricted access to content by default. They also encourage young people to spend less time on the app.

The new protocols are the result of years of debate about the effects of social media use on young people. Experts and politicians claim that social media and smartphones are the cause of the decline in the well-being of teenagers.

Legislation and lawsuits have blamed social media for problems ranging from bullying and suicidal thoughts to eating disorders, attention deficit disorder and predatory behavior. Meta’s new policies address those concerns, and some may have positive effects, particularly those focused on privacy. But they also focus on the rhetoric of politicians rather than the well-being of teens, and come even as some experts warn there is no causal link between young people’s use of social media and those poor outcomes.

Meta attempts to address much of the criticism about its effect on teenagers

Meta and other social media companies have come under intense scrutiny for their perceived negative impact on the mental health and well-being of young people. Cyberbullying, eating disorders, anxiety, suicidal thoughts, poor school performance, sexual exploitation, and addiction to social media and technology are all concerns that Meta’s new Instagram protocols were designed to address.

In recent years, reporting — such as the Wall Street Journal’s 2021 Facebook Files series — has examined how Meta’s leadership knew Instagram could be toxic to teenage girls’ body image but failed to mitigate the risks for vulnerable users. Surgeon General Vivek Murthy has also laid blame on rising rates of depression and anxiety on social media use; his office released a report last year warning that social media use was a major contributor to declining mental well-being among young people.

The report says that up to 95 percent of U.S. children ages 13 to 17 use social media, and nearly 40 percent of children ages 8 to 12 do so. “We do not currently have enough evidence to determine whether social media is sufficiently safe for children and adolescents,” the report’s introduction says, citing overuse, harmful content, bullying and exploitation as key areas of concern.

Murthy also called for a Surgeon General warning label on social media — similar to the labels on cigarette packages and alcohol bottles that warn of the health risks of those products — in a New York Times op-ed in June. The op-ed also called for federal legislation to protect children who use social media.

Similar legislation is already pending in Congress: the Kids Online Safety Act (KOSA). KOSA passed the Senate in July and heads to the House of Representatives for a markup on Wednesday. It’s unclear whether a version of the bill will pass both chambers, but President Joe Biden has indicated he would sign such a bill if it did.

The version of KOSA, passed earlier this summer, would require companies that allow child or teen accounts to disable targeted algorithmic features and restrict features that reward the platform or game in question or enable sustained use. It would also require companies to limit who can interact with minors, as Meta’s new policy does; “prevent other users from viewing the minor’s personal data”; and limit and prevent harm to teens’ mental health.

The Senate-approved version of KOSA goes further than Meta’s new teen account policy, particularly when it comes to young people’s data privacy. It’s also unclear what effect, if any, Instagram Teen Accounts will have on the laws surrounding teen social media use.

Who are the new protocols for and will they improve teens’ lives?

The wording in Meta’s press release focuses on parents’ concerns about their children’s use of social media, rather than on young people’s online privacy, mental health or well-being.

The reality is that Meta’s teen accounts, and indeed the KOSA legislation, can only do so much to address cultural and political fears about what social media is doing to children’s wellbeing, because we simply don’t know that much about it. The available data does not show that social media use has more than a negligible effect on the mental health of teenagers.

“A lot of the things that are being proposed to fix social media aren’t really questions of scientific rigor, they’re not really questions about health or anxiety or depression,” Andrew Przybylski, a professor of human behavior and technology at the University of Oxford, told Vox. “They’re really questions of taste.”

Christopher Ferguson, a professor of psychology at Stetson University who studies the psychological impact of media on young people, said he believes the outcry over the effect of social media on children’s well-being has all the hallmarks of a “moral panic,” reminiscent of the concerns of previous generations that radio, television, the Dungeons & Dragons role-playing game and other new media were ruining the minds and morals of children.

It’s unclear exactly what metrics Meta will use to determine whether the new rules are helping kids and parents. When asked about those metrics, Meta spokesperson Liza Crenshaw told Vox only that the company would be “iterating to make sure Teen Accounts work” for Instagram users. Crenshaw did not respond to follow-up questions as this article went to print.

“These all seem like good faith efforts,” Przybylski said. “But we don’t know if it’s going to work.”

You May Also Like

More From Author