Meta, which owns Instagram, is initiating a proactive step with setting rigorous guidelines for teenagers under 18 in an effort to make a safer online environment. Starting in the US, UK, Canada, and Australia, user experience for teenagers would undergo large modifications. Ultimately by early 2025, Meta plans to implement their policies worldwide.
Meta announced policies which will enable parents to control their children’s accounts in ways such as setting daily time limits and blocking teens from using Instagram at night. Parents will also be able to view their children’s message history, along with the content they view.
Teenagers under the age of 16 will need their parents’ permission to turn off the default settings, including sleep mode, private account, and censoring, while teens 16 and above will be able to turn it off by themselves.
Although these regulations have been implemented, many studies criticize the influence of social media while explaining the epidemics of increasing suicide rates and depression among teenagers. A 2021 Wall Street Study Journal researching the negative influence of Instagram on teen girls, said that “[they] make body image issues worse for one in three teen girls.”
The policies of Meta differ significantly from what many identify as the core issues of social media.
Naomi Gleit, Head of Product at Meta, said that it is addressing the three largest concerns of parents: adolescents viewing unwanted content, getting contacted by potential online predators, and being addicted to their phones.
Although these are important aspects of creating a safer online environment, the initial problem of addiction due to features like sensationalized contents remains unaffected.
“Initial problem of addiction is quite hard to fix as everyone communicates through Instagram nowadays,” Claire Park (10), social media user, said. “Although certain regulations need to be implemented for teenagers, it would not necessarily decrease their screen time.”
Still, how will these policies change Instagram for the youth?
Many teen safety issues have followed Meta since their initial success on Facebook, and some claim that these policies are simply a tactic to decrease criticism from the public. Recently, Mark Zuckerberg, the CEO of Meta, apologized to parents in the audience during the January Senate hearing who fall under the “actions to garner support.”
Dozens of emerging lawsuits against Meta for deliberately exacerbating the youth mental health crisis through knowingly implementing highly addictive features like the algorithm by design is presumably another reason for Meta initiating change.
Yet experts are skeptical if Meta would genuinely be willing to put off the potential profit of teenagers as a trade off of mental health.
“People would find a leeway anyways,” Michael Jang (9), Instagram user, said. “I don’t see much meaning in this change as it would not bring practical change.”
There is no specific measure that can directly reduce the usage of Instagram. Parents might or might not take on the responsibility of regulating their child’s instagram account.
“I don’t think the borderline privacy invasions that Meta is doing should be overlooked,” Katie Hong (9), subscriber of Meta, said. “Especially with ineffective policies, the problem should be highlighted.”
While new policies of Meta offer an increase of parental control, the effectiveness is quite questionable. Without a solid grasp of how Instagram operates, parents may struggle to enforce the guidelines, potentially leaving teens largely unsupervised.