Hoping to avoid child-health controversies on Instagram, Meta will introduce a number of features to increase protection for pre-teens in its virtual-reality state. In addition, it will add existing ones on Instagram, with the aim of stopping the debate about the app’s impact on kids.
With Meta’s Quest VR headset control software, parents will be able to approve or deny purchases, block apps, view apps owned by a teenage user, and receive notifications about any purchases. Parental control, which is only active when a teenager links an account with a parent, also allows parents to view a child’s screentime, see who the child has added as a friend, and block content from a PC to VR headset.
“We’re adding more app-mediated interventions to encourage teens to have more positive experiences with a variety of online content,” said Vaishnavi J., head of youth welfare at Meta. “And they need to be more aware of the time they are spending online.”
Politicians have recently come under fire for how Meta manages young people. Most notably, lawmakers criticized the lack of security around Instagram, citing internal research published from Meta by Whistleblower, which shows the company’s own concerns about teenagers’ use and mental health when using the app. Meta has responded by releasing new safety equipment and trying to discredit leaked research.
The company has several new measures to strengthen the safety of children. On Instagram, parents can now set a time of the week to limit the use of the app and view the information of any post reported by the child. Also, Instagram will start sending so-called “naz” to teen users, notifications that reduce the time spent on one topic – potentially harmful – and move on to another. Even if a teenage user spends a lot of time with a genre of reels, Instagram is focused on its app, focusing on short videos, eager to regain popularity and viewership from TickTock.