close
close

first Drop

Com TW NOw News 2024

Instagram Gives Parents More Control Over Teen Accounts
news

Instagram Gives Parents More Control Over Teen Accounts

Getty Images Stock photo of three young people using their smartphonesGetty Images

Instagram is changing the way it works for teens, promising more “built-in protections” for young people and adding extra controls and reassurances for parents.

The new ‘teen accounts’ for children aged 13 to 15 will ensure that many privacy settings are turned on by default, rather than a child choosing to do so.

Teens’ posts are also made private, meaning they are not visible to people who don’t follow them. Additionally, teens must approve all new followers.

These settings can only be changed if a parent or guardian provides supervision of the account, or when the child turns 16.

Social media companies around the world are under pressure to make their platforms safer amid concerns that not enough is being done to protect young people from harmful content.

The NSPCC called the announcement a “step in the right direction” but also said Instagram owner Meta “appeared to be emphasizing the need for child and parent safety.”

Rani Govender, child online safety policy manager at the NSPCC, said Meta and other social media companies needed to take more action themselves.

“This must be supported by proactive measures that prevent harmful content and sexual abuse from spreading on Instagram in the first place, so that all children can benefit from comprehensive protections for the products they use,” she said.

Meta describes the changes as a “new experience for teens, led by parents” and says they will “better support parents and give them peace of mind that their teens are safe with the right protection.”

However, media regulator Ofcom expressed concerns in April on parents’ willingness to intervene to keep their children safe online.

In a talk last week, senior Meta director Sir Nick Clegg said: “One of the things we did find… is that even when we build these controls, parents don’t use them.”

Ian Russell, whose daughter Molly viewed content about self-harm and suicide on Instagram before taking her own life at 14, told the BBC it was important to wait and see how the new policy would be implemented.

“Whether it works or not, we will only know when the measures come into effect,” he said.

“Meta is very good at setting up PR and making big announcements, but they also need to be good at being transparent and sharing how well their measures are working.”

How does it work?

Teen accounts will largely change the way Instagram works for users between the ages of 13 and 15, with a number of settings enabled by default.

Examples include strict controls on sensitive content to prevent recommendations for potentially harmful material, and muted notifications at night.

Accounts will also be set to private instead of public, meaning teens will have to actively accept new followers and their content won’t be seen by people who don’t follow them.

Instagram infographic showing how some teens are prompted to add a parent when they try to change teen account defaultsInstagram

Instagram will begin showing a pop-up to under-16s who try to change key default settings on their teen accounts, telling them they need parental permission.

Parents who choose to monitor their child’s account can see who they are messaging and what topics their children are interested in. However, they cannot view the content of the messages.

Instagram says it will move millions of existing teen users to the new experience within 60 days of notifying them of the changes.

Age identification

The system relies primarily on users being honest about their age. However, Instagram already has tools to verify a user’s age if there are suspicions that they are not telling the truth.

Starting in January, the company will also use artificial intelligence (AI) tools in the US to proactively detect if teens are using adult accounts and revert them to teen accounts.

The UK’s Online Safety Act, passed earlier this year, requires online platforms to take measures to keep children safe or face huge fines.

Ofcom warned social media sites in May they could be named and shamed – and banned for under-18s – if they do not comply with the new online safety rules.

Social media industry analyst Matt Navarra called the changes significant but said they were dependent on enforcement.

“As we’ve seen throughout history with teenagers, in these situations they always find a way around the barriers if they can,” he told the BBC.

“I think Instagram needs to make sure that security can’t be bypassed by more tech-savvy teenagers.”

Questions for Meta

Instagram is certainly not the first platform to introduce such tools for parents. The company claims to have more than 50 tools focused on teen safety.

In 2022, it introduced a family center and monitoring tools for parents that allowed them to see which accounts their child is following and who is following their child, among other things.

Snapchat has also introduced its own Family Center, which allows parents 25 and older to see who their child is messaging and set restrictions on how they can view certain content.

In early September, YouTube said that would limit recommendations of certain health and fitness videos to teenssuch as those that ‘idealize’ certain body types.

Instagram already uses age verification technology to verify the age of teenagers who try to change their age to over 18, by means of a video selfie.

This raises the question of why, despite the many security measures on Instagram, young people are still exposed to harmful content.

An Ofcom study earlier this year found that every child they spoke to had seen violent material online, with Instagram, WhatsApp and Snapchat being the most commonly cited services on which they found the material.

Although they are among the largest, it is a clear sign that the problem has not yet been solved.

Under the Online Safety ActPlatforms must demonstrate a commitment to removing illegal content, including child sexual abuse material (CSAM) or content that encourages suicide or self-harm.

But the rules are not expected to come into full effect until 2025.

In Australia, Prime Minister Anthony Albanese recently announced plans to ban social media for children by introducing a new age limit for children’s social media use.

Instagram’s latest tools put more control in the hands of parents, who now take even more responsibility for deciding whether to give their child more freedom on Instagram and monitor their activities and interactions.

Of course, they must also have their own Instagram account.

But ultimately, parents don’t run Instagram itself, and they have no control over the algorithms that push content to their kids, or what gets shared by its billions of users around the world.

According to social media expert Paolo Pescatore, it is an “important step in protecting children’s access to the world of social media and fake news.”

“The smartphone has opened the door to a world of misinformation and inappropriate content that is changing the behavior of children,” he said.

“More needs to be done to improve children’s digital wellbeing, and that starts by giving parents back control.”