Instagram is revamping its approach for teenagers, focusing on enhancing safety features for younger users and giving parents more control. Starting Tuesday, new “teen accounts” will be rolled out in the UK, US, Canada, and Australia. These accounts will automatically activate several privacy settings for users under 18, such as limiting their content visibility to only those who follow them and requiring their approval for new followers. For teenagers aged 13 to 15, changing these settings will only be possible if they add a parent or guardian to their account. This initiative comes as social media platforms are under growing pressure worldwide to improve safety measures, particularly in protecting young users from harmful content. The NSPCC, a children’s charity in the UK, praised Instagram’s announcement as a step in the right direction but pointed out that the onus for safety still largely rests on children and their parents. Meta, Instagram’s parent company, describes these updates as a “new experience for teens, guided by parents.” The company stresses that these changes aim to better support parents and reassure them that their teenagers are safeguarded with suitable safety measures in place. Instagram will be modifying its operations for users aged 13 to 15, with many settings automatically turned on. These settings will include strict restrictions on sensitive content to help prevent exposure to harmful material, along with muted notifications during nighttime hours. This means teenagers will have a safer experience while using the platform. Teen accounts will automatically be set to private, requiring young users to approve any new followers. Their posts will not be visible to anyone who does not follow them. Parents who wish to keep an eye on their child’s account can see who they are messaging and what interests they have, but they won’t have access to the actual messages. Ofcom has raised concerns about whether parents will actively assist in keeping their children safe online. Instagram will rely on users being honest about their ages, but it has methods to verify if someone is not telling the truth.
Starting in January in the US, Instagram will employ AI to identify teens using adult accounts and revert them back to teen accounts. The UK’s Online Safety Act also mandates that platforms ensure children’s safety or face significant penalties.Instagram has launched new features that enhance parental control over their children’s use of the platform. Now, parents can determine how much freedom their child has on Instagram and monitor their activities and interactions. However, to manage their child’s account, parents must also have their own Instagram account. Despite this increased oversight, parents still cannot influence the algorithms that decide what content is displayed to their children or what is shared by the vast number of users globally. This initiative is considered a significant advancement in protecting children from the hazards of social media and misinformation. The smartphone has exposed young users to a barrage of misleading information and inappropriate content, which can influence their behavior. Experts believe that further efforts are necessary to improve children’s digital wellbeing, starting with empowering parents to take control of their children’s online experiences.