Instagram CEO Adam Mosseri stated that the delay will allow the business to "engage with parents, experts, legislators, and regulators, to listen to their concerns, and to demonstrate the value and relevance of this initiative for younger teenagers online today."
Instagram is halting the development of Instagram Kids, which is aimed at children under the age of 13, in order to address concerns about the vulnerability of younger users. In a blog post published Monday, Instagram CEO Adam Mosseri stated that the delay will allow the business to "engage with parents, experts, legislators, and regulators, to listen to their concerns, and to demonstrate the value and relevance of this initiative for younger teenagers online today."
According to Mosseri, Instagram for Kids is intended for children aged 10 to 12, not younger. It will be ad-free, with age-appropriate material and features, and will require parental permission to join. Parents will be able to monitor their children's use of the app and who can message them, who can follow them, and who they can follow. While work on Instagram Kids is stopped, the company will offer opt-in parental monitoring features to adolescent accounts of those 13 and older. More information about these tools will be released in the coming months, according to Mosseri.
Also Read | Instagram testing new feature to allow users to mark followers as 'favourites'?
The news comes after The Wall Street Journal published an investigative series in which it was revealed that Facebook was aware that the usage of Instagram by certain adolescent females was causing mental health concerns and anxiety. However, the development of Instagram for a younger audience was met with widespread opposition almost immediately.
In March, Facebook announced the creation of Instagram for kids, stating that it was "exploring a parent-controlled experience." The backlash was nearly instantaneous, and in May, a bipartisan coalition of 44 attorneys general wrote to Facebook CEO Mark Zuckerberg, pushing him to stop the initiative, citing children's safety as a reason. They highlighted increasing cyberbullying, potential exposure to online predators, and Facebook's "disappointing track record" in protecting minors on its services. Similar criticism was levelled at Facebook in 2017 when it released the Messenger Kids programme, which was billed as a method for children to communicate with family members and friends approved by their parents.
Also Read | Social media influencers must label paid posts as advertisement, rules German court