Facebook is working on a version of Instagram for children under 13

Instagram boss Adam Mosseri confirms that a version of the popular photo sharing app for children under 13 is in development, BuzzFeed News reports. The Facebook-owned company knows that many children want to use Instagram, Mosseri said, but there is no “detailed plan yet,” according to BuzzFeed news.

“But part of the solution is to create a version of Instagram for young people or children where the parents have transparency or control,” said Mosseri BuzzFeed News. “It is one of the things we are exploring.” Current Instagram policy prohibits children under the age of 13 from using the platform.

“More and more children are asking their parents if they can sign up for apps that help them stay in touch with their friends,” said Joe Osborne, a Facebook spokesperson by email to The Verge. “At the moment there are not many options for parents, so we are working on building additional products – as we did with Messenger Kids – that are suitable for children, managed by parents. We’re exploring bringing a parental-controlled experience to Instagram to help kids keep up with their friends, discover new hobbies and interests, and more. “

BuzzFeed News got a message from an internal message board where Instagram product vice president Vishal Shah said a “youth pillar” project was identified as a priority by the company. Its Community Product Group will focus on privacy and security issues “to ensure the safest experience possible for teenagers,” wrote Shah in the post. Mosseri would oversee the project with Vice President Pavni Diwanji, who oversaw YouTube Kids while she was on Google.

Instagram posted a blog post earlier this week describing its work to make the platform safe for its youngest users, but made no mention of a new version for children under 13.

Targeting online products to children under the age of 13 involves not only privacy concerns, but also legal issues. In September 2019, the Federal Trade Commission fined Google $ 170 million for tracking children’s viewing stories to run ads on YouTube, a violation of the Children’s Online Privacy Protection Act (COPPA). TikTok’s precursor, Musical.ly, was fined $ 5.7 million for violating COPPA in February 2019.

Facebook launched an ad-free version of its Messenger chat platform for children in 2017, aimed at children between 6 and 12 years of age. Child health advocates criticized it as harmful to children and asked CEO Mark Zuckerberg to stop it. Then, in 2019, a bug in Messenger Kids allowed children to join groups with strangers, leaving thousands of children in chats with unauthorized users. Facebook silently closed unauthorized chats, which affected “a small number” of users.

March 18 update, 7:46 pm ET: Added Adam Mosseri’s tweet.

Source