Instagram boss Meta has been fined €405m (£349m) by the Irish data watchdog for allowing teenagers to set up accounts that publicly display their phone numbers and email addresses.
The Data Protection Commission upheld the penalty after a two-year investigation into possible violations of the European Union’s general data protection law (GDPR).
Instagram allowed users between 13 and 17 to manage business accounts on the platform, which revealed users’ phone numbers and email addresses. The DPC also found that the platform operated a user registration system that set the accounts of 13-to-17-year-old users to “public” by default.
The DPC controls Meta – which also owns Facebook and WhatsApp – for the whole of the EU because the company’s headquarters are in Ireland.
The penalty is the largest ever imposed on Meta by the watchdog, following a €225m fine imposed in September 2021 for “serious” and “egregious” GDPR violations at WhatsApp and costing €17m in March this year.
The fine is the second largest under the GDPR, behind the €746m imposed on Amazon in July 2021.
A spokesman for the DPC said: “We made our final decision last Friday, which includes a compensation of €405m. Full details of the decision will be released next week.”
Caroline Carruthers, head of information for the UK, said that Instagram did not think about its privacy responsibilities when allowing young people to set up business accounts and showed a “lack of care ” in users’ privacy settings.
“GDPR has special requirements to ensure that services aimed at children remain at a high level of transparency. Instagram failed this time when it opened accounts for children to open the default rather than the individual.
Last year Meta suspended work on a children’s version of Instagram after revelations about the app’s impact on youth mental health.
Instagram said the move was “tentative” to address concerns raised by parents, experts and administrators. The move followed revelations from a commentator, Frances Haugen, that Facebook’s own research showed that Instagram can affect girls’ mental health on issues such as body image and self-esteem.
Instagram said before September 2019, it has sent user contact information to business accounts and explained to users during the process. Those under the age of 18 have their accounts set to a private account when they access the platform.
Andy Burrows, head of online child safety policy at the NSPCC, said: “This is a serious breach that has huge safeguarding implications and has the potential to cause serious harm to children using Instagram.
“The decision shows how effective enforcement can be to protect children on social media and demonstrates how the law is keeping children safe online.”
A Meta spokesperson said: “This inquiry focused on the old sites we updated last year, and we’ve introduced a number of new features to help young people stay safe with their information..
“Anyone under the age of 18 will automatically identify their account when they log in to Instagram, so only people they know can see what they post, and adults can’t send messages to young people do not follow them.
“Although we have been fully engaged with the DPC throughout their enquiries, we disagree with how this fine was calculated and intend to appeal. We are continuing to carefully review the remainder of the decision.