Italian Parents’ Group Files Landmark Legal Action Against Meta and TikTok Over Minors’ Social Media Access

A courtroom in Milan is now the latest theatre involved in the global hothouse controversy concerning the responsibilities of social media giants towards children’s wellbeing. On Thursday, an Italian parents’ association and several Italian families went before the city’s business court for the first hearing of a class action lawsuit against Meta and TikTok. MOIGE, a prominent Italian parents’ group, claims that existing protections for minors on platforms such as Facebook, Instagram and TikTok are woefully inadequate. The question at the centre of the legal action is merely a simple one, but one that is pressing: Should tech companies have to alter their age verification protocols and their recommendation engines when kids are in the picture?

In a world where making a living is dependent on the internet, many parents have been resigned to the idea that their kids will encounter social media before they reach 18 years old. What about when that quiet worry becomes organised legal action? MOIGE has been the voice of thousands of Italian families that feel that the time for patience has passed. According to the group, as many as three and a half million Italian kids aged seven to fourteen are now online on social media apps, with some of them using apps that are in direct breach of the apps’ age restrictions. That’s a lot of numbers, and is a fact many families are familiar with: age gates are easy to get past and if a child enters, the algorithms are the same for them as they are for adults.

The Milan court is called on to make a number of concrete decisions. First, the plaintiffs are asking for the platforms to have much more robust age verification for all those who assert they’re younger than 14. Second, they want Meta and TikTok to take down what the lawsuit refers to as “potentially manipulative algorithms”—those that are designed to make users scroll and scroll and scroll for more and more outrageous or interesting content. Third, the parents group want complete transparency of the companies regarding potential physical and mental health risks of overuse. These are not hypothetical queries! They discover exactly just exactly how social media is so profitable and addictive.

image

The issues expressed by MOIGE from a medical and developmental point of view are supported by a developing research base. There has been a growing trend in anxiety, sleep disturbances and attention problems reported in young children who are heavy users of social media. Another problem, too, is that one could be exposed to bad content, whether it’s proself-harm or predatory behaviour. The Italian case is unique in that algorithm manipulation is seen as a structural issue instead of an issue with content failures. The logic is that if all content were eliminated, the reward system that would be used to ensure screen time would be problematic in its own right to the developing brain.

Not so the tech companies, either. TikTok said the case continues to move forward but it reiterated its existing protections. They stated that TikTok has been very strict in enforcing its Community Guidelines, particularly when it comes to mental and behavioral health. The spokesperson continued: “We are also looking after safety features to diversify recommended content, block potentially harmful searches and link vulnerable users to support services available.” That last is important, in that it admits – at least in a roundabout way – that harm can be done and mitigation must be done.

Meta (Facebook and Instagram) played a more assertive yet defensive tone. The company strongly disagreed with MOIGE’s allegations. “Parents care about their child’s safety online, which is why we’re continually improving things to keep teens safe,” said Meta in a formal statement. The company cited its Teen Accounts and the multiple protections provided by such accounts, such as automatic privacy settings and messaging restrictions. “We will remain committed to our record of keeping young people safe and will do more to continue to do so,” Meta added.

The legal mechanism in this case is very interesting. The case being filed with Milan’s business court is for a “class” injunction, not monetary damages. Rather, they’d like the court to take action going forward. It’s a higher standard of law in some respects, but it’s also a greater degree of transformative power. If Milan’s ruling goes in favour of MOIGE, it could necessitate reengineering of the core features of Meta and TikTok for the entire Italian market, plus others across Europe.

Well, all things have two sides to them. The opponents to the lawsuit may question the validity of the age verification, citing privacy concerns. How can you establish that the user is under 14 without asking all users to present paperwork from the government? If your data is compromised, that could lead to even more danger for your children. Others say that it’s the parent’s responsibility, not the platform’s, to be responsible for their children’s online behavior. Then there’s the issue of enforcement—even the most advanced age gates can be bypassed by tech-savvy teens—on the other hand, transparency of algorithms could expose trade secrets, something companies have a right to keep.

👁️ 39.1K+
Kristina Roberts

Kristina Roberts

Kristina R. is a reporter and author covering a wide spectrum of stories, from celebrity and influencer culture to business, music, technology, and sports.

MORE FROM INFLUENCER UK

Newsletter

Sign up for Influencer UK news straight to your inbox!