The increased burden on social media companies to safeguard their users has become a hot topic again, with Snapchat officially facing an inquiry by the European Commission. The investigation poses important questions of whether the platform has done their enough to ensure that minors are not victims of the platform and to stop the proliferation of illicit activities, especially in an age where virtual worlds have become so intertwined with daily routine.
The focal point of the inquiry is the Digital Services Act, an expansive law aimed to take the big internet companies to task over damaging and unlawful content. The law, which was introduced to make the internet a more transparent and safe place, necessitates companies to proactively evaluate risks and put effective protective measures in place. Non-observance may lead to major fines, at least up to 6% of the annual global turnover of a company. In the case of a company such as Snap Inc., the stakes are admittedly high.
Regulators have questioned why Snapchat is not sufficiently dealing with child grooming concerns and the sale of illicit or age limited commodities. These are not some small missteps but essential safety malfunctions that may put vulnerable users at risk of serious injuries. The Commission is of the view that current measures on the platform may be inadequate to ensure that there is no potential possibility of minors being contacted by individuals with ill motives, such as those with sexual desires and intentions of involving them in crime.
Regarding grooming and exposure to illicit goods, as well as account settings that pose a threat to the safety of minors, Snapchat seems to have forgotten that the Digital Services Act requires a high degree of safety to all users, EU tech chief Henna Virkkunen announced in a press release.

This is a sentiment echoed by a wider group of regulators who believe that platforms have been mostly reactive, as opposed to proactive, when it comes to user safety. Over the past few years, there has been a surge in public awareness of the risks of the Internet, in particular, the ease at which minors could be targeted via seemingly innocent services such as messaging or friend requests, or untemporal content.
The efficiency of Snapchat content moderation systems is another critical issue that was revealed in the investigations. Governments believe that these platforms are not robust enough to stem the flow of information that leads users to the shadow markets. This applies not just to drugs but also to those that are legally age-limited, including vapes and alcohol. The fact that such material can be freely circulated implies oversight and enforcement loopholes, and begs the question of how well the platform can balance user freedom, and required regulation.
Surprisingly, this is not the first occasion that Snapchat has been put on the microscope in Europe. The Commission has voted to succeed an earlier investigation led by the Dutch regulators in September, which narrowly looked at the sale of vapes to minors using the platform. The step is a sign of increased centralization and gravity, meaning this is not a single problem, but a wider trend that should be inspected further.
Another area of weakness discovered during the probe is age verification or what the regulators call age assurance. Snapchat is dependent on self-declaration at the moment, meaning that people input their age when creating an account. Although convenient, this system is generally regarded as easily circumvented, particularly by younger users. This would practically imply that children get access to adult spaces or contents exposing them to more risks. Regulators say that more robust and dependable systems need to be in place to actually enforce age restrictions.
There is also a problem with default account settings. Such settings are important to defining the extent of information that a user can share and with whom they can communicate. Unless they are designed with safety in mind they may expose younger users to unwanted contact. The Commission is convinced that the protection offered by Snapchat might not be adequate at the moment, especially to minors who might not be well aware of privacy settings or the consequences of their online activities.
In addition to these technicalities, the research study borders the design decisions of the platform itself, such as so-called dark patterns. These are designs of interfaces that have a subtle effect on user behavior, which in some cases might not be their best interest. To illustrate, increasing the barrier to reporting harmful material or pushing users to less secure environments may contravene safety initiatives. Regulators are also taking these design aspects more seriously as they are aware that the user experience involves more than convenience; it involves protection too.
In a larger sense, this case underscores the never-ending conflict between innovation and responsibility in the technology business. Social media networks have revolutionized communication and created new avenues of communication and sharing. But with this power there also comes a responsibility to make sure that these spaces are not breeding ground to harm. The problem is to establish not only open and safe systems but also to provide a way to engage freely and reduce the risks to a minimum.
The increasing pressure on companies to do much more than the bare minimum is also being anticipated by the populace. Any digital platform is weak in terms of trust, and any event related to child safety or unlawful acts can undermine trust quite fast. Whether something is safe is not the only question that many users, particularly parents, ask regarding a platform, but whether it is enjoyable or not.
The investigation will probably establish significant precedents in the functioning of social media companies in the European Union and potentially further. Should the violations be confirmed, the penalties might go further than financial fines to include much tighter regulation and mandatory alterations to the way the platform operates.



