Meta, TikTok, and Snap Agree to Follow Teen Social Media Ban Despite Strong Opposition

Australia has made a new law that will soon stop people under 16 years old from using social media platforms like Instagram, TikTok, and Snapchat. The law has created a big debate around the world, especially about how it will affect young people’s mental health and online safety. Even though the major companies—Meta, TikTok, and Snap—say they do not agree with the new rule, they have decided to follow it once it becomes official in December.

Meta, which owns Instagram and Facebook, said that the company will follow the law even though it believes this rule will not really help young people. TikTok, which is owned by ByteDance, and Snap, the owner of Snapchat, also said the same thing. They plan to start deactivating accounts that belong to users under the age of 16 once the law takes effect on December 10.

The companies told Australia’s parliament that they believe this rule might do more harm than good. However, since it is now a legal requirement, they said they will follow it and start preparing young users for the change. Meta, TikTok, and Snap said they would soon contact the owners of more than a million underage accounts to let them know about what will happen.

image

The Law and Its Purpose

The new Australian law says that all social media platforms must take “reasonable steps” to stop users younger than 16 from using their apps. If they fail to do so, they could face a fine of up to 49.5 million Australian dollars (about 32.5 million US dollars). The government says the purpose of this rule is to protect children from the negative effects of social media, such as cyberbullying, online predators, and mental health problems.

Many lawmakers in Australia believe that social media has become too harmful for young teenagers. They say that children under 16 are not mature enough to handle the pressure of social media, which often causes stress, anxiety, and poor self-esteem. This new law is part of a bigger global conversation about whether young people should be allowed to use social media freely or whether they need stronger protection.

The Companies’ Concerns

Even though Meta, TikTok, and Snap said they would comply, they made it clear that they still think the ban is not the right solution. According to them, the rule could actually make things worse instead of better. They believe that young people who are banned might move to unsafe corners of the internet where there are fewer rules and less monitoring. This could expose them to even more dangerous online environments.

The companies have also said that completely banning young users could take away their sense of belonging and limit their social interactions. For many teenagers, social media is a way to stay connected with friends, express their creativity, and learn about the world. “We do not think this ban will protect young people in the way the government hopes,” one company representative said during the parliament session.

In addition, the platforms mentioned that implementing such a ban is complicated. It is difficult to confirm someone’s real age online, as many children lie about their birth dates when creating accounts. While companies are working on better age-verification tools, they argue that no system is perfect and mistakes can happen.

Global Impact and Debate

The Australian law is being closely watched by other countries around the world. Governments in Europe and North America have also been discussing how to manage the growing influence of social media on children. Australia’s decision might encourage other nations to pass similar laws if they think it helps protect kids.

However, not everyone agrees with the idea. Some experts say that instead of banning young people, governments should focus on digital education. They believe children should be taught how to use social media responsibly, understand privacy settings, and identify online dangers. This, they argue, would be a better long-term solution than a complete ban.

Others believe that parents also play a major role. They say parents should monitor what their children are doing online and set healthy limits. Technology companies, on the other hand, should continue improving their safety tools, like filters for harmful content, stricter privacy settings, and better reporting options for abuse or bullying.

What Happens Next

Meta, TikTok, and Snap have all said that they are preparing for the December deadline. They plan to notify users under 16 that their accounts will be deactivated. The companies also said they will make sure the process is done carefully and respectfully, especially since many of these users might be upset or confused about losing their accounts.

The Australian government said it will watch how the companies follow the rule and will take action if they fail to meet the legal requirements. Authorities have made it clear that the safety of young people is their top priority. Officials have also said they are open to reviewing the law later if needed, depending on how it works in real life.

Reactions from the Public

The new rule has received mixed reactions from the public. Some parents are relieved and say this will protect their children from the harmful effects of social media. They believe that the constant use of these platforms has been making teenagers more anxious and less focused on school or real-world friendships.

On the other hand, many young people are upset about the decision. They argue that social media is not always bad—it helps them stay informed, build communities, and express themselves creatively. Some also say that banning them from using it will not solve deeper problems like bullying or loneliness.

image

PayPal Partners with OpenAI to Let ChatGPT Users Shop Directly, Shares Jump