Big technology companies like Meta and TikTok are once again facing trouble, this time over not following certain transparency rules. According to the European Commission, both Meta and TikTok failed to provide easy access to public data for researchers, which is required under the Digital Services Act (DSA). This law was made to make sure large social media platforms work responsibly and protect people from harmful or illegal content online.
Meta, the parent company of Facebook and Instagram, has also been accused of not offering simple tools for users to report harmful material like child abuse images or terrorist content. These are serious claims because such materials must be reported and removed quickly to keep the internet safe for everyone.
The Commission said that both Facebook and Instagram seem to have “burdensome procedures and tools” for researchers trying to access public data. This means that researchers who study how social media affects people’s lives are finding it hard to get the data they need. The DSA clearly states that researchers should be allowed to examine data from large social media platforms to understand their impact on people’s physical and mental health.
The main purpose of allowing researchers access to this information is to help governments and experts find out if these platforms are spreading harmful or false information. Without this access, it becomes very difficult to understand how these platforms affect society, especially young users who spend hours scrolling through them every day.

Meta and TikTok are two of the most influential social media platforms in the world. Millions of people use them to share pictures, videos, and opinions. However, the more users these platforms have, the more responsibility they carry to keep their platforms safe, transparent, and honest. The DSA was introduced to make sure that companies like these cannot hide behind complex systems or confusing policies when it comes to protecting users.
The European Commission also raised concerns about how Meta’s systems make it difficult for users to report harmful content. According to the Commission, Meta’s tools include “several unnecessary steps and additional demands on users” and use what they called “deceptive interface designs.” This means the process is so complicated that many users might give up before completing it.
“Such practices can be confusing and dissuading,” the Commission said in its statement. It added that Meta’s methods to report and remove illegal content might not be effective enough. This is worrying because under the DSA, companies must provide a simple way for users to report anything that breaks the law. This process, called the “Notice and Action” system, helps users tell online platforms when something wrong or harmful is being posted.
Meta, however, does not agree with these claims. A spokesperson from Meta told Reuters that the company does not believe it has broken the DSA in any way. “In the European Union, we have introduced changes to our content reporting options, appeals process, and data access tools since the DSA came into force and are confident that these solutions match what is required under the law,” the spokesperson said.
This statement from Meta shows that the company believes it has already done what is needed to follow the new law. Meta says that it made several updates to its systems after the DSA took effect. It claims that its methods now meet all the necessary standards.
TikTok, owned by the Chinese company ByteDance, has also been named in the same investigation. The Commission said TikTok’s platform has similar problems related to transparency and data access. Researchers have found it difficult to get the information they need from TikTok, which makes it harder to study how the app affects users — especially teenagers and young adults.
Many experts say that social media companies often use complex systems to limit how much information they share. They do this, they say, to protect user privacy and their business secrets. However, critics argue that too much secrecy can hide how these platforms spread harmful content, fake news, or even encourage dangerous trends.
The Digital Services Act was introduced to prevent such situations. It is meant to ensure that big online platforms are open about how they work and take responsibility for what happens on their sites. It forces companies to remove harmful content, stop the spread of fake news, and give researchers access to data that helps in understanding the risks of online behavior.
For example, if researchers can study data from these platforms, they might find links between heavy social media use and problems like anxiety, depression, or body image issues among teenagers. Without this data, these important studies cannot happen, and social media companies can continue operating without much accountability.
The Commission’s statement made it clear that these preliminary findings are not the final decision. Meta and TikTok will have a chance to respond and explain their side before any final action is taken. If found guilty, the companies could face huge fines or be forced to change their systems completely.
This situation once again raises an important question: how much control should big tech companies have over what happens on their platforms? While these platforms bring people together and allow free expression, they also carry risks — such as exposure to harmful content or misuse of personal data. Governments across the world are now trying to find a balance between protecting users and allowing these companies to grow and innovate.
Meta and TikTok are both expected to continue discussing these issues with the Commission in the coming weeks. Meanwhile, researchers and digital safety experts are calling for clearer systems, simpler reporting options, and easier data access to ensure transparency.
In the end, this issue is not just about two big companies. It’s about how safe and fair the online world should be for everyone — especially the millions of young people who use these apps every day. As one digital safety expert said recently, “When transparency fails, trust disappears. And without trust, even the biggest platforms lose their strength.”
This case will likely take time to reach a conclusion, but it has already reminded both companies and users that transparency and honesty are key to building a better and safer digital world.

