TikTok Is Subject of E.U. Inquiry Over ‘Addictive Design’
European Union regulators on Monday opened an investigation into TikTok over potential breaches of online content rules aimed at protecting children, saying the popular social media platform’s “addictive design” risked exposing young people to harmful content.
The move widens a preliminary investigation conducted in recent months into whether TikTok, owned by the Chinese company ByteDance, violated a new European law, the Digital Services Act, which requires large social media companies to stop the spread of harmful material. Under the law, companies can be penalized up to 6 percent of their global revenues.
TikTok has been under the scrutiny of E.U. regulators for months. The company was fined roughly $370 million in September for having weak safeguards to protect the personal information of children using the platform. Policymakers in the United States have also been wrestling with how to regulate the platform for harmful content and data privacy — concerns amplified by TikTok’s links to China.
The European Commission said it was particularly focused on how the company was managing the risk of “negative effects stemming” from the site’s design, including algorithmic systems that it said “may stimulate behavioral addictions” or “create so-called ‘rabbit hole effects,’” where a user is pulled further and further into the site’s content.
Those risks could potentially compromise a person’s “physical and mental well-being,” the commission said.
“The safety and well-being of online users in Europe is crucial,” Margrethe Vestager, the European Commission’s executive vice president overseeing digital policy, said in a statement. “TikTok needs to take a close look at the services they offer and carefully consider the risks that they pose to their users — young as well as old.”
TikTok, in a statement, said that it had “pioneered features and settings to protect teens and keep under-13s off the platform, issues the whole industry is grappling with.”
The company added, “We’ll continue to work with experts and industry to keep young people on TikTok safe, and look forward to now having the opportunity to explain this work in detail to the commission.”
TikTok has become a target of parents, policymakers and regulators who are concerned about the company’s data-collection practices and the platform’s effect on young people’s mental health.
Shou Chew, TikTok’s chief executive, was grilled by U.S. lawmakers last March about TikTok’s ties to China and the app’s impact on children. He stressed that TikTok was an independent company that wasn’t influenced by China and mentioned 60-minute screen time limits, which parents can control, for users 12 and under.
In Europe, where TikTok has more than 150 million monthly users, regulators last year faulted the service for having settings that made videos and posts public by default, exposing the information and data of its youngest users. They said TikTok also used so-called dark patterns, a system that encouraged users to select privacy-intrusive options during the sign-up process and when posting videos.
The E.U. investigation will also look at the effectiveness of TikTok’s age verification tools, intended to prevent access by minors to inappropriate content. It will also check to see if TikTok provides a list of advertisements that is searchable and reliable, as required under the Digital Services Act.
European regulators began a separate investigation in October into whether the social media platform X violated the Digital Services Act over the prevalence of gory images and terrorism content related to the Israel-Hamas war.