On Friday, the European Union (EU) regulators imposed a fine of approximately $370 million on TikTok for its inadequate safeguarding of children's personal data. The penalty was issued by Ireland's Data Protection Commission on behalf of the bloc's 27 members, marking the first such fine against TikTok by the EU.
The social media platform, owned by China-based ByteDance, was criticized for insufficient privacy measures for users aged 17 and under. The regulators noted that the platform's default settings exposed children's data by making their content public and criticized the company for its lack of transparency regarding data usage.
This development follows increasing scrutiny of TikTok by parents, policymakers, and regulators over its data collection practices and potential negative impacts on young people's mental health. According to a 2022 Pew Research Center survey, 67% of American teens use TikTok, with 16% using it almost constantly.
In a separate ongoing investigation, Irish regulators are examining whether TikTok is unlawfully transferring EU user data to China. This investigation is expected to conclude by the end of the year.
In the U.S., both state and federal policymakers have been grappling with how to regulate TikTok. Several government agencies have banned the app on work devices due to concerns about Beijing accessing sensitive user data. Montana has even enacted a law prohibiting the use of the app in the state.
TikTok, which has over 150 million monthly users in the EU, has been criticized for its insufficient efforts to protect children. The platform can be used by individuals aged 13 and over, but its default settings have led to violations of data protection rules.
The investigation into TikTok's practices spanned from July 31, 2020, to December 31, 2020. It found that TikTok did not adequately prevent its youngest users from bypassing age restrictions on the service, including sending and receiving direct messages and using a "family pairing" feature where a user not verified as a parent or guardian could remove these limits for a child user.
Regulators also reported that TikTok employed "dark patterns," techniques designed to prompt users to select more privacy-invasive options during sign-up and when posting videos.
TikTok has disputed the relevance of the fine, stating that it had already updated its policies pertaining to children in 2021. These changes included setting accounts to private by default for users aged 13-15 and offering more information to young people about their data collection and usage.
This is not the first instance of TikTok facing penalties for mishandling children's data. In April, British regulators fined the company approximately $15.8 million for failing to prevent children under 13 from registering for the service. In 2019, Musical.ly, which later became TikTok, agreed to pay $5.7 million to settle charges by the Federal Trade Commission for violating U.S. data protection rules for children.
This article was generated with the support of AI and reviewed by an editor. For more information see our T&C.