On August 5, the European Commission announced that TikTok has committed to permanently withdrawing the TikTok Lite rewards program from the EU market and will not launch any other similar programs. Previously, the European Commission had noted that TikTok Lite, launched in France and Spain, was suspected of violating the EU Digital Services Act (DSA) and initiated an investigation. TikTok's commitment is a response to the issues raised by the European Commission during the investigation, leading the European Commission to close the investigation, making this the first case concluded by the European Commission under the DSA.
Case Facts
TikTok Lite is a lightweight version of TikTok. In April of this year, TikTok Lite was officially launched in France and Spain, along with a "Tasks and Rewards Program" that allowed users to earn points by completing certain tasks (such as watching videos, liking content, following creators, inviting friends to join TikTok, etc.). These points could be redeemed for rewards such as Amazon gift cards, PayPal gift cards, and TikTok coins that could be used to tip creators.
This program raised concerns with the European Commission, which believed that the program could be addictive. Additionally, under the DSA, very large online platforms must conduct a risk assessment before launching any new features that could have a significant impact on systemic risks. This assessment must include measures to mitigate systemic risks and must be provided to the European Commission and the Digital Services Coordinator. As early as April 2023, TikTok was recognized by the European Commission as a very large online platform, but it failed to comply with these requirements before launching this feature (i.e., the "Tasks and Rewards Program").
On April 22, the European Commission announced a formal investigation into whether the launch of TikTok Lite's "Tasks and Rewards Program" in France and Spain violated the DSA. It required TikTok to submit a risk assessment report within 24 hours and warned TikTok that the European Commission was considering suspending the implementation of the "Tasks and Rewards Program" across the EU (according to Article 70 of the DSA, the European Commission can take interim measures against very large online platforms when there is urgency and a serious risk of harm to service recipients).
On April 24, TikTok posted on social media platform X (formerly Twitter) that it had voluntarily suspended the implementation of the "Tasks and Rewards Program" and stated that it "always seeks to engage constructively with the European Commission and other regulatory bodies."
Previously, the European Commission had already investigated whether TikTok violated the DSA in areas such as the protection of minors, advertising transparency, access to data for researchers, addictive design, and the management of risks related to harmful content. This investigation is still ongoing, and you can click to view Kaamel's detailed discussion on this case.
TikTok Lite Rewards Program Violates Risk Assessment Obligations
TikTok is one of the very large online platforms recognized by the European Commission (other recognized very large online platforms include Amazon Marketplace, Alibaba AliExpress, Facebook, Instagram, Twitter, etc.), and as such, it must comply with the additional obligations set by the DSA for very large online platforms, including the obligation to conduct risk assessments.
Article 34 of the DSA stipulates that providers of very large online platforms and very large online search engines must conduct a risk assessment at least once a year and must also conduct a risk assessment before launching any new features that may have a significant impact on systemic risks. Systemic risks refer to risks such as: (i) dissemination of illegal content through the platform's services; (ii) negative impacts on fundamental rights (especially human dignity, privacy rights, personal data rights, children's rights, etc.); (iii) negative impacts on public discourse, electoral processes, or public security; (iv) actual or foreseeable negative impacts related to gender-based violence, public health, protection of minors, as well as severe negative consequences on the physical and mental health of individuals. Article 35 further stipulates that providers must take reasonable, proportionate, and effective risk mitigation measures to address the risks identified in the risk assessment.
The European Commission believes that the TikTok Lite rewards program, which allows users to complete specific tasks in exchange for tangible rewards, may be addictive and could negatively impact the physical and mental health of users, particularly minors. This constitutes one of the systemic risks defined in Article 34 of the DSA, and TikTok was therefore required to conduct a risk assessment and implement corresponding risk mitigation measures. However, TikTok did not submit a risk assessment report to the European Commission as required, nor did it take relevant measures, thus violating the risk assessment obligations stipulated by the DSA.
Compliance Insights from the TikTok Incident
In this case, TikTok was investigated for failing to comply with the special obligations set by the DSA for its type of online platform, specifically the obligation to assess the risks of new features that could have a significant impact on systemic risks before launching them. Currently, many countries and regions have strengthened legislation in the field of data and privacy protection, and there are differences in the strictness of legislation and enforcement, regulatory focus, and regulatory models across countries. The compliance obligations for the same type of business may vary in different countries, so companies expanding internationally must stay updated on the latest legislative developments in local markets and ensure they meet local regulatory requirements.
When announcing the initiation of this investigation, the European Commission also mentioned that it had previously requested TikTok to submit a risk assessment report, but TikTok did not comply, leading to this investigation. Companies must respond promptly to regulatory requests, as failure to do so can result in adverse consequences. Additionally, many laws require relevant entities to produce reports or other documents, and companies must be prepared to produce and retain these documents for a certain period for review by regulatory bodies.
TikTok is currently facing multiple investigations or lawsuits initiated by regulatory bodies in Europe and the U.S. Recently, the U.S. Federal Trade Commission (FTC) accused TikTok of violating the Children's Online Privacy Protection Act (COPPA) and requested the court to impose civil penalties. You can click to view Kaamel's detailed discussion on this case.
DSA Compliance Highlights
The DSA is a significant regulation introduced by the EU in 2022 to govern the conduct of digital intermediary service providers, especially online platforms. Below is a brief introduction to the main content and compliance requirements of the DSA.
1. Scope of Application
The DSA applies to all intermediary services provided to recipients located in the EU or having an establishment in the EU, regardless of the location of the service provider.
According to the DSA, intermediary services include the following three types:
(i) Mere conduit services, which involve transmitting information provided by the service recipient through a communication network or providing access to a communication network, such as wireless access points (WAP), virtual private networks (VPN), etc.;
(ii) Caching services, which involve automatically, temporarily storing information provided by the service recipient through a communication network for the sole purpose of making the information's onward transmission more efficient upon request, such as content delivery networks (CDN);
(iii) Hosting services, which involve storing information provided by the service recipient upon their request, such as online platforms.
2. Obligations
The DSA adopts a tiered regulatory approach, imposing different obligations on intermediary service providers based on their size and the type of services they provide.
(1) For all intermediary service providers, the DSA requires:
- Transparency report: Publish a content moderation report in a machine-readable, easily accessible format, including information on the number of orders issued by Member State authorities, basic information on content moderation, automated means used for content moderation, etc., at least once a year.
- Readability of terms of service: Terms of service should explain any restrictions imposed on the information provided by service recipients and notify service recipients when there are significant changes to the terms.
- Points of contact: Designate and publicly disclose contact points for communication with Member State authorities, the European Commission, and the European Digital Services Coordinator, as well as contact points for communication with service recipients.
- Legal representative: Intermediary service providers that do not have an establishment in the EU but provide services in the EU must designate a legal representative in one of the Member States where services are provided, in writing.
(2) For providers of hosting services (including online platforms), the DSA additionally requires:
- Establish a convenient mechanism for reporting illegal content and promptly address reports upon receipt.
- If there is illegal content or content that violates the terms of service, the hosting service provider may impose restrictions (including removing or disabling access to specific content, partially or completely terminating services), but must provide specific reasons to the affected service recipient.
- Report any information that may involve criminal offenses threatening the safety of individuals to the Member State's law enforcement or judicial authorities.
(3) Online platforms are the focus of DSA regulation and face higher compliance requirements. The additional obligations for online platform providers mainly include:
- Establish a free internal electronic complaint-handling system that is easily accessible, user-friendly, and ensures that complaints are handled in a timely, non-discriminatory, diligent, and non-arbitrary manner.
- Establish an out-of-court dispute resolution mechanism and ensure that the relevant information is easily accessible so that users can resort to this mechanism when they disagree with the platform's decisions.
- Take necessary technical and managerial measures to ensure that reports from trusted flaggers in their professional field are prioritized.
- In transparency reports, disclose the number of disputes submitted to out-of-court dispute resolution bodies, the outcomes and average time taken, and the platform's compliance with these outcomes
.
- When pushing advertisements to users, ensure that users are aware that the content is an advertisement and provide information on the advertiser, the parameters considered by the platform when recommending the advertisement, and how these parameters can be changed.
- Do not design online pages in a way that deceives or manipulates users, or distorts or undermines their ability to make informed and free decisions.
- Do not collect minors' personal data for advertising purposes.
(4) For providers of very large online platforms (VLOP) and very large online search engines (VLOSE), the DSA imposes the strictest compliance requirements. In addition to the above obligations, they mainly include:
- Risk assessment: Conduct a risk assessment at least once a year for systemic risks (including illegal content, privacy violations, election manipulation, hate speech, etc.) and develop corresponding measures to mitigate the risks. A risk assessment must also be conducted before launching any new feature that may have a significant impact on systemic risks. Risk assessment-related materials must be retained for at least three years.
- Independent audit: Conduct an independent audit at least once a year and fully consider the recommendations in the audit report. Within one month of receiving the recommendations, complete an audit implementation report, list corrective measures, and implement them.
- Recommendation system: Each recommendation system must offer at least one option that is not based on user profiling.
- Advertising transparency: Set up and publicly disclose an advertising library on a specific section of the online page, allowing users to query the content of advertisements, the advertiser, the display period, etc.
- Data access: Provide the necessary data access to regulatory authorities upon reasonable request. Provide necessary data access to researchers upon reasonable request, subject to certain conditions.