Perspectives January 31, 2023
A TikTok Trend You Can’t Ignore: Addressing the Risks by Protecting Privacy and Bolstering Transparency
As one of the world’s most ardent defenders of internet freedom, the United States should strengthen requirements for privacy and transparency rather than resort to an outright ban.
The future of TikTok on the American smartphone is shaky. From statehouses to university campuses , a storm of political fervor against the app has been gathering. Some federal lawmakers and a Federal Communications Commission (FCC) commissioner have called for the app to be banned for everyone within the United States. One pending federal bill would do just that.
A nationwide ban would be a mistake. It raises serious First Amendment concerns, would normalize censorship globally, and could serve as justification for other governments to block international platforms and restrict free expression. Instead, the United States should bolster requirements for privacy, transparency, and platform responsibility at the legislative and regulatory levels. This could be conducted alongside other ongoing efforts to analyze and mitigate national security risks in a rights-respecting way. This will allow for a deeper understanding of how the company functions and could reduce TikTok’s vulnerabilities as a vehicle for foreign influence. It would also treat TikTok not as an isolated problem but as part of a wider ecosystem of problematic platforms that need urgent attention.
What are the concerns?
TikTok is owned by the Chinese tech behemoth ByteDance, raising unique and serious human rights and national security concerns. These include the broad swath of data TikTok collects, where it is stored, how it can be accessed , and by whom, including by the Chinese Communist Party (CCP). Additional concerns relate to whether the company does or will in the future censor or amplify posts preemptively or at the behest of the Chinese government, and whether the platform’s global reach can be exploited to spread disinformation.
China-based companies must comply with Beijing’s onerous regulatory controls. For ByteDance specifically, the CCP has a one percent stake and one of three board seats, and several revelations have exemplified how the ruling party exerts its influence over the company, including outside of China. Reuters, for instance, reported that Indonesian moderators of ByteDance’s news aggregator Baca Berita were instructed to censor stories critical of Chinese officials from 2018 to 2020. Another messaging app—WeChat, owned by the Chinese tech titan Tencent and popular among the Chinese diaspora—has also been found to censor political, social, and religious content, and even shutter the accounts of Chinese Americans sharing such posts.
TikTok has failed to adopt best practices in platform accountability and has not been transparent and forthright with the public. Former employees have also alleged that the company and ByteDance have a cozy relationship. For instance, ByteDance confirmed in December that it accessed at least two journalists’ locations via TikTok, in an attempt to discover whether TikTok employees leaked internal documents to the reporters. Forbes reported that TikTok and ByteDance staff have handpicked videos of certain influencers and brands and amplified them across the platform’s “For You Page,” showing the companies’ willingness to interfere with user-generated content.
However, it is not publicly clear exactly how ByteDance’s relationship with the CCP trickles down to TikTok. A recent paper analyzing TikTok’s national security impact found that CCP sway over ByteDance is focused on Douyin, the Chinese version of TikTok, rather than its global counterpart.
A comprehensive approach to making TikTok safer
Banning TikTok for millions of Americans does not fully address the challenges posed by the platform, some of which are common concerns across the social media sector. As a starting point, the US government should instead raise the standards by which all social media services must abide so that they are more resilient to foreign influence. Stronger legal standards will also incentivize TikTok and other services to act more responsively in the future to bypass legal penalties for noncompliance.
We know too little publicly about how TikTok operates, how it is susceptible to foreign pressure, and the platform’s impact on human rights and national security. Laws that bolster transparency for recommendation and data collection systems, require human rights due diligence reporting or related risk assessment audits, and provide platform data to vetted researchers in a privacy-conscious way would shed light on company operations, decision-making, and content governance.
TikTok has shown some responsiveness to public condemnation, and strengthened transparency regulations could spur additional legislative and civil society scrutiny, provide insights into the extent to which TikTok and ByteDance are interwoven, and inform further research, advocacy, and policy development. It would also allow people to make more informed choices about whether they want to undertake the risks of the platform.
Narrowly focusing on TikTok’s data practices also does not address the threats of the broader data ecosystem and the multiple ways foreign governments and other actors can access and exploit Americans’ information. This includes pressuring US and foreign-owned platforms directly, illegally hacking into data systems, and simply purchasing it from the shadowy data broker industry .
What is truly needed is a robust federal privacy law, which does not exist in the United States despite bipartisan congressional and broad public support. Such legislation can minimize the data that TikTok and other platforms collect and share, and would be a critical step toward reining in the broader free-for-all market that sells access to Americans’ data. It can also limit the personal data that can be fed into recommendation and advertising systems, reducing the reach of state propaganda campaigns that rely on microtargeting users based on their personal characteristics. In the absence of such a law, the Federal Trade Commission (FTC) could also use its existing authorities to issue stronger protections against commercial surveillance and weak data security, a process that is currently underway .
The dangerous impact of a ban, at home and abroad
Imposing an outright ban on millions of Americans’ ability to use TikTok poses a dangerous trade. It would eliminate space for free expression, information sharing, connection to communities abroad, and political and social organizing. During massive protests in Iran, people capitalized on TikTok’s algorithm to evade government censorship and promote content supporting the demonstrations to users everywhere. In December, when hundreds of thousands of people in the United States were stranded by mass cancellations of Southwest Airlines flights, Transportation Secretary Pete Buttigieg took to TikTok to educate passengers on how to file claims for owed compensation.
A US ban sets a dangerous precedent: TikTok is not the only app with ties to authoritarian states that raises challenges. Such a move would reverberate globally. The push against TikTok comes as censorship is deepening around the globe. A record high of at least 40 national governments blocked websites hosting political, social, and religious speech over the past year, while 22 blocked social media platforms.
Our research shows that authorities learn from each other, with less-free governments often pointing to the problematic actions of democratic states to justify their repressive policies. A US ban on TikTok could inadvertently encourage other governments to tighten their grip over and block dominant US-based platforms, shrinking the digital civic space. It would also considerably diminish the United States’ credibility as an advocate against censorship internationally, which would be particularly alarming given its position as the 2023 chair of the Freedom Online Coalition and secretary general of the International Telecommunication Union (ITU).
The United States—and democratic governments everywhere—have an opportunity to show that censorship is not a proportionate or effective response to the genuine risks of the digital age. There are better ways to protect people.