On Thursday, the chief executive officer of TikTok’s U.S. operations testified before Congress in a highly-anticipated appearance. While TikTok creators rallied outside, CEO Shou Chew faced a tough line of questioning from the U.S. House Committee on Energy and Commerce over five hours. The hearings provided key insights and takeaways for how the creator economy could face significant regulatory and policy changes in the coming years.
In Chew’s written statement, the company shared it has amassed 150 million users in the U.S.—a figure large enough that any sudden disappearance of the mobile app would certainly send shockwaves throughout the creator economy.
Yet, Congress appeared more focused on the current state of affairs of the platform’s operations than waiting around for any promised fixes that “Project Texas,” the plan proposed by TikTok to protect U.S. data, would offer.
For creators, it was clear that members of Congress were often disconnected from the cultural significance of the platform and, therefore, its significance to creator-owned businesses. The line of questioning from congressional members at times included long diatribes that were arguably out to provide usable soundbites for constituents. There were also moments during the hearing that showcased their lack of understanding of how TikTok operates.
There were also a number of moments when lawmakers were unable to get clear answers from the company regarding its data storage and access policies, with Chew instead referring to the “global interoperability” of the platform and the data. The arguably vague use of “interoperability” underscores that there is some level of information and data sharing between TikTok’s U.S. operations and its other entities around the world, including in China.
Even if creators are willing to trade their personal data exchange for an audience of TikTok’s 150 million American users, Rep. Dan Crenshaw (R-TX) delivered a harsh warning during his closing statement.
“You may not care that your data is being accessed now, but it will be one day when you do care about it,” he said. “With data comes power. They can choose what you see and how you see it. They make you believe things that are not true. They can encourage you to engage in behavior that will destroy your life. Even if it is not happening yet, it could in the future.”
Crenshaw, along with other representatives, often referred to the power of TikTok to influence Americans through its control of the platform, content, and algorithms. Just as creators are often in the dark working to reverse-engineer the platform’s recommendation systems and technologies via the “For You” page, members of the Committee were left with little new information from Chew about how the platform actually operates or works to keep users safe.
“I think your testimony has raised for questions for me than answers,” Rep. Lisa Blunt Rochester (D-DE) expressed during the hearing.
TikTok’s Section 230 liability shield and its impact on creators
Section 230 of the Communications Decency Act, a 1996 law that provides broad immunity to internet platforms for the user-generated content they are hosting, was frequently brought up during Thursday’s hearing.
For TikTok, Section 230 protections mean they can’t be sued if a creator posts a defamatory video or if someone uses the comment section to harass another user or creator, provided they implement content moderation controls. The law has notably been updated only one time in 2018 to require platforms proactively remove sex trafficking material.
Thursday’s congressional hearing focused on potential limits to Section 230’s protection for a platform based on the algorithms that they create, the content moderation decisions they implement, and the impact such decisions have on users.
As Rep. August Pfluger (R-TX) pointed to during the hearing, TikTok’s own platform terms expressly state its ability to step in and moderate content shared on the platform. The Terms of Service prohibit users from sharing “material that, in the sole judgment of TikTok, is objectionable,” which is a commonly recognized right among internet platforms.
In response, TikTok’s CEO was pushing for comparisons against other U.S. social media companies, as opposed to comparisons to the Chinese version of the app, or even the Singaporean version Chew’s kids would be unable to use based on local laws that prohibit users under the age of 13.
Chew specifically called out similar algorithmic and content moderation activities of Meta and other social platforms. As previously reported by Passionfruit, rumors from creators were swirling on TikTok that Meta was somehow involved behind the scenes with lobbyists to push the agenda to ban TikTok.
Rep. Yvette D. Clarke (D-NY) expressed concern over TikTok’s algorithms flagging and suppressing content about topics such as race and Black Lives Matter. Clarke and Chew did agree on the need for both platform and algorithm transparency when it comes to content moderation.
Changes to Section 230 and other internet laws raise concerns about more restrictive content moderation policies from platforms to remove illegal or harmful content. This in turn could result in the legitimate content of creators being removed or suppressed as a side effect.
Additionally, it may result in more gray areas around what content may or may not lead to more liability for a platform, resulting in restrictions over entire subjects or broadly across content categories.
Following the hearing, reporter Taylor Lorenz noted that statements from the hearing alleging TikTok bans of entire hashtags are false. “No hashtags were censored,” she explained during an interview with the Washington Post. “Briefly views on all hashtags on TikTok were temporarily unable to be seen from the search page. TikTok has never censored, as far as we’ve seen evidence of, a specific hashtag like that in that context.”
A reminder of creators’ global reach
There were moments in which members of Congress highlighted discrepancies in how the platform operates from one country to the next. Specifically, the treatment of algorithms, policies around content moderation, decisions over user experience and product design, and data privacy and security.
As platforms continue to roll out features that engage users and viewers across the globe, such as YouTube’s recent multi-language audio track support, creators will need to become aware of laws outside of their own jurisdictions. Even though unintentional, violating the laws of one territory could put a creator’s future success and revenues in that jurisdiction in jeopardy.
Rep. Lori Trahan (D-MA) referenced the UK’s Age Appropriate Design Code and 15 design standards aimed at protecting children online. Trahan specifically was asking whether TikTok would commit to such protections being voluntarily expanded to also protect U.S. children and teens.
“We take the safety of the younger users of our platform very seriously,” Chew responded, to which Trahan interrupted to point out, “This is a good way to prove it.”
Platforms affording users different rights based on where they are citizens is not a new issue. Laws across Europe have offered E.U. citizens arguably more control over their personal information and data, such as the right to be forgotten in 2014 and more expansive privacy rights rolled out in 2018.
It’s important to remember that TikTok’s regulatory problems will easily become the regulatory problems of all social media platforms. Any sweeping legislative changes will likely impact more than just TikTok.
The elephants in the room were proposed legislation known as the Restrict Act and the Data Act, which once passed by Congress would allow the decision of a TikTok ban and data sharing restrictions to be made by President Biden. However, such solutions are only a temporary fix to the much large policy issues and discussions that still need to take place surrounding user data privacy and security.
It remains to be seen how, if at all, TikTok plans to follow up with written responses on many items that remained unanswered by Chew. Passionfruit reached out to TikTok and the congressional members quoted in this piece for comment via email and did not hear back in time for the publication of this article.
What should creators do now?
Until a final decision is made and action is taken, creators can take proactive steps to ensure the communities they built on TikTok aren’t entirely lost. First, creators should have an established presence on multiple platforms. Creators should study the impact of India’s TikTok ban in 2020 on creators, which largely led to an explosion of Instagram and YouTube’s short-form video offerings.
Second, creators should be cross-posting short-form content to other platforms, like Instagram Reels and YouTube Shorts, by downloading their videos from TikTok.
“I take so much issue when they talk about this being a free speech issue,” Lauren Schnipper, an executive at content company Jellysmack said in a podcast episode about a potential TikTok ban. “Nobody is silencing the people on these platforms.”
Third, creators should be reminding their audience of the risk to their community should a ban become a reality. Creators can use link-in-bio tools like Linktree or Koji to easily send viewers to one place that presents a range of alternative options for connection.
Creators can consider prioritizing a platform they control, such as an email list or website, However, such an option may vary depending on the type of content they create and the ways in which they engage with their community.