Now Reading
TikTok Ban Needs More Work

TikTok Ban Needs More Work

President Joe Biden signed into law a requirement for ByteDance to sell TikTok or be banned from U.S. app markets. TikTok has initiated a legal challenge.

This law, while framed around national security, primarily impacts young people and follows state-level actions to constrain social media platforms, such as the law signed last month in Florida.

Following years of inaction and wait-and-see dialogue, policymakers often deferred to tech companies to set their own rules. This was often at the expense of individuals on issues like privacy and mental health.

If upheld in court, these actions may offer hope to some that American society is at a turning point. Unfortunately, these policies will not work. Their aims to protect young people or prevent Chinese interference are not well-served by the policy designs.

To be sure, tech entities cannot regulate themselves, but it is also imprudent to allow regulation by an angry mob. As a technology policy expert, I know that effective governance requires domain expertise and cooperation. Social media governance needs to respond to social norms, community expectations, and the voices of the public. But it also must also be practically possible.

And this is where many recent proposals and new laws fall short. While parents might know what is best for their children– with respect to screentime or readiness for a particular device or app– their emotional pleas to ban it do not actually offer a detailed enough solution to be impactful.

Banning Chinese ownership of one app does nothing to stop dataflows via the 1000s of other apps with Chinese developers—just look at the popularity of Shein and Temu—or any of the cheap smart devices that constantly send data internationally. Research documents significant international traffic to Internet-of-Things (IoT) developers, both through experiments and crowdsourced data shared by users, with China among the most prominent recipients.

Preventing new downloads via Google or Apple app markets does nothing to stop current users, or even out of market downloads. Sideloading, as it is called when a user installs an app directly from the developer or as shared peer-to-peer, is a popular approach to testing new apps and supporting research, and the EU even requires Apple to allow this practice under the Digital Markets Act. It is also growing in popularity globally.

Requiring parental consent for 14- or 15-year-olds to use social media draws on ineffective notice and choice mechanisms that already certainly do not do anything to protect user privacy. It is implausible to expect the same mechanism to do more in this situation.

Many experts agree that these policies are too simplistic. Meaningful age-gating would likely require collaboration with internet service providers and device manufacturers, or more complex or multifactor authentication by parents.

Addressing international data-flows requires a combination of coordinated legal and infrastructural solutions. Scholars have asserted for decades that technologies are governed by a combination of law, social norms, markets, and technical architecture.

Good solutions to these issues require intentionally addressing all four components.

Outside of my work as a researcher, my main responsibility is as a parent. I am deeply invested in these issues. From administrators asking for consent to social media screening as a part of daycare admissions to a preschool requirement to use an app that documented dietary, potty-training, and medical details, parents must address these concerns.

It is frustrating when privacy policies for apps that document children’s lives only outline how parents’ data in using the app would be protected. As children grow, so too do concerns including whether or not the school teaches good password practices or if YouTube Kids has solved their problems with explicit content. In using age restrictions on Disney+, parents need to figure out what is being censored and whether or not it aligns with their values on intellectual freedom.

Fortunately, I have the tools and expertise to affect change in this space, but many do not. All parents need to be heard in this conversation.

Lawmakers need to understand that the domain of technology policy is unique and that they must balance public concerns with expert guidance. Laws to address society’s many problems around social media need to reflect what is technically possible and effective toward social concerns and goals.

In my 2021 book, Governing Privacy in Knowledge Commons, I demonstrate that sustainable and impactful technology governance requires cooperation. Policymakers, advocates, teachers, administrators, and parents all need to work together.

Multifaceted problems require multifaceted solutions.

© 2022 VISIBLE Magazine. All Rights Reserved. Branding by Studio Foray.


Your Cart