Montana Is Right to Ban TikTok
Our children deserve nothing less.
This text has been adapted from an amicus brief filed by the Institute for Family Studies in support of Montana’s SB 419, which bans TikTok from the state.
Although only a relatively recent entrant into the U.S. social media market, TikTok escalates dangerous online activity. Nearly a third of Americans of all ages spend more than an hour on the app daily. With over one billion monthly users, it ranks fourth globally among social media platforms. More than any other social media platform, TikTok appeals to the youngest market possible.
In May 2023, Surgeon General Vivek Murthy issued a public advisory warning that social media’s mental health effects have created an “urgent public health issue” for young Americans. The most critical period of adolescent brain development is from ages 10 to 19, when the brain is “especially susceptible” to harm and influence by these applications.
The psychologist and Institute for Family Studies (IFS) contributor Jean Twenge finds significant evidence of social media’s negative effect on teen mental health, and says, “With heavy users twice as likely to be depressed as light users, it seems odd to describe the links as small. The associations are just as large as factors subject to public health interventions like smoking, obesity, and lead exposure.”
Certain populations prove particularly at risk for technological addiction. A 2022 report by IFS and the Wheatley Institute showed that children from non-intact families—i.e., those who don’t live with married parents—suffer disproportionately. Unfortunately, “Children being raised in non-intact families [tend to] have fewer rules guiding their use of technology and more exposure to that technology,” spending almost two hours more daily on their devices than their peers who live with married parents. This makes TikTok an extremely influential force for children in such circumstances.
TikTok fosters injurious social behaviors through its addictiveness and the false authority granted to so-called “influencers.” TikTok’s algorithm encourages children to engage in deadly “challenges,” such as the notorious “Benadryl Challenge,” which claimed the life of 13-year-old Jacob Stevens. At the algorithm’s urging, Stevens ingested large doses of Benadryl and filmed it live, to die on a ventilator days later.
These monetized influencers also offer users mental health “hacks” to self-diagnose and self-cure. Young users investigate their own mental health problems, and in many cases manifest symptoms they did not present before their self-study, such as the “TikTok tic.” Kids are lured into the app for help when they should be alerting a parent or guardian or seeing a professional. When one considers the young demographic to which TikTok caters, the risk of inexperienced diagnoses becomes more apparent.
TikTok’s algorithm is not designed for sociality. Unlike Facebook, for example, a TikTok user’s feed is not a timeline of the life events of friends and family. The New York Times journalist Ben Smith summarized: “[TikTok] displays an endless stream of videos and, unlike the social media apps it is increasingly displacing, serves more as entertainment than as a connection to friends.”
A 2019 New York Times article on TikTok provides a synoptic view:
The most obvious clue [of TikTok’s nature] is right there when you open the app: the first thing you see isn’t a feed of your friends, but a page called ‘For You.’ It’s an algorithmic feed based on videos you’ve interacted with, or even just watched. It never runs out of material. It is not…full of people you know, or things you’ve explicitly told it you want to see. It’s full of things that you seem to have demonstrated you want to watch, no matter what you actually say you want to watch.
While TikTok’s algorithm considers industry-standard metrics such as likes and comments in organizing a user’s feed, it puts more weight on replicating the content a user tends to return to and view for the greatest duration. This means that users cannot direct the feed away from content they may be drawn to by addiction and compulsion. Thus, children fall down the “rabbit hole” to very dark places. When a child is scrolling through her feed, she has little to no agency. TikTok’s algorithm is the dominant agent.
TikTok resists adequate monitoring by parents or guardians. Under severe pressure, in July 2022, the platform introduced tools for parents to customize the experience of their children. But the tools are notoriously difficult to operate. Even with these tools, no parent can possibly monitor content in such volume. Kids’ feeds can evolve from one day, one hour, one video to the next.
The platform shares YouTube’s visual appeals and the power of video, but the shorter clips tend to be even more intimate. TikTokers don’t only show their face and bedroom, but often record as they move around these and other places they frequent, such as their school, neighborhood, and gym, exposing friends, family, and strangers as well. TikTok thus destroys young users’ agency, intimate social connections, and privacy in one fell swoop.
As IFS executive director Michael Toscano has summarized in Compact magazine, TikTok “controlled by the Chinese government through its parent company, the China-based ByteDance.” TikTok is so dangerous for children, however, that the Chinese government itself does not allow its own children to use it. Douyin—TikTok’s Chinese counterpart—is not allowed to provide the same content that TikTok shows to American kids. Instead, China restricts Douyin to edifying material and rigorously excludes content that could threaten the moral and spiritual welfare of Chinese youths. Even so, the Chinese government placed a 40-minutes per-day restriction on Douyin for users under the age of 18. As Tristan Harris, founder of the Center for Humane Technology, recently told 60 Minutes, “It’s almost like they [the Chinese government] recognize that technology is influencing kids’ development, and they make their domestic version a spinach version of TikTok, while they ship the opium version to the rest of the world.”
In more than thirty states, TikTok is banned from the devices of government employees. On December 30, 2022, as part of a new appropriations bill, President Biden signed a federal ban into law. Why? As the Guardian has reported, an analysis of TikTok’s source-code revealed that it can “collect user contact lists, access calendars, scan hard drives including external ones, and geolocate devices on an hourly basis.” The Guardian writes that the app needs none of these data collections to operate, which suggests to analysts that, whatever the stated objectives of the app, its real purpose is data collection. Millions of American children emit geolocation data hourly, providing, potentially, a textured map of the United States. This threat to national security has been underexamined.
Perhaps the gravest threat of all is that this seemingly autonomous algorithm is in fact designed to re-shape the identity and personality of American kids, making them pliant to behaviors that, we have seen, the CCP finds destructive among its own youth. Prudence demands that we not give TikTok the benefit of the doubt as to whether it is intentional or not. Senate Bill 419 relieves the children of Montana of being objects of surveillance and manipulation. They deserve nothing less.