老夫子传媒

漏 2024 | 老夫子传媒
Southern Oregon University
1250 Siskiyou Blvd.
Ashland, OR 97520
541.552.6301 | 800.782.6191
Listen | Discover | Engage a service of Southern Oregon University
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

TikTok executives know about app鈥檚 effect on teens, lawsuit documents allege

Unredacted documents show TikTok is aware of the dangers caused by its app.
SEBASTIEN BOZON
/
AFP via Getty Images
Unredacted documents show TikTok is aware of the dangers caused by its app.

In communications newly revealed, TikTok executives discuss being aware of the harms caused by their app. TikTok officials were warned of the app鈥檚 dangers to minors.

For the first time, internal TikTok communications have been made public that show a company unconcerned with the harms the app poses for American teenagers. This is despite its own research validating many child safety concerns.

The confidential material was part of a more than two-year investigation into TikTok by 14 attorneys general that led to the company on Tuesday. The lawsuit alleges that TikTok was designed with the express intention of addicting young people to the app. The states argue the multi-billion-dollar company deceived the public about the risks.

In each of the separate lawsuits state regulators filed, dozens of internal communications, documents and research data were redacted 鈥 blacked-out from public view 鈥 since authorities entered into confidentiality agreements with TikTok.

But in one of the lawsuits, filed by the Kentucky Attorney General鈥檚 Office, the redactions were faulty. This was revealed when Kentucky Public Radio copied-and-pasted excerpts of the redacted material, bringing to light some 30 pages of documents that had been kept secret.

After Kentucky Public Radio excerpts of the redacted material, a state judge sealed the entire complaint following a request from the attorney general鈥檚 office 鈥渢o ensure that any settlement documents and related information, confidential commercial and trade secret information, and other protected information was not improperly disseminated,鈥 according to an emergency motion to seal the complaint filed on Wednesday by Kentucky officials.

NPR reviewed all the portions of the suit that were redacted, which highlight TikTok executives speaking candidly about a host of dangers for children on the wildly popular video app. The material, mostly summaries of internal studies and communications, show some remedial measures 鈥 like time-management tools 鈥 would have a negligible reduction in screen time. The company went ahead and decided to release and tout the features.

Separately, under a new law, TikTok has until January to divest from its Chinese parent company, ByteDance, or face a nationwide ban. TikTok is fighting the looming crackdown. Meanwhile, the new lawsuits from state authorities have cast scrutiny on the app and its ability to counter content that harms minors. 

In a statement, TikTok spokesman Alex Haurek defended the company鈥檚 child safety record and condemned the disclosure of once-public material that has now been sealed.

"It is highly irresponsible of NPR to publish information that is under a court seal,鈥 Haurek said. 鈥淯nfortunately, this complaint cherry-picks misleading quotes and takes outdated documents out of context to misrepresent our commitment to community safety.鈥

He continued: 鈥淲e have robust safeguards, which include proactively removing suspected underage users, and we have voluntarily launched safety features such as default screentime limits, family pairing, and privacy by default for minors under 16.鈥

Kentucky AG: TikTok users can become 鈥榓ddicted鈥 in 35 minutes

As TikTok鈥檚 170 million U.S. users can attest, the platform鈥檚 hyper-personalized algorithm can be so engaging it becomes difficult to close the app. TikTok determined the precise amount of viewing it takes for someone to form a habit: 260 videos. After that, according to state investigators, a user 鈥渋s likely to become addicted to the platform.鈥

In the previously redacted portion of the suit, Kentucky authorities say: 鈥淲hile this may seem substantial, TikTok videos can be as short as 8 seconds and are played for viewers in rapid-fire succession, automatically,鈥 the investigators wrote. 鈥淭hus, in under 35 minutes, an average user is likely to become addicted to the platform.鈥

Another internal document found that the company was aware its many features designed to keep young people on the app led to a constant and irresistible urge to keep opening the app.

TikTok鈥檚 own research states that 鈥渃ompulsive usage correlates with a slew of negative mental health effects like loss of analytical skills, memory formation, contextual thinking, conversational depth, empathy, and increased anxiety,鈥 according to the suit.

In addition, the documents show that TikTok was aware that 鈥渃ompulsive usage also interferes with essential personal responsibilities like sufficient sleep, work/school responsibilities, and connecting with loved ones.鈥

TikTok: Time-limit tool aimed at 鈥榠mproving public trust,鈥 not limiting app use

The unredacted documents show that TikTok employees were aware that too much time spent by teens on social media can be harmful to their mental health. The consensus among academics is that they recommend one hour or less of social media usage per day.

The app lets parents place time limits on their kids鈥 usage that range from 40 minutes to two hours per day. TikTok created a tool that set the default time prompt at 60 minutes per day.

Internal documents show that TikTok measured the success of this tool by how it was 鈥渋mproving public trust in the TikTok platform via media coverage,鈥 rather than how it reduced the time teens spent on the app.

After tests, TikTok found the tool had little impact 鈥 accounting for about a 1.5-minute drop in usage, with teens spending around 108.5 minutes per day beforehand to roughly 107 minutes with the tool. According to the attorney general鈥檚 complaint, TikTok did not revisit this issue.

One document shows one TikTok project manager saying, 鈥淥ur goal is not to reduce the time spent.鈥 In a chat message echoing that sentiment, another employee said the goal is to 鈥渃ontribute to DAU [daily active users] and retention鈥 of users.

TikTok has , which are prompts to get users to stop endlessly scrolling and take a break. Internally, however, it appears the company didn鈥檛 think the videos amounted to much. One executive said that they are 鈥渦seful in a good talking point鈥 with policymakers, but 鈥渢hey鈥檙e not altogether effective.鈥

Document: TikTok demoted people it deemed unattractive on its feed

The multi-state litigation against TikTok highlighted the company鈥檚 beauty filters, which users can overlay on videos to make themselves look thinner and younger or to have fuller lips and bigger eyes.

One popular feature, known as the Bold Glamour filter, to resemble models with high cheekbones and strong jawlines.

Internal documents show TikTok is aware of the harm that beauty filters, like Bold Glamour, can cause young users.
Image created by NPR's Grace Widyatmadja / TikTok
/
TikTok
Internal documents show TikTok is aware of the harm that beauty filters, like Bold Glamour, can cause young users.

TikTok is aware of the harm these beauty filters can cause young users, the documents show.

Employees suggested internally the company 鈥減rovide users with educational resources about image disorders鈥 and create a campaign 鈥渢o raise awareness on issues with low self esteem (caused by the excessive filter use and other issues).鈥

They also suggested adding a banner or video to the filters that included 鈥渁n awareness statement about filters and the importance of positive body image/mental health.鈥

This comes as the documents showcase another hidden facet of TikTok鈥檚 algorithm: the app prioritizes beautiful people.

One internal report that analyzed TikTok鈥檚 main video feed saw 鈥渁 high volume of 鈥 not attractive subjects鈥 were filling everyone鈥檚 app. In response, Kentucky investigators found that TikTok retooled its algorithm to amplify users the company viewed as beautiful.

鈥淏y changing the TikTok algorithm to show fewer 鈥榥ot attractive subjects鈥 in the For You feed, [TikTok] took active steps to promote a narrow beauty norm even though it could negatively impact their Young Users,鈥 the Kentucky authorities wrote.

TikTok exec: algorithm could deprive kids of opportunities like 鈥榣ooking at someone in the eyes鈥

Publicly, TikTok has stated that one of its 鈥渕ost important commitments is supporting the safety and well-being of teens.鈥

Yet internal documents paint a very different picture, citing statements from top company executives who appear well-aware of the harmful effects of the app without taking significant steps to address it.

One unnamed TikTok executive put it in stark terms, saying the reason kids watch TikTok is because of the power of the app鈥檚 algorithm, 鈥渂ut I think we need to be cognizant of what it might mean for other opportunities,鈥 said the company executive. 鈥淎nd when I say other opportunities, I literally mean sleep, and eating, and moving around the room, and looking at someone in the eyes.鈥

TikTok鈥檚 internal estimate: 95% of smartphone users under 17 use TikTok

TikTok views itself as being in an 鈥渁rms race for attention,鈥 according to a 2021 internal presentation.

And teenagers have been key to the app鈥檚 early growth in the U.S., but another presentation shown to top company officials revealed that an estimated 95% of smartphone users under 17 use TikTok at least once a month. This lead a company staffer to state that it had 鈥渉it a ceiling among young users.鈥

TikTok鈥檚 own research concluded that kids were the most susceptible to being sucked into the app鈥檚 infinitely flowing feed of videos. 鈥淎s expected, across most engagement metrics, the younger the user, the better the performance,鈥 according to a 2019 TikTok document.

In response to growing national concern that excessive social media use can increase the risk of depression, anxiety and body-image issues among kids, TikTok has introduced time-management tools. These include notifications informing teens about how long they are spending on the app, parental oversight features and the ability to make the app inaccessible for some down time.

At the same time, however, TikTok knew how unlikely it was these tools would be effective, according to materials obtained by Kentucky investigators.

鈥淢inors do not have executive function to control their screen time, while young adults do,鈥 read a TikTok internal document.

TikTok pushes users into filter bubbles like 鈥榩ainhub鈥 and 鈥榮adnotes鈥

TikTok is well aware of 鈥渇ilter bubbles.鈥 Internal documents show the company has defined them as when a user 鈥渆ncounters only information and opinions that conform to and reinforce their own beliefs, caused by algorithms that personalize an individual鈥檚 online experience.鈥

The company knows the dangers of filter bubbles. During one internal safety presentation in 2020, employees warned the app 鈥渃an serve potentially harmful content expeditiously.鈥 TikTok conducted internal experiments with test accounts to see how quickly they descend into negative filter bubbles.

鈥淎fter following several 鈥榩ainhub鈥 and 鈥榮adnotes鈥 accounts, it took me 20 mins to drop into 鈥榥egative鈥 filter bubble,鈥 one employee wrote. 鈥淭he intensive density of negative content makes me lower down mood and increase my sadness feelings though I am in a high spirit in my recent life.鈥

Another employee said, 鈥渢here are a lot of videos mentioning suicide,鈥 including one asking, 鈥淚f you could kill yourself without hurting anybody would you?鈥

In another document, TikTok鈥檚 research found that content promoting eating disorders, often called 鈥渢hinspiration,鈥 is associated with issues such as body dissatisfaction, disordered eating, low self-esteem and depression

Despite these heedings, TikTok鈥檚 algorithm still puts users into filter bubbles. One internal document states that users are 鈥減laced into 鈥榝ilter bubbles鈥 after 30 minutes of use in one sitting.鈥 The company wrote that having more human moderators to label content is possible, but 鈥渞equires large human efforts.鈥

TikTok鈥檚 content moderation missing self-harm, eating disorder content

TikTok has several layers of content moderation to weed out videos that violate its Community Guidelines. Internal documents show that the first set of eyes aren鈥檛 always a person from the company鈥檚 Trust and Safety Team.

The first round typically uses artificial intelligence to flag pornographic, violent or political content. The following rounds use human moderators, but only if the video has a certain amount of views, according to the documents. These additional rounds often fail to take into account certain types of content or age specific rules.

According to TikTok鈥檚 own studies, the unredacted filing shows that some suicide and self-harm content escaped those first rounds of human moderation. The study points to self-harm videos that had more than 75,000 views before TikTok identified and removed them.

TikTok also has scattershot policies on content that includes disordered eating, drug use, dangerous driving, gore and violence. While TikTok鈥檚 Community Guidelines prohibit much of this content, internal policy documents say the company 鈥渁llows鈥 the content. Often, the content is findable on TikTok and just not 鈥渞ecommended,鈥 meaning it doesn鈥檛 show up in users鈥 For You feeds or took a lower priority in the algorithm.

The company has talking points around its content moderation work. One example highlighted in the documents details a child sent to the emergency room after attempting a dangerous TikTok challenge. When dealing with the negative fallout from the press, TikTok told employees to use an internal list of talking points that said, 鈥淚n line with our Community Guidelines, we do not allow content that depicts, promotes, normalizes, or glorifies [dangerous] behavior, including dangerous challenges.鈥

TikTok acknowledges internally that it has substantial 鈥渓eakage鈥 rates of violating content that鈥檚 not removed. Those leakage rates include: 35.71% of 鈥淣ormalization of Pedophilia;鈥 33.33% of 鈥淢inor Sexual Solicitation;鈥 39.13% of 鈥淢inor Physical Abuse;鈥 30.36% of 鈥渓eading minors off platform;鈥 50% of 鈥淕lorification of Minor Sexual Assault;鈥 and 鈥100% of 鈥淔etishizing Minors.鈥

TikTok slow to remove users under 13, despite company policy

Kids under 13 cannot open a standard TikTok account, but there is a 鈥淭ikTok for Younger Users鈥 service that the company says includes strict content guardrails.

It is a vulnerable group of users, since federal law dictates that social media sites like TikTok cannot collect data on children under 13 unless parents are notified about the personal information collected. And even then, social media apps must first obtain verifiable consent from a parent.

In August, the Department of Justice TikTok for violating the federal law protecting the data of kids under 13, alleging that the app 鈥渒nowingly and repeatedly violated kids鈥 privacy.鈥

In the internal documents, however, company officials instructed TikTok moderators to use caution before removing accounts of users suspected to be under 13.

An internal document about 鈥測ounger users/U13鈥 says TikTok instructs its moderators to not take action on reports of underage users unless their account identifies them as under 13.

The previously-redacted portions of the suit suggest the company is aware these young users have accounts 鈥 through complaints from parents and teachers 鈥 but does little to remove them.

TikTok in crisis mode after report on TikTok Live being 鈥榮trip club filled with 15-year-olds鈥

After a on Forbes about underage kids stripping on TikTok鈥檚 live feature, the company launched its own investigation.

That鈥檚 when TikTok officials realized there was 鈥渁 high鈥 number of underage streamers receiving digital currency on the app in the form of a 鈥済ift鈥 or 鈥渃oin鈥 in exchange for stripping 鈥 real money converted into a digital currency often in the form of a plush toy or a flower.

TikTok discovered 鈥渁 significant鈥 number of adults direct messaging underage TikTokkers about stripping live on the platform.

As part of this internal probe, TikTok officials found that in just one month, 1 million 鈥済ifts鈥 were sent to kids engaged in 鈥渢ransactional鈥 behavior.

In an understated assessment, one TikTok official concluded: 鈥淸O]ne of our key discoveries during this project that has turned into a major challenge with Live business is that the content that gets the highest engagement may not be the content we want on our platform.鈥

Copyright 2024 NPR

Bobby Allyn is a business reporter at NPR based in San Francisco. He covers technology and how Silicon Valley's largest companies are transforming how we live and reshaping society.
Sylvia Goodman
Dara Kerr
Dara Kerr is a tech reporter for NPR. She examines the choices tech companies make and the influence they wield over our lives and society.