TikTok raises age requirement for going live, adds adult-only streams

The TikTok logo displayed on a phone screen and a laptop is seen in this illustration photo taken on Aug. 10, 2022. (Photo by Jakub Porzycki/NurPhoto via Getty Images)

TikTok is increasing the age requirement for hosting a livestream on the platform and is launching adult-only livestreams, the company announced this week. 

Currently, any TikTok user with at least 1,000 followers and at least 16 years old can go live on the social media platform, and people over 18 can send and receive tips — a way to make money on the platform. Starting on Nov. 23, the minimum age requirement to host a livestream will change to 18, the company said. 

It will also soon allow users to target adult-only audiences with their livestreams, a plan being rolled out "in the coming weeks." TikTok does not allow nudity, pornography, or sexually explicit content on its platform, so the new adult-restriction setting for livestreams is intended to prevent minors from encountering more mature subject matter. 

"For instance, perhaps a comedy routine is better suited for people over age 18. Or, a host may plan to talk about a difficult life experience and they would feel more comfortable knowing the conversation is limited to adults," TikTok stated in a post. "We want our community to make the most of the opportunities LIVE can bring without compromising on safety."

TikTok, wildly popular with teens and younger children, is owned by the Chinese company ByteDance. It is now the second-most popular domain in the world, according to online performance and security company Cloudflare, exceeded only by Google. 

TikTok algorithm mystery: What we know, and don’t know, about the Chinese government's control of the app

TikTok said it’s also updating its blocked keywords feature for live creators on the platform. Users can already use a keyword filtering tool to limit comments they feel aren't appropriate, and "in the coming weeks," the app is rolling out an updated version of this feature that will send a reminder to people and suggest new keywords they may want to consider adding to the filter list. 

To do this, a tool analyzes content that a creator most commonly removes from their livestream, spotting similar words in the comments and then suggesting that the users may want to add these words to their filter list, TikTok said.

"The foundation of TikTok is built on community trust and safety," TikTok said. "To protect our users and creators and support their well-being, we constantly work to evolve the safeguards we put in place."

RELATED: Nun's TikTok videos on faith and spirituality draw a huge following

Last year, executives from TikTok, as well as YouTube and Snapchat, faced a Senate Commerce subcommittee on consumer protection seeking to learn what the companies are doing to ensure young users’ safety on their platforms. The lawmakers cited harm that can come to vulnerable young people from the sites, ranging from eating disorders to exposure to sexually explicit content and material promoting addictive drugs.

"Sex and drugs are violations of our community standards; they have no place on TikTok," Michael Beckerman, a TikTok vice president and head of public policy for the Americas, told lawmakers last year. 

TikTok has tools in place, such as screen-time management, to help young people and parents moderate how long children spend on the app and what they see, he said.

The company has said it focuses on age-appropriate experiences, noting that some features, such as direct messaging, are not available to younger users. After federal regulators in 2021 ordered TikTok to disclose how its practices affect children and teenagers, the platform tightened its privacy practices for users under 18.

This story was reported from Cincinnati. The Associated Press Contributed.