Skip to Content

‘Watchdog moms’ on TikTok are trying to keep minors safe

<i>Courtesy Seara Adair</i><br/>Seara Adair uses TikTok to educate her followers about the dangers of social media.
Courtesy Seara Adair
Seara Adair uses TikTok to educate her followers about the dangers of social media.

By Samantha Murphy Kelly, CNN Business

Seara Adair, a mother of two young daughters from Atlanta, revealed in a TikTok video last year that she was sexually abused by a family member as a child. In the next few weeks, her follower count grew by tens of thousands, many of whom appeared to be minors.

After that, she started using the popular short-form video app to educate her followers about various digital dangers. She posted about the risks of being approached by strangers online and the problematic content found hidden in the deep corners of TikTok and other platforms.

“Minors started tagging me in posts or sharing things they’d come across that made them scared,” Adair, who now has 350,000 TikTok followers, told CNN Business. “They’d say, ‘Hey, I came across this. What do I do? Can you do something?'”

In one example, a young follower informed Adair of an alleged practice on TikTok in which minors and adults can post explicit videos privately on their accounts’ “Only Me” feed, where it can then be accessed by anyone with a shared password. (A TikTok spokesperson told CNN Business it moderates each video uploaded to the site, whether set to private or public, and removes and reports any child sexual abuse material to the National Center for Missing or Exploited Children.)

In another case last month, Adair used TikTok to discuss the spread of alleged “grooming manuals” on the dark web, which teach predators how to spot a vulnerable child, gain their trust and prey on them.

Adair is part of an emerging community of so-called “watchdog moms” calling attention to potential issues for younger users on TikTok and other platforms, and building up an online following in the process. This small group of parents flag issues ranging from the sharing of exploitative videos to more routine concerns about the oversharing of photos and personal information about children online. Adair, along with these peers, work to get problematic content taken offline, which she said is often a “very long battle.”

Sarah Adams, who is part of this advocate cohort and known as mom.uncharted on TikTok, has attracted millions of views on the platform for exposing these types of dangers. She started by creating videos on TikTok about child safety and sharing practices, and eventually began highlighting specific examples of the problematic content she’d find, including how seemingly harmless photos shared by parents online can be fetishized by sexual predators.

“My main goal was just to share a different perspective so parents would reflect on their own sharing practices and maybe think, ‘I never thought about it like that. Why am I sharing my child publicly?'” she said.

This subset of influencers never meet in person but chat often to share findings and experiences, and come together to attempt to take down concerning videos, according to Adams. The digital movement, which comes amid heightened scrutiny of the impact social media platforms have on younger users, also highlights the challenges tech companies face in effectively policing problematic content.

Last year, executives from TikTok, Snap and Meta testified before a Senate subcommittee as lawmakers questioned their apps’ impact on the mental health of teens. At the time, TikTok’s VP and head of public policy, Michael Beckerman, said the company is working to “keep its platform safe and create age appropriate experiences” but added “we do know trust must be earned.”

“We’re seeking to earn trust through a higher level of action, transparency and accountability, as well as the humility, to learn and improve,” Beckerman testified. He cited a handful of parental controls, improvements with moderation and age restrictions such as no direct messaging for anyone under age 16, as ways TikTok protects its young userbase. He also encouraged parents to get on the platform to learn more about how it works and how it makes their children feel.

TikTok requires users to be ages 13 and older and has said it takes additional steps identify and remove suspected underage account holders, such as training its safety team to look out for and disable accounts it may believe is used by someone younger. Its community guidelines prohibit nudity or content that promotes sexual solicitation on the platform, including attempts to redirect users to sexual content or services.

A company spokesperson told CNN Business TikTok encourages members of its community to report content that may be in violation. But Adair said she believes the onus has fallen too much on herself and other advocates on TikTok to step up in areas where the site is failing.

“It should not be our responsibility as creators on the app to do this, but TikTok is not taking care of their part of it,” Adair said. “As moms, we feel the responsibility to warn other parents so they can take responsibility for their own children and prepare them for what they could face on these apps.”

Hitting ‘report’ for hours

Like other large tech platforms, TikTok relies on users and algorithmic systems to flag potentially violative content, which then gets reviewed by a human moderator to determine whether it should be removed. The company said it has thousands of safety experts, which includes child safety professionals, moderators and policy experts, across the world and three main hubs — Singapore, Dublin and Mountain View, California — with workers to take down videos in real time that may violate its policies.

According to TikTok’s latest transparency report, the company removed 85 million problematic videos between October 2021 and December 2021, with violations to minor safety making up 45% of removals, followed by illegal activities (20%) and adult nudity and sexual activities (20%). The report said the removal of videos during that period make up about 1% of all videos uploaded to TikTok.

But as with other platforms, TikTok gets criticized for what it does and does not take action on. Over the years, numerous researchers, journalists and everyday users have flagged posts and accounts to tech companies for seemingly running afoul of various platform policies. The “watchdog moms” fit into this long history, but with a focus primarily on concerning content and activity for social media’s youngest users, including on TikTok, which is widely popular with teenagers.

Carly Yoost, CEO and founder of the Child Rescue Coalition, called the emergence of parent influencers who warn about exploitation methods on social media an “important” movement to ensure both parents and children are educated on digital safety. Gabrielle Usatynski, a family therapist who focuses on social media’s impact on families, also applauded their efforts to bring attention to violative content.

But Michela Menting, a research director at global technology intelligence firm ABI Research, said it is “quite concerning” that parents have to take on this role. “It shows a total lack of regulation engaging the liability of social media providers and digital platforms to police exploitative content,” she said. “It shows a willful ignorance of the very well-known dangers of the cyber-world for minors.”

These factors enable problematic content to endure on the site, Menting added.

Catie Reay, a mother of four known as The TikTok Advocate on TikTok, said she’s helped remove about 100 videos on TikTok in the past two and a half years but has reported “thousands” of videos that were never taken down, from videos she said showed explicit nudity to videos she said supported rape culture. (When CNN provided TikTok with the video about rape culture, TikTok said it was reviewing it).

TikTok spokesperson said explicit nudity is a breach of community guidelines.

Reay, who said she experienced sexual abuse as a child for over a decade by a family member, uses the platform to share warning signs of sexual abuse and offer advice to her 1 million followers on how to help others or themselves. She also works alongside non-profits who have safe houses for sexual exploitation survivors and organizations, such as Rescue America and the National Center on Sexual Exploitation, to bring awareness to the issue.

“I will spend a whole 7-hour day just hitting ‘report’ [on a post] in between making sandwiches for my kids and doing [other things],” she told CNN Business. “I’ll call on my armies of thousands of followers to help me report, and it will take 24 to 48 hours to be taken down. Some posts were posted two years before I saw them, so I often wonder how many people were sexually exploited in that timeframe.”

A TikTok spokesperson cited a previous blog post written by TikTok’s US head of safety Erik Han that noted mass reporting content or accounts does not lead to an automatic removal or to a greater likelihood of removal by its safety team.

Although Reay sees some success with helping to take down content, she said the vast majority of times she reports a video to TikTok it comes back with no violation. Adams said she also reports “a very high number” of problematic accounts and posts that come back with no violations.

“The Community Guidelines on TikTok currently feel more like a ‘suggestion’ rather than hard rules that will be enforced by the platform,” she said.

In the same blog post, Han said if a violation is confirmed, the video will be removed and the creator will be notified of the removal and reason, and given the opportunity to appeal it. If no violation is identified, the video will remain posted and others on TikTok will be able to view it.

Watching the watchdogs

While speaking out on these issues has helped broaden their followings on TikTok, it also appears to come with some headaches. Multiple parents said they believe TikTok has taken action against their accounts.

Reay, for example, said she believes some of her videos get less preferred placement on the platform, so they’re seen by fewer people. Other videos are taken down for violating community guidelines, she said.

She added that when she puts the attention on the creators who post or make the inappropriate content, her own account will often be temporarily suspended for bullying or flagged for minor safety. “It really inhibits the work that I do when I get put on suspensions and they’re usually about a week long,” she said.

A TikTok spokesperson said accounts or videos will be removed if they have violations themselves, such as bullying, and every user has the right to appeal the decision if they felt it was incorrect.

Adair said her account has been removed from TikTok on various occasions, too. She added that part of the reason she started talking about the dangers of TikTok was to get the company’s attention. Although she said it has partially worked, she believes TikTok still has a lot of work to do to make the site a safer place for young users.

“For a company of this size, there are going to be faults, but when it comes to child exploitation and grooming, these are serious things that [shouldn’t slip through any cracks],” she said.

The-CNN-Wire
™ & © 2022 Cable News Network, Inc., a WarnerMedia Company. All rights reserved.

Article Topic Follows: CNN - Social Media/Technology

Jump to comments ↓

CNN Newsource

BE PART OF THE CONVERSATION

News Channel 3 is committed to providing a forum for civil and constructive conversation.

Please keep your comments respectful and relevant. You can review our Community Guidelines by clicking here

If you would like to share a story idea, please submit it here.

Skip to content