Skip to content

Blumenthal Questions Snapchat, TikTok & Youtube

“The companies are in effect grading their own homework, they’re evaluating their own effects on kids when it comes to addiction and harms.”

[WASHINGTON, D.C.] – Today, U.S. Senator Richard Blumenthal (D-CT), Chair of the Senate Commerce, Science, and Transportation Subcommittee on Consumer Protection, Product Safety, and Data Security, questioned Snapchat, TikTok, and YouTube representatives during a hearing to examine the impacts of these social media companies on young people.

Blumenthal asked each of the companies if they have conducted research on whether your apps can have a negative effect on children’s or teen’s mental health or well-being and whether your apps promote addiction-like use,” and pressed them to commit to releasing this research to the Subcommittee. The companies all confirmed they have done research and that they would make it available to the Subcommittee.

Blumenthal questioned the witnesses about whether the companies “provide external independent researchers with access to your algorithms, data sets, and data privacy practices.” Snap Vice President of Global Public Policy Jennifer Stout stated that Snapchat’s algorithms “work very differently,” but agreed that they would provide access to independent researchers to study them. TikTok Vice President and Head of Public Policy for the Americas Michael Beckerman affirmed the company allows external access to its algorithm, stating it has published publicly how its algorithm works and has “opened transparency and accountability centers where we invite outside experts and invite you, Senator, and your staff to come in and see exactly how the algorithm works.” YouTube Vice President of Government Affairs and Public Policy Leslie Miller stated the company “regularly partners with experts,” which Blumenthal emphasized was not the same as allowing independent researchers, stating: “I’m going to cite the difference between your response and Mr. Beckerman’s and Ms. Stout’s which indicates certainly a strong hesitancy, if not resistance to providing access.”

Blumenthal pressed YouTube on its fight against privacy rules, including by its parent company Google, and asked the company to “commit that you’ll support privacy legislation as has been proposed.” Miller evaded answering the question.

Blumenthal praised the United Kingdom’s child safety laws, the Age Appropriate Design Code, pressing the companies on whether they would support similar child safety legislation in the United States, stating: “Whether it’s Facebook, or your companies in various ways, I think you have shown that we can’t trust Big Tech to police itself. And so when you say, ‘We already do it,’ well, you may decide to do it, but there’s no legal obligation that you do it and there’s no way to hold you accountable under current law. That’s what we need to do; that’s why I’m asking you about a law.” In response to Blumenthal’s question, Snap, TikTok, and YouTube expressed support for a U.S.-specific child safety law.

Blumenthal questioned Snapchat about the impact of the platform’s popular filters, which can change a person’s appearance, on teens’ mental health, including about Snapchat dysmorphia which describes the depression, mental health, and other issues associated with the platform. Blumenthal pressed the company about the type of research it conducts about its filters before they are available for use, and Snapchat committed to checking on this issue. Blumenthal raised Snapchat’s speed filter, which was available on the platform for eight years before safety advocates and multiple deaths in car wrecks led the company to discontinue the filter. Blumenthal shared the story of a Connecticut constituent whose son, Carson Bride, took his own life after being bullied on Snapchat, stating: How can parents protect kids from the relentless bullying that follows kids home from school—as Ms. Haugen said so movingly—no longer stops at the school house door. Now, 24/7, comes into their homes.” Blumenthal pressed the company to share what they are doing to stop bullying on their platform.

Blumenthal questioned TikTok about what the company is doing in relation to challenges that take place on its platform, including the Blackout Challenge, saying: “A mother who lost her child to the Choking Challenge shared questions with me, and they’re questions that deserve answers from you and from others who are here today, and I’m just going to ask her question: How can parents be confident that TikTok, Snapchat, and YouTube will not continue to host and push dangerous and deadly challenges to our kids?” In his response, TikTok’s Beckerman denied the existence of this challenge on the platform, saying: “We have not been able to find any evidence of a Blackout Challenge on TikTok at all,” emphasizing the company takes down videos that are illegal or dangerous. Blumenthal pushed back, saying “we found ‘Pass Out’ videos. We found them. So, I have a lot of trouble crediting your response on that score.”

Blumenthal shared the story of a Connecticut parent who described her 13-year-old daughter’s experience on TikTok, including being inundated with suicide, self-injury, and eating disorder videos, and said his staff created a TikTok account after hearing from parents across the country. “Within a week, TikTok started pushing videos promoting suicidal ideation and self-harm. We didn’t seek out this content, we can’t show these videos in this room, because they were so disturbing and explicit,” said Blumenthal, pressing Beckerman to explain why the company would inundate kids with these kinds of videos.

Blumenthal concluded the hearing by offering the companies an opportunity to describe how they differentiate themselves from Facebook, and in closing said: “I thank each of you for being here today. I hope you have grasped the sense of urgency and impatience that many of us feel. The time for platitudes and bromides is over, we’ve seen the lobbyists at work, we’ve seen millions of dollars berated against us and I think we’re determined this time to overcome them. I hope that we’ll look back on this hearing as a turning point along with others that were going to continue to hold.”

-30-