[WASHINGTON, D.C.] — Today, U.S. Senator Richard Blumenthal (D-CT) chaired a hearing of the Senate Commerce, Science, and Transportation Subcommittee on Consumer Protection, Product Safety, and Data Security on protecting kids online. With children’s screen time surging on popular apps like TikTok, Facebook Messenger, and Instagram, kids are being exposed to safety and privacy issues including data collection, aggressive marketing, and sexual exploitation.
“Big tech and data brokers are spying on children, watching them play, monitoring their lives. No company should be allowed to collect permanent, invasive dossiers on our children. Even more concerning are the cesspool of elicit pitches to kids,” said Blumenthal in his opening remarks, highlighting a recent survey from anti-human trafficking organization Thorn which found that more than a quarter of children nine to 11 years old received sexual solicitations on social media, often by adults. “These children are also assailed by aggressive, sophisticated and undisclosed marketing that prey on their impressionable minds and exploit those dossiers of private information for commercial gain.”
Blumenthal and Ranking Member Marsha Blackburn (R-TN) invited TikTok to testify at the hearing, due to the app’s popularity with young audiences, but the company declined the invitation. Blumenthal called out TikTok for refusing to take part in the hearing, stating: “We asked them to come in and explain how they are safeguarding children. Parents deserve to hear from TikTok and I’m disappointed that TikTok rejected our invitation and refused to discuss these issues with Congress. We’re going to continue to invite them to come. I hope that they will give parents and Congress the explanations we deserve.”
Blumenthal also expressed ongoing concerns about Facebook’s recent announcement that it will launch a version of Instagram for children, pointing out the app’s “notorious record of disinformation, bullying, and deception,” and aggressive Instagram influencer marketing to young fans. “I have no trust, none, that Facebook will keep these young users safe. It has failed far too often,” said Blumenthal. “Facebook should stop this additional intrusive and potentially dangerous interference in kids’ lives and abandon plans for an Instagram Kids.” Earlier today, Blumenthal, along with U.S. Senator Ed Markey (D-MA), and U.S. Representatives Kathy Castor (D-FL) and Lori Trahan (D-MA) issued a joint statement, calling on Facebook to abandon its plans to develop this new platform.
To address rampant safety and privacy issues facing children online, Blumenthal expressed support for strong Congressional action, including the bipartisan Children and Teens Online Privacy Protection Act, examining the United Kingdom’s online safety bill, and broader action on Section 230 reform as outlined by the bipartisan EARN IT Act, stating: [W]e must stop these business practices of negligence and commercial exploitation of children that now exists online. Spying and preying on children is never okay. Parents are powerless to prevent it now. They need the tools to stop it themselves or Congress must intervene to end it.”
The transcript of Blumenthal's opening remarks is available below.
Welcome to this hearing, Protecting Kids Online. I want to thank Ranking Member Blackburn and the witnesses for being here today, her collaboration has been invaluable and I’m looking forward to the excellent observations that we’ll hear from you and from the United Kingdom.
As children spend drastically more time online, the tech platforms really have become a perilous minefield for many of them. They are deeply addictive, potentially destructive without sufficient parental supervision or safeguards. I’ve fought for data privacy rules for consumers and accountability for tech companies, focusing on the harms that they cause. Nowhere is that more profound and urgent than for children.
Big tech and data brokers are spying on children, watching them play, monitoring their lives. No company should be allowed to collect permanent, invasive dossiers on our children. Even more concerning are the cesspool of elicit pitches to kids. In a survey last week by the anti-human trafficking organization Thorn, the finding was that more than a quarter of children nine to 11 years old received sexual solicitations on social media, often by adults. A quarter of those children received sexual solicitation.
These children are also assailed by aggressive, sophisticated and undisclosed marketing that prey on their impressionable minds and exploit those dossiers of private information for commercial gain.
Two examples: one TikTok, the other Instagram. According to Thorn, 66 percent of young children nine to 12 years old use the video sharing app TikTok. It can often be informative and entertaining. It has held itself out to parents as safe but it has aggressively recruited young users. Regrettably, TikTok has a troubling track record on children’s privacy. Only two years ago, TikTok paid a then-record $5.7 million fine for disregarding our children’s privacy rules and illegally collecting data about kids. It then shared this sensitive information with third parties and advertisers.
This practice still continues. In March 2020, children’s advocates led by Professor Campbell filed a complaint with the Federal Tread Commission alleging TikTok continues to violate the law. TikTok is also facing investigations in Europe for failing to protect children.
Privacy is not the only issue. Organizations like the National Center on Sexual Exploitation and the Center for Digital Democracy have all raised concerns about predatory sexual content and manipulative advertising on TikTok. The FTC even called attention to TikTok being used by predators to groom nearby children in its case against the company.
Because TikTok is so popular with young audiences, Ranking Member Blackburn and I invited the company to this hearing. We asked them to come in and explain how they are safeguarding children. Parents deserve to hear from TikTok and I’m disappointed that TikTok rejected our invitation and refused to discuss these issues with Congress. We’re going to continue to invite them to come. I hope that they will give parents and Congress the explanations we deserve. They’ve been failing to do it.
I’m also alarmed by Facebook’s recent announcement that it will launch a version of Instagram marketed to children. Instagram has a notorious record of disinformation, bullying, and deception. Prominent Instagram influencers often push alcohol, tobacco, and other dangerous products on young fans, despite warnings from the FTC.
Sexual exploitation is also a problem. According to Thorn, 16 percent of children and teens that have been sexually harassed on Instagram tied with Snapchat for the most reports of harm. I have no trust, none, that Facebook will keep these young users safe. It has failed far too often. For example, one design flaw in its Messenger Kids app allowed strangers to chat with children. Given that record, I cannot imagine why Facebook would bulldoze ahead into kids’ lives. Senator Markey and I wrote to Facebook asking questions about its plans and we have received woefully inadequate answers. I agree with the 44 state attorneys general and dozens of child welfare specialists saying no. Facebook should abandon its plans for Instagram Kids. Facebook should stop this additional intrusive and potentially dangerous interference in kids’ lives and abandon plans for an Instagram Kids.
As for the way forward, we must stop these business practices of negligence and commercial exploitation of children that now exists online. Spying and preying on children is never okay. Parents are powerless to prevent it now. They need the tools to stop it themselves or Congress must intervene to end it.
I commend my colleagues Senators Markey and Cassidy for introducing bipartisan Children and Teens Online Privacy Protection Act. I’ve worked on this issue with them and I will be strongly supporting and advocating such measures. But we need to do more. The EARN IT Act that Senator Graham and I introduced last session, approved unanimously by the Senate Judiciary Committee, offers a template for even broader action on Section 230. Eventually, tech platforms must be held accountable, they must bear liability for obvious violations of criminal and perhaps civil law, and the cutbacks in Section 230 immunity, carefully tailored to meet the needs of harm to children, offer a very important path forward.
And I also commend the trailblazing work of Baroness Kidron who drafted an age appropriate design code and an online safety bill in the United Kingdom. It too offers a potential model for us.
I’ll now turn to Ranking Member Blackburn.
-30-