Skip to content

ICYMI: Child Safety Experts Testify in Support of "Duty of Care" to Protect Kids Online

[WASHINGTON, D.C.] – U.S. Senator Richard Blumenthal (D-CT), the author of the Kids Online Safety Act (KOSA) with U.S. Senator Marsha Blackburn (R-TN), asked two child safety experts about their support for a “duty of care” that would require online platforms to prevent and mitigate certain harms that they know their platforms and products are causing to young users.

KOSA includes a “duty of care” that forces online platforms to consider and address the negative impacts of their specific product or service on younger users, including things like their recommendation algorithms and addictive product features. The specific covered harms include suicide, eating disorders, substance use disorders, and sexual exploitation.

“There needs to be a duty of care because ultimately these children are on their platforms,” answered John Pizzuro, the CEO of Raven, an advocacy organization focused on focused on ending child exploitation. “So there's a burden on them to make sure that the children are safe.”

Michelle DeLaune, the CEO of the National Center for Missing and Exploited Children (NCMEC), agreed: “We cannot prosecute our way out of the problem. The reports are coming in, law enforcement rightly is investigating. Really, we need to be looking upstream about preventing these crimes from happening in the first place.”

Blumenthal and Blackburn first introduced KOSA in February 2022 following reporting by the Wall Street Journal and after spearheading a series of five subcommittee hearings with social media companies and advocates on the repeated failures by tech giants to protect kids on their platforms. KOSA will require platforms to enable the strongest privacy settings by default, force platforms to prevent and mitigate specific dangers to minors, provide parents and educators new controls to help protect children, and require independent audits and research into social media companies.

The full text of Blumenthal’s exchange with Pizzuro and DeLaune is available here and copied below:

U.S. Senator Richard Blumenthal (D-CT): Thank you very much, Senator Hawley. We have worked together, and my hope is that we will continue to work together, especially on the issues that are before us today—and most especially, the Kids Online Safety Act, which was approved by the United States Senate in a vote of 91-3. 91-3. Doesn’t happen very often these days in the Senate. It was last session, and unfortunately the House never gave it a vote, which in my view, is a tragedy because it helps protect kids against the toxic content and the algorithms, the black box methodology that social media uses.

And of course, the tech companies who would be held accountable under this law say they are for it and then they worked behind the scenes against it, and they try to shift blame for this skyrocketing increase in online harms to others, avoiding the blame that they well deserve. But more important than the blame is reforms that they could well institute, providing tools and safeguards for parents and children and a duty of care so that they are required to mitigate harm if they know it is happening or have reason to know what is happening. And of course disclosure of the algorithms—the black box drivers of that toxic content.

Mr. Pizzuro, you say that we are not going to arrest our way—I think in your testimony, we cannot arrest our way out of this problem. Let me ask you, perhaps you and Ms. DeLaune, what you think about the duty of care as a means of providing some safeguards here.

John Pizzuro: Well, I think from my standpoint, is that there is no safeguards. And I think that's the problem, right? The AI algorithms push all this content to them, and it doesn't matter what the mechanism is. So, there needs to be a duty of care because ultimately these children are on their platforms. So there's a burden on them to make sure that the children are safe.

Michelle DeLaune: Thank you, Senator. It is really incumbent upon the companies to know their customer. You know, at this point, most of the sites, most things online, you just check a box, “You’re over 13,” “You’re over 21,” whatever it may be. They are working and looking at age assurance, knowing who the child is, what age they are at. Going back to the case that we just saw a moment ago, knowing who they are engaging with, whether or not they are over age, under age. There is a shared responsibility, in our view, for the platforms, for the app stores, for the device, in knowing who the child is and building and designing safer experiences for them, recognizing their age.

I will also talk briefly about the necessity. We cannot prosecute our way out of the problem. The reports are coming in, law enforcement rightly is investigating. Really, we need to be looking upstream about preventing these crimes from happening in the first place.

One feature that we are seeing an increase, we actually saw a 1300% increase in one year, in the use of generative AI to create child sexual abuse imagery. There, our blockers right now in trying to prevent that. The current legislation allows the National Center to provide specific elements to help in the prevention of these crimes only with electronic service providers.

What the Stop CSAM Act also allows us to do is share this information with other entities who are furthering the protection of children—whether it be an NGO, whether it be a gen AI tech provider. Right now, we are hearing requests working with Meta, with OpenAI, with Google, who are looking to build classifiers to detect generative artificial CSAM, AI CSAM. But there is limited information in some cases about what we can share. So, another really important thing that we just keep going back to is we will continue responding. We need to be preventing, they need to know who their customers are, and we need to be able to share good data, helpful data, to help them build solutions to the problems.

Blumenthal: And I agree that they have the technology. They know the customers. The burden should be on them. That is the importance of the duty of care. It is a design feature, it’s not a censorship mechanism. It does not block content. It gives the consumers choices so that they can block it if they wish, or their parents to take action to protect their children with tools that they deserve to have. And the duty of care imposes a measure of responsibility on the tech companies themselves to address the kind of problems that they are seeing and we are seeing children facing.

Thank you all for your testimony today. Thanks, Mr. Chairman.

-30-

Related Issues