Skip to content

Blumenthal Chairs Hearing With Head of Instagram on Social Media's Dangers to Kids & Legislative Solutions

“The resounding bipartisan message from this committee is: legislation is coming.”

[WASHINGTON, D.C.] – U.S. Senator Richard Blumenthal (D-CT), Chair of the Senate Commerce, Science, and Transportation Subcommittee on Consumer Protection, Product Safety, and Data Security, convened a hearing on Wednesday titled “Protecting Kids Online: Instagram and Reforms for Young Users.” While hearing testimony from Head of Instagram Adam Mosseri, Blumenthal highlighted the strong bipartisan momentum building around concrete proposals to hold Big Tech accountable, and pressed the company on its support for those proposals and its efforts to protect children and teens from dangers online. The hearing was the fifth in a series of bipartisan hearings spearheaded by Blumenthal and the Subcommittee’s Ranking Member Marsha Blackburn (R-TN) to inform legislation and prompt action by social media companies.

“The time for self-policing and self-regulation is over,” said Blumenthal in his opening remarks. “Self-policing depends on trust. The trust is gone. What we need now is independent researchers, objective overseers, not chosen by Big Tech, but from outside, and strong, vigorous enforcement of standards that stop the destructive, toxic content that now, too often is driven at kids and takes them down rabbit holes to dark places.”

Blumenthal emphasized the strong bipartisan momentum building on legislative reforms to protect children online, stating: “The resounding bipartisan message from this committee is: legislation is coming.” Blumenthal outlined bipartisan proposals including updating children’s privacy laws, providing parents and children with effective tools, Section 230 reform, and improved enforcement. “I hope that we will begin the effort of working together,” Blumenthal said to Mosseri, “but one way or the other, this committee will move forward.”

In response to Tuesday’s announcement by Instagram that it will be introducing new safety tools for young users and parents, Blumenthal said, “These simple time management and parental oversight tools should have—they could have—been announced years ago. They weren’t, and in fact, these changes fall way short of what we need in my view.” In a later exchange with Mosseri, Blumenthal added: “If you said to a teen who was on Instagram fixated on eating disorders, ‘why don't you try snorkeling in the Bahamas,’ that nudge just won’t work. And Instagram has a real asymmetric power here. It drives teens in a certain direction and then makes it very difficult for the teen once in a dark place to find light again and to get out of it.”

Blumenthal discussed his office’s repeated research into Instagram’s recommendations for teens, saying: “On Monday, we created another fake account for a teenager and followed a few accounts promoting eating disorders, and again within an hour, all of our recommendations promoted pro-anorexia and eating disorder content. Two months ago, the Global Head of Safety for Facebook was put on notice by this subcommittee. Nothing has changed. It’s all still happening…We all know that if Facebook saw a significant threat to its growth or ad revenue it would not take two months to take action. So why does it take months for Facebook to act when our kids face danger?

In response to Mosseri asserting, “I don't believe the research suggests our products are addictive,” Blumenthal replied: “We can debate the meaning of the word addictive but the fact is that teens who go to the platform find it difficult, maybe sometimes impossible to stop and part of the reason is more content is driven to them to keep them on the site, to aggravate the emotions that are so seductive and ultimately addictive.”

In his closing remarks, Blumenthal pushed Mosseri and Instagram for firm commitments on legislative proposals and actions to protect its young, vulnerable users, stating: “Well, you’ve said repeatedly that you’re in favor of one goal or another ‘directionally.’ And I find that term really incomplete when it isn’t accompanied by specific commitments, a yes or a no...The kinds of baby steps that you suggested so far, very respectfully, are underwhelming. A nudge, a break…I think you will sense on this committee a pretty strong determination to do something well-beyond what you’ve indicated you have in mind, and that's the reason that I think self-policing based on trust is no longer a viable solution.”

Video of Blumenthal’s opening remarks can be found here and the transcript is copied below.

U.S. Senator Richard Blumenthal (D-CT): We really appreciate your being here, Mr. Mosseri, your response to our invitation is very welcome. I want to thank you and your team for your cooperation and I want to thank the Ranking Member, Senator Blackburn, for being such a close partner in this work, as well as our Chairwoman, Maria Cantwell, and Ranking Member Roger Wicker for their support as well, and all the members of the committee for being so engaged on this topic.

As a note to start, I understand Mr. Mosseri has a hard-stop at 5 so I’m going to be strict on the 5 minute time limit. I know everybody thinks of me as a very nice guy, but I’m going to be ruthless, at least attempting to be ruthless, as best as any senator can be with his colleagues.

In this series of hearings, we’ve heard some pretty powerful and compelling evidence about the dangers of Big Tech to children's health, well-being and futures. Our nation is in the midst of a teen mental health crisis. Social media didn't create it but it certainly fanned the flames and fueled it, and if anybody has doubts about the potential harmful effects of social media, the Surgeon General yesterday issued a powerful report about the implications of social media as well as video gaming and other technologies on teen mental health, and that's part of the reason we're here.

The hearings have shown that social media and in particular Big Tech actually fans those flames with addictive product and sophisticated algorithms that can exploit and profit from children's insecurities and anxieties and our mission now is to do something about it. We're here to do more than shake fists. We really are seeking solutions. And we welcome the voices, and the vision of Big Tech itself, in that effort.

I believe that the time for self-policing and self-regulation is over. Some of the Big Tech companies have said “trust us.” That seems to be what Instagram is saying in your testimony, but self-policing depends on trust. The trust is gone. What we need now is independent researchers, objective overseers, not chosen by Big Tech, but from outside, and strong, vigorous enforcement of standards that stop the destructive, toxic content that now, too often is driven at kids and takes them down rabbit holes to dark places.

The day before this hearing, Instagram announced a set of proposals. These simple time management and parental oversight tools should have – they could have - been announced years ago. They weren't, and in fact, these changes fall way short of what we need in my view. Many of them are still in testing, months away. The roll-outs will be done at some point in the future. We don't know exactly when and unfortunately, these announced changes leave parents and kids with no transparency into the black box algorithms. The 600 pound gorillas in those black boxes that drive that destructive and addictive content at children and teens.

No effective warning or notice to parents when their children spiral into eating disorders, bullying or self-harm, nothing more than the bare minimum controls for parents and of course no real accountability to assure parents and kids that these safeguards will work.

I'm troubled with the lack of answers on Instagram Kids, once again, this pause looks more like a public relations tactic brought on by our hearings, just as these announced changes seem to be brought on by these proceedings announced just hours before your testimony. And we need real, serious review of those changes.

The magnitude of these problems requires bold and broad solutions and accountability which has been lacking so far. Facebook's own researchers have been warning management, including yourself, Mr. Mosseri, for years, about Instagram's harmful impacts on teens' mental health and well-being and the whistleblower who sat exactly where you are who told us about those documents, about the research, the studies, which showed that Facebook knew. It did the research, it had the studies, but it continued to profit from the destructive content, because it meant more eyeballs, more advertising, more dollars.

Given those warnings, it seems inexcusable that Facebook waited a decade to begin, and only to begin, figuring out that Instagram needed parental controls. In the past two months, this subcommittee has heard horrifying stories from countless parents whose lives and their children’s lives had been changed forever.

One father from Connecticut wrote to me about his daughter who developed severe anxiety in high school because of constant pressure from Instagram. That pressure became so intense following her home from school, following her everywhere she went, all the way into her bedroom in the evening where she attempted suicide. Fortunately, her parents stepped in and sought help and found a recovery program, but the experience continues to haunt her and her family. Facebook’s researchers call this fall into that kind of dark rabbit hole a perfect storm. That's the quote, perfect storm, created by its own algorithm that exacerbate downward spirals, harmful to teens.

Again, Facebook knows about the harm. It's done the research, the studies, the surveys repeatedly. It knows the destructive consequences of the algorithms and designs. It knows teens struggle with addiction and depression on Instagram, but that data has been hidden, like the algorithms themselves.

Just yesterday, that Surgeon General's report provided powerful documentation on how social media can fan those flames and fuel the fires of the mental health crisis that we face among teens. And it signals that something is terribly wrong.

What really stuns me is the lack of action. In fact, just within the last few months. Two months ago, the subcommittee heard testimony from Facebook's Global Head of Safety, Ms. Antigone Davis. At that time, I showed her the pro-eating disorder [content] rampant on Instagram, I demonstrated through an experiment, how its algorithms will flood a teen with triggering and toxic messages in just hours after we created an account. This was glorification of being dangerously underweight, tips on skipping meals, images we could not, in good conscience, show in this room.

It's been two months, so we've repeated our experiment. On Monday, we created another fake account for a teenager and followed a few accounts promoting eating disorders, and again within an hour, all of our recommendations promoted pro-anorexia and eating disorder content.

Two months ago, the Global Head of Safety for Facebook was put on notice by this subcommittee. Nothing has changed. It’s all still happening. And in the meantime, more lives have been broken, real lives with real families and futures and you hear from them yourself. We all know that if Facebook saw a significant threat to its growth or ad revenue it would not take two months to take action. So why does it take months for Facebook to act when our kids face danger?

Time is not on our side. No wonder parents are worried. In fact, parents are furious. They don’t trust Instagram, Google, TikTok, and all of their Big Tech peers. By the way, this is not an issue limited to Instagram or Facebook. Parents are asking: what is Congress doing to protect our kids?

And the resounding bipartisan message from this committee is: legislation is coming. We can’t rely on trust anymore, we can’t rely on self-policing. It’s what parents and our children are demanding. Senator Blackburn and I are listening to them, as are other members of this committee, we are working together. Your proposal for an industry body asks parents once again “trust us, we’ll do it ourselves.” But self-regulation relies on that trust that has been squandered.

We need to make sure that the responsibility is on Big Tech to put a safe product on the market. You can't be allowed to conceal when products are harming kids so the first imperative is transparency. We need real transparency into these 800 pound gorilla black box algorithms and addictive designs, and disclosure has to include independent, qualified researchers who will then tell the story to the public.

We need to update our children’s privacy laws. Congress should pass the bipartisan Children’s and Teens Online Privacy Protection Act authored by Senator Markey who is here today. I'm proud to be working with him on updating and expanding it.

Parents and children need more power and more effective tools to protect themselves on the platform. And that’s why Senator Blackburn and I are working on a framework and we’ve made good progress to enable that protection.

There really should be a duty of care. The United Kingdom has imposed it. It's part of a law there. Why not here? That ought to be part of framework of legislation that we're considering.

Section 230 reform, you make reference to it in your testimony. The days of absolute broad unique immunity for big tech are over.

And finally enforcement. State authorities, federal authorities, law enforcement has to be rigorous and strong. I hope that we will begin the effort of working together but one way or the other, this committee will move forward. Again, I thank you for being here. I thank all of my colleagues for attending.

-30-