In the closing minutes of a congressional hearing Wednesday that criticized tech executives for failing to protect children online, Sen. Richard J. Durbin, D-Illinois, urged lawmakers to act to protect younger Internet users.
“No excuse,” he said.
Lawmakers have long made similar statements about holding tech companies accountable — and they don’t have much to show for it. Republicans and Democrats have said at various points that it’s time to regulate the tech giants on issues like privacy and antitrust. But for years, that’s where it ended: no new federal regulations for the companies to follow.
The question is whether this time will be different. And already, there are signs that the issue of children’s online safety may gain more traction legislatively.
At least six bills waiting in the wings in Congress target the spread of child sexual abuse material online and would require platforms like Instagram, Snapchat and TikTok to do more to protect minors. The efforts are supported by emotional accounts of children who were victimized online and died by suicide.
The only federal internet law passed in recent years, SESTA (for Stop Enabling Sex Traffickers Act and Fight Online Sex Trafficking Act), which made it easier for victims of sex trafficking to sue websites and online platforms, was passed in 2018 .also following shocking testimony from a victim’s mother.
Child safety is a personal and visceral issue that is easier to sell politically than some other issues, internet safety experts and lawmakers said. At Wednesday’s hearing, faced with stories of children who had died after being sexually exploited, Meta’s Mark Zuckerberg said he was sorry the families had suffered.
“Much like the tobacco industry, it took a series of disturbing hearings on tobacco — but Congress finally acted,” said Jim Steyer, president of Common Sense Media, a nonprofit children’s advocacy group. “The dam has finally broken.”
Any legislative progress on children’s online safety would be a reversal of the gridlock that has engulfed Congress in recent years on other technology issues. Time and time again, proposals for rules to govern tech giants like Google and Meta have failed to become law.
In 2018, for example, Congress impeached Mr. Zuckerberg for leaking Facebook user data to Cambridge Analytica, a firm that created voter profiles. Outrage over the incident led to calls for Congress to pass new rules to protect people’s online privacy. But while California and other states have finally passed online privacy laws, Congress has not.
Lawmakers have also attacked a legal status, Section 230 of the Communications Decency Act, which protects online platforms like Instagram and TikTok from many lawsuits over content posted by their users. Congress has not substantially changed the statute, other than making it more difficult for platforms to use the legal shield when accused of substantially aiding sex trafficking.
And after companies like Amazon and Apple were accused of being monopolies and abusing their power over smaller rivals, lawmakers proposed a bill to make some of their business practices illegal. An attempt to get the legislation to the end of the line failed in 2022.
Senators Amy Klobuchar, D-Minnesota, and Josh Hawley, R-Missouri, and other lawmakers have blamed the power of tech lobbies for killing the proposed rules. Others said technology regulations have not been a priority for congressional leaders, who have focused on spending bills and measures aimed at subsidizing U.S. companies that make critical computer chips and harness renewable energy.
The Senate Judiciary Committee, which hosted Wednesday’s hearing, talked about five child safety bills aimed at tech platforms ahead of the hearing. The committee voted on the bills last year. none became law.
Among the proposals were the STOPCSAM (Enhancing Transparency and Responsibilities to Protect Children Suffering from Abuse and Maltreatment) Act, which would give victims new ways to report child sexual abuse material to online companies, and the REPORT (Revision of Existing Reporting Processes through Technology) Act, which would expand the types of potential crimes that online platforms are required to report to the National Center for Missing and Exploited Children.
Other proposals would make it a crime to distribute a personal image of someone without that person’s consent and prompt law enforcement to coordinate investigations of crimes against children.
A separate proposal approved last year by the Senate Commerce Committee, the Kids Online Safety Act, would create a legal duty for certain online platforms to protect children. Some of the legislative proposals have been criticized by digital rights groups such as the Electronic Frontier Foundation, which say they could encourage platforms to remove legal content while companies try to comply with the laws.
Ms. Klobuchar, who questioned the tech executives at Wednesday’s hearing, said in an interview that the session “felt like a breakthrough.” He added, “As someone who has run these companies for years, this is the first time I’ve felt hope for a move.”
Others were skeptical. For any proposals to pass, they will need support from congressional leaders. Bills that passed out of committee last year will have to be reintroduced and go through this process again.
Hany Farid, a professor at the University of California, Berkeley who helped create technology that platforms use to detect child sexual abuse material, said he watched Congress hold a hearing after hearing about protecting children online.
“That’s one thing we should be able to agree on: that we have a responsibility to protect children,” he said. “If we can’t get this right, what hope do we have of anything else?”