Hearings filled with heart-wrenching stories of children who died by suicide after their interactions with artificial intelligence chatbots have made it easy for those on both sides of the aisle to agree that something needs to be done to regulate the new technology. But as one bill has been voted out of committee and another heads toward a markup, observers say bipartisan support in the Senate for curtailing kids’ use of chatbots masks as-yet unanswered questions. Those center on which bots need regulating and how far legislation can or should go before it crosses boundaries of free speech and privacy. Late last month, the Senate Judiciary Committee voted unanimously to advance a bill sponsored by Sen. Josh Hawley, R-Mo., that would ban minors from using companion chatbots and ban all bots from engaging in sexual conversations with minors or encouraging them to commit suicide. Senate Commerce Chair Ted Cruz, R-Texas, voted for the bill, despite having introduced his own bipartisan chatbot regulation bill days earlier, which would require AI providers to give parents tools to control their child’s use. Cruz has said he will hold a markup of his measure in the next month, along with other children’s internet safety legislation. Earlier this week, Cruz offered his support for a measure sponsored by Sen. Marsha Blackburn, R-Tenn., known as the Kids Online Safety Act, that would put a “duty of care” on social media providers to prevent certain harms to children. A midterm issue? Both bills come as voters are looking with a skeptical eye at the rise of AI. An April poll from Politico found that 43 percent of adults think the risks of AI outweigh the benefits, as opposed to 33 percent who think the benefits outweigh the risks. Twenty-five percent said they didn’t know. Andrew Zack, policy director for the nonprofit Family Online Safety Institute, said high-profile safety incidents like the ones parents have testified to Congress about have combined with concerns about data centers and the energy grid in the public consciousness. “It’s definitely on people’s minds, and … depending on where you live, some states have already passed laws on this. So people will have already seen that,” Zack said. According to the Future of Privacy Forum, three states have enacted laws related to kids’ use of chatbots, and six others are considering bills this year. Maddie Daly, assistant director of federal affairs at the privacy-focused Electronic Frontier Foundation, said the midterms are part of the “political appetite” for chatbot bills. “I think they're coming from a good place. I think that there [are] very well-intentioned … motivations from lawmakers and from parents and from, really, consumers to try and regain some of the control that they feel like they've lost from their online experience,” Daly said. Prospects Midterm elections also mean that the congressional calendar for the year is condensed as members go back to their districts to campaign in August and October, leaving less time for remaining questions about the bills to potentially be resolved before the end of the 118th Congress in January. The two bills take different approaches. Cruz’s would put more power in the hands of parents, while Hawley’s would draw strict lines about what chatbots can’t talk to kids about. Hawley’s is further along the legislative track, but observers were quick to point out hesitations among senators, despite the unanimous vote. Sen. Alex Padilla, D-Calif., said prior to the vote that he hoped senators could continue to “fine-tune” the bill. “I know my colleagues have done careful work on this bill,” Padilla said. “I just want to register that there's still some questions, concerns that we have about potential privacy and security risks with the age verification component.” Hawley’s bill would require providers of AI companions to verify users’ ages, possibly by using government ID. Privacy advocates have expressed concern that this could create valuable targets for hackers. Cruz said at the committee markup that Hawley’s bill “still needs some revisions” and pointed to the ban on children using AI companions. He characterized that ban as applying to all chatbots. Critics say that characterization may be possible based on the bill’s definition of an AI companion as simulating “a sustained interpersonal relationship or emotional interaction,” exhibiting “persistent responses suggesting affection or attachment,” engaging in interactions “involving emotional disclosures from the user” or pretending to be a sentient being or character. Daly said she’s concerned that the definition would cover more tools than intended. “That includes these customer service bots, which are programmed to respond with emotions and empathy, et cetera,” Daly said. “If you're talking with an airline chat or an Amazon chat, they're going to respond in some sort of mimicking human emotion way, and under this definition, it would have been swept up anyway.” Cruz’s bill does not make a distinction between companion AI and other chatbots. Instead, it includes exceptions for what the bill wouldn’t apply to, including customer service, internal research and educational services. Tori Hirsch, legal counsel for the National Center on Sexual Exploitation, said those exceptions are helpful. “But at the same time … that’s the challenge with writing legislation. It’s like, when you start listing, oh, this is exempted and this is exempted, and this is exempted … it starts to be like, well this wasn’t specifically listed so it should be included.” Both bills have bipartisan support. Hawley’s has 19 co-sponsors, including 13 Democrats and six Republicans, while Cruz’s has three co-sponsors — two Democrats and one Republican. Neither bill has industry support, which senators noted at the Judiciary Committee markup of Hawley’s bill. Ranking member Richard J. Durbin, D-Ill., commended the committee’s work on the chatbot bill and others, but noted that “there has been a consistent pattern of Big Tech stopping us cold in our tracks on the floor.” Industry group NetChoice, whose members include OpenAI, Google and Meta, objects to bills that require age verification or assurance on the grounds that they violate users' rights to access speech. Zack, of the Family Online Safety Institute, also noted that the Senate and House are not aligned in their approaches to children’s safety on the internet. The House Energy and Commerce Committee in March approved a kid’s safety bill that combined 12 different measures. It would set new requirements for parental controls and require online platforms to put policies in place to address certain harms to kids online. Zack noted that so far this Congress, success on online safety came in the form of a law, known as the Take It Down Act, to require platforms to remove nonconsensual pornography, including images created with AI, which he called “targeted and narrow and specific.” The Federal Trade Commission, which has enforcement authority over the law's requirements for online platforms, has notified companies that they must be in full compliance by May 19 and that the commission is ready to begin enforcement. “When you start packaging these things all together … you might run into some First Amendment, teens-accessing-free-speech problems,” Zack said. He pointed to the Senate’s passage in March of a bill that would amend current law to boost protections for minors regarding the collection and use of their data by online platforms and extend such protections to children and teens under the age of 17. [Related: GOP data privacy bill tries again to set national standard] Zack called that bill, known as COPPA 2.0, more likely to pass because of its narrow protections. Daly said that privacy protections, either specific to children or for all users online, could actually help to answer some of the concerns chatbot bills are meant to address. She said she’s seen “very understandable frustration at these companies for not looking out for their users’ best interests.” “We really don’t have any sort of comprehensive federal privacy legislation. And that’s really the harm that people are getting at here, is that these companies can siphon up all of your data and sell it and use it to … target advertising to your kids. We just don’t think that that should be allowed.” Cruz’s bill includes a prohibition on providing targeted ads to users who are minors. Hawley’s legislation does not. The House Republican data privacy working group recently released a data privacy framework, but it does not have a Senate companion and is not yet scheduled for a legislative hearing.