[ad_1]

A bipartisan pair of senators reintroduced The Children’s Online Safety Act on Tuesday with updates aimed at addressing concerns that the bill could inadvertently cause more harm to the young internet users it seeks to protect. But some activists who have raised these issues say the changes are still not enough.

The bill aims to make the internet a safer place for children to access it by holding social media companies accountable to prevent and mitigate harms that may come from their services. The new version of the bill outlines a specific list of harms that platforms need to take reasonable steps to mitigate, including by preventing the spread of posts promoting suicide, eating disorders, drug use and more. This would require those companies to undergo independent annual audits of their risks to minors, and require them to enable the strongest privacy settings by default for children.

Related investment news

AI reported a massive rise in earnings calls this season.  What do the executives say?

CNBC Pro

Congress and President Joe Biden have made it clear that protecting children online is a major priority, and KOSA has become one of the leading bills on this topic. KOSA has put together a long list of more than 25 cosponsors, and the previous version of the bill passed unanimously from the Senate Commerce Committee last year. The new version of the bill has gained support from groups such as Common Sense Media, the American Psychological Association, the American Academy of Pediatrics, and the Eating Disorders Alliance.

At a virtual news conference Tuesday, Democratic Sen. Richard Blumenthal, who introduced the bill along with Sen. Marsha Blackburn, R-Tennessee, said Senate Majority Leader Chuck Schumer, D-N.Y., is “one hundred percent” behind the bill. and efforts to protect children online.”

While acknowledging that it is ultimately up to the Senate leadership to figure out the timing, Blumenthal said, “I hope and expect that we will have a vote in this session.”

Schumer’s spokesman did not immediately respond to a request for comment.

Late last year, dozens of civil society organizations warned Congress against passing the bill, claiming it could further endanger young internet users in various ways. For example, the groups worried that the bill could add pressure on online platforms to “overdo it, including from state attorneys general seeking to make political points about What kind of information is appropriate for youth.”

Blumenthal and Blackburn made several changes to the text in response to criticism from outside groups. They have sought to tailor legislation more carefully to reduce the duty of care requirements of social media platforms to a specific set of potential mental health harms based on evidence-supported medical information.

They’ve also added protections for support services like the National Suicide Hotline, substance abuse groups and LGBTQ+ youth centers to ensure they aren’t inadvertently impeded by the bill’s requirements. Blumenthal’s office said it did not believe the duty of care would apply to those types of groups, but chose to make that clear regardless.

But the changes were not enough to satisfy some civil society and industry groups.

Evan Greer, director of the digital rights nonprofit Fight for the Future, said Blumenthal’s office never met with the group or shared the updated script in advance of the introduction despite multiple requests. Greer acknowledged that the co-sponsors’ offices met with other groups, but said in an emailed statement that “it appears they have intentionally excluded groups that have domain expertise in content moderation, account recommendation, etc.”

“I have read that and can say unequivocally that the changes that have been made do not address the concerns we raised in our letter,” Greer wrote. “The bill still contains a duty of care that covers content recommendation, and it still allows state attorneys general to dictate what content platforms can recommend to minors,” she added.

“The ACLU continues to staunchly oppose the Cossa Act because it ironically exposes the very children it seeks to protect to further harm and increase surveillance,” Cody Finsky, ACLU senior policy advisor, said in a statement. The group joined in the letter warning it was passed last year.

“KOSA’s core approach continues to threaten the privacy, security, and freedom of expression of both minors and adults by mandating platforms of all stripes to monitor their users and censor their content under the guise of a ‘duty of care,’” Finske added. “To achieve this, the bill would legalize the process.” The platforms already prevalent data collection to identify underage users when they should seek to reduce these data breaches. Furthermore, parental guidance in minors’ online lives is paramount, but KOSA will enforce monitoring tools regardless of the minors’ home situations or safety. KOSA will be a step backwards in making the Internet a safer place for children and minors.”

At the press conference, in response to a question about criticism of Fight for the Future, Blumenthal said that the duty of care had been “intentionally narrowed” to target specific damages.

“I think we met this kind of proposal very directly and effectively,” he said. “Obviously our door remains open. We’re willing to listen and talk to other kinds of proposals that are being put forward. And we’ve talked to many groups that have been very criticized and a number of them have given up their opposition, as I think you’ll hear in response to today’s hearing. So I think that “Our bill has been clarified and improved in a way that meets some of the criticism. We will not solve all the world’s problems with one bill. But we are making a very important and measurable start.”

The bill has also faced criticism from several groups that receive funding from the tech industry.

NetChoice, which sued the state of California over an age-appropriate design code law that includes its members GoogleAnd meta And TikTok, in a press release that despite lawmakers’ attempts to respond to the concerns, “Unfortunately, how this law works in practice still requires an age verification mechanism and data collection on Americans of all ages.”

“Working on how young people use technology is a difficult question and one that parents have always been better at answering,” Carl Szabo, NetChoice vice president and general counsel, said in a statement. “KOSA has instead created an oversight board of Washington, D.C. insiders who will take the place of parents in deciding what’s best for children,” Szabo added.

“KOSA 2.0 raises more questions than it answers,” said Ari Cohn, free speech consultant at TechFreedom, a think tank that has received funding from Google, in a statement. What constitutes a reason to know that a user is under 17 is completely unclear and not specified on the bill. In the face of this uncertainty, platforms will have to clearly verify the age of all users to avoid liability – or worse, avoid getting any knowledge on their account. leaving the minors without any protection whatsoever.”

Matt Schrewers, president of the Computer and Communications Industry Association, said, “Protecting young people online is a broadly shared goal. But it would run counter to the goals of bills like this one to enforce compliance obligations that undermine the privacy and safety of teens.” Members included AmazonGoogle, Meta and Twitter. “Governments should avoid compliance requirements that would force digital services to collect more personal information about their users — such as government-issued geolocation and identification information — particularly when responsible companies take action to collect and store less customer data.”

Subscribe to CNBC on YouTube.

Watch: Senator Blumenthal accuses Facebook of adopting Big Tobacco’s playbook

[ad_2]

Leave a Reply

Your email address will not be published. Required fields are marked *