A new bill proposed by four lawmakers aims to strip Section 230 protections from algorithm-based recommendations like Facebook’s newsfeed and hold social media companies accountable for what is fed to its users.
Section 230 of the Communications Decency Act currently prevents people or entities from suing web services such as social media platforms over the content posted by its users. In short, it protects companies from being the focus of lawsuits based on the content uploaded to their sites, which is often difficult to moderate especially for large platforms like Facebook, Reddit, or Instagram.
Section 230 has been the focus of criticism over the last few years, especially since former President Trump brought it to the limelight when he claimed that social media platforms like Twitter were being unfairly biased against conservative speech, specifically with regards to how it was adding fact-checked warnings to his false statements.
A new proposed bill would keep the protections of Section 230 mostly intact, but would specifically remove it from algorithm-based feeds.
Four Democratic lawmakers — Reps. Anna Eshoo (D-CA), Frank Pallone Jr. (D-NJ), Mike Doyle (D-PA), and Jan Schakowsky (D-IL) — have introduced the “Justice Against Malicious Algorithms Act” which would amend Section 230’s protection to exclude personalized recommendations for content on platforms, which includes algorithmic-based feeds like Facebook’s ubiquitous News Feed, The Verge reports.
The bill follows a bombshell report from the Wall Street Journal that revealed Facebook was aware that its platforms were toxic for teen girls as well as testimony from Facebook whistleblower Frances Haugen. Haugen provided a trove of internal Facebook documents as part of her testimony, one of which showed that Facebook was aware that its algorithm was capable of feeding new users conspiracy theories in as little as one week.
The new exception to the rule, if passed, would include any services that knowingly or recklessly used a “personalized algorithm” to recommend third-party content, which in the case of Facebook would include posts, groups, accounts, and other user-provided info.
Part of Haugen’s testimony alleged that Facebook has complete control over its algorithms and should be held accountable for the content it serves users, even if the social media network wasn’t the one to create that content.
“They have a hundred percent control over their algorithms, and Facebook should not get a free pass on choices it makes to prioritize growth and virality and reactiveness over public safety,” Haugen said in her testimony.
“Designing personalized algorithms that promote extremism, disinformation and harmful content is a conscious choice, and platforms should have to answer for it,” Representative Pallone says. “While it may be true that some bad actors will shout fire in a crowded theater, by promoting harmful content, your platforms are handing them a megaphone to be heard in every theater across the country and the world.
“The time for self-regulation is over. It is time we legislate to hold you accountable.”
Image credits: Header photo licensed via Depositphotos.