Lawmakers want to remove legal protection from Facebook News Feed
Lawmakers want to remove legal protection from Facebook News Feed
Democratic lawmakers want the social network to face legal liability for recommending harmful content to users. Reps Anna Eshu (D-CA), Frank Pellone Jr. (D-NJ), Mike Doyle (D-PA), and Jan Shackowski (D-IL) introduced the "Justice Against Malicious Algorithms Act", an amendment to this section. Will do . Protection of 230 to exclude "personal recommendations" for material that contributed to physical or serious emotional injury.
The bill follows a recommendation from Facebook whistleblower Frances Haugen before Congress last week. Haugen, a former employee who leaked extensive internal Facebook research, encouraged lawmakers to crack down on algorithms that promote, rank or otherwise order content based on user engagement. It applies to web services with more than 5 million monthly visitors and excludes certain categories of content, including infrastructure services such as web hosting and systems that return search results.
For covered platforms, the bill targets Section 230 of the Communications Decency Act, which prohibits people from suing Web services over third-party content posted by users. The new exception will let these matters proceed if the Services knowingly or negligently used "personal algorithms" to recommend third party content. This may include posts, groups, accounts and other user-supplied information.
The bill does not necessarily allow people to sue Haugen over the type of content that has been criticized, including hate speech and material related to anorexia. Much of that content is legal in the United States, so the platform does not require an additional liability shield to host it. (A Pellone statement also lambasted sites for promoting "extremism" and "disinformation," which aren't necessarily even illegal.) The bill only includes individual recommendations, which can be used to sort content with algorithms. should be done for. Defined as "that which relies on information specific to an individual." Companies can still use analysis extensively to recommend the most popular general content.
In his testimony, Haugen suggested that the goal was to add general legal risk until Facebook and similar companies stopped using personalized recommendations altogether. “If we reform [Section] 230 to make Facebook accountable for the consequences of their deliberate ranking decisions, I think they will get rid of engagement-based rankings,” she said.
No comments