A pair of cases going before the US supreme court this week could drastically upend the rules of the internet, putting a powerful, decades-old statute in the crosshairs.
At stake is a question that has been foundational to the rise of big tech: should companies be legally responsible for the content their users post? Thus far they have evaded liability, but some US lawmakers and others want to change that. And new lawsuits are bringing the statute before the supreme court for the first time.
Both cases were brought forth by family members of terrorist attack victims who say social media firms are responsible for stoking violence with their algorithms.The first case, Gonzalez v Google, is expected to be heard on 21 February and will ask the highest US court to determine whether YouTube, the Google-owned video website, should be held responsible for recommending Islamic State terrorism videos. The second, which will be heard on 22 February, targets Twitter and Facebook in addition to Google with similar allegations.
Together they could represent themost pivotal challenge yet to Section 230 of the Communications Decency Act, a statute that protects tech companies such as YouTube from being held liable for content that is shared and recommended by its platforms. The stakes are high: a ruling in favor of holding YouTube liable could expose all platforms, big and small, to potential litigation over users’ content.
While lawmakers across the aisle have pushed for reforms to the 27-year-old statute, contending companies should be held accountable for hosting harmful content, some civil liberties organizations as well as tech companies have warned changes to Section 230 could irreparably debilitate free-speech protections on the
Read more on theguardian.com