Back in 1996, in an effort to encourage the growth of the new "internet," Congress passed Section 230 of the Communications Decency Act, immunizing the new platforms from liability for their users' posts.
If you ask Google, which owns YouTube, they'll tell you that the law provides absolute immunity from liability for actions — including terrorist attacks — that are arguably provoked by violent videos recommended by its algorithms.
If you ask the parents of a girl killed in a terrorist attack supposedly recommended by a YouTube algorithm, they will tell you that YouTube and its parent, Google, should be held responsible.
In this strange debate, or, rather, in this debate with strange bedfellows, liberals are arguing for less free speech and conservatives for more, a function of the fact that the platforms have banned former President Donald Trump and his ilk, to the great consternation of conservatives, but not done enough, according to liberals, to ban hate, white supremacy and right-wing lies.
So, who is right?
Two hundred plus years of jurisprudence interpreting the First Amendment, which protects free speech as against state action, has made clear just how difficult it is to draw lines in this arena.
Speech is powerful.
It does provoke.
And in the marketplace of ideas, the truth does not always win out; sometimes, hate, ignorance and evil attract more adherents.
But recognizing that speech can be dangerous doesn't tell you how to draw lines or where the lines should be.
"I know it when I see it," the late Justice Potter Stewart famously wrote of obscenity, and while his definition has been much mocked, few have done better.
The test for incitement— falsely crying, "Fire!" in a crowded theater — requires that the threat of violence be imminent, which is not an easy test to apply, or meet, especially if it is being applied before anything goes wrong.
Hindsight may be 20/20, but the platforms attempting to engage in foresight have no such advantage.
Is it better for them to ban too much — or too little? In regulating the marketplace of ideas, are they in fact engaged in a "public function," which would make them accountable under the First Amendment?
Should Section 230 be modified or eliminated, so as to subject platforms to tort standards of liability?
Those tort standards of liability would still require plaintiffs, such as the parents in the case before the court, to prove a "but for" causal connection between the offensive videos and the terrorist acts.
In theory, that is not an easy standard.
But in practice, a jury shown horrible videos and then an act of horror may make connections after the fact that are more difficult to draw in advance.
The traditional challenge for civil libertarians is to assume that the power to decide which speech is allowed is not in your hands, but in the hands of someone you disagree with entirely (no doubt how conservatives feel right now), and ask yourself how much power that person should have.
Thus, the traditional argument for free speech is that no one can be trusted, or should be, to decide for us what we should or should not hear.
Lies about the dangers of vaccines, which could cost children's lives?
Should those be protected? Why?
Violent videos spewing hatred based on religion or race or ethnic origin?
Should hate be protected?
Calls to overturn democratically elected slates of electors? Is there a place for them online?
Susan Estrich is a politician, professor, lawyer and writer. Whether on the pages of newspapers such as The New York Times, the Los Angeles Times and The Washington Post or as a television commentator on countless news programs on CNN, Fox News, NBC, ABC, CBS and NBC, she has tackled legal matters, women's concerns, national politics and social issues. Read Susan Estrich's Reports — More Here.