Facebook CEO Mark Zuckerberg’s congressional testimony made two things clear.
First, a wave of new regulations may soon hit social media and the Internet. Second, the folks elected to write these regulations know fairly little about social media and the Internet.
Problems of that sort are endemic to the regulatory state. Relatively few of our regulations were written with a deep understanding of the industries they upended. Because the public knows even less about most industries than does Congress, however, most regulatory harm goes unnoticed.
Not so with social media. Mess with that industry, and people will notice.
The people pushing hardest for Congress to create that regulatory mess will soon complain most bitterly about the mess Congress creates. The case for increased regulation is hitting the Internet giants from two sides: for censoring too much innocuous speech, and for filtering out too few dangerous foreign bots. America is clamoring for regulations that stop censoring, improve filtering, and leave the basic user experience intact.
D**n the contradictions — full speed ahead!
Yet there is an elegant way out. As is often the case, the key lies not in greater regulation, but in appreciating the trouble that existing regulations have caused. In this case, the culprit is the safe harbor provision of the Communications Decency Act (CDA) of 1996.
That provision arose to address a very specific problem. In the early days of the Internet, most websites allowing third parties to post content were basically bulletin boards — anyone with a message and a thumbtack could post information. When people posted pornography, some of the websites’ owners pulled it down.
Because no good deed goes unpunished, those laudable efforts created a legal problem. Pruning objectionable content is a form of editing. Lawsuits arose claiming that Web sites whose owners exercised any control over content were editors or publishers — and thus liable for content that they allowed through.
Enter Washington, D.C. Proclaiming "the Internet and other interactive computer services offer a forum for a true diversity of political discourse, unique opportunities for cultural development, and myriad avenues for intellectual activity," Congressional regulators created the CDA, an explicit policy "to encourage the development of technologies which maximize user control over what information is received by individuals, families, and schools who use the Internet."
The CDA thus created a generous exemption to longstanding general principles of law: "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider."
It’s unclear that Congress understood just how generous that exemption was. Over past 20-plus years, the Courts have acted as if Congress had announced that Internet sites that allowed third parties to post content were immune from all liability. Period.
Though that broad exemption allowed Internet companies to develop popular sites and powerful tools for weeding out objectionable content, its downside is becoming clear: Anyone who qualifies as "an interactive computer service" may edit or publish freed from the liabilities that typically restrain editorial abuse. It’s no surprise that the editing has become abusive.
In the offline world, companies providing information must decide whether they’re community bulletin boards or newspapers. That decision determines the actions they take, the rights they reserve, and the obligations and liabilities they assume. In the online world, companies need never make such a choice; they can claim to be bulletin boards while acting like editors.
Some immediate consequences? Fake news, viewpoint discrimination masquerading as balance, susceptibility to foreign interference, widespread public confusion, protection for those abetting sex trafficking, and (as Zuckerberg’s testimony made clear) idiosyncratic definitions of "hate speech."
The failure to choose is creating a public backlash. Google, Facebook, and other Internet giants provide far too much editorial oversight to qualify as bulletin boards, but far too little to qualify as responsible journalism. How do they see themselves? The time has come for them to choose. More importantly, the time has come to revisit the CDA’s safe harbor. Ideas that made sense given the Internet of 1996 do not necessarily make sense today.
The safe harbor is reaching its breaking point. No one should be exempt from choice. Internet journalism should be subject to the same constraints as off-line journalism.
And Internet sites who hold themselves out as bulletin boards should not be allowed to exercise editorial authority. There are plenty of ways to divorce content posting from content streaming. The law should force companies to choose whether they provide an open forum, a moderated service, or both — and treat them accordingly.
Life on the Internet should not be an exemption from responsibility.
We don’t need new regulations to promote responsible behavior.
We just need to rethink an old one.
Bruce Abramson is the President of Informationism, Inc., Vice President and Director of Policy at the Iron Dome Alliance, and a Senior Fellow at the London Center for Policy Research. Jeff Ballabon is CEO of B2 Strategic, a Senior Fellow at the American Conservative Union's Center for Statesmanship and Diplomacy, and an advisor to Donald J. Trump for President, Inc. To read more of their reports — Click Here Now.
© 2021 Newsmax. All rights reserved.