Trending Topics

U.S. Supreme Court ruling lets Big Tech profit from calls for violence against police

With lawmakers in limbo, I’m betting on cops and communities to win the fight against radicalization


Getty Images

On May 18, 2023, the U.S. Supreme Court decided a lawsuit that the National Fallen Officer’s Foundation (NFOF) President said was,

[A]n important landmark case that will change the landscape of public safety for future generations. Facebook, Google and Twitter have enjoyed broad liability protection under [section] 230, while fueling societal instability and leaving police officers and citizens vulnerable to attacks facilitated by online radicalization.”

Both the NFOF and The National Police Association (NPA) had joined to file an “amicus curiae” (friend of the court) brief in the litigation.

The cases

There were actually two cases – Twitter v. Taamneh and Gonzales v. Google. The plaintiffs were the families of two people killed in separate ISIS terrorist attacks in different parts of the world. The defendants were tech giants Google (owner of YouTube) and Twitter, but the decision held equal portent for Facebook (now Meta).

The plaintiffs argued Twitter and Google “aided and abetted” the terrorists in violation of the Anti-Terrorism Act by providing a platform where they could post their content, enlist new recruits, and plan and execute attacks. The Act provided for civil liability in addition to criminal penalties.

The tech defendants said they were protected by section 230(c)(1) of the Communication Decency Act, which states:

No provider or user of an interactive computer service shall be treated as the publisher of speaker of any information provided by another information content provider.”

The plaintiffs responded that algorithmic content recommendations and targeted ads changed the tech platforms from an “interactive computer service” to an “information content provider.” The latter is defined in 230(f)(3) as any “person or entity that is responsible, in whole or in part, for the creation or development of information provided through the Internet.” Information content providers don’t have immunity from liability granted by 230(c)(1).

Why police care

The NPA and NFOF argued in their brief that police are also suffering from social media-fueled hostility and attacks. Accordingly, if the Court ruled against section 230 immunity for algorithm-curated content that calls for radicalization against police, it would help decrease animosity and violence.

The decision

There were a lot of amicus curiae briefs on both sides of the case. Big Tech, free speech organizations and the U.S. Chamber of Commerce argued that interpreting 230(c) to not provide immunity for social media platforms would end free speech, the internet and the economy as we know it. In addition to the law enforcement organizations previously mentioned, weighing in on the side of the families were The Anti-Defamation League, Senator Chuck Grassley, former US national security officials, and retired American military generals, who contended the nation’s security, public safety, and democracy were at stake.

So, what did the Supremes do? They dodged the question – in a 9-0 decision. The Court said it didn’t have to decide the section 230 immunity question because it held the tech companies hadn’t “aided and abetted” the terrorists. This involved a long and exhausting (to this reader) discussion of what “aid and abet” means and what must be “aided and abetted.” Check it out if you suffer from insomnia.

In short, the Court concluded that the tech companies’ failure to control terrorist content, as well as their algorithms that pushed content and ads, didn’t meet the requirement for some “affirmative act” that involved meaningful participation in a specific crime so as to constitute aiding and abetting. The Court’s reasoning regarding the algorithms was that they operated indifferently amongst the companies’ billion-plus users (without any intent other than to make money.)

The implications

I wouldn’t count on the Supreme Court settling the question of Big Tech liability for their platforms’ use by individuals advocating violence and terrorism – not even as their algorithms help advance, hone, and target such radicalization.

At oral argument, Justice Kagan suggested such a task was better left to Congress, adding, “These are not, like, the nine greatest experts on the internet.” A sentiment Justice Kavanaugh echoed and with which I heartily agree.

The lobbying of lawmakers has already been occurring. Changing section 230 is something even Trump and Biden agree on. But it’s complicated – technologically, economically, constitutionally and politically. That likely means task forces ad nauseum and lobbyists crowding the pig troughs.

As for me, I put my money on cops and their communities winning the war against radicalization, violence and hate before the judiciary or legislature solves anything. Where radicals sow division, hatred and violence, front-line officers build bridges of:

And so citizens honor officers – with memorials, parades, packed cathedrals, streets lined with candles and flags, and a higher value on an officer’s life if it is taken by a criminal than they place on their own.

Yes, there are terrorists. Yes, there are some extremists in policing. But the majority of cops and citizens support each other, care about each other and want to join together to solve communities’ problems. That’s what will win against radicalization.

As a state and federal prosecutor, Val’s trial work was featured on ABC’S PRIMETIME LIVE, Discovery Channel’s Justice Files, in USA Today, The National Enquirer and REDBOOK. Described by Calibre Press as “the indisputable master of entertrainment,” Val is now an international law enforcement trainer and writer. She’s had hundreds of articles published online and in print. She appears in person and on TV, radio, and video productions. When she’s not working, Val can be found flying her airplane with her retriever, a shotgun, a fly rod, and high aspirations. Contact Val at