University of Colorado prof favors tougher censorship against ‘misinformation’

A University of Colorado-Boulder professor recently spoke out in favor of Big Tech’s recent acts of censorship.

One free speech expert said that censorship is not the solution, and "the answer is more speech."

After the January 6 riots on Capitol Hill, one information science professor at the University of Colorado-Boulder is claiming that social media companies need to do even more than they are currently doing to restrict the spread of “misinformation.”

During an interview with KDVR-TV in Denver, Casey Fiesler’s claims that “misinformation” surrounding the 2020 election prompted the January 6 Capitol Hill riots, claiming that “filter bubbles” allowed the “widespread’” accusations of election fraud, and that social media allows easy access for people to “only see information that aligns with their beliefs.”

“You have a sense that you know so many people because you have so many interactions with strangers on forums like Facebook. You feel like you have this bigger sense of the world but you have still curated your social media feeds such that you’re only interacting with people who are like yourself,” Fiesler told KDVR.-TV. 

Because of this, Fiesler suggested that Big Tech step up its censorship efforts by restricting social media users’ ability to share what is deemed as misinformation.


[RELATED: Academics & scholars are standing up to ‘social media mobs’ and ‘cancel culture’]

“At some point, it becomes a matter of principle and values. What are you going to allow your platform to be a part of?” 

Campus Reform reached out to Fiesler for more insight on her position. She said it’s “still possible” that “misinformation” prompted the January 6 violence.

“I do think it is possible that misinformation contributed to the unrest that fueled the riots at the Capitol, based on reports about the motivations of those involved, but of course there is no way to know for certain,” she said.

Fiesler added that “filter bubbles” are only one way in which misinformation arises.

“I was drawing a comparison to algorithmic filter bubbles in noting the idea that we often choose to surround ourselves with people who think similarly to us.  One reason that someone might believe something that is not true is confirmation bias, and I suspect that confirmation bias is even stronger when everyone around you also agrees with you. This is a broad idea that is not limited to any particular type of belief, of course,” Fiesler said.

She then told Campus Reform that previous censorship strategies have been rendered “ineffective,” which may be why Twitter simply removes content altogether. 

However, she said this does put tech companies in a tough position.

[RELATED: Prof draws criticism for sharing ‘Wuhan Plague’ picture on social media]

My point about previous strategies was that labeling misinformation is likely ineffective, since it is easy enough not to believe those labels,” Fiesler said. ”Therefore, I understand why platforms like Twitter might have decided to remove content that violates their rules, since platforms can create their own policies (against misinformation, against hate speech, against inciting violence, whatever they like) and then decide how to define and enforce them. (And of course this is not a First Amendment violation since the First Amendment applies to government suppression of speech.) As I said in the interview, where to draw this line is very tricky. I don’t envy the position these platforms are in.”

Campus Reform asked Fiesler if she favors the idea of censoring voter fraud information, and whether she thinks Parler, one of the latest social media applications to be de-platformed entirely, was rightfully banned.

“My understanding is that Twitter’s decisions have been based on violations of their civic integrity policy but I don’t have any knowledge of how they decide what content violates that policy,” Fiesler told Campus Reform via email, adding that Parler’s ban came in light of alleged illegal activity.

“My understanding is also that the app store TOS provisions that Parler was accused of violating had to do with coordinating illegal activities, though I’m not sure...Platforms are in the tricky position of deciding where to draw the line between protecting people from certain kinds of content and unreasonable censorship,” she said.

Speech First President Nicole Neily told Campus Reform that censoring alleged “misinformation” begs the question: Who determines what it is?

“While a platitude like ‘stopping misinformation’ sounds good, it raises the question - who’s going to be determining what is or is not ‘misinformation’? The government? That’s an immensely powerful position - and not one that I trust others to fulfill, no matter what political party they identify with,” Neily told Campus Reform.

Neily suggested the solution is not to crack down on free speech, but rather to allow for more of it, and to allow people to govern themselves.

“As Justice Louis Brandeis famously noted, ‘the answer is more speech.’ I trust my fellow Americans to make up their own mind about issues, and I wish that others would do the same,” Neily said.

Campus Reform reached out to CU Boulder for comment on Fiesler’s position. 

Spokesperson Andrew Sorenson replied with a pre-written statement on CU’s commitment to free speech.

“Freedom of expression, as guaranteed under the First Amendment, and academic freedom, as defined by the Laws of the Regents, while distinctly separate concepts, are central to CU Boulder’s academic mission and underlie our community values of inclusivity and critical thinking,” the statement read.

Follow the author of this article on Twitter: @_AddisonSmith1