Episode #011: Believing Outrageous Lies, Facebook Stops Political Ad Research, and Social Media at the Olympics

Duration: 15 min, 58 sec.

Summary:

 • “A terrifying new theory: Fake news and conspiracy theories as an evolutionary strategy” by Paul Rosenberg

 • “Facebook shut down political ad research, daring authorities to pursue regulation” by Christianna Silva

 • “Sharing videos on social media from Tokyo Games is not allowed -IOC” by Karlos Grohmann

Subscribe:

RSS | iTunes | Anchor.fm | Spotify | Google | Breaker | RadioPublic | Pocket Casts

Listen on Apple Podcasts

Links from this episode:

 • A terrifying new theory: Fake news and conspiracy theories as an evolutionary strategy | Salon.com

 • Facebook shut down political ad research, daring the U.S. to regulate (mashable.com)

 • Sharing videos on social media from Tokyo Games is not allowed -IOC | Reuters

Rough Transcript:

This week’s three topics:

Social scientist Michael Petersen on why people believe outrageous lies

Facebook shuts down political ad research daring the U.S. to regulate it

And social media at the Olympics.

The date is Sunday, August 8th, 2021, the time is 2100 hours,  and you’re listening to Episode #11 of Communicate For Effect.

Segment 1

A good Sunday evening to you all.

I’m Mike Nicholson.

Do you ever ask yourself, why do all these people around me believe these outrageous lies on the news and in politics, especially when many of those lies can be proven to be false?

I do, all the time, and that’s what social scientists Michael Peterson, Mathias Osmundsen, and John Tooby were also asking themselves.

They recently wrote a thesis paper titled, “The Evolutionary Psychology of Conflict and the Functions of Falsehood” and Paul Rosenberg from Salon magazine interviewed them.

Rosenberg’s article starts by setting the stage by citing Noam Chomsky’s book Manufacturing Consent which challenged the notion that those who believe in outrageous lies do so out of ignorance.

He also cited the memorable line from an academic study on coverage of the first Gulf War, “the more you watch, the less you know” and fast forward to the social media age and there is discussion around the topic of “motivated reasoning” or using the more common phrase, people hear what they want to hear.

So this thesis by these three social scientists revolves around deception being a part of humans’ evolutionary history because it provides an advantage.

“Being part of a successful social group was every bit as essential as food and water. So deception among humans evolved from group conflicts.”

And that’s the premise of their current work.

It “seeks to illuminate the evolutionary foundations and social processes involved in the spread of outright falsehoods.”

From the interview, here are a few sections that I’d like to highlights that stuck out to me.

The sections are the narrative, using nature as a model, group mobilization, group loyalty, and the intended effect.

The first highlight is a section on the traditional narrative of those that believe these outrageous lies:

They say: “The traditional narrative is, ‘Well if you believe false things, then you must be stupid. It must be because you haven’t really made an effort to actually figure out what is going on. But over the last few decades, more and more research has accumulated that suggests that’s not the case. In fact, the people who are responsible for spreading misinformation are not those who know the least about politics. They actually know quite a lot about politics.”

This is something that I must say I ask myself all the time when I watch the news.

Does this person actually believe this nonsense, or are they just saying it to be bold and to get headlines?

I think the answer is there are some out there that say things publically that aren’t true because they are politically or financially motivated for doing so.

Then there are some who are the receivers that are blinded by their loyalty to a group, and there are some who are just plain ignorant.

The second highlight is they used nature as a model for why people believe in outrageous lies.

“Animals are trying to get an upper hand in conflict situations by making false signals. First, that is also what we should expect that humans do, that if they can send false signals that are advantageous to them, then they should do it. That means there might be certain advantages, within one group, to spread misinformation and spread falsehoods, if that can give them an upper hand in the conflict with the other group.” 

This goes back to those that know what they are saying isn’t true, but they’re saying it anyway because it provides them some kind of advantage.

Here, they are talking about those that are communicating the false information, not those that are receiving it.

The third section I’ll highlight is on the mobilization of a group, or in nature, the signaling that I’m in trouble:

“When you want to mobilize your group, what you need to do is find out that we are facing a problem, and your way of describing that problem needs to be as attention-grabbing as possible before you can get the group to focus on the same thing. In that context, the reality is seldom as juicy as fiction. By enhancing the threat — for example, by saying things that are not necessarily true — then you are in a better situation to mobilize and coordinate the attention of your own group.”

This, to me, is just political communication.

Not all political communication, but just those that choose to utilize their position and the media for “attention-grabbing” to push their agenda.

This section on mobilization and the next section goes hand-in-hand.

The next section is on loyalty to a group, and if you are loyal to a group, then when you’re signaling a problem with attention-grabbing rhetoric, you can rally the troops even if what you are saying is known to be an exaggeration or an outright lie.

“Humans are constantly focused on signals of loyalty: “Are they loyal members of the group?” and “How can I signal that I’m a loyal member? A good way to signal that I’m loyal to this group and not that group is to take on a belief that is the exact opposite of what the other group believes. So that creates pressure not only to develop bizarre beliefs, but also bizarre beliefs that this other group is bad, is evil, or something really opposed to the particular values that they have. So while there is this motivation or incentive to create content as bizarre as possible, there is also another pressure or another incentive to avoid the situation where you’re being called out by people who are not motivated to engage in the collective action. That suggests it’s better to develop content about situations where other people have a difficult time saying, “That’s blatantly false.” So that’s why unverifiable information is the optimal kind of information, because there you can really create as bizarre content as you want, and you don’t have the risk of being called out.”

The first part says “A good way to signal that I’m loyal to this group and not that group is to take on a belief that is the exact opposite of what the other group believes.”

This is U.S. COVID politics right now.

Take a position that is the exact opposite of the other political party, even if it is negatively impacting the health of your own loyal followers.

This sums up discussions I’ve had with numerous people over the last 5 years or so.

I think there are some who are more loyal to a political party than to their country.

Their political party is their identity, it can do no wrong.

In the interview, he touches on religion as well and how some of the religious mentality – not beliefs – is carried over to politics.

Remember in the first section, he said “people who are responsible for spreading misinformation are not those who know the least about politics. They actually know quite a lot about politics.”

He’s talking about the person spreading the information, not the receivers of that information.

The receivers of the information put loyalty to the group over everything else.

The final section is on the ultimate effect:

“Evolution cares about material benefits and, in the end, reproductive benefits. So the beliefs that you have should in some way shape real-world outcomes. We are arguing that these false beliefs don’t just exist to make you feel good about yourself, but exist in order to enable you to make changes in the world, to mobilize your group, and get help from other group members. I think that’s an important point to think more about: What it is that certain kinds of beliefs enable people to accomplish, and not just how it makes them feel.”

I will add that I think that those that know what they are saying is false, are doing so because they know have the loyalty of a group, but at least in my mind, they often frame the issue as a crisis for the group when it’s really a crisis for them personally.

People want to gain or remain in power, they are politically motivated, they are financially motivated.

So interesting article and paper, I’ll put the link in the show notes.

Some of this I, and probably many of you, probably felt that this was the case, but it’s nice seeing some of this explained out in a more structured way than just a gut feeling.

That doesn’t mean you have to like it, but it does help to understand it.

Segment 2

This past Tuesday, Facebook stopped a team of researchers from NYU from studying political ads and COVID-19 misinformation by blocking their personal accounts, pages, apps, and access to its platform.

NYU’s Ad Observatory has used a browser add-on since 2020 to collect data about the political ads users see on Facebook.

They received permission from everyone who used their add-on, but, the article says, “Facebook’s attempt to stop their research has more sinister roots in the platform trying to stop the academics from exposing problems.”

In a statement, researchers said, “Facebook has also effectively cut off access to more than two dozen other researchers and journalists who get access to Facebook data through our project” and “The work our team does to make data about disinformation on Facebook transparent is vital to a healthy internet and a healthy democracy.”

Facebook spokesman Joe Osborne said that Facebook has a requirement to create rules for a privacy program that the researchers violated.

Who is correct? You can decide for yourself but deleting the personal accounts, if that is indeed what they did, seems like overreach and maybe an indicator.

The group is now calling for increased regulation of Facebook and social media companies, saying “The public urgently needs to know and needs to understand the implications of Facebook’s platform for public discourse and democracy.”

Segment 3

The Olympics just wrapped up.

I watched exactly zero minutes of it on TV, being a cable-cutter myself, and the only clips I saw were short clips on social media.

So I was curious to see this last article where the International Olympic Committee said that sharing videos on social media from Tokyo Games was not allowed.

It’s not allowed because of broadcaster rights.

The IOC will receive more than $4 billion in broadcasting rights for the period that includes the 2018 Pyeongchang Winter Olympics and the Tokyo Games, with the biggest chunk coming from NBCUniversal -which paid over $7 billion for the rights to broadcast the Olympics through 2032.

Now the IOC says that 90% of that income is redistributed to athletes and sports, but they were trying to crack down on sports stars that were posting videos on their social media accounts.

A Jamaican sprinter was blocked from Instagram for a while because she posted videos of her 100 and 200 meters races to her 300,000 followers.

There have been a lot of videos on TikTok and other platforms by athletes, and they said that still photos are OK, but the video is not.

Fast forward another 4 years from now, and I’ll bet they’re going to have to take some kind of drastic measures because I don’t see social video decreasing anytime soon, it will obviously rapidly increase in 4 years, and companies like NBC who paid $7 billion still need to make sure that investment was worthwhile.

Wrap Up

So that’s it for #11.

If you have any questions or comments for me, just go to 46alpha.com and shoot me a note.

You can subscribe to the Last 24 daily news summaries, or follow my FlipBoard magazine if you want to read more articles that I find interesting on digital comms, marketing, and technology.

I’m Mike Nicholson, and I’ll see you again next week.

 

Mike Nicholson

I've spent my career working in a variety of Strategic Communications, Public Relations, Public Affairs, Information Operations, and Executive Outreach positions. With a history of planning, preparing, executing, and assessing communication strategies in the U.S. and abroad, I use this site to write, think and share lessons learned on organizational communications.

Share via
Copy link
Powered by Social Snap