Episode #010: Targeted Ads, Retargeting, and Mapping Disinformation Networks

Duration: 12 min, 16 sec.

Summary:

 • “Targeted ads aren’t just annoying, they can be harmful. Here’s how to fight back” by Silvia Milano

 • “Retargeting: What Is It, Why You Need It, and How To Do It” on 46ALPHA

 • “Hate “Clusters” Spread Disinformation Across Social Media. Mapping Their Networks Could Disrupt Their Reach” by Jigsaw

Subscribe:

RSS | iTunes | Anchor.fm | Spotify | Google | Breaker | RadioPublic | Pocket Casts

Listen on Apple Podcasts

Links from this episode:

 • Targeted ads aren’t just annoying, they can be harmful. Here’s how to (fastcompany.com)

 • Epistemic fragmentation poses a threat to the governance of online targeting | Nature Machine Intelligence

 • Retargeting: What is it, Why You Need It, and How to Do It : 46ALPHA

 • Hate “Clusters” Spread Disinformation Across Social media. Mapping Their Networks Could Disrupt Their Reach. | by Jigsaw | Jigsaw | Jul, 2021 | Medium

Rough Transcript:

This week’s three topics are:

How targeted ads create filter bubbles and echo chambers by Silvia Milano.

Retargeting: What is it, why you need it, and how to do it.

And Jigsaw, the unit inside Google that explores threats and builds technology, is mapping clusters of disinformation.

The date is Sunday, August 1st, 2021, the time is 1930 hours,  and you’re listening to Episode #10 of Communicate For Effect.

Segment 1

I made it to #10, double-digits.

The next big benchmark, I think, is #14.

I read somewhere that half of all podcasts have less than 14 episodes so if I can make it to #14, I guess I’m doing alright.

I don’t know what the next big benchmark will be after that. Maybe #100?

I’ll figure it out, give me something to shoot for.

For the first segment, I’ve got an article written by Silvia Milano.

She is a Research Fellow at the Future of Humanity Institute (FHI) in the Faculty of Philosophy, at the University of Oxford.

The title of the article is, “Targeted ads aren’t just annoying, they can be harmful. Here’s how to fight back” and it’s on the Fast Company website.

The premise of the article is that online, targeted advertising divides and isolates us, and she is advocating for consumers to become more active participants in helping to regulate online advertising.

She mentions political advertising, the Cambridge Analytica scandal, and the 2016 US election.

These I think, and she thinks, are relatively well-known and we’ve learned more about filter bubbles and echo chambers over the past few years…at least some of us have.

Where she takes this article is not in the direction of political ads, but in the direction of commercial ads.

She says, “Commercial targeted advertising is the primary source of revenue in the internet economy, but we know little about how it affects us.”

She, and several others, conducted some research into this area of how targeted ads affect us.

There is a link to their study in the article, and the study is titled, “Epistemic fragmentation poses a threat to the governance of online targeting.”

We collectively can see the effects of ads on groups of people in the physical world because we all can see an ad at a bus stop or train station, but online consumers are more isolated because what they are seeing is specifically targeted to them.

She writes that advertising standards authorities rely primarily on consumer complaints.

In the U.K. in 2015, there was a “Beach Body” ad on the subway that commuters complained about, saying it promoted harmful stereotypes.

In this case, there were 378 total complaints before action was taken to take down the ads, but she is curious about the impact on all the other people that saw it, did not complain, but still were perhaps impacted by the ads.

I think the Beach Body ad sounds like something that was done in poor taste, which to me ultimately impacts that company, but as she points out in the article where she and the others on her team are looking into is things like ads for high-fat-content foods that target children or gambling ads that target a person who suffers from a gambling addiction, etc.

There’s probably a number of other scenarios like this you can think of.

Their research into this is called “epistemic fragmentation” which is where the information available to an individual is limited to what is targeted at them, without the opportunity to compare with others in some kind of common space.

She says, “Currently, regulators are adopting a combination of two strategies to address these challenges. First, we see an increasing focus on educating consumers to give them “control” over how they’re targeted. Second, there’s a push toward monitoring ad campaigns proactively, automating screening mechanisms before ads are published online.”

But what she is advocating for is giving consumers a bigger role and having them be active participants.

She writes at the end, “Our research shows it’s not just political targeting that produces harms – commercial targeting requires our attention too.”

So, interesting topic, looking at the greater impact all these targeted ads have on society.

I’ll have a link to both the Fast Company article and the academic article in the show notes.

Segment 2

For topic #2, we’re talking about ads and Retargeting.

What is it and how do you do it for a business.

Retargeting is an advertising technique that allows companies to display customized, targeted ads on the websites of people who have previously visited their site.

These ads are based on information that you provided to them when you visited another website.

If you own a website and have a decent amount of traffic, you can do this because you will have collected data on your website visitors, including what pages they viewed and the content that is of interest to them.

If you set them up and pay for the ads, past visitors to your website will see your ads while they are browsing the web, watching YouTube videos, or reading news sites.

As a business owner, this is useful because 97% of people that come to your website for the first time will leave without buying anything.

In marketing, there is a thing called the rule of 7.

The rule of 7 says it takes an average of seven interactions with your brand before a purchase will take place.

So retargeting can help you do that.

So how do you do it?

You can do it through Google Ads.

They have a relatively easy process, it requires that you ad a ‘tag’ to each page on your website that you want to track.

So good for your business, you can target customers that have been to your site and maybe get a sale.

But as talked about in Segment #1, the larger, impact on society as a whole, what are the impacts of this filtered view on massive amounts of people?

If you own a small business, you are most likely just trying to generate sales and make a living, but there are others out there that can use this for other purposes, like in the next segment.

Segment 3

This one is interesting.

Jigsaw is a unit within Google that explores threats and builds technology, and they are mapping clusters of disinformation with George Washington University here in D.C.

They interviewed violent white supremacists and found that they use different online platforms for complementary purposes, like recruiting new followers and coordinating private events.

They write, “Malicious actors, such as violent white supremacists and people who knowingly spread COVID-19 disinformation….operate(ing) primarily in loose social networks rather than cohesive groups. This informal structure means that the name or logo used by a group on one platform may look different on another, making them more difficult to observe across the internet. To tackle these issues effectively, a dynamic, internet-wide approach is necessary that can capture malicious actors’ informal, decentralized networks.”

And that’s what they are looking at.

In cooperation with social scientists from GW, and using some techniques that were used to map ISIS online in 2016, they used this method to target hate speech and COVID-19 misinformation.

What they found was these “hate clusters” spread quickly between platforms, using hyperlinks that acted like “wormholes” moving users between platforms, and between moderated and unmoderated platforms.

No big surprise there, I mean that’s how the internet and the sharing of information works.

White supremacists intentionally chose a decentralization model to subvert content moderation and de-platforming.

Much like ISIS, I suspect, and there’s probably an argument in there that de-platforming works, and there’s probably something in there that also helps feed the conspiracy theorists.

I’ve been de-platformed, they’re trying to hide the truth from you, meet me over here on my more secret platform.

One of the authors writes:

An extremist group has incentives to maintain a presence on a mainstream platform (e.g., Facebook Page) where it shares incendiary news stories and provocative memes to draw in new followers. Then once they have built interest and gained the trust of those new followers, the most active members and page administrators direct the vetted potential recruits towards their accounts in less-moderated platforms such as Telegram and Gab, where they can connect among themselves and more openly discuss hateful and extreme ideologies.”

Very interesting, at least to me, and the mapping aspect of this is extremely interesting because if you can map it, you can start to address and maybe even predict it.

Wrap Up

So that’s it for #10.

If you have any questions or comments for me, just go to 46alpha.com and shoot me a note.

You can subscribe to the Last 24 daily news summaries, or follow my FlipBoard magazine if you want to read more articles that I find interesting on digital comms, marketing, and technology.

I’m Mike Nicholson, and I’ll see you again next week.

 

Mike Nicholson

I've spent my career working in a variety of Strategic Communications, Public Relations, Public Affairs, Information Operations, and Executive Outreach positions. With a history of planning, preparing, executing, and assessing communication strategies in the U.S. and abroad, I use this site to write, think and share lessons learned on organizational communications.

Share via
Copy link
Powered by Social Snap