How to stop doom scrolling – and have a better experience online with Jay Van Bavel (from ReThinking)
TED TechNovember 26, 202434:4831.86 MB

How to stop doom scrolling – and have a better experience online with Jay Van Bavel (from ReThinking)

It’s impossible to separate the way people engage with AI with the way they engage with the internet as a whole. This is an episode of ReThinking, another podcast from the TED Audio Collective, featuring a compelling discussion of why the internet can feel so unfriendly–and where we can go from there. You’ll hear from Jay Van Bavel, an award-winning professor of psychology and neural science at NYU, on the science of virality, why bad news commands our attention, and how we can find common ground around more uplifting content.

If you liked this episode, you can find more ReThinking wherever you get your podcasts.

Learn more about our flagship conference happening this April at attend.ted.com/podcast


Hosted on Acast. See acast.com/privacy for more information.

It’s impossible to separate the way people engage with AI with the way they engage with the internet as a whole. This is an episode of ReThinking, another podcast from the TED Audio Collective, featuring a compelling discussion of why the internet can feel so unfriendly–and where we can go from there. You’ll hear from Jay Van Bavel, an award-winning professor of psychology and neural science at NYU, on the science of virality, why bad news commands our attention, and how we can find common ground around more uplifting content.

If you liked this episode, you can find more ReThinking wherever you get your podcasts.

Learn more about our flagship conference happening this April at attend.ted.com/podcast


Hosted on Acast. See acast.com/privacy for more information.

[00:00:00] TED Audio Collective

[00:00:06] Hey everyone, no matter who you are, you're likely to spend a lot of time on the internet.

[00:00:12] In fact, you probably needed a Wi-Fi connection to access this episode you're hearing right now.

[00:00:18] This week, we wanted to share with you a great episode of ReThinking, hosted by organizational psychologist Adam Grant.

[00:00:25] It's all about the ways we interact online.

[00:00:28] You'll not only hear from Adam himself, but also renowned psychologist and neuroscience professor Jay Van Bavel.

[00:00:35] They have a great discussion about why it's so easy for bad news to go viral, how to make online conversations kinder,

[00:00:43] and what you can do to feel better when you're scrolling through your favorite social media platform.

[00:00:48] I hope you enjoy it.

[00:00:50] And if you like what you hear, you can find ReThinking wherever you get your podcasts.

[00:00:55] Now, on to the show.

[00:00:56] Hey everybody, this is John Wygel, the host of The Hustle Daily Show, a short daily podcast that brings you the latest news in business and tech.

[00:01:03] Every day, I'm joined by amazing writers of The Hustle Newsletter, and we cover the biggest business headlines in 10 minutes or less,

[00:01:09] and explain why you should care about them.

[00:01:11] We're not your typical boring news podcast full of stock picks or quarterly performance breakdowns.

[00:01:15] We have a lot of fun.

[00:01:16] We're irreverent, and we find that sweet spot where business news meets culture and tech.

[00:01:20] We go deep on topics like how a man won the lottery 14 times,

[00:01:23] or why it's nearly impossible to buy an original Bob Ross painting.

[00:01:26] If nothing else, you'll walk away with an interesting piece of news to talk about with your friends

[00:01:30] that makes you sound way smarter than you actually are.

[00:01:33] So search for The Hustle Daily Show on Apple Podcasts, Spotify, or wherever you listen to your podcasts.

[00:01:39] When I use social media, it's like your diet.

[00:01:41] It's like, trust me, like, I'd rather have like a chocolate cake right now,

[00:01:44] but like, I'm actually going to go have like a salad for lunch because I'm 46.

[00:01:49] And like, if I have a chocolate cake for lunch every day, it's like not going to work for me.

[00:01:56] Hey everyone, it's Adam Grant.

[00:01:57] Welcome back to Rethinking, my podcast on the science of what makes us tick with the TED Audio Collective.

[00:02:03] I'm an organizational psychologist, and I'm taking you inside the minds of fascinating people

[00:02:08] to explore new thoughts and new ways of thinking.

[00:02:13] My guest today is Jay Van Bevel.

[00:02:16] He's a professor of psychology and neuroscience at NYU,

[00:02:19] and an award-winning teacher, researcher, and writer.

[00:02:22] His book, The Power of Us, with Dominic Packer,

[00:02:26] is a great read on how to overcome our differences.

[00:02:29] Jay is also a leading expert on why what goes viral often makes us miserable,

[00:02:33] and how to change that.

[00:02:35] It's like taking out the tiny tumor in your brain so you don't have seizures anymore.

[00:02:40] Like, that's how we do surgery.

[00:02:41] I think, like, this is a level of which it might help if people knew that,

[00:02:45] and they could, like, figure out what those accounts were and unfollow them.

[00:02:48] They might actually enjoy their online experience a lot more.

[00:02:54] All right.

[00:02:55] Jay, I have to tell you, part of the reason that you're here, among many,

[00:03:00] is I find the news incredibly depressing.

[00:03:05] And I'm hoping you're going to cure that.

[00:03:07] Okay.

[00:03:08] I don't know if I can promise that, but I'll try.

[00:03:10] I've sworn off watching TV news altogether.

[00:03:13] And increasingly, I don't even want to read online news.

[00:03:17] And I just came across, ironically enough, a headline,

[00:03:20] which I guess means I haven't totally shut off the news.

[00:03:24] But the basic finding was 23 million news headlines from 2000 to 2019

[00:03:28] showing that anger, fear, disgust, and sadness have all gone up

[00:03:33] and emotional neutrality has declined.

[00:03:36] What's going on?

[00:03:37] Most news now is interwoven with social media.

[00:03:42] And so we're all in this universe where we're engagement metrics.

[00:03:45] It's called the attention economy.

[00:03:47] And winning the attention economy means more people clicked on a news article,

[00:03:51] went to the website, and that increases ad revenue.

[00:03:54] Human psychology is really old.

[00:03:56] It's millions of years old.

[00:03:58] And our brains are basically wired for survival.

[00:04:00] And that means, like, social connection and belonging.

[00:04:02] But that also means things like avoiding threats

[00:04:05] and being hyper-vigilant about things that could have killed us.

[00:04:08] And we are the offspring, and we are the ancestors of generation after generation

[00:04:13] after generation of people who, like, avoided getting wiped out

[00:04:16] by, like, a dangerous poisonous snake or eaten by a bear or something like that.

[00:04:20] And so we're, like, hyper-attuned,

[00:04:22] and that type of information is hyper-engaging to us.

[00:04:25] And so what happens is when the news posts some story about some tragedy or corruption or scandal

[00:04:32] or anything negative, disgusting, outrageous,

[00:04:35] we're more likely to want to click on it.

[00:04:37] And therefore, the news company that's producing it or the influencer who shared it

[00:04:42] gets engagement.

[00:04:43] And they're more likely to create and share more and more information

[00:04:46] that takes that form or uses those headlines.

[00:04:48] And so we're now in this kind of, like, doom cycle

[00:04:52] where we're getting more and more of that fed back at us,

[00:04:55] and we're engaging with it more and more.

[00:04:57] And that's why people like you are eventually just, like, hit a burnout point

[00:05:00] and disengaged.

[00:05:01] Like, I don't want to see this.

[00:05:02] This is so depressing.

[00:05:03] And that's where we're at.

[00:05:05] It reminds me of something that Liv Boree said not too long ago.

[00:05:09] She said that the problem with raging against the machine

[00:05:11] is that the machine begins to feed on rage.

[00:05:14] Yeah, they're profiting off us.

[00:05:15] It reminds me of, like, The Matrix where we're all, like, getting exploited for our energy.

[00:05:19] Here are our tensions getting exploited.

[00:05:21] This is not new, right?

[00:05:23] You're pointing out that this is a fundamental future of human psychology.

[00:05:26] Psychologists have been writing about this phenomenon for decades.

[00:05:29] I think about Paul Rosen on negativity bias,

[00:05:32] Baumeister and colleagues on bad is stronger than good.

[00:05:36] And I feel like those ideas have long had a lot of evolutionary hand-waving in them, right?

[00:05:41] And I think I'm guilty of that often.

[00:05:43] And, like, fail to notice a lion and you might not survive to pass on your genes.

[00:05:47] Overlook an adorable koala.

[00:05:49] Guess what?

[00:05:49] You live to fight another day.

[00:05:51] But I think you've made a pretty compelling case that this is actually hardwired in us.

[00:05:56] And if that's true, it ought to start early.

[00:05:58] So do we see this with young children?

[00:06:00] Do we see it with infants?

[00:06:02] Walk me through the case that this is actually part of our evolutionary wiring.

[00:06:07] I'm not a developmental psychologist, although I've collaborated and done some developmental work.

[00:06:11] My understanding is that you see this in kids.

[00:06:13] You also see this in our primate ancestors.

[00:06:16] So, for example, if you show, I think this was done with macaque monkeys.

[00:06:20] If you show them, like, if they see, like, a snake,

[00:06:23] all they need to see is one other monkey freaking out to the snake and they learn to fear it instantly.

[00:06:28] Whereas there's other things that are, like, harder for them to fear condition to.

[00:06:31] And so basically what that means is we have a preparedness to learn that some things are more dangerous than others.

[00:06:38] And so that seems to be something that is pretty conserved across species.

[00:06:43] We can have, like, news headlines and stories and videos and stuff that, like, exploit one another's tendency to do this.

[00:06:50] I think that's what makes us special is that we can figure out ways to push one another's buttons to persuade and control and manipulate one another.

[00:07:00] Well, that brings me to one of my puzzles, which is why do we keep falling for it?

[00:07:05] Like, surely we're smarter than this, aren't we?

[00:07:07] Yeah.

[00:07:08] So I'll tell you a study that we finished.

[00:07:10] This was led by my PhD student, Claire Robertson.

[00:07:13] And we got access to this huge data set of experiments that the website Upworthy ran.

[00:07:18] And these are, like, your classic A-B experiments where they, like, try a headline one way, try it another way, see which one people click on, and then, like, use that one.

[00:07:25] So they were, like, masters of using science and data and analysis to, like, optimize headlines.

[00:07:31] But when we analyzed those 22,000 experiments, we found, if anything, the exact same news story when it had a negative headline was generating more engagement.

[00:07:40] It was getting more clicks.

[00:07:42] And if it had a positive headline, it was getting less clicks.

[00:07:44] There are a couple of interesting nuances in your data.

[00:07:48] First of all, those data came from a particular period in time when, to your point, the internet was operating based on a set of algorithms that kind of fed that.

[00:07:59] But at some level, it seems to me that the dynamic hasn't changed even though the algorithms have.

[00:08:05] So we have another paper where we analyzed about 3 million news stories and posts by political leaders on Facebook and Twitter before it was X.

[00:08:15] And what we found there was a couple of interesting things.

[00:08:19] One was that when people spread negative news about the political outgroup, when they shared that content, it was more likely to get spread.

[00:08:27] It got shared about 67% more.

[00:08:29] And that's one of the single most effective strategies we found for making things go viral is just spreading bad news about the other guys.

[00:08:37] So that's probably where they're coming from is just a data-driven, probably not some top-down, you know, evil strategy about how to make people mad all the time.

[00:08:45] If anything, internal documents from Facebook show mostly they were trying to do the opposite.

[00:08:49] They're trying to, like, create community and stuff like that.

[00:08:51] But they were just like – their algorithm is amplifying based on engagement.

[00:08:55] And that's the thing that was driving engagement.

[00:08:57] What's troubling about this is that we're only measuring short-term engagement.

[00:09:00] And in the long term, what a lot of people did is what I did, which is to say I don't want to get my news through social media because it just makes me mad or sad.

[00:09:11] I used to be actually like a techno-optimist.

[00:09:13] I used to think a handful of people in Silicon Valley, if they wanted, could, like, snap their fingers and change something and make the world a better place, like, at a level of scale that we never have had in human history.

[00:09:24] And I often thought that they would get this data and hear these stories and want to do more of that.

[00:09:29] And I actually am pessimistic now.

[00:09:32] I don't think they do because they found wet buttons to press about the dark side of human nature and then we just get more of it.

[00:09:38] But again, these are, like, short-term things.

[00:09:40] They're making more profit in the short term.

[00:09:42] But if they erode democracy to the level of massive intergroup conflict, then maybe they're going to, like, damage all the institutions and societies and communities that they actually care about.

[00:09:53] Well, I think from my conversations with a lot of those folks, I think I've landed somewhere in the middle.

[00:09:59] They're trying not to destroy their business.

[00:10:01] They're also trying not to destroy democracy.

[00:10:03] And so what they've been testing a whole bunch is, like, okay, can we, for example, figure out some middle ground solutions?

[00:10:12] And one of those is deprioritizing political news.

[00:10:15] I've seen some evidence that political news is much more negative than other topics.

[00:10:19] I think one example on the other probably end of that spectrum is actually science news, which skews very positive.

[00:10:25] People are fired up.

[00:10:26] They're inspired.

[00:10:27] They're excited when there's a new discovery.

[00:10:29] Most of the incivil content that is spreading and including the stuff not just that's negative but has a lot of misinformation is heavily in the political domain.

[00:10:38] So there was a big paper in Science Magazine where they analyzed misinformation and it spreads further and faster than true information.

[00:10:45] Oh, this is the Sinan Aral paper, right?

[00:10:46] Yeah, yeah.

[00:10:47] And if you look carefully at their data, they found a couple of moderators.

[00:10:51] So a couple of factors that seem to be driving this.

[00:10:53] One is stories that are loaded with emotion.

[00:10:55] So a lot of stories that have a lot of emotion tend to be in the political domain.

[00:10:59] Part of it is our political leaders, as I was saying.

[00:11:01] Like, this is data from Rob Willer's lab that they've become less and less civil and more and more hostile over time in the last, like, decade or more.

[00:11:10] And so these people have massive, massive following.

[00:11:13] Some, you know, when Trump was on Twitter, he had, like, 90 million followers.

[00:11:16] And most other political leaders have millions of followers.

[00:11:19] And they set norms.

[00:11:20] They establish, like, the tone of the conversation around certain issues.

[00:11:23] Okay, now I'm going to tell you an optimistic thing.

[00:11:25] So this is a paper that's unpublished.

[00:11:27] We're about to submit it in a couple weeks.

[00:11:28] It's led by a host of mine, Steve Rathje.

[00:11:30] And we decided to take a scalpel to this problem.

[00:11:33] One thing that we've noticed is that a lot of this negativity online is driven by a small number of accounts.

[00:11:40] And so we're talking about the spread of anti-vaxxer misinformation.

[00:11:44] Like, on Facebook, there was the disinformation dozen, which was, like, a dozen accounts that created 50% of the misinformation for the whole platform.

[00:11:51] And then it gets amplified by influencers and so forth.

[00:11:54] But it's really a small number of people.

[00:11:55] So what we did was we identified the most polarizing accounts on Twitter or X.

[00:12:01] And we paid people to unfollow them for a few months.

[00:12:05] And in another condition, we paid people to follow some science accounts, like NASA, where it's, like, you see those amazing images, like the James Webb telescope.

[00:12:13] This is the most powerful intervention we've ever run in my lab.

[00:12:16] First of all, it reduces a sense of partisan animosity because you're not seeing this nonstop stream of, like, the most negative people.

[00:12:23] And the negative people, I forget what the date is, but it's something like 10% of people share 97% of the message of social media posts about politics.

[00:12:33] So when you're talking about politics as really negative, it's really only 10% of the people who are driving it.

[00:12:38] And then those 10% of the people, on average, are the people at the extremes.

[00:12:41] So if you unfollow a couple of those people and replace it instead with following some science accounts, people feel happier.

[00:12:48] Here's the other cool thing.

[00:12:49] The effects lasted for six months.

[00:12:51] And I've never seen that before.

[00:12:52] And here's why.

[00:12:54] Once they unfollow just a handful of these accounts, once the study's over and we say you're allowed to follow them again, they don't.

[00:13:01] They realize, I think, that they're, like, enjoying it more without those handful of accounts.

[00:13:07] And I'm liking these science accounts, so I'm just going to keep it as is.

[00:13:10] Every once in a while, I'll share a study that upsets people.

[00:13:14] And then I notice a bunch of people unfollow me.

[00:13:17] I'm like, listen, like, you just shot the messenger here.

[00:13:20] Like, the fact that I raised a hard question or you disagreed with the finding of the rigorous research that I thought was worth knowing, even though I didn't like the result of it either.

[00:13:31] Like, I don't think that is a good reason to unfollow someone.

[00:13:34] You should unfollow if their posts are disrespectful, if they're consistently spreading lies or bullshit, or if their content just ruins your day.

[00:13:41] And it makes me wonder, should content creators have positivity ratios in mind?

[00:13:47] Should journalists have positivity quotas?

[00:13:50] How do you think about that dynamic?

[00:13:52] Okay.

[00:13:53] That's a great one because I experience the same thing as you but probably on a much lesser scale.

[00:13:57] And sometimes I just delete it because I just, here's the difference.

[00:14:01] I think you're a little bit like me, but I'm probably even more sensitive.

[00:14:03] Like, if I share something controversial, I'll, like, scale back the controversy for a week just for my own psychological well-being.

[00:14:10] Sometimes I don't know it's controversial when I post it.

[00:14:14] I kind of start from the default assumption that the whole audience who's chosen to follow me knows that intellectual integrity is one of my core principles.

[00:14:23] And I'm not going to bother to share something unless it's cleared my high bar of, I think this is credible information from credible experts.

[00:14:31] And, like, every once in a while I sort of have this jolt of, like, someone accuses me of just posting research to get clicks.

[00:14:39] And I'm like, that goes against my core values as a science communicator, which is, I'm not just going to share what makes you feel good.

[00:14:47] I want to share what makes you think hard.

[00:14:49] I think it matters to know this information.

[00:14:52] It's not fun, but it's meaningful or it's a helpful perspective.

[00:14:55] And I think that requires a long view from the audience, but it also requires me as a content creator and a communicator to be clearer about the principles behind what I share.

[00:15:04] When I use social media, it's like your diet.

[00:15:07] I think some people just want, like, the junk food diet version of it.

[00:15:11] And others of us, like, actually make an effort to use it as a way to make ourselves smarter or more informed about the world.

[00:15:17] Sometimes I'll share a study and I'll have other scientists, like, critique it or debunk it.

[00:15:20] And that actually informs the way I think about that research.

[00:15:23] And I might not cite that in the future because I didn't know of a critique of it that they brought up.

[00:15:27] Or they might connect me to another paper that I hadn't read that kind of goes deeper.

[00:15:32] And so there's also a conversational thing where I feel like if I'm having those conversations, even if I'm wrong in what I share or I'm interpreted the wrong way, I feel like I'm getting smarter about it.

[00:15:42] And so it's kind of like that's like a healthy diet.

[00:15:44] We need a badge for social media for science communicators that's the Rawlsian veil of ignorance only applied to research methods as opposed to justice.

[00:15:54] I share content if I think the methods were high quality, regardless of the conclusions.

[00:16:01] Yeah.

[00:16:01] Yeah.

[00:16:02] As a scientist, all you care about is, like, the study was rigorous and then we'll update our thinking and then we'll, like, get better.

[00:16:08] And guess what?

[00:16:09] If we do solve some kind of bias in society, that's great.

[00:16:12] Let's talk about how to uncondition.

[00:16:16] That's not a thing.

[00:16:18] I guess you can do extinction of conditioning, right?

[00:16:21] Extinguish, yeah.

[00:16:21] Yeah, extinguish.

[00:16:23] So you mentioned Upworthy earlier.

[00:16:25] One of the things I've really loved about their work is they have made a concerted effort to try to shift the balance and say, we want to be an organization that elevates people.

[00:16:34] And I think they've done some very clever things around activating positive emotions like joy, inspiration, curiosity, hope, gratitude, pride.

[00:16:47] I thought it'd be interesting to get you to analyze a few of their success cases.

[00:16:50] I've enlisted an expert in my house to help with this.

[00:16:54] My daughter, Joanna, is actually interning with Upworthy right now.

[00:16:58] And she's picked some examples for us.

[00:17:01] Joanna, can you hear us?

[00:17:02] Yeah.

[00:17:03] Do you want to come over so we're recording you on my mic, too?

[00:17:06] Hi, Joanna.

[00:17:06] Nice to meet you.

[00:17:07] I'm Jay.

[00:17:07] Hi.

[00:17:08] Nice to meet you, too.

[00:17:09] One person said, I need the people to know that Olympic silver medalist Georgia Villa is sponsored by Parmesan cheese and regularly posts pictures of herself with giant wheels of cheese.

[00:17:20] I love the wheels of cheese in that photo.

[00:17:22] You want to hold it up so Jay can see it?

[00:17:24] Well, I saw her go viral, yeah.

[00:17:26] And I loved those pictures of the giant Parmesan's.

[00:17:29] Okay, so here's some variables that predict why that works.

[00:17:32] First of all, one of the things that generates attention in the attention economy of the online universe is surprise.

[00:17:39] And it's very surprising.

[00:17:41] I didn't even realize they were sponsored by cheese companies.

[00:17:44] The other thing that caught my eye was the caption.

[00:17:47] Upworthy wrote, that's big Parma for you.

[00:17:52] I thought it was hysterical.

[00:17:54] Everything's better when you have comedy.

[00:17:56] Okay, Joanna, you want to do another?

[00:17:57] This guy's parents put on a cousin camp every summer for the grandkids.

[00:18:02] And for the past eight years, they've invited all the grandkids ages five plus to spend four days on the farm away from parents with their cousins.

[00:18:10] So they go all out and they make it feel like a real summer camp where they turn the bedrooms into cabins.

[00:18:16] And they say the Pledge of Allegiance.

[00:18:17] There's arts and crafts, kitchen duty, talent shows, music, time swimming, campfires, field trips, even sports, anything.

[00:18:24] So you have a couple themes that are important to people is family.

[00:18:28] It's like a beautiful family thing and the grandparents staying involved with the kids' lives.

[00:18:32] I also thought that was actually a story that might have resonated with people because it's useful.

[00:18:36] Like a lot of grandparents or parents might see that and get an idea to do something similar.

[00:18:41] So I also thought that that was one that connected that way.

[00:18:44] It also shows love.

[00:18:45] Love is like one of those really strong positive emotions.

[00:18:47] And we found in terms of moral emotions tend to go viral.

[00:18:51] And most of them are negative, like outrage and disgust.

[00:18:53] But one that does go viral is love.

[00:18:56] And so it's kind of like a show of love and dedication for the grandparents to spend all year prepping this and doing this for their grandkids.

[00:19:03] We ran a study like with 500 Americans.

[00:19:07] And we asked them what types of things do go viral.

[00:19:09] And they all say like misinformation, outrage, negativity.

[00:19:13] And we say what types of things do you want to go viral?

[00:19:15] And overwhelmingly they say they want stories like this.

[00:19:18] They want stories that are heartwarming, that show pro-sociality, that show something positive and uplifting.

[00:19:23] And so there is this strong stated desire by people that they want way more of this in their lives.

[00:19:29] And they want social media to feed them this.

[00:19:31] And they just feel like they're not getting it.

[00:19:32] So I think people are also feeling a little starved for stories like this.

[00:19:35] I think there needs to be a good news podcast where you just analyze uplifting stories over and over again.

[00:19:41] Maybe the closest thing is like the Happiness Lab by Laurie Santos at Yale.

[00:19:45] It was just a happiness class and like what you can concretely do to be happier.

[00:19:49] And it's, I think, the most popular class in the history of the university.

[00:19:52] So there's definitely like a deep, deep desire, especially among young people who are super anxious and depressed right now.

[00:19:58] They want stuff like that.

[00:20:04] You ready for lightning?

[00:20:06] Yep.

[00:20:06] Okay.

[00:20:07] Okay.

[00:20:07] First question is, what is something you've rethought lately?

[00:20:11] Well, obviously I've been rethinking how to make social media better.

[00:20:15] I used to think you have to change the algorithms and all that.

[00:20:17] And now I'm thinking that a small number of people are driving most of the problems.

[00:20:21] And we need to figure out how to navigate that issue.

[00:20:24] I think that's very true.

[00:20:26] It reminds me of the Michael Peterson work on how trolls use aggression to get attention.

[00:20:30] If you have one jerk standing on the corner yelling at everybody, it's hard for them to cause a lot of harm.

[00:20:35] But online they can be harassing hundreds or thousands of people all day or spreading misinformation to huge audiences.

[00:20:42] And then the other thing about Michael Bang Peterson's work that I thought was really interesting is it's high status seeking people who are doing this.

[00:20:49] And so what has happened is we've given status to people who do this.

[00:20:54] We need to change what we give status to online.

[00:20:56] And we need to stop giving status to those folks and find a way to give status to the people who are shipping their puffin sweater to strangers.

[00:21:03] Please make that your next project.

[00:21:05] Okay.

[00:21:06] The world needs you, Jay.

[00:21:08] What is the worst advice you've ever gotten?

[00:21:13] The worst advice I ever got was not to go to college.

[00:21:15] I grew up in a blue collar town.

[00:21:18] My dad tried to convince me not to go to college and work for his construction company.

[00:21:21] And very shortly thereafter, it went bankrupt.

[00:21:24] So thank God I didn't take this advice.

[00:21:27] What's a hot take you have?

[00:21:28] An unpopular opinion.

[00:21:30] I have so many unpopular opinions.

[00:21:32] Okay.

[00:21:32] There's huge conflict in society and on social media around age.

[00:21:38] It was millennials and boomers and OK Boomer.

[00:21:40] And now it's everybody seems to be mad at Gen Z.

[00:21:42] I tend to think that a lot of that is false conflict.

[00:21:47] That the differences between generations are real, but they're really small.

[00:21:51] And that the moment we start to categorize people into groups, we create the us-them dynamics that we can create in the lab by assigning people to a blue team or a red team.

[00:22:00] Once we start to put a label and attach stereotypes to it, we create intergroup conflict where it doesn't need to be.

[00:22:06] Oh, that may be an unpopular opinion, but it's actually one that I hold strongly.

[00:22:11] Oh, do you?

[00:22:11] I think you're exactly right there.

[00:22:13] Okay, good.

[00:22:14] Exactly right.

[00:22:14] I mean, it tracks for me with the Brent Roberts et al findings that when people called millennials generation me, that that was actually more of an age and life experience effect than a cohort difference.

[00:22:26] I read Gene Twenge's book Generations, which I thought was excellent and super compelling to me.

[00:22:31] But like there is more individualism, but it's not just one generation.

[00:22:34] It's just increasing individualism over time.

[00:22:37] And it's correlated with technology use, which just affords us more and more individualism.

[00:22:41] But it's the cutoffs and the labels and how people perceive those labels to me is it's like zodiac signs or astrology or something like that.

[00:22:50] What's a prediction you have for the future 20, 30, 50 years down the road?

[00:22:55] This is like my darkest one.

[00:22:57] It's around climate change.

[00:22:58] In our book, we identified three problems that are going to be related to identity.

[00:23:01] And you think of identity as ways to like find solutions and fix it.

[00:23:05] One was on threats to shared national identities.

[00:23:07] You're going to have more and more of that.

[00:23:08] Other one was on inequality.

[00:23:09] When there's massive inequality, people start to identify with their economic group.

[00:23:15] And in organizations, when there's huge inequality in salary, people no longer identify with the organization.

[00:23:20] So inequality erodes social cohesion in ways that are challenging for society.

[00:23:25] And then the last one is climate change.

[00:23:27] It's that if we don't build some identity that's bigger and we don't care about ourselves as humans and that highest level identity,

[00:23:34] then there's going to be no way to solve it because I think it requires us to care about one another and people in different countries who are going to suffer worse than we are.

[00:23:42] And so that's my biggest concern.

[00:23:44] I actually don't think, I think we're capable of it and we're getting better at it, but I don't know if we're going to solve it.

[00:23:51] And it's like I have kids and I'm terrified about what their future looks like with another 10, 20 years of climate change.

[00:23:58] I really hope you're wrong on that one.

[00:24:00] Me too.

[00:24:00] I desperately hope that someone's listening to this in 20 years and saying, God, he had that wrong.

[00:24:08] What is a question you have for me?

[00:24:10] I would say that you are the most influential person probably that I follow because I don't really follow a lot of politicians.

[00:24:16] And one thing I've noticed with you, you'll share something from science and then it seems like you're very good at also drawing a prescriptive piece of advice from it.

[00:24:26] I think that's like a special skill that you have in your books and your social media.

[00:24:30] So I'd love to know from you, like how you find stuff that you think is useful for people.

[00:24:36] Well, thank you.

[00:24:37] First of all, I used to say, okay, I want, basically I want to share what's either interesting or useful for people.

[00:24:46] And I found I had a really hard time gauging that because what I find interesting is not always interesting to other people.

[00:24:53] If I were to draw a big circle of all the things I find interesting and then put a tiny dot in it, that's a number of those things that are interesting to my audience, I've discovered.

[00:25:05] Yeah.

[00:25:06] And also, like it's really hard to gauge what's useful for other people because what helps me is not necessarily what's going to help everyone else or anyone else.

[00:25:16] So I've learned a lot actually through posting things that didn't resonate.

[00:25:21] And I think that the first thing that I especially tell people who are kind of venturing into more public communication is you should post a lot because you're going to run a lot of poorly controlled A-B tests and you will start to see patterns.

[00:25:36] And as I've done that, one of the things I've learned is the heuristic that serves me better than do I think this is interesting is do I want to share this with other people that I know?

[00:25:47] Oh, interesting.

[00:25:48] I'm sure you also get a lot of daily journal alerts for studies and new papers.

[00:25:54] And when I get those alerts, I used to say, oh, like if this study blew my mind, I'm going to post that.

[00:26:00] And now I don't do that.

[00:26:01] I put it in a Word doc.

[00:26:03] There's a doc of every study I've ever found interesting.

[00:26:06] And when I'm looking for something to post, I go into it.

[00:26:09] And my first test now is, is there someone specific I want to share it with?

[00:26:14] And then the second test is, are there more people that come to mind?

[00:26:17] And if it's more than a few people, I'm like, okay, now this is interesting or useful to others.

[00:26:23] That's my strategy.

[00:26:24] What do you make of it?

[00:26:25] I think it's like a great strategy for just collating everything interesting.

[00:26:28] And then the things that might be interesting right now that are very timely because they're yoked to the news, you've removed yourself from that by only posting them when you go back later.

[00:26:37] So do they survive like a right now it's interesting and four months later when I open this, is it still interesting?

[00:26:42] Those are things that are probably more like evergreen or enduring.

[00:26:44] The other thing is you do a theory of mind exercise where you start to imagine other audiences.

[00:26:50] And I think like that, that sounds really useful.

[00:26:52] I always think like that's the key to communication is whether you're giving a talk or writing a paper or an op-ed or anything is like stepping outside yourself.

[00:27:02] The only thing I push, I would ask you about is like, how do you avoid?

[00:27:06] So one thing I noticed from people on social media, they end up getting what I would call audience capture.

[00:27:12] That they end up getting like more and more extreme audiences and post more and more extreme content until they're posting like paranoid conspiracy theories.

[00:27:19] And they've gone down this like weird rabbit hole.

[00:27:21] And I see this for, I've seen this for several people.

[00:27:23] How do you avoid doing that?

[00:27:25] What makes you think I've succeeded?

[00:27:28] Well, I haven't seen you post anything crazy yet.

[00:27:30] Good, good.

[00:27:32] All right.

[00:27:32] Please tell me if you ever do.

[00:27:33] Okay.

[00:27:34] It's the same problem we face as writers, which is you don't want to completely ignore the reader, but you don't want to be a slave to the reader either.

[00:27:42] This is actually why I try to think of who do I want to share this with.

[00:27:45] I'll read a study and almost always like the first person I want to share a study with is a fellow academic.

[00:27:51] Yep.

[00:27:52] I actually do send it to that person or two.

[00:27:55] And if I notice that I can't find somebody who's extra critical to send it to, I'm like, okay, maybe there's a problem here.

[00:28:02] Or if I notice that all the people I'm sending it to are people who are just enthusiastic consumers of all knowledge and they don't have strong filters, then that's a red flag for me too.

[00:28:13] And so I pay a little bit of attention to like, would I send this to the person that I know eviscerates ideas or that I know actually thinks psychology is, you know, kind of junk science.

[00:28:25] And if so, all right, I feel much better about that.

[00:28:28] Oh, good.

[00:28:28] Okay.

[00:28:29] I like that, like thinking about your critics in advance and making sure it's above that threshold.

[00:28:33] It's like a good, like quality control probably.

[00:28:36] Sometimes.

[00:28:37] And other times you just cannot anticipate why a study will make people angry.

[00:28:41] Yeah.

[00:28:42] There's different tribes, you know, in each place with different vibes.

[00:28:46] Perfect segue to our final discussion topic.

[00:28:50] I went to the Olympics for the first time in Paris.

[00:28:53] I took two of our kids and we went to an Olympic basketball game.

[00:28:57] It was Team USA against Serbia, semifinals.

[00:28:59] And it was crazy to watch how far behind we fell before we came back and won.

[00:29:06] But even crazier was when I heard half the stadium booing an American player.

[00:29:12] And I watched that happen.

[00:29:14] The moment he touched the ball, the entire arena, boo.

[00:29:18] And it just felt completely counter to the spirit of the Olympics.

[00:29:25] Olympics, I've always thought the whole point of the Olympics is we're not just going to elevate

[00:29:30] excellence.

[00:29:30] We're going to celebrate excellence no matter who is doing great things, no matter what country

[00:29:34] they're from.

[00:29:36] And of course, you're going to root for your own, but you don't root against your opponent.

[00:29:39] We want to build solidarity and peace.

[00:29:42] And I thought of this research on the Olympic paradox.

[00:29:45] And I saw that you also posted on this.

[00:29:48] So we clearly were having similar reactions here.

[00:29:51] What we hope from the Olympics, obviously, is that there's like this Olympic spirit, which

[00:29:55] is transcendent, which is unifying across countries.

[00:29:58] But what often happens in practice, and this is what the research finds, is that people's

[00:30:02] national identities get activated.

[00:30:05] And when that happens, there can be animosity towards competitors, towards outgroup members.

[00:30:10] Or in this case, if I'm remembering, was this Joel Embiid?

[00:30:13] Yes, it was.

[00:30:14] It's because he could have played for the French national team.

[00:30:17] His national identity would have allowed him to, and he chose not to.

[00:30:20] By the way, this is the people who are treated worse in society are traitors.

[00:30:24] And we have terrible terms for these in organizations, right?

[00:30:27] Like devil's advocate.

[00:30:28] You'd think that we would understand the value of dissent, but we give it like literally

[00:30:32] the worst name on earth.

[00:30:34] He was seen as being a traitor to France, and it's mostly French fans.

[00:30:38] And so that's something that transcended it.

[00:30:41] They were thinking about their national identity, and he could have helped them.

[00:30:44] He's one of the best players on earth.

[00:30:45] So I think like that's the unintended effect of the Olympics.

[00:30:50] But it makes complete sense because that happens in every competition on earth.

[00:30:55] The fan rivalries get intense and sometimes borderline on violence.

[00:30:59] But there's often incredible acts of transcendent identity and Olympic spirit.

[00:31:06] One of the most beautiful ones in this Olympics, there was this great pitcher.

[00:31:09] I think it was Simone Biles who had won every Olympic gold forever and ever.

[00:31:12] And then she got, I think, silver in one competition.

[00:31:16] I forget if it was like the floor routine or something.

[00:31:18] And this woman, I think she was from Brazil, Andrade.

[00:31:21] I'm trying to remember her name.

[00:31:22] Yeah, Rebecca.

[00:31:22] Yeah, and she was amazing.

[00:31:23] And then Simone and the other American who got bronze bowed down to her on the podium.

[00:31:28] And I've never, ever seen that before.

[00:31:31] And this is like Simone Biles, like the GOAT, one of the best athletes of all time, period.

[00:31:34] And obviously at the Olympics, but anywhere.

[00:31:37] And then for her to lose and do that is just a sign of incredible sportsmanship and a sense

[00:31:45] that you're cheering on other athletes and there's something bigger here that you're all part of

[00:31:49] and it's beautiful.

[00:31:50] And you see that at every Olympics.

[00:31:52] There's always examples of athletes like picking up someone who's like going to faint

[00:31:55] before they cross like the finish line.

[00:31:57] And some other athlete and kind of running by them like carries them to the line.

[00:32:00] The competition of the Olympics happens within a broader cooperation.

[00:32:05] To have good competition, you require cooperation that we all place by some rules that we're not

[00:32:10] going to harm one another, that we're all in this together.

[00:32:13] We all benefit from this cooperative environment of the Olympics.

[00:32:17] What I found so powerful about seeing this up close at the Olympics was this was not just

[00:32:24] the athletes.

[00:32:25] The fans were like this too.

[00:32:27] We're in the pool and two different divers failed dives, which is extremely rare at the

[00:32:32] Olympic level.

[00:32:33] And the entire crowd was cheering for them to bounce back.

[00:32:37] The whole crowd, like not just the fans of their countries.

[00:32:40] Everyone wanted to see them succeed.

[00:32:42] I think we've been missing that.

[00:32:43] We often ignore that.

[00:32:45] Guess what?

[00:32:46] It's not as interesting to people, even though I think it's more beautiful and profound

[00:32:49] and useful for humanity.

[00:32:51] This was something we tried to capture in our book because there's a lot of people are focused

[00:32:54] in the last five, 10 years in American discourse about tribalism and the downsides of partisan

[00:32:59] identities and other types of identities and identity politics.

[00:33:02] And no question, there's a long history of intergroup conflict and violence that is linked to that.

[00:33:09] But there's tons of evidence over and over again, study after study, example after example,

[00:33:16] of how identities and groups can bring us together.

[00:33:19] We have to focus more and harness like these healthy parts of identity and understand they're

[00:33:24] there for a lot of people.

[00:33:25] And then like maybe make norms that make that other part that's more potentially dangerous

[00:33:29] or violent or xenophobic.

[00:33:32] Like that's not part of who we are when we talk about our identities.

[00:33:36] Yes.

[00:33:37] We need patriotism without nationalism.

[00:33:40] Yeah.

[00:33:41] Just because you identify the group doesn't mean you're going to discriminate against outgroups.

[00:33:44] If the norm is inclusivity, the more you identify with the group, the more inclusive you become.

[00:33:50] And the more you embrace other people and difference in other groups and cooperate more.

[00:33:54] And so it's really about like identity is not a bad thing as long as we have healthy norms.

[00:33:59] Yes.

[00:34:00] Well, Jay, you are just so full of knowledge that I could ask you questions all day.

[00:34:04] And I think that you're a rare psychologist who is as insightful and brilliant in doing research

[00:34:13] as you are in communicating it.

[00:34:14] And I really admire that.

[00:34:16] Thank you so much.

[00:34:17] The feeling is mutual, Adam.

[00:34:20] This Olympic discussion reminds me that we need more patriotism and less nationalism in the world.

[00:34:27] Patriotism is taking pride in your country.

[00:34:29] Nationalism is being hostile toward other countries.

[00:34:32] In-group solidarity does not require out-group prejudice.

[00:34:36] You can love your people without hating others.

[00:34:43] Rethinking is hosted by me, Adam Grant.

[00:34:45] The show is part of the TED Audio Collective.

[00:34:47] And this episode was produced and mixed by Cosmic Standard.

[00:34:50] Our producers are Hannah Kingsley-Ma and Asia Simpson.

[00:34:53] Our editor is Alejandra Salazar.

[00:34:55] Our fact checker is Paul Durbin.

[00:34:57] Original music by Hansdale Sue and Allison Leighton-Brown.

[00:35:00] Our team includes Eliza Smith, Jacob Winnick, Samaya Adams, Roxanne Hylash,

[00:35:05] Banban Cheng, Julia Dickerson, and Whitney Pennington Rogers.

[00:35:09] For more uplifting news, check out Upworthy's new book.

[00:35:12] It's called Good People, Stories from the Best of Humanity.

[00:35:18] This is my favorite news story every year in the Olympics,

[00:35:21] is at the Olympic Village, they almost always run out of condoms.

[00:35:25] What's going on at the Olympic Village is you have all these like super hot,

[00:35:29] like ripped athletes hooking up with one another,

[00:35:32] like the day after their event's over, right?

[00:35:33] Because they can party and cut loose.

[00:35:35] like an Olympic Village on the Olympic Village.

[00:35:35] And this is the winter of the Olympic Village,

[00:35:35] where they've got an Olympic Village.

[00:35:35] So I'm sure I'm thinking about this.

[00:35:36] So we're going to go to the Olympic Village.