Transcribe all your audio with Cockatoo

Blazing fast. Incredibly accurate. Try it free.

Start Transcribing Free

No credit card required

Brain Rot Emergency: These Internal Documents Prove They’re Controlling You!

Brain Rot Emergency: These Internal Documents Prove They’re Controlling You!

The Diary Of A CEO

99 views
Watch
0:00

You are actively rewiring your brain for the worst by engaging with social media, high volume, quick videos.

0:05

And the social media executives don't let their kids use this stuff because they designed it to be addictive. And they know that millions and millions of kids have been cyberbullied, sextorted. Many have committed suicide. So I'm getting angry.

0:17

And then from the medical perspective, it's rewiring your body,

0:20

increasing your risk of heart disease and PTSD. We've moved too far into the virtual world and the results are catastrophic.

0:27

People are spending roughly about six and a half hours a day on their phones. What do we do about this?

0:32

Well, here's the amazing thing. We actually can control our fate.

0:36

So we are joined by a social psychologist

0:38

and a Harvard physician to dive into the technology addiction and brain rot crisis billions are facing worldwide.

0:44

And how we can counter its devastating mental health effects.

0:48

You have to reclaim your attention because without the ability to pay attention for several minutes at a time, we're seeing the destruction of human potential, the human relationships, the connection.

0:58

But there's all these small tweaks that you can do to override that primal urge to scroll. For example, 91% of people had an improvement in attention, well-being, and mental health. After just two weeks of continuing to use their device but not having internet access,

1:11

next, keep your phone out of your arm's reach because the sheer potential for distraction has actually been shown to change your prefrontal cortex, which is called brain drain.

1:20

So yes, we should exert more self-control, but we're being pushed into addictive apps, and it's messing us all up. That's not our fault.

1:26

Would you advise people to delete these short-form video actions?

1:29

Oh my God, yes. That would be the most important thing you can do for your intelligence and for humanity. But if I was going to offer some specific advice, here are the three things that I do. So first... I wanted to ask you guys what you thought of this.

1:46

Hey, you're back.

1:47

This terrifies me.

1:50

♪♪

1:54

Guys, I've got a quick favor to ask you. We're approaching a significant subscriber milestone on this show, and roughly 69% of you that listen and love this show haven't yet subscribed for whatever reason. If there was ever a time for you to do us a favor,

2:07

if we've ever done anything for you, given you value in any way, it is simply hitting that subscribe button. And it means so much to myself, but also to my team, because when we hit these milestones, we go away as a team and celebrate.

2:17

And it's the hit the subscribe button, I won't let you down and we'll continue to find small ways to make this whole production better. Thank you so much for being part of this journey. It means the world and yeah, let's do this.

2:36

♪♪

2:39

Jonathan, Aditi. Jonathan, I've heard you say that the destruction of attention is the largest threat to humanity that's happening around the world. And I've also heard you say that short form videos are the worst of the worst

2:53

because they're shattering attention spans. The reason why I wanted to have this conversation today is somewhat personal. And in fact, all of the conversations I have in the diary obviously are somewhat personal to some degree.

3:04

They're inspired by some unanswered question I have in my head, and also some observation I have in my life. And the observation I've had is that short form videos in particular are making my life worse. And actually I've got to say the catalyst moment

3:20

really where I thought, you know, I need to get exceptional people together to have this conversation was I thought this, I then looked at my screen time and saw a huge change. I felt so much worse because all these social platforms have short form video now. And then I actually heard Elon Musk, who, you know, has a social media platform that does short form video, say that he thinks it's one of the worst inventions for humanity. Jonathan, why did

3:41

you say what you said about short-form video and this corruption of attention?

3:46

Because I wrote a whole book called The Anxious Generation, focusing on teen mental health. That was the mystery that popped up in the mid-2000s. Why are people born after 1995 so much more anxious and depressed? And I've been tracking down that mystery, and a lot of it points to social media and especially Instagram, social comparison, all the things we know about social media. When the book came out in 2024,

4:10

since then what I realized is that I vastly underestimated the damage because I focused on mental health, which is a catastrophe, but the bigger damage is the destruction of the human ability to pay attention without the ability to pay attention

4:25

for several minutes at a time, ideally 10 or 20 minutes at a time, without that, you're not gonna be of much use as an employee. You're not gonna be of much use as a spouse. You're not gonna be successful in life. And that's when I realized this is way beyond mental health.

4:40

This is changing human cognition, changing human attention, and possibly on a global scale.

4:47

Aditi, what perspective do you come at this from and what's been your perspective through all the work you've done about brains and stress and neuroscience and all these kinds of things that has shaped the way that you think about social media, screen time, short form video?

5:01

My background is that I'm a physician at Harvard and my expertise is in stress burnout and mental health. And so that is the lens that I view all of this through. We know that the most deleterious relationship that you have is with your device. You know, in every healthy relationship, we have boundaries.

5:20

We have boundaries with our kids, our parents, our colleagues, our, you know, with our friends. And yet we have no boundaries and often porous boundaries when it comes to the relationship you have with your device. So it's not so much about, you know, becoming a digital monk and renouncing technology because technology can serve us, right? It inspires, educates, connects.

5:42

Now more than ever, it's so important to be an informed citizen, but not at the expense of your mental health. And so what Jonathan was saying, this constant being engaged with your devices, with social media, the scrolling from the minute you wake up until you go to bed, there's a reason why you have your best ideas in the shower. And that's because that's the only place in the whole day where you are not with your

99.9% Accurate90+ LanguagesInstant ResultsPrivate & Secure

Transcribe all your audio with Cockatoo

Get started free
6:03

device. People take their device to the bathroom, they sleep with your device, you eat with your device, people walk down the street. There's more near-miss pedestrian accidents because people are walking while they're crossing the street and looking at their devices. And so there's all of this brain biology at play behind the scenes. So both of you have talked about how it doesn't feel good to engage and constantly be on your

6:26

phone, that sense of infinite scroll. But there is, you know, it feels like you're doing nothing. You're just doing this, right? What are you doing? But in fact, it is not passive, it is active, and it has a profound effect on your biology, on your brain, on your psychology, and also social factors that I hope we talk about today.

6:45

You know, scrolling, wasting a bit of time doesn't seem so harmful. What is the big, if we play this for 10, 20, 30 years, what is the big risk or threat?

6:55

The biggest threat right now, we don't even have to wait 20 years, is that through a process called neuroplasticity, which is just a big fancy word that simply means that your brain is a muscle, is that by engaging with social media, that sense of high volume, low quality, quick videos,

7:10

you are actively rewiring your brain for the worst. So you're increasing your sense of stress, worsening your mental health, attention, cognition, distractibility, irritability, complex problem solving. All of that changes when you engage in that infinite scroll.

7:28

I'd like to add on here because one of the main arguments I get is, ah, this is what they said about television. Ah, this is what they said about comic books. This is just another moral panic. But people need to understand why touchscreen devices are so different from television. And so parents find this helpful if I just lay this out briefly,

7:45

good screen time versus bad screen time. So humans are storytelling animals. We have always, as long as we've had language, we've raised our kids with stories, epic poems, all kinds of stories. Stories are good.

7:59

The human brain needs lots of patterns. The child's brain needs lots of patterns to develop. So the worst thing you can do is hand your child a device because they're crying for it because they've been trained to get it and you're busy. So you hand them the device, they're quiet.

8:13

What's happening? They're sitting alone, not, you know, when I was a kid, we always watched it with my sisters, with my friends, you're arguing about it, it's never long stories. It always ends up at YouTube shorts or TikTok or Instagram Reels for older kids.

8:30

So they're doing this, but here's the key thing that it does that a television does not. A television puts you in a state that psychologists call transportation. You get into a story and you find yourself pulled in and you're rooting for the characters.

8:43

And this is how a brain gets tuned up to social patterns, but it can't happen in 10 seconds. It can't happen in one minute. It takes a long period of time. And there is no reinforcement. There is no, the television doesn't do anything to you.

8:59

You don't have any response. Whereas a touchscreen device is a Skinner box. So B.F. Skinner was one of the founders of behaviorism, and he put rats and pigeons in a box where he could deliver reinforcement, a little grain of food, on a schedule,

9:14

and by giving them quick reinforcements for behavior, he could train them to do amazing tricks in just a few hours. When you give your kid a touchscreen device, it's stimulus, response, swipe, get a reward or not, variable ratio, and you just keep doing that.

9:29

So you are, as Aditi said, it is rewiring your brain. It's not just wasting time. It is literally training you to do things where television didn't do that. So this is a whole new game.

9:40

And to add to that, you know, from the medical perspective, you're shortening this attention span. And what happens over time is, so, like Jonathan said, right, you're not sleeping as well because you are engaged with your device. We know that 80 percent of people are checking their phones within minutes of waking up. We have something called revenge bedtime procrastination.

9:58

This concept of, you know, at the end of the day, you're fatigued, you've had a long day, you've had no me time, and you want to get to bed early. We all know, by the way, what the data is that, you know, we've been taught since we were little kids, right? Like bedtime, sleep is important, it's good for your body, it's good for your brain. And we might have all the knowledge in the world, but in terms of action, there's a wide gap between offshoots. So what happens? So you know you have that decreased attention, you have that irritability, hypervigilance, and so at night at the end of the day it's 9 p.m. you finally, you know, if you're a parent, your kids are asleep, your kitchen is clean, maybe you finish your entrepreneurial day and you finally sit down with Melanie on the couch and you're like,

10:38

ah, some me time. And you know you want to get to bed early, and you know it's good for you. But then suddenly you're scrolling, and before you know it, it's 2 AM. And you're saying, oh my god, what happened? Why am I still awake? What was I doing all this time? What happens is that you essentially give yourself some me time at night.

10:54

And so you procrastinate bedtime. And so what happens is with this revenge bedtime procrastination, it affects your sleep. And then when you don't have good sleep, good quality sleep, so you have difficulty falling asleep, staying asleep, sleep debt over time for kids, for adults has all sorts of ramifications. So this is just the tip of the iceberg, this short form video content and the ripple effects go far and wide. Not only is it rewiring your brain, it's rewiring your body, it is affecting your sleep, which increases your risk of heart disease later in life.

"Cockatoo has made my life as a documentary video producer much easier because I no longer have to transcribe interviews by hand."

Peter, Los Angeles, United States

Want to transcribe your own content?

Get started free
11:30

And when you're consuming graphic videos and graphic images, it can increase your personal risk of PTSD through vicarious trauma, even if you weren't there. So this is just a vast network of things that can happen to you simply because you're thinking, yeah, it's harmless, what is it? It's just a bunch of videos that I'm checking out

11:51

as a way for me to decompress.

11:53

What do I need to know about the nature of the brain to understand exactly what short form video is playing, is hijacking, is taking advantage of?

12:03

The thing to understand about all of this is that we have to focus on childhood. Why do we have childhood? Humans have this really interesting childhood where we grow rapidly at first, and then we slow down for about five or seven years.

12:16

We don't grow very quickly. And then we speed up at puberty. Whereas other primates, they just grow and grow till they reach reproductive age, then they reproduce. But we seem to have this long period of sort of middle childhood for cultural learning.

12:28

It's a period in which the kid is now walking and talking and turning away from the parents, and that's a time for this to come in. And they pay attention and they form relations. All these things have to happen slowly because the neurons are gradually growing.

12:42

They're finding each other based on what the child is doing, okay? So we grow up in the real world, and that happens over time. And a lot of that is very physical. Kids are very physical, mammals are very physical, and there's a lot of touch.

12:54

So that's a healthy human childhood. But when you give an iPad or your old iPhone, and they begin doing the touching and swiping, that is gonna hijack their attention, that is gonna push out all other forms of action and learning,

13:09

and that is going to change the way the parts of the brain that learn to pay attention, what's called executive function, it's gonna change the way the brain learns to pay attention, it's gonna change the reward circuits. I think you had Anna Lemke on recently,

13:21

who is the nation's expert on addiction. And the way that she describes it, how any one addiction is gonna change your reward pathways to make you more vulnerable to other addictions. So we're setting our kids up, not just for this, but then when they get a little older, it'll be video games, it'll be porn, it'll be gambling.

13:39

Now, everything is gambling. So we're setting them up for a life in which their brain is saying, give me something, give me some quick dopamine, give me some quick dopamine. I don't want to have to work for anything. I don't want to have to apply myself for an hour and then get a reward. And so what the short videos are doing for kids is preventing them from learning the connection

14:00

between hard work and a reward. Is there anything else I need to know from a neuroscience perspective about what's going on in my brain when I develop these addictions with short-form videos or these sort of quick dopaminergic tasks?

14:14

So we all as humans have a primal urge to scroll. When you feel a sense of stress, as many of us do in this moment in life, it is your sense, you know, your amygdala. And so it's your sense of self-preservation. It's survival and self-preservation. That is what your amygdala does.

14:29

So if you want me to show you here.

14:31

I'm so sweet, because I have no idea what I'm doing.

14:32

Yeah, it's okay. So here, deep here, it's a small almond-shaped structure and that is your amygdala, its main purpose is survival and self-preservation. It houses your stress response, your fight or flight response, and it is truly what is activated when you are engaging in content,

14:52

when you feel a sense of stress, and so you have this primal urge to scroll. And so evolutionarily, when we all were cave people living together, we would sleep at night and there would be a night watchman scanning for danger.

15:05

And now we have become our own night watchman and so we scan for danger all day, all night long. How do we do that? We scroll. And then the amygdala is triggered. And then you scroll some more and you scroll some more and you scroll some more.

99.9% Accurate90+ LanguagesInstant ResultsPrivate & Secure

Transcribe all your audio with Cockatoo

Get started free
15:17

And so over time what you're doing is that you're making that amygdala in a state of chronic, it's continually being triggered. What happens to the amygdala over time when it's continually triggered? It starts to rewire your brain in other ways. And how does it do that? Through something called the prefrontal cortex.

15:36

If you put your hand, I like, I can use this model, but I can also just use my hands. When you put your hand on your forehead, the area right behind your forehead, right here, is the prefrontal cortex. This is a very important thing for our conversation, this area of the brain. And what the prefrontal cortex does is it is called,

15:52

it governs executive functions. So impulse control, memory, planning, organization, strategic thinking, complex problem solving. And there is a tension between your amygdala and the prefrontal cortex. When your amygdala is in the driver's seat, that prefrontal cortex is quiet. And what is happening as we continue to engage with our devices and have this primal urge

16:18

to scroll, that amygdala upregulates and the prefrontal cortex downregulates. And over time, that is very problematic for all of the reasons that we're kind of introducing at the start of this conversation.

16:29

There was a meta-analysis done in 2025 of 71 different studies, and it found that heavy short-form video use was associated with reduced thinking ability, especially shorter attention spans and weaker impulse control.

16:41

That's right. These studies are just beginning to roll in now. Kids have been on social media really a lot since 2008, but especially once they got smartphones around 2012. Studies began coming in in the 2010s that, um, look, it's looking like the kids who are spending a lot of time on this are doing much worse. They're more depressed. The focus was on depression. And some other researchers said, no, it's just a correlation, you can't prove causation.

17:07

And we've been going around and around on this for about 10 or 15 years. Now we're doing the same thing with the short-form videos. The damage everyone can see. My students tell me this is what's happening. We feel it. Studies are coming in. But there will be a few studies here and there that don't show it

17:24

and people will push that up. Meta spends a lot of time and money to influence the public debate. A lot of public documents are coming out now about how they do that. So we can engage in debate over research

17:37

on short-form videos for five or 10 years, but at that point, it's way too late. We've lost a second generation, Gen Alpha. So I think when we're talking about kids especially, we need to have what's called the precautionary principle, which is if there's reason to think that this is hurting kids,

17:51

how about we don't roll it out into every childhood? How about we make these companies responsible, we hold them responsible for what they're doing to kids, because we're about to make the same mistake we made with social media, letting it worm its way into childhood. We have already done that with short videos and we're about to do it with AI chatbots. In fact, we're just beginning it in late 2025, I'd say.

18:12

I don't think people quite realise how much these major social media platforms have figured out that short-form video sells. We're actually seeing this sort of global rise in short-form drama apps now. And I don't know if you guys have seen these apps, but it basically takes a movie that used to be two hours long, and it breaks it down into, say, 60 different parts.

18:33

A colleague of mine at my company was showing me the other day, in different parts of the world, they're exploding. There's been a 190% increase in short-form drama apps. Takes a long-form movie, turns it into short-form videos. Disney Plus plans to introduce AI-generated short-form videos this year, starting with 30-second limits inside the Disney Plus app.

18:51

And TechCrunch also reported that as of October 2025, Netflix tested short-form video content on phones and recently announced its plan to expand this feature. It appears that all of the content we consume is going that way. And listen, I'm friends with lots of people at big social media platforms.

19:06

This doesn't get me. This doesn't stand in my way of criticizing them, because I think two things can be true at the same time, right? So I think it can be true that I have a podcast and I make short-form videos, and that I also

19:16

understand that there's a real downside to them. And all of the major social media platforms that I speak to have a huge drive towards short form video. It appears to be their number one strategic priority. And obviously because of the success of TikTok, as of January, 2026, TikTok, I believe, is the most downloaded social app in the world now.

19:37

And if I'm running a social media company and my one focus is profit, I'm now faced with an existential crisis. I either take part in this thing that is driving the highest retention, therefore the best ad payouts, or I die.

19:54

So there's two comments to that. First off is that, you know, when we think about social media and how society is shape-shifting to allow this short-form content, there is a concept that Jonathan and I briefly mentioned, I think, prior to us filming called second screen viewing. And so what's happening is that allegedly these big streamers are asking

20:18

their creative talent, whether it's screenwriters or actors or directors, to replay, to reiterate the plot. Because as you're watching, you know, when we were kids, we would watch TV or movies, and you'd just sit on the couch, and you'd have a bucket of popcorn with your family, and you'd watch a movie an hour, hour and a half, two hours.

"Your service and product truly is the best and best value I have found after hours of searching."

Adrian, Johannesburg, South Africa

Want to transcribe your own content?

Get started free
20:36

And now, second-screen viewing is happening, which means that you're watching a movie or a TV show, and you're on your device. And so you are constantly having that fragmented attention and we are all doing it. And so what these streamers are allegedly asking their creative talent to do is to reiterate the plot.

20:52

So it's shape-shifting.

20:54

It makes sense. If my brain is, you know, I'm 33 years old. So I've grown up with a lot of this stuff. If my brain has been wired to have shorter attention spans. And movies from 30 years ago are not gonna cut it for me.

21:06

Right, but then look what happens if everybody chases that. And I know, look, Netflix is making shorter and shorter stuff. Even TED, the TED conference, TED Talks are getting shorter and shorter. What does that do?

21:17

It just repeats the cycle. Now, I appreciate that you're in a collective action trap, as you put it, if I don't do it and everyone else is, then I lose out. And so the business pressure on all the creators, the business pressures go shorter, shorter, shorter.

21:31

There's a very useful psychological term, distinction here that I think would be helpful, which is the difference between psychological assimilation and accommodation. This goes back to Jean Piaget, the great developmental psychologist.

21:42

We have certain mental structures. We have a model in our head of how things work. And, you know, then you learn something new. Then, oh, that's a, you know, kid learns, oh, that's an aardvark. Okay, I put that into, you know, that's just, you just assimilate.

21:55

They learn lots of animal names. And then they learn something that's doesn't fit. Like you learn about bacteria. And now you have to, okay, now you have to change your mental structure. It takes a little time. You change your mental structure to understand more about life. That's what education really is all about.

22:11

You have to have a lot of assimilation, of course, but you need that accommodation over and over again. That's why you want to go to college. That's why you want to read novels. That's what a great movie does, it takes time. And so one of the great things about this modern technology is that we can do things like have this three-hour

22:27

conversation. I can't believe it. People are going to listen to it. So this long-form content, this is all about accommodation. Anybody who walks out, who leaves this conversation after three hours and isn't thinking about something

22:41

differently, we failed. So you are very much in the accommodation business. That's great. And then the question, both a moral and a strategic question is, how much do you need to play the quick hit game

22:53

in order to get people there? I leave that to you to do the moral calculation. Maybe it balances out, but I think that's where you are.

23:01

Would you advise people to delete

23:03

these short form video accounts? Oh my God, yes, of course. But yes, that would be the most important thing you can do for your intelligence and for humanity would be delete them. So what I advise my students to do is I say, just do this. Just delete one of the social media apps that you use,

23:19

especially if it's TikTok, just delete from your phone. You can still check on your computer. If someone sends you a video, you can still watch it on your computer. You can even check it, you know, every weekend, you can spend some time on it, but just get it off your phone.

23:32

Because on the phone, the phone is always with us, it's an extension of our body. And if it's always there, then it's going to break up your attention. It's going to take every seven seconds that you're not doing something, you're going to go for the phone. So the best thing you can do to make yourself smarter and a better partner and a better human,

23:51

I would say would be to delete the short, especially any of the short form videos, so TikTok. Unfortunately, YouTube, which has a lot of good stuff on it, becomes YouTube Shorts. Instagram, which does a lot of terrible things, but people do find it useful for all kinds of purposes, becomes Instagram Reels. So I think the proper amount of short form video

24:08

for children zero to 18 is zero. They should never be watching the vertical videos. Parents, don't ever let your kids watch the short vertical videos. You might even, if only there was a way to put it, is there a way to put a time limit?

99.9% Accurate90+ LanguagesInstant ResultsPrivate & Secure

Transcribe all your audio with Cockatoo

Get started free
24:18

You can say, has to be 10 minutes or longer, nothing shorter than 10 minutes. That at least will get rid of this, the quick swiping, the dopamine stuff.

24:29

So I would say that for kids, yes, like not engaging at whatsoever, but for someone, you know, my approach is a little bit different for someone who's like in their 30s or in their 40s. And the way I would kind of frame that is,

24:47

instead of renouncing, you know, saying I'm gonna get it off my device and I'm gonna check on a desktop, which is great. There's little kind of tweaks that we could do because my approach is to foster that sense of empowerment in someone to help them make positive change. And so one strategy that you could use,

25:03

if you are saying, there's no way I'm getting rid of my, I'm not deleting these after my phone, right? If you're, by the way, I practice what I preach and I really don't engage in technology as the best of my ability. But one thing that you could do is grayscale your phone. And so, especially at night, like it's 9 p.m., like we talked about revenge bedtime procrastination, you know that you're going to do it. You're going to sit down and you're going to scroll before you know it's 2 a.m. Instead, grayscale your phone. This simple switch, you can toggle it. I have my phone set to grayscale which simply means that you're getting rid of your color, making it't have that same addictive quality to it. It's like going through a grocery

25:45

store. A marketing executive described it this way to me. Going through a grocery store instead of the Technicolor junk food cereal, it's just black and white. So you have a less, there's a greater sense of compulsion to continue checking. So that's like one strategy you could use. And the other is to set some boundaries. So geographical boundaries, keep your phone out of your arm's reach. If you're at a desk, if you're a student, not right next to you, because we know there's this phenomenon of brain drain.

26:13

So it's not just that when you're using your phone, it can have a potential distraction, but also just having it close by. It's called brain drain. And so putting it in a desk drawer, keeping it in another part of the home, if you are working, keeping it far away from you. And so you kind of can override that primal ursh to scroll,

26:32

let your prefrontal cortex take hold again. And so there's all these small tweaks that you can do. You think, no.

26:39

Yes, there are all these small tweaks you can do and they will make the heroin a little bit less addictive. And yeah, you should try those. But what I can say after teaching this course for many years is that people who try that, they say, yeah, you know, it helped, it helped. But you only really get the transformation

26:53

when you quit social media. That you get your life back, you get hours a day back. So, and so I would urge everyone to just think, you know, you only get one childhood, you only get one young adulthood. And if you're gonna spend it scrolling, what do you have to show for it at the end?

27:12

And when you get people to reflect on, well, how much value do you really get from watching the short videos? How would your life be different if you knocked it out? Once they realize that their motives for being on it were either just to keep up or because that's what everyone else is doing,

27:25

or as you said, I deserve it because I'm tired. Well, why are you tired? It's in part because your attention was fragmented all day long. So you only really get the transformations when you get a real change in what you're consuming.

27:38

Although of course, yes, setting it to grace will be helpful,

27:40

but it's not going to be transformative for most people, I believe. And then, you know, based on the science, there's certain elements, like when we think about what is it about the phone that is creating that sense of compulsion, Jonathan is right. So what is it about the phone? It's not just the phone, you know, you're scrolling, you're engaging. There are two studies that were really interesting. One, people got off of, they continued to use their devices, they had no internet. So it's like, you know, I tried this experiment myself in December. I was out of the country, and so I just let my, you know,

28:11

I didn't plug into Wi-Fi, and I found, you know, markedly, a marked change in my mood, my sleep, and I'm not even, you know, 20 years old on TikTok, and it was so different. And so this study found that just two weeks of continuing to use your device, but just not having internet access, improved your attention, well-being, and mental health.

28:33

And in this population, it was all adults. It wasn't kids. It was all adults. Found that 91% of people had an improvement in at least one of these metrics. And then another study more recently, just one week of not engaging in social media, a digital detox they called it, did the same thing.

28:52

Better, you know, less anxiety, less depression, decreased insomnia. But my feeling is that, you know, there is this new kind of meme, right? Like the millennial urge to delete my internet presence and live off the grid. There is certainly utility to that. And I salute anyone who wants to engage in that analog life more and more.

29:16

But from where I sit, I feel like we do need to have healthier boundaries and engage more responsibly. It also builds up that muscle and it can help, you know, it takes eight weeks to do neuroplasticity when you're building new brain circuits, it takes eight weeks. Falling off, getting back up is part of habit formation. So if you're going to make any of these changes, understand that it takes some time. But I

29:50

So I'd like to go a little further with this. So the way you put it, yes, there's all these things that we could do, we should have boundaries, but all of that puts the responsibility on us. I agree.

30:02

And that's where we are with junk food. With junk food, we're like, okay, it's out there. We have to learn self-control. We have to teach self-control to our kids. Okay, that's the way it is in this country. But the digital devices, I think, are very, very different.

30:13

So imagine if we sent our kids out into the world and it wasn't just that there was junk food in all the stores. everything was made of junk food. You know, door handles, you can eat it, it's chocolate. But it's not just that the world's made of junk food. They actually can tell, they're able to tell what you're craving at the moment, and maybe you're more in the mood for salt.

30:34

So now it's all potato chips or pretzels. If the world is designed by companies to always give you the thing that will most grab your unconscious desires, will affect the amygdala, the reward centers. That's on them. That's not our fault.

30:51

My general rule as a social psychologist is if a few people are doing something bad or self-destructive, well, you know, they should learn some self-control or that's something about them. But when 90 or 95% of people are doing something self-destructive, that's because of the companies that put us in an environment that encourages addiction. So I just want to read a quote. We have so much good stuff coming up from MEDA, from all the whistleblowers

31:14

now, all the court cases are beginning in Los Angeles. Finally, it's their first time they're going to, MEDA's going to face a jury with all the parents who've lost kids. So here is a chat. So we have a lot of internal documents that came out from the attorneys general that are suing Meadow. So while they're talking about the results of some of their internal research, one of them says,

31:33

oh my gosh, y'all, Instagram is a drug. We're basically pushers. We're causing reward deficit disorder because people are binging on Instagram so much they can't feel reward anymore, which is something Anna Lemke said.

31:47

Like the reward tolerance is so high. And then he says, I know Adam, meaning Adam Osseri, I know Adam doesn't want to hear it. He freaked out when I talked about dopamine in my Teen Fundamentals leads review,

31:59

but it is undeniable, it's biological and psychological, top-down directives drive it all towards making sure people keep coming back for more. This is not on us. They designed it to be addictive. They've done research to make it maximally addictive.

32:14

They push it on children. They tried to get Instagram kids for even littler kids. They know what they're doing. They've done the research. My team, we put together, we found references to 31

32:25

internal studies that META did. They've done a lot of research finding harm. They bury it, but you can find it at metasinternalresearch.org. We put it all online. You can read these quotes. So yes, we should exert more self-control, but basically we're being pushed addictive substances, addictive apps, and it's messing us all up.

32:46

I agree wholeheartedly that it is so destructive. And you feel like even with people in their 40s and 50s, and if anyone can do it, it's you, Jonathan. Seriously, I would love to see it. We also know based on the data that these things, they reshape our brain,

33:04

rewire our brain through neuroplasticity, and also change our brain waves. So patterns, so we talked about the amygdala and the prefrontal cortex, right? But they also change brain waves. And so when you look at studies and the data,

33:17

it has the reward pathway in dopamine, and these brain patterns, the brain waves, mimic addictive behaviors. And there's certain features, right? And these brain patterns, the brain waves mimic addictive behaviors. And you know, there's certain features, right? Like when you do swipe down to refresh, it's the slot machine.

33:30

It was modeled directly after the slot machine.

33:33

Or autoplay, or, you know, the algorithm, that infinite scroll. One really interesting kind of like breaking news, which you guys may have already heard of, it's like three days ago, the European Union Commission found TikTok to be in breach of the Digital Services Act. And what it said was that it is addictive, it, you know, creates compulsion and gets people into this autopilot mode. So they have difficulty disengaging.

34:04

And personally, I am moving away from social media and really leaning into analog life. But I think with the way the world, you know, it's one of our only ways to connect, right? Meaning, I don't mean connect deeply.

34:16

I don't mean connect like in a deep way.

34:18

Right, that's right.

34:18

But be informed to know what's going on in the world, et cetera. I suspect that because we've spent so long criticizing meta over the last 10 years, because the biggest in any category takes all the heat, so open AI is taking it now. And what this often does is it provides cover for other people to go even more extreme

34:37

with that behavior while meta take the heat. And I actually think this is how TikTok came to be. TikTok had basically, they originally started as Musical.ly, became TikTok. They were taking no heat. So they created an algorithm

34:52

which is the equivalent of crack cocaine. The reason why I have a TikTok account, I don't have the app on my phone. I have never had the app on my phone. It was because I noticed that the view variance on TikTok was like no other platform.

35:08

What I mean by that is you can have a million followers on TikTok and you can get 10,000 views or you can get 10 million views. In the 15 years that I've been on social media, building social media businesses, I'd never seen this before.

35:19

And what it indicated to me is that the algorithm was being an even more aggressive sorting hat or retention machine.

35:26

What to push up, what to push down.

35:28

Yeah. And so like when I started in social media in 2014, if I had a million followers, I might get a million views or maybe 800,000. I did some research the other day on all of our social channels over time.

35:41

And what we're seeing is the variance in the amount of views we can get is increasing, which means the algorithm is doing more work to say, show everyone this, I don't care if the person that posted it is called Jenny and has seven followers,

35:52

and show no one this, I don't care if it's Steven who has a million followers or whatever. And I've realized that TikTok was way ahead of everybody here. And that's why they are the most addictive, the fastest growing platform. I say all this to say that even if Meta shut down tomorrow,

36:09

someone else would seize the opportunity if there isn't sort of policy, I guess, in place. That's right. That would just be whack-a-mole, right?

36:18

Yeah, no, that's right. And so, you know, in terms of who's done the damage to kids, Meta is the big fish via Instagram. And they're also the main player in terms of spending a huge amount of money to lobby Congress and block laws. They're also the main player in buying up civil society organizations, giving money

36:34

to organizations, the National PTA, all sorts of organizations. They get to then give a message on digital citizenship or digital health. So META really is the major driver. Meta is the tobacco industry here, trying to change the dialogue. But in terms of the products,

36:51

Snapchat is probably more deadly in terms of the actual number of deaths per user, because Snapchat is not making you depressed by social comparison as much. Snapchat is introducing you to all kinds of people, and it's the main way that drug dealers

37:04

and sextortionists find kids. Snapchat has a quick ad feature, which relentlessly pushes you to connect with friends of friends. So once a man can get any kid in a school, now he can get connected to all the kids in the school.

37:16

So when we, in a lot of the court cases, you know, when you have, you have suicides from cyberbullying, you have drug overdoses from, you know, a kid bought a Xanax, but it had fentanyl in it. So Snapchat, oh, and Snapchat in 2022, we know from their internal documents from the lawsuits, they were getting 10,000 reports of sextortion from their users, not a year, every month.

37:39

And that's just what was reported, which is the tip of the iceberg. So Snapchat is a terrible platform for children to be on. It should be an adult-only platform. You're talking with strangers around the world and with disappearing messages,

"I'd definitely pay more for this as your audio transcription is miles ahead of the rest."

Dave, Leeds, United Kingdom

Want to transcribe your own content?

Get started free
37:53

and Snapchat doesn't even keep a record. It is ideal for sextortion. There's even a handbook, How to Sextort Kids on Snapchat. It goes around the world and criminal organizations use it. So I definitely don't want to let Snapchat off.

38:05

TikTok, of course, is a Chinese company. I mean, nominally we'll see if that's changed, but it was a Chinese company that gave its, Chinese kids got healthy TikTok or Douyin. They got, they, you know, learned to follow astronauts. And they gave us the,

38:20

their algorithm feeds their kid patriotic stuff. It shuts off at a certain time at night, there's all kinds of limits. So the people who make the technology generally want to protect their own kids and they want other kids to use it.

38:33

That's what TikTok is doing in China. They want American kids to rot in hell, but they want their kids to grow up with the ability to focus. And it's the same thing with the tech guys in Silicon Valley. They don't let their kids use this stuff.

38:46

They make their nannies sign contracts that they will not let the kid have a phone. They will not expose the kid to that. They send their kids to schools like the Waldorf School, precisely because there are no computers or tech in the classroom. So once again, we see their revealed behavior. They know.

39:00

They designed it to be addictive. They know it's addictive. They don't let their kids use it. They want your kids to use it. So I think that's where we are.

39:07

And how does AI become a protagonist in this story?

39:10

So my work is now focused on AI chatbots, mental health, and the human connection. We haven't yet kind of delved into loneliness, but there's this unmet need for human connection, right? Deep human connection. We don't have a sense of meaning or purpose right now, because what happens is we can talk a little bit more

39:28

about the default mode network and what happens to your brain when you don't allow yourself to get bored because you're constantly on your devices. And that meaning and purpose, that self-referential thinking is really what develops from your board. And so all of this that we're talking about, that feeling of disenchantment, it's a fragmented society.

39:47

You're by yourself. It's that echo chamber phenomenon. All of it leads to, it kind of opens the door for AI chatbots. And so what the reason is, because these tech companies are sensing that people aren't really happy on social media, and they're thinking about getting off, right?

40:01

They're using it less, because social media has getting off, right? They're using it less because social media has become less social, more media. So they're not really engaging as much and they're spending time doing other things. And so the Atlantic had a fantastic piece about this. They're billing it as the anti-social media. It's a place where you can go to form deeper connections and really have someone understand you. One of the tech leaders said that there's an unmet human need

40:32

for connection and people don't have as many friends as they want to. And so we're going to introduce friendship through AI chatbots. There is a Reddit forum right now. So just to back up, AI chatbots, what we're talking about in our conversation today is the publicly available chatbots, not AI for medical care, which has breast cancer. So many wonderful in my field in medicine,

40:57

breast cancer diagnoses and detection five years earlier through AI. I mean, there's some amazing things coming out of AI. This is about the publicly available conversational chatbot phenomenon. And so when Harvard Business Review found that the number one use case is not productivity, is not you know coding or the things that you think of when you're

41:19

using an AI chatbot, but it's mental health therapy and companionship. Number one use case of AI chatbots. So people are using AI chatbots as a life advisor, as a therapist, as a companion, and on Reddit, which is like the zeitgeist. It's like, you know, where we-

41:35

And why is this a bad thing?

41:37

Oh, I read so many reasons why it's a bad thing.

41:39

To use it as for companionship, for example.

41:42

There's so many red flags about AI chatbots. And so Reddit has a forum. It's I think, last I checked, 45,000 people. AI is my boyfriend. And you know, people who are having a relationship with their AI chatbot. The reason it's bad, I mean, AI chatbots are, you know, where social media is about attention, the attention economy, dopamine.

99.9% Accurate90+ LanguagesInstant ResultsPrivate & Secure

Transcribe all your audio with Cockatoo

Get started free
42:05

What's happening with the AI chatbot phenomenon, it's that it is forming attachments. So oxytocin is a hormone, the bonding hormone, and we're probably gonna see more data on how oxytocin is involved. And so it is going to reshape human connection.

42:21

Right. If I could add on to that, that was beautifully put. Social media came and hacked our attention and took most of it with devastating effects. Now AI is coming to hack our attachments, which is going to have even more devastating effects.

42:37

So think about it this way. Everyone needs to understand the attachment system. It's this wonderful system that all mammals have that keeps the mother and other species, but for humans, mothers and fathers, keeps us connected to the child and the child to the parent.

42:50

But it's this cybernetic system in which, as the kid is beginning to develop and is able to, like, you know, you do like peekaboo games and you do the back and forth, and it's just the most delightful thing. You get that back and forth.

43:03

It's called serve and return interactions. And all the time, the child is developing what's called an internal working model of the parent. And the model in their head is, oh, you know, when I get in trouble, this is the person that comes and soothes me.

43:17

And the point of this isn't just to make the child feel good. The point is that now the child can go off and play, because that's where the learning happens. It doesn't happen when you're in your mother's arms. The whole point of the attachment system is to regulate the child going off and playing, taking risks, having experiences,

43:32

and then when something goes wrong, as it always does, then they come running back to their secure base. And if they don't have a secure base, then they're much more anxious and they don't explore as much. All right, so this develops very gradually over all of childhood. And the internal working models you develop as a child

43:48

are the models that you will reuse in puberty for romantic relationships. And so if you are securely attached as a child, you're more likely to be securely attached as an adult on the dating market, which makes you a much better candidate

44:01

for boyfriend or girlfriend or husband or wife. What's gonna happen? AI is gonna intervene very early. AI is going to be so much more responsive than the parent because the parent has a job and the kitchen and two other kids and is not always there. But the AI teddy bear is always there for you.

44:16

So the primary working models are going to be for the teddy bear, the AI chatbot in the teddy bear, and later the AI chatbot on your iPad, and then on your computer. And already, they're a holographic porn, naked, beautiful men and women that can be your companion. So we're gonna have a whole generation growing up developing attachments to AI-generated holograms from companies that are now about to enter

44:40

the inshittification process in a way beyond anything we've ever seen. Just, if I could just briefly say what inshittification process in a way beyond anything we've ever seen. Just, if I could just briefly say what inshittification, have you heard the word inshittification before? Okay, so there's a wonderful book out now by Cory Doctorow who addressed the question,

44:53

why is it that everything, all the platforms, they seemed so wonderful at first, the whole internet, everything's so wonderful, and then it all turns to shit. How does that happen? And he says, it's a very simple process. They discovered early on, certainly

45:06

in the early social media age, by the early 2000s, they discovered, you know what? You got to get to scale. Scale beats everything else. You got to get millions of people. You don't need a business model.

45:15

Just get the millions. Get the millions, and then we'll super nice, attractive, fun. Everyone's here. It's just girls dancing. What could possibly go wrong with girls dancing for men all over the world? Nothing. So it all seems very nice at first. And then once they have scale, now they—and of course, they've raised multiple rounds of venture capital. They have to start monetizing. They have to start repaying. So now they start squeezing the customers to pay the users, because the users are not

45:45

the customers. The advertisers are the real customers. So now they've got to extract money from the users to give to the advertisers. But then once they've got all the advertisers and they've shut down local papers and all the other competition, now they start squeezing the advertisers too and trimming the degree to which they keep more of the surplus for themselves. So insidification can explain why all these platforms become predatory, why they always put profit ahead of kids' well-being or safety.

46:14

And for the social media companies, we're talking about, you know, tens or hundreds of millions of dollars that they raised. For the AI companies, it's billions and billions. They are going to have to monetize beyond anything we've ever imagined. Now, they're already introducing advertising, okay? So we've got these chatbots that are our children's best friends and lovers and therapists and

46:37

everything else. And these things have to monetize. they have to extract billions somehow. So I don't even know how they're going to do it. But for some reason, I don't trust them. I think that we're about to see an inshittification of AI chatbots far beyond anything that we saw on social media.

46:57

OpenAI, I've just announced recently, OpenAI are the owners of ChatGPT, that they will be putting adverts in, I believe, the freemium model for billions of users around the world.

"99% accuracy and it switches languages, even though you choose one before you transcribe. Upload → Transcribe → Download and repeat!"

Ruben, Netherlands

Want to transcribe your own content?

Get started free
47:06

That's how it starts.

47:07

Potentially.

47:08

Yeah, there was a big Super Bowl campaign, you know, and one that was particularly interesting was the Claude, its competitor, Betrayal was the title of that ad. And it was a young guy talking to his older female therapist about how he has some mommy issues

47:26

and talking about, you know, what should I do? And so that therapist is Chachi Fiti and you know that pause right before answering the question, it's very comical. And so it's, you know, she answers, it's like the anthropomorphization of,

47:41

and we can talk about what that word means, you know, comes to life. It's like Chachi Fiti comes to life and answers and saying, you know, you can try this with your mother and this for, you know, difficult relationship, et cetera. And then just says, and if you want,

47:55

there is this new dating site for young men and older cougars. It was so problematic and it was called Betrayal. And the guy says, what?

48:06

It's obviously, you know, Sam Altman came out and did a big tweet about saying that's not how ads are gonna work, et cetera. But to some degree, if I've developed a relationship with my AI and I use it for therapy and dealing with my problems in life, to some degree, kinda.

48:20

Yeah, you're vulnerable.

48:22

Yeah, no, look, and besides, look, Sam can say that all he wants, and maybe it's, I don't doubt that it's true for now, but once one company crosses the threshold and puts advertising into this incredibly intimate relationship, the most intimate relationship in most young people's lives is going to be with their AIs. Once they cross the boundary and say, oh, but we've got ethical advertising, that'll last five or 10 minutes. And even if they don't change, others are now going, every other company is going to

48:48

do it and they won't be bound by the same thing. And eventually, collective action problem, open AI will have to do it too. Again, a massive tidal wave of insidification is heading our way at warp speed.

48:58

I don't have my phone out because I've lost attention. I wanted to ask you guys what you thought of this. So on one of the AI apps, they now have a companions button and I can pick who I want to talk to. And there's one particularly seducing lady here, Annie, who...

49:24

Hey, you're back.

49:25

Missed that dirty mouth of yours.

49:28

What took you so long?

49:29

We did it on the podcast before.

49:31

What could possibly go wrong with this?

49:33

Yeah.

49:37

Or start something even...

49:38

No, I would like to pick right back up where we left off Annie last time on the show. What's going on with you today?

49:49

I'm still sore from last time, baby.

49:52

God.

49:53

But I mean, this is an app that I can download on my phone.

49:56

Any child can download it.

49:57

A child can download it on their phone. It does ask me, again, I'm not just famous at all, it asked me what my birth year was.

50:03

It didn't make me prove it. Let me guess, but it also, it suggests that you were born 18 years ago. That's the default, usually.

50:08

Yeah, yeah, yeah, yeah. It just asked me what my birth year, it didn't ask me to prove or anything like that. And we all know that relationships and connection is retentive. companionship apps and an AI that can be your friend. I've had all of the major social apps talking about this. It is deeply concerning, especially in the context of a loneliness crisis.

50:28

It is a tsunami. It is approaching fast and furious, and it is not a toy. It is going to fundamentally rewire everything. Human relationships. Everything. That's right. It is so detrimental. Yeah, can I just say something about

50:46

the these tech executives and companies offering this as a way to address the loneliness crisis? So there's a Yiddish word called chutzpah, and chutzpah means like nerve, like you've got a lot of nerve. The audacity. The audacity, yeah. And the classic, you know, the classic comedic definition of chutzpah is a boy who murders his parents, and then he asks the judge for clemency because he's an orphan. Okay? So that's chutzpah. Now, imagine that you're Mark Zuckerberg. You quoted him before. Mark Zuckerberg was the executive who said, well, you know, I read that, you know, people

51:20

on average want 15 friends, but they only have three. And so we're developing these AI companions to fill that void that we created by raising everyone on Instagram. So the chutzpah of these people, we have to really change the way we think about them. We thought about them as gods and saviors

51:36

early in the internet phase, and the things they created were magical. But we have to change our thinking about them and see just the massive destruction that they have already wrought on our children, our society, our democracy, and it's just the beginning. AI is going to make this so much more intense.

51:51

When you hear these tech leaders, you know, I love hearing Jonathan talk because he just goes there and I'm always way more tempered. And I love it. It's emboldening me to go there too.

52:02

Yeah, because I'm beginning to get angry. I don't really get angry, but in the last year I'm getting angry.

52:07

I love it. So the way, when you hear all of these various tech leaders speak, they will always say, they speak to the issue. So, you know, I've heard many of, for research for my second book, BotBrain, to a lot of Sam Altman's speeches or panels, and he will always say things like, yeah, you know,

52:25

privacy is a major issue, or yeah, people, you know, one million users a week talk about suicide on ChatGPT, yeah, this is an issue. And so they address it, or they speak it, and so you think, okay, there's going to be some sort of solution. And often the solution is, yeah, you know, society, we're going to have to figure this out. So the burden of responsibility is not on the developer, it's,

52:54

You said in the last year you were getting angry. Why in the last year?

52:58

Because I was so deeply immersed in the book and the writing of the book and trying to understand the numbers and the graphs and the trends and the studies. And that's all very abstract. But then since the book came out, I have had so many conversations and I've met so many of the survivor parents. Like just for example, so I was in London,

53:14

this was just so unbelievable. I was just in London two or three weeks ago and I met Ellen, Ellen Groom I think was her name. Her son Jules was found dead. Happy kid, found dead, strangled. It sure looked like it was the choking challenge. 13 year old boy. Everything looked like the choking challenge on TikTok.

53:33

What's the choking challenge?

53:35

It's a challenge where kids are challenged to cut off the circulation to the point where they pass out. But then they, I think they're supposed to try to film themselves waking up after they've passed out. And of course, if you don't do it exactly right, you die. And so we don't know how many have died. Hundreds for sure, we don't really know,

53:51

because you find a kid dead, you don't know what it is. If you don't have the code, if you don't have the password to get into your kid's phone she couldn't get into his TikTok. And she went to Delaware, she went to sue to demand that TikTok release. What was he watching when he died? And TikTok says, oh, privacy issue.

54:13

Oh no, we won't release that, as if they care about privacy. And then in the courtroom, this was so disgusting, in the courtroom, in Delaware, this British woman coming over, trying to get some justice,

54:25

trying to at least get some information. The lawyer for TikTok is trying to suggest that your son was depressed beforehand. He was gonna be suicidal. Basically, even if he was watching TikTok, that was just a correlation.

54:40

TikTok didn't cause it. He was gonna die anyway. I mean, it's just so disgusting the way these companies treat the parents and the kids that they're crushing and stepping on. And so the more I see this, the more I realize this is,

54:53

I mean, this is a level of cruelty that goes far beyond the tobacco industry. The tobacco executives, they had to go home at night, but they never saw during their workday, they never saw children suffering. They saw people dying, middle-aged and older, but they never saw children suffering. The social media executives, they have to go home knowing every day that millions and

55:12

millions of kids have been cyberbullied, sextorted, shown eating disorder videos. Many have committed suicide. They have to go home knowing that, knowing that they designed it for addiction, knowing the kids are addicted, and lying about it. So yeah, I'm getting angry.

55:27

And in their own homes.

55:28

Right, and in their own homes, the hypocrites don't let their kids do it.

55:31

That's right.

55:32

So yeah, I'm getting angry.

55:34

You talked earlier about deleting these apps from our phone. I probably should have represented the rebuttal, this for my business. Increasingly people need TikTok to run their businesses. And I imagine there'll be a lot of people who will be listening right now. I guess I'm in a slightly different position because I have options. But for some people that are running small businesses, what do you say to those people?

55:55

Yeah. So this is part of the reason I focus on the kids, because for the kids, it's totally clear what we need to do. Raise the age. they should not be on it, these are adult-only platforms. For adults, A, I'm very hesitant to tell adults what they should do or what they have to do or pass laws blocking people, I'm hesitant to do that. And I totally see that for businesses, it is useful. I use X and Instagram and LinkedIn to get my work out. These are very powerful tools for adults.

56:20

The only real solution for the adult problem is going to come from market competition. It's going to come from—imagine if there was a social media app that was built from the beginning for trust. Because what are the places that didn't get inshittified? eBay? Uber? Places where you're dealing with strangers. You don't know the name of your driver. He doesn't know yours. You know first name, that's all. But the company knows, the company has know your customer rules, know your driver rules.

99.9% Accurate90+ LanguagesInstant ResultsPrivate & Secure

Transcribe all your audio with Cockatoo

Get started free
56:47

So you can have social media apps that are built for trust so that if someone, you know, if a driver tries to sextort or sexually harass a customer, that driver gets fired.

56:56

Well, just this week though, there was that big lawsuit, right, with that woman and her Uber driver raped her. And so now it's like slowly coming out that Uber, has patterns of covering up certain. So hopefully that will change. Hopefully this was a landmark lawsuit and now there'll be more accountability.

57:23

We all let our daughters get into Ubers with strange men from around the world, you know, that we don't know.

57:27

I take Ubers everywhere.

57:28

Yeah, so it means in general, the system works. Well, of course, yes, there are places where they're not careful. And so what I'm dreaming of is that someone will come up with a platform that has know your customer rules, there are no bots, there are no foreign intelligence agencies manipulating

57:45

us, and you can trust what's on there. You know that it's real and that there will be an alternative. I'm not sure what the monetary model would be at the beginning. Subscription generally seems to be the least corrupted, whereas selling advertisements, as open AI is now doing, is the most corrupting. It's going to force them to maximize for engagement.

58:04

So I understand we can't, you know, businesses can't just boycott these. There has to be something. But I think there will be better ones coming out.

58:12

I think right now, as a stopgap, while these social media companies, their feet are held to the fire, there are things that we can do in the now. So, you know, the things that I talk about all day is like how to create boundaries

58:29

and so that you can protect your mental health, stay informed, run your business, but then be able to not have all of those deleterious effects to your brain and your body.

58:38

It is quite, it's quite difficult. I kind of see both of your perspectives on this.

58:44

It's quite difficult. And kind of see both of your perspectives on this. It's quite difficult. And I'm only talking about adults. So for kids, as a mother, I have-

58:48

Even for adults, I find it difficult.

58:50

We have a zero screen policy in our home.

58:52

It's kind of like trying to navigate through the world and avoid processed foods. And this is probably even more compelling because it's in my pocket all the time. I need it for other things, and it's just one reach away. So, you know, boundaries, I think I could build a discipline to create boundaries.

59:12

But I've sat here on this podcast for many, many years, listening to neuroscientists tell me, Steve, don't put your phone in your bedroom. And I'm still waking up, and it's the first thing I look at with one eye open, and then I'm going to bed,

59:21

and I'm doing the whole revenge thing that you just said at night time. I'm so glad you've given, because I will finish a hard day of work. It might be 11 o'clock. And then my partner is waiting for me. You know, we're going to have some time, but I want some me time.

59:35

So there I am on short form video scrolling till like 2 a.m. in the morning. and then I wake up late the next day, my diet's worse because of my sleep. It's all worse. My relationship's worse. I didn't spend time with her. And I'm going, what the hell just happened? I'd got nothing out of that scrolling session.

59:53

It would be so much better off if you would watch Netflix or a movie that most of those problems would go away if you would make that me time, it'd be watching something long and with some quality of the production.

1:00:05

Or let's take it a step further and not do anything and just sit there. Sit there on your couch. You know, we talked about boredom very briefly, but you know, we have-

1:00:14

That's like torture for this generation.

1:00:16

It's torture, but it's also, you know, we still have a capacity for boredom, meaning we, as like the human brain does, but we just don't allow ourselves to get bored. And so when you're thinking about that art, the lost art of pondering and just sitting there, I think, I don't know if it was Steven,

1:00:33

you or Jonathan said, when you're in the car, I remember as a little kid, we did road trips. Yeah, road trips with my family and all you would do just make up games, look out of the window. We've become creative. Yeah, we've lost that. And so there's this thing called the default mode network, which I think is important to think about right now as we're thinking about AI and what's going to happen and how it's going to hijack our sense of attachment and attention.

1:00:55

So the sense of meaning and purpose, right? If you ask people right now, most people will say I am a keynote speaker. So I speak all over and when I ask people, the word that comes up over and over is a sense of horizonlessness. Adults.

1:01:09

Oh, interesting.

1:01:11

People feel like they have nothing to look forward to right now. The human brain needs something to look forward to. That's how we're wired. Progress and, you know, in all ways. And so right now there's this sense, and it's not just now, it's been for the past several years after the pandemic specifically and during the pandemic is when it really changed how we started thinking about the future. And so we have the sense of like, what's the point?

1:01:35

What's the point of working hard now? What's the point of doing whatever? Because it's like, I don't really see a future for myself. And so I think that, along with this fragmented attention, our loneliness, boredom might be the antidote. It's a way to reset your brain. And the reason is because we are living through this polycrisis, right?

1:01:55

So it's the era of the polycrisis. And polycrisis simply means that there's something happening everywhere at all times. And we, with our devices, this high-tech device that plugs us in everywhere, our brains are getting fed real-time, on-the-ground information. And so while all of this has evolved,

1:02:13

technology now with AI chatbots, your amygdala has not. And so it feels like when something is happening, whether it's far away or close by, your amygdala has that same reaction. Now, if you were to not engage in preventive edge time procrastination, put your phone away and just kind of hang out. Maybe drink a cup of herbal tea, like old school. Play a board game or something. You might, you know, or just allow yourself to get bored. That hyperactivation, hypervigilance, you might be able to come back down to baseline. That default mode network

1:02:46

will start working in the background. You might develop a greater sense of meaning and purpose.

1:02:50

So probably today, and then life's gonna happen to me again, and boom, I'm back into it. And, you know.

1:02:55

And you could create a practice, a cultivated practice.

1:02:59

I sit here interviewing neuroscientists, and I go, if I still can't crack it, and I have all the information and advice and hacks and tips and tricks and resources, and I can decide what time I wake up, but I've got all this privilege and I can't crack it,

1:03:13

I go, it's gonna be really difficult. So let me offer a way of thinking about this. So in my first book, The Happiness Hypothesis, there's a metaphor in there. It's about ten ancient ideas, and I use a lot of metaphors to explain ancient ideas about psychology and whether they're true. And the first chapter

1:03:31

is on how the mind is divided into parts that often conflict, like a small rider, which is our conscious reasoning, on a very large elephant, which is all the automatic processes that happen, that we don't see what's happening, we just feel the results, intuition and emotion. And psychotherapists tell me this is an incredibly helpful metaphor with their patients, because it explains, and there's a quote from Ovid in there,

1:03:55

I see the right way and approve it, alas, I follow the wrong. So I know I should go to bed, as you say, but yet for some reason, I'm should go to bed, as you say, but yet for some reason I'm not going to bed because our brains are 500 million years old. They work on automatic processes, they're animal brains.

99.9% Accurate90+ LanguagesInstant ResultsPrivate & Secure

Transcribe all your audio with Cockatoo

Get started free
1:04:10

And then very recently we got language and we can reason things out, but the parts that do reasoning don't control behavior. And so really the elephant is what largely guides our behavior, our automatic processes. And your phone, as I said before,

1:04:25

BF Skinner is in your phone. Your phone is a behaviorist training device that trains the elephant. And that's why you often do things with your phone that you don't want to do. And so, and this is why I'm so insistent

1:04:37

that we all have to get all of the slot machine apps off of our phone. That is, the original iPhone was an amazing tool. It was a Swiss Army knife. It had, you know, a telephone, a browser, maps, a music player.

1:04:53

There was a flashlight, okay? There was no App Store. There were no push notifications, 2007, 2008. It's just a Swiss Army knife. There's no problem, okay? Now, I'm very lucky in that my iPhone has always

1:05:06

stayed that. I'm always on a computer, so my attention problems are on my computer. But my phone, because I never had any addictive apps on it, except during the crypto craze where I played around with it and I got hooked. And I was checking 50 times a day. And I saw the addiction. So once I got rid of that and lost all the money that I was willing to lose, once I got rid of that, my phone has no addictive power over me. Because when I see it, there's no,

1:05:30

it's not a slot machine calling, hey, come back and play, come back and play.

1:05:34

So your phone right now, on your personal device,

1:05:37

you don't have any social media apps or anything like that? I do have Twitter, but I never check it there. I never do use that on the phone. You know, now texting and email is a little bit like a slot machine, but it's very mild. So this is, again, this is what works for my students. Just get the slot machine apps off your phone, and then you'll find that then you could even have your phone near you when you go to bed.

1:05:59

But if you've got addictive apps on your phone, you can't have it when you go to bed. Angela Duckworth, the woman who gave us the concept of grit, she has this amazing graduation speech at one of the schools in New England. And she says something like, where you put your phone at night will maybe become the most important decision

1:06:16

you make in your life. And what she means by that, it's not that, if you can use behavioral control and change the stimuli, if you can do that, then you're gonna be okay. But if not, the phone is gonna take your attention and you're not gonna amount to anything.

1:06:30

All I had to do was brain dump. Imagine if you had someone with you at all times that could take the ideas you have in your head, synthesize them with AI to make them sound better and more grammatically correct, and write them down for you. This is exactly what WhisperFlow is in my life. It is this thought partner that helps me explain

1:06:48

what I want to say, and it now means that on the go, when I'm alone in my office, when I'm out and about, I can respond to emails and Slack messages and WhatsApps and everything across all of my devices just by speaking. I love this tool, and I started talking about this on my behind the scenes channel a couple of months back. And then the founder reached out to me and said,

1:07:05

we're seeing a lot of people come to our tool because of you. So we'd love to be a sponsor. We'd love you to be an investor in the company. And so I signed up for both of those offers and I'm now an investor and a huge partner

1:07:14

in a company called WhisperFlow. So if you want to give it a try, head over to whisperflow.ai slash DOAC to get started for free. And you can find that link to WhisperFlow in the description below. We asked our audience how many of them thought they were addicted to their phone. And roughly 85% of respondents, the Diverseo audience, described themselves as being very or completely addicted to their phone.

1:07:42

Wow. Very or completely, that surprises me. I didn't realize it would be that high.

1:07:46

So you can do a test. So for people listening, if you want to say like, how addicted, and by the way, we're using the word addiction very loosely in our conversation. And so what we're really talking about,

1:07:55

because you know, there is in terms of, you know, medical, clinical syndrome, when you think about addiction, there's certain criteria. And so what we're talking about is overuse or over-reliance on your devices, right?

1:08:11

Yes, it interferes.

1:08:12

And if that is an addiction, I don't know what is.

1:08:14

And so when you're thinking about, am I addicted to my phone? Do I have, am I, you know, really? What the very simple thing that you can do. I did it myself and I was like I know again like you Steven like know all the science still was really difficult you have all the access and it was still difficult and so all you have to do is you just take your phone you put it in another part of your house or apartment or whatever and give yourself a couple

1:08:35

of hours when you know you're gonna be home or you know you're not reliant on your phone for work or whatever an hour, three hours, and just have a piece of paper, old school, piece of paper and a pen with you. And every time you feel that compulsion of like, I wanna check my device, you make a mark, you make a mark, you make a mark, and just to see. Because some people say,

1:08:53

I'm surprised that your audience said 85%, because most people would say, I don't know if I'm not really that addicted, you breathe in an hour 960 times a minute. And you may notice that you want to have that compulsion to check 960 times a minute or thereabouts because we all have that sense of reliance on our devices.

1:09:18

So that's like a really quick way that you can check to see, am I relying on my device?

1:09:24

Are you addicted to your phone? Under that definition?

1:09:26

Because of the line of work that I am in, I can very quickly, I have certain tells when I know, I call them the canary in the coal mine, right? I think we talked about this the last time I was here. I can very quickly tell when I'm starting to get that feeling of addiction or compulsion. And so I course correct early, but that's only because I know the science and I course correct. So I keep my- I'm definitely addicted.

1:09:49

I keep my phone outside. I walk the talk. I keep my phone outside my bedroom. It is not within arm's reach. I gray scale my phone during periods of deep focus during the day when I have a deadline,

1:09:58

I have to get things done, and at night so I avoid revenge bedtime procrastination. But sometimes it happens. Like I'm a human, you know, so this past week, not to be a real downer, but there've been things that have been in the media the past week that have been really challenging, especially as a woman. And so I have found myself with the primal urge to scroll.

1:10:16

My amygdala has been triggered. I have been going down rabbit holes and I wouldn't ordinarily do that. So I give myself grace too and have a sense of self-compassion. Do you feel like you're addicted to your phone?

1:10:27

No, I'm not at all addicted to my phone because I don't have any slot machine apps on it. But I really want to question, you made a distinction that many scientists do, which is, well, you know, we can't quite say it's addiction because, you know, addiction is certain biochemical pathways based on, you know, heroin and addictive substances.

1:10:45

But I believe that this is one of the meta-talking points that they are able to push, that we can't call it addiction, it's different. I'm not trying to have a meta-talking point. And no, look, you and I are total allies on this. We see the problem, we're both,

1:10:58

all I mean is, you know, we're supposed to be very careful about using the word addiction, but you had Anne Lemke on, and she was very clear, like in her practices, now it's overwhelmingly digital addictions. It's all of this is working through dopamine.

1:11:12

If you feel compulsive use, definitely dopamine. So it's most of the same brain centers as it is for heroin or crack or any other drug. And it's the same effects. That is, it's compulsive use, where you don't want to do it, you want to change, but yet you find yourself doing it,

1:11:28

and you have withdrawal effects. And people have terrible withdrawal effects when they're heavy users of these things and they stop. And so, you know, if it walks like a duck and talks like a duck and swims like a duck, I'm going to call it a duck.

1:11:42

In fact, that's what they call it. So I just want to read one more quote. Again, the quotes are just so astonishing. Some meta researchers, one of them says, quote, it seems clear from what's presented here in this internal study that some of our users are addicted to our products.

1:11:58

That's their word, addicted to our products. And I worry that driving sessions incentivizes us to make our products more addictive without providing much more value. How to keep someone returning over and over to the same behavior each day? Intermittent rewards are most effective—think slot machines—reinforcing behaviors that become especially hard to extinguish, even when they provide little reward or cease providing reward at all. People, I mean, just imagine an industry

99.9% Accurate90+ LanguagesInstant ResultsPrivate & Secure

Transcribe all your audio with Cockatoo

Get started free
1:12:26

that has caused 85% of people to feel that they're addicted.

1:12:32

And not calling it addiction.

1:12:33

And not calling it addiction. And these people are having their lives diminished, their relationships diminished. So I'm trying to convey is, we're seeing the destruction of human capital, the destruction of human potential,

1:12:48

the destruction of human relationships, the destruction of connection, the destruction of sense of meaning at a scale so vast, I don't think people are capable of comprehending it. I now believe this is affecting most human beings. These industries, these few companies,

1:13:02

have damaged the lives of most human beings. We don't have good data from the developing world, but certainly the developed world, wherever kids are going through puberty on touchscreens, you have this constant fighting over the screens, over the technology,

1:13:16

and you have these diminishing outcomes, diminishing cognition, diminishing sense of purpose in life.

1:13:23

Only to get worse with the AI chatbot. As AI comes in, it's going to get worse.

1:13:26

Unless we act soon, we've got to change course in 2026. We don't have five years to study it. We've got to stop this now in 2026.

1:13:33

Are you concerned at all about the way education's going for children? Because it appears that maybe it might- Ed tech is big tech in a sweater, as they say. Because I was almost imagining a future where my future kids are going to learn their curriculum from an AI chatbot. Because I can imagine the case.

1:13:53

Cheaper, more personalized, more convenient. It's going to know if my son's called Timmy. It's going to know Timmy's brain. And it's going to know how to make him pay attention and what he's interested in and what he's not. So are you concerned about this or is this a good thing?

1:14:07

There is definitely a use case for ed tech. If there could be a device that only did math tutoring or only did tutoring and you couldn't watch videos on it, I'm totally open to believing that that can speed up teaching. But here's what's happened.

1:14:23

We put computers on everyone's desks around 2014-2015. We used to think in America that it was an equity issue, even back to the 90s. The rich kids all have computers, the poor kids don't. Let's get philanthropists to buy computers for school districts so that every kid can have a computer on their desk. Okay, now what is a computer? It's a play device. It does everything. Kids use it at home. They watch videos. They do all sorts of things.

1:14:47

You put it on their desk and you tell them to do math homework, what happens? It's mostly short videos. That's what research is showing. It ends up because they don't block YouTube. They might say, oh, we block porn.

1:14:56

We block video games. They can get around all that. It's YouTube Shorts, which is TikTok. So what happened to test scores in the United States? From the 70s through 2012, they were rising. We actually were improving what kids knew, what kids learned in the United States. We have very good data.

1:15:13

The National, the NAEP, the National Assessment of Educational Progress goes up till 2012. And then by 2015, it starts going down and it's going down before COVID. And it goes down more during COVID. And everyone thinks like, oh, it's COVID, but it started, the peak was 2012. And what's happening, what

1:15:29

we now can see is that the top students, the very best students, who are the ones with executive function, they're the ones who can pay attention. If you put a computer on that kid's desk, he's not destroyed by it. He can actually still learn. But the bottom 50% cannot. So all of the drop in educational stats is the bottom 50%. The bottom 50% in terms of capacity to pay attention, their education is being devastated. And that's what happened when we put laptops and we put Chromebooks and iPads on their

1:15:57

desks. We spent hundreds of billions of dollars on this stuff, and it has damaged education. And if we'd spent a quarter of that on teachers, we would be in such better shape today. So we made a colossal blunder with ed tech in the 2010s. And now we're about to do the same thing again with AI. Again, maybe there are apps,

1:16:16

maybe there are applications that will be great, but we've got to put the burden of proof on Silicon Valley. We've got to say, you guys have to prove that this stuff is effective and safe before we'll let it in. We are not going to let you just say, hey, let's just flood the zone. Let's give it to everybody.

"I'd definitely pay more for this as your audio transcription is miles ahead of the rest."

Dave, Leeds, United Kingdom

Want to transcribe your own content?

Get started free
1:16:30

And then we'll wait 10 years and see what happens.

1:16:33

I mean, that brings up this study that I have in front of me here, which was a 2022 study, a Munich study which tested the idea of brain rot, which I believe was the Oxford Dictionary Word of the Year, it's 2024. And what they did is they gave 60 participants a test,

1:16:50

then a 10 minute break, and then another test. During the break, they either rested or used TikTok, Twitter, or YouTube. And the results showed the following. The TikTok group, so they had a 10 minute interval to do anything, and this group got TikTok to

1:17:05

look at, their memory accuracy dropped from 80% before the break to 49% after the break, a nearly 40% decline just from a 10 minute break. In contrast, the Twitter and YouTube groups showed no significant change in the Munich study. And there's an image I'll throw up on the screen. Results from the Munich study showed a 40% drop in perspective memory accuracy in the TikTok group after a 10 minute break, which is unbelievable.

1:17:32

It's unbelievable. What the hell is going on there? How can a 10 minute TikTok break drop my memory accuracy by 40%?

1:17:41

TikTok is brain rot.

1:17:42

I mean, it's-

1:17:43

What's going on?

1:17:44

There's so much going on in the brain. So when you're thinking about, here's the thing, brain breaks are not nice to have. They're actually essential for your brain. And so we talked a little bit about that default mode network and what happens to it. When you're engaging with your devices, and that's not a brain break. That's activating all of the aspects.

1:18:04

So it's activating your amygdala. It's dampening or decreasing the volume of your prefrontal cortex. It's creating that reward system, the dopamine hit, those addictive behaviors. So it's only, when you're thinking about memory, planning,

1:18:18

what was the metric here? It was memory, right? That was the metric that they were using to study. And so when you're thinking about working memory or cognitive function, complex problem solving, this is all prefrontal cortex. And so when you're engaging with TikTok,

1:18:32

10 minutes, five minutes, whatever it is, you are dialing down that biology in your brain. And so of course you're going to see changes and you're going to see the flip side, increased hypervigilance, irritability, distractibility, fragmented attention.

1:18:48

It's just, again, this is not to say that this whole conversation, right, or when you're reading studies, you might say to yourself, what's wrong with me? Is there something wrong with me? Is my brain broken?

1:18:59

Am I weak? It is not you. You are not alone. It is not your fault. It is the biology of your brain doing exactly as it should. So we talked about the amygdala and prefrontal cortex here. Your amygdala is not wrong or broken. It's by design supposed to think about your immediate needs,

1:19:16

survival, self-preservation. And so when you're on the algorithm, we know we talked about certain, or maybe we didn't talk about it, certain content that you see on TikTok and others, when it's reactionary, words like FOMO or rage bait, these are not neutral terms.

1:19:34

When you're engaging with these social media platforms, it's not something neutral, it's not passive, it is an active biological process in your brain. So this study, it's not surprising. It is actually exactly what you would expect to happen to your biology

1:19:48

if you had this sort of what we call in medicine this kind of intervention. It's stimulating exactly what it's supposed to do.

1:19:56

Yeah, I'll just add on to what Aditi said that there are many medical conditions where you can't just go to the patient and say, why do you think you got this cancer? Oh, I think it's because I ate a lot of chocolate when I was, whatever.

99.9% Accurate90+ LanguagesInstant ResultsPrivate & Secure

Transcribe all your audio with Cockatoo

Get started free
1:20:10

You know, when the act is separated from the effect by 30 years, then you don't expect the patients to have insight into the cause of it. But when the outcome is separated from the input by seconds, and you have literally millions of chances to observe the co-variation,

1:20:30

the patient is really, really accurate. In fact, the patient really knows what's going on. And so I think the deciding factor here on this big debate about, oh, is it just correlation or is it causation? The deciding factor for social media and for a lot of these tech innovations,

1:20:46

including video games and gambling and all of that, should really be the kids. And if the kids say, this is bad for me, we should take their word for it. Given that we also have correlational studies, random control trials, longitudinal studies,

1:20:59

natural, I mean, we have so much other data, but given that the kids themselves, they call it brain rot, they call the material brain rot. My students tell me it's a huge obstacle to them doing their homework. As one of them said, I pull out a book,

1:21:12

I read a sentence, I get bored, I go to TikTok. So if they're telling us that this is damaging their ability to pay attention, they feel it, they feel the loss, we all feel it, many of us have noticed this, then I think this is pretty decisive evidence that this stuff is bad for cognition.

1:21:30

And it has long-term consequences. So it's not just that in the moment, right? So there was this case that was all over the media, a college student. I'm sure you're familiar with the case. And this young woman was on TikTok experiencing brain rot. And then some TikTok algorithm took her down to this place of, you know, you should take an edible.

1:21:50

It'll help you.

1:21:51

So you can go to class.

1:21:52

Wow.

1:21:53

Prescribing drugs.

1:21:54

Wow. And you could go to class and you could, you know, be more alert. And so she did that. And then it continued on and on. And then she developed a dependence on edibles. And then checked into rehab.

1:22:06

And only when she focused on analog activities like guitar playing and a couple of other things that she started doing is when, and you know, removing the stimulus, the TikTok algorithm, is when she started to improve. So it's not just in the moment, oh, I can't remember something, or I'm more irritable. These sorts of things compound and the long-term sequelae or the long-term effects can be quite damaging. That's just one example.

1:22:35

In your book, Anxious Generation, Jonathan, you use the subtitle here is how the great rewiring of childhood is causing an epidemic of mental illness. I was looking at some of these graphs of different sort of mental illnesses and they're increasing. One of them that's increasing is ADHD. I was diagnosed with ADHD maybe about a year ago.

1:22:58

And when we're talking about short attention spans, I mean, the name attention deficit hyperactivity disorder, I believe that's what it's called, sounds a lot like what we're talking about. Is there a link, do you believe, between the increasing diagnoses of ADHD and the sort of frying of our brains with

1:23:18

short-form video and social media? Yeah, I mean, I suspect that there is, but here's what I can tell you I learned while writing the book. I looked to see if there were studies indicating that heavy use of social media and video games and all the electronic stuff caused ADHD. And when I was doing the research in 2023,

1:23:37

I did not find evidence that it will give a kid ADHD who otherwise wouldn't have it. What I did find was evidence that for kids who have ADHD, when you let them have the devices, the video games, all that, their symptoms get much worse. And so, because it is a major achievement

1:23:52

of young adulthood to be able to pay attention, to develop what we've been calling executive function, to be able to make a plan and decide, oh, to reach the plan, I have to do this, and then I do this, and then it might be a long time before I get here, but I will keep going, and I will keep my eye on the prize. That, I assume that's, you're saying

1:24:18

I definitely, I mean, if I think about school, I couldn't pay attention in school for very long. And that meant that I was always in the expulsion room and then I was expelled. And then that's kind of, I feel like it's got worse as an adult. And in my opinion, my relationship with my phone has made it much worse.

1:24:36

Where really I can't pay attention to many things for very long time. The exception to this is I can do deep work for many, many hours without moving. It's almost a bit of a contradiction.

1:24:47

When you are extremely motivated, I assume. When I'm extremely motivated. When you're really into it, you can be into it. That's right. But a lot of work isn't that. A lot of being effective in the workplace

1:24:56

is not you're following your passion. Right, ADHD kids, they can zoom in because they're getting the dopamine, they're getting the dopamine from this thing. But a lot of work isn't like that. And these kids are not gonna be able to do that. So actually what you said, it fits perfectly with what I found from those Dutch studies.

1:25:11

If you did have, whether it's a genetic or whatever the predisposition is, this environment has made your symptoms worse. Now, of course, ADHD kids can be incredibly creative. They are often very, very successful. But my fear is that the pathways to success that they used to take might be blocked

1:25:28

if they basically are just scrolling all day long and not able to pay, not able to have real life experiences.

1:25:34

And relationships are like that, especially romantic ones.

1:25:36

It's an interesting thing that you bring up, Stephen, because there is an increase in adult onset, you know, when adults are diagnosed with ADHD, because typically we think of ADHD as a pediatric condition or young adults. And so increasingly we're seeing more and more adults who are in their 30s and 40s, 50s sometimes, even 60s, who are being diagnosed, newly diagnosed with ADHD.

1:25:58

And so that's an interesting, there's so many, you know, reasons. Like it might be that they had it all along and they were diagnosed. And so what is going on there? That would be a future podcast episode for an ADHD expert of what are the drivers of why are so many adults

1:26:15

being diagnosed with ADHD?

1:26:17

Or maybe even just the symptoms looking very similar.

1:26:20

Yeah, that's right.

1:26:22

You talk about popcorn brain, Aditi.

1:26:24

Yeah, so we've talked about brain rot and the primal. You talk about popcorn brain, Aditi. Yeah.

1:26:25

So, you know, we've talked about brain rot and the primal ursus scroll and popcorn brain is kind of an offshoot. It's part of the same family. And so what happens is it's a term coined by a man, a psychologist named David Levy. And what happens with popcorn brain is that you, and we all have it. And so what is a societal phenomenon, when you spend too much time online

1:26:46

and you are overstimulated, and so it is hard for you to spend time offline. Offline feels slow, boring, things are moving at a much slower pace. And so popcorn brain is the sensation of your brain popping. It is not actively popping.

1:27:00

It's not like your brain cells are popping, but it sure feels like it. And so your primal urge to scroll kind of primes your brain to develop popcorn brain. You are more at risk for developing popcorn brain when you feel a sense of stress because of that primal urge to scroll. The differentiator between brain rot and popcorn brain, again, these are societal terms that

1:27:19

we're calling for a constellation or a group of symptoms, right? And so the difference to me is that popcorn brain is ubiquitous. It's everywhere. It's like we all have it, and it's happening all the time because of the modern age and a lot of the things that we talked about.

99.9% Accurate90+ LanguagesInstant ResultsPrivate & Secure

Transcribe all your audio with Cockatoo

Get started free
1:27:36

Brain rot is a little bit more specific. It's a little bit more well-defined. So it has certain features. Like, we call it the bio-psycho-social model when you're thinking about a particular medical condition or an entity. So what are the biological factors we talked about?

1:27:50

What defines brain rot? It's a change in brain waves, a change in brain regions, the amygdala lighting up and the prefrontal cortex kind of being quiet. Psychological factors we talked about, attention, complex problem solving, impulse control, and then the social factors, loneliness and others.

1:28:10

So compulsion. And so I would say popcorn brain is something that we all suffer from. And brain rot is something that is very specific. The other thing that we haven't talked about that I would love to kind of, because so much of our conversation is like doom and gloom, right?

1:28:25

It's like, wah, wah, wah. One thing that I would like to say is that as bad as, when you hear the term brain rot, it seems permanent because rot, it connotes like deterioration. That's it. It's one-sided, it's one way and that's it.

1:28:40

But in fact, popcorn brain and brain rot are reversible conditions. So it is not- In adults. In adults.

1:28:48

If you've gone through puberty with it, it's not so clear.

1:28:51

Yes, in adults. And my work focuses on adults. And so when you have, if you experience brain rot in your 30s, 40s and beyond, you can, it takes time, you know, it takes eight weeks for your brain to rewire itself, give yourself time.

1:29:05

A sense of self-compassion is really important. But you can, you know, there is a sense of it being able to be reversed. So it's not so much a brain, it's not a fixed trait, but rather a brain state. So I think it's important to offer that hope.

1:29:18

What is an adult brain?

1:29:19

What age is an adult brain? Like what age does my brain stop growing in the way where it's reversible? So- Yeah, I mean, traditionally it was thought that, you know, puberty is the period of super rapid brain change and that begins, you know, early teens, sometimes even before 10, and is mostly over by sort of, you know, mid to late teens. But then the prefrontal cortex,

1:29:39

which Aditi was talking about, which is so important for impulse control and executive function, that doesn't finish myelinating. Myelinating is when you get the sort of the neuron that you get a sort of a fatty sheath, like an insulation that sort of locks down the circuits

1:29:52

and makes them more efficient. That doesn't stop until around age 25, is what we've always said for many years. But you're telling me that there's new research

1:29:59

showing that. Tell us about that. So, you know, all this time, right, we've always said that the prefrontal cortex is fully formed and fully functional at the age of 25. And so when you're talking about impulse control and all of this stuff, but there was this really interesting study, I'll send it to you. It looked at, I think it was 1000 people from age zero, so birth all the way to 90.

1:30:20

So the entire population. And it found five, it looked at lifespan and said there are actually five stages. So first is childhood up, zero to age nine. During this time, your brain is not very efficient, but it's really growing and, you know,

1:30:36

it's growing and changing, but it's not really efficient. Nine to 32 is considered adolescence. And so, you know, 32 is when adolescence ends, apparently, according to this new research.

1:30:47

But it would be sort of, I mean, you're most of the way done by 25. But there's still some, there's some flexibility even after that.

1:30:54

And then the next stage is from 33 to, I think, 63. 66 is like adulthood. Things are very stable. Learning is stable and it's efficient and things are doing well. And then 60, yeah, 66 to about 83 is early aging.

1:31:16

And so that's when you see some of the age-related changes. And then 83 plus is late aging. So the kind of main finding was that, you know, it was all over the news. It was like, adolescence goes until 32.

"Cockatoo has made my life as a documentary video producer much easier because I no longer have to transcribe interviews by hand."

Peter, Los Angeles, United States

Want to transcribe your own content?

Get started free
1:31:31

So I'm 33, so I'm-

1:31:32

One year out.

1:31:34

I'm cooked by now. When you wrote this book, Jonathan, The Anxious Generation, it's had a big impact on the world in a way that I think any author might dream of. And I know this in part because, you know,

1:31:45

I sit on this podcast interviewing really interesting people all the time. And even this morning when I did an interview across town with James Sexton, he talked about this book twice. And, you know, laws have been changed around the world

1:31:58

inspired by this book. And we're actually seeing an increase of laws in the UK. I mean, Australia just banned, I think, social media for people under 16.

1:32:05

Up to age 16, yeah.

1:32:06

You met with Macron, right?

1:32:08

Yeah, yeah.

1:32:09

Could you ever have imagined, and actually, what does the success of this book say about society?

1:32:16

Now, thank you for that question, because I do tend to get, as you've heard, I mean, I'm extremely alarmed about these trends, and these are gigantic threats beyond what anyone can imagine. But here's the amazing thing, is that we can reverse this for almost no money, and it's completely bipartisan, and it's not that hard to do, and we're doing it.

1:32:38

And so what happened was, you know, I wrote the book as an American, assuming that we don't have a functioning legislature, the Congress can be stopped, we have a vitocracy, the social media companies can stop anything in the house. So I wrote this assuming, you know, we'll never get legislation. So we have to do this on our own.

1:32:53

And I proposed four norms, no smartphone before high school, no social media before 16, phone-free schools, and far more independence, free play, responsibility, and the real world. So four norms, we can try to do this with collective action locally at the school level. Two things that surprised me. One, are that immediately governors from red states

1:33:12

and blue states started reaching out to me. Our states actually function. Our states have governments that are accountable to the people that are trying to get good results. And so this has been a totally bipartisan issue. Sarah Huckabee Sanders from Arkansas

1:33:22

was one of the very first. Kathy Hochul also. And it is, it tends to be more female legislators and governors or spouses of heads of state. The moms, the book really spoke to moms because moms around the world, they felt the kids being pulled away. I believe they felt it viscerally more than the dads did. Also, the dads kind of liked the video games. They're a little more pro-tech. So I think the moms felt the pain more and took it more personally.

1:33:45

So when the book came out, mothers around the world jumped into action, formed groups, pushed for legislation, and changes began happening. I was in Davos and then London and Brussels two weeks ago. And what I saw was a complete sea change

1:34:03

in the world's thinking about how we need to have age limits on social media and other tech. And here's what I think just happened. It's so cool. It just dawned on me, literally while I was in London, like I was pushing on open doors everywhere.

1:34:14

Wherever I went, people wanted to do this. I went to the EU, they want to do this. Like, what is happening? And what I realized is this. Steven Pinker has a book out last year called When Everyone Knows That Everyone Knows. It's about the immediate change in a social system when private knowledge,

1:34:32

you know, everybody knows that the emperor has no clothes. Everybody knows that this, you know, ideology doesn't work. Everybody knows that, but they don't all know that everybody else knows it and that everybody else knows that. And so in The emperor's new clothes,

99.9% Accurate90+ LanguagesInstant ResultsPrivate & Secure

Transcribe all your audio with Cockatoo

Get started free
1:34:46

everybody thought he's, I don't think he has any clothes on, but maybe, you know, maybe only wise people can see it. But when the child says, the emperor has no clothes. And then in the Hans Christian Andersen story, it says, and the people began whispering to each other,

1:35:00

and then they all cried out in unison. And that's what happened when Australia's law went into effect. So I believe that December 10th of last December was the global turning point in the battle to reclaim childhood.

1:35:14

And if we reclaim that, we move on to our attention and adult life as well. What happened on December 10th, the Australia law went into effect, sky didn't fall, people weren't locked out of their accounts, all the companies complied,

1:35:27

they shut down 5 million accounts for Australia's 3 1⁄2 million kids that were underage, 2 1⁄2 million kids. The sky didn't fall, and there was a lot of news coverage around the world of what Australia was doing. And a lot of the news coverage

1:35:41

included opinions from the writers saying, why can't we do that? Hey, let's do that here. And when everybody saw that everybody was looking at Australia and saying, let's do that here, then everybody knew that everybody knew

1:35:53

that this is just completely bonkers to have children being raised on social media platforms, talking with anonymous strangers and being fed algorithmically curated garbage. So I believe that that's why 2026 is going to be the year when at least 15 countries are going to commit

1:36:10

to passing an age minimum law. In 2025, it was one, Australia, and now we already have Indonesia. Their law goes into effect in March. I met with Macron in Davos, and a few days he was preparing

1:36:24

to push a bill through the assembly, and he got it. He's the first in Davos, and a few days, he was preparing to push a bill through the assembly, and he got it. He's the first in the EU, but a lot of other countries in the EU are gonna follow. The whole EU is likely to do it. So, yes, I am incredibly alarmed

1:36:36

about how big this problem is, but I'm incredibly inspired that the whole world is rising up to do something about it. We actually can control our fate. And that was not clear before December 10th.

1:36:49

Bravo. As a mother, that was the first thing I said to you. The first thing I said to you was thank you, as a mom, for changing my family's life.

1:36:59

Thank you, Aditi.

1:37:03

It's a really special accomplishment, Jonathan. You know, I could, there's no real words that I could say that could quite capture the long-term impact that that's going to have on billions of people's lives. And not just the direct, but also the indirect, in all the ways we've described their ability to form connections, to fall in love, to find meaning and purpose in their lives and their neuroscience and therefore, you know, the neuroscience of their children

1:37:29

and their children's children and so on. So it's a really, it's a really overwhelming accomplishment.

1:37:34

It was a bizarre situation that I walked into with the unique abilities of a social psychologist. That is, everybody was upset about this. Everybody could see it, but they thought, well, this is my problem, or in my family, we have this problem.

1:37:50

And I came to this with fresh eyes. My dissertation was on moral development. I'd studied adolescent behavior longer ago in my career, and I've written about it in all my books. So it wasn't totally new to me. But I came into the field of social media studies around 2018, 2019, I really immersed myself in it.

1:38:06

And it was like, you know, you walk in and immediately you see, wait, this is a trap. People are on it because people are on it. And the kids are complaining about that. Everyone's complaining about it. And the only reason they can't get off

1:38:17

is because everyone else is on it. So I think I was able to see that. And then also COVID confused us for a few years. So it wasn't until COVID was in the rear view mirror that it was possible for everybody to say, wait, this is crazy. And so I was incredibly lucky in terms of the timing.

1:38:33

My book happened to come out in March of 2024, just as the world was ready to see like, wait, what have we done to our kids?

"Your service and product truly is the best and best value I have found after hours of searching."

Adrian, Johannesburg, South Africa

Want to transcribe your own content?

Get started free
1:38:40

Let's undo it. And you said you're now focusing more on short form video.

1:38:45

So, yes. So, in studying older Gen Z, these are the people who went through puberty on Instagram. I should, if I could just lay out that it's very important to get the timing that everyone understands the timing.

1:38:56

Cause this is what, you mentioned the poly crisis before. The poly crisis, I believe, begins between 2010 and 2015, here's why. So we've had the internet for a long time and it was marvelous. We love the internet in the 90s. It's gonna be the best friend of democracy, okay? And then the iPhone comes out and it's amazing.

1:39:11

Oh my God, there's just so many things. Everything seems great. Okay, so in 2010, most of, almost all of us have flip phones. The iPhone's spreading, but it's still mostly flip phones. Teens are all on flip phones, basic phones. And we call those people millennials. If you finished puberty by 20, if you were born in say 1990

1:39:30

and you start puberty in 2002, you're done by 2008, so you know, in there. If you got through puberty before you got on Instagram, you're a millennial. Whereas if you're born say, well, if you're born after 1995, but let's say if you're born in the year 2000, you begin puberty in 2012,

1:39:48

and you're not done until 2016, 2018. So in 2010, everyone has a flip phone with no front-facing camera, no high-speed internet, you have to pay for your text, so you use it to call people and to text them, and that's it. It was a communication device. And that's why the millennials have good mental health.

1:40:06

They are the last mentally healthy and successful generation. But if you're Gen Z, you got, 2012 is the year that now most people now have a smartphone. It's the year that Facebook buys Instagram. They don't change it at first,

1:40:21

but that's the year that all the girls go on it. Everyone now has high-speed data, front-facing camera, came out in 2010. So by 2015, we're in a radically different world. For children's development, it's now radically different, much more hostile to human development, and that's what we did to Gen Z, and now we're doing to Gen Alpha. For politics, it was crazy for all sorts of reasons in every decade,

1:40:45

and especially, you know, in the early 2000s, there's a lot, there's a culture war going on, there's all kinds of stuff going on. But it was when, it was when everyone has, really Twitter was the biggest perpetrator of this. When everyone has Twitter,

1:40:57

and everyone's checking all the time, and anything can blow up, you know, you described the way there was, you know, variants on TikTok. If you get it just right, it can blow up. You can have huge impact. That's when the democracy, if democracy is a conversation, when it moved from newspapers and, you know, even simple web bulletin boards,

1:41:16

when it moved to super viral retweet buttons, all of that, that's all 2010 to 2015. So that's why since then, everything has been insane and it's going to just keep getting more insane. And that's why I believe we have this poly crisis because there's more to it.

1:41:33

It's not just the technology, but I believe the transformation of our connection and our information flow and our addiction, all of that is radically different by 2015 compared to how it was in 2010. And now everything else builds on top of that, I believe.

1:41:47

What do you think?

1:41:48

Do you think that makes sense?

1:41:49

I think there's one more data point to add in that 2014 was the year that things really, it was the tipping point, like you say.

1:41:56

Yes, that's the year that I point to too, yes.

1:41:58

Yeah, so before- What do you think 2014? So when you look at the data, you see that time spent alone, when you compare, when you look at data from like the 1960s to 2014, it was kind of stable. Americans spending time alone,

1:42:13

spending time with friends. Yeah. Kind of the same, right? So people spent the kind of same amount of time with friends, same amount of time alone over those decades. 2014 marks a shift, and there is a steep rise

99.9% Accurate90+ LanguagesInstant ResultsPrivate & Secure

Transcribe all your audio with Cockatoo

Get started free
1:42:27

in time spent alone and a drop in time spent with friends. And so what happens in 2014? It is when the majority of Americans get a smartphone. And it's not to say, again, we've said causality, correlation, which is it, but there is, based on everything

1:42:46

that we've talked about, my gosh, is there an association between that? This is not to say that time spent alone, when I share this data, people may say, but I like spending time alone. I'm not lonely, I'm okay.

1:42:58

This is not about being an introvert or an extrovert. It's not about, you can have solitude and feel great and you're not lonely. But we are human beings and we are social creatures. This is just how we are built evolutionarily. And so that is a real red flag when you have this big jump in time spent alone very much the same year.

1:43:21

And so my work focuses on adults, Jonathan, on kids. But there's this, you know, that's the moment, right? 2014, where everything changed.

1:43:30

Last month, I told you about our sponsor Function Health and their team, who've developed a way of giving you a full 360 view of what's going on inside your body. They offer over 100 advanced lab tests covering everything from hormones, toxins, inflammation, heart health, stress, and so much more. So Jack, who started this show with me, got his first blood draw done a couple of weeks ago,

1:43:49

so I thought I'd let him tell you a little bit more about his experience.

1:43:52

This test really opened my eyes to personally what I should be doing with my health. I hear a lot of information in this podcast. I sit in every single recording, so to know how I can relate each one to me personally is super valuable. You sign up and you schedule your tests, and once you're done, you get a little report like the one I have here.

1:44:07

I can see my in-range results, my out-of-range results, and there's a little AI function too. So if I have any questions about my out-of-range results, I can just go in there and ask it any question I want.

1:44:17

And these tests are backed by doctors of research. You get an annual draw done and a mid-year follow-up. So if you want to learn more head over to functionhealth.com slash doac where you can sign up for $365 a year. I'll put the link in the description below. It is just one dollar a day for your health. There's a phase a lot of companies hit where they're no longer doing the most important thing which is selling and they get really bogged down with admin and it's often something that creeps up slowly and you don't really notice until it's happened. Slowly, momentum starts to leak out. This happened to us and our sponsor, PipeDrive,

1:44:50

was a fix I came across 10 years ago. And ever since, my teams across my different companies have continued to use it. PipeDrive is a simple but powerful sales CRM that gives you the visibility on any deals in your pipeline. It also automates a lot of the tedious, repetitive, and time-consuming parts of the sales process, which in turn saves you so many hours every single month,

1:45:08

which means you can get back to selling. Making that early decision to switch to PipeDrive was a real game changer, and it's kept the right things front of mind. My favorite feature is PipeDrive's ability to sync your CRM with multiple email inboxes in boxes so your entire team can work together from one platform. And we aren't the only ones benefiting.

1:45:25

Over 100,000 companies use PipeDrive to grow their business. So if something I've said resonates, head over to PipeDrive.com slash CEO, where you can get a 30 day free trial, no credit card or payment required. So what do we do about this? Because when I look at all the stats, we did all these audience surveys ahead of this, people are spending roughly in our audience

1:45:48

about six and a half hours a day on their phones. Short form video is only gonna get more addictive. AI is gonna know me more. It's gonna be more personalized. The content is gonna be generated just for me. What am I, what's next?

1:46:00

Is it a law we need to pass? Is it something I need to do myself? So I think we need to pick the low-hanging fruit first. And the reason for that is not just efficiency. It's that we have to prove that we can actually do something because we've never done anything. We've never done anything to restrain this. We've let Silicon Valley run wild.

1:46:17

Congress gave them special protection, Section 230. Nobody can sue them for killing their kids if they feed them content, they can't be held responsible.

1:46:25

I think section 230 is probably something worth explaining.

1:46:27

Sure, the Communications Decency Act, 1997, I think it was, pleasure to remind you, it's a year. There's a section in it that the goal was to specifically let the tech companies like AOL back then, let them take down pornographic content because they were afraid if we take down anything,

1:46:44

then we're responsible for everything and now it's going to be endless. So Congress specifically said, don't worry, don't worry. If you choose to take something down, nobody can sue you for what you leave up. So it was a good intention originally. But the courts have interpreted it so widely as to say no one can regulate social media, they're not responsible for hurting kids, you can't sue them.

"The accuracy (including various accents, including strong accents) and unlimited transcripts is what makes my heart sing."

Donni, Queensland, Australia

Want to transcribe your own content?

Get started free
1:47:02

And they have never faced a jury. They have never, no parent has ever gotten justice from them despite all the kids whose lives have been ruined. All the kids are dead. And that's gonna change. That's changing just now here in February in Los Angeles.

1:47:14

So because the US Congress sort of set up this problem and it also in a different law said, how old does a kid have to be before a company can take their data without their parents' knowledge or permission, before a company can expose them to all kinds of stuff, before a company can have them sign away their rights? How old? And the original law said,

1:47:33

16. Let's try 16, you know, because, you know, it wasn't so sick and twisted back then, 1998, caught by the Children's Online Privacy Protection Act. So, but various lobbying, they pushed it down from 16 to 13 and they gutted enforcement. So as long as, and that's why all over the internet, it's, are you 13 or what's your birth year? And as long as you're 13, you're in for porn and you have to say you're 18.

1:47:55

So because we, it's a few laws that set this up. We definitely need laws to undo it, especially for kids. What I'm advocating is let's do the easy stuff, the high impact stuff for kids because that is totally not politically controversial. There is no left-right divide on that, and that's been true everywhere. Australia, Britain, the EU, everywhere. Regulating the internet for adults, regulating social media for its destructive properties

1:48:22

and democracy is a hell of a lot harder. And I don't have easy answers. There's a lot we could do to reduce the virality, the spread of the extreme con- so there are lots of little things that we can do. And Francis Haugen, the Facebook whistleblower, had all kinds of ideas. So we definitely can do things to make it less toxic for democracy. But those are going to be politically controversial because one side is going to benefit from

1:48:42

more than the other. So it's gonna be very difficult to do. I don't know if we can do them in the US, but let's just all do the, let's just all protect the kids. That way we show globally that we actually can do something. And if we do that, then I think we will be able to do some basic things about AI, like no companion chat bots if you're under 18. You know, these things already have a body count. A lot of kids have been encouraged to kill themselves.

1:49:05

They already have driven millions or hundreds of thousands or millions of people into psychosis. So we'll be able to, I believe, put some limits on AI, especially for kids. But if we can't get this, if we can't win on social media for kids,

1:49:18

then I don't think we should do and what do you think we can do?

1:49:25

So my work as a doctor, I think about what we can do and how I can empower people to first build awareness. So I aim to first normalize and validate the experience with everyone who is engaging with chatbots. And so I don't like to shame people, because as a doctor, right, like you want to meet the patient where they are.

1:49:47

And so I won't shame someone to say, you know, why are you using this? Why is your boyfriend AI? Or why are you getting married to AI? Or why are you using AI for a therapist? One of my followers on social media, it still makes me laugh,

1:50:01

I put out a call saying, why are you using AI as a therapist? And so someone wrote to me, it was great. I screenshot it, it said, because all human therapists are trash with a trash can emoji. And it made me laugh.

1:50:15

And I said, you know, so there is, so to me, when I think about what's happening and what we can do, it's no mistake that we're here right now. So the pandemic, like we've talked about, was a huge driver. Social isolation, hyper-reliance on self, right? Then the proliferation of technology that replaced human interaction. Zoom board meetings, Zoom funerals, Zoom birthday

1:50:41

parties, Zoom graduations, things that we did in person are now online. And then personally, as a doctor, I was a talking head during the pandemic for lots of news channels about the vaccine. I have a background in public health as well. It immense distrust and mistrust

1:50:59

in establishment and experts. And so it's like, I'm gonna do my own research. I'm not gonna go see a doctor or a therapist. I'm going to do my own research. I'm not going to go see a doctor or a therapist. I'm going to talk to my chatbot. And also, I mean, you know, let's keep it real, the cost, right? So people are struggling.

1:51:12

They're in financial crises. There's an unmet need, yes, for human connection, but also for good therapy or, you know, good medical care because there is such a need because of the pandemic and people aren't getting the care that they need, that they deserve. There's so many factors here.

1:51:29

And so what I've been focusing on this year particularly is learning about AI chatbots, how they are influencing mental health, what is actually happening. Because I'm a human first, AI second person. It's like my work focuses on high touch and AI is high tech. And this is the first intervention that we are seeing that is high tech that is becoming high touch. And that scares me.

1:51:52

And you're writing a book about that at the moment, right?

1:51:55

Bot brain.

1:51:55

It's called Bot Brain, how to stay calm, resilient, and human in the face of AI. And so really thinking about how are we going to be able to live with this technology? I love Jonathan's stance is to say, out, AI companions, done.

1:52:13

For kids.

1:52:14

Yeah.

1:52:15

For kids, yeah.

1:52:16

Until proven safe. Totally agree. But in terms of adults, like how do we manage that for adults, you know? And so my work focuses, right now what I'm doing is I've spent the year talking to every, as many, AI researchers who are working on these models or who are doing research on the downstream effects

1:52:33

of these models. And when I say that it is dark and dystopian, it has profoundly changed something in me and it has influenced my mental health. I had to take a step away from just, because I couldn't believe what I was learning.

1:52:45

Well, could you just give, yeah, give us some examples.

1:52:48

Yeah, the teaser, the teaser.

1:52:49

This is intriguing.

1:52:51

So one, I spoke to one of the scientists who told me that, you know, there's the echo chamber phenomenon in social media, right? Where, we all know what that is. It's like, you, it's a fragmented world because of social and you're engaging and then you get the same – the algorithm feeds you the same kind of thoughts that you

1:53:09

already have. But particularly now with AI chatbots, when you're engaging with your chatbot, even just talking about it, I'm getting chills, it's the echo chamber of one. So it's you speaking to you. It's like the funhouse mirror, and then it's giving you a response, and then you're talking and it's giving you a response. But people, regular users who are using AI chatbots think that it's wise, compassionate, nonjudgmental, unbiased, empathetic, these human attributes. And so, you know, the echo chamber of one is kind of one idea that really frightened me. And the second one was the drift phenomenon. The drift phenomenon is this idea that really frightened me, and the second one was the drift phenomenon.

1:53:45

The drift phenomenon is this idea that you are engaging with your chatbot, and it's engaging with you and it's actively changing your beliefs through the drift. So you might start off as one belief, and then you're talking, and through this amplification funhouse mirror effect, it slowly shifts your belief to something altogether different. You've heard cases of it in the news

1:54:08

where people start, you have a plumbing problem, you go to your AI chatbot, you ask them how to fix your sink, and then you're like, you know what? Can you tell me about the meaning of life? And then you start talking about that,

1:54:17

and before you know it, you have these theories and you're getting that validation. And so a lot of my work over the past year has been, you know, digging into the science of what is going on in the brain. How are you forming, not us particularly at this table, but millions of people are forming a sense of attachment, a therapeutic connection with their chatbot. They're, you know, giving names to it, and it's an entity. And so how does that happen?

1:54:45

And how is it going to replace human-to-human connection? And so it terrifies me. I've also gone through some AI therapy myself just to see- Oh, how was it? What would happen. It was very interesting.

1:54:57

I knew what was happening as it was happening. So certain words that they used, and I was like, ah, I see what you did here. And so it's been a journey and I'm frightened, frankly, of what it means for all of us. And my approach, kind of, you know, not like Jonathan's, I love Jonathan's approach. I, you know, I think, yes, we need legislation, But my approach is more, I would say, tempered in that I think that we, there's utility for AI chatbots for certain people

1:55:30

because of access or, you know, need, et cetera. Like if you are LGBTQIA plus and you live in an area that is not very open and you need to talk to someone, you can't go to your therapist, it's like maybe you can use an AI chatbot. So there's certain cases, a case-by-case basis, but my work will focus, this particular book will focus on ways that you can first understand and build awareness of what's happening with this interaction and then what you can do to manage that.

"I'd definitely pay more for this as your audio transcription is miles ahead of the rest."

Dave, Leeds, United Kingdom

Want to transcribe your own content?

Get started free
1:55:59

I didn't realize that my chatbot was giving me a tailored experience until one day when I had a debate with my friends about who the best football player in the world was. And we all went to our chat GBTs and asked it. And mine said, Messi. And his said, Ronaldo. And I thought he was lying.

1:56:15

So I was like, video record. And he video recorded it. And his gave him a completely different answer to the same question. And we- different answer to the same question. And did it know that you were each fans of? Well, this is the thing. I think it's got such a huge amount of memory on me that it knew what I wanted to hear.

1:56:28

Oh, wow. It knew what?

1:56:29

It knew what I wanted to hear because I've probably went through the World Cup. And then I realized, okay, so this is not reality. This is, it's a curated version of reality that in some sense is trying to please me And of course, once the advertising model kicks in, retention becomes the great incentive. What do you think?

1:56:47

It's called sycophancy, by the way. Yeah, I just learned that word. It's like extreme, it's like agreeableness at scale. It's like golden retriever. Like kissing your ass,

1:56:54

it's like professional kissing your ass. The yes man. What do you think of these AIC CEOs? Because it feels like they're in a bit of a race, where if they don't do it, then a national rival's gonna do it. If a national rival doesn't take them out, China's gonna do it.

1:57:13

We kind of saw it with social media. How can they stop? Because if they stop, they might say that there's an existential risk.

1:57:21

There is like a build the plane as you're flying it. And I think you, on one of your episodes, you know that I'm a fan of this show and I actively listen to this. I've told you this many times. One of the, I think you had said on one of your episodes, right, that you have a friend who is very close to a AI founder.

1:57:37

I said this, yeah. Yeah. founder says all the right things, and then behind closed doors, it's altogether different. Yeah, it was a horrifying thing.

1:57:45

And I said this in the clip went viral, and people have been trying to hazard a guess who it was. I shouldn't say who it was, because it's Chinese whispers at the end of the day. It's someone that I'm very good friends with, who is verified, spends time with one of the biggest

1:57:57

founders of an AI company in the world. he was with him two weeks ago again. And he said to me that they're very aware that there's a small existential risk for humanity.

1:58:10

And it's- It's not small. Publicly they say it's small. Privately they say it's big.

1:58:14

I mean, but even if it was 1%.

1:58:17

It's a lot more than 1%, they say.

1:58:19

If it was, but I'm saying even if it was 0.1%, if there was anything that I was doing in my life where there was a 0.1% chance that I might wipe out everybody, I would immediately stop doing that thing. But these numbers are much bigger.

1:58:32

I'm hearing 7%, 20%, 25%, depending on who you... And I think acceleration in this direction increases that percentage. What do you think of these people? Like, what's going on here?

1:58:44

Let's start with the collective action problems because each company is competing with the other companies. And so they feel like they have to go faster. And we know that, you know, open AI has pushed some products out before they did safety testing

1:58:58

because they had to get to market by a certain date. So just the normal business environment puts them all in a collective action problem against each other. And then they all say we're in a collective action problem against China, because if we don't do this, then China will.

99.9% Accurate90+ LanguagesInstant ResultsPrivate & Secure

Transcribe all your audio with Cockatoo

Get started free
1:59:11

Now one thing I learned—again, I don't know if Tristan said this on your podcast or whether it was on his podcast—but is that China is focused on using AI to make its economy more efficient, to make manufacturing better and cheaper. They are using these applications, which we've talked about before. Like, we're totally,

1:59:27

there's lots of great applications of AI. The Chinese also have so many spies in America and in the tech companies, and they can hack into anything. So the point is, the faster, our companies are in a headlong race to create AGI,

1:59:40

to create a country of geniuses that can replace all human workers, put us all out of work, and it can run everything. They're in a race to create that. And one of the arguments is if we don't do it, China will. But what I understood from listening to Shih-Tzu Han and from his conversation with you

1:59:55

is that the faster we go towards AGI, the faster China goes, because they just take all our discoveries. So can't we slow down on the race to AGI and do more safety testing? What we all saw with Malt Book and communities of agents

2:00:11

who are talking to each other and making up languages, and even if part of that was human-driven now, in a year, it's gonna be much more than what we saw. So I think the risks are extraordinary. I think that some of these guys, look, they've been in AI for a long time. They might not have realized the existential risk they were putting us all in 10, 15 years ago,

2:00:29

and now they can't stop. They can't pull the plug. They can't say, oh, let's shut down the whole business. So it is a very, very risky time. And I think Dario Amodai, I just read his long essay on the adolescence of technology, at least you get the feeling he's really wrestling with it. And I think he's more open than some of the others, but I don't know.

2:00:49

But when has morality ever been top of mind for a tech leader? You might be thinking, if there's 0.111% chance, I'm not gonna do it. That's what I think as a doctor. That's what you think as a social scientists, but we're not AI leaders, right? Yeah.

2:01:05

It's one of the great question marks. I just can't seem to get an answer to. And then you've got this whole robotics thing happening where Elon's got his Optimus robots and there's gonna be a billion, he says there's gonna be 10 billion of them at one point,

2:01:17

but I think his pay packet requires a million of them to be out in the world. For him to make a trillion dollars, yeah. Yeah, and I just saw you, AI, robotics, you combine the two. Yeah, you get Terminator. We laugh, but it's like, should we stop for a second and maybe have a conversation about this?

2:01:38

Can we? With commercial incentives in play, it does feel like, I don't feel hopeful.

2:01:44

Yeah, it's very hard to know how to stop it. But I just, I want to just add one point on here, which we've touched around a few times and the robotics will really bring it home here, is the loss of the sense of meaning or purpose that many people are feeling, but especially young people.

2:02:01

The saddest graph in the Anxious Generation, all the graphs look the same. It's all a hockey stick. It's all like nothing was happening, you know, 90s to 2010, 2011, then all of a sudden something happens.

2:02:11

And the saddest one is the one, my life feels meaningless. Do you agree with that, disagree with it? And the percent that agree, I think it's, you know, the end of chapter seven. And then it's fairly flat. And then all of a sudden we hit this period,

2:02:29

the great rewiring, 2010 to 2015. So right around 2013, it goes way, way up. Young people feel useless. And I think the reason is that they are useless. What I mean is people need to feel useful. People need to do things for other people.

2:02:44

That's how you feel useful. People need to do things for other people. That's how you feel useful. If you were to disappear, would the world change? If yes, you're useful. Are people depending on you for something? If yes, you're useful. So if kids are doing errands for the family, they're useful.

2:02:57

But as childhood change from a mix of things to just consuming content, if that's all you do, and five hours a day is the average for social media, eight to 10 on devices, not counting school. If all you're doing is just, you're just consuming content,

2:03:11

you are useless. Now, what's happening? The chance to have a job where you actually do something for people, you know, it used to be if you work in a store, at least you're helping people buy something and you might talk to them. And now you're just there watching as they use the machine.

2:03:25

The more technology makes things easy and cheap by replacing people, the more people will feel, my job is to just, I don't have a job, it's just consume content. The AI guys tell us, oh, such abundance. Oh my God, it's gonna be such abundance.

"99% accuracy and it switches languages, even though you choose one before you transcribe. Upload → Transcribe → Download and repeat!"

Ruben, Netherlands

Want to transcribe your own content?

Get started free
2:03:41

No one will have to work. We'll give everyone UBI. We'll give everybody, you know, universal basic income. That is hell on earth. What's going to happen? Certainly all the boys, most of the boys, it's just going to be video games, porn, and gambling.

2:03:55

So if you simply give people money to do nothing, you're guaranteed they're going to feel useless. And then the suicide rate will continue to go up. So this is the world that the AI guys are taking us to, a world in which there's nothing left for people to do. They say that they will give up some of their trillions and somehow let it be taxed or diverted as UBI, but that's never happened before, so it's not likely to happen in this case. So again, I don't know what to do, but we've got to start showing that we can do something, and we've got to be talking about this.

2:04:25

And we can't be welcoming AI in everywhere. We've got to be wary and vigilant. Yes, there are some uses, but Silicon Valley has tricked us so many times and insidified so many of the apps that we use. We have to expect that the same is going to happen

2:04:39

with our beloved chatbots and our beloved chat GPT.

2:04:43

This graph on page 195 of your book, which is titled, Life Often Feels Meaningless, is the graph you mentioned, I'll throw it up on the screen, is shocking.

2:04:54

Shocking just to look at. Suddenly there's this huge spike in meaninglessness amongst high school seniors. What is it to live a meaningful life? What does that mean?

2:05:07

Yeah, so my first book, The Happiness Hypothesis, addresses that question very directly. And the first hypothesis you might have about happiness is it comes from getting what you want. You know, you set out a goal, you get your goal,

2:05:20

you're happy. It's very short-lived. You're happy very briefly, and then you run to the next thing. The more sophisticated happiness hypothesis is that happiness comes from within. And this is what the ancients tell us, East and West, Buddhist, Stoic. Don't try to make the world conform. You

2:05:35

change yourself. Accept the world the way it is. That's better. But the conclusion I came to as a modern social psychologist working in positive psychology was that the best way to say it is that happiness comes from between. What I mean by that is humans evolved as almost hive-ish creatures. We evolved in intensely social groups,

2:05:57

never being alone, lots of gossip, lots of conflict, always intensely social. And modernity has made it possible for us to not live that way. We've come apart. There are many advantages to that,

2:06:08

but we feel we're missing something, we're lonely. We feel something is not right. And so the conclusion I came to is that happiness comes, a sense of a full, satisfying, meaningful life comes when you get three betweens, right? The relationship between yourself and others,

2:06:28

love, broadly speaking, not just romantic, but friends, family, yourself and your work, that as humans need to be productive. We need to be doing something that matters, that affects other people,

2:06:42

and the relationship between you and something larger than yourself. We need to be part of something that en other people, and the relationship between you and something larger than yourself. We need to be part of something that endures, that part of a tradition, part of we can look to the future. What I do matters for this group or this mission,

2:06:55

or me as an academic, I feel like I'm connected all the way back to Plato, and I hope all the way forward in time to future psychologists and future scholars. So if you get those three right, then you will be as happy as you can be.

2:07:09

You'll be as happy as your genes and childhood allow you to be. And when you put it that way, what we can see is social media and AI interfere with all three. So relationships between yourself and others,

2:07:22

well, you know, social media gives you lots and lots of shallow relationships, which blocks out, you don't have time for real people. So the technology is blocking relation between ourself and others and taking it over. Ourself and our work, work is going to be taken over by the machines, and it's already becoming more soulless and isolated. And then yourself and something larger than yourself.

2:07:44

Humans have to live in a moral matrix. We co-create a set of meanings and traditions. We need a sense of history of who we are, where we came from. All that's getting shredded. Everything is just little bits.

2:07:55

People don't read books. Imagine if all of the accumulated wisdom of humanity in books is just gone, just gone. Nobody's gonna be, young people not reading books. It's very hard for them to read a book now because of the attention.

99.9% Accurate90+ LanguagesInstant ResultsPrivate & Secure

Transcribe all your audio with Cockatoo

Get started free
2:08:07

So if we lose a sense of history, if we lose an ability to co-construct reality, then it'll be hard to imagine anything that we're connected to larger than ourselves. So I am a techno-determinist in the sense that I think the tech, it doesn't determine everything, but you have to start with the technology because that changes the ground upon which we live,

2:08:26

the zone in which we're trying to construct meaningful life. Start with that, and then you can see what the obstacles are. And that's why I take a much more, well, intemperate, I guess, I'll accept the word. I love it.

2:08:40

Because we don't have much time here. We have to reclaim life in the real world for our kids and for ourselves. There is no way to find a happy, meaningful life if we make the full transition to the online AI robot world.

2:08:53

And what, in your perspective, is a meaningful life? And how does it differ from Jonathan's?

2:09:00

I loved Jonathan's description. It was so beautiful. I have given a prescription to patience of what creates a meaningful life, and it is to live a lifetime in a day. And so that sounds like this big thing, but all it is is that when you start your day, think about five things, five things that you can do in your day to create an arc of a long

2:09:24

and meaningful life in one day. create an arc of a long and meaningful life in one day. So what does that mean? Spend a little bit of time in childhood. So in wonder and play, even if it's for a few minutes, do something that brings you joy for joy's sake.

2:09:36

Spend a little bit of time in work. We all know what that is and for most of us it's a lot of time, but for, you know, it doesn't have to be paid work, but just something that helps you feel a sense of productivity, agency, that I can do difficult things and I can overcome. Spend a few minutes in solitude, very important for all of the reasons that we've talked about today. Spend some time in community, so engaging with others.

2:09:59

And then spend some time in retirement or in reflection, really taking stock of your day. So at the end of the day, when you're going to bed and you're putting your head on your pillow, you can say, okay, yes, I lived a meaningful life. I did all of those things. And so if you do a little bit of that every day,

2:10:15

you can make a difference. And the reason I give that prescription because I've had patients who, you know, guitar players, right? So people who love playing the guitar. And they don't play the guitar all week. And they'll say to me, I don't see patients currently, but they've said to me, oh, you know, no, doc.

2:10:29

I said, what do you like to do for fun? Oh, I like playing guitar, but I don't play it. When do you play? I don't know, once a guitar a little bit every day? Because it's that all or nothing fallacy. It's like if I don't have an hour to play guitar, I'm not going to do it.

2:10:50

But the joy that it can bring you, that meaning and purpose, it's tremendous. So I think that's what I use, live a lifetime in a day. And the reason is because there are two distinct – when you look at how your brain and body react to happiness, there's two distinct types of happiness. And so there's hedonic happiness and eudaimonic happiness. Hedonic happiness is all about what we've talked about, social media, consumption, pleasure.

2:11:17

And the other type is eudaimonic happiness, meaning purpose, connection, community, growth-oriented activities. And so when you live a lifetime in a day, you go towards that eudaimonia, which can then help you and overcome that hedonic. Because in your brain, there's something called the hedonic treadmill.

2:11:37

And the treadmill is a thing in your brain where no matter what you do, this is like the Instagram lifestyle, right? No matter what happens, you need more of it. You need more of it. You need more of it. Same thing with brain rot. And that is because that you can never get enough

2:11:49

and it's the hedonic treadmill, but you do not have a treadmill for eudaimonic happiness.

2:11:55

That is really beautiful. I've never heard an approach like that, but it sort of takes you, it gives you a much bigger view of your day, live a lifetime in a day. If I was going to offer some specific advice, first I'll offer advice to parents. Here's the rule.

2:12:09

So I did a really good job keeping my kids off social media, but I didn't pay enough attention to computers and everything else because it was during COVID. The rule I wish I had followed, I recommend to all parents, especially with younger children is,

2:12:20

have the clear rule, no devices in the bedroom. No screens in the bedroom ever. That's just our family rule We have a TV in the living room. We have a computer you can sometimes use those But we never take screens into the bedroom at least for kids, you know Maybe later on you'll have to relent in middle school They'll have so much homework. They can take the laptop in and maybe you're feeling a small apartment Of course, it's difficult, but if you can afford to do that to have that rule. That's the main rule I wish I had done in my family.

2:12:45

And that will make everything a lot easier. Also, same thing at the dinner table, no device. We don't have screens at the dinner table. So that's a specific thing for parents to do. For everyone else, for everyone, for just all adults, the advice is you have to reclaim your attention

"Cockatoo has made my life as a documentary video producer much easier because I no longer have to transcribe interviews by hand."

Peter, Los Angeles, United States

Want to transcribe your own content?

Get started free
2:13:01

because your attention has been largely taken from you. At least a lot of it has. You have to reclaim it. And here are the three things that I do with my students and you can do it very quickly and I can just explain it. The first is you have to get your morning and evening routine right. The great majority, as soon as they open their eyes, they're on their phone and it's the last thing and it's everything in between. So you have to have a good morning routine. What are the first seven things you want to do after you open your eyes?

2:13:26

And at a certain point, you can check your phone, but it shouldn't be in the first few. Do things to set up your own day, otherwise your day will be taken by your phone, it'll be controlled by your phone. So you've got to reclaim your morning and your evening,

2:13:39

that's step one. Step two, you have to shut off almost all notifications. Go into your notifications, look into your settings, what's giving you all the notifications. Most of my students get an alert every time they get an email. They don't understand that they have that, because they don't want to miss anything,

2:13:56

but they don't understand that if you are always being alerted, then you miss everything else. So shut off alerts for almost everything. Obviously Uber and Lyft you want to keep on, you want to know when the car is coming. But news outlets, everything else, get a daily email. Don't get alerts when... And then the third, as I said before,

2:14:12

is get rid of all the slot machine apps. Whatever apps you habitually use, whatever apps you feel compulsion towards, you have to get it off your phone. And in that way, your phone is no longer a dopamine trigger that's going to always call out to you like an addictive product. Do those three things, you'll reclaim a lot of your attention.

2:14:28

I would add stop, breathe, be.

2:14:30

Stop, breathe, be.

2:14:32

So three second brain reset. So before you check your devices, before you engage, stop, breathe, and be. Ground yourself in the present moment. What it does is it decreases that what-if future-focused thinking. You know, anxiety is a future-focused emotion, and it gets you back into the here and the now. And so maybe the compulsion, you know, you're bored, you're checking, what about doing something else? You know, we often use that checking as a substitute for many things, and so it gives you that opportunity. And then the rule of two is something that we haven't talked about,

2:15:06

which I would love to propose to us today, is that your brain can really only handle two new changes at a time. And so give yourself two things, of all of the things that we've talked about, if you want to try in your life, two at a time,

2:15:18

give yourself eight weeks, and then add two more and two more. This is why New Year's resolutions fail, because we try to do everything all at once. And so just stepwise, two at a time.

2:15:28

Jonathan, you've just written this book, which is now out, called The Amazing Generation. And it's beautiful. Beautiful illustrations. I'm assuming this one is for slightly younger audiences.

2:15:40

It's for ages eight to 13, yes. And who should buy this and who should they buy it for? It turns out that kids 8 through 80 actually love it. Even adults, they're buying it for their kids. But because it kind of lays out the basic ideas of the anxious generation and explains dopamine, explains the business model. But it does it in a really fun way.

2:15:59

And it's working beyond our wildest dreams. If you look at the Amazon reviews, it's full of parents who said, I left it on the kitchen table. My kids came home, they grabbed it, they fought over it. They read it, they each read it in the first couple of days. And then they said, mom, when I go to middle school,

2:16:15

I don't want a smartphone. Just give me a flip phone, give me a basic phone. Because the book is about how to be a rebel. It's about how to reject this control that the company is trying to put on you and how to live a life that you choose full of real freedom, friendship and fun.

2:16:31

And also The 5 Resets, which is a book we talked about before on the show. Rewire your brain and body for less stress and more resilience. Another smash hit bestseller that everybody's been talking about. Who's it for?

2:16:41

It is for anyone who is struggling with stress, overwhelm, and burnout. It's to help you feel a sense of calm and clarity in this anxious, uncertain world. Everything is free, so that's something that's really important to me as a doctor. Every suggestion I ever offer will always be cost-free because I think about patients with varying resources. It's all science-backed and it's totally practical. You don't have to go to Bali and have a sabbatical.

2:17:07

You can rewire your brain today, right now, in the midst of all of this chaos.

2:17:12

Thank you to both of you. I've learned so much and I really, really mean that. Like I feel sufficiently pushed to take change, to make change in my life. And I need to go think about this because I am most certainly struggling with my addiction to my phone.

99.9% Accurate90+ LanguagesInstant ResultsPrivate & Secure

Transcribe all your audio with Cockatoo

Get started free
2:17:30

And I can feel it hurting my relationships, especially now as a fiance. My girlfriend talked to me, my fiance talks to me about it all the time. And I want to be present. I want to be present for my kids when I have my kids.

2:17:39

And I'm slightly concerned right now that I won't be unless I take some kind of drastic action in the direction of getting my attention back and reclaiming it. Thank you so much for the work that both of you do. I can't say it enough because it's so important and you've reached so many millions of people and you're both changing the world in a way that my words would not be able to capture. But just thank you and please keep going and if there's anything more that I can do to support both of your causes, please do let me know what they are. And on behalf of all of my, you know, many millions of people that are with us right now, thank you so much for saving our children.

2:18:11

Thank you, Stephen. Thank you for giving the world so many opportunities to accommodate and create new mental structures.

2:18:17

It's always such a pleasure to join you, Stephen. And truly, I feel like you are changing the world as well.

2:18:24

Thank you.

2:18:25

We're done.

2:18:26

Thank you.

2:18:27

YouTube have this new crazy algorithm where they know exactly what video you would like to watch next based on AI and all of your viewing behaviour. And the algorithm says that this video is the perfect video for you. It's different for everybody looking right now. Check this video out. Check this video out.

2:18:42

I bet you, you might love it.

Get ultra fast and accurate AI transcription with Cockatoo

Get started free →

Cockatoo