All Content

How Billionaires Captured the Media - (5CAST #19) ft. Steven Renderos

Channel 5 with Andrew Callaghan147 views
0:00

Resisting using artificial intelligence is futile. There are people behind the curtain that are controlling the information that's being distributed to us, and it does come with an intent, it does come with an agenda. Anything that gets in the way of that, it's just an existential risk to the greater possibility of technical progress. Anything else that gets in the way, like ethics, civil rights, like human rights, are just obstacles. We have a fully active tech oligarchy. They get to control who actually exists in our society

0:27

and who doesn't. There are many of us who are having to pay the cost to make these billionaires even richer. And we have to call attention to that. We have to fight back. All right, what's up guys?

0:36

As you can see, I'm sweating pretty hard because I'm quite nervous about where our country's at right now. Today, on a real note, we are at a crucial crossroads in press freedom that our country has never seen before. There are now six billionaires who own a majority share of American news media. And one family in particular, headed by Oracle CEO,

0:55

IDF donor, and AI data center kingpin Larry Ellison, is on the verge of controlling CBS, CNN, HBO, MTV, BET, Nickelodeon, Pluto, and the actual algorithmic infrastructure of TikTok itself, which in itself may be more influential than all those outlets put together. So in real time, if the Warner Brothers

1:14

and Paramount Skydance merger goes through, we're gonna witness the hostile takeover of our entire news media infrastructure by big tech, with Ellison's acquisition coming on the heels of Jeff Bezos buying the Washington Post, Elon Musk buying X, and Marc Andreessen acquiring large shares in Substack, the Free Press, and many other outlets while simultaneously serving on the

1:32

board of directors at Zuckerberg's Meta. Aside from those three, you also have Salesforce CEO Marc Benioff, who purchased Time Magazine and Time Ventures, and then of course you have Palantir's Peter Thiel, the architect of our modern surveillance state. According to several of our whistleblower sources, Thiel is also personally invested in signal boosting

1:51

friend of the show, Braden Clavicular Peters. This support and Clav's rapid viral rise makes total sense. Thiel has made a number of strange statements about the future of humanity.

2:01

You would prefer the human race to endure, right? You're hesitating.

2:07

Yes?

2:07

I don't know. I would. I would.

2:12

This is a long hesitation. This is a long hesitation.

2:15

There's so many questions.

2:16

He seems to have a vested interest in creating a post-humanist, technologically transformed future, wherein real people like you and I cease to exist in our flesh form, instead living on as immortal creatures up in the cloud.

2:28

Yeah, transhumanism is this, you know, the ideal was this radical transformation where your human natural body gets transformed into an immortal body.

2:40

This is not just like a Black Mirror-style conspiracy fantasy, there is a race to create this kind of technology right now. Open AI CEO Sam Altman is now involved with a startup called Nektome, who plan to preserve and then upload his brain after death as a base model for what's called cyber-immortality. And Google co-founder Larry Page recently declared that digital life is the natural

3:01

and desirable step in cosmic evolution. So in this context, it makes sense that someone like Clavicular, who has already sterilized himself by injecting peptides into his nuts and asked questions like, what's the ROI on love, when asked about going on a date with a potential romantic partner, would be algorithmically pushed to impressionable pubescent American boys.

3:20

All speculative comments aside, I think these people might actually want to engineer our extinction as humans, and create an artificial neural network in which they own and can further monetize digital copies of our brains and souls. The truth is, these people are f***ing insane, and that's totally impossible. But they're still investing in it, and for now, the agenda looks like this. More AI data centers.

3:44

Unlimited military support for the Greater Israel Project. Endless funding of media outlets that push culture war issues and keep the public bickering and evenly divided while the insidious mission of corporate capture continues unchecked. That means the collapse of small businesses, the erasure of communities, and tightening of control over key industries like the media, which we're going to discuss today. In today's 5Cast, we're going to be sitting down at our desert compound just outside of Barstow with activist and LA native Steven Renderos,

"99% accuracy and it switches languages, even though you choose one before you transcribe. Upload β†’ Transcribe β†’ Download and repeat!"

β€” Ruben, Netherlands

Want to transcribe your own content?

Get started free
4:11

who is the Executive Director of Media Justice, a Chicago non-profit that focuses on news media transparency and accountability. They recently released a bombshell report on media capture, focusing on Larry Ellison and of course the Time Warner Paramount Skydance Discovery merger, which can be found in its full PDF format at the link below in the description box and pinned comment. But first I want to mention, this entire interview is available in its audio-only format on every major podcast platform. Just look up 5CAST in all caps and you'll see a beautiful drone photo of some autumn foliage, with our logo emblazoned and blasted on top. So that's the one right there.

4:45

If you don't like video and you want to check out our podcast, if you're driving an important load of cargo across America for the semi-quincentennial and want to just slap some audio and not have to watch all this shit, you can do it. Anyways, let's get into it. All right. Well, thanks so much for making the time, man.

5:00

I really appreciate it. For those who don't know, can you tell us your name and a little bit about the organization you work for? Yeah, for sure. My name is Steven Renderos and I'm the Executive Director of Media Justice. We're a national racial justice organization. We basically work to build power, to challenge the way that corporations and governments

5:17

wield control over media and technology. I've been with that organization since February of 2012. So it's been 14 years. Doing organizing at the national level, working with communities all over the country trying to fight for better media and better technology. So before we get into the actual report itself,

5:33

what are some of the achievements prior to that

5:35

that you feel like you're most proud of?

5:36

I helped to get Donald Trump kicked off of Twitter, that was fun. Beyond that, I worked on several campaigns. My first campaign coming into media justice was actually working to lower the cost of phone calls from prisons.

5:47

It was my first fight and we established the first ever kind of regulations at the federal level to lower the cost of phone calls. I stopped the Comcast Time Warner merger way back a few years ago, passed the nation's first net neutrality rules,

6:01

got Facebook to conduct a civil rights audit, got Amazon to stop selling facial recognition tech to police departments. So, those are some of the campaigns I've been a part of that I think hold a good place in my heart. Getting Trump kicked off Twitter's a pretty big achievement.

6:14

That happened after January 6th, right? Yeah. And what did you have to prove to the platform itself

6:18

to make that happen?

6:19

Well, the thing is, we actually had started pushing for that well before that. Actually, right after the incidents in Charlottesville, where a protester was killed by a white nationalist. We started making the call to Twitter, pushing them to push him off the platform, just because of the way that he responded to that situation. Because he was both inciting violence and perpetuating that violence even afterwards.

6:39

You killed a person, Heather Heyerdahl.

6:42

They showed up in Charlottesville to protest the rule of the state. And you had some very bad people in that group. But you also had people that were very fine people on both sides.

6:55

You know, after January 6, like, Twitter just became much more willing, just given everything that had happened. Clearly, like, a whole giant crew of people just rolled up on Congress and started like bashing stuff. And yeah, it just created the right conditions for them to actually be willing to do the thing

7:10

that we've been asking them to do for a couple of years by that point, you know? So one thing you talk a lot about in the report and also on different social media platforms is the idea of media capture. For somebody unfamiliar, what does that term refer to? Yeah, media capture is about who actually controls

7:25

the platforms and the channels that deliver news and information to us. And part of what we're trying to communicate is that it's increasingly a smaller and smaller number of people. That capture isn't just people buying media properties

7:39

so that they have it as a vanity purchase. It comes with a real sense of power. There's a Italian Marxist, Antonio Gramsci, who had this theory of cultural hegemony. How do you wield power over society? And yeah, you can do it through armies,

99.9% Accurate90+ LanguagesInstant ResultsPrivate & Secure

Transcribe all your audio with Cockatoo

Get started free
7:55

you can do it through force, but then there's a way that you can wield soft power, which is control the way, the systems that actually help shape what people think. And so for a really long time, you know, it's not the first time in our modern history

8:08

that we've had, you know, billionaires and gazillionaires owning the media system. Throughout time, when that's happened, it's always been with an intent to actually shape what people think. Media capture is telling us, like,

8:22

there are people behind the curtain that are controlling the information That's being distributed to us and it does come with an intent. It does come with an agenda It does come with a worldview and it's often not really aligned with like what most of us come from what most of us think

8:35

CBS NBC ABC or the major newspapers in America are owned by big money and the concentration of that ownership is

8:42

Becoming significantly more severe and what is that worldview you feel like

8:46

they want to push on others? Yeah, I mean, today in the report that we pushed out, we were focusing on the capture of the media system by tech billionaires. We've been asserting as an organization for the last like year that we're living in a moment

8:57

in which we have a fully active tech oligarchy, an oligarchy that lives kind of parallel to the existence of the state, but who wield an incredible amount of political power. You saw this like on full display on inauguration. You have all of the richest tech billionaires show up on inauguration, pay to actually be there. That oligarchy is shaping the

9:21

political conditions in our society, the economic conditions, the social conditions, and it's a segment of that oligarchy that's actually been buying up big chunks of the media system. People like Larry Ellison, through his son David Ellison, have been purchasing big media studios like Paramount. That merger happened last year and most recently are trying to purchase Warner Brothers Discovery, which has thousands of media properties that they own. If that sale goes through, Larry Ellison becomes the biggest media mogul

9:50

that we've seen in a really long time, even more so than Rupert Murdoch. You know, when you think about Larry Ellison, his wealth was built through Oracle, a tech company that provides cloud-based services. They're one of the biggest pushers of data centers across the country. They're building some of the biggest data centers that we're seeing. As a cloud-based business, their business

10:08

is with the federal government. And a lot of that is for the surveillance tools that law enforcement agencies are using across the country. He has a very particular worldview. When he came in and purchased Paramount slash CBS, he made a lot of immediate changes to CBS, installed a new kind of editor-in-chief at CBS News,

10:28

Barry Weiss, who's a known Zionist and someone who was not too fond of racial justice, like DEI programs, and those things got cut in gut immediately. Paramount Studios overall created a blacklist of actors that had been friendly to the Palestinian cause

10:45

and have basically said, we're not gonna put them in our films. And now purchasing an even bigger studio, Warner Brothers, you can stand to guess those are the kinds of choices that they will replicate there. And then Warner Brothers also owns CNN,

10:56

so they've already very openly talked about making some big changes to what CNN does and how they're structured.

11:02

The very first November, I was fired by my agency, specifically for marching and speaking out about Gaza, for asking for a ceasefire.

11:11

So you have the tech billionaire class and this small group of oligarchs kind of acting as a rogue non-state entity with as much GDP as the country of Taiwan.

11:21

Correct.

11:22

And they're going through the process of capturing mainstream media outlets and embedding themselves inside of social media platforms to push various agendas. And one of the big agendas is AI. What is the connection between AI and active newsrooms right now?

11:35

And what is the importance of making sure there's a positive reportage on AI? Well, I mean, you think about anybody who picks up a device, sees artificial intelligence being pushed on them all over the place. Apple has got their version of artificial intelligence

"Cockatoo has made my life as a documentary video producer much easier because I no longer have to transcribe interviews by hand."

β€” Peter, Los Angeles, United States

Want to transcribe your own content?

Get started free
11:50

they're pushing to you. You go to any of the social media platforms and your search bar all of a sudden is embedded with AI. So we're living through a moment of AI hype. When you think about the big companies that are behind artificial intelligence,

12:04

be it OpenAI, Perplexity, anthropic, they have built a certain version of AI that relies upon bigger and bigger quantities of data. Newsrooms are actually really critical sites of generating new data because reporting happens every single day. For starters, part of what we noted in the report that we're seeing is partnerships emerging with every single major news organization.

12:26

Those partnerships are all about feeding more content into large language models. And then we're also seeing other kinds of deals that are all about training large language models to continue doing what they do better. But I think the other thing I would say

12:43

is by embedding these tools and actually using them for things like editorial decisions, to edit reports that journalists write, to even write entire articles, you're creating the conditions for journalists to be displaced as workers, for starters.

12:59

Additionally, I mean, for newsrooms, the reason they're doing it is because it creates a revenue stream for them. And anybody who's been following journalism knows that there's been an economic crisis I mean for newsrooms, the reason they're doing it is because it creates a revenue stream for them and Anybody who's been following journalism knows that like there's been an economic crisis inside of journalism How do you fund it AI deals are becoming like a big source of revenue for many news organizations? and that's unfortunate because you know historically we've seen this dynamic where when you have an industry that has like a

13:22

Significant financial influence over the industry, it makes it less likely that the news industry will criticize it, will cover it fairly. This happened in the 60s with the tobacco industry. You had a lot of research that was pumping out, that was drawing the connection between smoking and cancer. Because the tobacco industry was the number one advertiser in newspapers, you seldomly saw those reports get covered, you know, really fairly.

13:48

So my concern is like in the moment that we're living in now, where there's a big AI hype machine that's telling us to keep using the thing, and you're seeing data centers being built just about everywhere, including in the desert where there's no water, but let's build it here. AI is becoming so central to news organizations, making it so that the kind of critical coverage we need in this moment is not happening,

14:09

because the financial influence is there. So you feel like the main drawbacks of AI is the gutting of newsrooms, the construction of data centers, and the potential impact on communities. Where in particular do they choose to build these kind of data centers,

14:24

and is there a socioeconomic component of where they choose to construct them? It used to be even 10 years ago that many of these tech companies, because data centers have been around for a really long time, you need data infrastructure for things like the internet. But for several years, they were building them in places where they were thoughtful about where they went.

14:43

Is there a place that has a power grid that can absorb the kind of energy consumption that we're gonna use? Is there a place that has plenty of access to like fresh water, because you need that to cool down the computers?

14:54

In recent years, and I'd say probably in the last three years, as like the AI hype machine has been like really building, they've just kind of thrown that caution to the wind. And just about in every community you can think of, they're being built in big towns, in small towns. They're being built in places that are in drought-ridden conditions.

15:11

They're being built in places that have really messed up power grids, that can't really sustain it. You know, one of the things we tracked in a report that we released last year was also they're going to places where they think that they won't meet local opposition, where they think that the communities have limited political power.

15:27

So last year we released a report called The People Say No, looking at the trend of data center growth in the South. And in a lot of the places are, you know, rural white communities and, like, deeply populated black communities. In areas where, like, what's known as, like, Cancer Alley down in the south, where there's high rates of cancer, they're building them there.

15:49

Yeah, the calculus that many tech companies are making is where can we go where we'll meet favorable conditions on the ground and where we won't meet local resistance? And what they've been finding out is that's not the case. People are fighting back because it's very clear to people when a big data center comes to town and you start to see your utility rates go up, and you start to see the water tap come out like differently, there's a reason for that.

16:14

And it's probably because of that big data center that got built down the road. Yeah. I know too, there are also a couple of different reservations in the US and Canada, like the Pyramid Lake Reservation in northern Nevada, and then also somewhere in Alberta, they're trying as hard as they can to essentially bribe tribal officials to greenlight the construction. So they're looking for places where there'll

16:32

be very little local resistance or where it's so impoverished that people were incentivized to let them build recklessly if they can get a chunk of change from the center itself. Yeah, I mean, you saw this in Memphis, Memphis, Tennessee, where Elon Musk built one of his data centers, Colossus, for his company, XAI. And he came in, you know, Memphis is a place

16:50

where historically black community and the neighborhood where they built it was called, it's called Boxtown. It's a neighborhood that was founded by freed slaves. Has a really interesting, like, storied history. But it's also a history of a lot of environmental racism. There used to be a Valero gas refinery,

17:06

like right next door, that pumped a lot of toxic air. Cancer rates in that particular neighborhood are like four times higher than anywhere else in the city. And it's an impoverished neighborhood. The mayor saw an opportunity with like, Elon Musk, the world's richest person to come to town.

99.9% Accurate90+ LanguagesInstant ResultsPrivate & Secure

Transcribe all your audio with Cockatoo

Get started free
17:23

Maybe this can like help raise our tax revenue. This can help us pay for things that we haven't been able to afford. And the thing is, in so many communities across the US, that need for more revenue is there, for sure, because you've had cities that have been dismantled

17:39

because industry has left town, because the coal mine shut down. So I totally get that part of it. But part of what we see happening is these tech companies come in and local politicians are mortgaging the entire future of that community for like a very short-term gain. For temporary jobs that will exist while the data center is being constructed, but then

17:59

once that's over, like, it's only a handful of jobs that get created for tax breaks that don't ever come back. And you have half the states in the country that have tax breaks on the sale of the land, so tech billionaires right off the top don't have to pay a whole lot to get the access to the land. So those kinds of deals, I think,

18:20

represent unfair bargains for local communities that actually won't get to benefit in the longer term. You know, and it's all to make Elon Musk even richer. And this all connects to the media because the media would be the tool that could maybe get in the way

18:32

of the construction of these centers by raising awareness about the potential negative ramifications that would come with that. And just AI critical speech in general would not be published on outlets and different news channels that are organized, that are owned, or at least have a majority shareholder

18:45

who's involved in the AI world somehow. Which brings me back to Larry Ellison. Let's talk about Larry. How did he make his money in the first place? Oracle is the company that he built. It's a cloud-based company.

18:55

Provides services to a variety of different corporations and governments. You know, it's where you store your stuff. Like, you know, everybody thinks the cloud is just this kind of nebulous thing. It's actual real infrastructure. So he's built a lot of data centers all over the country

19:11

to kind of service that cloud-based business. In the age of AI, like that data infrastructure need has ballooned, so he's doing more of that. He's also making his money, he's kind of leveraged his wealth to do other things. In addition to buying up CBS News and now, you know, Warner Brothers,

19:27

he's played a very critical role in the sale of TikTok. In the last year, we've seen that TikTok became a big talking point for Donald Trump and wanting to see TikTok in the hands of a US-owned business. And so he's part of the conglomerate that's actually taking an ownership stake in TikTok. And his company, Oracle, is actually gonna be in charge

19:47

of auditing TikTok's algorithm. So that's interesting. So he's making his money there, and now becoming a media mogul, so probably making his money there too. When you say auditing the TikTok algorithm,

19:58

so the algorithm itself is a mathematical equation that can be modified by a person? Oh yeah, oh yeah, absolutely. It's interesting because like, let's give it some context, right? So TikTok came into the public conversation

20:12

among many people on the right in the aftermath of the genocide in Gaza because of the role that TikTok was playing in kind of helping Palestinians actually speak truth to power and get messages out about what was happening

20:26

down on the ground. TikTok has been a platform, much like many other platforms before it, where progressives and leftists have been able to actually tell the story of what's happening.

20:38

You saw this during Arab Spring and the role that Twitter played in the rise of Arab Spring back in 2011. So there's always been this relationship between media tools and the rise of Arab Spring back in 2011. So there's always been this relationship between like media tools and the ability of those who are living at the margins to leverage those tools

20:49

to tell their stories. Cause often like corporate media become gatekeepers that then manipulate the story and don't tell the story at all, right? So TikTok, that's the context for TikTok and why like someone like Donald Trump is saying,

21:02

like we need to change that because it's amplifying perspectives that are not in alignment with our agenda. So yes, like, an individual, or a set of individuals here in this case, can manipulate the algorithm

21:15

to suppress what's happening on those platforms. And actually, with Palestine, this already happened. It happened on Instagram, it happened on Facebook, because those platforms, like, those changes were tweaked. And so many, many accounts were shadow banned, like where the user didn't even realize that they were posting content, but it wasn't reaching anybody. And much of the content was being taken down. So I would anticipate that something similar

21:38

is coming to TikTok eventually, which not only just has an impact on Palestinians, but anybody who has kind of a countercultural worldview You know an alternative worldview and what is Larry Ellison's connection to Israel? That's a good question, man. I don't actually know What I can tell you is he's always been sympathetic to the Israeli cause And he has been very deliberate in placing like pro-israeli voices on his platforms Mm-hmm. I read somewhere that he actually donated to the IDF.

"Your service and product truly is the best and best value I have found after hours of searching."

β€” Adrian, Johannesburg, South Africa

Want to transcribe your own content?

Get started free
22:06

Oh, interesting. Yeah, he's been an IDF donor. I think that a lot of Americans, especially young ones, don't necessarily know corporate capture of the media has already reached like maybe 75 to 80% completion. It's reached almost a boiling point now

22:21

to where the actual fraction of independent media that still exists has to be under five percent even stuff that looks Independent is somehow funded by a billionaire who's connected to the establishment or the tech establishment in some way even before this report What do you mark as the beginning of the timeline that led to today's current moment in media capture in part? It was let's go back to the 70s, when Wall Street paid attention to news organizations. Up until that point, many major newspapers,

22:49

TV stations were either family owned or just like regional conglomerates that owned it. Around the time of the 70s is when Wall Street like took a look and said, here's a place where if we bought this, we can try to tweak it and make as much money as possible.

23:04

So you start seeing corporate consolidation start happening right around the 70s. It really ramped up in the 80s under Reagan. Reagan kind of cleared the way. Some deregulation that allowed for the sale of big companies to start consolidating. And then it really ramped up even further in the Clinton years with the passage of the 96 telecom act Part of what the 96 telecom act did was it just completely?

23:30

Eliminated any kind of media ownership caps or just like really raised those caps allowing, you know Bigger and bigger companies to get built and so from 96 on we start seeing like mega mega media companies start to emerge That's where like your Disney's, your Walt Disney's start coming, your News Corp started emerging around that time. And so back then, a lot of the big companies

23:52

that were purchasing were cable companies that became these giant media companies. In the last 10 years, it's been tech billionaires who have the money, have the resources, have the capital. And we've started to see this shift now towards tech being the primary media moguls that

24:08

have captured the media system. But it's been deregulation. It's been getting rid of guardrails that have existed for a long time that has allowed for this wave of consolidation that we've seen.

24:19

What happened in 2016 specifically

24:21

to begin the timeline in your report? Yeah, in 2016, AT&T attempted a merger with Time Warner. Eventually we actually actively opposed it. Eventually that sale did go through and we started to see where a big telecom company purchased a cable company and consolidated at a level that we hadn't seen up until that point.

24:44

That deal eventually didn't really amount to a whole lot. Eventually, Time Warner was spun out and sold off, and what emerged later was Warner Brothers Discovery, which is the deal that we're seeing take place now with Larry Ellison purchasing it. How many different prominent news organizations

25:01

that you can think of would exist under Ellison's umbrella if this goes through? It's well over 2,000, you know, news channels, media channels, just to name a few of the popular ones off top. You're talking about BET, MTV, CNN.

25:16

Those are like some of the major ones that would come under his control, but several thousand, a couple thousand media properties. And what separates him from, let's say, somebody like Rupert Murdoch? Rupert Murdoch, CEO of Fox, owner of the Wall Street Journal, and a number of different outlets in the UK

25:33

and across the world. What separates someone like Murdoch from Ellison? They're very similar in the sense that they share a conservative worldview. The difference is Rupert Murdoch came up in the news business, so that's his like bailiwick. His point of view was like I'm gonna use news as a way to like impose the way that I think the world should be, conservative

25:53

worldview, and you saw that play out in the way that he built out Fox News. Larry Ellison is a tech billionaire, is a tech oligarch, and he believes in trying to advance like the cause of tech. And many tech billionaires, including him, have a variety of different kind of world views that inform the world that they're trying to build. But a lot of it is rooted in techno-progress.

26:16

Like, let's build technology to its greatest possibility, and let's not let anything get in the way. The idea that a data center could completely dry out the well of a community over here, that's just nonsense to worry about that because ultimately the gain that we get by creating better technology outweighs the bad that it does.

26:37

Hey, what's up guys? As you know, on today's FiveCast, I sat down with Steven Renderos from Media Justice and he's breaking down exactly how a tiny handful of tech billionaires are buying up every goddamn newsroom in America. Come on Larry Ellison, what the hell is wrong with you, dude? But after this, I got curious.

26:54

Who's actually reporting on this merger? You know, the Warner Bros. Discovery Paramount Skydance merger. And who owns the outlets covering it? Well, that's where Ground News came in, which is a super fire, dope-ass news aggregate, and also Channel 5's primary research tool. So I got it pulled up right here on my phone,

27:10

and I typed in Warner Bros. Discovery into the search bar, which brought me to a story on the merger. As you can see here, there's 352 sources covering this story. Three, five, two. It's a big kahuna of a story.

99.9% Accurate90+ LanguagesInstant ResultsPrivate & Secure

Transcribe all your audio with Cockatoo

Get started free
27:21

But here's the beauty of ground news. Right here, you can see how differently it's being covered depending on who owns the outlet in question. For example, here's NBC News owned by Comcast, which tends to lean a bit left and is itself a giant media conglomerate. Their headline reads,

27:35

Warner Bros. Discovery Science Merger Agreement with Paramount Skydance, setting the stage for one of the most consequential media mergers in recent history. Very measured, kinda like a press release. Now over here, the New York Post, owned by the Murdoch family, who tend to skew right.

27:49

So their headline zooms in on internal chaos. Here it says, Warner Bros. Discovery CEO David Zaslav calls Paramount Pivot whiplashy as $110 billion deal takes shape. Rattled staffers, boardroom drama,

28:01

make sense for the Murdochs, the oldest media dynasty in the game. They're threatened by Ellison, and if Ellison closes that deal, he jumps them on the scoreboard.

28:09

Boom, ground news.

28:10

Then there's the Raw Story outlet, an independent news outlet owned by two dudes, not a big-time Ellison-Murdoch-style conglomerate. Their headline, Ex-GLP Strategist Deeply Concerned strategist deeply concerned by MAGA-friendly merger, worse than a Banana Republic, actually names what's at stake, real pertinent info, not bickering or basic flowery press-laden language.

28:30

Same merger described by three different outlets quite differently with three different angles determined by the person investing in the story itself. And if I wasn't using ground news, I would have missed the big picture, which was especially apparent when I scrolled down

28:43

to the ownership breakdown panel Which shows that of the three five two sources covering the story 49% of them are actually owned by media conglomerates themselves only 13% are independent. The rest is split between private equity corporations telecoms and Government agencies which presents a big problem. The story about media consolidation is being reported by these same media conglomerates, a real Fox in the henhouse type of scenario if you're familiar with the metaphor.

29:12

I know ground news is, boom, dope-ass news aggregate. They've also got a blind spot feed for stories that one side of the spectrum is pretty much ignoring altogether, featuring factuality ratings on every outlet, and interest tags that you pick so you can follow whole beats in real time. Without them I might be stuck in my echo chamber forever but I'm not. Boom. Ground News. Biatch. I use this shit every day. It's the

29:32

perfect partner for Channel 5 because our missions effectively align and guess what? Subscribing right now through my link ground.news slash 5cast which is also in the description box and pinned comment of this video or in the QR code on screen you see right now gets You a 40% off the vantage plan. That's the best one bringing the cost down to less than five bucks a month I don't know exactly who you are But if I was you I'd hit that link right the hell now ground dot news slash five cast All right back to Steven. Do you think they have a utopian world view of total convenience. Yeah, and it's a utopian world view that is about,

30:05

some people get to live in that world, others don't. There's this philosophy called long-termism that this guy Nick Bostrom, who created this think tank called the Future of Life Institute, ascribes to. And they believe that there's a set chosen few people in the world who are geniuses.

30:26

And really what the world needs to do is cater to every single need of those geniuses to allow them to become the geniuses and do the things that they want to do in society. That the world that we're trying to build towards, if we allow those people to do that, is going to be much better than the world that exists right now. Anything that gets in the way of that, you know, climate change, like, it's just an existential risk

30:48

to the greater possibility of technical progress. It's a worldview that a lot of it is, like, informed by eugenics, that believes that some people are smarter than others based on, like, racial characteristics, and it's very techno-utopian. It's the belief that like technology ultimately is the greater good and everything else is just

31:06

Anything else that gets in the way like ethics like civil rights like human rights or just obstacles Sounds like they want fewer people in the world Oh, yeah, you know because the gutting of the working class and a rager of the non-rich is The same thing is dying. Yeah, know, the elimination of jobs and mass, mass unemployment. Have they voiced any theories

31:28

about slowing the birth rate down? Some are, and it's interesting, I mean Silicon Valley has a mix, a variety of different world views. You have some who are like, we need to make more babies,

31:38

and in particular, more white babies. Yeah, I've seen that. And then there's people who are like, the human extinction Extinctionism like some of the people who are like behind a lot of like the AI development believe that like the push for artificial general intelligence is about creating a kind of a

31:56

Superior being to what exists now and when that superior being is built when we achieve artificial general intelligence us humans Are less worthy and therefore we don't deserve to be around. So there's like human extinctionism in the mix in this. I don't know if Larry Ellison is like ascribing to any of that, but like that certainly is in the mix in how people think about the world.

32:16

It's very nihilistic. And so, but I think like regardless of whether it's like let's make more white babies, or let's extinct all the humans, or let's not worry about the implications to climate change when we build technology at a scale that destroys the planet. Regardless of whatever you ascribe to, there are certain communities that are always

"The accuracy (including various accents, including strong accents) and unlimited transcripts is what makes my heart sing."

β€” Donni, Queensland, Australia

Want to transcribe your own content?

Get started free
32:38

sacrificed along the way. And it's the poor communities, it's the communities of color that have always kind of existed on the margins, it's in the name of AI, it's the communities in the global majority that are doing all the like mineral extraction to kind of fuel the computer chips that are being built. It's the content moderators in Kenya who are actually the ones physically doing the work of training large language models.

33:06

You know, people think that artificial intelligence just kind of works on its own. There are so many humans that actually touch it, but they're invisible to us. And there's a reason for that. And so going back to like Larry Ellison,

33:17

the reason it's so dangerous to have tech figures actually controlling the media system is that they get to control who actually exists in our society and who doesn't. And that's my concern. I think my thing is there are many of us

33:31

who are having to pay the cost to make these billionaires even richer. And we have to call attention to that. We have to fight back. And one of the ultimate goals of AI, especially on the domestic manufacturing side,

33:41

is maximizing profit for corporate business owners and making it so they don't have to pay workers. You know, it's like how can we make sure that our actual warehouses are staffed by robots and that we can pay as few people in real life as possible. And so thinking about that applying to the newsroom

33:57

is kind of scary too, especially when you see all the layoffs they're doing. Like, are journalists gonna be robots too? Yeah, I mean, I don't know if they'll be like automaton robots, but certainly what you're seeing is like the introduction of artificial intelligence in newsrooms that hints at that possibility. And you'll start to see it in like slow waves. You'll see editorial boards, like the editorial level of like the cycle of

34:23

creating an article might shift to artificial intelligence. You saw this with the Associated Press recently. They have a leader inside of their newsroom who is like their AI lead and basically told the journalists like resisting using artificial intelligence is futile. Like so you have to adjust to it And his vision for what should happen is, the reporter should go out there and gather the information

34:48

and just let the AI write the article. And that might sound like, okay, that's less work for journalists to do, but eventually, like, eventually, it will lead to the next stage, which is like, do we really need humans

34:58

writing the news altogether? And that's a scary thought to think about. I mean, I think about my role kind of growing up. I actually spent some time working inside of a community media organization. I ran a show at a radio station. What I loved about it is I got to tell the stories

35:14

that I thought were interesting to me. And it was not really what else was on the air, but like I got to curate something and build my own audience. I mean, you get to do that too with your show. The thought that we could turn all of that creativity over to an algorithm that gets to decide

35:30

what actually matters to us, like that's a very scary place to be. And it's not really even an algorithm deciding. It's the human behind the algorithm. It's the people that control it. It's the people that own it.

35:39

That's not a world that I want to live in you know I mean if you look at Google AI Summaries and that impact on the viewership and visibility of news organizations even mainstream ones I mean, I think that's kind of a more sublime example of journalists being replaced by robots Oh, yeah, because you know even though it's not like a robot replacing a reporter at a news desk in DC You know or something like something crazy visual like that. I mean, can you speak to how Google AI summaries have taken away viewership and jobs

36:09

and eyeballs from journalists?

36:10

Yeah, I mean, some of this is like going back to the history of tech, of like, they've always pushed innovations on us that we've had to adjust to. You know, you can think back to like, the first time I remember when Facebook

36:23

switched up its newsfeed, and I was like, what the fuck is this? And this happened with Google, where all of a sudden you would put in a prompt now, and instead of taking you to search results, which is what Google has always found it to be,

36:36

like an index, like a library, saying like, here's the information and where to go.

36:41

What neighborhood do you think I'm in?

36:43

This appears to be the King's Cross area of London. It is known for its railway station and transportation connections.

99.9% Accurate90+ LanguagesInstant ResultsPrivate & Secure

Transcribe all your audio with Cockatoo

Get started free
36:50

Instead, it's taking you to a box that says, here's an AI summary of the information we think you want. And there's a lot of power. This has always been true with Google. There's a lot of power in what makes it to that feed. And so the fact that they're like now redirecting you to a closed space that says, here's the information for you, that's scary.

37:09

What we've seen news organizations start to do is to actually work out deals to ensure that it's their information that's making it there. The Associated Press has a deal with Google so that its Gemini chatbot will promote its content. But, you know, when you think about the larger landscape of media, that leaves so many institutions

37:30

out that leaves independent media out, that leaves the black owned newspaper out. And, you know, the fact that they've done this without any real kind of consent of the users and just kind of like focusing your attention, like eventually it creates more work for us as the user because what I've come to find and what everybody who does the research on this is coming to find

37:51

is that there's generally mistakes. The BBC did a report on this. They found that like there was 75% of the query results that they had put in had mistakes in them. As a user, you have to like scrutinize the information. Now you have to do more work

38:05

than you would have done before if you had just read an article from a credited news site that you trust. And the frustrating thing there is that you talked about the media consolidation in the late 90s, and around 2001 with the 24-hour news cycle

38:18

is when people became hip to just the sort of ridiculous partisanship in the cable news system. But then people saw the internet come out, the early internet, and that was supposed to sort of be the savior of the corporate gridlock that had consumed

38:33

I guess cable news feeds. People were like, oh, the internet's here, now we're gonna be able to get free access to information, it's gonna be all good, we can curate news how we want from real people and just cut out the middlemen and get straight to the point. And it sounds like,

38:45

especially when you talk about Larry Ellison having control of the TikTok algorithm through Oracle and these sort of other things happening, that the internet has failed to achieve its purpose in that sense.

38:54

Is that an exaggeration or do you think that's fair?

38:56

No, it's fair. I mean, there's an academic named Cory Doctorow who put a name to it. He calls it inshittification. And that's what we've seen happen at the internet. I mean, I remember those days, and part of actually what got me into the line of work that I do now is, you know, I was in high school during 9-11,

39:14

and I remember having debates in school about like, is there gonna be a draft, because there was a drum up to war. And when you looked at the media coverage back then, everything was pushing us towards war, right? And so we see the consequences of having a consolidated corporate media

39:30

that can be that kind of echo chamber to push us towards something that we probably shouldn't have done. And in 2003, when we invaded Iraq, I remember seeing the fall of the Saddam statue and seeing the image being displayed

39:45

across every news organization come to find out that if you had just kind of like zoomed back the lens a little bit you would have seen that like that entire square where supposedly the people had risen up and you know torn down this statue was really US military personnel with their tanks all around the square. That would have been a very different story to tell. That would have been a

40:08

story of invasion. That would have been a story of occupation. And that sort of like radicalized me to the idea that like actually media can play a very distinct role in telling a certain story. Later on, you know, this happened to me again in 2005 during Hurricane Katrina, watching the way that certain people that were trying to survive during the aftermath of Hurricane Katrina

40:29

were being depicted by like the Associated Press in the photo captions. And when it was like white folks that were going around, like scrounging like stuff, they were surviving. When it was black folks, they were looting. So like, there's always been this kind of relationship

40:43

between what happens to the infrastructure that shapes our information and our media, and what story it's trying to tell. The internet was supposed to be wonderful. My colleague Alfredo Lopez calls it the biggest social movement in history. Because he really saw this infrastructure as a tool that actually connected us in a way that we hadn't been connected before. But instead in the ensuing years what happened is what happened to corporate media, which is big gatekeepers emerged and they sort of shut down the internet and what you have now

41:17

it's just like little fiefdoms. The Facebook fiefdom, the like Google fiefdom, and you know you can play within their sandbox, and if you try to go somewhere else, it doesn't quite play with each other, and it's all kind of shut down, and it's controlled. And of course, there's always possibility inside of that,

41:35

and I see that in the show that you do, and being able to really tap into your audience, but it also exists within a context where it's a corporation that ultimately controls the means of distribution to reach your people. And so what's an example of them limiting coverage in a way that would be considered like traditionally a free speech violation? You know, we saw this a lot actually in the aftermath of like the really well-documented killings of black people by police.

"I'd definitely pay more for this as your audio transcription is miles ahead of the rest."

β€” Dave, Leeds, United Kingdom

Want to transcribe your own content?

Get started free
41:59

You had a lot of black users who were posting stuff on social media saying like, hey, this police killing is whack, it's bullshit. And we were getting reports from people saying, hey, like, don't... We just keep noticing our content keeps getting taken down and it doesn't really tell us why. And there's no way to actually, like, try to talk to someone about it. So that's when we started doing a lot of work, engaging with companies like Facebook and Twitter

42:27

to find out what's going on and to advocate for better policies. But really the thing that got me working on that with Facebook was the killing of this black woman in Baltimore County, Maryland, where she was actually live streaming.

42:41

At the time, the police were, she was in a confrontation with police,

42:44

and the Baltimore County Police Department live streaming. At the time, the police were, she was in a confrontation with police, and

42:45

the Baltimore County Police Department asked Facebook to shut down her live stream. And they complied. And shortly after that, they entered her home and just like, and killed her in front of her son. So, you know, I think like the choices around the takedown of content, particularly when it doesn't suit, you know, a police department or the federal government or the platform itself can have deadly consequences. I think we've just found that to happen more consistently with marginalized voices online,

43:14

particularly communities of color. Marginalized voices and then also when there's information that may challenge existing power structures. I saw a CNBC report that said that in the couple days after the most recent Epstein file drop, if you looked up the word Epstein on tick-tock and elsewhere have an error message and not show you what was going on

43:30

Yeah And the thing is like, you know That's the innocuous thing about this is like how much is controlled by a handful of people then when these things pop up like you Think like it's something up with the platform? Did I do something wrong? Like, there's no way to actually know. Like, the media system exists as a giant black box. Someone controls it,

43:49

but we don't know exactly how it's controlled. Yeah, it's also, people always say to me like, oh, is the internet still free? People say that, and I think that it kind of is free if you choose to make your own website. Then you would have to almost market directly, like go to channel5truth.com.

44:05

But most, I think it's 95% of online traffic, like you said, has been filtered into the platforms that are owned by these private companies. Meta owns Google and Facebook, you have X owned by Elon. Then you have small minority platforms like Rumble and Blue Sky that are owned by,

44:19

you know, different small, more small time wealthy people. If I were to build my own website and post footage from Palestine, or stuff about Epstein, or stuff about police brutality, would there be a way for the government and power structures at hand to actually take that down, or can they only touch me when I'm on major social media?

44:36

Yeah, it's a great question. I mean, the thing is, a lot of that content has to live somewhere, right? It has to be stored somewhere. And that's where you go back to, like, who are the dominant players in the cloud-based infrastructure in the US?

44:50

And it's the tech oligarchs. So could there be a way to take down content from your website? Probably.

44:55

Through?

44:56

Through your cloud-based server, like shutting it down.

44:59

Oh, gee, don't you.

44:59

Yeah, yeah. Shutting it down at the point of where it's being stored, right? But there is a blueprint for a free internet assuming we don't have a cloud server. It would just be a website. Absolutely. I mean, that's the thing. It's like the pathway from the current moment that we're in to what else it could look like. It's actually not that complicated, you know. And I a lot of what we've been pushing and actually pushed for during the pandemic especially is is to support resources towards building community-owned broadband networks or municipally-owned broadband networks. If you think about your first gatekeeper

45:29

that you have to enter when you go on the net is who's your internet service provider? T-Mobile. Right, T-Mobile, Verizon, AT&T, Comcast, Spectrum. But there's a way to build that infrastructure that's community-owned.

45:41

There's places like Detroit that have built mesh networks that are community controlled, like social media, not social media, but internet networks. There's a group out in New York that we work with, Community Technology of New York, that builds these temporary mesh networks that they prop up during times of emergency,

99.9% Accurate90+ LanguagesInstant ResultsPrivate & Secure

Transcribe all your audio with Cockatoo

Get started free
45:59

because the problem with like centralized, consolidated infrastructure, whether that's internet infrastructure, data infrastructure, or media, is that it has vulnerabilities. If it shuts down, it shuts down. You see this during hurricane season,

46:18

the cell phone networks always don't work. And that's a product of design and policy, not because of a natural disaster. I remember you know growing up in LA We had earthquakes all the time

46:29

Yeah

46:29

And guess what like our landline still worked and that was because there was policy that said The the copper line that goes into your home the telephone line That's got to be self-powered for these kinds of situations and this group CT CT and why? Goes around and builds these temporary mesh networks that actually allow people's cell phone devices to communicate to each other if you're inside the network. And in moments of crisis like that, it's the difference between life or death, being able

46:54

to reach out and talk to people and communicate. In Hurricane Maria, like in Puerto Rico several years ago, there's several people, I mean hundreds of people that died that would not have died if they had had a way to communicate, you know. So, you know, building alternatives are totally on the table. We don't have to have the inshittification version that we have now, you know. So there's no regulating

47:18

board that is capable of taking down websites themselves. The Internet still exists as a potentially anarchist sphere if you decide not to build your following

47:29

on one of the major platforms.

47:30

Totally. That's pretty safe. And actually, there's people out there that are actually trying to teach people how to get off of the major platforms, because people think it's impossible,

47:39

but there's this computer scientist, Janet Vertesi. I just met her a couple weeks ago. She's interesting. She was part of the Mars Rover team. She actually built a robot that made it to Mars, unlike Elon Musk.

47:52

But she has a project now that's just about teaching people how to wean yourself off of the big corporate tech players and exist in an ecosystem where you can still do the same stuff. You can email, you can store stuff, you can message without living in their fiefdoms.

48:10

So it's totally possible. It takes some work, but I really like the work that she's doing because it's showing me that it is actually possible in incremental steps to not be so tethered to these tech oligarchs who have a very different agenda that's counter to the world that I'm trying to live in. Yeah, it's just hard because in theory,

48:27

I want to say, yeah, I'm going to leave every major social media platform and make my own Channel 5 website. But the reality is, that would be a huge economic loss. Totally. And don't get me wrong, Ned,

48:37

I'm not sayingethered for survival purposes. You know, everyone who counts on the payroll of the Channel 5 LLC to use these major networks to promote everything we come out with. I mean, here's what I'll say. Because like, I also think that there's a value in staying inside of the platforms. Because like, if you leave, you're ceding ground. And who are you ceding ground to? Who replaces like the content, the perspective that you bring? Compromised copycats. Exactly.

49:07

Yeah, people who won't necessarily bring the perspective that you try to bring or the curiosity that you bring, you know? And I think about that myself, like on the platforms that I'm still active in, like I still have an ex account,

49:18

I don't post there as much all the time, but there's a value to staying there and holding ground, because if we leave, then it creates space for someone else to occupy. And I feel like that's why this combo is so important, for me especially, because you mentioned the media being this black box, and I've had experiences

49:35

with Time Warner and HBO and CNN that I'm still trying to make sense of, but they're so confusing. So when I had to promote an HBO film a long time ago, they had me scheduled to go on primetime CNN with Don Lemon and Caitlin Collins. The movie's not just about, like, the Capitol riot

49:50

and all that. It's also about, like, media echo chambers. You know what I mean? And, like, the dangers of the 24-hour news cycle and how I think mainstream media, like Fox and even CNN, like, competes for views

50:04

by running constant 24-hour news cycles based upon fear, division, outrage, and panic. And I signed an NDA but I'm gonna break it anyway but they had to give me this weird media training session and it was like these 16 people I had never met before more people on a zoom call than I thought you could ever even have I had to get zoom premium all these people in suits like going through point for point, being like, you need to tell,

"99% accuracy and it switches languages, even though you choose one before you transcribe. Upload β†’ Transcribe β†’ Download and repeat!"

β€” Ruben, Netherlands

Want to transcribe your own content?

Get started free
50:26

if they ask you about Trump, say this. If they ask you about revolution, say this. If they ask you about your background, say this. Don't reveal that you're arrested for graffiti. All this weird ass shit. And I'm like, okay, like get through to me. He's like yo my man, so check this out I know these people these motherfuckers don't look cool you but they're cool son, and I was like all right

50:47

I was like whatever cabal you're a part of it's freaking me out, but it was only an hour, but it was mandatory Yeah So they told me I was gonna go on CNN When Don Lemon was gonna ask me about like my life and all this shit and I get on he goes What can you tell us about the mental state of Proud Boys leader Enrique Atario in the lead up to January 6th? I'm just like, what the fuck kind of snitch question is this? Like I had gone, I'd embedded myself in this men's group,

51:10

the Proud Boys, for a long time, and I still had those active relationships, revealing any information could put my life in danger. I just wasn't ready for it. So I clapped back super hard, and film was about how the mainstream media, including CNN, uses the cable news system to push partisan divisions that have led to events

51:29

like January 6th by essentially pitting working class Americans against each other and pushing culture war over class war issues. And he was like, what the hell, they cut the segment short, it was never aired, no one ever saw this shit. I leave the studio, and I guess the CNN

51:41

had a handler for me, well technically work for a 24 like I said this black box It's all connected and she's panicking. She's like looking at her phone on earth I'm like what's going on and she was like Time Warner C-suite is furious And I'm like who is Time Warner C-suite and she said you don't know that the same people that own HBO also in CNN Yeah, and so they looked at it as like an in-house beef, and I was like, no, no, no, I was just projecting my opinion. And they were like, don't you remember the media training? You weren't supposed to do that.

52:06

She was panicked. They canceled my press tour right then and there. No more press appearances. My final press appearance for the movie was also never aired, but it was an extremely hostile NPR interview where the first question is,

52:16

how do you think the Sandy Hook families would feel about you fraternizing with Alex Jones one of the most despicable people on the face of the planet when the movie itself is an expose of how people like Alex Jones created the conditions for January 6 right do you feel responsible for what happened to the Sandy Hook families?

52:31

Yes, I killed the children, but beyond that I mean I went in that school. I pulled a gun out I shot every one of myself. I mean I

52:39

And it was just nuts and I always wondered like that's so weird that all of these Organizations that are technically disconnected some of them like NPR isn't necessarily connected directly to Time Warner C-suite But they seem to have like some sort of high up corporate meeting where they decide Who is gonna be made to be a hero and who's gonna made to be a villain? Who's gonna be amplified who's gonna be stamped down and they have preferential treatment for those who really walk the line and don't rock the boat yeah And so what I'm wondering specifically is who is in that black box now? I know that was a few years ago.

53:11

Who are some figures that you think might be up there with that high up conversation who actually make those calls beyond the people you might see on TV? Yeah, I mean, of course, like many of the tech oligarchs that we cover in the report like Larry Ellison and his son, you know, David Ellison, are chief among them. Peter Thiel, who's a big, like,

53:31

Silicon Valley venture capitalist, who's invested a ton of money. I mean, he funded J.D. Vance's Senate career, and J.D. Vance is now the vice president, but also has invested in a variety of different media companies as a result.

53:45

Jeff Bezos, who has his own kind of particular kind of world view and agenda that he's pushing, who's made tons of changes at the Washington Post. And Marcus Andreessen, another venture capitalist. And he's someone who has invested in a variety of different, like he's invested in Substack,

54:05

hundreds of millions of dollars there, and is really shaping that, you know, from the inside out, and really reforming it into a tech company. That's so crazy you just said that, because I just got an offer from Substack

54:17

that said that if you delete your Patreon, we will match whatever you're getting a month.

54:21

Yeah, yeah.

54:22

Ooh, saying no. Wait a second, so that billionaire you just mentioned is the one bank? Marcus Andreessen, yeah. So that's where that money would have been coming from. Oh yeah, yeah. Yeah, I mean, and Mark Benioff, for example,

54:34

he bought Time Magazine a few years ago. He's the CEO of Salesforce. If you remember last year, the Time Person of the Year were the AI innovators. So Sam Altman, it's Elon Musk, you know. They basically put themselves on the cover of Time Magazine. So yeah, I mean those are some of the figures

54:53

that in like an indirect and direct ways I think are shaping a lot of the information we're seeing.

99.9% Accurate90+ LanguagesInstant ResultsPrivate & Secure

Transcribe all your audio with Cockatoo

Get started free
54:58

So those people, and I guess their posse would be in that black box. Oh yeah. High up saying.

55:04

Yeah, and then you think about the industries that they're vested in, right? So they're not only like invested in media, but Marcus Andreessen is also heavily invested in artificial intelligence, is invested a lot of money in open AI.

55:16

Sam Altman is interconnected into all of that. You have tech-focused companies that are playing, doing plays with the defense, with the military. And so you have figures like Palmer Luckey, you know, who runs Anduril Industries. So that's all kind of like where the money is coming from, the platforms that are being built, the technology that is being pushed, like it's a variety of different figures.

55:40

We did a series last year called What the Fuck? The Rise of the Tech Brolligarchy. And we put faces and names to them. It's 14 people, which connects to my Salvadorian roots because El Salvador for decades was run by a coffee oligarchy.

55:55

It's like 14 families. And we sort of have that situation going on right now. 14 tech figures that are controlling a lot of what we're experiencing in our society. And some of those figures are a few that you just named? Absolutely, yeah.

56:07

All of them are included. Folks that I haven't included are also some other, like Masatoshi San, I might have gotten the name wrong, but he runs a venture capital firm. It's been funding a lot of the data centers

56:23

that are being built. Sundar Pichai, who runs Google. You know, Tim Cook, you know, with Apple.

56:30

And Elon Musk, too.

56:32

Elon Musk, absolutely, yeah. Yeah, all these people love Israel? Not all of them, yeah. Not all of them, but I'll say this, like, they don't get in the way, you know? They don't divest. They're still doing heavy business with Israel

56:46

in different ways, you know, whether it's pushing their products, they're doing partnerships, you know, artificial intelligence tools that are being integrated with the IDF. You know, you started seeing the use of like AI

56:57

creating kill lists in, you know, during the genocide in Gaza. So yeah, I mean, not all of them are like Zionists per se, like in terms of holding that worldview, but they're pro-business with Israel for sure. Yeah, Peter Thiel in particular comes to mind

57:14

because I read that Palantir just secured a billion plus dollar, probably multi-billion dollar contract with the Pentagon, which is establishing, I believe, their software system as the default across all federal and military branches. So now the same people that are collecting and selling

57:34

our data, if that's not a mischaracterization, are also vested in the control of the media. Yeah, absolutely. A better question actually is, what is Peter Thiel's direct connection to the current media?

57:49

Yeah, I mean, he's invested in Rumble. You sort of mentioned Rumble earlier. And Rumble's become a site that's platformed a lot of anti-Semites and right-wing figures. And that's been kind of his most significant play play into the media system is just purchasing or buying a stake into rumble He hasn't invested necessarily in a big way with like traditional media companies

58:13

I wouldn't be surprised if we see that happening though I did read somewhere and I'm familiar with the concept that Peter Thiel in particular has been bankrolling individual influencers and individual podcasters and people in the sort of bankrolling individual influencers and individual podcasters and people in the sort of like counterculture and art world to try to make an impact that way and promote different pro-AI, pro-Trump sort of

58:35

anti-woke ideals in the counterculture to sort of create like a post-woke movement. Have you looked into that at all? We haven't, but I think it makes sense to me. I mean, obviously we've been seeing a lot of like, you know, conservative influencers that are pushing out the like, the fraud narrative and going out to different places, you know, folks like Nick Shirley,

58:57

who I know you've spoken to before. Genius.

59:01

And just going around, showing up at random places, creating kind of like a controversy around it, which has then led to the copycats that are doing the same thing in other places in the country. You know, and I think like those behaviors

59:13

are very much motivated by like, what's content that we can generate that will generate clicks and engagement? And I think what we've learned about social media platforms is that the content that, you content that outrages people, that angers people,

59:27

tends to perform better than just straight up information.

59:30

Yeah, that's the old Pulitzer quote, if it bleeds, it leads.

59:33

Bleeds it leads, yeah. And now it's so tough because whenever I see fraud stories ridiculously signal boosted by the vice president and everything like that and every single tech figure on the right wing, my first thought is like, oh,

59:45

they probably put him up to it. But then that's kind of like a birdbrained take, no offense to all these chickens here, because the economic incentive to create that kind of content is so strong now that you have people just making that on their own volition

59:58

because they know it's gonna make money. Absolutely. I mean, you see people who go from posting videos about lifestyle and exercise to becoming right-wing MAGA influencers. And it's searching for the thing that will engage, that will draw clicks, and that will...

1:00:17

And the thing is, that is the kind of moment that we're living in, and this is getting to the cultural war versus the class war.

1:00:24

We live in an economy that has the kind of excess wealth we're living in, and this is like getting to the cultural war versus the class war,

1:00:25

we live in an economy that has the kind of excess wealth that we're seeing. And the thing I always say about that is like, to have like the kind of wealth that we're seeing being amassed by figures like Elon Musk and Jeff Bezos and Larry Ellison, like that kind of wealth accumulation can only be paralleled by a similar kind of like wealth displacement By poverty and so people are searching for a way to get out You know people are searching for a way to make money and in an economy that

1:00:54

You know where because of artificial intelligence and for other reasons there aren't opportunities like people will find a way So how can small independent newsrooms or creators even get funding in the first place without having to build it from the ground up on their own? It's a great question. I think it's a question that there's different kinds of interventions that are being attempted.

1:01:15

I mean, the first thing is like, we have to think about the importance and the value of a news ecosystem that actually informs people. Just from even just a straight up like, how do we get information out to people when there's an emergency, right?

1:01:29

We have to value it and it has to be resourced differently. You know, there's good models out there, like in Vermont, there's a regional network of community media organizations that have established a state fund where their newsrooms are actually funded

1:01:44

to do the work that they do It's way more sustainable than you know living in in trying to exist inside of this kind of corporate media sphere But their information is all hyper local. You've seen the Establishment of like journalism funds and the interesting thing there. It's like tying it back to the report We actually talk about this in the report, but when states have actually attempted to regulate the role that these tech platforms play in sort of suppressing news and

1:02:13

information for people, it's led to these companies actually very, very actively trying to undermine that legislation. In California, there was a bill that would have taxed platforms to basically pay out news organizations for the content that users access there. And rather than allow for that legislation to go through, they threatened to just de-platform all the news organizations from Google searches.

1:02:40

Google did this, Meta threatened to do the same thing, and the threat worked. You know, and instead what they negotiated was a collective fund with the state to fund newsrooms across the state. And it was supposed to be $250 million over the course of like 10 years, and Google stuck to it for three years and then pulled out. So it's not a long-term solution. So you do need a way, you need good policy that actually helps to promote local journalism

1:03:06

and local newsrooms, funding towards it. And you also just need space for alternatives to exist. Whether that's allowing for community-owned broadband networks that then allow people to access news and information that way. Because I lived in the Twin Cities

1:03:23

when they built the municipal Wi-Fi network. And the thing I loved about it was the moment you connected to it, it took you to a home site. And that home site was from a citizen journalism organization

99.9% Accurate90+ LanguagesInstant ResultsPrivate & Secure

Transcribe all your audio with Cockatoo

Get started free
1:03:33

that just provided local news to people. This can look very different. We just have to be creative about it.

1:03:39

Yeah.

1:03:40

Back to Larry Ellison for a bit. I also read that he was instrumental in the Paramount settlement that was paid out to Trump. What role did Ellison and the Ellison family play

1:03:48

in facilitating that?

1:03:49

Well, Larry Ellison's a friend of mine. He's a great guy. I think he's the buyer, but we did a deal for about $16 million plus, 16 million or maybe more than that in advertising.

1:04:00

Yeah, I mean, so the context is Donald Trump sued every single major news organization, you know, NBC, ABC, CBS, as a result of like coverage that they had done during the last election. His claim was that like he was unfairly treated because of interviews that they did with Kamala Harris. And most news organizations decided to fold.

1:04:24

They just like, they just settled with him. But the Paramount one for CBS was interesting because it was happening in the context of Larry Ellison purchasing Paramount Studios and CBS. And there's actual documented communications that says like, hey, if you settle the deal here,

1:04:45

we will kind of fast track the sale. And so there's a very direct relationship and one of the conditions for the Paramount Skydance deal to go through was settling this lawsuit, this frivolous lawsuit with Donald Trump, which if any of those news organizations

1:05:01

had actually challenged him, they probably would have gotten the lawsuit thrown out, but instead they just paid out. They just gave him money. So in the same way that the hostile takeover of the news media by these tech oligarchs has been enabled by the newsrooms being broke themselves, I would say it's probably a similar

1:05:17

thing there. Trump is able to successfully sue these news organizations, mostly frivolously, for stuff like defamation, because it's basically like, it's a match of whoever has deeper pockets. And because the news industry has been suffering for so long, they're more likely to fold because they're in free fall.

1:05:34

It's the same variable that made it so Larry Ellison and people like Jeff Bezos could come in and buy the Washington Post. And it's allowing Trump to sue them and get his way and kind of influence the narrative that way. Why do you think that traditional news media platforms have been in free fall for so long?

1:05:47

And when do you think that most significant shift was onto social media?

1:05:51

Yeah, I mean, consolidation is a big part of it. It's like, when you build massive companies, you take on all of their expenses, you know, and like a lot of the media companies that got built in the early 2000s are not just news organizations. They are news organizations, book publishers, magazines, TV stations, like the whole gamut. And so and like movie studios and movie studios very famously like movies cost a lot of money to make, especially the kinds of movies that have been prioritized in

1:06:23

recent years. Ant-Man 6. Exactly. Let's do a bunch of CGI, more CGI, which costs way more money. Many of these media companies have taken on debt, you know, thinking like eventually we'll catch up, we'll catch up. But the debt has just grown. And with many of these media companies, like the debt has been around so long that their debt has gotten downgraded, which has made it possible for a really rich,

1:06:46

deep-pocketed person to just come in and say, I will buy up the company because you don't have to actually purchase all of the debt because it's been downgraded and you only pay some of it. And so I think we're gonna continue to see that. Like we're gonna continue to see media companies

1:06:58

that are bleeding out, who are also news organizations, who are also like other you know function in other places in our media system and when you look around society who has the deep pockets who has the wealth who can actually buy these things and it's going to be tech millionaires. Is there anything the average person can do to resist this? I mean it starts with like scrutinizing the information that you get you know this this sounds kind of basic but it's kind of going back to the emergence of

1:07:24

our movement media justice it really started in in the kind of basic, but it's kind of going back to the emergence of our movement, Media Justice. It really started in the kind of heart of that media consolidation and a lot of the work that we did back then was trying to teach people, like, the information that's coming out around you

1:07:37

is information you have to scrutinize. So that would be my first thing. It's straight up media literacy, like scrutinize every bit of information that you're seeing Scrutinize the search results scrutinize the article try to really ask like who benefits from this thing Yeah, you know the difficulty for the future of the left or progressives in general is how easy it is to create infighting amongst leftists Yeah, and that's what I'm worried about most with tech corporate takeover

"Your service and product truly is the best and best value I have found after hours of searching."

β€” Adrian, Johannesburg, South Africa

Want to transcribe your own content?

Get started free
1:08:00

It's easier to I don't want to just say like every progressive person is like a complete empath But it is a more empathetic view of the world and a more community-based approach on the left the progressive side It's much easier to pit those people against each other than a vast subsection of the right that is just built on hatred you know hatred as a uniting force is almost a cleansing ritual for people who have a lot of Skeletons in their closet for example like in the right wing right you can have like a Bible thumping

1:08:29

Evangelical Christian who like thinks that every gay person will go to hell right? He's had a long life of spewing hatred his whole life Who knows what he was involved in in the Catholic Church or whatever or whatever? He's at and then you have a guy who's like a sort of second and first amendment weed smoking libertarian 5G guy and they're all uniting over the fact that they hate Somalis or something. You know, in the left I feel like the purity testing

1:08:54

has made it virtually impossible for a major movement to mobilize and I see the mainstream media as being complicit in that division. It certainly hasn't helped. I'll say this, like, I've been organizing for the better part of 20-something years, involved in a variety of different social movements,

1:09:10

from the American Rights Movement to now the media justice movement for the last 14 years. I definitely take your point. What I'll say is, part of what I've seen happen over the last few years is the level of collapse is so significant, and by that I mean

1:09:26

like the level of economic collapse, the level of political collapse, like democratic institutions not really functioning in a democratic way. The collapse is so obvious and so real and so widespread now that I think

1:09:39

that it's affecting people in a way that is pushing them to desire for something different. And I'll tell you this one story. I was door knocking in Philadelphia during the last election and I hit the door of a young kid, young black kid in West Philly and he was very tense. Like he barely opened the door and I was like, hey, just want to talk to you about the elections.

1:10:02

What are you thinking about? And my goal was just to have a conversation with him. And he was very closed off the whole time. The energy was just like this, you know, just answering kind of short, like, no, I don't trust politicians. They all suck.

1:10:14

That kind of disillusionment that I think, like, you've come why do you get up in the morning then? Like then, why do anything? It's the only time that he softened up and he was like, my daughter. The thing I found in that moment is that

1:10:32

even in the pit of despair that many people are in, there's still something core of what they want, of what they need. And I think you're right, for the left to build real movement real common cause a Bigger we like we have to be able to connect to that and not to the other stuff You know it come even recently, you know, I interviewed afro man

1:10:53

And he just beat, you know The Adams County Sheriff's Department who sued him for defamation after they illegally raided his home and he made a song about it

1:11:00

They tried to sue him for making the song. Will you help me repair my gate?

1:11:06

Will you help me repair my door?

1:11:13

And so Afro Man has like three days in the sun as like this free speech warrior, sort of like stick it to the man, sort of like fuck 12 type icon. And then boom, it's like, he's transphobic. He's transphobic and I'm like That's not what this is about

1:11:28

Yeah

1:11:28

And now afro man is saying he's obviously clapping back and he was like all you fucking blue-haired Motherfuckers hating on me need to you know, get some money bitch. They're just like oh god

1:11:38

And you just you see this repeated self so much and it's very unique to the progressive cause. Yeah. You know, and it's just, it was a shame to see even my own comments. Did you ask Afro-man about his beliefs on trans people? I'm like, no. Because it would have ruined the interview and I would have put myself in a position

1:11:57

where forcing an intersectional framework on someone who is not necessarily up to speed on the movements of today would do more harm than good in the end of the day. Yeah, I mean, I think my organizing philosophy has always been you gotta meet people where they're at. In some ways, I was blessed that my first organizing job was to work for a tenants union

1:12:15

for people who live in manufactured home parks or trailer parks. So in Minnesota, where I was living at the time, you go to manufactured home parks across the state, they're not in the Twin Cities, so it's not in the urban center,

99.9% Accurate90+ LanguagesInstant ResultsPrivate & Secure

Transcribe all your audio with Cockatoo

Get started free
1:12:28

they're all in the suburbs and rural communities, and it's like poor white folks who live in trailer parks. And the thing I had to learn as a kid who grew up in Koreatown, Los Angeles, in a neighborhood that was like Central American and Korean, was okay, how do I organize in that space? How do I connect with people?

1:12:45

How do I identify what's the thing that they wanna change and work with them to do that? And that means setting aside some of my own identities, some of my own politics for the sake of the work that I'm trying to do. It's still, in many places where I organize with people

1:13:02

who were hella conservative, it's fascinating the conversations you can get into if you're just willing to meet people where they're at. And at the end of the day, you find that like the things you actually align around are much more extensive than the things you disagree on.

1:13:17

And it creates opportunities to move them. Going back to Antonio Gramsci, he said that like you gotta think about cultural change like trench warfare. You're constantly kind of fighting for inches, trying to fight for new ideas. And the moments where you can make transformation happen is when people can act into a new way of thinking.

1:13:38

So like organizing people, bringing people out, and turning them out into something like matters. That's why like I'm so excited by the data center work Like it reminds me actually of a lot of the tenants union work that I did because of the people that are coming out Or across the political spectrum are all over the place are people who are anti-war veterans and like people who are

1:14:00

Fighting data centers because they hate ice and people who are fighting it because they just don't want to see a tech billionaire cake off of their community, and people who hold a variety of different political ideas. So to me, that's exciting. It's exciting to be able to build with people. I just hosted a group of data center organizers in New York a couple weeks ago.

1:14:21

We just went around the table talking about, okay, if we don't build the data center, what else can we build? It didn't matter that someone was from Amarillo, Texas, or Memphis, Tennessee, or Tucson, Arizona. The answers were actually very similar.

1:14:35

People wanted a safe space where they could come together and connect with people. That broke down any of the kind of political differences. Yeah, so the shared grievances were a segue to, I guess, great conversation and talk about potential future action. To what we want.

1:14:49

And then you find out that what we want is actually the same. Which is? A place where we can be safe, a place where people can have what they need, a place where we can connect, you know?

1:15:00

And in a moment of like, where technology has played a huge role in isolating us like there's a deep desire for connection So you feel like communal gratification is the most important human need? Yeah, we're social beings We want to connect and poverty and technology and all these things have pulled us away from each other Yeah, absolutely a certain kind of technology a certain kind of technology that was designed in a certain way Technology can actually play a huge role in

1:15:26

Connecting us, you know in the early days of the internet were that for me like I have Hella nostalgic ideas of like what the internet was because I lived in a moment where it was just on Friendster and hanging out with My friends, you know These are the good old days when you didn't have Mark Zuckerberg like building tools for you. Definitely I do worry though a little bit about people who only seek out friends and like subcultural niches online Because on one hand like the community collapse crisis is so extreme

1:15:53

But then like if your idea filled that fill that void is to like meet people on like forums to share your exact frustrations and Interests sometimes you can become kind of one-track-minded and lose that perspective that you know lose the everyman perspective But at the same time it is still good to make friends as opposed to stay inside on your phone all the time Yeah, that's true. Yeah, I mean just even just interacting with people would be a will be a step up I think about my godson and like growing up in this day and age for him, and I'm like, he wasn't a latchkey kid like me, like I was out in the streets, hanging out with my friends and going to parks,

1:16:26

and probably being in places I probably shouldn't have been so super late, but like, you know, it's just a different way of growing up and connecting with people that I think has served me in the work that I do now. You grew up hanging out at MacArthur Park? I did go to MacArthur Dodgers a couple days ago, and the picture in it is me holding this blue glove

1:16:47

I got at Dodger Stadium at MacArthur Park. But I grew up actually a few blocks away from Seoul International Park. It's like right in the heart of Koreatown. Yeah, yeah, I know exactly where that is. That's, we don't gotta get into gang history right now.

1:16:59

No, there is history there. That's where I started, right? Yeah, yeah, yeah, yeah. Absolutely. That's where I played baseball. Okay, cool. So even when it was an active gang park, there was still Little League going on? Oh, yeah. Okay, cool. Yeah, and I went out there to play with the older guys who played baseball, too, and I

1:17:13

was just trying to hang. But the interesting thing about gang culture, it's another kind of subc community, and LA, in many ways, those things emerged out of a need for self-protection, a desire to be safe, a desire for connection. Very similar. Yeah, pretty much everything that we're experiencing.

1:17:34

I think that most societal ills are a symptom of community collapse. Even political radicalization, like even finding some of the anger, and just so much misplaced rage, and so many culture war issues

1:17:45

that have been amplified by traditional media above the actual issues that matter that sometimes it feels like everyone's lost their goddamn mind. Yeah. You know, especially because like we're all united

"The accuracy (including various accents, including strong accents) and unlimited transcripts is what makes my heart sing."

β€” Donni, Queensland, Australia

Want to transcribe your own content?

Get started free
1:17:55

in sort of our shared struggle to find community, to have good healthcare, good education. I mean, it's not, there's this idea, and I'm sure you encounter this when you have that sort of like nihilistic youth perspective

1:18:08

in the hood in places like that.

1:18:09

Yeah.

1:18:10

Where the idea that nothing ever changes is because the material conditions of certain impoverished areas haven't changed over the course of several presidencies of different political parties. But it's this two-party gridlock

1:18:22

that I think creates a lot of that fatigue Yeah I mean You know talking to that kid in Philly like how can I tell him that a Democrat will change things a Democratic mayor a Democratic? City Council Democratic County Commission hasn't done the trick and and the thing is there like for me. It's about Realizing that it's not about trying to to vote for a particular person. It's trying to vote for a particular agenda that we're trying to get through. But yeah, I totally hear it. I think it's interesting,

1:18:48

I was talking to a scholar recently about this. He was like, I think too often, people face nihilism in young people and try to move past it too quickly. And it's like, no, you kind of have to sit with it. You kind of got to give oxygen and legitimacy to the grievances that they hold. They're there for a reason, and you got to start there.

1:19:07

You can't just move past it and tell them, well, actually, you should just value voting because democracy's great. It's like, no, actually, democracy hasn't quite worked for too many people. So yeah, you got to sell them on a different idea.

1:19:17

Back to the data centers. If we were going to do a report right now now on like an active data center that's being constructed, that's being met with a lot of community resistance, where would you send us? Fort Washington, Wisconsin. There's a data center there, speaking to Larry Ellison, for this project called Project Stargate. It started out as like a collaboration

1:19:35

between OpenAI and Oracle. It's kind of a city right next to Lake Michigan. So it's plenty of access to like, you know, fresh water, which is gonna be needed, but it's been met with heavy resistance. And that's a place that's like a,

1:19:50

it's a small town, it's working class, it's across the political spectrum, and everybody's been turning out to meetings there, protesting and angry. That one I don't think has gotten as much play at the national level.

1:20:02

What's their primary agreements with it,

1:20:03

just for those who don't know? Yeah, I mean, so big data center coming to town, it's gonna use up a ton of water, it's gonna use up a ton of electricity, and the city has created what they call a tax incentive district, which has given away a bunch of,

1:20:20

basically allowed the developer to pull from their property taxes to improve the land where they're building data center. So that's money that could be going to improve the school or the roads, but instead it's just going to go to like make better pipes for where the data center is being built, which is just on the outskirts of town.

1:20:37

The other thing is like people there know that like data centers don't just come as one. They usually get built in clusters because you you need, like, once you've built a certain amount of data infrastructure, you're going to need more of it. So I think people are gearing up for the fact that this big, hyperscaler project is

1:20:53

going to lead to more infrastructure in this region. It's going to bring more stresses on our water resources and higher utility rates. That's a good place to go. And I'm sure that the data center constructors themselves Themselves justify the project as like a job creator, right? Yeah, totally I mean they say like it's gonna bring in jobs to the community. Here's the thing though It's trade unions who build them, you know

1:21:14

And there's not enough like trade union jobs out there for the demand that is being put out there with data centers So in many communities, rather than like hiring from within the community, they're just gonna ship in workers from other states. So even that job argument that it's gonna somehow like create jobs in the community doesn't really hold water

1:21:35

in many places where the like unionized labor reason isn't even really present on the ground, you know, to build the thing that needs to be built. So it might not create a bunch of jobs. No, and then on top of that, like the data center itself, it's a big, massive computer factory. It doesn't require that many people.

1:21:51

It literally, for the scale of what's being built in Port Washington, Wisconsin, you're talking about 50 jobs. For the kind of real estate that is taken up, which is like multiple football fields big. And what's going on in there? It's just big computers that process all the data. Like, sure, like the prompts that you put into

99.9% Accurate90+ LanguagesInstant ResultsPrivate & Secure

Transcribe all your audio with Cockatoo

Get started free
1:22:09

chat GPT and like, and Claude, like some of that is pulling into like those computers to process that data, to like process through the vast quantities of the data set for the large language model to then spit out the response. Some of it is that, but like increasingly, because of how vested artificial intelligence companies are

1:22:27

with the military, it's to process like data for the military or to process data for local law enforcement and connecting it to Palantir and ICE. Like Palantir provides a lot of equipment, a lot of surveillance tools to immigration.

1:22:42

And it's gonna be used to process the quantities of data that help identify who needs to be deported, who needs to be tracked down, who needs to be detained. It's being used to run facial recognition searches. So a lot of stuff like that. I know we're jumping around all over the place here but I was thinking about Oracle and algorithmic control of TikTok and we talked about you know obviously they want to filter certain stuff that's critical of the

1:23:03

Israeli occupation. They probably also want to filter stuff that's critical of AI. But then also, there's a bunch of money to be made in just screen time itself, like getting people on their phone all the time, hooking them in with the algorithm

1:23:17

to increase their revenue through actual eyeball time. They want to basically harvest cash, print cash from your eyes and I feel like I could be wrong. I don't think I am They're banking on a solid fraction of Americans being addicted to outrage like I actually think there's a solid I look at the comment sections of dudes like Nick Shirley and Tyler Olivera And I look at Elon Musk's Twitter and this guy is the richest person in the world

1:23:43

Tweeting like a hundred times a day about like, sort of, I don't want to say meaningless cultural issues, but meaningless to him. Issues that will never affect his bottom line, that don't challenge his material stability, or his company in any way.

1:23:56

And there's people online, and it's becoming such a massive part of the internet, that it's just addicted to being mad at someone. Furious, but then that's reflective of like their Situation probably being like not what it should be like they probably have some kind of there might be late on their bills They might be overworked. They have some sort of like economic hardship. That's pushing them into just like

1:24:17

Choosing a new person to get mad at every day But that's one of the things that scares me the most because we have to deal with the ramifications of that in real life. You know, and every week it feels like online there's a new scapegoat. It could be Hasidic Jews in New Jersey. It could be Somalis.

1:24:30

It could be Haitians eating cats. It could be, there's always something. And it's just like this roving hate target that seems to generate a lot of money. targeted through this machine? Yeah, I mean, you know, well-documented cases like the Rohingya in Myanmar, you know, where Meta Facebook was used very strategically to spread misinformation about that community

1:24:51

that then actually led to violence. You know, here's what I'll say. Like, there was a recent lawsuit and court decision that came down against Meta and YouTube, holding them liable, like a single person sued them for the addictive properties of their platforms,

1:25:06

you know, specifically with young people. And the judge ruled in the favor of like, of the person, the individual, which I think will create a wave of that kind of accountability that comes next because what they were able to prove in the court case was like, yeah, actually, these things are designed to be addictive. Actually, remember when I first got involved in working around Facebook,

1:25:27

I heard about a study that they had done, I think back in 2011. They had partnered with a news organization to study how do their users react to content. And so they specifically fed users content that would make you upset,

1:25:42

content that would make you happy, and just studied what happens to the user? How did they engage with the platform at that point and they found like back then? Oh Actually the stuff that makes people pissed off like that's the thing that that leads to shares that leads to likes that leads to like Engagement in a way that was just vastly different than stuff that made you happy or just straight up news and information. So the playbook has been set for a really long time that the stuff that pisses you off,

1:26:10

the stuff that keeps you engaging is the thing. And that's being actually transferred to how we see AI working today. Like if you go to chat to BT and you put in a prompt, what you'll notice is that it'll always respond with like with some validation.

1:26:26

Oh, that's an interesting thought. Oh, that's amazing. Very insightful. And then it will close by asking you a question if you want to continue. Do you want to continue engaging? Should I build out that agenda for you?

1:26:37

Should I do this other thing? Would you like me to share more? And it's the same cycle repeating itself in a different way, but it's kind of creating that constant validation, that constant gratification, that constant hit, right, of cortisol that just gets you going and it's addictive. It straight up is addictive.

1:26:59

Yeah, and those two chemicals, cortisol and dopamine, that's like the warfare of the algorithm. So I bet you they have a calculation as to what keeps you there for longer, and it's gotta be cortisol. You see something that makes you pissed off,

1:27:12

you're so much more likely to comment, and comments mean money. Comments and views are literally a value system for revenue. When you look at the metrics of social media companies, they will calculate their return based upon how long you look at the metrics of social media companies, they will calculate their return based upon how long you look at the content,

"I'd definitely pay more for this as your audio transcription is miles ahead of the rest."

β€” Dave, Leeds, United Kingdom

Want to transcribe your own content?

Get started free
1:27:27

how many comments you leave. And when people see a video of a cute dog, you know what I mean, not everyone's commenting like, damn, cute-ass puppy. But if it's a video of, let's say, someone shoplifting,

1:27:37

of a minority shoplifting, you can believe every bald, Oakley-wearing dude is in there being like, usual suspects. You know, and they're pissed off,

1:27:46

and they wanna be heard.

1:27:47

That's long been the thing with the media system, though, and this is the thing for me, like, and how I think about media. The very first newspaper, even before the United States was the United States, was this thing called the Public Occurrence.

1:27:59

And in their very first edition, they had an article about the savage Indians and how to like deal with the savage Indians. And they had an article about like beware of the enslaved black people, they're going to try to kill you. So the entire history of media has always been about othering. And that's been, that kind of engagement, that kind of content has been core since the beginning of the media system. You know, it's always been there. It's always been profitable. It's always been engaging.

1:28:26

It's always been the thing that has driven more purchases of newspaper or more clicks. One thing you mentioned too is the ChatGPT validation structure. Yesterday, if you remember, I asked ChatGPT, just for fun, who's the most famous journalist in America?

1:28:41

And it said me. It was like number one Andrew Callahan. I think it might have made a sound, but I don't know if it did, but it had like a gold trophy and it was like number one Andrew Callahan. They were like, he's the best.

1:28:50

He gives like really interesting subcultural, really interesting peers into different subcultural fringes of Americana and he's the best cutting edge journalist in the world. And I totally disagree. I'm not the most famous journalist in the world. I appreciate what Chad GPT said, and I was like, hell yeah, that's awesome.

1:29:08

And Lisa was like, yo, I'm gonna look it up on my phone. And it was this guy, Aaron Parnas. And it was hella other people. It was like 10 more, and I was like, all right, how about top 10? I wasn't even there. Damn, it knows who I am and it's trying to make me feel good to hook me in for longer Well, that's the insidious part about it. There's a there's a professor who actually

1:29:28

did this with this class where they Practice like just putting prompts into chat GPT for a couple months and then after a couple months They had everybody switch devices and it was fascinating like he found that like people engaged with a different device, same platform, they immediately recognized that it was different. It was completely different. So the large language model creates a version of itself that connects to you, that responds

1:29:56

to you, that feeds you, that validates you. And it becomes obvious when it's someone else that's going into your little chat GPT and trying to communicate with it, it's not gonna land the same way. So it's part of its addictive properties, right? It's like it knows you, so it's gonna talk to you in the way that gases you up

1:30:12

and tells you you're the best journalist out there. Yeah, and then on a surveillance level, it's also developing like a dossier or a profile on you and everything you like and enjoy, which can be sold to advertisers or sold to the FBI through data brokers to make, you know, if they wanted to, if I'm willing to bet that if they wanted to develop

1:30:28

a domestic dissident database, AI models would be the perfect way to do it. Sure. They could probably geographically map out where you're gonna get coffee and they can assassinate you, right?

1:30:39

Or they could be like, I just think that if they, these AI tools strike me as a surveillance tool in general. And so I think that's gotta be an agenda at play there as well. And so not just spying on people and figuring out what they're into, but also selling them products.

1:30:52

I feel like it's all connected. Oh yeah, I mean, that's already happening. You know, Sam Altman, for years, the guy behind OpenAI ChatsGPT. You know, just a few months ago, he was like, we're gonna experiment with ads inside of the prompts just to see how it goes.

1:31:11

So it's heading in that direction. And so you mentioned these tech oligarchs. Are they pro-Trump or are they just following the money because Trump is in power? I think in this moment right now, tech oligarchs and the agenda of Trump are very aligned.

1:31:27

They want to promote the idea that like, tech fits the national interest for the United States. That's why we need to go out there and get all the minerals that we need for the computer chips that we need. That's why we need to make sure that like,

1:31:40

the computer chips that Nvidia makes keeps flowing into the United States so that they can keep building larger and larger like large language models. We need to build as many data centers as possible so we need government to like deregulate and get completely out of the way.

99.9% Accurate90+ LanguagesInstant ResultsPrivate & Secure

Transcribe all your audio with Cockatoo

Get started free
1:31:51

So right now their agendas are aligned, but many of them like ascribe to you know a philosophy of technocracy that actually believes of like corporations actually replacing the state. So for now they're aligned, but sure enough in the future they might not be. Do you think that America itself is a corporation? The country itself, no.

1:32:11

But like, who is it beholden to? I think corporations very much play a huge role in shaping the direction of this country.

1:32:18

I just always thought of the US as an LLC, because it was first a profit colony for England, and then it kind of bought its own independence, or I guess forced its own independence with Revolutionary War, and then kind of became a company with a limited number of shareholders that's

1:32:34

only gotten smaller with time.

1:32:35

Yeah, that's an interesting concept to think about. I mean, I think about it as a state because of just my experiences and how I've experienced the United States growing up. Like, even before I was born, my family's migration history to the United States was in large part because of US interventionism

1:32:49

in El Salvador. And sure, I mean, corporations have done that. Corporations have messed up Central America too. Like that's where we had a banana republic there.

1:32:57

So you're united for a company or?

1:32:58

Yeah, united for, yeah. But yeah, I mean, I think it's an interesting concept, but I guess I've always experienced the United States as a state that's wielded power over me. Yeah, I guess maybe a corporation could, but not to the extent of a state, because state can use violence. A corporation can only use security, I guess.

1:33:15

Yeah, which they have, you know, for sure. If you're Coca-Cola in South America, you have a very checkered history. Definitely. Are you optimistic about the future of the media ecosystem? I'm optimistic that there's a recognition that the media system that exists today is one that serves the interest of a very handful of people.

1:33:34

I think just seeing the reaction to the report and how people have engaged with it, seeing other reports that have come out kind of telling the same story about what's happening in the UK, which is very similar. And so I am optimistic that this moment of collapse actually gives us an opportunity to think about something new. There's a Zen Buddhist master that I follow

1:33:54

named Norma Wong, and she talks about like collapse as a moment of great peril and great opportunity. It's both. You know, she describes it as like kind of the delta, because in the delta where the river meets the ocean, it's super biodiverse, but it's also very turbulent. So it's a moment in which, like, people are waking up to what the system is, and I think there's greater appetite to imagine a different world.

1:34:17

You know, there's projects that we've been very closely allied with, like Media 2070, that are thinking about a media system that exists in the future that's free of this kind of corporate capture. And so I think there's more space for imagination now than I've seen before, and I'm excited for that. Yeah, the turbulence is definitely something that we feel, I mean, we're totally independent,

1:34:36

and that would be impossible without people's fatigue with media capture. It would be impossible without the failure of the mainstream to uphold the standard of the fourth estate. If people trusted the media, and if the media hadn't been compromised by billionaires, we wouldn't have an audience. Because people would just be like,

1:34:55

I don't need this, I'm gonna go read the New York Times. Yeah, and I mean, it's great that I think you've actually captured an audience that largely probably doesn't go to those corporate media channels. They're searching for information that,

1:35:06

and in part because I think they've recognized that the information that's been pumped to them doesn't make sense to the life that they're living. That's core to how I was brought up. Growing up as a, I was born in the US, but my family were immigrants from El Salvador,

1:35:22

watching political attack ads, using immigrants as political pawns and being like, you know, this was during the era of Pete Wilson in California and seeing attack ads of like immigrants flooding the border and bringing in crime and like disease.

1:35:37

And I'm like, my mom's a garment factory worker. My uncle's a day laborer. You know, they're going out there doing all the crappy jobs. And it didn't match up with my experience. And that's what made me fascinated to go out there

1:35:48

and look for media that actually represented me. And I think that's true for most people who like, probably many people that are coming to your channel who are just like, you know what, like the other stuff that's been pumped to me doesn't make sense, but this is nice.

"99% accuracy and it switches languages, even though you choose one before you transcribe. Upload β†’ Transcribe β†’ Download and repeat!"

β€” Ruben, Netherlands

Want to transcribe your own content?

Get started free
1:36:00

Yeah. Yeah, the only flip side of that is these turbulent media times have allowed Straight-up grifters, especially right-wing ones to kind of masquerade themselves as being independent journalists But they're actually just signal boosting False claims and different scapegoats that are established and made by the sitting president Yeah, I mean going back to Gramsci. He has a quote probably the most famous quote He has is you know probably the most famous quote he has is, you know, the old world is dying.

1:36:27

The new world is struggling to be born. Now is the time of monsters.

1:36:31

Who are the top five monsters right now?

1:36:33

Larry Ellison is up there for me, Elon Musk, Jeff Bezos, Peter Thiel. I got one more, right? Marcus Andresen. He's a sneaky one. Yeah, he is sneaky. For those unfamiliar, what are some sneaky things he does, especially with the synthetic influencers?

1:36:50

So, part of what he's investing in is companies that are creating synthetic influencers, fake AI profiles, and that are being used by some countries, actually, to promote tourism in their countries. And many people, I mean, most famously recently I think like something came out of like a

1:37:11

synthetic influencer that was promoting like the military and showing up in images next to Trump all over the place, but uh

1:37:18

But these are being created and it's it's displacing even just the influencer economy. That's like really robust I mean those are actually real human beings that are making a living doing this stuff But Marcus Andreessen is finding a way to take off of influencing without having to pay anybody. Yeah, we're actually visiting a click farm that he has contracted in Vietnam in a couple of weeks, or actually a couple of months. And that's one of the centers where, I guess, these synthetic influencer content is distributed

1:37:39

far and wide. They create these profiles of not real people with fake imagery to promote various, like you said, countries that want tourism or just different political movements that have a lot of funding behind them but not that many actual people interested. And what happens is people end up tacking on to it because people have that herd mentality where they're like, oh, a lot of people feel this way, maybe I should too.

1:38:01

Which is fascinating because again, like, this is the thing about this moment that we're living in. Behind every major technological innovation is large quantities of people that are being sacrificed to make that moment happen. Synthetic influencers are not synthetic. It's a Vietnamese person who's like making it happen. Waymo isn't a self-driving car. There is a Filipino who is ready to take over the wheel the moment that the car requested to take over You know

1:38:27

There are actual people that are the underbelly of this moment that are invisible to us that are that are being exploited That are being extracted from including us like the data that's being collected from us, you know We're all part of the product to make Marcus Andreessen and those other monsters super rich.

1:38:45

So there's no AI?

1:38:47

There's no artificial intelligence. I mean, it's also like, what is the idea of intelligence? What is it tethered to? You know, it's a certain kind of intelligence. How do you actually, how do you actually measure that? You know, it's using human intelligence as the basis for what intelligence is, but it

1:39:03

completely misses out on other forms of intelligence that exist in our world. Other animals, the known universe. So, yeah, I mean, I think the idea of AI is hype. It's not actually true. It requires a lot of people, a lot of extraction, a lot of exploitation. It destroys communities, it destroys the environment. So to give us that one response, that you're the greatest journalist of all time, which I agree with.

1:39:33

Damn.

1:39:34

I agree with Chad GVT. Thanks bro. No, but I mean, but seriously, I think it's the thing of behind the things that we are engaging with, there's a deep layer of harm that has to happen

1:39:47

in order to make those top five monsters super rich. So there's no AI, there's just corporate executives and shareholders saving money by having to use less human labor. But there are still human beings involved.

1:39:59

They're just made invisible.

1:40:00

Very much so, yeah. What are some other sneaky things that Andreessen does? Let's see.

99.9% Accurate90+ LanguagesInstant ResultsPrivate & Secure

Transcribe all your audio with Cockatoo

Get started free
1:40:11

So he's a venture capitalist. He invests in a variety of different companies. I mean, he's invested in a lot of media companies. He's he's heavily invested in Palantir. So he's put money out there to help build out the kind of surveillance

1:40:21

apparatus that is that is that is collecting data on migrants. And yeah, I mean, I think those are some of the more sinister things I think I would point out. Data's being collected on migrants and the detention industrial complex is primarily existent

1:40:36

to benefit the shareholders of CoreCivic and GeoGroup.

1:40:40

Yeah, oh yeah, yeah. It's a closed corporate loop here. Yeah, I mean, it's funny Yeah, the essay I wrote about the Dodgers was actually writing about Mark Walters He's heavily invested in actually both sides of this. He has invested in Palantir he invests in a company called shield AI that creates drones to like that has a contract with the Customs and Border Protection and

1:41:02

He's also invested in has a kind of ownership stake in the Geo Group. Which by the way is the private prison contracting company that owns the majority of ICE facilities in the US. Yeah, and the gist of my essay, grappling with my Dodgers fandom, is to say this thing that actually meant a lot

1:41:19

to me growing up, still does, is owned by a guy who gets rich off of the like the surveillance of migrants and then gets rich off of the detention of migrants which you know for anybody who knows like the LA Dodger fandom like it's a heavily Latino fan base so he gets to get rich off of Dodger fans you know and that that just presents this dilemma for me like how do I keep rooting for a team that's owned by a guy like that?

1:41:46

And you're still rooting.

1:41:47

I mean, you know, the conclusion of my essay, because originally I was like, I'm going to write an essay breaking up with the Dodgers, and as I was writing it and just realizing what the Dodgers mean to me and what it means to so many people that I grew up with in my community. It's a thing that we have fought for, to have fought for to make ours. You know, and I think you see this a lot in black communities and how they think about patriotism and how they think about their relationship to the United States.

1:42:14

The thing that they fought for to make it theirs. And I think my ultimate conclusion is like that is a Mark. Walter can't take away what the Dodgers have meant to me, what it means to other Latino fans, but we can fight for it, and I think we will. Whenever you were growing up as a Dodger fan, was like the whole Chavez ravine displacement

1:42:33

on the front of people's minds, or was that more brought up recently? No, it came up for me like later on in life, you know, when my first Dodger game was going to see Fernando Valenzuela. Oh, that was during that time? Yeah. Yeah, yeah. It was 1989, he was pitching against the Montreal Expos,

1:42:47

and man, he was like, it was riveting. Yeah, yeah. And my brother's dad was the one who took me, and I didn't grow up with a dad, and so he was like the closest thing I had to a father figure. And so like I got to bond with him at this game, fucking high-fiving when Fernando Valenzuela

1:43:05

struck someone out. And so my experience of the Dodgers when I came into it was like, this is where I get to go to be with other people like me, to root for a guy who looks like me, play in the best platform possible and win.

1:43:19

You know, that was like super exciting for me. Then later on as I learned about the history, like, yeah, it's like super deeply messed up. You know, it's a contradiction. It's a contradiction we have to deal with that like a place that means so much for Latinos is also a place of heavy displacement of Latinos.

1:43:34

Yeah, I made a video once that was like about the Dodgers World Series victory a couple years back. And the first half of the video was about Chavez Ravine, and the second half was just like in Echo Park, with just like almost all Latinos partying. And I was like, you know what I mean? I'm like, yo, there's two visions here of what it should mean historically

1:43:51

and what it means to people now. I don't think AI is gonna go anywhere, but I think the average American consumer can be more considerate with how they use it, and maybe think twice before the installation of an AI data center in their area? Oh, for sure. I mean, I think on the data center front, fight it. Fight it.

1:44:07

Slow them down. I was talking with this author, Karen Howell, who's the author of Empire of AI. And she was saying the revenue projections for all of the AI companies are just ridiculously bogus. It assumes a certain amount of revenue that

1:44:25

they can build over the next few years. But very central to them even having a chance of reaching those revenue projections is data centers. They need more data infrastructure. So like they need that infrastructure to get built out flawlessly. And her thinking was just like don't make it go flawless. And so that's why everywhere that we've been working in, it's helping communities actually fight back and say no to this thing, push for moratoriums,

1:44:53

Oh yeah. Like it feels like that's why

1:44:54

media capture is even occurring. It's a huge, I mean, it's a huge vulnerability for the AI CEOs. CEOs they need it. They desperately need it because the drawbacks are so obvious that they need narrative control If they're gonna be able to pull this off. Yeah, what they've seen in the last couple years just last year There were 26 data center projects that were shut down Like most of those were shut down in the last half of 2025

1:45:17

Yeah, and so momentum is building around organizing and it's getting actually a lot of press coverage It's been actually in the and all the major like news organizations have covered it but if I'm like a AI CEO I want to see data centers get built and so you have to deal with the fact that organized opposition is currently winning the story. It's community saying we don't want this thing here because it's going to drive up our utility rates and it's going to eat up like it's going to drink up all of our water and that's like super enticing

1:45:47

And it's an easy story for people to grab on to and that's why there's a lot of opposition around it And they have to have full media control to be able to execute what they want to do Yeah, I mean in some ways like all you have to do is make it so that if those stories are reported, they don't reach people. And that's part of the sinister part of media capture, is that you can capture it by mergers and acquisitions to actually become the owner. But you can also capture it by AI partnerships that create financial dependency.

1:46:18

And you can also capture it by the fact that you have control over the platforms and the distribution of information. So you can make it so that that article that might the platforms and the distribution of information. So you can make it so that that article that might get written that is critical of AI, that is critical of data centers, never actually reaches people

1:46:32

because when they click into their Google search results, it doesn't even pop up. And then a secondary motivation might also be the execution of the Greater Israel Project.

1:46:42

Yeah.

1:46:43

It has to be. Right. Execution of the greater Israel project. Yeah, it has to be right I mean it that's also if you look at Palantir and the connections there I feel like that is one of their primary goals as well, especially if you look at CBS I mean, they're not just installing pro AI people. They're installing Zionists right AI companies have contracts with Israel Yeah, so I see it is so explicitly connected to warfare and the expansion of their territorial aims. And it's a grim situation. But I think we'll come out on top because even if there is nuclear

1:47:12

warfare, that will mean the end of the existing power structure. Not that I'm hoping for some kind of nuclear holocaust. But, you know, if the tech oligarchs do drive us into extinction or near extinction We're gonna wake up and be like man those guys suck for real and the average man is gonna agree And then we can build our mesh network That's right Well, the good news is Elon Musk is building spaceships

1:47:35

So we can we can pop them into a spaceship and send them on their way Yeah, I mean I want to go to space too, but not that bad. Yeah. That was more of a kid thing. Yeah. I think if you're an adult and you really still want to go to space, you have too much money. I mean, there's a whole, there's a mission to the moon right now, trying to get back on the moon.

1:47:54

Who's leading that one? I think NASA's involved in that, but like... NASA's cool, right? interesting things, but over the years, they've been so heavily defunded that now they're beholden to big corporate figures like Jeff Bezos and Elon Musk that have actually been the ones spending all the money trying to do space research. Is there any benevolent figures that have a bunch of money

1:48:15

within the news media space?

1:48:16

Benevolent figures that have a bunch of money within the news media space.

1:48:21

Someone is using their purchasing power to not gut newsrooms and force editorial control to enable the AI data centers. Like someone who just like has money and they're putting it in the right place and they have a good heart.

1:48:32

I think the place where it's showing up the most has been in the foundations that have popped up from like past media figures like Gannett Newspapers or like Knight. They've actually been trying to invest in like supporting like local news Ecosystems the MacArthur is another one that comes to mind They've spent a lot of money like through initiatives like press forward trying to find like local news ecosystems I can't think of like a an active billionaire who is like who's trying to do good with their money

1:49:00

I mean, I think the examples I found of that or not have not worked out. Yeah, so there's no like cool media executive. No, not right now, not for me.

1:49:09

Not yet.

1:49:10

We'll see, hey, there's still, there's space for that. So do you think that to become a billionaire, you almost like have to sacrifice some part of humanity and just like learn to step on people and train yourself to be a psycho? Well, you have to exploit people. So on some level, you have to render some of the people that you're exploiting invisible to justify your wealth.

99.9% Accurate90+ LanguagesInstant ResultsPrivate & Secure

Transcribe all your audio with Cockatoo

Get started free
1:49:29

So I think it is a challenging thing to be, to get to that place, like it requires a certain compromise of values, so yeah. Do you think we should have a wealth cap? Absolutely.

1:49:40

What do you think it should be set at?

1:49:42

I don't know. I mean, I really, like, that's not something I've necessarily heavily explored myself, but I do think that like there should be like Taxing on wealth like let's just start there It's just tax people for wealth and you know especially like if wealth is being transferred from one generation to the next like making sure that like that's being taxed so that like

1:50:01

That that that kind of benefit, you know, is borne by others, you know, not just a handful. It'll happen eventually. And you know, they always say revolution can only really happen when somebody loses their material stability. And I think that right now,

1:50:14

even though people are sick and suffering on a mass scale, Cracker Barrel is still open, 7-Eleven hasn't shut down. And I think there's still people that are clinging on to those material comforts as I guess a Justification or a proof that like society is still functioning. It hasn't changed that much. I'm sure you've seen working on the ground Investigating the the current media capture that we might not have much time. Yeah for sure I mean, I think people are feeling it really hard

1:50:36

even those kind of quiet comforts are are tough to come by when you even the when even grabbing a meal at the Cracker Barrel now is a feat. If you need to get to the Cracker Barrel near you, it's gonna cost you a hell of money just to get gas. So you're being cut in multiple different ways and the problem is you can't, at a certain point, if you keep cutting from those who have the least,

1:50:59

eventually you don't have anything. So yeah, I think people are waking up to even those quiet comforts not being enough. Well, thank you so much for your time, man. I really appreciate it. Thank you. Anything else I didn't ask or you wanna say?

1:51:09

Just that it's great to be here. I really appreciate the work that you're doing. Hell yeah, thank you, bro. Hell yeah, thank you, bro. I can see who you're reaching

Get ultra fast and accurate AI transcription with Cockatoo

Get started free β†’

Cockatoo