Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
AI as a thing in Silicon Valley really exploded during
COVID because tech companies lock control of their workforce. The
workforce went home and didn't want to come back into
the office, and so they wanted to break the back
of the employees. And then the writers strike and the
director's strike happened. In Hollywood. They have another industry that
(00:23):
go and pitch, We're going to just wipe out all
of these high cost employees. We're going to just get
rid of them. That's the goal. And I'm sitting there
and going, we haven't gotten a politics yet.
Speaker 2 (00:37):
You've reached American History Hotline. You asked the questions, we
get the answers, leave a message. Hey, they're American History Hotliners.
Bob Crawford here, thrilled to be joining you again for
another episode of American History Hotline, the show where you
(00:59):
asked the questions. The best way to get us a
question is to record a video or a voice memo
on your phone and email it to Americanistory Hotline at
gmail dot com. That's American History Hotline at gmail dot com. Okay,
today's question is a hot one. Ouch. I mean, I'm
getting burned by it already. Just looking at it on
(01:21):
the screen. Anyway, I've got a great guest here to
help me answer it. He is Roger mcnamie, and I'll
let him give his own introduction.
Speaker 1 (01:30):
So I'm Rocher macnami. I spent thirty four years investing
in the tech industry, and then beginning in twenty sixteen,
I have been screaming at the top of my lungs
to reform the tech industry, really big tech, to protect democracy.
I wrote a book called Zucked Waking Up to the
Facebook Catastrophe. It was about the first part of that journey,
(01:53):
and I am at the moment chagrined by what has happened.
Speaker 2 (01:58):
So, Roger, we had a question from Jason Comas in Perksey, Pennsylvania,
your old stomping grounds outside of Philadelphia.
Speaker 1 (02:06):
What is political technology and is it being used in
the United States? Okay, So the issue that we have
here is that technology has been a factor in politics
for a very, very long time, and it's been used
a lot of different ways. So people started collecting data
on voters to try to understand voting patterns on how
(02:27):
to influence them beginning in the sixties and seventies, and
you know, you saw the beginnings of direct mail in
the seventies, and then as personal computers came along and
then networks came along, this got a lot more sophisticated.
The flip side of that was that once social media
came along, the opportunity to take that data and then
(02:50):
directly target prospective voters with messages that might be in
some cases to we inform them as something, or other
cases to misinform them, or in completely other cases, maybe
to suppress their votes. And since twenty sixteen, the US
electoral landscape has been drowning in technology, and I would
(03:17):
argue that it's not just around elections, it's really everything now.
And the Trump administration's success in twenty twenty four owes
an enormous amount to the fact that the tech industry
all mass decided it was going to back Trump, not
(03:40):
just with money, not just with their votes, but with
their technology platforms to try to affect the outcome. And
you know, it's going to be a long time before
we know exactly how and what they did get that
to work, but it was quite clearly successful.
Speaker 2 (03:58):
Obviously a lot of money invested on the part of
Elon Musk.
Speaker 1 (04:03):
True, but not just Elon Musk, not just Elon Musk.
A really important thing to understand is that that Meta,
which is Mark Zuckerberg, Amazon, which is Jeff Bezos, but
also the Washington Post, Google, Microsoft, all of them, to
different degrees, lent really important assists to this.
Speaker 2 (04:26):
Let's go back to go forward. So your career, I
mean your career has been watching all of this change
and transform. Tell our listeners about yourself and how and
what you've seen throughout your career and how it has
changed and how you foresaw it was going to change.
Speaker 1 (04:45):
So I answered the call of Steve Jobs in nineteen
eighty two when we were in the absolute kind of
early days of personal computing, and Steve talked about the
opportunity to use technology to empower people and increase their productivity.
(05:06):
And to me that seemed like just the perfect amalgamation
of the values of the Apollo Space program in the
hippies that I so admired, you know, the people who
created Atarian Apple, and it was my tribe, and I
was really happy to go there. And I was lucky
because I began my career right before personal computers became
(05:29):
significant enough to be their own industry. And so I've
gotten to see the whole thing, and a little bit
like Forrest Gump, I have by accident found myself in
the right place at the right time, repeatedly, purely by accident.
And I moved from Baltimore, Maryland, to California in early
(05:50):
nineteen ninety one, when essentially Windows was just beginning to
work and the industry was making its pivot, which meant
I was in California inside the offices of a venture
capital firm called Kleiner Perkins coffielded Buyers when Mark and
Dreesen came into pitch Netscape, Jeff Bezos came in to
pitch Amazon, and Larry and Serie came in to pitch Google.
(06:14):
I mean talk about having a front row seat for
the revolution that was me. And then in two thousand
and six, By then I was really well established in
the industry. I've been around a long time, and somebody
at Facebook reached out to me and said, my bosses
are facing an existential crisis. Would you take a meeting
(06:35):
with Mark Zuckerberg? And it was the strangest meeting I've
ever had. But Mark was twenty two. At the time
Facebook was two years old. It had only nine million users.
In fact, it only had high school and college students,
so I wasn't even able to be a user. But
I believed that his original architecture, which required authenticated identity
(06:57):
using your school email address and gave you control over privacy,
that those two things would allow Facebook to create the
first really successful social media platform. And it never occurred
to me that there would ever be a problem with
that that, you know, how could talking to your friends
and family ever be a problem? And you know, Mark
(07:18):
fooled me in some pretty profound ways, or put a
different way, I missed some cues that I, in retrospect
wish i'd picked up, and that I really didn't pick
up until a number of years later.
Speaker 2 (07:31):
Roger, you're an old deadhead.
Speaker 1 (07:33):
I am, indeed from literally from childhood.
Speaker 2 (07:36):
And Eddie old deadhead knows the name John Barlowe. You
bet talk about I think because I think about him
a lot, and how he embraced the Internet in its
earliest moments for community, for a utopian society.
Speaker 1 (07:53):
Right, So talk to John. Yes, so I knew John
well and I can sider him a good friend. He
grew up in Wyoming on a ranch, a very successful ranch,
so he was a child of privilege, and his dream
for the Internet was a little bit like the Wyoming
(08:16):
Ranch writ large, you know, the borrowing from the Ronald
Reagan vision of everybody as the Marlborough Man, but without
the overt right wing elements to it. It was an
inherently libertarian vision, and he and I used to debate
with some frequency that aspect of it, because I believe
(08:40):
that libertarianism it's too anarchic for me, you know, I
do believe societies needs structure to survive, and the notion
that each of us can be a completely independent unit
strikes me as not only not realistic, but not desirable.
And society is about an evolution where people learned to
(09:07):
coexist by essentially pooling their skills, each accord to their ability,
each according to their need, and creating something that's much
greater than the sum of its parts, where libertarianism is
just the parts. And I never agreed with John about
that part of it. The part of it I found
(09:30):
absolutely intoxicating was the notion that the Internet was a
tabula arousa that we could that we could architect. Therefore,
I thought the debate was fantastic, and John was far
more articulate than I and a much better networker than
I was. And his vision obviously was intoxicating to a
(09:52):
large number of people. And in retrospect, we made some
really fundamental errors in the design of the Internet. The
decision not to build identity into it, decision not to
have a separate payment system. Those are two things I
think we'd regret to talk about those just for a minute.
Speaker 2 (10:12):
Explain those two ideas, like building identity into the Internet.
Speaker 1 (10:16):
And the libertarian notion was that anonymity was inherently a
good thing, and that it would scale infinitely and you
never had to worry about it. There was even when
John was alive, it was really obvious that whenever you
brought people together and gave some people the ability to
(10:36):
hide their identity, that inevitably trolling would take over that space,
and that it only took a very small number of
trolls to ruin an otherwise fantastic community. And you saw
this in chat rooms, you know, whether it was on
AOL or other services. You saw this in newspaper comments sections,
(10:58):
everywhere they came up you sought and with respect to
the early social media platforms, and if you count AOL
as one, of them, you know, but it certainly goes
on to friends stir in MySpace and all that. The
ability to obscure your identity led to massive trolling, and
(11:22):
that's not a good thing in my opinion. Now, to
be clear, I think that the ability to protect your
identity from others is also a very important value. So
this is not one of these things where we want
to disclose who everybody is all the time. But I
do think we have to have some level of accountability,
(11:44):
and we have to decide which are the spaces where
anonymity is okay and which are the ones where it's
not okay. So I don't think you have to enforce
a situation where people must be disclosed all the time.
But the flip side of it is you have to
have spaces where where trolling is discouraged by the architecture
(12:06):
of the product.
Speaker 2 (12:08):
So we have this libertarian space where people can mask
their true identities. Now let's talk about late nineties into
the Obama period. How did politicians begin to use this
or governments begin to use this?
Speaker 1 (12:27):
So it's a great question. So the way I handicap
it is that when Obama got elected the first time
in two thousand and eight, the role of internet technology
was inconsequential. But by the time he got re elected
in twenty twelve, the model that we see today had begun.
(12:52):
And what Obama did is he had a program which
he did with Facebook, where Facebook users would agree to
share their friends list with the Obama campaign in order
for the Obama campaign to get people to register to
vote and be aware of election day. So the principle
that they were trying to do was pro democracy and good,
(13:17):
but the underlying issue that I, as a Facebook user,
could share my one thousand friends or whatever the number
is with a political campaign without the permission of the
people who are affected. That was wrong and because they
got away with it, Facebook went after twenty sixteen thinking
(13:40):
that is going to be the golden ticket in politics. Basically,
what happened between twenty eight and twenty twelve is that
Facebook first grew its audience dramatically by cutting deals where
it shared friends lists with really attractive partners think video
game platforms into like, you know, people who could bring
(14:03):
millions of users onto the Facebook platform. And so Facebook
was very much in the business of sharing without permission
people's identity and friends lists. The Federal Trade Commission twenty
eleven entered into a concent degree with Facebook, saying thou
shalt not do that, and if you do that again,
(14:24):
we're going to blast you big time, and you have
to go through a process where you prove each year
that you haven't done it. Facebook literally paid no attention
as far as I can tell, to that consent degree.
They just kept going as though nothing had happened. And
so by twenty twelve, they did this thing with the
Obama campaign, which is clear violation of the antitrust degree
(14:44):
with Obama's own FTC. And by twenty sixteen, they were
in this position where they had traded so much stuff
that everybody's friends list had been shared multiple times, you know,
maybe hundreds of times, and there was no ability by
(15:07):
anybody to protect themselves from us. And so when twenty
sixteen came along, Facebook realized, Okay, we had an impact
in twenty twelve. We're going to be the decisive factor
this time, and we're going to get a lot of
revenue because campaigns are going to target audiences with Facebook
(15:28):
advertising and Instagram advertising. But we're going to help them
do that to make it work better. And that's going
to be a game changer, and they offered this book
to the Trump campaign and the Clinton campaign. The Clinton
campaign refused, And so what wound up happening was that
the Trump campaign, with its partner Camberd Chanalytica, paired essentially
(15:56):
thirty million voter records in the U States. So there's
thirty million out of two hundred and twenty million voters,
so one in seven to a Facebook ID. So this
was this was the big mistake of the Clinton campaign,
I would argue. I would argue that that it was
a huge mistake of the Federal Elections Commission, It was
(16:18):
a huge mistake of the Obama administration. Obviously, the Clinton
campaign put itself in a huge hole by not agreeing
to do this, if for no other reason than to,
you know, just neutralize the threat. But they looked at
this and said that's wrong. But Trump looked at it
and went, wow, We're going to win because of this.
(16:39):
Here's what mattered, okay, was that that that what is
called custom audience of thirty million voters was then taken
to the San Antonio digital offices of the Trump campaign,
where employees of Facebook, employees of Microsoft worked hand in
hand with Cambridgechanalytica and the Trump campaign to tart create
(17:05):
and target ads which were designed to suppress votes of white,
suburban women, young people, and people of color. And they
ran a bazillion experiments, they spent a bazillion dollars on Facebook,
(17:26):
and if you look at it with uncanny accuracy, they
suppressed those three constituencies in precisely the congressional districts that
they needed to decide the election, an election where they
got way fewer votes. The one piece that nobody's ever
(17:47):
followed up on is that earlier in the year, in
twenty sixteen, the Russians hacked the servers of the Democratic
National Committee and a Democratic Congressional Campaign Committee and stole
all the election books from it, which included the campaign
books of both Clinton and every congressional district. Now, if
you wanted to have perfect targeting, if you combine that
(18:10):
with the thirty million voter data set that Cambra Channel
Okay created, you had a weapon unlike anything that had
ever been used in politics. And surprise, surprise, it squeaked
the thing out by the narrowest of margins.
Speaker 2 (18:33):
This is American History Hotline. I'm your host, Bob Crawford. Today,
my guest is Roger mcnamie. He's the author of Zucked
Waking Up to the Facebook catastrophe. He's been sounding the
alarm bells about runaway technology since we were playing Snake
on our nokias. Today we're talking about the intersection of
technology and politics. I know it's a dumpster fire, but
(18:56):
it's a great conversation. Remember, send us your burning question
about American history, record yourself using your voice memo app
on your phone and email it to American History Hotline
at gmail dot com. That's Americanistory Hotline at gmail dot com.
Now back to our show, Roger talk about January sixth
and the role technology played in the attack on the Capitol.
Speaker 1 (19:19):
Sure, okay, So Trump gets elected, he goes into office
in twenty seventeen, and he tries a whole bunch of
things right away, and all of them be either backfire
or at least make him very unpopular, and then essentially
COVID comes law. Beginning at twenty twenty. It's an election year. Now.
(19:40):
Facebook had been warned in May of twenty nineteen by
the Federal Bureau of Investigation that there was this group
called QAnon and that FBI was classifying QAnon, which had
begun around Pizzagate, and it basically was this monster conspiracy theory,
and the FBI warn him that was growing like crazy
(20:01):
and that Facebook need takes steps to make sure it
didn't grow like crazy on on Facebook. By June of
twenty twenty, so thirteen months later, Facebook admitted to at
least three million followers of q andon on Facebook groups
or Facebook pages devoted to q andon. Right, so they
(20:23):
didn't slow it down at all, And the Trump campaign
had this genius insight, which was that COVID created a
perfect melting pot and grounds for conspiracy theories, and the
definitive purveyor of conspiracy theories was QAnon, which had spent
the early part of twenty twenty absorbing the anti vax movement.
Speaker 2 (20:46):
And it was.
Speaker 1 (20:46):
Pretty straightforward because on social media, anti vaxx was really
about new mothers groups. So Annie Vaxer's infiltrated new mothers
groups and Facebook would then herd new mothers to these
groups where they would get radicalized. And Facebook did this
on purpose, right, They did it because radicalized people are
(21:08):
just so much more economically valuable because they're vulnerable to
scams and they're vulnerable to emotional pitches in a way
that a normal person is not. So q Andon absorbs
anti VAXs and in the summertime, Trump goes, we're gonna
absorb QAnon, so he merges Maggot with QAnon. He starts
(21:30):
talking QAnon stuff left, right and center, so that by September,
when is clear Trump is going to lose, he's got
this massive conspiracy theory network online in his pocket. So
he starts to talk about stop the steal, what happened.
The QAnon guys pick it up, go nuts with it.
(21:52):
The anti VAXX guys go nuts with it. And as
you get to the election and he starts talking about,
you know, how corrupt everything is and how it's been stolen,
they start pointing people towards not accepting the outcome. Here's
the Keith understand Of the thousands of people who attacked
(22:12):
the capital on January sixth of twenty twenty one, the
vast majority of those people were small business owners who
had you asked them two years prior, prior to QAnon,
could you ever imagine yourself attacking a police officer? Could
you ever imagine yourself attacking the US Capitol? They would
have looked at you like you were insane. But what
(22:35):
had happened was they were groomed, either by Andy Vak's
QAnon or MAGA, to be incredibly vulnerable to this alternate
reality that was set up for stop the steal. And
then the insurrection itself was organized entirely, well not entirely,
(22:56):
but most heavily on Facebook, and people like me are
looking at this thing, going, hello, look what's going on here?
I mean, there was no surprise. You could see it
coming miles away. But of course Trump was president, so
he's got Washington turned off, right, he's got you know,
(23:21):
he's got the standby button on, so his amp is
turned on, but nothing's coming out right. And you know,
you look at this and you go, we are so screwed.
And yet because they committed so many crimes, all you
had to do was let the law take its course.
(23:43):
And instead we said we're going to move on Biden
and Merrick Garland. Yeah, but not just them. You know,
obviously the Justice Department, all the way down, right, Congress,
all the way down. The Democratic Party had conditioned itself
by then to be non confrontational to such an extreme
(24:07):
degree that even in that window, when Mitch McConnell or
gazillion other people agreed that this was an insurrection and
could not be allowed to stand. We did nothing. And
anyone who thinks that political technology didn't play a role
in that too isn't paying attention.
Speaker 2 (24:28):
Well, part of the technology that followed that was, you know,
the rise of the Borough podcaster. Yeah, and so this
is what one of the things that led to Trump's
victory in twenty twenty four was the support he received
from the Ben Shapiro's and the Charlie Kirks and the
(24:49):
Steve Bann and Joe Rogan. Of course, so how did
these guys and how did the right become the technological
the political technological innovator.
Speaker 1 (25:00):
I don't actually think of it as political and technological.
It's definitely political innovation. It was not technological innovation. I
think what's going on here is that we created an
environment over the course of called seventy years where we
trained Americans to view themselves as consumers first, as opposed
(25:24):
to being citizens. To beat the depression, to win the
Second World War, every American had to be a citizen.
We had to recognize that we had shared interests and
that we had to make sacrifices in pursuit of those
shared interests, and we were all good with that. Tax
rates were incredibly high, but they were used to finance
(25:46):
public goods enjoyed by everyone. Public health, public transportation, public education,
all of that, right, and we built this extraordinary engine
that raised standards of living throughout the entire economy. And
then beginning in the late sixties, the Republican Party decided
(26:07):
it was going to do everything it could to kill
the New Deal and return the country to oligarchy, which
is what had prevailed before the Depression. And it was
a very long term strategy, and the Democratic Party stopped
attempting to fight it and more or less embrace it.
(26:27):
Clinton governed basically, you know, socially to the left of Reagan,
but economically to the right o Reagan. And you know,
Obama continued that right, these guys were neoliberals. They were
big on deregulation, they were big on, you know, allowing
(26:47):
white collar crime to proliferate. And you know, I mean
Obama had what is called a Section one case under
the Sherman Andy Trust Act against Google. That so Section
one is the portion of the Scherman AADI Trust Act
that deals with anti competitive crimes are so severe you
don't actually have to prove them, just the attempt. You know,
(27:09):
it's like price fixing and things like that. The attempt
to do it is is all you have to have
in order to get a win. And they had a
Section one case against Google that would have, had they prosecuted,
it prevented surveillance capitalism from undermining our politics from twenty
sixteen on. The case is so. But in twenty eleven,
(27:30):
Eric Schmidt, who was the most important funder of Obama's
reelection campaign, prevailed on Obama to kill that anti trust case.
So when we're looking backwards, the key thing we need
to understand is the Democrats have been enthusiastic contributors to
their own demise. And you know, the thing that we
(27:52):
have today with the continuing resolutions and all the other
things related to whether Democrats are the a fight or
capitulates Trump, all of that is part and parcel of
a democratic culture that has been in place and evolving
for certainly thirty years.
Speaker 2 (28:14):
To wrap this up, let's talk about AI technology. One
of the things we didn't see so much of in
twenty twenty four cycle were the deep fake videos. We
did see some and they were starting to come in.
How do you see AI political technology being utilized.
Speaker 1 (28:33):
In the next I'd like to actually zoom back slightly
because I think AI is currently practiced is probably the
most evil class of technology products ever created. You know,
in my view, social media, Google Search, the Microsoft Office
(28:54):
three sixty five, cloud based productivity tools, Google apps, all
those things. Those are really toxic products, but they've evolved
in that position. AI is designed to be essentially predatory
(29:19):
and to be hostile to the interests of everyone it touches.
So you're talking about a technology that, the way that
Silicon Valley has described it said, we're going to invest
a trillion dollars, We're going to create a technology that's
going to revolutionize the economy for the better. Now here
are the things we know for sure. They've invested roughly
(29:39):
three hundred and fifty billions so far. In doing so,
they've totally distorted the power grid and accelerated climate change.
They have used millions of gallons of drinking water in
places where water is really scarce. They have stolen it
from you, from me, and from everybody else. Every copyrighted
(30:02):
work that's in digital form. They have stolen from you,
from me, from everybody else, every piece of personal information
that exists anywhere in digital form, and they've done that
to create products that do not work as well as
the things they supposed to replace. You know, you look
at Google search now with Gemini, and you go, who
(30:26):
wants a search engine that you have to fact check?
Speaker 2 (30:29):
That's right, right, right.
Speaker 1 (30:31):
There are clearly use cases where AI is valuable, but
they don't require three hundred and fifty billion dollars worth
of investment. Distorting the power grid, distorting the water tables,
destroying copyright, destroying privacy, those none of those things is
necessarily Those things are choices. Why because the goal the
(30:53):
AI as a thing in Silicon value really exploded during
COVID because tech companies lock control of their workforce. The
workforce went home and didn't want to come back into
the office, and so they wanted to break the back
of the employees. And then the writers strike and the
director's strike happened. In Hollywood. They have another industry that
(31:16):
go and pitch, we're going to just wipe out all
of these high cost employees. We're going to just get
rid of them. So The whole exercise is we're going
to spend a trillion dollars in toll we spend three
hundred fifty billion so far in order to unemploy every
creative person in the economy. That's the goal. And I'm
sitting there and going, we haven't gotten a politics yet.
What is good about that? In what way does society benefit?
(31:40):
What it is is it's literally fascism in a box, right,
because you're essentially concentrating economic power in the hands of
a handful of people, the expense of hundreds of millions,
if not billions of people.
Speaker 2 (31:56):
Do you think people will always be okay with this
because it doesn't seem like there's a good amount of
people that are kind of okay with it.
Speaker 1 (32:04):
Well, what's wrong is how those people are distributed, right.
It's politicians and journalists first and foremost. I think if
we went into the street, there's more antagonism to AI
than any technology I've seen in my lifetime. I mean,
it's actually really encouraging. The problem is the industry has
bought off the politicians, and it's bought off the press.
(32:27):
I mean, the press is so dumb on AI. It's
really it's hard to even know where to start. And
it's not every journalist. I'm just talking about the big institutions.
You know, they mindlessly act as stenographers every time Sam
Altman or you know, the CEO of Microsoft, or you
know Elon Musk talks about it. I mean, Musk's rock product,
(32:53):
when applied to news sources is wrong ninety four percent
of the time. I mean you sit there and you go,
I mean you can't explain that through chance, right, That's
just a product that doesn't work at all. And you know,
explain to me. Kure Starmer, the Prime Minister of the
(33:15):
United Kingdom, and he's going, you know, we're going to
get rid of copyright law to let these guys get
away with us. I'm going, what in God's name are
you doing? Look at Tony Blair, what is your excuse? Man?
You should know better. You're an MP for labor.
Speaker 2 (33:33):
We are at a hinge moment of history, Roger, and
we are glad we have you to help us kind
of work through it. Thank you so much for answering
this question before we go. Is there a question you
might want us to answer on American History Hotline?
Speaker 1 (33:50):
The most important thing I want every American to think
about is how are they going to change their relationship
to technology? Our only hope at this point because politicians
are on the wrong side, the press is on the
wrong side, every institution's on the wrong side. It's totally
on us. What are you willing to give up in
order to be part of the solution? I mean, years ago,
(34:12):
I gave up Google, I gave up cloud based apps,
I gave up all the meta apps you know. And
it turns out it's really easy to lift that way,
but most people just can't imagine it. And I'm sitting
there going I got news for you. If you don't
abandon these products, you're screwedin. Now. Banning is enough. It
only works if everybody does it. But weirdly, we have
(34:34):
every incentive to do that. Now, just say no.
Speaker 2 (34:41):
You've been listening to American History Hotline, a production of
iHeart Podcasts and Scratch Track Productions. The show is executive
producer is James Morrison. Our executive producers from iHeart are
Jordan run Tall and Jason English. Original music composed by
me Bob Crawford. Please keep in touch. Our email is
(35:03):
Americanhistory Hotline at gmail dot com. If you like the show,
please tell your friends and leave us a review. In
Apple Podcasts, I'm your host, Bob Crawford. Feel free to
hit me up on social media to ask a history
question or to let me know what you think of
the show. You can find me at Bob Crawford Base.
(35:25):
Thanks so much for listening, See you next week.