All Episodes

January 7, 2020 16 mins

George Noory and media theory professor Dr. Douglas Rushkoff explore his research into the increased reach of the internet in the commercial, news and transportation worlds, and how increased technology is taking us farther away from each other.

Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:02):
Now here's a highlight from Coast to Coast AM on
iHeart Radio and welcome back to Coast to Coast back
with Douglas Rushcoff. Douglas, I got to ask you about
your Alistir and Adolph the next hour. That fascinating a
cult book. But you were talking about being on the
phone and they somehow track and follow you, and did
you want to get into that again? Sure, I mean

(00:23):
it's funny we you know, you might be having a
phone conversation with a nephew or someone and you mentioned, oh,
you know, oh I'm going to have to buy a
you know, a new baby seat or something for you,
and then you know, you go on the internet and
all of a sudden there's like an ad for a
baby seat there, and naturally we assume, oh my gosh,
they're listening in on the call, right the sort of

(00:44):
the edge Snowden saying, where you know the NSA is
listening in on every call and all they are doing that.
That's not how they get the baby seat on there.
It's almost scarier to realize the way they figured out
about the baby seat has nothing to do with hearing
your conversation, everything to do with watching all of your
behaviors online and not even the content of your email

(01:07):
or the specific sites that you're going to. But they're
analyzing everything's how rapidly you click on your mouse, what
you're hover over with your pointer, how many times you've
checked your Snapchat that day, or if you've gone on
Twitter between ten and ten fifteen or ten fifteen and
ten thirty. They lump you into these big statistical buckets,

(01:31):
and there's enough people and enough data using all this
that they can figure out that, oh my gosh, people
who have this combination of nine key strokes are seventy
percent likely to want to be getting a babyholder for
their car, and they'll throw the ad at you. So
the weirdest thing is that this is the way the

(01:52):
algorithms see us. It's amazing technology, though geez, yeah, it
is amazing. I mean, and it's just predictive, just predictive algorithms.
But then when you think about it, I mean, for me,
these are kind of the dark art that I'm really
concerned about. This is why I wrote Team Human is
kind of to arm people against this and to fortify,

(02:13):
to fortify their resistance that for me, these algorithms are
almost like demons. I mean, think about it. You build
a piece of code, you teach it how to look
for vulnerabilities in a human being, and then how to
leverage those vulnerabilities to get us to act, often against

(02:34):
our own best interests. Well, you're gonna get it. You're
gonna get a kick out of this, Douglas. A couple
of years ago, I got an email from one of
our listeners who said that George, I've been looking at
the website of your guests for tonight and he's got
some very inappropriate stuff on his website, which if I
were you, i'd canceled a guest in. So I emailed

(02:55):
them back and I said what do you mean by inappropriate?
And he said, there's stuff there and explained it to me,
and it's so bad I didn't I don't want to
mention it on the air. But you can get a
picture of that. So I go to the guy's website,
the guest, and everything's fine, and I'm looking at it,
and I go back again an hour later. Everything's fine,
a all his stuff is fine, his ads are fine.

(03:19):
It dawned on me that this guy, the listener, was
looking at inappropriate material and it followed him to this
guy's website exactly. But people don't realize is that, you know,
your Google search results are different than my Google exactly.
The ads that you get are the difference ones than
I get. Your Facebook feed is different than my Facebook feed.

(03:42):
And that's why we're so screwed up as a society
right now. I mean, it used to be bad enough
back in the day when we would argue about things,
but at least we were seeing the same thing and
then arguing about it. Now we're arguing about things that
we're seeing, different things being presented different pictures of the world.

(04:03):
And the dark part, I mean, the part that bothers
me is that this is not random. This is by design.
This is a divide and conquer approach to humanity. They're
preventing us from finding common ground because the more isolated
we are, the more alienated we are from one another,
the better customers we are. The whole design of the

(04:26):
interactive space, of the digital space is to keep us
back in our reptile brain, to keep us responding automatically,
and they do that by showing us the equivalent of
car crashes. You know, whether you're on the pro Trump
side or the anti Trump side. You're going to see
images that get you really upset, and that's because when

(04:47):
you're upset, you can be more easily manipulated, more easily
sent to one place or another. You know, there's a
whole division at Stanford University called captology that technology developers
go through. And captology, just like it sounds, is the
study of how to capture people online, how to addict
them and manipulate their behavior. They port the algorithms from

(05:10):
Las Vegas slot machines into our Facebook news feeds, a snapchat,
and that's because they're using what's called behavioral finance in
order to modify human behavior to get us clicking on
buy buttons and just clicking, you know, just engaged online
in that strange, panicked state. Then we come out into

(05:32):
the world and we hear about, oh, you know, Trump
did something to Iran or Iran did something to us,
and everything is panic, Everything is catastrophe instantly, and you
can't approach any problem, you can't do anything, You can't
you can't even engage with your family and your friends
if you believe you're in the midst of a catastrophe.

(05:53):
Earlier today, I was doing earlier today Douglas, a brigadier general,
had a letter that he compiled that said that we
were going to leave Iraq, our troops were going to leave,
and it hit all the websites. I mean, I'm getting
these notifications like crazy, and I'm going, oh, all right,
this is interesting. Like ten minutes later, there's a retraction

(06:14):
from our government. No, this is not true. Who We're
not leaving. You know, this was an honest mistake. It's
not happening. What are the dangers of things getting out
there so fast, uncooperated, unchecked, and they and they start
panic or whatever. Well, the danger I mean, gosh, now

(06:36):
I'm really going to sound old and I wasn't alive
for it, but the danger is is orson welles wore
the world. Oh yeah, the original fake news. Only that
was being done as a joke. That did people missed
the disclaimer at the top of the show. Yeah, And
but you know, and even though they put a disclaimer,
as we all knew, you know, people were compiling it
to their cars and trying to get away from the

(06:58):
US coasts because we're afraid of the alien invasion and
we're in an equivalent state now we're one way or another.
Almost every piece of media that we're looking at is
telling us you are in danger. You know, the world
is ending. These people are not your friends, and that's
what I'm trying to fight against. That's why I keep

(07:19):
I came up with this kind of this team human meme.
You know, partly it's to say, like, team human against
the robots. You know that I don't think we're going
to get replaced. I don't want to upload my consciousness
to a chip. I think you know that, even if
this singularity is coming, I don't think human beings have
to pass the evolutionary torch to our robot successors and

(07:39):
accept our inevitable extinction. I think we deserve a place
in the digital future. But I'm also arguing that being
human is a team sport. You know, we've bought this
really cock and made me understanding of evolution as competition
for survival between everybody. Evolutions no more the story of
competition than it is of collaboration. Of the way that

(08:00):
trees share resources through their root systems. They're not crowding
each other out for sunlight. They're sharing their energy with
one another through a living system of roots and mushrooms
that take a service feed for the transfer, and human
beings are like that too. We're at our best when
we're when we're connected for real to one another, which

(08:21):
was the original intent of the Internet. I was there.
I remember these were loving, open minded, kind of hippie
psychedelic people, thinking that the Internet could connect humanity, could
be the brain for the you know, for the spirit
of Gaia on the planet way out. But we're using
it for the opposite now, we're using it to divide

(08:43):
and conquer and to you to really to amplify some
of the oldest and most extractive anti human agendas that
you know, the original the original dominator agenda of Pharaoh,
you know, is really coming through these technologies to say,
I am opposed to driverless vehicles, especially driverless trucks, because

(09:06):
I don't want to see our truck drivers put out
of work. Hasn't the technology going so far? It's going
too far? Yeah, well, I mean it. The idea of
an automatic truck is not bad in itself. It's more
a matter of we're using it to amplify the worst
effects of an extractive economy. You know, we're taking drivers

(09:31):
and they're not only driving the vehicle, right, and they
should be paid for that. They're driving the vehicle across
the country, but they're also training their robot replacements in
real time as they drive. You know, every Uber driver
out there is not just getting slave wages for driving
us around when they could when they could be or

(09:52):
should be driving a cab like a human being. But
they're training their replacements. If they if they owned the company,
if with some kind of a cooperative then at least
they're going to have shares in the robot, you know. Sure,
but they don't. Now they don't. They're being exploited, as
are all. I mean, and truckers understand it better better

(10:12):
than anybody. Construction workers too, if they look at their
construction material, the machines they're using now, you know, the
Caterpillar and John Deere and everything else. There's AI on
these things, you know, And that's not just to help
the construction person build a better place. Is to teach
the machine how to do it without the human being,
you know. And that would be fine if you're gonna

(10:34):
pay the person. Say, look, you've trained all our trucks.
You and your family now get royalties on these trucks forever.
That's great, you know, and then you can do something else,
you know, learn, learn other things if you want. But
but that's not what they're saying. They're just want to
replace and discard. They're discarding the human contribution. That's every

(10:55):
single digital business plan has to do with, how do
we have less people? How do we get rid of
the people and save money doing it? Right? What's the
upside to this kind of technology? Is there any Well,
there's a lot of upsides if we can if we
can embed them with human values. I mean, I've got

(11:15):
no problem with uh, you know, robots chilling the soil
and the farmer getting to spend a lot of time
watching his robots and drinking iced tea as they do
the work. You know, there there is a future, a
balanced future where we use technology to do things that
are you know, difficult or or painful for human beings

(11:35):
to do. You know, right right now, to build a
cell phone, they still have to send a kid in
Africa into a cave at gunpoint to get their rare
earth metals. No, that's that's blood on our hands, you know,
these are these are not it's just like blood diamonds.
These rare earth metals are are are dangerous to get

(11:56):
I mean, so sure we could be using tech to
do that, but it's really a matter of looking at
what technology do people need, rather than what technology can
we build to exploit people more, to repress their humanity,
to keep them from talking to each other, to take
more of their stuff. You know, we're looking at a
very old what the Native Americans called a disease. You know,

(12:19):
when the colonizers came to North America, they thought that
they had such faith in humans. They couldn't believe that
we would just tear down forests and enslave people and
kill like that. So they said we had a disease
that they called it wetico. They thought it was a
spiritual disease that led us to want to just dominate
others and take their things and extract their value. And

(12:40):
I think we're still suffering from that disease that we
don't realize that it's so much easier to include your
neighbors in the profit that you're making rather than to
try to exclude them, that your success is not dependent
on someone else's failure, that you better when everyone does better.

(13:02):
We need to teach even Walmart, we have to teach them, no,
you know, that actually, if your customers and employees are wealthier,
you're going to do better as a business, not worse.
The wealthier your customers are before they can spend with you.
That's right, Douglas, and that simple. With the way things
are moving here, we are twenty twenty. What do you

(13:25):
think things will look like and be like in twenty
twenty five, only five years from now, which will be
like that? It's interesting, you know. It's it's easier for
us to imagine like the zombie apocalypse of the future
of fifty years from now than it is to imagine
what happens just five years from now. I mean, I

(13:45):
have to believe that over the next five years more
people are going to wake up to what it is
that we're talking about here, you know, either because they're
too poor to keep buying all of this tech. Eventually
you just can't get the iPhone twenty or the Android
one thousand, you know, and as you all behind things
start to work. I mean it's like, oh, I can't

(14:06):
use that app, and you start to look up. I mean,
what I'm doing in New York now, and I'm starting
to notice it turning around. I walked around the streets
and I never have my cell phone and never looking
at it, but I always see like everybody just staring
in their phones as their well I can't I can't
believe they don't smash in the poles and other people
into each other, into the street and the taxi kats.

(14:29):
But every once in a while, I noticed somebody else
looking up, and I make eye contact with that person,
and it feels like we're in a secret club of
the humans, like a body snatchers, Like, oh, there's another
one who hasn't been snatched. But I'm finding now over
the last year, more and more people are looking up.
I'm having that experience more and more. So I think

(14:49):
people are realizing they're getting almost nauseous with their devices.
And and yes, I know, you know your nephew or
your cousin is on Facebook and they had a kid
or whatever, but you could just find out about that
and okay, great, but they're not here right now. There's
somebody is here right now. There's your neighbor, there's your coworker,
there's your friend. And these people are coming to the

(15:11):
realization that their real world experiences have so much more
bandwidth to you the tech term, than any Skype connection
that you can establish rapport. Look into someone's eyes, see
if their pupils are getting larger or smaller, See if
they're breathing, is sinking up with yours? That you know,
it fires the mirror neurons in your brain and releases

(15:33):
oxytocin into your bloodstream, which is the bonding hormone, and
all of a sudden, you feel really different. You don't
feel alone. You've found another person, You've begun on that process.
So I think five years from now we will be
much more social. I think we're going to look back
on this red blue state thing just five years from
now and realize, oh wow, that's not us. They did

(15:57):
that to us. I'm not a red person, I'm not
a blue person. I'm just a person. And this whole
thing has been imposed on us by an industry that's
looking to make money off these days. That's that's the
name of the game for them. Yep, yep. And they
don't need it. That's the thing. When you look at
Zuckerberg with his billions, they don't even need it. They
don't need that money. It's just they're just as as

(16:21):
addicted to this as anybody out Listen to more Coast
to Coast AM every weeknight at one am Eastern, and
go to Coast to Coast am dot com for more

The Best of Coast to Coast AM News

Advertise With Us

Follow Us On

Host

George Noory

George Noory

Popular Podcasts

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Las Culturistas with Matt Rogers and Bowen Yang

Las Culturistas with Matt Rogers and Bowen Yang

Ding dong! Join your culture consultants, Matt Rogers and Bowen Yang, on an unforgettable journey into the beating heart of CULTURE. Alongside sizzling special guests, they GET INTO the hottest pop-culture moments of the day and the formative cultural experiences that turned them into Culturistas. Produced by the Big Money Players Network and iHeartRadio.

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.