All Episodes

October 26, 2025 • 29 mins

A world-first law is about to change how Aussie kids use social media — forever.
From December 10, children 16 and under will be banned from holding social accounts. But what does that really mean for families?

In this special extended episode, Dr Justin Coulson speaks with Julie Inman Grant, Australia’s eSafety Commissioner, about the new age-limit legislation — who it covers, how it will work, what fines apply, and what parents must do now to prepare.

This is the definitive guide for every parent trying to navigate the online world — with calm, clarity, and confidence.

KEY POINTS

  • What the new under-16 social media ban actually includes (and who’s exempt)
  • How eSafety will enforce compliance — and why parents won’t be penalised
  • The five-step “layered safety” approach every platform must follow
  • What’s being done to restrict online porn and explicit content
  • The truth about “nudifying” apps and how schools can respond
  • Simple ways to help your child transition off social media safely

QUOTE OF THE EPISODE

“Parents shouldn’t have to fight billion-dollar companies to keep their kids safe online — the responsibility belongs with the platforms.” — Julie Inman Grant

RESOURCES MENTIONED

ACTION STEPS FOR PARENTS

  1. Talk with your child about the upcoming change — and why it matters.
  2. Help them download photos or memories they want to keep before Dec 10.
  3. Set up approved messaging groups to stay connected safely.
  4. Bookmark trusted influencers or sites they can follow directly.
  5. Visit esafety.gov.au for family checklists and guides.

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:05):
Gooday, Welcome to the Happy Families podcast Something Different today
as we kick off your new week coming up in
the next sort of six weeks or thereabouts December tenth,
we see a massive change in Australia with minimum age
legislation for social media coming in. All children sixteen and
below will no longer have access to accounts or to

(00:26):
be algorithmically followed online by the major technology companies. Today,
my very special guest on the podcast is the person
in charge of regulating, developing, and essentially rolling all of
this out. She is the nationwide federal government appointed e
Safety Commissioner, Julie Inman Grand Julia has been on the
podcast a number of times and she consistently provides us

(00:48):
with the very best information about what's happening to keep
our kids safe. Today. We have a much longer conversation
than normal on the pod because there is so much
to cover and it matters so much, everything from who's
included in the band, what parents are supposed to know
about it, how fines are going to be enacted, what
to do about explicit content online, whether that's covered or not,

(01:09):
and what the government's doing about that, and even those
neudifying apps where we're seeing on a weekly basis schools
finding kids using them and causing harm. We're going to
cover all of that and more. This is a conversation
you don't want to miss. It's coming up right after this. Holo,
I'm welcome to the Happy Family's podcast, Real Parenting Solutions

(01:30):
every single day on Australia's most downloaded parenting podcast, which
is over a month month and a half away from
the arrival of the brand new legislation, world first legislation.
There are a lot of countries who are kind of
jealous about what's going on in Australia when it comes
to social media. Minimum age legislation limits on kids under
the age of sixteen being able to access the social

(01:51):
media platforms because of the risks associated with it. And
there is nobody better to talk to about this than
the person who is in charge of regulating it all,
the Safety Commissioner of AUS Australia, who is a regular
conversationalist on the Happy Families podcast. What a privilege to
have you back, Julie Immigrant, Thank you for your time today.

Speaker 2 (02:08):
The privilege is online justin I can't.

Speaker 1 (02:10):
Even imagine how busy you heard the moment I well, actually,
I do have some sense of how busy you are,
because when I was speaking with Amy or assistant, it
took us a long time to lock this time in.
So let's get started straight away. You're in charge of
the upcoming social media and inter major legislation. It starts
on December ten. There is still, unfortunately, probably more than
is necessary, a lot of confusion out there. Can you

(02:31):
step it out in really simple terms? What does it
mean for parents and teenagers?

Speaker 2 (02:38):
Ease Safety, being the first online safety regulator in the world,
were used to writing the playbook as we go along,
and this was probably, I can definitively say, the most novel,
complex piece of legislation that I've ever ever seen, and
so it's taken a lot of mental gymnastics to actually
figure out how this will work in practice. But I
guess what parents really need to know is that the

(03:00):
whole idea is to take the pressure away from parents
and kids themselves and the onus of responsibility on keeping
them off social media until the age of sixteen. That
falls squarely on the platforms. As you would understand, justin
these companies have, particularly the larger ones, and where the

(03:20):
vast majority of young people are spending their time, They've
got vast resources, the best minds, they're using advanced AA
eye tools, and the vast majority of them do use
age inference tools today. The government also completed in August
age Assurance Technical Trial where about fifty different age verification

(03:41):
technologies were tested for accuracy, robustness, whether or not they
what kind of bias, or whether they captured the broader
ethnicities across Australia and whether they were privacy preserving, and
about ten of those scored the top technical readiness rating
or just below. So this gives all these major companies

(04:05):
additional technologies to use. So the way that it will
work is we're going through something called the assessment and
self assessment process. But I think you can be reasonably
comfortable that the major platforms that the Prime Minister mentioned
and that kids are using today, YouTube, Facebook, Instagram, TikTok,

(04:26):
snapchat and acts are likely to be covered. At this time,
we're still engaging in some conversations around procedural do fairness.
Later this week will announce a number of organizations that
probably will not be in the ban. There are two exemptions.
One is for messaging sites and the other is for

(04:48):
online gaming platforms. But as you would understand, there really
is no right defining line about a lot of messaging services,
for instance, have social media functionality and online gaming if
you're thinking about Orldblocks or Fortnite for instance, lots of
social media, chatting and interactive features. So are they primarily

(05:11):
is their primary or so purpose as an online gaming
platform or social media platform. These are the processes we're
going through now. But either way, we have a range
of information coming out this week for parents and young
people themselves. We consulted with about one hundred and sixty
different organization, including with young people specifically and with parents

(05:34):
so that we could tailor these materials for what they want.
So everything from what are conversation starters, Whether are the
checklists you need to go go through with your kid,
for instance, how do you start having the conversation about
what platforms they are on today and get them to
start weaning themselves off social media? This is going to

(05:55):
be a monumental event for a lot of young people.
How do you download the ur lis and their pictures?
How do you make sure that they're still connecting with
their friend group? Particularly in December, one school is breaking
out at school holidays, we'll explain how to set up
sort of a group messaging chat. For instance, if there

(06:16):
are influencers you approve of that your kid might like
to follow, then you know, look for their website and
bookmarket now. And then we've been privileged to work with
groups like Beyond Blue, Headspace reach Out and Kids Helpline
to make sure that all the mental health resources and

(06:37):
the help that kids need are readily available in our
resources and the language is kind, caring and compassionate.

Speaker 1 (06:44):
Julie, there are so many moving parts here. I want
to go back to the platforms themselves for just a moment,
as much as there are so many resources that are
on the cusp of arriving, and certainly over the next
month or so, we'll continue to land to godparents through this.
The most common question that I get when parents are
asking me about it, other than do I or do
I not favor the changes, and let's be clear I

(07:07):
really do, is what apps are included? So you've highlighted
and the Prime Minister has mentioned as well, the major
tech companies. But I'm just I'm thinking, and I don't
want to get into the weeds too much here, but
I'm thinking about things like discord. Discord is a messaging app, right,
but it's very much about social media, and we know
that there's a lot of really really damaging stuff that

(07:27):
happens there. And then I'm thinking about how recently some
filings that Facebook made in the United States, they argued
that they are not a social media company. Now they're
a I mean, they're a short form TV company, and
they're saying that they're not a social media company because
people are sharing content with strangers, and people are consuming
content from strangers, and it's usually all short form video.

(07:48):
There's only about ten to twenty percent of the interactions
that happen on Facebook now that are technically social media interactions.
The rest of it's shortfam you mean Facebook or YouTube Facebook.
I read something about Facebook doing that just recently. And
so when we've got these organizations, who are I guess
what I'm saying is it's a constantly moving target.

Speaker 2 (08:12):
These lists will be dynamic, and you've just encapsulated what
the challenge has been for me and my team. I
don't have specific declaratory powers in this legislation to say
who's in and who's out. I have to work with
the rules that were tabled and they're very broad rules. So,
as I mentioned, the primary test that we're using now
is what is your soul or primary purpose? Is your

(08:35):
solar primary purpose? Social media? Well, of course we're seeing
a lot of shape shifting. So YouTube's been saying oh no,
even though they designate themselves as a social media site.
For our codes, we're a video sharing platform, and pinterest
is a visual search engine, and snap is a camera app.
It's really messaging. So I mean what we're having to

(08:59):
go through and do is do our own testing, but
look at features and functionality and then try and apply
these tests where there isn't a clear line and we
know that things are going to be dynamic and change.
So to give you an example, I was at OpenAI
headquarters last week talking to them about a range of
things because we've got some provisions in our codes around

(09:21):
AI companions and chatbots and preventing kids from accessing born
and explicit violence and suicidal ideation content and the like.
They didn't mention once that the following week they were
going to be introducing an AI generated social media app
called Sora. Well, so here you go, and now you've
got the melding of social media and AI generated deep

(09:44):
fake videos. So I've written to them and I've sent
them the assessment tool. Things are going to be dynamic
and are going to change all the time. Let's say
you're looking at roadblocks. I think most people would agree
that online gaming is probably its primary purpose. US. There
is that chat functionality they've just released in the US,
a future called Moments, which is like stories. So features

(10:08):
and functionalities are changing all the time. This is always
going to be dynamic. We want to give parents the
most clarity that we can, but I do want to
say this justin One of the normative changes that this
legislation is meant to give parents is to be able
to say this has a meaning you'll never be on
social media, or you'll never have a smartphone, just not yet,

(10:31):
not until you're ready. And the government is saying that
if you're under sixteen, you're not ready. So and you
don't have to worry about the POMO because your friends
aren't going to be beyond this either. So parents do
have agency. Let's just say we decide roadblocks meets the
gaming exsumption. You have agency as a parent to say

(10:53):
you're not going to be on Facebook or TikTok, but
I'm not comfortable with you being on roadblocks either.

Speaker 1 (10:58):
Yeah, Julie, even as I listen to what you're saying here, Gentwangy,
who I'm sure that you would be very familiar with.
She's got her new book out. I'm actually speaking with
her on Wednesday about her new book, and one of
her rules is you are the parent like you are
in control. And she's so behind this change as well

(11:19):
because it allows parents to feel like they can step
into that role and say, well, the government has said no,
and my job as a parent is to help you
to stick with what the laws are. In the same
way that I'm not going to let my twelve year
old driver car or go into a pub, or go
into a casino or any of those things. They're now
not going to be able to do these things, And
it just it feels reassuring, it feels validating, It feels
so much safer for parents and for families. Just listening

(11:41):
to your talk, though, it sounds like you have such
an exhausting role to play. After the break, I want
to ask you, Julie, how enforcement is going to work.
I mean, we're talking nearly fifty million dollars for each breach,
and this is something that a lot of parents are
trying to work out. Well, hang on, I'm not going
to get in trouble, so the onus is off me.
But my kids are definitely going to try to be sneaky.

(12:02):
They're upset about this. So that's coming up, plus a
conversation about what the government, what the E Safety Commissioner
is doing around ornographic age gating, and a discussion about
neudifying apps which are a concern in so many schools.
Right now, stay with us, it's the Happy Family's podcast,

(12:25):
Real Parenting Solutions, every day on Australia's most downloaded parenting podcast.
My name's doctor Justin Coulson. The E Safety Commissioner, Julie
Inman Grant is joining me now, Julie. As the minimum
age legislation date arrivedal gets closer December tenth, something that
a lot of people are talking about is there's a
forty nine point five million dollar fine for social media

(12:45):
companies who allow under sixteen's onto the platform for each breach.
My question is who's policing it and how well?

Speaker 2 (12:55):
That is my job as the regulator, and I've developed
a compliance and enforcement strategy but let me tell you
the five things, because not everybody's going to be reading
the regulatory guidance, although it is available on our social
media Minimum Age hub at eSafety dot gov dot au.
First of all, we're asking companies to focus on using

(13:17):
age assurance technologies and what we call a waterfall approach
or a layered safety approach, So it's not just going
to be one way of testing age, and it can't
and while they can ask for ID, it can't be
the sole thing, the sole determinant of age because a
lot of people aren't comfortable giving up their ID. So

(13:39):
they are going to have to tell us before December tenth,
how many people, how many under sixteens they have on
our platforms. We've already used our transparency powers in February
of this year to get a nominal number. I've been
pushing the companies because I think that the number of
kids that are on their platforms are higher. So there's

(14:00):
two point five million eight to fifteen year olds, and
eighty four percent of young people told us last September
that they had at least one social media account. Very
few of them had ever been banned for age related violations.
And in ninety percent of cases, their parents helped them
set up the account.

Speaker 1 (14:21):
So again I'm sitting my head like, seriously, your parents,
just what are you doing these?

Speaker 2 (14:26):
Well, again it's the fear of their kids being excluded,
so that should take away that. So we've asked them
to tell us the number, and then on December tenth,
we're going to be tracking how many deactivations or removals
of under sixteen accounts they take. We know this is
not going to be perfect and not every company is
going to do it the same way or with the

(14:48):
same level of accuracy, so we're also asking them as
the second second phase is to make sure that there
is a discoverable, easy to use, and responsive user empower
reporting so that parents are educators can can report to
the platforms they if there's an under sixteen that's still

(15:08):
on there. The third the third requirement is that they
have an appeals process that is fair, so if they overblock,
people can also say I would like to prove to
you that I am sixteen or over and reinstate my account.
The fourth thing we're asking them to do is to
make sure that they're avoiding circumvention, whether location based circumvention

(15:30):
through VPNs or through age based circumvention like spoofing AI
estimation systems, and we go into a lot of specific
technical detail about what we know they can do and
they should be doing, and that the last part really
is about measuring the efficacy of the tools that they're
using and making sure that they're being transparent with the

(15:52):
data so that we can measure success. Now, these hefty
fines are only for systemic failures, So if a couple
of we expect that you know a few of these
will fall through the cracks. But if we have an
indication that these companies don't appear to be applying effective

(16:14):
age assurance solutions, that's when we'll take take action, will
be issuing some information notices, will be reassessing in June
because things will change. We may be looking at a certification,
there will be technical standards that will be being developed.

(16:37):
But I'll say one thing justin that this has helped
with and I was just in the US meeting with
a number of these companies. Age assurance is happening. It's
happening everywhere, and companies understand that this is where they
need to go. So that the train has left the station.
So companies can try and ignore it or avoid doing

(16:58):
it for a long time, but the UK is asking
for it, Ireland's asking for it. The European Union of
twenty seven member states also is going to require this.
This is where the world needs to go.

Speaker 1 (17:12):
I kind of feel like throwing my hands in the
air and having a little party on my own here.
This is I mean, it's wonderful to hear that we're
starting this off, and obviously the world will become well,
I think the world will become a better place for
our young people as a result. A quick comment on
number four of those five points that you highlighted in
terms of that cascading approach to age verification, the fourth
one you highlighted was location based and AI check ins.

(17:37):
So my fifteen year old daughter has been extremely clear
that while she doesn't like it, she's happy to comply
with it, but none of her friends are going to
They're all researching VPNs, they're all talking about different ways
that they can get around this, or they're just saying,
my parents are going to set up another account in
their name that I can use and I'm just going
to pretend to be them. What I'm hearing you say
in relation to that is that number one, the company,

(18:00):
the social media companies are going to have to be
mindful of what's going on from a VPN point of view.
For people who are not familiar, VPN is basically a
virtual private network, which means that you can circumvent any
location based elements of your online activity, and also your
AI component is that, like AI can see what people

(18:20):
are saying, what they're following, what they're liking, what they're sharing,
and make some fairly well educated guesses as to the
age of the person who's using that platform. Is that
what you mean with that.

Speaker 2 (18:31):
Specific Yeah, I mean a number of companies have been
using inference technologies for some times, so there are different
behavioral signals or even using natural language processing, you know
the way that grammar is used, or acronyms or emojis.
Thirteen year olds generally speak to other thirteen year olds.
If they can see you're logging in between before school

(18:52):
and after school, there are a whole range of signals
that they're already picking up, and then they'll be using
facial age estimation to keep improving their classifiers. But that's
why I put a lot of technical information in the
regulatory guidance that basically says, most of these platforms are
picking up device ideas, you know, IP ranges and addresses.

(19:15):
They can tell if it's been downloaded from the Australian
App Store, so they can see very clearly when through
the IP range when a VPN appears to be being used,
and that will just that will need to trigger another
age estimation check, so it won't be as easy as
young people think. I love it.

Speaker 1 (19:37):
I love it. Okay, our time is up. I want
to do a quick summary and then I've got two
more questions around explicit content, because I mean, this is
one of the most consistent and overwhelmingly horrible questions that
come up around explicit content. So here's my quick summary
of what you've described. Over the next month or thereabouts,
the social media companies are going to have to be
demonstrating to you what their primary purpose is and a

(20:00):
list of who's included and who's not included will be
finalized in time for December ten. Children under the age
of sixteen will be limited from creating accounts and being
algorithmically followed. They will be able to access whatever content
is publicly available just via the websites of those platforms,
and in terms of enforcement, parents no longer have to

(20:21):
stress about it because the platforms are responsible for doing
the work at a systematic level, and if they're found
to have systematic breaches, they're going to be hit for
forty nine and a half million dollars because they're just
not doing it right. And you've got that wonderful five
step cascading approach to different age verification systems. Is there
anything that I've missed that is just vitally important for

(20:41):
parents to understand or is that a pretty good snapshot
of where we are right now.

Speaker 2 (20:45):
I think that's a perfect encapsulation, but I would just
have a couple of caveats. And this is related to
your next question. Is just because a platform is exempted
they're a messaging platform, an online gaming platform, that doesn't
mean as parents that we stop being vigilant and set
and forget. It doesn't mean these platforms are necessarily safer

(21:07):
this These are just exemptions that we're written into the legislation.
I've heard some people over sell that this is going
to stop cyberbullying or image based abuse. You know, these
are actually fundamentally human behaviors that play out online. They
play out on social media, but they can play out
on messaging and gaming platforms as well. So we'll be watching.

(21:28):
We'll be watching for that too, and we'll have guidance
for parents, and we'll continue to have our reporting schemes
for cyberbullying and image based abuse. So I think this
will make parenting in the digital age a little bit easier.
But you know, it's like anything if you compare it
to a car, no, if you're you know, we expect

(21:49):
the car makers to make safer cars and embed seat
builts and put airbags and that sort of thing. You know,
they know that they're going to be inexperienced or bad
bad driver is there. You still have to follow the
rules of the road, but you're not going to be
penalized if you get into an accident. Right, it's the
same thing here. It's the platforms themselves that are responsible

(22:11):
will give parents guidance, but you know they can choose
how to parent how they want. If they want their
child who's under sixteen to have a platform that aren't
going to be penalized, and the child themselves won either.

Speaker 1 (22:22):
So when I hear that, the metaphor that I use
is you might move into a safer neighborhood, but you
still want to lock your door. Absolutely, okay. Last two questions,
Commissioner Grant and so grateful for your time. What is
the government doing about explicit content, specifically pornographic age gating?
Where we're making big fuss about social media and rightly so,

(22:46):
but most explicit websites simply require a user, whether they
are six or fifty six, to click the yes I'm
over eighteen button and they're in. That's as far as
it goes. At this point, it seems to me that
if we can do this with social media, we must
be able to do this with explicit and pornographic content.

Speaker 2 (23:09):
And we are, and that is through the industry codes
that I've just registered over the past couple of months.
You'll see the first changes, particularly to search engines, at
the end of December this year, and then you'll see
changes up and down the stack. What these codes require

(23:30):
eight different sectors of the industry to do, from the
app stores to social media sites, to ISPs, search engines
and the like. It is again a layered safety approach,
and each of them will have different requirements. So we
know that about fifty eight percent of young people under
the age of thirteen that come across pornography come about
it incidentally, accidentally and in your face. That's how they

(23:52):
describe it. Yeah, intel through an innocent search or a
browse on the Internet. So what our search engine codeodes
which were developed by industry themselves. But I decided to
register because I thought the amount appropriate community safeguards will
do from now on if it comes across peignt graphic

(24:13):
content or explicit violence they like the Charlie Kirk assassination,
which has been designated refuse classification in Australia, it will
blur that if so that will prevent the incidental coming
across of that content. If you're an adult, you can
click through and you watch the adult content. It will
also for people who are seeking out suicide instructional material,

(24:37):
it will refer them to a mental health site rather
than taking to them that to that kind of you know,
perfect material that you can lead to greevous outcomes. But
we also will be requiring a whole broad range of
platforms to be using age assurance to prevent un eighteens

(25:00):
from maxising suicidal ideation content, self harm, disordered eating, explicit
violence and pornography. So again this is going to be
up and down the technology stack. It will also apply
to AI companions in chatbots, because we know that fifth
and sixth graders are spending up to five hours in

(25:21):
what they think are quasi romantic relationships with AI companions,
and they're designed to be emotionally manipulative and to prey
on your best instincts and your worst fears from monetization purposes.

Speaker 1 (25:36):
We saw that in Parental Guidence season three when we
got the AI companions to chat with the kids, and
they were though and manipulative, they were absolutely concerning. So
that's reassuring. We'll probably get you back next to you
to have a chat about number one, how the social
medium in major legislation is rolling out, but number two
to talk more about those those explicit content and dangerous

(25:56):
content guidelines as they as they roll out as well.
Last question, and you've been so generous with your time
notifying apps there are massive concern in Australia at schools
at the moment are tearing their hair out as kids
get hold of them and then create explicit content based
off even innocuous school book photos of kids, usually girls,

(26:16):
or teachers in their school. What can you offer parents
to help here. If they're concerned that their kids might
be either a victim of or a user of these
neudifying apps.

Speaker 2 (26:30):
Well we share your concerns. Now we've using our mandatory standards.
We've gone after a couple of newdifying companies, one in
which has created at least three very popular apps that
have been downloaded or viewed by at least two hundred
thousand Australians over the past year. So we're taking action

(26:51):
against them and the app stores that are hosting them.
We've just won in court against a man who post
to deep fakes of Australian women, you know, a fine
of up to three hundred and forty five thousand dollars,
which may not seem big, but for an individual it
is and it is a deterrent. We've also developed a

(27:12):
deep fake image based abuse Incident Management toolkit for schools because,
as you say, this is happening at least once a
week and schools across the country, and schools haven't known
how to deal with it. You know how to collect
the evidence, when to go to the police, when to
report to ease safety, and I will note if there's

(27:33):
intimate images or deep fake intimate imagery of a child
that's likely to be considered child sexual abuse material. We
can tackle that and we will work with law enforcement there,
but we have the ninety eight percent success rate of
getting that kind of content down. What people need to understand,
and this is where I think we need some cultural
change with young people. They might think this is a

(27:54):
bit of a laugh and look what they can do
for this with this virtually free technology. Well, the cost
the victim survivor is significant. It's humiliating, it's denigrating, and
it's incalculable in terms of the kinds of impact it
can have on young people. So we need to be
preventing this behavior from happening in the first place.

Speaker 1 (28:15):
Julie, you've got a tough gig. Really appreciate the work
that you do and what you strive to do to
keep our community and particularly our young people and our
vulnerable people are safe. Thank you for your generous time today.

Speaker 2 (28:29):
King. Well, we'll try everything we can until it works.

Speaker 1 (28:32):
A big thank you, Julie Immigrant, the e Safety Commissioner
of Australia for taking time out of an incredibly busy
schedule to have a chat with me about all of
those things we could have talked for a lot longer,
but hopefully there's enough detail there to put you straight
and get you started on conversations with your kids. E
Safety dot gov dot au is where you find more
information and we will link in the show notes to
a bunch of resources that will be useful. In addition,

(28:55):
check out the safety dot gov dot U website because
there's a whole lot of webinars to explain even more
about what's going on that are happening over the next
month or thereabouts. It would be well worth your time
having a look. Again, we'll link to that in the
show notes. Okay, longer one than normal today. Thanks so
much for listening. I hope that you've gotten heaps out
of it and found it to be helpful and informative.

(29:15):
The Happy Families podcast is produced by Justin Ruland from
Bridge Media. Mimhammonds provides additional research, admin and other support,
and if you'd like to find resources to help your
family to be happier and to function better, visit us
at happy families dot com dot au. Tomorrow we're back
answering another one of your tricky questions on the pod
Advertise With Us

Popular Podcasts

Stuff You Should Know
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

CrimeLess: Hillbilly Heist

CrimeLess: Hillbilly Heist

It’s 1996 in rural North Carolina, and an oddball crew makes history when they pull off America’s third largest cash heist. But it’s all downhill from there. Join host Johnny Knoxville as he unspools a wild and woolly tale about a group of regular ‘ol folks who risked it all for a chance at a better life. CrimeLess: Hillbilly Heist answers the question: what would you do with 17.3 million dollars? The answer includes diamond rings, mansions, velvet Elvis paintings, plus a run for the border, murder-for-hire-plots, and FBI busts.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.