All Episodes

April 10, 2023 23 mins

Discord is a social chat app designed initially for the gaming community, now extending to a wide variety of interest groups. In this episode listen to Discord Senior Policy Director, Savannah Badalich explain what Discord does to keep users safe. Part of keeping young people safe online is understanding the platforms they use.

Resources:

DISCORD and DISCORD Safety centre. See eSafety for reporting eSafety Guide for games apps and social media.  For more information about SHV @ shvic.org.au.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
This podcast is produced and recorded on Wurundjeri land.
It contains discussion about adult topics,
use your judgment if there are little ears around.
Welcome to Doing IT.
This is a podcast made by the Everybody Education team at Sexual Health Victoria.
We run a whole lot of education programs for communities and medical professionals across Victoria.

(00:23):
We also run sexual health clinics in the city and Box Hill in Melbourne.
My name is Anne and I'm part of the SHV Schools and Community team.
We go to schools and run classes for all year levels on bodies,
babies,
growing up, puberty,
sex,
reproduction,
relationships,
consent.
This podcast is for parents and carers of school aged children so we can share what goes on in our relationships and sexuality education class and help support these sorts of conversations at home.

(00:52):
Today
I'll be talking to Savannah Badalich who is the Policy Director at Discord.
Not all parents, carers and teachers.
I speak to know about Discord, but young people do.
It's a platform for communicating with friends about common interests.
Often it's associated with gaming.

(01:15):
Savannah works towards making sure users are safe.
The eSafety Commissioner often refers to safety by design,
meaning that tech companies should be responsible for user safety.
I'm really interested to ask what sort of things Discord are doing to help keep young people safe.

(01:38):
Savannah,
So thank you so much for coming in and talking to us about Discord.
Happy to do so.
Thank you for the invitation.
My first question,
can you explain what Discord is?
Yes.
Um I love the little pop quiz at the beginning.
Um So Discord is a free chat video and audio app.
Um think of it as a communication service for communities.
Uh the way that it works is we have these spaces.

(02:01):
Um we call them servers.
It references our old nomenclature when it came to the gaming community,
but you have your own space where you can create different channels dedicated to it.
So think about it as you could have a dog server.
So it could be a whole space dedicated to your love of dogs.
In it you can have separate text channels or audio or video channels.
Um so it could be around,

(02:22):
let's say your own dog pics,
it could be around um uh different breeds,
whatever it might be,
but you can organize it like that.
Folks really use it for the free audio feature.
So the idea is you can just jump into a room and you're already having a conversation.
And so friends will often see each other already live in that room.

(02:42):
-- Joining a phone call. -- Exactly.
And always on phone call if you wanted it.
Um and so Discord was really popular and we had created it around the gaming community so that it's easy for you all to listen to each other,
talk, coordinate.
Um so you could play video games together,
but it's expanded,
especially um during the pandemic,
it ended up being used by a lot of educators,

(03:02):
schools,
universities,
a lot of friend groups um who just wanted to create their own sort of private server for themselves.
I have one for my siblings um that I often use, but we still are primarily for the gaming audience.
Uh and so young people,
you'd say that's the bulk of your audience. Yeah.
It's not dog photos,
it's mainly people wanting to game together.

(03:23):
Yes.
So it's,
it's primarily 13 to 24 year olds um with many teens in particular that are interested usually around things like yes,
a game or a hobby um or around their school.
Do you have, off the top of your head, a gender breakdown of that?
I'd say it's predominantly men.
Um and that's because of just the association with gaming.

(03:44):
But we have so many women gamers and then more and more non gamers joining the platform.
So I,
I can't say for certain the the breakdown,
but that's how it started.
Um and we're getting closer and closer to that parity. Right, and why do gamers like it?
Gamers like it because it,
it was free.
Um I'll say that at the very beginning there were other softwares that gamers could use,

(04:07):
but they weren't as secure.
You could actually kind of hack in to one of those conversations pretty easily.
So for Discord,
it's not like you can look up me,
Savannah,
any Savannah that's on Discord.
You have to have my exact name and then a number.
So think of it more like a phone number or a street address.
And so it's very hard to randomly message someone um or to find someone on Discord.

(04:30):
Now,
that security aspect meant that it was easier to have private conversations with gamers.
We also see those gamers started using it for other reasons.
Um There's a lot,
there's a huge trans and non-binary population on Discord.
Uh a lot of youth that are interested in testing out their identity because you can have some pseudonymity there.
And so it started with gamers just wanting more of that security,

(04:53):
wanting it to be easier to have audio conversations while they're gaming or to stream with their friends.
And then it quickly became,
oh my gosh,
it's so much easier for us to create our own friend group.
Um and then I could use a persona and test out a different part of my identity.
So your website provides some information on keeping accounts and servers safe.

(05:14):
Yes,
which sounds like that was a priority for the users.
Um what are the main tips for safety?
Totally.
Um so I would say Discord has a lot of customization.
If you can think of it,
there's probably a setting for it,
which makes our settings a little daunting.
So I'd say the,
the main places where you should go for account level security um are our privacy and safety settings tab.

(05:36):
So you click the little gear in the bottom left hand corner.
It takes you to settings,
privacy and safety.
Um and there's a few different things that you can turn on.
So there's one uh setting that's allows it so that you can scan any media that's being sent to you in case that it might be uh sexual in nature.
And if they,
if it's turned on,
you won't receive media that's sent like that.

(05:56):
Another one -- Does it look for key woods or, -- No,
it scans the actual image and we have a model,
a machine learning model that can detect whether or not it could be sexual in nature.
Yeah.
So you can have that turned on.
You could also have it so that you won't receive any direct messages from anyone.
So you can determine whether or not that means I will only receive DMs, direct messages, from my friends.

(06:20):
It could be friends of friends or it could be anyone um in a in another server that you're also part of.
Uh I recommend turning that to the highest setting just like the other one um for that sort of like sexual media.
Um especially if folks are,
are uncomfortable with their,
with their kids seeing that kind of content.
Um and then lastly,
there are a few other more uh server level settings that can mean that if it's a private server where it's only your friends,

(06:47):
you might want to have your DMs open to any new friends that join that server.
Um or you can set it by server.
If you're in a bigger one that's just dedicated to Fortnite,
you can make it so that you don't receive any direct messages um or friend ads um in any of those servers either. So it can be at the account level or the server level.
I always recommend that parents um walk through their settings with their,

(07:11):
with their kids um and set them to the most uh sort of the strongest of all the settings.
And I'm just thinking as a parent,
like they wouldn't necessarily know what questions to ask -- Yep -- or even how the platform was going to be used.
So what like,

(07:31):
how would they approach that conversation?
Like,
would they ask,
what do you want to use Discord for -- Yeah -- and go from there?
I love this question.
Um we actually have on our safety center which is Discord dot com slash safety.
A whole section dedicated to parents having these kind of conversations.
Teens know a lot about technology,
trust them to show you.
I would start it off with a with a tutorial.

(07:54):
Ask them what kind of communities are you in?
Like,
where are you hanging out?
What are they around?
You'll learn a lot about the hobbies that your,
your teens or young adults in your life are part of,
um,
why they're part of it.
Maybe ask for a little tutorial on how to join a server.
Um snd then ask them like,
how have you set your safety settings?

(08:15):
Have you set them?
have you received any messages from people that you don't want?
Did you know that you can prevent that from happening?
So I would just uh open it up as a conversation to get to know who they are in these different spaces because depending on their settings,
depending on what the customization,
they could be a different persona in each one.

(08:36):
And we love that kind of creativity.
Um so I have my,
for my family server,
I have a particular silly name and then I have another one for,
you know,
more professional.
My,
my work servers,
I promise you that uh the teens in your life are probably doing the same thing and testing out different parts of themselves.
So I would explore as you would,
anything that they're interested in.

(08:57):
Are there communities where people want to exchange sexual pictures and like,
is that part of Discord as well?
Like if you were,
if you were wanting that and you're a grown up?
So we do allow for that kind of content on Discord.
Um It is gated to those that are above the age of 18. Um so when someone signs up and tells us their age that locks them into certain types of experiences for their own safety.

(09:21):
Um,
but if there are adults that are interested in this kind of content,
the sexual content,
adult content,
which can also include other things like gambling conversations,
it could be conversations around hobbyist communities,
like interests around weapons,
whatever it might be,
there are all of that content.
We gate for those that are above 18,
for the safety of teens.

(09:43):
So if you are above those ages,
yes,
you could have those conversations and find it.
We don't often make it easy to find it in,
in the,
in the product.
You have to know sort of like have other friends who are sharing those um those spaces.
But otherwise,
yeah,
if you're above the age of 18 and you want to engage in those communities,
you can,
Uh,
so what sort of things are reported as issues on Discord and how does Discord respond to them? For any message that you see on Discord,

(10:10):
If you hover over the message,
there will be three little dots that show up.
If you click that you can report it.
It gives you a few different options,
um I,
I highly recommend folks,
uh,
read the community guidelines.
It's pretty uh obvious the kind of content that wouldn't be allowed.
Uh no harassment,
hate speech,
Um anything that would be sexual harassment,

(10:31):
non consensual
intimate image use sharing,
terrorist content,
whatever you might think that it is against our rules.
And I highly recommend folks uh read through those community guidelines.
Um but when you click through,
it'll ask you what you're reporting.
Um and whether or not it fits into one of those sort of categories,
uh,
once you report it our trust and safety team,
which is,

(10:52):
um,
we,
we have a,
a huge safety function in Discord.
It's roughly 15% of our whole employee base.
Um and they're doing a lot of different things uh that can be investigation.
So that's our trust and safety team.
Um It could be other types of machine learning or,
you know,
automated ways of finding this content. But just reacting to a report,

(11:12):
it'll go to either a general um full time employee or,
or someone else that,
that is one of our front line workers that will look at the content and uh do some sort of enforcement action.
If it's against our rules,
we remove it.
If it's severe,
we remove the account and remove the server.
Um we have specialized teams dedicated to what we view as the highest harm content that being things like child safety issues,

(11:40):
anti violent extremism,
uh cyber crime and other sort of mental health issues like self harm.
So we do have specialized training for those trust and safety agents.
They do in depth on boarding to get to that point.
And so they take special care with those kind of reports.
But we don't just remove content in reaction to a user report.

(12:02):
We have a few other ways that we find this kind of content.
We also have trusted flaggers organizations like potentially,
you know,
Sexual Health
Victoria or um eSafety Commissioner.
Uh we have direct lines of communications where they will send us content that they think might be in violation of our community guidelines or a local law and we remove it.

(12:22):
Um they have immediate response,
right.
We also predominantly remove most of the content and accounts and servers that are in violation of our rules,
we we find proactively,
we don't need a user report to find it.
We use different methods that could be, you were mentioning kind of earlier, looking at some keywords,
flagging through machine learning,

(12:43):
we look at all media that's uploaded to Discord for anything that might be child sexual abuse materials.
We have a lot of different proactive methods,
but a majority of the content that is harmful,
we remove, before we ever get a user report because we don't want to put that burden on a user to have to find it or flag it first.
It when we talked with young people about reporting in particular,

(13:07):
it was one thing that they would say,
yeah,
I know how to do it.
But why would you?
Because the reporter doesn't always hear back about the outcome. -- Yep --
Um how would you encourage reporting for young people?
Yeah,
there's other research too that we've seen from Thorn,
which is an international child safety organization that they'll,
if they don't report and don't often tell a trusted adult,

(13:29):
they'll also block or mute.
And so we take all of those different methods really seriously and make sure to make it as prominent as possible in the product. When we receive a report,
we give that extra care,
especially when it has to deal with teens or teens are reporting it.
So I would just say please report it.
Um hopefully we found it before you even had to.

(13:51):
But if we didn't, report it so that we can take a look at that and I promise you,
there's probably other things around that content that you're reporting that uh a trust and safety agent will want to see so they can remove it and make Discord a a healthy place where you feel like you can participate.
And it sounds like from what you're saying that that conversation at home about consent is really important -- Yes -- that young people don't have to put up with things that they don't want to see or don't consent to,

(14:16):
especially when it comes to sexual content.
Exactly right.
Consent is key here.
And I think having those conversations early and often with teens reinforcing it is really important.
I just want them to know that they have a lot of control over their experience.
It's on platforms like ours to help educate and making sure that folks understand how to set those settings.

(14:37):
But you do have tools at your disposal to make your experience better and where you are seeing unhealthy things,
please report it so we can remove those people in those communities.
So interesting that word 'control' -- Yeah -- because I think parents and carers don't feel in control of this.
So that's a good starting point for a conversation.
Um what complications of regulation are there when users can be anywhere in the world?

(15:00):
Regulators are doing their best to protect their citizens,
their residents.
And so I just want to say that we really encourage these conversations and higher standards for safety.
Um I think that the trouble comes when there are uh conflicting regulations,
there's certain expectations in one region that might conflict with another.
It makes it harder,

(15:21):
but it's not impossible for us to,
to make sure that we're doing the best for our users.
I'll say that the eSafety Commissioner here,
there's really higher standards for certain types of abusive content.
And because of those higher standards,
we, in Australia,
we're able to take it more seriously and make changes to our overall policies that apply globally.

(15:43):
And this happens in Europe as well.
So that change has a global ripple effect.
But it can be hard for,
for companies to make sure that we're able to address it at a global level or even if there's some local changes in different regulations,
it can be difficult.
It's not impossible though,
I don't want to say that just because it's hard,
there shouldn't be those regulations.

(16:05):
Uh,
what sort of things could parents or carers talk about with their children before they connect on Discord?
It's a great question.
I think they should have,
It should be conversations around the internet generally.
Um,
I'll just say that the online
and offline divide doesn't exist for teens.
There's no such thing as online versus offline there.
It's hybrid at all times.

(16:27):
So the people that they're talking with at school,
they often talk with online who then introduce them to people that they might be gaming with,
who might be uh across the world.
Um but they don't think of it as,
as that differentiation.
They're just thinking about conversations in different spaces.
And so the same conversation you would have or you're talking to someone about,

(16:47):
you know,
uh respecting boundaries,
consent,
um talking about uh you know,
how to interact with strangers in person or online.
All of those things should be together.
Um and so, I would just say talk early and often about respecting boundaries,
identity,

(17:08):
who they are, um best practices around interacting with anyone.
All of those should be happening before they even get onto Discord. Once they're on Discord,
Uh I,
I think that that's where we can really do more and,
and have these conversations and we often have this at the,
at the local server community level.

(17:29):
Um so I,
one thing I didn't mention about before is that these,
these spaces are controlled by the users who create them.
So the people who create them, and they can moderate it as they like.
So the example about the dog server earlier,
um you can have a rule to say that you don't want to talk about cats, and you can remove content within that space that talk about cats.

(17:50):
Uh we see a lot of communities that want to share conversation but want to keep it to either a particular topic or don't want certain types of conversation in their space period.
Often we see like healthy norms being showcased by these uh server owners,
the people -- It's adjudicated by the people in the space. -- creating the space.

(18:10):
Yeah,
it's not only their responsibility to find that content that they don't want to see.
That's,
that's a lot on us as Discord,
but there is some sort of healthy norm sharing that happens within those spaces that I think are really cool.
Um and so we,
we try to equip moderators,
that's what we often call those people in those spaces, with the tools they,
they can use to have those conversations too.

(18:33):
So you're saying you can see, like this might sound like a stupid question,
but you can see the stuff on Discord,
like you're saying it's,
it's private.
Like if,
if you're a user,
you can't just hack into someone's -- Nope. -- account,
but you people at Discord can see everything that goes on.
Like you can actually see those conversations? No.
So we uh one of our biggest commitments out with safety is privacy.

(18:57):
So a lot of our trust and safety efforts are privacy preserving.
There are spaces on Discord like let's say a,
a direct message or a group direct message of,
you know,
1 to 10 people or a what we call more of those servers that are,
are,
are more private in nature,
um that can be under a certain number.
We are not constantly looking at message content,

(19:20):
that's not what we do.
And we actually do a lot of our proactive efforts to find bad content using other signals,
things that aren't message content because we want to preserve the privacy of those conversations.
So I would say no,
we can't see everything but when a user reports we can and when we have some other signal,
something that tells us,
hey,

(19:40):
there might be something dangerous or harmful in the space,
then we do look at that content.
But we're not constantly just reading through your messages.
The other thing that we,
we haven't mentioned so much is is strangers.
And I know that's a big thing that parents and carers are worried about that.
Um yes,
absolutely,
we want our young people to engage socially and online is how they do that, you can engage with strangers through Discord,

(20:04):
Uh,
if you're playing a game or you're in that room where you've been invited and there's people you don't know in there.
Are there particular safety concerns around that?
Yes.
I mean,
with any sort of messaging someone that you don't know,
um,
or having a conversation with someone you don't know,
you don't know their intentions or their motives.
Anything else.
That's why those,
you know,
account level settings and also server level settings are so important where you can prevent people from randomly messaging you.

(20:31):
But within those spaces,
I think with the smaller ones,
it's usually you and a couple of friends and the other person can vouch for them.
Um in the bigger servers which are not common,
all,
all that common on Discord,
there might be like a Fortnite server or a Minecraft server or whatever it might be.
Um those ones definitely having conversations with teens are,

(20:52):
hey,
if someone randomly messages you or is trying to ask if you can open up your DMs so that they can message you be,
be wary,
make sure that you're having more conversations and,
and building trust before doing that,
know that you can always report,
you also have the option to block them or mute them on the on the platform.

(21:12):
But with any sort of interactions between strangers,
there can definitely be potential for harm and there are different ways to mitigate that.
And that's,
that's where settings can really help you.
Awesome.
Thank you so much.
Yeah.
It's so interesting,
It's given me a lot to think about and discuss.
I'm so glad, yeah, and you all can find anything else around safety at Discord by going to Discord dot com slash safety.

(21:37):
And I'm just,
I'm happy that we could have this conversation.
Thank you. Great.
Thank you so much to Savannah.
A few things that really stood out for me in this discussion are, talk to your young person about what sort of things they want to do on the platform.
This could guide the customization of the platform and make it safer. Be an askable tellable adult, a massive barrier to reporting viewing sexual content online is that young people don't think their parents would understand or they would get them in trouble.

(22:16):
Discord has real people looking at reports that come through and multiple mechanisms to detect patterns of behavior.
This could lead to detecting criminal behavior such as grooming.
Once again the conversation has come back to consent,
no one should have to engage with sexual content if they don't want to.

(22:37):
It's not any different online.
I'll link to some resources in the show notes,
I'll link directly to Discord and Discord Safety Center.
You can see eSafety for reporting options and eSafety guide,
which talks through different apps and games and social media and gives a bit of a summary of each one.

(22:58):
For more information about Sexual Health
Victoria go to S H Vic dot org dot A U.
You can follow us on Instagram,
Facebook or Twitter,
contact us directly at doing it at S H vic dot org dot A U sub,
subscribe to the podcast um like it if you like it.
Thank you so much for listening.
Advertise With Us

Popular Podcasts

24/7 News: The Latest
Therapy Gecko

Therapy Gecko

An unlicensed lizard psychologist travels the universe talking to strangers about absolutely nothing. TO CALL THE GECKO: follow me on https://www.twitch.tv/lyleforever to get a notification for when I am taking calls. I am usually live Mondays, Wednesdays, and Fridays but lately a lot of other times too. I am a gecko.

The Joe Rogan Experience

The Joe Rogan Experience

The official podcast of comedian Joe Rogan.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.