Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:08):
Welcome to Audioarchiv, the channel for historical interviews with female writers, philosophers,
activists, and intellectuals from all over the world.
(00:34):
Hello. Imagine you are leaving the house.
Suddenly, someone approaches you on the street, insults you in the worst possible way, calling
you an idiot, a paedophile, or a left-wing terrorist.
He calls his mates over, who spread the lies everywhere.
(00:54):
Now all sorts of people begin to insult you in the worst possible way.
Every time you go out.
Not only do your family and friends become afraid, but your employer confronts you, your bank
wants to close your account, and your children are bullied. What would you do?
Would you go to the police and file a report?
(01:17):
You would probably defend yourself in any case, as your sense of justice tells you that public defamation is punishable.
Section 187 of the Penal Code states (01:25):
"Whoever, knowing better, asserts or disseminates an untrue
fact concerning another person, which is suitable to make them contemptible or to demean them
in public opinion or to endanger their credit, shall be punished with imprisonment of up to
(01:50):
two years or with a fine, and if the act is committed publicly in a gathering or by disseminating
content (Section 11, Paragraph 3), with imprisonment of up to five years or with a fine." So far, so good. That is the law.
But what if someone denounces you on the internet with the same insults, threats, and lies? Do you defend yourself?
(02:18):
Do you file a report? Do you demand compensation? Probably not.
But what is different in the digital realm compared to the analogue?
Nothing! says Anna-Lena von Hodenberg in our interview.
Because, as the co-founder of HateAid, an NGO that advises and supports all victims of digital
(02:39):
violence, links as always in the show notes, what applies in the analogue world, she says, also
applies in the digital world.
Human rights, Basic Law, Penal Code, etc.
But why do most of us think that the internet is a lawless space?
Perhaps because we were overwhelmed by the digital revolution in the last decade.
(03:04):
Perhaps because we silently accept that Facebook X, YouTube, and Instagram, to name just a few,
have mutated into playgrounds for far-right conspiracy theorists and hate speakers, promoted
and demanded by the Trumps and Musks of this world.
Their goal is clear (03:24):
intimidation through emotional manipulation against minorities, queer communities, women, migrants, and democrats.
And to extract as much profit as possible.
Just like in real life.
That we as individuals can resist against this is explained by Anna-Lena von Hodenberg from HateAid in the interview.
(03:47):
It might also be good to consider nationalising the companies' hardware, from undersea cables
to server farms, placing it under public control, or forcing them to disclose their algorithms.
Another possibility would be to hold the operators of the platforms, the CEOs, owners, and supervisory
(04:08):
boards personally liable for any violations of human rights on their platforms.
The conversation is led by Berlin journalist Nadja Luer.
My name is Anna-Lena von Hodenberg.
I co-founded HateAid five and a half years ago, serving as its founding managing director, and
HateAid is the first nationwide advisory service for people who experience digital violence.
(04:31):
We offer emotionally stabilising initial consultations, safety advice, communication advice,
and we also encourage people to file reports and stand up for themselves.
And now, after these five years, we have evolved from an advisory service for victims into a human rights organisation.
So we are really developing proposals for European and national policy, for legislative proposals,
(04:52):
regulatory proposals, also for large platforms, and on one hand, we are still involved in victim
advisory work, advising individual victims.
At the same time, we are looking systemically at how this digital space, these social media,
should actually be structured so that they do not harm democracy but strengthen it when we engage with them.
(05:13):
So the public service broadcasting is also an institution.
We are also increasingly experiencing that journalists are being hindered in their daily work,
attacked, and are afraid to tackle certain topics because they are immediately, so to speak, punished in social media.
What would you advise us on how we can better organise ourselves?
(05:34):
What is important to say again is that I myself am a former journalist and the issue of female
journalists under pressure from digital violence, from hate, but also from analogue violence is a huge topic.
And it is also no coincidence that journalists are under pressure.
So it is actually about female journalists who, on certain topics, as has now been proven in
studies, climate, migration, right-wing extremism, AfD, feminism, these are contentious issues,
(05:56):
which female journalists no longer want to report on because they know what happens to them afterwards.
And not only in the digital space are they personally and massively put under pressure, but
also in the analogue space, because we are not a digital person and then we log off and become
the analogue person, but whoever sees hate in the digital realm is also aggressive in the analogue
(06:19):
realm, and whoever faces hate in the digital realm will also encounter hate at the next demonstration, for example.
That is why we in Germany have been downgraded repeatedly by Reporters Without Borders in recent years.
If we look at who is actually being attacked, then these are, so to speak, the pillars of our democracy.
And female journalists are one of the pillars, the fourth estate in the state, they are one
(06:42):
of the pillars of our democracy, which is being repeatedly undermined, this tree is being sawed
again and again, until eventually everything collapses in on itself.
And yes, female journalists can come to us for advice, they should, but it is not really about
the individual female journalists, but about what they represent in a democracy.
(07:05):
And that should be damaged.
And that means that it should not be the individual person who defends themselves, but the institution,
which is actually being attacked, should take responsibility and look after the female journalists.
And that is why it is important, first of all, that there is awareness in the institutions.
When one of us is attacked, it actually means that we are all being attacked and institutions
(07:29):
should also prepare internally to stand up for their journalists.
This is an important question because we currently feel that we are still somewhat defending
ourselves from a passive position, that we send security personnel with our journalists, for
example, when they go to demonstrations.
However, I still feel that this is still very sporadic and that we are still very reactive in such hate situations.
(07:56):
So how can we institutionalise this even more?
Anyone working in journalism today cannot avoid this topic.
This means it must actually become part of the training.
How can I protect myself preventively?
We know from counselling that any attack one is prepared for, where one knows what to do, where
one knows what can happen.
(08:18):
This is an attack that may happen, but it does not hit me as hard as if it comes completely
out of nowhere, like a knife being stabbed in the back.
This means the first thing is to provide education in prevention.
How can I set my data?
How can I have as few personal data as possible in the digital space?
So really, the individual person needs to be well informed and it should be clear in a crisis
(08:42):
during an attack, who is actually taking care of it, who takes over my account, which already
has hate comments, my private account, because the private account is always attacked, as one
hopes that the employer or the media institution will say, well, this is your private account,
we won't deal with it.
One is always personally attacked. This is a calculation.
(09:04):
We see this not only in journalism, but also, for example, with female scientists.
We have seen a significant increase in attacks on scientists who are in institutions that have
researched, for example, Corona, or who have researched the Russian war of aggression against Ukraine.
And it is exactly the same pattern.
The institution is never attacked.
It is always the person who is attacked personally, in order to isolate them and to try to prevent
(09:25):
solidarity, and to have an awareness that, okay, when something like this happens, who takes
over the accounts, who files the criminal complaints, who finances it if lawyers need to be
involved, who has psychological support, who provides psychological support, who perhaps talks
to the family, is there a press release in which the media institution clearly comments on this
(09:47):
and not the individual person.
Especially in cases of defamation, when false claims are made online, it is quite clear that
the communication does not rest on the individual person, but rather that as much responsibility
as possible is taken away from the affected person, so they can continue their work.
What I believe is still not entirely clear.
We often think, or most people often think, that the internet is a lawless space. Is it?
(10:11):
Or how are the laws structured so that it is clear that this is not the case, but one can also
be punished for what one does on the internet.
Thank goodness the internet is not a lawless space, although some think so and behave accordingly.
I believe it is also very important to convey this to your journalists.
(10:32):
They are often very resilient.
And part of self-awareness is that one simply has to endure a lot of criticism and a lot of hate and so on.
And it is totally counterproductive.
We see this in our consultations as well.
They only come very late, when really very serious things have already happened.
But it starts with the small things.
And actually, if one reacts to that, and we will also come to what is actually illegal in the
(10:54):
digital space and what is not.
If one reacts to that, then one can usually avoid the worse.
So in principle, everything that is criminally relevant in the analogue world is also relevant in the digital space.
No significant difference is made.
There are some criminal offences like threats of rape, which do not happen so often in the analogue
world, but suddenly occur massively in the digital space.
(11:17):
This has also been tightened up in recent years.
But insults, defamation, threats, death threats, threats of rape, these are all things that
are illegal, that can be reported, and against which one should defend oneself.
It is not brave to not defend oneself.
It is rather brave to defend oneself.
Firstly, to show perpetrators red lines.
(11:39):
This is the issue we have in this digital space.
Are there actually red lines or not?
And to also hold the judiciary accountable, so to speak, to carry out prosecution here, because
that can only happen if we report and defend ourselves.
And also to show that we are defending ourselves, that we do not accept this and relatively
early on, so to speak, show the perpetrators.
(11:59):
Here, so to speak, there is a red line for us, because we also see that after the insult, if
it remains, the threat comes, perhaps the threat of rape, then the death threat, then one finds
out where the people live, then one might drop by there and so on.
So the more one somehow allows, the more boundaries one does not cross, and the longer one does
not address this, the more violent it becomes.
(12:22):
And at the same time for all others who are watching, these are also their viewers, they send the signal.
It is initially acceptable to treat our journalists this way.
That means, it is actually also acceptable if we interact with each other in this society in
some way, because public service broadcasting also has this role model function.
And I see this as something very important, that public service broadcasting also very clearly
(12:42):
says no, this is not acceptable, and we stand up for them, and this is not a way of treating
people that we will tolerate.
And we take means and ways into our own hands in this constitutional state to protect our journalists.
Where is actually the threshold of these online amplified hate messages towards a perpetrator? An attack in reality?
(13:03):
Yes, that is a technical term called stochastic terrorism.
And that means that calls for violence against a person, which come from many, not only the
calls, but also those who tolerate it, must also be said to be indirectly involved, because
they simply do not contradict, that they create a mood where at some point a person, who perhaps
(13:25):
has not written anything in the digital space, but then feels, I actually have to act now, feels,
ah, look, there are so many who are shouting, that’s alright, something has to be done now,
that person must be attacked now.
We are now waiting for that.
I have an entire community behind me, urging me to do this, and in the end, they also celebrate it when I do.
(13:48):
We have seen examples of this in Halle with the perpetrator from Halle, who tried to open the
synagogue there and ultimately killed two people.
He streamed it for an audience that has long been calling for violence, and he thought that
what he was doing would also be celebrated.
(14:11):
We have seen this with other terrorist attackers as well, for example, with the perpetrator
who murdered Walter Lübcke outside his house, who was also the one who posted the video of Walter
Lübcke at that community meeting where he defended the right to asylum.
Under this video, this video that was shared thousands of times, tens of thousands, if not hundreds
(14:34):
of thousands, of hate comments calling for violence against Mr. Lübcke were found.
And after the perpetrator went and shot Mr.
Lübcke, one could read the jubilation of the community there.
This means it takes 500 hate comments, and then someone goes out and becomes violently active.
Is there a terrorist attack on a person?
(14:56):
It may also be that no one feels called to do anything.
The more vehemently people are called to action, the more people read it.
And that is what the digital space does.
The dissemination is simply so broad.
The more violent individuals read this, the more it may trigger their own radicalisation, the
more likely it is that an attack will occur.
(15:19):
And now I have only spoken about terrorist attacks.
That is, so to speak, the most extreme thing we can imagine.
Less extreme is that journalists at demonstrations suddenly face such massive hostility, as
we experienced ten years ago, when they report at a demonstration, such as Pegida demonstrations
or during Corona demonstrations, where people were indeed no longer safe.
(15:40):
And this is, so to speak, a violent, aggressive atmosphere, where people have charged themselves
up digitally beforehand and then go to this demonstration in real life, and there this violence
also manifests itself in physical violence.
If we look at Reporters Without Borders, the situation in Germany for journalists is no longer
(16:02):
good regarding press freedom, but only satisfactory.
That it is not just that in an autocratic system, like in Turkey or Hungary, journalists are
simply restricted in their way of reporting, but that here among us there is a public atmosphere
that is frightening for individual journalists, and this is in a democracy.
(16:25):
So that press freedom is not restricted because we do not slide down the rankings because the
system promotes it, but because a mass online contributes to this in democracy, so actually democracy promotes it.
And I ask how intentional and instrumentalised this is by certain organisations and groups,
(16:51):
or how much such things simply become self-sustaining because the internet is as it is.
Exactly, in an autocratic system, there is simply no regulatory gap.
The regulation is relatively clear there.
There is an autocrat, and there is censorship, and journalists are put under pressure or end
(17:11):
up in prison, and so on.
You can read all about it at Reporters Without Borders.
And in a democratic system like ours, we have law enforcement and a regulatory gap.
Two things that enable this.
This digital space has simply happened to us.
It has simply happened to us, and suddenly ten years later, in the blink of an eye, we wake
(17:32):
up in the morning with our mobile phone and go to bed in the evening with our mobile phone,
and everything is on our smartphone.
We read newspapers there, we discuss public issues, we fall in love there, we book our travels, we check the way.
The communication revolution that has happened to us.
And we have simply gone along with it and for a long time did not ask ourselves what this actually
(17:57):
means for our society, and we have left the design of these new spaces, which now make up the
main part of our lives, to private platforms whose only goal is to make a profit.
It is completely clear, there is no question about it.
The problem is that these are now our public spaces, which are essentially oriented towards profit.
(18:19):
And what we have missed is, on the one hand, that our institutions, especially our law enforcement
agencies, were not present in these spaces at all, and did not take these spaces seriously at the beginning.
Even now we still speak with judges or prosecutors who say, well, this only happened on the
internet or at police stations, victims are sent back home and told, well, if you receive a
(18:41):
threat on the internet, that is not a concrete threat you have.
This means that a distinction is still being made.
The digital space, despite being so important for all our lives, is completely trivialised.
So that is one problem, that the law, you asked me what is actually happening with the law in the digital space?
It applies, but it is not enforced.
(19:03):
It is not consistently enforced because we have not digitalised our institutions in such a way
that they can work as quickly as, well, operate at the same speed as unlawful things happen in this digital space.
If I can receive insults or threats within an hour, but I still have to manually record each
(19:23):
one at a public prosecutor's office, then that is a problem, I cannot keep up at all.
This is essentially one area, namely law enforcement.
And you asked what democracy actually enables this?
No, it does not enable this.
Actually, we have a resilient rule of law, which has been asleep, one must simply say.
There are indeed these positive examples, that people who generate hate online and cross boundaries,
(19:50):
legal boundaries, are also arrested.
Can't we orient ourselves a bit towards that?
There are positive examples and we would not be sitting here today if I did not believe that change can indeed happen.
And we are also currently working on this in recent years, but primarily under pressure from
civil society, including organisations like ours, which conduct and finance these proceedings
(20:13):
and also hold the public and law enforcement agencies accountable, really with public pressure,
so that a cultural change slowly comes about.
When I started five years ago, there was a single specialised public prosecutor's office in a German federal state.
Meanwhile, four federal states still choose to have specialised units for digital violence.
For example, we work with the specialised public prosecutor's office in Hesse together with the ZIT.
(20:37):
And it looks like this, that regularly through the cases, which we also report, the investigators
then go out and ring the doorbell at 7 am and take away complete smartphones and computers.
Exactly, also take the person away first and then check everything and then there can really
(20:57):
be significant prison sentences following.
We also carry out civil law enforcement and can also assert claims for damages, the highest
we have had so far.
This means it can lead to prison sentences and it can become really, really expensive if this happens in civil proceedings.
This means our law allows this and there is currently a cultural change happening to enforce it.
(21:23):
But we simply need to become much, much faster.
We probably also need to report on it when something like this happens.
So when the public prosecutor actually takes action or the police do, then we have the task
of showing that so that people understand, okay, such things can be punished, even with hefty fines.
So these are the effective mechanisms that probably deter people from doing such things in the future. Absolutely.
(21:47):
And you are also knocking on open doors with me.
So much of what we do is, we also try to bring it into the public eye.
And we are also using prominent individuals who stand up with us as influencers in social networks
to show others, look, I stood up, you can do it too.
That is absolutely no problem.
And I also see a total responsibility of public broadcasting here.
(22:08):
Every year there is also a BKA action day, where really perpetrators are targeted throughout Germany.
Exactly, where there are also these searches at perpetrators and they are very, very successful.
And that would be, for example, what is often too little, is always a report, but actually,
these should be reports where people can really see, okay, this is how it works when I exercise violence online.
(22:32):
You had already described very well earlier the important role of public broadcasting and what is important there.
So what more should we do, perhaps come together more, even perhaps switch on some videos with
prominent people, that just comes to my mind spontaneously.
There are three things that public broadcasting should do in my opinion.
Firstly, we have already talked about this, support for journalists, both preventively and acutely,
(22:56):
and without ifs and buts.
Exactly that must happen all over Germany.
And this must also happen for freelance journalists, because they are, so to speak, the most vulnerable. Firstly that.
Secondly, there needs to be more reporting on law enforcement and also addressing that.
And public broadcasting must also take the lead.
Exactly, when journalists are attacked, then also, so to speak, report it, initiate civil proceedings
(23:18):
and really also, so to speak, we defend ourselves too, you can do that as well. That is the second.
And the third, we need to look at the platforms, because that is the third point.
We have talked about cultural change and not regulation in democracy.
The platforms make money from hate, they earn money from it.
The algorithm that serves us content has no morals.
(23:38):
It does not care whether you stay on the platform because you are watching a cat video or because
you are watching a violent video, it simply wants you to stay on the platform for a long time
so that it can eventually serve you ads, tailored ads, so that the platform can serve them to
you and make money from you.
If you are not paying, you are always the product on these platforms.
And what unfortunately goes viral the most, what is the most emotional, are indeed things that
(24:04):
polarise, that outrage, that incite anger, that stir up hatred.
That is what leads people, what people share. That is no coincidence.
We see that right-wing and far-right groups have now specialised in igniting exactly these fires in social networks.
So, massively attacking a specific person.
(24:25):
Somehow this happens in the Telegram group, where the name of the person is mentioned and then,
so to speak, with thousands of accounts they simultaneously go to the profile of the person
and attack them, trying to find out private data about this person, publishing it again on the
internet, hoping that perhaps someone will come by and take a photo of the house, to increase the pressure even further.
(24:46):
And others who are then online, who may have had a bad day, may also be a bit racist or misogynistic
or whatever, join in, so to speak, and the algorithm notices, ah, a lot is happening here and
this seems to interest many people, so I will show this to even more people in the newsfeed.
And that is, so to speak, a huge problem that is not known at all.
(25:08):
This means that what public broadcasting actually needs to do is reasonable digital journalism
to explain to its viewers.
How does it actually work online?
Who is actually organising themselves there?
How do these algorithms actually work?
If I see that a person is receiving hate comments, it might be that they are only from 1000
people, all of whom have fake accounts.
(25:29):
All of this can be discovered with good data journalism and monitoring.
At the moment, civil society is still doing this and providing it to journalists.
For example, we also do this, CMAS does this, the Institute for Strategic Dialogue.
But I actually see this as a task of public broadcasting, really as an educational mission,
to inform people who only see from the outside what it looks like behind the scenes in social
(25:55):
networks, so that they, so to speak, as informed citizens do not fall into all these traps and
can assess it much better and not be so influenced by these moods.
But there is already the feeling that we can counter the emotional with rationality a bit like Sisyphus.
(26:17):
It is probably the only chance. Yes, it is time-consuming.
You can still report them.
There is not just this possibility.
We are only talking about content being removed from the platforms.
You can also still report perpetrators, and you can demand in civil proceedings that things
be taken down, first that the items are deleted and no longer disseminated, and then you can also demand compensation. That remains your prerogative.
(26:40):
Journalists can always contact us.
We always support them in this.
This means that even if this does not work, there is still this possibility. That is not excluded.
But the second thing that is really interesting about the Digital Services Act is that they
are now trying not to just extinguish individual fires, which happens when violent content is
(27:02):
already there, but to try to tackle it systematically, so that they are obliged to conduct investigations
into what risks, what systemic risks arise for example for which groups, such as journalists,
women, members of the queer community, and so on.
So looking at it, I can tell you from my consultation, it is good that women have exactly the
(27:25):
same rights as men or as diverse individuals to access this platform.
For women, this is not a safe place; they are much more likely to experience violence, as are
queer individuals or black people, and at the moment we are seeing this with antisemitism, that
if I, for example, am a Jewish person, then the likelihood of experiencing violence on these
(27:46):
platforms is much, much higher.
And now the platforms are obliged to investigate and examine what risks exist for which groups
and what they can and must do to eliminate these risks.
At the end of the day, we are talking about product safety.
So if you go to a pharmacy and buy an aspirin, it has undergone clinical trials, there is a
(28:10):
package leaflet that states which risk groups should not take it and how much one can take.
It also states what is in it, what ingredients it contains.
We do not have that with social media platforms.
And that is precisely what the DSA is trying to implement, firstly a risk assessment and then also a risk minimisation.
That sounds great at first.
(28:31):
However, it will be crucial how much the platforms disclose, for example, how their algorithm
works and whether this algorithm perhaps promotes violence against women or violence against journalists.
How much of that will they disclose?
They are also supposed to make their algorithm and the functioning of their algorithm transparent,
and how much will they actually do about it in practice.
And I am initially pessimistic because we have experiences from the NetzDG, where the platforms
(28:56):
actually did everything to avoid taking real measures and always tried to find legal loopholes
when something was not fully defined, in order to avoid it, because that naturally also calls
into question their business model.
That means, what is the role of public broadcasting in reporting on this and educating people about it.
(29:19):
Firstly, that the platforms are not the same for everyone, but that there are specific risks
for certain groups on these platforms, that this did not fall from the sky, but is related to
how the platforms are designed, how the algorithm works, that these are also decisions made
by the platform, and that we, I always say, do not have to accept everything we are given, but
(29:45):
that citizens should also shape these platforms through their votes, protests, and petitions,
just as we shape other political areas and public spaces.
I believe that this awareness has not yet reached the population at all, that we as citizens
can make demands on how this digital space should be shaped.
(30:07):
And I believe that this is also an essential task of public broadcasting and reporting, to educate citizens about it.
Thank you for being with us at audioarchiv.
Follow us, so you won't miss an episode and don't forget the like button.
See you next week, your audioarchiv team.