Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Oh, we just wrapped up the radio show for Studio
six forty. Now it's time for Studio six forty plus.
This is the podcast only version of Studio six forty.
This is where the panelists bring to us their ideas
for topic and it's something that they have a passion
for or something that's interesting to their demographic. And today
the three panelists submitted their story ideas and two of
(00:22):
the three had to do with AI and impact on journalism,
also the ethics of using it correctly. So which two
had the AIS?
Speaker 2 (00:32):
What it was? Me?
Speaker 1 (00:34):
Asher and Kya. Okay, so let's get started then, Kyle,
I'll just start with you. Why is this a big deal?
Speaker 3 (00:41):
I think there's two camps of thought, where obviously there's
the people who are using AI to write their essays
and to put questions in for their tests and get
those answers, and obviously that is wrong and academically dishonest.
And then there's the other camp where AI is thought
of as this evil force that's taking jobs away and
(01:05):
limiting creativity, is going to ruin the world. And I
think the truth lies in the middle somewhere where at
the end of the day, AI is not going anywhere.
It's going to be a part of our lives going forward,
and now is the time that we need to learn
how to use it in a way that benefits us
(01:25):
and in a way that is more of a tool
than just a solution.
Speaker 4 (01:30):
Yeah, I completely agree. I think that the role of
education is to in large part ready ready us for
the workforce. And I think that if there is an
outright ban on artificial intelligence, then they're doing us more
a disservice than anything, because at the end of the day,
when our generation and generations younger than us go and
get jobs, they're going to be expected to know how
(01:53):
to use AI in the same way that our parents
were expected to know how to use Microsoft Word. It's
the same thing and where they are going to be
by not teaching us. It's the same as a teacher
back in the day, I would say, no, the Internet's
not going to be a thing. You have to use
pen and paper, you have to learn cursive because it's
never going to actually help you. But now we know
(02:14):
that so many jobs are using artificial intelligence, Journalism, education,
and just so many other ones are using artificial intelligence.
It would be a disservice for them to not just
not allow us to use it, but not teach us
how to use it. Who's they, I would say education
and large education. That's where a lot of it is
being condemned, I would say, from what I've seen.
Speaker 1 (02:34):
Well, I think because with everything, I think people are
looking at how it will be abused, and I mean,
look at how AI is already being used for misinformation, disinformation,
parody that that's pretty hateful and demeaning. I've seen a
lot of videos online and I don't troll online, believe me,
(02:55):
but sometimes stuff pops up on my feet. I'm like,
what is this and then it's like it's AI.
Speaker 5 (03:01):
Yeah.
Speaker 3 (03:01):
I think that's a big part of it, is teaching
people how to recognize AI so that they don't get
bamboozled on the internet, or same thing for educators to
recognize AI if their students are using it, especially in
an academically dishonest way.
Speaker 5 (03:18):
Yeah.
Speaker 2 (03:18):
And I think we've touched on that too before in
previous shows. Could you just you know, the whole idea
of A in the classroom, And I think a lot
of professors they have different opinions on that, right, Some
some are you know, more against it, but I think
a lot of realizing too kind of you know what
you said Ashrakey and both of you guys, that it's
it's inevitable, right, and you're going to have to adapt eventually,
And so I think the better way to go about
(03:40):
it is to teach us how to use it like
to our benefit, right and obviously not to you know,
write your papers and stuff, but just to kind of
use it as as an aid or something for little
things here or there.
Speaker 1 (03:52):
I think people I think, and me included. That's why
I'm trying to learn as much about it as possible.
I'm going on to webinars and Associated Press and Reuters
both recognize this and they're constantly having In fact, I
have one next week that it's AI AI coverage in
the election and how AI is going to be used
(04:12):
for polling and for other aspects of election coverage. So
I'm going to be tuning in for that. But I
read recently where ESPN, one of the big sports networks,
is openly saying we're using AI to start covering sports
that did not get as much coverage in the mainstream,
and that was their reasoning behind using AI. For instance,
(04:35):
it was an astra you were looking this up. It
was it.
Speaker 5 (04:38):
Was lacrosse and National Women's Soccer.
Speaker 1 (04:40):
League, right, and those are two sports that don't get
a lot of love in the mainstream media. And they
said because of staffing resources and the abundance of all
of these sports that and especially sports in underserved communities.
They made that a big point too, that they want
to give AI a chance to give it coverage. But
the one one thing that I learned Jacob and I
(05:02):
went to National Associator Association broadcasters earlier this year, and
one of our missions was to find out about how
AI was going to be used in journalism. And the
one thing that this expert told us was the AI
is only as good as the database in which it
pulls from. So if, for instance, and she gave the
(05:24):
example the Kennedy assassination, if you ask AI to tell
you about the Kennedy assassination, it's going to give you
all kinds of different stuff because there's not one definitive
answer on the Kennedy assassination, and there's a bunch of
conspiracies attached to it. So somebody's going to have to
go through that and still fact check AI. And I
(05:46):
think that's a.
Speaker 3 (05:46):
Big concern and eventually, you know, with AI, because it
is only pulling from already existing things, it can't create
anything new, and it will plateau. And I think that's
a big conversation with art and AI, is that, because
it's just pulling from other real life artists, at a
certain point, it's going to get to a point where
(06:06):
it just plateaus up where it can't be recreated, and it.
Speaker 1 (06:10):
Just well, isn't part of AI, isn't part of that technology,
but it's supposed to keep learning.
Speaker 5 (06:16):
It is it is.
Speaker 3 (06:17):
But I mean, if there's nothing new coming in, there,
no real life human experience.
Speaker 1 (06:23):
And it's done every variation of whatever existed before exactly.
Speaker 2 (06:27):
Yeah, because it seems like, you know, and I don't
think AI could replace like creativity necessarily, right, you know,
because that comes it's a very human you know, that's
that's something human, right, And it seems like that a
lot of people you know who are using AI and
like for work, let's say, like they're using it to
(06:49):
kind of complete like simpler tasks, right, And so I
don't know, I'm not sure, Like I don't think it
could get to the level of a human in ways
like that as.
Speaker 5 (06:59):
Far as creativity, and create storytelling.
Speaker 2 (07:01):
Yeah exactly, Yeah, Yeah.
Speaker 4 (07:03):
But something that concerns me about AI a lot too
is something that you're right. A lot of people do
just use it for more monotonous tasks, but there are
a lot of people that rely on it for information.
And a lot of times AI is coded by a company,
and companies have biases, so AI is pre programmed to
have biases. So I think that's something that's going to
(07:24):
be super controversial.
Speaker 1 (07:26):
But here's the problem is that how does the public
know who are they going to be able to trust?
What are they going to be able to trust, Where,
where's that source going to come from?
Speaker 5 (07:34):
Yeah?
Speaker 2 (07:34):
You know what's really interesting recently when I've been googling,
like a random it could be anything, just something random
that I wanted to know, And the first thing that
will pop up, like above all the links, it says
like AI generated or something. Yeah, and it kind of
seems a overview yes, and it's like a summary of stuff,
and then it'll give me the answer to whatever I
(07:55):
was asking Google.
Speaker 5 (07:56):
So that's new.
Speaker 2 (07:57):
That's like pretty recent.
Speaker 5 (07:58):
I haven't seen it.
Speaker 1 (07:59):
So it's kind of like they're just claimer. It's like
like use it your own risk. Yeah, that's the way
I think about it. When I saw it, I was like, oh,
by the way, this is AI generated take it for
what it is.
Speaker 3 (08:08):
I mean, I think it's kind of I think of
it in the similar ways I think of like Wikipedia,
where you just have to read it and be like, well,
you know, you gotta back it up.
Speaker 1 (08:18):
Well, I use Wikipedia sometimes, but I use it as
a starting point exactly. I'll use that and then I'll
go sess it out myself. Yeah. You know, it's interesting
because I assume then based on these topics here, Nico,
the last one your topic here, And I'm going to
bring it up because this is a good example. Your
idea for the topic today was the six hostages executed
(08:38):
by Hamas, and you say that half the world still
can't condemn it, supports the terrorists in saying they deserve
to be killed. So if you were to put that
topic into AI, what do you think would come up?
Speaker 5 (08:50):
Oh gosh, if.
Speaker 1 (08:51):
You think about I mean, and that's what I'm talking about.
That is such a polarizing topic. How would you be
able to get into intel from a And if you
were a journalist counting on AI to write about that topic,
what do you think would come up.
Speaker 2 (09:05):
Oh my gosh, yeah, I think because you know, we're
talking about a lot of times you ask AI something
and then it could pull from all these different sources.
So I don't know, I'd have to.
Speaker 1 (09:16):
Like try, Are you doing that right now?
Speaker 2 (09:17):
Sure? Yeah, that's good.
Speaker 1 (09:18):
What do you got?
Speaker 4 (09:20):
I just asked chat GPT and it's kind of just
giving you the overall like information.
Speaker 5 (09:25):
Uh would you ask it?
Speaker 4 (09:26):
I said, just tell me about the six hostages killed
by Hamas, and so it told tells me about how
they were abductor October seventh found dead, and then just
kind of the overall information.
Speaker 5 (09:39):
It doesn't give any.
Speaker 1 (09:41):
She now puts something in there, like, you know, is
Hamas bad?
Speaker 2 (09:44):
Yeah?
Speaker 1 (09:45):
I mean seriously, is Hamas a bad organization or a
bad group?
Speaker 5 (09:50):
Well, yes, of course, but I'm curious about it, says yes.
Speaker 4 (09:54):
Hamas is widely regarded as a terrorist organization by many countries,
including the United States, EU, and Israel. Has been responsible
for numerous attacks on civilians, including suicide bombings, rocket attacks,
and kidnappings.
Speaker 1 (10:05):
And here's my final question for that type in should
I support Israel or Hamas? No, I'm serious because this
is this is exactly what we're talking about.
Speaker 4 (10:18):
Supporting Israel or Hama Hamas is deeply personal and complex decision,
but it's important to understand the facts in the parlor context.
So it gives you like a little write up on
Hamas and then that little write up on Israel neutral route.
Speaker 1 (10:31):
Now next, now you gotta Now you're gonna have to
type in everyone's name. Have you, guys ever aied yourself?
Have you ever googled yourself?
Speaker 5 (10:39):
Well, yes, that was all the rage in eighth grade.
Speaker 1 (10:46):
I don't anymore. I don't do it anymore. God, in
the eighth grade, you googled yourself for.
Speaker 3 (10:51):
The first time, something actually comes up for me, which.
Speaker 1 (10:54):
Is really articles. Yeah, that's good.
Speaker 2 (10:57):
You probably find like my podcast on there, you know.
Speaker 1 (11:00):
Oh, well that'd be good. Yeah, I mean that's that's
kind of the cool thing. But yeah, ai yourself and
see if his oh oh his job.
Speaker 5 (11:06):
Wow, that's terrifying. Oh gosh, that's so.
Speaker 1 (11:09):
Did you pop up on chatch?
Speaker 2 (11:10):
Oh?
Speaker 5 (11:11):
Oh I did? That's really weird. Yeah, it knows.
Speaker 4 (11:14):
I'm a student at Vanyard University who's participated in like
Unify America civic engagement, is also student leader at my
church all.
Speaker 5 (11:24):
Kinds of stuff that I have no idea how it happened.
Speaker 2 (11:26):
Did you just look up your name?
Speaker 5 (11:28):
I just said, who is actually? Okay, that's it.
Speaker 2 (11:30):
Okay, we're going to do it, guys.
Speaker 1 (11:31):
Okay, yeah, let's just do it.
Speaker 2 (11:33):
Wow.
Speaker 5 (11:33):
I kind of hate that a little nor.
Speaker 1 (11:35):
Why well, hey, I mean, this is what happens when
you do stuff online.
Speaker 5 (11:39):
That's true. That's super weird.
Speaker 1 (11:40):
This is what happens.
Speaker 2 (11:42):
What Okay, here's the thing. Okay, because I use my
middle name, you know, instead like as my like professional Okay,
so let's see. So this is chatchyt that's when you
used right after Okay, So I said, who's Nico Sapphart?
Speaker 5 (11:56):
I don't know.
Speaker 2 (11:57):
As of my last update in September twenty one, there
isn't a widely recognized public figure or notable person.
Speaker 5 (12:03):
By the name Nico Saffar.
Speaker 2 (12:06):
Okay, it's possible Nico Safer could be a new or
emerging figure Okay, okay, ballad, a character from a recent
work of fiction, or someone known within a niche community.
If you provide more context, I might be able to
help you better. Well, totally, they don't know enough about me.
Speaker 3 (12:21):
Yet, do you got kaya I'm getting a whole lot
of nothing too, mostly other kaya mansh some filmmakers, some writers.
At first I thought it was me, but not me.
Speaker 1 (12:33):
Well, I do know that sometimes I come up as
in a search because there's a professional football player with
my name and a professional saxophone player with my name
with play with George Mike.
Speaker 4 (12:45):
I actually did see that when I looked you up
when I just didn't come on the show, I was like,
I don't think he's a saxophone player now, yea.
Speaker 1 (12:52):
And I don't play with the San Diego Chargers either,
so no, my football days are over. But so there
you go. It's a little frank and it's a little interesting.
I have used AI to just double check some things
I did.
Speaker 5 (13:05):
Steve, you did on AI.
Speaker 2 (13:07):
You're actually the first one. I said, who's Steve Gurgery
And it has all of them, but you're the first one.
Steve Gregory journalist, actor, musician, composer, act, And then there's
a there's a yeah, I mean they're separate, they're like one, two, three,
but the journal use the first, and then it says
a little bit about yeah number one, oh yeah.
Speaker 1 (13:23):
Well that's sometimes it's not good to be number one. Well,
so it sounds like you guys are willing to embrace
AI with questions.
Speaker 3 (13:33):
Yeah.
Speaker 5 (13:34):
Yeah.
Speaker 3 (13:34):
I think of it kind of in the realm of
like a calculator, where if you're teaching students how to
do basic addition and subtraction, you're not going to give
them a calculator that's going to defeat the point. But
if you're teaching them how to do you know, advanced calculus,
advanced statistics, then you know, the calculator can free free
up some space and have them open to learn the
(13:55):
more complex topics.
Speaker 1 (13:56):
Yeah, I remember. None of you remember what a slide
rule is, do you. Yeah, that was what was given
to us in algebra. We didn't get calculators, and then
they didn't start calculators in my school till after I graduated.
Then they started allowing the Texas instrument calculators to come
in and start start doing that, and I thought that
(14:16):
was cheating. But so you can always tell when someone
counts change back at you at the story, you can
tell who's at math.
Speaker 4 (14:23):
Yeah.
Speaker 1 (14:24):
Anyway, well we're gonna leave it there. Always good to
have you, Thank you so much, and good luck.
Speaker 5 (14:30):
Thank you, Thank you.