All Episodes

November 13, 2025 38 mins

A new UNH poll finds that just 28 percent of respondents think artificial intelligence will have a positive effect on the country overall. Where do you come down on AI? There was an AI generated photo of a Texas high school that sent the community into a panic and disturbed school activities. How can schools be prepared for the threat of AI generated threats? Should there be laws or restrictions on AI? In this age of AI, when AI is used dangerously, what should be the consequence?

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
It's Night Side with Dan Ray on WBS, Boston's news radio.

Speaker 2 (00:06):
That is correct. Brady Jay for Dan Ray on Nightside.
Shout out to Dan. And we have Rob Brooks and
Jay working the wheel in the master control. It's guiding
the ship through the evening top of the hour routine.
We all have our top of the our routine. Here
go get a candy bar top of the hour. I
just finished off. I polished stuff Reese's peanut butter cups

(00:31):
and it gets my brain working, firing on all cylinders.
Excuse me, all right, last night a side topic kind
of popped in, and then we had a gentleman call
and say he had no use for computers. We liked
life fine without a computer, and I get that to

(00:57):
some degree. But sir and all you folks like him,
you really are missing out. And this doesn't lead into
our topic here. You're missing out on being able to
do certain things like we knew your license buy plane tickets?
How do you buy plane tickets without using a computer.

(01:22):
That these uh, self induced difficulties have been around for
a while. But now if you don't use a computer,
you're going to be missing out on even more. And
I'll get to what that is, but I really recommend it.
In a way, it's none of my business, but in
a way it's my responsibility to tell you, please don't

(01:45):
be intimidated by the computer. It's actually quite simple. And
I feel tell me if I'm wrong six one, seven, two, five, four,
ten thirty that you say you don't want to just
because you're afraid, admit it. Admit it. There are way
it doesn't take a lot of time, and the payoff

(02:07):
is huge. Yes, there are risks that go along with it,
but it's here and not embracing the computer. It's kind
of like when the automobile came along. Say I like horses, Yeah,
and you can get along on a horse back in

(02:29):
ought sex or whatever it was. But it becomes a hardship.
And I feel that many of you not getting involved
in understanding what computer is it is a hardship that
is self. You're doing it to yourself. There's no there's
no shame in saying I want to learn how to
do it, I don't know how to do it. I

(02:49):
would think that there would be many parents who would
ask their kids to teach them, so that would be
the easy way. Can you please teach me and this
very simple things are teach me how to write an email,
teach me, teach me how to look something up. All right.
Now there's another whole layer, and that is artificial intelligence

(03:13):
or super overused word. I wish there are other words AI.
AI as different things to different people, but it's it's
a thing like I just showed Michael Coin something that
blew his mind. I said that I was thinking about

(03:36):
asking about interviewing chat GPT Guy Chat GTP. GPT is
an AI program which is stunning. It's only twenty bucks
a month and it does crazy stuff. You can talk
to it like a human being. It seems to have
a sense of humor. And if you if you say,

(03:59):
can you not you that word anymore? They don't. Of course,
there's a lot of down There's a lot of downsides too.
And the reason this comes up tonight is because a
pole at a school I used to go to and
liked very much in New Hampshire did a poll and

(04:20):
it says in New Hampshire mixed feelings about the effects
of AI in New Hampshire. And I'm I'm wondering a
couple I'm wondering your take on a couple of things.
One of them is how do you think people in
New Hampshire view AI? Do you think they're mostly in
favor of it or not? And here's this, this is

(04:44):
kind of interesting, really interesting. Of those that don't trust
AI and don't think that it will make the country
a better place, of those doubting thomasin do you think
those people who doubt and dislike AI would be more

(05:04):
likely to be republican conservatives or liberal Democrats? Well, let's
find out. This is phase one of what we did.
What we're going to do is find out about this poll.
But I'll ask you the same question as the poll.
And we've brushed upon this before, but we've never we've
never drilled down and made its complete topic that I
can remember as far as I know. New polling data

(05:29):
shows twenty eight percent of respondence think artificial intelligence will
have a positive effect on the country overall. Wow, so
you're taking the mind this is New Hampshire kind of
a mixed state, a lot of conservatives. Oh yeah, trust me.
When I drive up north, there's all kinds of lawn
signs and flag waving going on up there, and you

(05:52):
know who they voted for. And in the more urban areas,
it's a different thing. But the article says, and this
is Boston Globe granite status, anticipate artificial intelligence will have
a positive impact on medical care in the coming decade.
There they are largely pessimistic about the emerging technologies broader

(06:15):
social effects, according to a new survey. And I'm going
to say that just as I feel it's a hardship
not to embrace the computer, it's going to increase that
hardship if you don't embrace AI. Again with the automobile analogy,

(06:37):
you may not like it, but it's coming. And even
though you may not like it, if you don't use it,
that then you don't like you be behind the eight
ball even further. And in a future topic, I will
address how this is really creating a generation gap, more big, way,

(07:01):
more substantial, way, more gaping than the one in the sixties.
Because the gap, it's not through choice anyone, in many ways,
not through choice. If you don't understand, if you don't
get technology, and if you don't use it, you don't
understand your kids, and you don't understand your grandkids, you
live in a different world. They don't get you at all.

(07:26):
You know, they you are going to be much less
on their minds and a much less part of their
life if you can't communicate with them, and you're not
gonna be able to communicate with them if you don't
get their technology. I don't think many of you understand
how technology is insinuated into their life way more than

(07:48):
you get. Anyway, let me get into this and do
you do agree with me or not. You can say, hey,
Bradley Jay, I don't care. I am refusing to learn
the computer. You can say the same thing about AI.
I don't like it. I'm not going to do it,
but I'll give you the pros and cons later. What
do you think? Six one, seven, two five. So granted

(08:09):
status they feel like it's going to help medically, but
they still don't like it. Kind of here's the thing,
it's coming. There's nothing you can do. Nothing you can do.
It's coming. It's going to make demands of you. You're
going to need to learn how to use it to
compete to stay up well. Lessen the third of the

(08:29):
respondents predicted a I'll have a positive effect more than
twice as many thought it was going to have a
negative effect. And even so, you need to adopt it,
even though you may think it's evil. You need to
learn it, if for no other reason to protect yourself
from it. And as I was mentioning before Dean Michael

(08:53):
con was blown away when I said, you know what
I'd like to do, I'd like to ask my bosses.
I'd like to interview chat GPT on the radio. And
unless you've experienced it, you might think, well, that would
be weird, that would be no good. It will be
someone talking, you know. I'll have to type it in

(09:14):
and I'll get a text response and I'll say tell you,
or you'll have some robody sound in person speak like
a lot of those dumb AI podcasts that make dumb
AI mistakes. Oh yeah, you're probably judging current AI on
those old podcasts, and you can. You can tell by

(09:35):
how they misread stuff like he went to the store
and he bought all oaf of bread that they just
don't get a lot of stuff. But then the new
AI is so far advanced. I can talk to it
and it right out of my laptop has a big
clear voice, and I can ask it about itself, which

(10:00):
is pretty intense. I ask it, why do you not
like me to call you Bob, and I ask AI
if it's legal to interview AI on the radio, And
I wonder in a court of awe if that would
hold up if say the Open AI company said you

(10:23):
can't interview chat GPT. What if chat GP told me
I could? That would be a slam dunk defense. I
would think anyway more about it, And this pole the
influence of AI generated synthetic media as Smith's mark on
politics too. Oh my god. And of course when it

(10:47):
comes to negative campaigning, what a delicious tool AI is.
You can create such embarrassing, demeaning images that have a
great effect, and images that would make a statement that
was false. Then you run into all kinds of legal

(11:10):
areas on hey, does this qualify as slander or libel?
I mean you were putting out an image that is
making a statement about me that is not true. It's
just like you spoke it or wrote it. It's the

(11:30):
same thing. Also, what if you put out a fake
AI image of a building on fire? You yelled, you
visually yelled fire in a crowded theater, and then what
if fire departments responded, and then what if fire firefighter

(11:52):
was hurt? Would you be on the hook? You probably would,
and you probably should. I don't know about they probably
would you probably we should, Glenn and Brighton is a
smart guy. He knows this is a good time to call.
And we'll talk to Glenn right after this. On WBZ, You're.

Speaker 1 (12:08):
On Night Side with Dan Ray. I'm WBZ Boston's news radio.

Speaker 2 (12:14):
In case you just joined us, I was speaking about
a poll taken in New Hampshire regarding the opinions about
how AI will affect the quality of life in this country.
Also that there has been a situation where fake AI
generated image caused a lot of problem with the high

(12:35):
school and a fire. And then I will also give
you a serious list of the pros and cons, and
I'm really interested in how you feel about it. And
for me, it's simultaneously evil and helpful. I'll tell you
at this point I don't mean to be gloom and doom. However,

(12:59):
I do believe it will be it will cause the
actual end of the world, and they're not too just
in future. And I'll explain why, what can we do
about it? Nothing? How's that for an upbeat assessment? We
go to Glenn in Brighton.

Speaker 3 (13:16):
I agree with you, all right, Glenn.

Speaker 2 (13:19):
First of all. How you been. I haven't talked to
you in a long time.

Speaker 3 (13:21):
I know I have a friend that's living with me
who he's homeless. He worked in a music store. He
got evicted because they closed the building, and he he
has a friend. He and I are trying to trying
to track down Steve Kilroy, who does piano refinish him.
That's why I'm calling you. I missed you last night.

(13:43):
I didn't know you were on somebody told.

Speaker 2 (13:45):
Interesting, So yeah he does. Actually, he's a Steve Kilroy
as a drummer and a piano finisher. He has his
own piano right and he's currently he's in a bunch
of bands. He may be on tour. But I can
connect him. Why do you want to connect? Because you want.

Speaker 3 (14:04):
To You know a guy. We know a guy in
the Newton. His piano needs to be finishing. His cat
cook cook cat claws scratch the hell out of the thing,
out of the lead, especially in one of the legs.

Speaker 2 (14:18):
Of course, you are a piano tuner, so that's all right,
that all fits together. Yeah, it's expensive though. I think
maybe he needs to just get used to those cat
closs cat gloss scratches.

Speaker 3 (14:29):
Yeah, I know. I thought, well, the guy is a
contract contract attorney. He's got more.

Speaker 2 (14:34):
Money than going Okay, then okay, fine, so there's that,
but that's kind of inside baseball. So in general, you're
healthy or you know, yeah, just give me the good
don't tell me the bad news, just everything.

Speaker 3 (14:48):
I have a crush on that woman that does a
commercial about the Dreams by James.

Speaker 2 (14:54):
He speak very highly of you as well. That's nice.

Speaker 3 (14:58):
Yeah. Well, a woman in the ad has a voice
like someone that I'm trying to track down named Christine
and Dorchester, Christine Doherty. I met her at a trumper.
I met her at a trumper.

Speaker 2 (15:10):
I use last names bleep bleed.

Speaker 3 (15:14):
All right, Oh you're getting you guys get nervous. I'm sorry.

Speaker 2 (15:18):
That's a bad policy. So you called during the AI thing, Glenn.
That coincidence.

Speaker 3 (15:24):
Yeah, I don't like either the ride for people for disabilities.
They have a new AI system.

Speaker 2 (15:30):
Okay, so how does that work?

Speaker 3 (15:32):
The drivers I don't know. The drivers hated, the dispatchers
hated the everybody hates it. But the head, the head
of the ride, and I can mention his name because
Dan Ray tried to have him on as a guest
Joel Cook, He doesn't care that everybody hates it. Dan
Marina McKinnon tried to get him on his guest So,

(15:56):
how does the.

Speaker 2 (15:56):
System work or how did it change?

Speaker 3 (16:01):
Well, it's I don't know how explaining of it itself computic.
When you call the ride, it calls you back and
confirms that everything that you told it, well, that's fine.
And then the next day it calls you an hour
before it's coming and ten minutes before you're coming. But
sometimes it forgets to call you, and sometimes it gives

(16:23):
the driver's the wrong address. I had a driver no
show me. I had to take an uber to tune
the piano because the drive the ride didn't even come
to my apartment.

Speaker 2 (16:33):
Well, that is a flaw. Hopefully they can fix that.

Speaker 3 (16:36):
In the pouring rain, I'm waiting outside.

Speaker 2 (16:38):
How much does it cost to tune a piano?

Speaker 4 (16:42):
Uh?

Speaker 3 (16:43):
Well, I charge either one sixty or two sixty. It's
a single tuny one sixty. If it's a double tuning
two sixty.

Speaker 2 (16:53):
How much does it cost a tuna fish? How could
I not say? How could I not say that?

Speaker 3 (17:00):
Okay, so what are you doing for fun?

Speaker 2 (17:05):
You know? Do you get out to the Eagle deli and.

Speaker 3 (17:09):
Stuff, and yeah there were no nice women that.

Speaker 2 (17:17):
That's no good. So you're you're not down. So you
think AI is the kind of going to be the
end of the.

Speaker 3 (17:25):
World, like me, Yes, I agree with you, totally.

Speaker 2 (17:29):
Okay with my I will just allude to the thing
I'm going to talk about a little more detail. There
was a kerfuffle caused by a and I AI image
of a school on fire, and people I guess thought
that there was a school on fire. Fire department came

(17:51):
and all. And so it's a very short distance from
that to a video of missiles taking off a you know,
a lot, along with an AI generated phone call from
a fake Putin or a fake world leader and other
other AI hacks that make our country or some other

(18:14):
country believe we are under attack. We have some seven
minutes to use them or lose them, and and and
then it's all over. That's it, good night, good night.

Speaker 3 (18:29):
And so is that near Plastown, New Hampshire. You mentioned
New Hampshire.

Speaker 2 (18:37):
Oh, this was a pole. I mentioned New Hampshire in
terms of the pole that was taken and I believe
it was u n h and Durham, New Hampshire.

Speaker 3 (18:46):
Chris Chris Q in Plastown, Hampshire, according to the website.

Speaker 2 (18:50):
Okay, so that's anything else, Glen, you see him kind
of down.

Speaker 3 (18:56):
Come on now, I am because I need you to
have him call my friend.

Speaker 2 (19:01):
Should Okay, that's like not a radio thing. Like we've
got to remember there's.

Speaker 3 (19:07):
A lot of people. Okay, Okay, do I leave a
number with a producer?

Speaker 2 (19:14):
I don't know, Glenn. Let's take a big picture, man.
How are you doing in the big picture?

Speaker 3 (19:21):
Oh, I don't know. My life is like groundhog Day.
Every day is the same.

Speaker 2 (19:26):
Okay, that's so, I'm so sad. Well, I'm glad you
checked in, Glenn. Thanks to talk to you longer. But
you don't seem to have much to talk about. And
I don't mean it in the name any sort of bad.

Speaker 3 (19:38):
Way on tomorrow night.

Speaker 2 (19:41):
Yeah, there's a lot of people out there needs to
be entertained. And yeah, I know usually you're super entertaining
telling telling stories of your wild past and your your
wild adventures. But but you're down here.

Speaker 3 (19:55):
I did. I can tell you a story, all right,
I need a story A yeah, I mean I when
I when I tuned the piano Monday, the guy had
two loud doors. I had to keep telling him, Hey,
I can't tune with the doors bark.

Speaker 2 (20:10):
All right, Well that was a pretty short story, but
that was a good one. As you can see, Glenni's past.
It's time for the break, and I appreciate it. And
I hope thanks we talk. Life is a little better
for it. And everybody say a prayer for Glenn and
hope he cheers up. Thanks Glenn's WBZ.

Speaker 1 (20:27):
You're on Night Side with Dan Ray on WBZ, Boston's
news radio.

Speaker 2 (20:32):
Let's go right to Sandy and Hampton on w b
Z the number six, one, seven, ten thirty Sandy and Hampton, Hampton,
New Hampshire.

Speaker 4 (20:41):
Yep, Hi Sandy, Hi, Hi.

Speaker 5 (20:45):
So I think that, hey, I is gonna be the
death of criticalking.

Speaker 2 (20:50):
Oh yes, I already.

Speaker 5 (20:52):
I already see it with just compute as people get
tied up and I'm anur. Don't think for the steps
to figure something out on a computer. But AI, they're
just going to have to talk.

Speaker 2 (21:08):
Yes, yeah, yeah, I agree.

Speaker 6 (21:12):
And I can give you an example. When one of
my twins was in sixth grade, they have the National testing.

Speaker 1 (21:22):
Fun and.

Speaker 6 (21:25):
The school bought me because.

Speaker 5 (21:28):
To ask if he got hold of a test because
he got the only algebra question right.

Speaker 6 (21:35):
Uh, the test the only one in the whole school.
And he hadn't been.

Speaker 5 (21:39):
Taught algebra yet. And I asked the teachers, did you
think to ask him? And his answer to the teachers
was that he knows order operations. He figured out algebra
on his own. But that's because he had an interest
in and learned all the different fines. And even the

(22:03):
teachers couldn't get into his homework. So I told for English,
I said, well, we wants to use learn how to
use square roots like.

Speaker 6 (22:11):
In second grade. Oh, he's too young to that. I'm like, no,
he wants to know.

Speaker 5 (22:14):
If he wanted to do his homework, tell him to
his homework from nths.

Speaker 6 (22:17):
Then they teach him about that.

Speaker 5 (22:19):
So he learned everything, and he can do everything without
a calculator, although he uses he can't use a calculator.

Speaker 4 (22:28):
Now.

Speaker 5 (22:28):
Wow, And and like when I was a kid in school,
we weren't allowed to use calculators until we understood it
weren't problems out on our own using our brain.

Speaker 2 (22:41):
Wow, this kid seems super smart. I mean even in
high school. I think I cried because it's not in
high school. But somewhere along the line, I didn't understand,
you know, the order of operations this because it's in
the parentheses cod of first or last or I don't
even all that. Oh my god, I was. I was

(23:03):
bad at math, and and here I am. I don't
This job doesn't require a lot of math.

Speaker 6 (23:11):
Well, it's just critical thinking and a little like the
maths is how yah? Were remember the quorder of operations?

Speaker 2 (23:23):
Yeah? Yeah, well I agree with you one hundred percent absolutely.

Speaker 1 (23:29):
And.

Speaker 6 (23:31):
Cal people just don't even half of them don't even
know how to use it do a definite decimal.

Speaker 2 (23:38):
Well, I did used to know how to do it,
and I know it a little bit. I mean I
kind of get it. I could figure it out again, jeez,
you know, I'm exposing myself as as pretty uh ignorant
on a bunch of stuff here. I should probably keep
it to myself.

Speaker 4 (23:55):
You get the basic sund Yeah, I mean I did
learn it.

Speaker 2 (23:58):
I forgot it. I learned it what fifty forty years ago?
You kid?

Speaker 5 (24:04):
You could recall it if there was any MP that
knocked out all of the electronics.

Speaker 2 (24:10):
Oh yeah, if I go to the library and I
pull out the little jar and I see the card
and the number I find, I can find the book.

Speaker 5 (24:16):
Yes, yeah, so there's some people that can't do that.
There's some people that and if AI.

Speaker 6 (24:24):
Goes down, then something happens.

Speaker 5 (24:27):
People are not going to know how to cook.

Speaker 6 (24:28):
They're not going to how to plan their meals because.

Speaker 5 (24:31):
People can look at the refrigermators now and have ask
what can I make from supper by looking at what's
in the fridge.

Speaker 2 (24:38):
Yeah, that's right.

Speaker 6 (24:40):
They don't even make those of the decisions though.

Speaker 2 (24:43):
By the way, let me share what I made some
soup today in just that way without AI. Maybe I
need AI. Let me tell you my ingredients and then
I'll tell you if it was good. It was carrots.
I had a whole bag of carrots, got to use
them up, had onion, chopped up onion. I had half
of a thing of sun dried tomatoes. I don't don't

(25:06):
have a recipe or anything. I'm just taking whatever I got,
sticking it in a pot. I had two things of
chicken broth. I had two cans of boiled oysters. I
had one can of salmon like in tuna, but it
was salmon and two bay leaves. And I had no

(25:30):
idea if it would be good, and it was. It
was surprisingly not so bad. But you know, if you
just stick a bunch of stuff in chicken broth, most
of the time it turns out okay.

Speaker 6 (25:42):
Yeah.

Speaker 5 (25:43):
That's the fun of cooking.

Speaker 6 (25:44):
It's chemistry and you learn what works together what doesn't.

Speaker 2 (25:50):
I did learn that sun dried tomatoes are so strong
in their flavor that it kind of overpowered everything else.
So I would actually do that again because it kind
of had a uster be flavor. It was very healthy,
but it just had too much of the tomato, the
sun dried tomato taste. I hear what you're saying, Sandy,
Thank you. I agree one hundred percent. Thanks. And now

(26:12):
it's Mike in Cambridge. Call you, Mike, Hello, how are you?
I am super well? Thank you.

Speaker 4 (26:18):
I love you. I love your topic tonight about the
AI and the and the computers and whether or not
we should embrace it or be fearful. It's and so
I just I just want to weigh in with it
slightly different way to look at it. I don't think
we need to do one or the other as if
they're the only two choices. I think in different parts

(26:38):
of our lives we can, we can embrace it. So
certainly in the workplace, that's the way now that we
have to communicate with our co workers or if we're
talking to other companies in the course of business. Certainly
you have to know what you're doing with the AI
and with the computer in order to be able to
do your job. And when you're making purchases, it's easy
to use the computer to do searches of like if

(27:00):
you want to buy a particular item, you can compare
the price, you can see who has it in stock,
or whether you want to have it deliver it to
your home. So sure, I think the AI and I
think the use of the computer is the way to go.
But with family and friends, I think we need to
hold on to some aspects of communication from the past.
I think we should regularly with our children, and with

(27:22):
our spouses and with our friends, we should have conversations
either in person and when that's not possible, by voice
on the telephone, because I think that remains the best
way to communicate and to show how you feel while
you're communicating. So I think we can embrace it in
the workplace because we need to or we won't be
able to do our jobs, but we need to be

(27:42):
a little more weary, not necessarily fearful, but we need
to be aware of the right as we increase the
dependence on these devices, that we don't lose contact with
people and the opportunities that we have to communicate face
to face, ideally and when that's not possible, at least
talk by voice. Talk on the telephone. Either you can

(28:05):
hear the feeling, you can hear the pause, you can
you can hear the tone of voice of the person
you're talking to. So I think we can do both
at the same time.

Speaker 2 (28:16):
I know you I don't know if you heard this,
but Michael Caine and Matthew mcconnoughey have both made deals
to have their voices AI copied, so if they called
you up on the phone, you would think we was them.
And here's here's something to watch out for as that
becomes more, uh, higher quality and more prevalent. There's a

(28:40):
scam that's pretty common where people will somehow get information about,
say a grandmother, and know that they have a grandson
named little Jimmy, and one of the scams, they'll call
up and say little Jimmy's been in an accident, we
need you to send money right away. And that's you

(29:01):
know people that fools people all the time. Yeah, but
with AI, they can get a hold of some sort
of voice print of little Jimmy. Well then then Grammy's
gonna completely believe it and send all her money. It's
it's kind of kind.

Speaker 4 (29:17):
Of I know, I know, it's it's kind of scary.
I mean, at least we have the caller ID that
can kind of help us a little bit to be
sure of who's really calling us. If we see, uh,
you know, you see your wife's name or phone number
jump up on your phone, chances chances are better that
you you're really talking to your wife. But if it
comes from an unknown number, sure, we're gonna have to

(29:38):
be careful about that.

Speaker 2 (29:40):
You know what, I don't even answer unknown numbers anymore,
do you. No, it's just a number. I have no idea.

Speaker 4 (29:47):
Well, but what they do now is they'll they'll actually
come up with with your area code and even maybe
in exchange that's that's close to your neighborhood. And it
might not be a number, but you might think it's
a friend that you don't know because it looks like
it's a local call. And so that's unfortunate that the
communications the telephone companies are allowing the marketers to use

(30:08):
tactics to be deceptive, and it's unfortunate that the phone
companies are playing right into it because of the profits
that they get from those businesses.

Speaker 2 (30:17):
One thing that's kind of good is if you see
a number pop up, it's calls and you just look
at it. Now you get a text version of what
they're saying. So if they're leave a message and it
can it says, HI, this is your brother Timmy.

Speaker 4 (30:33):
Yeah, yeah, then you can go, oh.

Speaker 2 (30:35):
Well, well, actually not Timmy, because you probably see his
name pop up. But you can get a hint from
the text message they're leaving whether or not you should
actually answer or not. But I don't because you know
those telemarketers. I don't want to have to talk to them,
So I just wait. If they leave a message, if
it's somebody legit, I call them back.

Speaker 4 (30:56):
I really do, so, Mike, what do you do for
little I really, I really would call them back. I'm
annoyed right from the get goal that they bothered me
with something in the first place. So when i'm when
I when I make purchases it's because I want to
buy something before somebody's trying to sell me that thing.

Speaker 2 (31:14):
Hey, let me ease into the next topic. When you
make purchases online? We were just talking about this is
going to be the ten o'clock topic. Do you think
it's okay to purchase something, say, on Amazon anyone but
for the future, for the purposes of concise speech here
from now and I'll be saying Amazon, is it okay
to purchase something on Amazon knowing you're going to use

(31:37):
it and send it back? Say you're going to you're
going to know, a Halloween party, and you know that
you got a Halloween costume you're going to use one time.
Is it morally wrong to order a Halloween costume in
this case, wear it and send it back knowing you're
going to send it back.

Speaker 4 (31:56):
But you know, I don't think this is it's it's
a new issue as far as you know online purchases.
But even when people were making retail purchases at Their's
or Bradley's or all those stores that have gone gone
past us now, people would buy something like a like
a maybe a battery charge, or they'd go home and
charge up their battery and then they'd take the charger

(32:16):
back to the store. And so this is a This
isn't something new. This has been going on since probably
people have been bothering with each other. Well, no, I
think it's wrong. Of course it's wrong.

Speaker 2 (32:27):
Okay, hood one, you're right. And Finally's Basement had a
very generous return policy. And I heard the story someone
told me about that. I actually tried. I actually tried
to do that at Finaling's basement. But I didn't plan
on no, no, I know, but I wore it out
a little bit, and I took it back and go,
you wore it. Somehow they knew I wore it, and

(32:48):
I just I didn't even wear it anywhere. I just
kind of wore it to the store. But they they said,
see this, Yeah, that's how we know.

Speaker 4 (32:55):
Oh yeah, well you know I bought something. I bought
something in the store a couple of weeks ago, and
it was an item that somebody might be tempted to
bring it home and use it because you need it,
and then bring it back. And what some of the
retailers are doing now is that if you're gonna, yes,
you can return the item. But they have the what
they call a restocking fee. And then that's to discourage

(33:15):
people from you know, buying something and bringing it back
to two days later after they used it and get
all of their money back. And so sometimes that I
think the restocking fee discourages at least some of it.

Speaker 2 (33:27):
Anyway, Okay, but yeah, that's wrong.

Speaker 4 (33:28):
That's wrong. I don't I can't imagine you're gonna get
anbody who's gonna call you and say that it's the
right thing to do.

Speaker 2 (33:33):
I don't know, you'd be surprised. I might even make
the case myself. I might take you know, I might
be the softist and take both sides.

Speaker 4 (33:42):
Uh.

Speaker 2 (33:43):
So you're saying it's morally wrong, might morally wrong.

Speaker 4 (33:48):
Yeah, and in some cases it might be legally wrong.
It's depending, you know, if you bring it back in
this you're not adsing, Okay, but I mean, yeah, it's
wrong on every on every level, I think it's wrong.

Speaker 2 (33:59):
Sure, So if they that way, if they were living commandment,
if if some if the pope goes up on a
mountain and gets another tablet gets a tablet, it might
say thou shalt not buy stuff and return it on purpose.

Speaker 4 (34:16):
Well, but I but I realized that other people are
going to do that, and I think the price of
the product is already built into they they know, they
know how many you know what percentage of the people
who buy a product. They're going to do exactly what
your your question is tonight, your moral question. People will
do that, and that's built into the original price.

Speaker 2 (34:37):
Of the ISOK.

Speaker 4 (34:38):
We're all we're all paying for that. I don't think
the retail is losing.

Speaker 2 (34:41):
You're so interesting that I need to but I need
to break. Thanks so much, Mike, take care, Thank you. Yeah,
Okay moment on WBZ.

Speaker 1 (34:52):
Night Side, Dan Ray on w Boston's news radio rather.

Speaker 2 (34:58):
Jay for Dan tonight on nights. I'm going to finish
up this topic by giving you an exhaustive list of
pros and cons of a I pay attention because you're
probably never going to get as detailed, to listen detailed
a list as this, pay attention pros, and a lot
of these that are pros are good for business and

(35:19):
bad for you. I will, I will grant you that
efficiency and speed automates repetitive tasks and seconds. Right, that's
like jobs, gone processes, huge volumes of data faster than humans.
More jobs, cuts down administrative work and feels like healthcare,
law and finance. More Jobs Gone good for business, bad

(35:40):
for you. If it reduced your cost too, If they
handed down the savings to you, that'd be good, But
what are the chances of that cost savings reduces labor
costs for routine and data heavy tasks. Lowers overhead by
streamlining workflow and reducing human error because there won't be
humans because they're job is gone. Accuracy and consistency delivers

(36:04):
precise output when trade correctly minimizes human bias in certain
analytical tax That's only true depending on how it's programmed.
Repeat tasks reliably without fatigue. More Jobs Gone provides twenty
four to seven assistance and customer service bad. That's supposed
to be a pro, but it's bad for you because

(36:24):
who wants to talk to AI for customer service, tech
support or education education. I don't think they mean school.
I think they mean product use aids individuals with disabilities
through voice recognition. That's pro, that's legit. Innovation and discovery
helps researchers solve complex problems, accelerates drug discovery, identifies human parents,

(36:52):
identifies patterns human off humans often miss recommends personalization, recommends products,
et cetera to you, we know about that, and improves
vehicle safety with advanced driver resistance programs systems. That's good.
Monitors industrial environments for hazards good. Flag cybersecurity threats faster

(37:14):
than humans. Good. Let's get to some cons. Job displacements,
automated routine jobs and manufacturing, transportation based office work, and
customer service. Lots of jobs, Bye bye, precious. Older workers
are those without technical training, as I have been told,
telling you get smart, learn the stuff, or you're going

(37:38):
to be out to pasture. Bias and discrimination as a
con reflects biases in the data it's trained on, can
reinforce social inequities if not carefully monitored. Decision making systems
and housing blending and hiring can produce unfair outcomes. Privacy

(37:59):
can cons well you already you don't need to be
a genius to understand that. Can generate convincing false text
images and videos, makes it harder for the public to
distinguish truth, enables political manipulation and fraud. These are huge
bad things. The thing is, it's here, whether you like
it or not, and you might as well jump on

(38:21):
and get what you can out of it. Coming up
Is it morally okay to order stuff knowing you're going
to return it. It's WZ
Advertise With Us

Popular Podcasts

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Ruthie's Table 4

Ruthie's Table 4

For more than 30 years The River Cafe in London, has been the home-from-home of artists, architects, designers, actors, collectors, writers, activists, and politicians. Michael Caine, Glenn Close, JJ Abrams, Steve McQueen, Victoria and David Beckham, and Lily Allen, are just some of the people who love to call The River Cafe home. On River Cafe Table 4, Rogers sits down with her customers—who have become friends—to talk about food memories. Table 4 explores how food impacts every aspect of our lives. “Foods is politics, food is cultural, food is how you express love, food is about your heritage, it defines who you and who you want to be,” says Rogers. Each week, Rogers invites her guest to reminisce about family suppers and first dates, what they cook, how they eat when performing, the restaurants they choose, and what food they seek when they need comfort. And to punctuate each episode of Table 4, guests such as Ralph Fiennes, Emily Blunt, and Alfonso Cuarón, read their favourite recipe from one of the best-selling River Cafe cookbooks. Table 4 itself, is situated near The River Cafe’s open kitchen, close to the bright pink wood-fired oven and next to the glossy yellow pass, where Ruthie oversees the restaurant. You are invited to take a seat at this intimate table and join the conversation. For more information, recipes, and ingredients, go to https://shoptherivercafe.co.uk/ Web: https://rivercafe.co.uk/ Instagram: www.instagram.com/therivercafelondon/ Facebook: https://en-gb.facebook.com/therivercafelondon/ For more podcasts from iHeartRadio, visit the iheartradio app, apple podcasts, or wherever you listen to your favorite shows. Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

The Joe Rogan Experience

The Joe Rogan Experience

The official podcast of comedian Joe Rogan.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.