All Episodes

December 26, 2025 39 mins

Bradley Jay Fills In On NightSide with Dan Rea:

No, that wasn’t a typo. We know it’s “for all intents and purposes,” but alas, how many times have we heard it the other way? How about it being a “mute point,” instead of “moot point?” Bradley’s talking eggcorns and fake grammar rules with Mark Liberman, Christopher H. Browne Distinguished Professor of Linguistics and Director of the Linguistic Data Consortium at the University of Pennsylvania. Don’t get “flustrated!”

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
He's night side with Dan Ray on WBS. He costins
me radio nice of being Weather says ten seven on
the Dot.

Speaker 2 (00:09):
And I told you earlier that I I watch YouTube
a lot. It's there's a lot to learn and it's
fun to do. It's it's a great place. If I
had children, I would try to ease them over into
YouTube to learn stuff while I'm doing their screen time.
And one of the one of the sights I subscribed

(00:32):
to is called Rob Words. And this guy Rob Watts,
he's over and he lives in Berlin now, but he's
I think he's English. He he goes through like he
does segments on the meaning, the root meanings, the origins
of the meaning of the states in the United States,

(00:53):
and the complete history of the English language. I love
the language and I love it's idiosyncrasies. One day while
watching Rob, so you had a guest, a distinguished guest,
Mark Lieberman, the Christopher H. Brown Distinguished Professor of Linguistics

(01:16):
and Director of the Linguistic Data Consortium at the University
of Pennsylvania. And he's a heavy hitter, but he's fun too,
And we're very glad to have Mark with us to
talk about initially these things called egg cons and as
we as you find out what this is, I'll ask

(01:38):
you to get involved. But you're going to need to
know what an egg corn is before I can explain
how to get involved. So, Professor Lieberman, thank you for
being with us. So can you explain what an egg
corn is?

Speaker 3 (01:58):
Well, this all started about years ago when I heard
about a case where a woman had complained to a
car dealership that her the hood of her car had
gotten dinged by what she called eggcorns falling on it
because she parked under an oak tree. So it was
obviously she was talking about acorns, but she thought sensibly

(02:22):
enough that they were actually eggcorns. It's a sensible mistake
because an egg corn kind of looks like an egg
and it's a kind of a seed, which is one
thing to corn means.

Speaker 2 (02:35):
And so all her life she thought acorns were called
egg corns because it sounded that way, and they kind
of looked like corn on a fried egg or something
like that.

Speaker 3 (02:49):
Well, or an egg in the shell, I mean, the
nut part of an acorn actually looks a little bit
like an eggshell. And then so the question was what
to call this, and we I did a little bit
of looking around on the web and discovered she's not

(03:09):
the only one who ever thought this. A bunch of
other people did as well, though it's not all that common.
And but the thing is, uh, it's obviously it's obviously
a mistake, although it's a reasonable mistake since sakehorn itself
doesn't mean anything in the contemporary language except the seed
of a well, seed of an oak tree.

Speaker 4 (03:33):
Uh.

Speaker 3 (03:33):
And my colleague Jeff Pullam suggested, well, you know, often
when something new comes up, we name it after the
example that brought it to our attention, So why not
call the call it? Call things like this an eggcorn?
And when you start looking around, there are lots and
lots of other examples of cases in which people uh

(03:57):
think they're a phrase or a word that's not the
quote unquote correct phraser word is what that phraser word
should be construed as in their internal lexicon, so to speak.

Speaker 2 (04:13):
And Uh, this is where I'm want to ask the
listening audience to share any things that when they were
a kid or maybe way into life, that that they
didn't quite hear right and had been saying incorrectly until
someone corrected them, And that can include music, folks, musical lyrics.

(04:34):
For example, the big the big example is Jimi Hendrix.
Excuse me, while I kissed this guy, a lot of
people thought that's what he was saying. And everybody has
some word that for a while they were saying wrong.
For me, mine was murger. I thought murder was murger

(04:55):
m U G E R.

Speaker 1 (04:57):
And I don't know.

Speaker 2 (04:58):
I must have been like, you know, I'd hear it
on Perry Mason here on trial for Murger one. And
it took forever a little too long for me to
to find out that that was not the case. Now,
what are some other fun examples of these egg.

Speaker 3 (05:14):
Cons well, excuse me. Famous ones are things like Alzheimer's
disease for Alzheimer's.

Speaker 1 (05:24):
Disease, Oldzheimer's disease, Okay.

Speaker 3 (05:27):
Yeah, in one foul swoop instead of in one fell
swoop for all intensive purposes instead of for all intents
and purposes.

Speaker 1 (05:39):
Yeah, And so.

Speaker 2 (05:40):
Back back up to fell swoop, I thought it was
I don't know whether it's felled or fell.

Speaker 1 (05:48):
And what is the origin of that phrase that idiom?

Speaker 3 (05:52):
I guess, well, it's typical of eggcorns that the true
version of the phrase either never meant anything or it
doesn't mean anything anymore in the contemporary language. So feil
is basically an obsolete word meaning bad or evil.

Speaker 1 (06:14):
Okay, so what a all right?

Speaker 3 (06:18):
All right?

Speaker 1 (06:18):
I get it?

Speaker 3 (06:18):
Yeah, where what.

Speaker 1 (06:22):
Were some of the others again? Because I don't want
to just.

Speaker 3 (06:25):
Old Teimer's disease for Alzheimer's disease.

Speaker 2 (06:28):
How prevalent is it that people say old Teimer's disease.

Speaker 3 (06:35):
I'm not sure that I've ever heard it said, but
if you google it, you'll find lots and lots of
examples from common sections and things like that, where people
seem to think that that's the phrase.

Speaker 1 (06:47):
Okay, and you mentioned a couple others as well.

Speaker 3 (06:52):
Well. Another example that's pretty common is ex patriot. The
word x patriot meanings somebody who lives in a country
they weren't born in. Instead of uh, rendering that as
ex patriot as a former patriot.

Speaker 1 (07:12):
So people would just spell it with the letter X.

Speaker 3 (07:14):
You mean, no, they would spell it as ex hyphen
patriot p A t r I O T, rather than
e x p A t r I A t e.

Speaker 4 (07:26):
Y.

Speaker 2 (07:26):
He had been x patrioted as opposed to repatriated et cetera.

Speaker 3 (07:32):
Yeah. Another example is uh, old wives tail for old
wives tale. That's a case where both makes sense. Old
wives tale makes sense, old wise tail makes sense.

Speaker 1 (07:55):
That's interesting.

Speaker 2 (07:57):
I've never heard that either old wives tail. Everyone has.
Every one of you out there must have had one
of these errors. Now, don't be shy. Let us know
something you thought was said one way was actually said
another way. I shared my mistake, now you should share yours.
Let's continue with Professor Lieberman in a moment on WBZ.

Speaker 5 (08:18):
You're on Night Side with Dan Ray on WBZ, Boston's
news radio.

Speaker 2 (08:23):
Let's continue with the professor Mark Lieberman, a linguistics A
distinguished linguistics professor.

Speaker 1 (08:30):
By the way, a professor.

Speaker 2 (08:32):
Can you give me a handle on what a linguistics
professor actually teaches? What kind of courses you might teach us?
I never I don't believe I took a linguistics course,
even though I was a communications major. What's our class
of yours?

Speaker 3 (08:46):
Like? Well, I teach a bunch of different kinds of courses.
I'm actually actually have positions in not only the Department
of Linguistics, but also the Department of Computer Information Science
and the Psychology Graduate Group. And so On the linguistics side,

(09:08):
I teach the general undergraduate introduction without any corequisites. I
teach graduate courses in phonetics, that is, the sound patterns
of language. For On the computer science side, I have
taught courses on all right, what's basically digital signal processing,

(09:34):
how to find things in signals?

Speaker 1 (09:37):
You know what, let me like, let me go eat,
and be more basic, what is linguistics? What are linguistics?
How do you define that?

Speaker 3 (09:46):
It's the study of speech and language?

Speaker 1 (09:48):
Okay, the study of speech and language? And why is
that important?

Speaker 2 (09:52):
And I suppose it's mostly important when you do it
comparatively correct, when you compare this this culture speech with
their customs and things like that.

Speaker 3 (10:04):
Well, that's that's fun and interesting and often worthwhile and valuable.
But uh, there's a full computational side of it, which is,
you know, everything that Google does and Apple does and
IBM does and so on with what's now called AI
and large language models that came out of computational linguistics

(10:27):
over the last few decades. There's also there's also applications
in teaching when you think about how to teach kids
to read, or how to deal with language development. Issues.
There's applications in political science figuring out what's going on

(10:48):
when people talk about politics, what they really need, how
you compare different ways of talking about politics.

Speaker 2 (10:57):
I think there's interesting classes. There's a lot going on.
There is at the very root of much of what
we do, uh you know actually.

Speaker 3 (11:05):
Well, and in the law. A lot of what goes
on in law is trying to figure out what laws mean,
what contracts mean, what the Constitution means, and so on.

Speaker 1 (11:18):
So would the worrying of the Second Amendment be a linguistics.

Speaker 3 (11:21):
Issue, absolutely, absolutely, for.

Speaker 1 (11:25):
Saying that a.

Speaker 2 (11:28):
Standing militia being necessary, you have the right to bear arms,
but what if outstandinglious militia is no longer necessary.

Speaker 1 (11:36):
It seems that they put that caveat in there for
a purpose.

Speaker 2 (11:39):
Well, and it's necessary for you to have weapons is
because we have we need a militia, which will seem
to say man.

Speaker 3 (11:49):
Of course that has changed, and there's been a lot
of controversy over the over the decades and centuries actually
about what that preliminary phrase in the Second Amendment means.
But there's also the question of what bare arms means.
And actually in the eighteenth century at the time that
that amendment was written. There's a good argument that bare

(12:13):
arms meant to serve in the military.

Speaker 2 (12:17):
So a lot of linguistics, you have to pay attention
to the evolution of meaning and make sure that the meaning.

Speaker 3 (12:26):
How words are used, not just look them up in
the dictionary and look at how they're actually used.

Speaker 2 (12:31):
Well, if if you had taught at my university, I
definitely would have taken your course more than one. Probably
we have Neil in Watertown and wants to join in.

Speaker 4 (12:41):
Hi Neil badly, a little professor. The reason I called
was I've heard that one fell swoop the first time
was ever used, at least in print, at least the
first time that is known about it was And Macduff
said that in Macbeth's when he got the bad news
from Ross that all his children had been killed. I

(13:02):
think he said something like all my chickens and one
fell swoop. And if I may just to other uses earlier,
Lady Macbeth's trying to psych herself up to be evil,
and she says, let no compunctious visitings of nature shake
my fell purpose and son seventy four, it's but be

(13:24):
contented when that fell arrests without all bails. She'll carry
me away. So it's like a personified policeman comes and
takes you away, meaning death. And it's it's a cousin
of felon. I think it comes from.

Speaker 1 (13:38):
The same place really fell and fell.

Speaker 4 (13:42):
Well, I haven't looked it up in a while, but
that's in my head. I apologize if I'm wrong.

Speaker 1 (13:45):
So means cruel, terrible, fear, and fierce.

Speaker 2 (13:51):
So probably a felon a person who has created it,
who has committed to particularly fearsome crime. I guess that, Neil,
you bring something up. I believe that the Professor will
concur that so so many idioms and phrases that we
use come from Shakespeare. That was really the mother load, right?

Speaker 1 (14:13):
Nothing on?

Speaker 4 (14:14):
Ye, the word useful? I think those so, Professor.

Speaker 1 (14:18):
Thanks a lot, Neil appreciate it. How about that? Is
is it true that Shakespeare.

Speaker 2 (14:24):
Has is responsible for many of the little phrases we
use and don't even realize it.

Speaker 3 (14:33):
We believe there are more phrases with an origin in
Shakespeare than for any other English language writer. Of course,
there are plenty of phrases that were not first used
by a famous writer, or were first used by some
other famous writer. But still Shakespeare's definitely up there.

Speaker 1 (14:51):
What are some of the others Shakespeare might have given us?

Speaker 3 (14:58):
Uh? Gee, that is something I can immediately bring the mind.
But I bet that if I asked asked Google for
phrases that originate in Shakespeare will get a long list.

Speaker 2 (15:15):
Okay, So let's get back to the phrase egg corn,
and there are related, for the better lack of a
better word, things, there are things types of phrases related
to egg corn. They have different titles. We talked about
those earlier. Can you mona Green is one of them?
Can you explain what that is and how it's different.

Speaker 3 (15:38):
Manda Green? Well, that arose from an old Scottish ballad
that talked about the fact that they killed the Earl
of Moray and laid him on the green. But a
woman hearing that song sung interpreted that that they killed

(16:00):
the Earl of Moray and Lady Manda Green.

Speaker 2 (16:05):
So instead of laid him on the green, it was
it was heard Lady Manda Green.

Speaker 3 (16:11):
Right, and so she wrote about that, and so that became.
Since that kind of mishearing of lyrics is extremely common,
it became the way of talking about examples of that
kind in general. It's examples of the kind are manda greens.

Speaker 2 (16:28):
What are the policies of dictionaries about taking these misheard
phrases and making them into official words?

Speaker 1 (16:36):
Does that happen?

Speaker 3 (16:39):
Well, there are many words that everybody uses that started
out as as a mishearing of this kind. The one
famous one is Jerusalem artichoke, which is not all that
well known, but probably a lot of people around the

(16:59):
boss scenario have eaten Jerusalem artichokes. It's kind of a
root vegetable and it's easy. It's native to North America.
It's easy to grow, and it's the it's two words
or roots of a sunflower like plant. It's kind of
a pretty planet. But because it's a sunflower like planet,

(17:20):
it was originally called an Italian giusole, which means turned
to the sun in Italian, but English speakers misheard that
as Jerusalem has nothing to do with Jerusalem, it was
never been seen in Jerusalem and so on, but that
they thought it was Jerusalem.

Speaker 1 (17:38):
So there's there are a lot.

Speaker 2 (17:40):
Of musical manda greens, and I want to just share
they're funny. So here are some classic examples from Cretan's
Clear Clear Water Revival CCR.

Speaker 1 (17:55):
Let's go with CCR.

Speaker 2 (17:57):
There's the song bad Moon Rising, and some folks think
or thought they were saying, instead of saying there's a
bad moon on the rise, they were humming along as
they were vacuuming, there's a bathroom on the right, is
what they thought.

Speaker 3 (18:14):
This, Yes, there's a bathroom on the right.

Speaker 2 (18:19):
And now anytime anybody hears that CCR song, they will
be saying that. Okay, we talked about the Jimmy Hendrick's one,
and that is that's completely understandable because it certainly sounds
exactly like it. It's not excuse me while I kiss
this It's not they think it's excuse me while I

(18:39):
kiss this guy, not excuse me while I kiss the sky.
This is an interesting one. Elton John song Tiny Dancer.
Instead of hold Me Closer, Tiny Dancer, some folks, and
I wish I had some numbers on how many folks.
Instead of hold Me Closer, Tiny Dancer, it's hold Me

(19:03):
Closer Tony Danza. There's something really inherently funny about these
hold Me Closer Tony dance.

Speaker 1 (19:14):
Okay.

Speaker 2 (19:15):
Next we have the Beatles song, and this is almost
too weird to believe. From the Beatles song Lucy in
the Sky with Diamonds instead of the girl with Kaleidoscope eyes.

Speaker 1 (19:33):
Some and again, i'd like to see the number.

Speaker 2 (19:36):
Some think it's said the girl with kaliis goes by
instead of the girl with kaleidoscope eyes, and Lor knows.
I don't want to laugh at that, but it is interesting.

Speaker 1 (19:49):
So there are.

Speaker 3 (19:50):
Those, so yeah, watches on Yeah, my favorite. My favorite
is in the Taylor Swift song the Long List of
ex lovers as all the lonely Starbucks lovers.

Speaker 2 (20:02):
Yes, that's listed in the modern pop examples. Let's see
these are modern. I hope I get these right. So
instead of in the Pearl Jam song, Jeremy instead of
Jeremy spoken class today, it's Jeremy smoking grass today.

Speaker 1 (20:23):
That's all I can say. And you know, there is a.

Speaker 2 (20:27):
There's currently a trend in music production, at least some
genres of music production, to bury the vocals a little
bit more than before. It used to be that the
music was just kind of this little ice skating rank
for the vocals to skate around on, very clearly distinct,

(20:49):
but now many times the vocals are treated more as instruments.
They do have meaning, but a lot of the essence
of the vocal is more instrument like than the meaning
of the words, so it gets easier to make mistakes.
Let's do another one here, and we do have a

(21:10):
David in San Francisco to get you after the break
from Queen, we will rock you. Instead of kicking your
can all over the police, it's kicking your cat all
over the place. How can you go through life taking that?

Speaker 1 (21:24):
Anyone? Come on? And the fun thing is what happens someday.

Speaker 2 (21:30):
You're in a car with your friend and you're singing
along to the radio, and you sing along kicking your
cat all over the place, and the friend looks at you,
like what what And you say, yeah, kicking your can,
kicking your cat all over the place. See, you're super embarrassed.

(21:52):
I can't think of any of these that I have done,
but I'm sure there is some. Let's see a more another,
the more recent one instead of she's an easy lover.

Speaker 1 (22:07):
And I have to doubt.

Speaker 2 (22:08):
I don't know if I believe this. She's an easy lama. No,
I mean, come on, she's an easy lama. I don't
know if he sings.

Speaker 3 (22:16):
It does not make sense, but you never know.

Speaker 2 (22:21):
And here's I'm reading these now and I don't know
what's coming up. I did not proofread these. It's kind
of dangerous America America, okay, for America America. God shed
his grace on the from America the Beautiful. Some folks
think it America America. God is chef Boyard. Whether that's

(22:50):
true or not, whether a significant section of the population
thought that, it is kind of funny. I won more
before the right. This is from three doc Knight, Joy
to the world. The line is joy to the fishes
in the deep blue sea, Joy to you and me.

(23:11):
What they heard was joy to the visions.

Speaker 1 (23:15):
That people see. That's not very funny, but there you go.

Speaker 3 (23:19):
So, uh, it's a reasonable misunderstanding.

Speaker 1 (23:23):
I love those. It'd be fun to make a song
out of only those.

Speaker 2 (23:29):
So we have Actually David and San Francisco wants to
speak with us and perhaps someone else after this break,
and thanks for being with us, Professor more in a
moment on WBZ.

Speaker 5 (23:41):
It's Night Side with.

Speaker 1 (23:45):
Boston's news Radio. That's correct.

Speaker 2 (23:48):
We are talking linguistics. I guess we're having fun with words.
That's a better way to say it. I'm an entertainer.
I want to have fun and I like words. And
when I can meld the two, I like to I
like to do it. The uh, professor, let's talk to
David in San Francisco. David in San Francisco, how do

(24:08):
you do?

Speaker 3 (24:09):
David?

Speaker 1 (24:09):
Say hello to Professor Mark.

Speaker 6 (24:13):
Bradley, our professor are Let me ask you. Do you
know where the term drag, as in drag queen comes from?

Speaker 3 (24:24):
The word comes from? Yes?

Speaker 2 (24:27):
The question is where does the word drag from drag
queen come from. That is the question posed by David
from San Francisco.

Speaker 6 (24:35):
Yeah, I told you, Bradley.

Speaker 3 (24:38):
I know.

Speaker 2 (24:39):
Let's see if the professor can answer. So it's just
a guess, nothing, nothing fancy. You're just gonna gotta make
a wild guess. It's not academic and it's not really
there's no rhyme or reason.

Speaker 1 (24:54):
It's just a guess.

Speaker 3 (24:57):
Well, I can I bet that? Uh, the Internet can
tell us, But can tell us, David tell the professor.

Speaker 6 (25:12):
Professor comes from Shakespeare. Back when Shakespeare was writing his stories,
all the actors were men. Character in the story, he
would write the word drag next to a character and
the drag was dressed as girl.

Speaker 1 (25:32):
So there you go.

Speaker 3 (25:34):
It's certainly, that's certainly what it means. I don't recall
having seen it from that long ago, but you might
very well be right.

Speaker 6 (25:50):
I've heard that studying history. I learned that.

Speaker 2 (25:56):
Well, there you go, you win, you win there, David.
That's good, Thank you very much, Andrew and Wooster. Hello,
you're on WBZ with Professor Mark Lieberman.

Speaker 5 (26:06):
Hi, Bradley. I used to talk to you a while back.
I lived in Spencer and I'm the Trader Joe's guy.

Speaker 1 (26:17):
Oh wow, how are you?

Speaker 5 (26:19):
How are you?

Speaker 1 (26:19):
I'm great Trader Joe's guy.

Speaker 5 (26:24):
They The one that always gets me is keen wah. Okay,
Customers come in and say keno or keno keno keen wah.

Speaker 1 (26:35):
So so people don't know how to say keene wa.

Speaker 4 (26:39):
Yeah, that's interesting as well.

Speaker 1 (26:43):
And how are you been?

Speaker 3 (26:44):
Andrew?

Speaker 1 (26:45):
You're still with the Trader Joe's.

Speaker 5 (26:47):
Yeah, I'm still Trader Joe is twenty years Wow.

Speaker 1 (26:51):
Good for you and thanks for joining in with us.
That was a good one. I appreciate it.

Speaker 2 (26:55):
So, professor, let me see, oh now you you I
don't want to overlook your wheelhouse, professor, and part of
your wheelhouse is how linguistics.

Speaker 1 (27:09):
It's integral to AI. Can you flesh that out for
us some more and make us understand that better? Right now,
it's a pretty amorphous feeling I have about her connection.

Speaker 3 (27:27):
I'd rather amorphous term. It basically means, you know, complicated
computer program that answers what seemed like hard questions. But
one one of the most important particulars in that area
is what are called large language models.

Speaker 1 (27:47):
What does it mean?

Speaker 3 (27:49):
Well, let's leaves a large part out for the moment
and get onto language model. And the very first language
model was developed by Alan Turing in the late nineteen
thirties as part of the effort to do cryptanalysis on
the Enigma encryption machine that the Germans used. And why

(28:13):
do you need a language model? Because there were an
unreasonably large and astronomically large number of possible settings with
the encryption machine. But they had a clever way of
using patterns in the ciphertext to cut that number down
to a few thousands. And then in principle, you could

(28:37):
try each of the thousand possible settings and see whether
the result of producing clear text from the ciphertext made
any sense. But that would take a long time. If
you had a human being do it looking at each
of the outputs, most of which would be gibberish. And
so Touring invented a model of German letter sequences, and

(29:03):
that model would inclusively assign a probability to a sequence
of letters, as how likely is it that this is German?
And so they built a machine, which was actually also
one of the world's first computers, which would run through
thousands of possible encryption keys, produce the clear text corresponding

(29:27):
to that key, and then ask it does this seem
like German? And if it seemed enough like German, then
it would spit it out and give it to a
human being to check. So, you know, that was in
thirty nine or so, and ever since then people have
been using techniques of that kind to produce models of language,
and those were used, for example, in speech recognition. If

(29:50):
you're trying to figure out you have some noises and
you're trying to figure out what's the text that corresponds,
you put together what you can gather from looking at
the sounds and what you can predict from asking how
likely is this is? This is this text that the
species tech system is producing, And that again has been

(30:14):
going on for many decades. And then at a certain
point it occurred to somebody, you know, we could take
these language models and use them in a generative way.
That is, rather than using them to ask what is
the right way to interpret this sound? What is the
right way to interpret this encryption? Instead, let's turn it

(30:35):
the other way around and say, all right, you know,
write me a story that and you could you could
do it by giving a prompt, by saying, all right,
start this way and then continue. I think that's what
a language model does. It says, if you have a
certain pattern of letters, what are the likely ways to

(30:55):
go to get to there or to go from there?
And what is large means well, because now we have
big computers and lots of GPUs, you know, warehouses full
of computers. These language models, instead of being the small
to medium sized language models from the thirties and forties

(31:16):
and fifties and sixties and seventies, now they're large language
models with hundreds of billions, even trillions of parameters. And
it turns out that that scale, that that scale of
parameters for modeling what is a likely sentence of English
or German or Chinese or branch or whatever produces there

(31:40):
these amazing emergent properties that come up when you start
using those very large models in a generative fashion.

Speaker 1 (31:48):
Thank you for that.

Speaker 2 (31:49):
I can come as well, but I kind of get
thank you and the fact that there are many languages,
So there's AI that generative AI. There are many languages,
but there are subtle differences. There are certain words in

(32:09):
one language that don't exist in another, and a translation
might not be exactly the same.

Speaker 1 (32:16):
So does this cause a problem for AI that has
to somehow be dealt.

Speaker 5 (32:22):
With and.

Speaker 1 (32:25):
One of the in the end result one language. Will
we someday way down the line, to.

Speaker 2 (32:34):
Maximize the use of artificial intelligence, only have one language
around the world.

Speaker 3 (32:42):
Well, I don't think that's very likely. Although the seven
or eight thousand existing languages in the world, probably at
least half of them are what's called endangered in the
sense that there are relatively few, if any, kids that
are learning that language and using it at home. But

(33:02):
there are still thousands of languages and language varieties that
are being learned and used in the home. And even
when you have a situation where there's a dominant language
over a large area that everybody is motivated to learn
and in many cases forced to learn that doesn't actually

(33:24):
prevent people from speaking other languages at home, so to speak.
And so I think it's very unlikely that we're going
to wind up in one single language.

Speaker 1 (33:34):
But go ahead.

Speaker 3 (33:36):
What I was going to break in and say is
that one of the first practical applications of language models
was in machine translation, and that was that was created
by that was used in systems that had a model
of English and a model of French, or a model
of English and a model of German, or a model

(33:58):
of English and a model of Chinese or whatever, and
would attempt to construct a system which would figure out
what pattern of letters in language X corresponds best to
a given input pattern of letters in language Y. And

(34:21):
that you know, worked not very well in the nineteen seventies,
and a little better in the nineteen eighties, a little
better in the nineteen nineties, and by now for languages
that have lots and lots of training material, although it's
far from perfect, it works pretty well. And that you know,
there's an old saying that goes back to a British

(34:45):
linguist in the nineteen fifties JR. Firth, which is, you
shall know a word by the company it keeps and
so rather than looking, rather than knowing what an English
word means, or what a Chinese word means, or what
a Spanish word means by looking it up in a dictionary. Instead,

(35:06):
what they do is they asked, well, okay, now we
have a few billion documents in English and this word
occurs hundreds of thousands or millions of times. What is
in its neighborhood? What is the commons?

Speaker 1 (35:23):
The context?

Speaker 3 (35:23):
Yeah? So how does it get used?

Speaker 2 (35:27):
Linguistics must be really important in proper in deciding prompting
for AI, especially again, different languages not quite meaning the
same thing, but you want to get the exact prompt right.
Are you or people in your discipline involved with standardization
of prompts across different languages?

Speaker 3 (35:51):
Well, I do know that. So there is this field
that was extremely popular a couple of years ago. It
is kind of declining in popularity now called prompt engineering,
the idea of being that rather than rewrite AI systems,
what you do is have to worry about how to
construct the prompt to a general system in order to

(36:13):
get it to do whatever you want it to do.
And it is true that some prompts work a lot
better than other prompts, and so knowing how to construct
prompts is you know, is worthwhile. But a lot of
what's going on now is more along the lines of
trying to adapt the systems to the task, the relevant

(36:39):
tasks and to sort of fine tune them to work
better on one thing or another, rather than to make
people learn how to praise the prompt exactly right to
get it to do what they want.

Speaker 2 (36:52):
Thank you so much s for joining us, professor. I
really I do appreciate it, and that's my pleasure.

Speaker 3 (36:58):
Thanks for having me of fun.

Speaker 1 (37:00):
And we'll talk to you again soon. Thanks.

Speaker 3 (37:03):
Okay, bye bye, good luck, bye.

Speaker 2 (37:05):
Bye, and say to Rob Watts, say hi to Rob
Wats for me if if you see him again.

Speaker 1 (37:11):
So, uh, there you go, folks.

Speaker 2 (37:13):
We have a little bit more before the top of
the hour, and then we'll have open lines. I have
some suggestions. I do like the open lines because that's
when people feel comfortable and calling in. I noticed that
Alex and Millis did, did Colin haven't heard from some
of the folks.

Speaker 1 (37:32):
Glenn. I don't know.

Speaker 2 (37:33):
Glenn A lot of time likes to wait till the
end on Friday to Colin. Maybe we'll hear from Glenn
get some find now, what's up with him? Something like
that that's coming up on w b Z.

Speaker 1 (37:45):
It's Night Side with Dan Ray on w b Boston's
news radio.

Speaker 2 (37:50):
Before we get to the top of the air, I
just want to say it's open lines and maybe throw
some things out there for you that they have been
ruminating with me. One of them is the following should
weed Should smoking marijuana in public be banned? You can't
drink in public, why is it okay to smoke in public?

(38:13):
I don't know the rule, but and I'm walking trying
to live my life, and I don't mind marijuana, and
I don't mind the smell, but I kind of do
mind the intrusiveness of the smell. Say I'm walking in
the park walking my doggie and you get this big
blast of skunk smelling smell.

Speaker 3 (38:31):
That is uh.

Speaker 2 (38:31):
You know, we have freedom of We have certain freedoms.
You have the freedom to move your arm wherever you
want until it did someone's face.

Speaker 1 (38:38):
You have the you have the freedom to.

Speaker 2 (38:42):
Blow your smoke where you want until it's blown in
my face. So I'm gonna say, yeah, let's let's just
make a rule no smoking weed in public. I know
nobody's going to enforce it, but at least let's let
us get it down there so so we know, oh,
that's kind of what the community.

Speaker 1 (39:02):
Expects and what the law expects. And if you really.

Speaker 2 (39:05):
Flaut it, maybe you get a ticket or something. So
that's one thing, but it's open line six months, seven, two, five, four, ten.

Speaker 1 (39:11):
Thirty, easy go on, good fun. I'm WBZ
Advertise With Us

Popular Podcasts

Stuff You Should Know
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

The Breakfast Club

The Breakfast Club

The World's Most Dangerous Morning Show, The Breakfast Club, With DJ Envy, Jess Hilarious, And Charlamagne Tha God!

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.