Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
So already the information, the data that is going into
the model is biased, right, because they're just scraping the web, right,
And so to have a model that is less biased,
they'd have to start over.
Speaker 2 (00:13):
And nobody is starting over.
Speaker 3 (00:17):
That is poet and professor Lillian Yvonne bertrumk. Lilian practice
is what they call computational poetry. Their twenty nineteen collection,
Travis D. Generator, was long listed for the twenty twenty
National Book Award for Poetry. Their most recent collection is
called Negative Money.
Speaker 1 (00:37):
I was like, there's nothing else I can be doing
right now except for engaging these questions of anti blackness
and the codification of it.
Speaker 2 (00:44):
It was the most important thing.
Speaker 1 (00:45):
I was like, I'm not doing anything with computation unless
it's about this intentionally digital, intentionally black computational poetics.
Speaker 2 (00:58):
And heavy handy. Yeah, so with the world.
Speaker 4 (01:01):
Take a sip of brandy and smoking.
Speaker 2 (01:03):
Guys who know what.
Speaker 1 (01:04):
The plan is?
Speaker 2 (01:04):
O be kama Latin. You know when to see the
stand me. My name is George M. Johnson.
Speaker 3 (01:10):
I am the New York Times bestselling author of the
book All Boys Aren't Blue, which is the number one
most challenged book in the United States. This is Fighting Words,
a show where we take you to the front lines
of the culture wars with the people who are using
their words to make change and who refuse to be silenced.
Today's guest Lily and Yvonne Bertram. Hello, everyone, I am
(01:35):
here today with our very special guest, Miss Lillian Yvonne Bertram.
Who is Lillian Yvonne Bertram.
Speaker 2 (01:44):
Oh Dear.
Speaker 1 (01:46):
First of all, I just want to say, like, I'm
so humbled to be speaking with you, and it is
such an honor. So I'm just having a fan person
moment privately right now. I guess professionally, I'm a poet,
writer instruction. Sure, I teach at the University of Maryland.
I direct the MFA and creative Writing. I've written a
(02:06):
bunch of things, mostly poetry.
Speaker 3 (02:08):
Yes, And you talk about you said being like more
mechanically inclined. Okay, So I have a math brain, and
so it's like I am very analytical.
Speaker 2 (02:19):
I love data, I love all those things.
Speaker 1 (02:21):
Oh that's so great. I mean I do not have
a math brain. Let me be clearer. But I struggled
and struggle with math, and so I probably at some
point thought that like being a creative would be my way.
Speaker 2 (02:32):
Out of that.
Speaker 1 (02:33):
Right, no one's gonna come at me with math if
I'm doing poetry and lo and behold, I wind up
doing like computational poetry and poetics. So I think, like mechanically,
I am interested in how things work. It's like, there's
got to be a way to multiply words or divide
(02:54):
them or do a kind of permutation with them in
some kind of way. There has to be a way
to repent that linguistically. And I'm not saying that I
succeeded in any of that, but that's something that from
my very first book that I was interested in. So
I guess that would be a way in which, like
the mechanical brain is always trying to kind of like
(03:16):
unite very disparate concepts or apply something to language to
see what could happen.
Speaker 3 (03:22):
Like, Okay, sometimes, like I guess, my art runs up
against facts or like against uh, yes, things that are
like permanent, right too plus two is always going to
equal for right, like, so it's like that's a permanent thing.
Speaker 2 (03:33):
There's no way around it. But there are ways you
can also get to four.
Speaker 3 (03:38):
Like you say, you can divide, or you could do this,
or you could you know, and I think, yeah, Like,
that's what I hear when you say that, Like the
computation when you say computational poetry, I think is computation athletics.
Speaker 2 (03:51):
Yeah, what does that mean?
Speaker 3 (03:52):
Because I don't think I've ever heard anyone put computational
in front of poetry.
Speaker 2 (03:56):
Yeah. So the way I describe.
Speaker 1 (03:59):
It, which may not be how someone else describes it,
is the application of computational processes to poetry and poetics.
So for me, that has taken this shape of again
attempted permutations of language, right, either using a machinist process
to do that, which can be take out a variety
(04:21):
of forms. One of them is like writing a computer
program that does that to language that exhausts all the
permutations of a sentence or a poem or something like that,
writing computer programs that compose a poem or a text
out of textual material. It can go up to using
(04:42):
things like language models. There are lots of different ways
that you can apply what ultimately are mathematical processes right
to how language is generated, arranged, organized.
Speaker 2 (05:00):
Power.
Speaker 3 (05:00):
Far does that teeter into like the hot topic of
the country, AI and artificial intelligence, We got to get
into it as someone who we got to get into it.
I know, I just recently got into the Writer's Guild
for Film television, and when the strike was happening, that
was one of the largest concerns was, like, what happens
(05:22):
if artificial intelligence is writing the script and doesn't have
a heart and doesn't have the soul of an actual writer,
but costs a lot less than actual writer, Right, and
so does this teeter into that place of artificial intelligence?
Speaker 1 (05:37):
Yes, it does, so I I.
Speaker 3 (05:40):
Don't think all AI is bad for many things that yeah,
there are many reasons to use it, and then yeah,
there are sinister reasons.
Speaker 2 (05:50):
Oh, for sure reasons.
Speaker 1 (05:54):
And that's sort of where it goes to, Like you
one needn't use the tool to unemploy people, right, So
that's a choice that a studio is making, and that's
a choice that they do not have to make.
Speaker 2 (06:10):
There's a difference.
Speaker 1 (06:11):
Between the blanket term AI and what we know as
large language models, which you could debate those terms. Right,
do they actually model language? Are they actually models? What
do they actually do? So what I do with computational
poetics is what I have done, Like people have called
AI when it's not it is like writing a program
(06:33):
using a computational process, one that you could actually do
by hand right but would take far longer, and sort
of applying it to text. But it has nothing to
do with what we call artificial intelligence, which has a
foundation in machine learning. You can do a lot of
computational poetics that have nothing to do with machine learning.
And the anthology that my colleague and I just put
(06:53):
out called Output, An Anthology of Computer Generated Text nineteen
fifty three to twenty twenty three, the majority that is
not machine learning. So there's like a very long history
of computational text generation and computational poetics.
Speaker 2 (07:06):
That is nothing to do with it.
Speaker 1 (07:08):
We wouldn't have machine learning without some of that earlier work.
But machine learning and like generated text in literature doesn't
really come on the scene as hugely as it is
until maybe like twenty twenty, twenty twenty one. Although I
was involved with large language models back in twenty seventeen
twenty eighteen, but they weren't publicly available until chatshypt all
(07:28):
of a sudden just burst out of zeus as siety,
and there was so you mentioned like the writers, does
what is written have heart or soul? I would argue,
I would push back even on the term written, because
what a language model does is it applies a probability
across like a distribution and predicts the next word.
Speaker 2 (07:52):
Right.
Speaker 1 (07:52):
So it's a very vast and competent prediction model. That
doesn't mean that it is quote unquote right. So, like
if you have the letter Q and it's it's looked
at a billion trillion tokens, right, there's a high probability
that the letter that comes after.
Speaker 2 (08:11):
Q is you right, okay, right, It's not going.
Speaker 1 (08:14):
To be key, It's probably not going to be m so,
and so that's so what it gets really good at
is predicting that over long strings of tokens more than
just the words. Why I differentiate it from like text writing.
This is how I understand the models to work and
encourage everyone to like really figure figure out as much
(08:35):
as we can about how the model works, which is
very difficult because the models that we have access to,
these corporate models like open AI and Google and so
on and so forth, we cannot know how they work
because they are black boxed.
Speaker 2 (08:47):
Right.
Speaker 1 (08:48):
Chances are they don't even know fully how they work
because there are so many embedded layers, right, you cannot
see inside or visualize how the model is actually working.
Speaker 2 (08:59):
But what we we do.
Speaker 1 (09:00):
Know, right is that words are separating into tokens assigned values,
and then the analysis has run like the numeric distance
between words.
Speaker 3 (09:09):
Right, Okay, I actually understood that that was That was
a good lesson for me, and I was like, okay, wait,
I get it, okay, like predicted.
Speaker 1 (09:19):
Yeah, so I think like when we're talking about writing,
by and large, what we're talking about are language models.
My colleague Nick Modfort will say they don't model language
because language is.
Speaker 2 (09:32):
More than just text.
Speaker 1 (09:34):
Language is gesture, it can be spoken speech, it's facial expressions.
There's a lot more that goes into language. Sometimes it's
not verbal even there's a lot more that goes into
language than just text.
Speaker 3 (09:47):
Yeah, it can definitely be nonverbal because as a person
who is in black community, we can just get looks
exactly and have a whole conversation without exactly words.
Speaker 1 (10:01):
And that's still so so there's a questions like what
is it actually modeling?
Speaker 5 (10:05):
Right?
Speaker 2 (10:05):
Yeah.
Speaker 3 (10:24):
This week's Queer Artist Spotlight is Weakened by Sierra Sean.
Speaker 2 (10:28):
Here's a short sample.
Speaker 6 (10:38):
You'll go away in the top one and Big and
Blue Sea that don't worry you.
Speaker 2 (10:57):
You can listen to the whole track at the end
of the episode.
Speaker 3 (11:01):
And now back to my conversation with Lillian Yvonne Bertram.
I asked her about African American Vernacular English aa VE
for sure, and how does this play into especially because,
like you said, language could be so many things.
Speaker 2 (11:19):
But you know, with.
Speaker 3 (11:20):
The I don't want to say, we're not going to
say the acceptance. We're gonna say the acknowledgment of aa
VE as language. And oftentimes these systems aren't built by
people who use that particular type of language. And so
to do these systems even with the being like prediction models,
(11:42):
is there something like inherently biased based on.
Speaker 2 (11:47):
Like how black folks use language versus how white people.
Usually we'll just say what it is, Yeah, is.
Speaker 3 (11:57):
A primarily male dominated, white is dominated space oftentimes that
it could be inherently racist in the ways in which
it works.
Speaker 2 (12:06):
Whether it's.
Speaker 3 (12:08):
The ever struggle I have when I'm traveling and trying
to wash my hands in the airport and the stupid
sensor will not recognize my hands but recognized it white
much easier. Right, Is that something that has been run
into with using these language models, especially in terms of
how just black folks use language and the do was
(12:30):
language I think which will get a name Black Queer Lingo,
which is being adopted much and much quicker by the day.
Speaker 1 (12:39):
The short answer yes, right, short answers yes. I think
in a lot of ways, language models are an exercise
and bias so and they do that, and that's something
that they do incredibly well. And that's because you can
make a language model on a small amount of text,
but it might not be that.
Speaker 2 (13:00):
Great in terms of doing what you needed to do.
Speaker 1 (13:03):
And I want to emphasize that it's not like open
AI or Google or whoever was like, you know what,
let's make this like really cool thing that's going to
be able to approximate aave and all these other kinds
of things, and artists are going to use it and
they're going to love it.
Speaker 2 (13:18):
I'm pretty sure that's not what they went into this for.
Speaker 1 (13:20):
Right. They probably went into this for If we can
make a competent language model, then chatbots will be better.
We can sell the chatbot, people can subscribe to it,
businesses will use it. They'll be able to search their
archives and records better, and provide consumers with more accurate information.
(13:40):
So that is a consumer facing, profit driven tool. So
the text that is generated has to be of a
certain type. It can't be ambiguous, it can't be creative,
it has to be a certain way. So going into
language model use like one has to approach it with
(14:02):
that awareness and knowledge that this is a bias system.
Speaker 2 (14:05):
It is not objective.
Speaker 1 (14:07):
If you're using a language model that you have tweaked
on your own and you're using it on the back end,
you're using it via code and not using a service
because he is a service, right, So then you have
more control over what can happen within it, but you
still don't have access to all of the parameters that
(14:27):
you can tweak. So that's the long answer to yes,
that models are biased. They were always going to be
by biased, Okay, right, and you have to you know,
were they built for aave?
Speaker 5 (14:40):
No?
Speaker 2 (14:40):
Can you build one that is yes? Would it be
like it would be all.
Speaker 1 (14:47):
Over the place, would it you know, ring of mimicry,
stereotype parody, probably?
Speaker 2 (14:53):
Yeah? Right in terms of would you want that so
switch gears a little bit to your poetry? Yeah?
Speaker 3 (15:11):
You order book in twenty nineteen called The Travesty Generator. Yes,
can you unpack like what the multiple meanings of that
title is.
Speaker 1 (15:21):
Well, one is that Travesty Generator was a program and
it tweaked language in a way they called it turned
language into a travesty. Right, It's just like really distorted
and manipulated language. And I like, I loved that, like
Travesty Generator, and I've at the time when I was
(15:44):
writing the book too, I was like, all the stuff
that was going on twenty sixteen, twenty seventeen, twenty eighteen,
George Floyd, all of that a travesty, right, just like
the way in which like the United States continually generates travesties,
like especially appropriate for this for today, right, it's and
(16:05):
it's it's kind of crazy because the book came out
a while like in poetry Land, right, this came out
a while ago, and there's still so much there's like
there's renewed interest, and it's it's because I'm just like
I could I could have written this twenty years ago.
I could have it's because the same things like keep happening,
and so it's a nod to the program itself. It's
a nod to the generation of travesty is the ongoing
(16:27):
anti blackness, police brutality, oppression, right, A travesty that is
the United States.
Speaker 3 (16:33):
Right, because it's almost like tragedy and travesty are like
like a tragedy happens and you're like, that's tragic, but
it doesn't always feel like it'sself imposed, Whereas travesty it's
a choice that is made by, like you said, the
government or a person in power to inflict this particular
(16:57):
type of travesty on someone else. And like you said,
it's generated. It's like this didn't have to happen, right,
Like we're all right here, right.
Speaker 2 (17:06):
Right exactly.
Speaker 1 (17:07):
I mean a lot of what the book focuses on
are these like really absurd the travesy of a miscarriage
of justice, this absurd oppression and anti blackness. And it's
written using code because the way so much of it
is codified, right, computational determinisms like abound when it comes
(17:30):
to the oppression of black people and other minorities. Right,
it's codified in our society.
Speaker 3 (17:36):
It's inherent right code, right, Okay, Yes, yes, it is
embedded in it.
Speaker 1 (17:47):
It's embedded in the language exactly. It's embedded in the language.
It's embedded in the processes, the ones that we see
and that we don't see. And so there's a poem
that includes actually like a lot of code and it's
been pipulated like coded language. It uses the codes from
the underground railroad, right. It does a mashup of a
(18:07):
lot of different things, contemporary moments like refusing to put
Harriet Tubman on the dollar bill, the connection of that
to Emmitt till money, Mississippi cotton that goes into the
making of bills, all of these things coming together to
try and reveal again, I guess the ways in which
these things are codified, right, and that they operate both
(18:29):
beneath the surface and like in Plaine tight.
Speaker 3 (18:32):
Wow, that's going to be the light bulb moment I
think for a lot of people because that it's like
it's not that I didn't know that, but it's like
when you break it down into this is it is
all code, right, Like it's codified, which makes it code,
which makes it this. And I'm big one history. I
wrote a book about the Harlem Renaissance, golf flamboyance, and
(18:54):
I even wrote poetry for the first time, which everybody,
everybody really likes.
Speaker 2 (18:59):
The poem.
Speaker 3 (18:59):
So I was like, I would never do a poetry
book because I have too much respect for my friends Jericho,
and I just have too much respect for the poets
in my life and the work that they do.
Speaker 2 (19:14):
But why was poetry the medium that you wanted to use?
Speaker 3 (19:18):
And what do you think is the importance of keeping
poetry as a medium alive?
Speaker 1 (19:26):
Well, I mean, poetry is important because it's what I do,
and so it's what I do and I could not
think of it appearing in any other way, partly because
what attracts me to poetry, and what is always attracted
me to poetry is that it is such an expansive space.
Speaker 2 (19:46):
It allows for so much.
Speaker 1 (19:48):
And it's not that other forms of writing don't, but
I feel like in the universe of poetry there's room
for just about everything.
Speaker 2 (19:57):
And I had gotten into.
Speaker 1 (19:59):
Compy because it was very thrilling to me, or I
was like, Okay, finally I now have access to a
method by which I can actually apply the things I
was thinking about language to the language. And poetry for
me is about language and new and surprising ways, and
so it's had to be poetry now. To be clear,
(20:20):
One I didn't expect the book to be published.
Speaker 2 (20:22):
Two.
Speaker 1 (20:22):
I didn't expect anybody to read it, you know, I
didn't expect it to gain any traction, and so it
was wild that it did. But it was also for me, Okay,
I want to show for myself, at the very least,
that there is a way that computation can be a
(20:43):
poetic method, but one that did more than demonstrate itself right,
one in which, like the outputs could be urgent and
relevant and it could engage.
Speaker 2 (20:55):
These issues in these questions.
Speaker 1 (20:57):
Because at the time, I was like, if I'm not
engaging these questions, like, there's nothing else I can be
doing right now except for this, except for engaging these
questions of anti blackness and the codification of it. It
was the most important thing, and that has continued to
be my impetus for using any kind of like computation is,
(21:18):
you know, what I've called an intentionally digital, intentionally black
computational poetics.
Speaker 3 (21:50):
Now back to my conversation with poet Lillian Yvonne Bertram,
what's your approach been to I guess sharing this wisdom
with your students and how you try to craft their voices.
Speaker 2 (22:03):
That's a great question.
Speaker 1 (22:04):
I think like the voice is the voice is there, right,
maybe it's a matter of amplifying it or understanding like
what it is, right, what makes it your voice? I mean,
if I think about like the poems that I write,
they're very much like sometimes like how I speak and
how people know me, so like they sound like me
(22:28):
in that way, because I mean, the can't sound otherwise, right,
I've long abandoned like being able to write other writers
that I admire. So I'm like, that's just not what
I have to contribute to the world, right, I have
something else, So I'll just go with that. But you know,
I teach my students coding, and I teach them computation,
and early on when I was teaching this kind of
(22:50):
competition in digital poetics, I you know, students were like,
it's just going to sound like the computer if I
use this method. And the first challenge to that is, well,
what do you think a computer sounds?
Speaker 2 (23:01):
Like?
Speaker 1 (23:01):
How do we identify computer voice? What makes it different
from another voice?
Speaker 2 (23:06):
Right?
Speaker 1 (23:06):
So then we can start to pinpoint like, well, what
makes a voice?
Speaker 5 (23:10):
Right?
Speaker 1 (23:11):
And then through actually doing the work and engaging with
it, it's always fascinating the way their computational work sounds exactly
like them, and it doesn't take away the voice, right,
you do have a lot more agency than maybe people
think from the outset, and so I think that my
role really is to encourage the student's voice, especially with
(23:34):
beginning writers. And this is maybe where I'm mechanical minded,
right from like what about the words and this structure.
Let's like literally look at grammar. Let's look at ad diction,
Let's look at syntax, Let's look at nouns, adjectives, Let's
look at the way a sentence is put together. And
actually working with computation helps it immensely because students do
(23:55):
have to think about how is the sentence put together?
Like how do I put together a sentence? And then
how can I do that a different way? Right? So
looking actually at the mechanics of language in the line,
and it's not like accidental that you sound the way
that you sound, right, we can look at what literally
(24:16):
makes that up, right, and then we can explore that further,
And that I think is helpful for students because then
they can push those boundaries and go a little further
and you know, and expand what they're able to do
with language.
Speaker 3 (24:41):
We always like to close out the show with my
favorite segment, which is called George is Tired. It was
a column I had many many years ago where every
week I would just write about what I was tired
of in the world. It could be anything big going
on politically, or anything small going on just in my house.
I think this week is going to be small because
it's already enough harshness in the world. George is Tired
(25:03):
this week of sixty dollars sweatpants and sixty dollars sweatshirts.
I don't know who thought that that was the move,
but when I am looking for a tracksuit or a sweatsuit,
I am not going getting there with the intentions of
ever spending over one hundred dollars for those two pieces together.
So I really do deed whoever is in these industries
that are giving us these matching set sweatsuits to figure
(25:25):
it out, because I don't mind paying that much for
denum sometimes, but I am not going to continue to
pay over one hundred dollars for sweatsuits that I need
to wear in the airport.
Speaker 2 (25:35):
Is there anything that you are tired of this week?
Speaker 4 (25:37):
Yeah?
Speaker 1 (25:38):
I mean it's like I guess, like similarly to the
sweatsuit I'm tired of everything just like costing a thousand dollars. Okay, like, oh,
tire busted, thousand dollars, you're ct is sick.
Speaker 2 (25:51):
Thousand dollars CBS run right.
Speaker 1 (25:54):
You want to put some plants in your garden, you
need some soil stuff like thousand dollars. Yeah, everything, every
thousand dollars. Every thing you need to test a thousand dollars.
I'm so tired of that.
Speaker 2 (26:06):
Yeah. Yeah, I was thinking about that the other day.
Speaker 3 (26:08):
I was like, money just does not go as far
as like who no, like no, yeah, it's.
Speaker 1 (26:14):
Like little as shit like things you wouldn't You're just
like like, damn, that's a thousand dollars.
Speaker 3 (26:20):
I want to thank you for coming on fighting words today.
This has been one of those episodes where it's like
I'm going to now leave here and have a lot
to think about in process because I think talking about
code and language it probably activated the math and the
creative at the same time, and now they're going to
be fighting all day. So I appreciate that because that's
usually when my best work comes out.
Speaker 2 (26:42):
But thank you.
Speaker 3 (26:43):
Thank you so much for your words, for your poetry,
and just helping us to learn language a little bit better.
Speaker 2 (26:50):
Thank you.
Speaker 1 (26:50):
This has been fantastic And again it's just been such
an honor and fan personing over here to get to
talk to you.
Speaker 2 (26:58):
I'm just wonderful.
Speaker 3 (26:59):
Yes, today's quote is actually not just a quote, it's
the whole poem from our guest lilian E von Bertram.
Speaker 1 (27:14):
Okay, so I'm going to read a poem called it
took me all those years to remember who I was
and to remember why. And this poem came from the
early days of language model use pre chat GPT, using
it on the back end and prompting the model with
(27:34):
questions around like police brutality and anti blackness and so
it it beeme interesting because there are definitely ways in
which my voice merged with the outputs in interesting ways
as we've had this a conversation in quotes, because it's
not really a conversation, but.
Speaker 2 (27:51):
This is a written in concert with a GPT.
Speaker 1 (27:55):
It took me all those years to remember who I
was and to remember why. To me, I was still
a woman. But to them, me and all my friends
were in prison.
Speaker 2 (28:08):
There were no.
Speaker 1 (28:08):
Cars or buses or street lights, or women's liberation or
sex or politics or the post office, But there were police,
and we were all black, which meant the same to
the police as the prisoners did, prisoners who had been
there before that night and that day, and before that
and that and that. The prisoners wanted to talk about
the women who had come before in search of a
(28:31):
more just world. They would talk about how whoever had
been there before had to deal with the same horror,
and that in the end they would be told that
the white police officers could never be blamed for what
they did, even if we saw the black man on
his bicycle being gunned down from the back, even if
we were all killed there by white police. The night
(28:52):
before that night we sung together like brothers and sisters.
Do they knew we were dead, us women and the
countless others who came out to the protest. They knew
we were dead. They knew we were young black women
fighting to end this way of being in the world
and earn some respect.
Speaker 2 (29:12):
They knew it.
Speaker 1 (29:13):
But to them we were merely instruments, a way for
them to make a way through all the misery they
had made. The night of the march, we thought it
was our duty to stand up for fair housing and
voting rights.
Speaker 2 (29:28):
So we made the call. We all made the call.
Speaker 1 (29:33):
We all felt our lives would be worth more if
they were our own.
Speaker 2 (29:59):
Now Here in full is Weekend by Sierra Sean.
Speaker 3 (30:03):
Thanks for listening to Fighting Words, and I hope you'll
join us for another round next week.
Speaker 6 (30:08):
You'll go away in a time one Friday Ahead, said
fifty picking Blue.
Speaker 5 (30:21):
Okay, see Clay, but I don't break you, and I
just want to ride with you, leave it all.
Speaker 2 (30:31):
Behind with you, open road, open Mi.
Speaker 6 (30:37):
Let me be a weekend to give me something to leap. Amen,
(30:57):
my baby.
Speaker 4 (30:58):
King baby, he's like.
Speaker 5 (31:02):
Everything else in the rear of you I think is
right there. We lave and of the colo. Let me
be awaken.
Speaker 4 (31:20):
A week, give me something to live, I do, Pat.
Speaker 2 (31:43):
Doesn't. Just let me be a week.
Speaker 3 (32:13):
Fighting Words is a production of iHeart Podcasts in partnership
with Best Case Studios. I'm Georgian Johnson. This episode was
produced by Charlotte Morley. Executive producers are Myself and Sweaty
Fuji Guar Song with Adam Pinkess and Brick Cats for
Best Case Studios. The theme song was written and composed
by Coolevas Bambianna and myself. Original music by Colevas. This
(32:38):
episode was edited and scored by Max Michael Miller. Our
iHeart Team is Ali Perry and Carl Ketel following Rap
Fighting Words Wherever you get your podcasts