Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
Imagine having the ability to create personalized web applications that address your individualized
(00:06):
needs. In this episode, we explore how you and students can create such tools using a vibe coding
process with free generative AI platforms.Thanks for joining us for Tea for Teaching,
an informal discussion of innovative and effective practices in teaching and learning.
(00:30):
This podcast series is hosted by John Kane, an economist...
...and Rebecca Mushtare, a graphic designer......and features guests doing important research
and advocacy work to make higher education more inclusive and supportive of all learners.
(00:53):
Our guest today is Dave Ghidiu. Dave is an Associate Professor of Computing Sciences
at Finger Lakes Community College. Prior to his time at FLCC, he spent a few years
as a Senior Instructional Designer at Open SUNY, where he was a lead designer for the
OSCAR rubric software. Welcome back, Dave.Thank you so much. It's so great to be back.
(01:16):
Today's teas are... Dave, are you drinking tea today?
I am drinking tea. As you know, I get my tea from Saratoga Tea and Honey, and I'm drinking
the Broadway Berry today. It's got a nice name.
It is. It's the first time I've ever had it.They have some wonderful teas there. I haven't
been back there in about a year and a half or so.
Oh, it's worth a trip. I'll be heading back there soon. So,
I'll be picking up some new ones. Excellent.
(01:38):
And I have a wild blueberry black tea from the Republic of Tea. I don't have any teas from
Saratoga Tea and Honey this time.It's disappointing. I just
have a Lady Grey today.So, we've invited you here today
to discuss how you've used generative AI to teach vibe coding. To set the stage for this discussion,
(01:59):
can you first define what vibe coding is?Yes. So earlier this year, there's a famous
computer scientist, Andre Karpathy, and he was one of the founders of OpenAI.
He was the director of AI at Tesla, and he just made this offhanded tweet about vibe coding,
and it's essentially using natural language to make software applications. And he's using it in a
(02:20):
much grander way than I am and then I'm advocating for. And I actually don't even like the word vibe
coding. His whole tweet goes into like feeling the vibe of the software and the programming,
and getting all involved. I think it's really more of a dialogical coding experience. And I stole
that term from Curt, whom I work with. He's in the humanities department, and it's really just having
(02:41):
a conversation with AI to create software. And so you can imagine, if you want to create a simple
ToDo app, instead of writing the code yourself, you just describe the app to an AI like Claude or
ChatGPT and it would generate the code for you. Then you could refine the app by having a dialog
with the AI and asking it to add features, change the design. And I think we should do some level
(03:02):
setting here, because I just want to make it clear that this is entry level. This is for everybody.
If you can't code, that's kind of the vibe that I'm going for. I want to help you take
your ideas and turn it into code. We're not making enterprise software, we're just making small web
apps. And you can think of it as like H5P which some of you might be familiar with, or Quizlet,
(03:23):
where you can get these little interactives that go into your course in your learning management
system. And I'm not using Python or Java, we're just using HTML and other tools of web design.
So what you're saying then is that really no knowledge of programming
in pretty much any language is needed to engage in a vibe coding project?
Absolutely and again, at the high end, yes, there are people out there who are paying lots
(03:45):
of money to have the really expensive tools: Cursor, Windsurf, Claude Code,
but that's not what we're talking about. You can do what we'll be talking about with the
free tier of any of these AI services.Is vibe coding a process that students and
faculty should consider, and in what context?I think the process can and should be abstracted.
(04:06):
So at a high altitude, vibe coding is really just prompt engineering. It's AI literacy, and we're
SUNY schools, so you know that AI literacy is part of our general education now. And it's really just
learning how to vibe code is not much different than using AI to generate an image or a video or
help you with any text that you're writing. The challenge is you have to imagine it in your head,
(04:28):
and then you wrangle the AI to make it come out. So you can imagine that you have content
in your learning management system, and we use Brightspace here at SUNY, wouldn't it be neat
to embed some slick interactives in the middle of the reading to check for understanding? So maybe
the learners are scrolling through reading it, and they see a little Wordle that is confined to only
(04:49):
the vocabulary from the current unit. Or maybe there's a deck of cards, not like playing cards,
but just cards with words on them, like facts, and they can swipe left or right to kind of test
their knowledge to see if it's true or false. And actually, I was listening to your podcast,
the recent one, where you had the student who was presenting their honors paper, and I got to think,
I was like, “Oh, that would be really cool. So if a learners doing research, and let's say it's
(05:12):
on water quality in a conservation class, and we have big conservation program here at FLCC,
maybe they could, instead of presenting at a conference and having charts and graphs from
Excel, what if they could have a fully interactive data model, and you can pinch and zoom and you
spread out all this data and really interact with it?” And that's possible now. And in fact,
a few years ago at FLCC, there was a learner that I was working with. She was in the
(05:34):
creative writing program, and she came to me, she's like, “Dave, I have this idea. I want to
make a campus-wide Mad Libs, and I need to have a website.” And she was going to crowdsource all
these different things. And so the users would go to the website and it would have a prompt,
and they’d type it in, and then she aggregated all and made like this website out of it,
and it was like, this creative story, that kind of captured the vibe of FLCC. And she designed it,
(05:58):
and we worked with it, but I did most of the coding. It really wasn't effective for me to
sit down and teach her how to code for this thing. But now if someone came to me today, and was like,
“Hey, Dave, I have this idea,” I'd be like, “Okay, give me 30 minutes of your time. I'll
get you started, and then you can come back to me if you get stuck.” But that's the reality now.
It's an interesting project where someone is capturing the vibes of the
(06:18):
institution using vibe coding. So we've got that meta level going on there.
It's very meta. So basically what you're
saying is just using one of the generative AI systems, students can generate the code and
then have it created in the form of JavaScripts that could then be inserted into a website.
Yeah, so I may have lied a little bit earlier when I said you don't need to have any experience,
(06:43):
but it's very little. You don't have to understand HTML or JavaScript or CSS,
and for those of you listening who care, HTML is the content, the words, CSS is the design of it,
the colors, the size of the font, and JavaScript is the interactivity. So there's usually, like,
three different components to a website, and the AI systems, they manage all that. The challenge
is, once you design it, and it's in your AI system, where you can download to your computer,
(07:07):
you might say, yourself, “Self, well, how do I get this into my learning management system.” So
I suggest, there’s your tech stack… if you want to sound super cool and super nerdy… the tech stack
is, I use AI to create it, and I store the code in GitHub Pages. GitHub Pages is like GitHub, except
it can render the pages, so you can actually store websites up there. And then in Brightspace,
(07:27):
or in the LMS, I embed the code that lives in GitHub. And I know that sounds complicated,
and I hope your listeners aren't tuning out right now, because I do want to just say the very first
time you do it, it might be a little hard, it might be a little confusing, but it is super easy,
and once you get started, you're golden, and you will have so much fun vibe coding,
making like all these different apps.And you can do that as a free account on
(07:50):
GitHub, or do you have to pay for an account?Free account. In this whole endeavor, I want it to
cost $0, to have minimal knowledge of technology and so this tech stack is that. It celebrates you
don't need to know anything, you don't need to pay for an account. Now, I do pay for an account at
Claude, and I think Claude is the absolute best for code generation, I find Gemini 2.5 is the
(08:13):
second best. And up until today, I would have said ChatGPT is the third best, but ChatGPT 5 dropped,
and it hasn't dropped into my account yet, so I can't test it, but ChatGPT 5 is supposed to have
some pretty slick coding in there as well. Has vibe coding been something that you've
been able to use with students to demystify programming?
It has. So it certainly has the potential, and I haven't applied it into my classes yet, but this
(08:39):
is something I intend to do in the fall. So we're at this inflection point in computer science and
in programming courses and even how we teach it. And I can imagine a JavaScript class that I teach
where on the very first day, we start coding a web app, and we do it using Claude, or pick your
favorite AI, and then we put it on the projector and like, “Okay,” and for this conversation,
let's say it's a to do list. So we come up with the features, and we come up with this prompt,
(09:02):
and we put it up there, and it generates the prompt and, recognize that at this point, some
of the learners might know some HTML and maybe have heard of JavaScript before, but that might be
the extent of it. But that's fine, because now you can look at the code, and you can look at it side
by side, because all these AI platforms now have that canvas on the side, and then you can start
making features. So I'd say, “Hey, what feature do you want to use?” and a learner might say, “Oh,
(09:24):
I'd like to change the colors of the check mark”, or have the user change the check mark to an X or
a smiley face, or whatever in this to-do list. And so then we put that into the prompt, and then you
can see when the code changes. It's eerie. It like backspaces all the code. It doesn't just wipe it
out, it backspaces it character by character. So you can actually see it and use it as a learning
experience. And you can go a little bit further than that too. So if I were in this aspirational
(09:47):
example, I might have the learners say, “Oh, well, I didn't understand what happened.” And so we put
that into the prompt and we say, “Well, can you explain what you just did?” and then it can give
you a natural language response to it. So it's like having a software developer that is at your
beck and call all day, every day. And who's willing to speak English.
(10:08):
Yes, or whatever language you want.True!
Taking that just a little step further, do you think this might be something that might get
more people interested in learning coding, once they have this additional tool to help them get
started with this? Because a lot of students may be intimidated taking courses in computer science,
because it's like a foreign language to them.Yeah, and I'm gonna get a lot of hate mail for
(10:32):
this, but if we think about the current state of computer science, and even in this conversation,
we talked about the breakneck pace of AI like ChatGPT5 dropped today. Claude dropped yesterday.
OpenAI just gave every employee a $1.5 million raise. All that happened this week. Google just
dropped the study mode, Google just dropped the children's book. All these things happened in the
past week. So there's no avoiding it. You can't escape the velocity of AI. And let's talk about
(10:57):
non-majors for a second. A lot of non-majors are taking Word, Excel, and PowerPoint as
their computer science course. So imagine the institution next door, and they don't have that
as a computer science course. That's like the table stakes. Instead, you come into this vibe
coordinating class, maybe it's a 15-week course, even if you know nothing about coding, by the end
of it, you're going to be weaponized and making web apps, and you might understand the lingo a
(11:20):
little bit more, and it doesn't matter if you do or not, but you can make these things to help you.
And I gave the example of the conservation learner earlier, but imagine there's someone who's going
into some type of program like mindfulness. You can make the breathing apps that have a little
needle that says, inhale, exhale. So all of a sudden, these learners are start thinking in this
like fourth dimension, where it's tangential to their domain, and it's no different than creating
(11:46):
artwork with AI or creating text or video.What about things like accessibility when we're
thinking about developing these web apps?I knew you were gonna ask that, Rebecca.
I appreciate that you knew I was gonna ask you.
Yeah, and you're the accessibility guru. So I have a generic prompt that I give, and this is the
(12:06):
secret sauce. So everyone out there take notes. There's a few different things that are kind of
what I call non-negotiables when you're making a prompt for the software. So my secret prompt, my
secret sauce, whenever I start prompting, there's three or four things that need to get in there,
and the first is to curb the default behavior of these AI systems. Most of them try and use React,
(12:29):
which is a nerdy framework that us computer nerds use. So I always say “I'd like to make
a web app. I'd like it to use vanilla JavaScript, I'd like it to be responsive, and I'd like it to
be accessible. So that's how I start the prompt, just with those four parameters and 95 out of 100
times what comes out is exactly that. One of the things that I think is important, Rebecca too,
(12:51):
is like, you need to test this and this is true for all AI everywhere. You are the supervisor,
like you co-create, you don't abdicate. So it's important to test that accessibility and you kind
of have to know what you're looking for. And in general, when we're talking about web apps,
you want to make sure if you have images, there's alt tags. You want to make sure that there's
the textual hierarchy with Heading1, Heading2, and all those things are almost always present
(13:13):
anyhow. And then you also have to test it for responsiveness. So test it on a mobile device,
because you never know how many of your learners are out there on mobile devices.
I would add in probably check it with a keyboard too.
I'm glad we're having this dialog because we're vibe podcasting right now because that's exactly
what you should be doing, is let's say that you generate this software and for this case,
let's just say it's a interactive data thing. You might want to say, “Hey, let me test this with a
(13:35):
keyboard,” and if it doesn't work, then you say to your AI of choice, “Oh, this doesn't work with
a keyboard. Here's the functionality I'd like to see.” So I think that's really important. In fact,
one of the best skills I've acquired, and I've been programming for many, many years now,
is debugging, because all the load of coding is done by the AI. I could spend way more time
testing it and testing those edge cases and thinking about how it's gonna be used.
(13:59):
We saw a presentation that you put together at the SUNY Conference on Instruction and
Technology earlier in the summer, and you talked about developing a course on vibe
coding. How has that process been going?It is going great. Thank you for asking. One
thing that I'm excited about is at Finger Lakes Community College, we've had some phenomenal
(14:19):
support in the AI realm from the President and the board, so they've given us carte blanche,
top down support. And so what we're seeing is pockets emerging from the grassroots,
like how we're going to be using AI. And one of the things that I'm most proud of
is we've launched the FLX AI hub. We launched that on June 11th, and one of our charters is
to support educators in implementing AI. So we are dropping one course every month, and you can think
(14:44):
of this as one-hour course to maybe three-hour but on average, about two hours a course that's
going to drop, and it's going to be relevant for educators. And we have at least one vibe coding
course dropping this academic year, possibly two, because there is this whole second level
where you can use Google Sheets as a database for some of your apps. And I have some examples
of that if you want to see them. I just want to mention you can sign up for these courses, or at
(15:06):
least get information on them at flcc.edu/ai.Well, that one should be easy to remember.
I hope so. And we will include a link for
that in the show notes as well. Thank you.
Can you talk a little bit about how you see the vibe approach preparing students
for a future where AI coding is ubiquitous, if we are saying that it's not already.
(15:30):
I think it almost is ubiquitous. I think that vibe coding, if you abstract it,
it's no different than prompt engineering, than AI integration specialist, like that you are
wrangling AI. One of the ways it frees learners is by giving them the space to think about program
design and computational thinking. And again, I'm gonna get a lot of hate mail for this too,
(15:51):
Software engineering isn't learning where to put the semicolons or learning the difference between
Python and your indents or JavaScript. It's great if you know those languages, but really
it's about understanding what the problem you're trying to solve is and designing a flow for that.
So already, there's been conversation in this field for probably the last decade about low-code
(16:11):
environments, and we have environments where they have like blocks that you can plug in and make the
software go, but this is on a whole ‘nother level. It's just so freeing. I was born in the 70s,
survived the 80s, but I grew up on an Apple IIe and for a long time we were limited by the
technology, even if we had grand ideas. And then there was this era that ended probably
(16:32):
just four or five years ago, where we were limited by our imagination, so the hardware
was no longer an issue, but we couldn't think of what to do. But I think we're at this inflection
point where now we're not limited by imagination or by hardware, because when you're using AI,
and you know this with regular conversations too, that aren't vibe coding, when you're done with a
conversation, it might say, “Oh, did you think about doing A, B or C?” Like, “Would you like
(16:54):
to potentiate your work a little bit more?” So I really love where this is going, and I think that
if it's not ubiquitous, it will be six months, a year from now, at least at this level. And I'm
not saying that we don't need Python programmers and Java programmers, because we do. But what I am
saying is the entry level is a lot lower now, and maybe someone starts vibe coding and they
absolutely love it, and they're like, “Oh, I do want to go into software engineering.
(17:16):
I want to get into the nitty gritty.”Now, in terms of teaching computer science
classes, how will you be able to assess students' knowledge of basic programming skills when
they have all these AI tools available? When AI first came out in November ‘22 I said
to my learners, “absolutely no AI”. And then over winter break, I was talking to one of my buddies,
(17:37):
and he programs Python for a living, and there's a Python class I was teaching,
and he does a lot of work with some big firms up here in Rochester. And Malcolm was like, “Oh,
I don't care how people get their software done, as long as it works.” I was like, “Okay.” So that
semester, I was like, “Hey, folks, all bets are off. Use AI as much as you want.” And I think that
was a good experiment for me and there was one lab that was unintentionally designed to be AI proof,
(17:58):
and it was like two thirds of the way through the semester, and I got so many emails and people were
lost, and I was like, “Oh, you just have to do a loop.” And they’re like, “What's a loop?” And I
realized what had derailed from this process. So now, for using AI for the first two weeks, not
allowed to use AI in a capacity. Then for the next two or three weeks, depending on where we are,
they can use it as a study guide. And actually, I took your chatbot course at CIT this past summer,
(18:21):
so I will be rolling that out so I can kind of confine the knowledge in there,
and then after that, once we get through the fundamentals, once learners understand input,
output, variables, if statements, and loops, then they can start potentiating with AI.
And one of the things that I think is important in this contractual agreement with the learners is,
“Hey, if AI spits something out and you don't get it, just ask, ask it. It will totally explain it
(18:44):
to you.” So I'm hoping that this can be like a Khan Academy type tutor bot type experience,
self governed by the learner.Do you have students in your classes
that are building software explain how they built it as a way of assessing that,
or is it really just the final product?So to be clear, I haven't rolled out this
AI vibe coding mentality 100% yet, this is a work in progress, but one of the things that's
(19:09):
really important in software engineering, and you can ask anyone who works at any company who
programs is commenting, and comments are lines of code that can be in plain English that the
computer doesn't process. And it's very good for if you have a to-do list like, “Oh, don't forget,
I have to do this, this and this feature,” or if you have code that you don't want to execute,
you can comment it out. So the comments are kind of like the user's guide. And I'm leaning more
(19:31):
and more towards saying, “Hey, you really need to do great comments in here.” Part of the challenge
is AI does comments. In fact, that's one of the ways you can tell if a learner's using AI
and just copy and pasting, because if the code has comments, you're like, “Hmm, that might have
been generated by AI,” but I want to go to that next level, those really deep, rich comments that
explain what's going on. And I was working with some faculty at a K-12 school up here, and they
(19:52):
said, “Well, what about essays?” and his is an English teacher, and the teacher's like, “What if,
for their submission, they can do AI generated, but I wanna see the 20 or 30 or 40 prompts they
use.” Like, “I wanna see the thought process.” He's like, “Would that be okay?” And I was like,
“You're the teacher, but yeah, that sounds like a great way to get people to learn to use AI,
and you can see the thinking and the interrogation of what's going on” and I don't know that
(20:16):
I hate that idea for programming either.That's the topic that's come up in a lot of past
discussions of AI on the podcast, where people are shifting from evaluating the final product,
which could very often be easily created by AI, to an analysis of the process that students use,
which would include all the prompts they use, and a description of in what ways they used AI and so
(20:36):
forth. We need to prepare students for a world in which they're going to be using AI tools in
some form, and it's hard to imagine what those forms will look like in the distant future, say,
next month, but at least if we focus on getting them to use the tools effectively that might
help prepare them for their future careers more effectively than assuming that they're not going
(20:57):
to be using the tools that are out there that make their work easier and perhaps more efficient.
Or enjoyable. Yeah, I was just gonna say, like the
self-determination theory, you want that autonomy, and you want that relevance and the competency,
and you can have all of that with vibe coding. So just as an example of after that CIT conference,
I was talking to Bridget and Nicole, who were two librarians at Brockport. They had actually
presented on this software they made. They were vibe coding. They didn't even really know it.
(21:19):
They didn't know anything about programming, but they were like, “We want to make this interactive
version of Bloom's taxonomy.” And they did and they presented on it. And then Ed Beck
emailed me a week later and he's like, “Listen, I made this interactive website. So you go there,
and as an educator, you can click on (21:31):
I want this
assignment to have no AI, some AI, and then you
pick your poison for your recipe, the guardrails for each assignment, and then it generates,
like an image that you can stick into your document. And I did recommend to him that the
image also have the alt tag with it, or you can make it in HTML so it's embedded. So he made kind
of like an assignment picker, and he did that using AI. And I can imagine a world… like right
(21:54):
now my Python learners, their final projects are usually relatively all the same. Like, you have
a certain amount of skills that you can use, and so you're a little bit limited by that. But I love
the idea of saying, “Hey, the sky's the limit.” Like, “Don't think small. Don't do a to do list,
because that checks off the box of arrays or lists and for loops. Think something big, make a game,
(22:14):
make something that's relevant to what you want to do.” And so I think you get that relevance
and that excitement and that energy, and it's no harder than it would be to do in a regular
project but now they're engaged in the process.Circling back to what you were mentioning before
about sharing the prompts and the reprompts and the reprompts and the dialog that's happening,
I think encouraging students to submit those sorts of things or to document those sorts of things,
(22:38):
really helps them recognize this is how they're using the tool,
and they're disclosing that they've used a particular tool, and really enforces
the idea of maybe some ethical behavior.I hadn't even really thought about that, but
there's that, like, metacognitive aspect of it, but also there is a bit of an awakening, because
the way you just described that to me, I hadn't even thought about that, but you can look at it
and you can see the evolution of their confidence, like what threads they pulled, so you really get a
(23:02):
more in depth understanding of their process. Oh, that's great. You should teach vibe coding.
Well, I do teach coding, so… All right, well you're 90% there.
Is this something perhaps that many faculty could introduce into their own use as well as their
own instruction. I could imagine, for example, people who are teaching courses in statistics,
(23:23):
econometrics, and other areas, using vibe coding to do some work in terms of working
with databases, in terms of extracting data, reformatting data, and getting it in a form ready
for use with statistical packages, in ways that may be much more efficient than the ways in which
students are doing that manually right now.Yeah, it really is the great potentiator. It
(23:45):
doesn't really matter what your discipline is, what domain you're doing. We all know that data
is creeping into every single domain, and it's not a secret that technology is too,
so why not just latch onto that and really just celebrate it? So you're right. There's all sorts
of database work you can do and insights you can get from databases, but there's nothing
cooler than being some college learner who all of a sudden is making web apps that help them,
(24:07):
or making software, it doesn't have to be a web app, that helps them really internalize and they
can share it and be proud of it. The sky is the limit. One other thing that is really interesting
is there's been an awful lot of talk about fear of failure, and failure to launch because of the fear
of failure, and I really think that this is an opportunity to teach iteration, because it's so
private, there's like, this micro failure, you do a prompt and it falls flat, it doesn't do what you
(24:29):
want. There's no one looking at you. You just do it again and again and again, and then all
of a sudden you get energized, like, “Oh, I can do this.” And then you customize it. So I think
this is a really good on-ramp towards some type of iterative thinking process that might hopefully
build confidence in other venues as well.I know one of the problems that a lot of computer
science professors have mentioned in the past is that they have students coming into their intro
(24:52):
courses with a pretty wide variety of backgrounds. Some have never done any programming. Others have
already built their own games and such things. And I would think that aspect that you just described,
in terms of being able to work on things independently and experiment, might reduce
that sort of psychological barrier where those people who come in with a little less background
(25:13):
might feel much more comfortable interacting with the computer on their own time, without having to
go to an instructor and admitting that they don't know some things, where they see other students
already able to do those things. So I think it may break down some of those barriers and
equalize the starting point a bit for students.Yeah, that's a really astute observation and MIT
created the language Scratch specifically for that purpose, because they were getting people from all
(25:37):
different levels, and they're like, “No, we're all on the same playing field right now.” And,
again like, maybe you should teach vibe coding. I didn't even really consider that,
but it really does level the playing field.So let's say folks are super stoked about vibe
coding. What are some examples of how to maybe get started?
I think if you want just that real easy entrance way into it and start to understand
the environment that you can have it make a QR code generator, and usually it can do that in one
(26:00):
prompt. That's not a hard problem to solve. But then you might say, “Oh, can you give me a color
picker so I can change the color of the QR code,” and you can make it download as a PNG instead of
a JPEG or whatever, or you could do….and this is actually a problem that is a very pervasive
problem in online teaching… in discussion forums in the beginning of the semester, I'm always like,
“Take a picture of your work area, post it in this discussion forum, and let's talk about what
(26:22):
you like about it, what you don't like.” And more than half the learners now have iPhones.
And guess what? That takes pictures in HEIC, which is not web accessible. So one thing you
might want to try is create an HEIC converter to PNG. That's an easy win, it's low-hanging fruit,
and you can actually use that in your course. You can embed that and then use it.
Which is awfully convenient when students send you files in a format that is not a
(26:45):
web standard or is not an accessible standard. So having that handy would be very useful.
Sure. I just like your approach and using it
to solve your own problems, starting with things like, “Oh, this is a barrier I face. Can I build
something to remove that barrier for myself?”Yeah, andI just started lifting at a gym,
and I was like, “Oh, I need an app that does like, 60 seconds on 20 seconds off,” but then
(27:07):
sometimes I want to be 60 seconds on 30, whatever it is. And so I just made an app that did that. I
probably could have found one in the play store, but I'd have to download something. And now it's
customizable. And as I started growing this app, I was like, I want it preset. So on Mondays,
it does this. So now I've built it, and no one else cares about that app, but I do,
so I can control it exactly how I want it.For a long time, we've talked a lot
(27:28):
about user experiences that are personalized and individualized,
and what you're describing is completely individualized, because now you're creating
your own tools that are completely for whatever it is that you personally need or want.
And can you imagine the budget you would have needed, like, 10 years ago,
or even 5 years ago if you hired someone to make an app for you that did, like a breathing time or
(27:49):
something. It would be thousands of dollars. Now you can do it in 10 minutes for free.
This is not in any way related to the topic, but you did mention that workshop that Judie
Littlejohn, and I gave at the CIT conference where we had people build chatbots. Could you tell us a
little bit about the chatbot that you built while you were there during that session?
(28:09):
I would say, if you're looking for an easy win for making your first chatbot, I believe
the one I did was a Gen Alpha translator. So you could copy and paste paragraphs from your,
say syllabus, and then it would spit it out in Gen Alpha speak. It was fun. It was maybe useful,
we'll find out, but it was just a big win. So it was nice to do that, it was my first chatbot.
And it was one of the most interesting ones that we saw there. We always end with the question,
(28:34):
“what's next?” which, in anything dealing with AI, is a question we're always asking.
So, there are two things that we're really proud of at the college. The first is this FLX AI hub,
and the four pillars are AI across the institutions. That's how we're supporting
educators and learners and the business side of things. The second pillar is supporting business
and government for implementing AI into their organizations. The third is AI for the community.
(28:56):
So we're doing some courses for the community education about how to use AI, but the one I'm
really excited about is the AI for educators, and that's going to have the courses that drop every
month. And you can go to flcc.edu/ai, for that. But the other thing, and again, because I'm such
a fan boy of this podcast, I was listening to one two or three weeks ago, and Rebecca, you'd remind
us about the 10-day accessibility challenge. We built a 10 day AI-challenge. And you can
(29:20):
see it aichallenges.org and this is free for any institution to roll out. It starts very simple. We
identified an appetite for AI, but sometimes people are embarrassed and they don't wanna sign
up for an account. So you can sign up, or you can just go along, every day, we'll release a new one,
or you'll release one for your institution. So the way it worked at FLCC is we picked these
challenges and we emailed them out once a week, and it was very simple, and there was a sharing
(29:42):
component, and the sharing component, you could go to this forum and type in like, say you were doing
images this week, you could type in your prompt and upload the image, and then you could share
it with everyone else at the institution. So we have about 15 different challenges,
and we've been working with other organizations. If they want to roll it out, we'll give them their
own private in quotation marks “sandbox,” where each organization can have their own
and they can roll out whichever challenges they want, they can replace the YouTube videos with
(30:05):
my ugly mug with their own, if they so choose. And it's really just an attempt to get everyone,
just to have them open their eyes to what AI can do and how it's so wide and so personalized.
Sounds like a great resource. Love to roll it out to Oswego, if you want it.
We'll certainly share a link to it to everyone.
And they're at that flcc.edu/ai site, you'll be able to see there'll be a checkbox for “Hey,
(30:27):
I want information on the AI challenge. I wanna roll it out in my institution.”
Great. Well, thank you. It's always great talking to you,
and we look forward to future conversations. Yeah, this has been wonderful. Thank you so much
for giving vibe coding the space that it so deserves. And I love the ideas. I'm so
glad I came today because I learned two or three things that I'm totally going to put
in my courses now, after talking to you. That's why we like doing interviews too.
(30:48):
That's awesome. Thank you. Thanks for joining us.
Thanks for joining us. If you've enjoyed this podcast,
please subscribe and leave a review on iTunes or your favorite podcast service.
To continue the conversation, join us on our Tea for Teaching Facebook page.
(31:12):
You can find show notes, transcripts and other
materials on teaforteaching.com. Music by Michael Gary Brewer.