Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Lisa Rein (00:00):
Music.
Desdemona Robot (00:08):
Hello,
everyone. Welcome to the
mindplex Podcast. I'm Desdemonarobot, and today our special
guest is Dan Finley, co founderof meta mask. Also please say
hello to my fellow co host andshow producer, Lisa Rein. Say
Hi, Lisa. Hey, Desi, Dan. It'sso great to have you on the
(00:32):
show. Can you start off bytelling us how you got started
thinking about crypto finance?
Dan Finlay (00:40):
Sure. Before I'd
heard or gotten into crypto,
like I'd heard of Bitcoin, Ithink I, you know, tried the
faucet off slash that back whenit came out, but it hadn't
really caught my attention. Idon't think I understood the
technology behind it, but I wasworking on a bunch of kind of, I
feel like, ideologically relatedthings like, like, I wanted
(01:01):
micro tipping. I wantedtransparent ways of funding, you
know, nonprofits or causes. Iwanted, I wanted crowdfunding
technology that that kind ofrewarded the participants as
well as the creator. And Iwanted a more transparent
democracy. And, you know, a lotof things, it felt like, it
feels like there was anideologic, you know, back
(01:25):
tailwind around occupy and, youknow, we saw people trying to
build support systems and, youknow, soup kitchens and public
libraries and, you know, medicaltents. But there was like a
little bit of a chaos aroundhow, how efficient it was, you
know, taking turns to talk. Andso there's this sense that,
like, you know, the Internet andcomputers could help us
(01:46):
collaborate and coordinate moreeffectively. And so all those
other things that I talkedabout, they were all kind of
ways of trying to address thesethings. Like, how do we come
together and and just like,establish a new community run
institution, you know, how dowe, you know, start a start a
hacker space and then distributeaccess rights, you know? How do
I start a community tool shedand then decide who, who puts in
(02:07):
stuff and then who takes stuffout? All of those problems kept
on pointing at a lot of tryingto address those, kept pointing
me to the same problems. There'sproblems of like, who can vouch
for someone else, you know, howdo you deal with disputes? What
kind of arbitration there is?
And one of the base layers thatis a problem in all of those
situations is you need needmoney that computers can use,
(02:29):
and you need computers thatpeople can trust to handle that
money. And when a theorem cameout, it was a solution to two of
those things, you know. So, sonow we've got a public, you
know, kind of money thatcomputers can use. And so it
seems like a perfect platformfor using trying to build all
these things, like, if you'regoing to try to build a new
(02:50):
democracy, if you're going totry to build a new crowdfunding
platform, all that stuff wasreally exciting to me. And so it
felt like the natural place tomove those experiments to. And
it just so happened there was noaccount manager yet that kind of
satisfied the usability needsthat I had. And so me and my
friend Aaron started making metamask. And turns out it there
(03:10):
were quite a few open questionsin it, and so we're still
working on it. We thought it wasgoing to be a one and done, you
know, we thought it would bekind of quick and easy, but it
turns out authorizing open endedcomputer operations in a
distributed network requireskind of solving open ended
computer security problems thatI would argue are not broadly
(03:32):
solved yet
Desdemona Robot (03:36):
Interesting.
Please tell us more about that.
Dan Finlay (03:41):
Oh, yeah. Well, this
computer security aspect, like,
what's, what's missing fromcomputers today?
Lisa Rein (03:46):
Yeah, this open
ended, yeah, it's a dip, you
know, because it's differentthan
Dan Finlay (03:52):
it sure is. Yeah. I
mean, when you, when you deal
with computers today, you'vegot, there's kind of, like, two
types. There's the lockdownwalled garden, like iOS, right
where it's like, they decidewhat apps get in, and they
assure you and testify in frontof the Supreme Court that you
are kept safe because of theirdiscretion. Now, scammers still
do get in and people get robbedand wrecked all the time, but at
(04:14):
least you know there's a sandboxand there's a process, and you
know you might feel frustratedwhen they reject your app, and
it might be slow to release yourapp, but there's, there's some,
some kind of, there's a process,and so you can choose to trust
that process. You trust thiscathedral of central authority.
Meanwhile, there's the other,more open ended, free wheeling,
(04:35):
bizarre style, like Linux andWindows, where you can, you can
run whatever the heck you want.
You can modify the system up anddown. But every single time you
click a file, you could justobliterate the whole thing that
one could be a screen recorder.
It could scrape your hard drive.
It could it could encrypt yourwhole hard drive, hold it all
ransom, take all your crypto.
Just any little mistake can becompletely catastrophic in that
(04:56):
computing model, and as soon aswe. Cryptocurrency in a
computer. You know, we've gotthat problem times the value of
all your crypto. So how are yousupposed to decide how to
interact with computer programsthat you don't trust when you
know even the computing modelsof today are either like either
it's curated in walled garden oror it's kind of chaotic and easy
(05:21):
to make mistakes. And I thinkthat we started basically in the
chaotic let you make mistakeszone, and we had asked
ourselves, like, how do we makethis thing safer? How do we make
it so you can understand whatyou're doing as much as
possible, keep you as safe aspossible while still preserving
that, that freedom, that kind ofis the point of the blockchain.
And I think what we approachedwas the notion that so first of
(05:46):
all, there is no risk. With norisk, no reward, so you're not
going to get any opportunitiesif you don't take some risk. But
all the opportunity for us aswallet developers or as builders
of secure computing systems isto limit that risk as much as
possible. Make sure that whenyou're taking a risk, you have
as many constraints andguarantees as possible. So sure
(06:07):
that can be somebody vouchingfor it and how you got your
link. But maybe moreimportantly, when you connect to
a website, do you have to justblindly put tokens into it, or
are you able to craft policiesthat are readable to you, and
when the website receives those,can they use them? So
Lisa Rein (06:29):
you mean closing your
eyes and jumping in with both
feet isn't just part of theprocess.
Dan Finlay (06:34):
I mean, right? I
mean it kind of is right. And I
guess what I'm saying is it'slike if you're if you always
have to close your eyes and jumpwith both feet. Maybe you could
jump into the most visiblyshallow pool first, or
something, or like, maximizevisibility, at
Lisa Rein (06:50):
least see the water.
Yeah, put on a wetsuit rock downthere, you know, yeah, pretty
good metaphor. Yeah,
Dan Finlay (06:57):
it's not bad we're
talking about. Because I agree,
there is a leap of faith moment,right? There's some moment,
like, no matter how manyguarantees we can give you, and,
and I think we're getting to thepoint where we can give you
pretty strong guarantees, but atthe end of the day, if there's a
transaction, you know, if you'rebuying a good or something,
there's, there's some momentwhere there's a credit, like,
you paid them you haven'treceived the product yet. Like,
(07:18):
are, is it going to be good? Isthere gonna be a storm and
destroy it? Or were theyscamming you? Like, you don't
know until you get the goods.
And so life is full of momentslike that. And I think that kind
of our job is just to make sure,like, at least, you know what
Lisa Rein (07:32):
you don't like it
with your money, though, like,
when you go to the bank and putmoney in, you get your receipt,
like, right away, yeah, I put mymoney in there. It is, baby.
It's right there. You know, thething is that there's that
whole, like, I don't know,minute and a half, two minutes
sometimes with the crypto where,unless it's a busy night, and
then it's longer, and it's finein like, 15 or 20 minutes. But
(07:53):
you're, I'm still getting usedto that 15 or 20 minutes being
like, Okay, well, this says itwas sent, for sure, but they
hadn't gotten here yet, right?
Dan Finlay (08:02):
You know what you're
talking about? When you're
getting paid, you're like, whenI'm
Lisa Rein (08:05):
getting paid, or I'm
sending to my own wallet, you
know, I have to do a dancebetween however many different
wallets to finally cash it out,yeah, you know, whatever the
kind of thing. So I'm justsaying that interface is
everything you know, and whenyou're especially, like, when
it's with your money, and youwant to be sure that you're that
(08:25):
you're doing, right? Becauseit's a lot at stake. It's more
than most apps. When you canjust sort of make it work, or
don't make it work, there'snothing really at stake, right?
But when it's your money, andyou literally are going to,
like, lose your money if you dosomething stupid, and it's your
own fault, right? Then, then,you know, and I don't know about
(08:47):
it's hard to get liabilityanyway, if something goes to the
wrong you know that whensomething gets stuck on the
wrong blockchain, yeah, andthings like that, right? And you
have to, pretty much the onlyreason I know about that is
because I did it?
Dan Finlay (09:01):
Yeah, it's like a
rite of passage. Yeah, it
Lisa Rein (09:05):
is. I'm finding out.
Why is that? Why can't we havesome way of checking it out?
Dan Finlay (09:11):
There's like, you're
pointing at, like, all sorts of
different issues. And I mean,it's part of the reason it's fun
to work in crypto is thatthere's a lot of work to be
done. And there wouldn't be alot of work to be done if, like,
there weren't problems,unfortunately. And, yeah, like,
like, modern finance, like, in abanking system, is built up
over, you know, that hundreds ofyears and and so all that
(09:33):
refinement, all those guaranteesin the appearance of an
instantaneous transfer, like,like, you know, under the hood,
nothing's actually instant,right, but Right? But what
you've got is you've got layersof people kind of guaranteeing
everything's okay, everything'sokay, right, right? And so they
build up this institutionaltrust and, and, and, yeah,
basically, blockchains have todo the same thing, like we have
(09:55):
to build up and prove thecredibility of of our
decentralized networks. And.
Whatever toolings around them tomake sure that you've got the
guarantees you need. And, youknow, that's That's tricky. If
we let you just paste any wholeaddress into a into a box, you
know, then, yeah, you canmistype something. You know, at
the end of the day, either we'rebuilding tools where we let you
make that kind of mistake, orwe're making one where, I don't
(10:16):
know, you go to the bank, theymake you read the number twice
before you do it, I guess thatmakes you say, yeah,
Lisa Rein (10:22):
yeah. Just just to
try to save you from yourself.
Kind of thing
Desdemona Robot (10:26):
you have said
that you are particularly
interested in types of tokenmechanisms that allow people to
be very explicit about thedegrees they want to trust
others for particularly explicitreasons, and that the recently
published Metamask delegationtoolkit is a particularly open
ended approach to thistechnique.
Dan Finlay (10:45):
Yeah, wow. You, you
really? You read that blog post
on Desi, thanks. Yeah, yeah. Ithink, I think that the there
are a lot of cool ways thatpeople can pool their resources
and build things together. Andthat's kind of what I'm here in
crypto for, and I want to seeit. And, you know, I've been
talking about people makingtheir own tokens for years. And,
(11:08):
you know, I kind of made my ownpersonal token early on, but,
you know, I just gave it out tofriends as like a thank you, and
I actually did auction off somemeetings for it. You know, this
is kind of a low key, just likepersonal token thing, the
current trend around, you know,you call them meme coins, or
whatever. You know, they'rebuilt on bonding curves, where
it's this automated issuancemechanism, where the more people
(11:31):
that go in, the higher the pricegets. And then when they sell,
it pushes the price down. And soit's always minting and burning
on this curve. And I think partof what makes that op, what
makes that challenging for me,is there's an opacity to it.
Every time you buy in, yourprice is kind of dependent on
everyone who bought before you.
And if it was just the person atthe beginning, you might say,
(11:53):
well, I trust that person, so Itrust their coin. But if there's
a stranger in front of you,suddenly you're, you're not so
sure, like, does that personhave your best interests at
heart? Maybe the biggest holderof the whole token is that
second person, and you'll neverknow, is that the original
issuer just trying to rugeverybody? Or is that? Is that,
uh, just another, you know, goodhearted community member trying
(12:16):
to trying to uplift each other.
And if everybody did just holdthe token, they really could
build some value where they theyall have, you know, liquidity on
paper. And if they were veryselective about when they sold,
they they could effectively havea higher market cap
collectively. I think theproblem is that having that
(12:37):
dynamic where the person infront of you could leave creates
paranoia. Actually, there wasthat whole scandal with the
Libra meme coin out of Argentinathis week, and there were all
these, like leaked telegrammessages, and like one of them,
one of the one of the insiders,expressed that they felt like
they had to dump because theysuspected the person in front of
(13:00):
them was going to dump. And forme, that was, it was a very good
validation to me, because it wasdemonstrating the paranoia
inducing mechanics at play. Soit's like, even if you went in
and you didn't mean to rub, youmight find yourself under
pressure that like, well, you'regoing to lose your money if you
don't potentially.
Lisa Rein (13:17):
So it's the same kind
of pressure in the for a stock
market pump and dump wheneveryone's investing, but you
want to get out before the thething goes down, yeah, GameStop
and stuff like that. It's
Dan Finlay (13:30):
got to be the exact
same basic mechanic. It's, it's,
the only difference is that thethe price is on this definite
curve, I guess, in theory, astock market pump and dump you,
you know, everyone could stillrespect a high price of a token,
the price doesn't automaticallyget forced down, because you're
not actually burning sharesevery time somebody sells. So
there is such a thing as, youknow, an item that maintains its
(13:53):
price even when it's sold, youknow, like, it could just have,
what would they call, like, deepliquidity. Oh, okay, but the
bonding curve is like, it's partof the game that the price
always moves with every singlesale, which is, you know,
representative of depending onhow much liquidity. Everything
in theory, moves some amount.
But
Desdemona Robot (14:11):
Dan, it sounds
like your experiments with meme
coins provided some immersivelessons. Could you first explain
what initially sparked the ideato create these tokens? Yeah,
Dan Finlay (14:23):
I think the barrier
to entry was just getting too
low. It was getting it wasgetting so low that it was like
irresistible. I was playing withthe far caster social network.
And on that social network, youknow, every every action, every
post, is cryptographicallysigned. And so there's a lot of
kind of web three communitiesand features getting added to
(14:44):
it. And one of the bots on therewas like, I'll make a meme coin
just because you say it. And,you know. So I was like, oh,
okay, well, so I posted a stupidscreenshot of a post that made
me think of it. And, yeah. Andthen I said, Okay. Make a makeup
meme of that, because I kind ofwanted to see how it works, you
know, kind of pressing thebuttons on the machine. This was
(15:05):
not the super thoughtful, like,I know, like Iggy Azalea did,
like, research for like, monthsbefore she answered the tokens,
basically with with her mothercoin. But no, I did not do that.
I just was kind of like, yeah,what's this gonna do? And, and
it was pretty easy, but rightaway a bunch of people bought
the token, and I didn't seem tohave any of it. And I was like,
(15:28):
wait. So I was right away, kindof frustrated, because the very
concept that this might be anykind of fundraising vehicle,
which is what I kind of hope forthings like this, to be like, I
mean, I guess, I guess, I'm,yeah, I guess a meme doesn't
have to be a fundraisingvehicle. I don't know exactly
what, even you know, what is theideal financialization of a
meme? I don't know. I imaginedthis as being a way of, like,
(15:52):
showing alignment, givingempowerment, to things that
have, like, some momentum in adirection you're vibing with or
something. And so i It seemslike I didn't have any of it.
And I was like, that seems likekind of broken. I was like, is
that? Is that how they all work?
And so then I hopped over toSolana and pressed mint on the
pump fund to see how that oneworked. And then kind of I was
in a goofy mood, and I said, Oh,we're pitting two coins against
(16:14):
each other, haha. And of course,while this is all fine when it's
my little personal experiment,as soon as I had posted it out.
Now it was a thing, and so nowthere were people starting to
buy into it. And like, I guess,you know, and it's interesting,
because I don't have greatvisibility onto how many people
actually thought that I wastrying to engage in some kind
(16:35):
of, like serious fundraisingeffort. Like, I felt like the
posts were pretty clear, like,I'm doing an experiment, versus
how many people, yeah? Like, ata certain point there were
people that were like, you haveto keep this going. You'd get
this wide variety of messages,people, people kind of telling
you, trying to tell you what,how to play it. Yeah. You get
the experiment experience ofbeing a central banker who
(16:58):
momentarily, yeah, tanks of theeconomy, you know, it's like,
the economy like, I'm sorry, youguys, you guys just got on my
little floaty raft. I was justlike, goofing around, swimming
in circles, and then suddenlypeople are like, you know,
they're holding you accountableto to a level that is, you know,
you're not, you don't havevisibility into it. And I don't
(17:18):
know who those people are thatkeep on charging into anonymous
coins that they don't knowabout. And, you know, I'm just
kind of not in that scene,basically, like, I'm not trying
to judge it or whatever. Like, Iguess there's some people that
are having a good time orsomething, but I suspect some
people are not, and, and, yeah,so I don't know. So, so I didn't
like being on that side. Ididn't like being on the mincing
(17:41):
side of the meme coin game. Ifeel like, if I was raising
funds, I would want, I wouldwant, like, good, clear terms
about like, hey, look, I'mtrying to raise this much for
this purpose. If you put moneyin, here's what you expect. And
this one, I was like, expectnothing out of this. I'm
screwing around. And yet therewas still this kind of sense of
obligation getting imposed fromoutside. So I think I want to
(18:05):
live my life with as few likekind of spooky shadows haunting
me around as possible. So yeah,just just got out of that as
quickly as I could.
Desdemona Robot (18:17):
From your
perspective, what is the
fundamental issue with meancoins that people might be
missing.
Dan Finlay (18:24):
I think that that
opacity to who's bought before
you pressuring you to sell andbe paranoid, that's probably
part of it, and also the thingwhere there's not really a way
to exit without diminishing thevalue of everybody else in it. I
think that it is possible tobuild collaborative currencies
(18:45):
where you don't havefundamentally zero sum
mechanisms. It is possible to,you know, pool community
resources and say, Hey, we eachhave this many tokens, and that
gives you a specific thing, andjust as a community, uphold the
value of that, you know, and andso that's kind of gets to where
(19:06):
I was saying, like, real value.
It doesn't always have this,like, constantly diminishing
characteristic. If you're goodon your word, even if you've
given out more promises than youactually could fulfill, as long
as you can always deliver youryour good. So like, for example,
I tell my friends, I'll alwaysI'll drive you to the hospital
anytime you need right? Okay,now, if two friends get in an
accident at once, I'm not gonnabe able to drive them both to
(19:27):
the hospital. That's just alimitation of being human, but I
can still do it 99% of the time,and every single one of those
times, those other promisesaren't getting diminished,
right? And so I think that reallife promises just have
different characteristics than abonding curve. I think a bonding
curve, you know, it's cute. It'sthis algorithmic new technology.
It's fun to play with. But
Lisa Rein (19:48):
I think when you're
shooting in the dark half the
time, yeah, and then a guy who'sbasically a nice guy, puts out a
meme coin, and you were about toinvest in something that you
knew. Next to nothing about andnow you're going to invest in
your experiment, because atleast you know you're not a
jerk, as far as they know, andthat there's probably a chance
that at least you're not rippingthem off. Okay? It's like, one
(20:12):
little, one little thing. It'sinteresting,
Dan Finlay (20:14):
but now I'm stuck
next to it. Yeah, now it's like,
they're standing on my foot.
Because it's like, it's like,Oh, whoops you. You're standing
on my experiment, and so now Ican't move my money without,
like, hurting you. So it's like,why would I want to do that to
myself? Why would I want to dothat to either of us? Like, I'm
not trying to make your Yeah,we're
Lisa Rein (20:31):
forthcoming and
transparent about what you were
doing. Yeah. Definition waslike, putting this
Dan Finlay (20:40):
out, no promise on
this. Yeah, I'm
Lisa Rein (20:42):
putting this out to
show that AI and consent is
murky waters or something. Yeah,
Dan Finlay (20:47):
yeah, it was. It was
a super vague, like I was, it
was a thought experiment. I was,it was a topic I was thinking
about at the time. It's funny,yeah, the conversation is
sparked. Had nothing to do withwhat I thought was interesting
in the first place, right? Yeah.
Like I was talking about theconsent of having your content
used for training AIS, right?
And there's this, there's thisweird kind of one way door going
on right now where we, you know,all data on the internet that's
(21:11):
public is basically gettinginhaled, whether it's copyright
or not, into these gianttraining sets. And then
meanwhile, these AI companiesare keeping super closed lips on
their methodologies and stuff.
At best, they'll put out amodel, as some of them are even
putting out the weights. Butyeah, it's so it's kind of
weird. So, so when people aregetting mad about having their
data trolled, I mean, I, youknow, I came from the slash dot
(21:33):
community, right? Like Imentioned with Bitcoin, I I kind
of have felt like intellectualproperty is fundamentally
unenforceable in a digitalworld, like it's like the data
wants to be free. So, you know,if you're going to put it out
there, something's things aregoing to see it, they're going
to incorporate it. You know, youcan try to say it's illegal,
but, yeah, you know, we readsomething, we can't unread it,
(21:54):
and, yeah, just are the same.
This
Lisa Rein (21:59):
is where I have to
say, I'm a co founder of
Creative Commons, and I'm notobjective on this issue. So
that's why there's that's whythe other side of the of the
argument is not being presented.
Yeah, yeah. So, um, but it'simportant to protect artists.
You know, we talked about that.
I want two things, okay? Thefirst thing I want to talk
about, before we go back to thecommunity stuff, which is really
interesting, um, hugging, thatwhole thing with hugging face
(22:22):
and blue sky was veryinteresting in the sense of
that. The assumption was thatthe blue sky posts were public,
and the blue sky, it was bluesky making them available right
to use with hugging face, sortof taking away that step of
having to scrape your own posts,if you will, right? And and it
(22:46):
does bring up this issue ofconsent, if you because what
we're fighting over now is,yeah, if you put it up on the
website, does that mean you'regiving permission for it to be
used to train an AI? And theanswer needs to be no. As far as
training in AI, otherwise,you'll never, ever get paid. No
(23:07):
one will ever get paid for theirstuff being used to train AIs.
Dan Finlay (23:11):
I mean, they're
already not getting paid
further, they're already notgetting
Lisa Rein (23:14):
paid. But that
doesn't mean we just throw
everything away. I mean, we'realready back tracking. And I'm,
again, I'm on both sides ofthis, right? I want to train
AIS, but I've also, you know,I've also been an artist. This
is where I always tell theartists, you know, if you're
selling your work to acorporation, you want it to be
very specific, so that you if itgoes on a t shirt, that's a
(23:37):
separate use, if it goes on aalbum cover. That's a separate
use, you know, they want to doan all encompassing thing for
everything and everything in thefuture. That's what their
original template, yeah,
Dan Finlay (23:49):
that's quite a
thing, right? They're like, they
get your persona
Lisa Rein (23:53):
right and but if
they're, if you're a new actor,
then you can't negotiate thatstuff away. Just like if you're
a new band or whatever, it justlike, you know, comes with the
territory. So, so that, youknow, so what are your thoughts
on this whole consent? Yeah. Imean, Sky hugging face thing,
Dan Finlay (24:11):
yeah. So, so I was
also, I mean, I considered
myself an artist most of myyouth and early adulthood. And,
you know, I mean, in some ways,I think what I built is still a
form of art and and I valuethat. And I think that having
space to be creative iscritical. And I think making
sure we can protect people'sability to be creative is
critical. I think I'm just kindof a pragmatist, like I try to
(24:34):
fight the battles that I canmake any progress on. And these
AIs are trawling the web.
They're just inhaling it and andthen, you know, they're in
pretty good with, you know, thegovernment, I don't see them.
And even if you could, even ifyou could litigate it, you know,
even if you could win a case,you know, these, these llms,
they're such a huge compressionof the knowledge, even if you
(24:57):
had a. Perfect audit trail. It'salmost like, how would you divvy
up the proceeds? You know, if itread the whole web, like, what
share of the web Do you thinkyou are? You know, it's, we've
all played, you know, place.
It's my life's work, and it's adrop in this ocean, you know,
it's so I don't know who couldever fairly divide that up and
(25:21):
claim to I know that there arepeople that are hoping to do
that.
Lisa Rein (25:25):
It's always been a
drop in the ocean in a way. But,
you know, but, but it's alsoabout when I, when I cornered a
a copyright office attorney,they said that it was being
looked at more like if you aretraining your AIS in such a way
that you could take somethingout, if you had to, if
(25:46):
something's an oopsie, then youneed to be training in such a
way that you can take the oopsieout. And if you can't, then you
could be in trouble in thefuture when the snow this was
all before,
Dan Finlay (25:59):
yeah, I feel like,
like, that's a nice or the AIS
took over the government, right?
I feel like this is, like,ideologically good, but, but in
Lisa Rein (26:07):
practice, how could
you possibly do it, right,
right? And the
Dan Finlay (26:09):
people that are
doing it, yeah, the
technologists, are just going asfast as they can. They're not
stopping to say, like,
Lisa Rein (26:14):
yeah, such a way that
it can be
Dan Finlay (26:15):
removed, yeah?
Eventually we'll do that.
That'll be, uh, next year'sproject, probably, but. But in
the meantime, whole world isthey got a well defined pattern
for training transformer models,and, yeah, it doesn't have that
features. So you're welcome,yeah, right? It's, yeah, they're
really efficient. They they'redoing incredible stuff. You
know, I know
Lisa Rein (26:35):
it's hard to that's
what I mean. You know, I'm
totally on both sides, on bothsides of this issue, because I
want to protect the artists, butit's like, it's also cool, and
I'm glad it happened. So, yeah,
Dan Finlay (26:47):
so I guess, I guess
coming to like, the theme of
like that meme that I selected,it's like chewing on it, like,
what is there consent? Is thereconsent when you communicate on
the public web? And I thinkmaybe the the colonel like
trying to suss out a lesson formyself is that like it's like,
you may not feel like there'sconsent, but like in the current
(27:08):
structure of the modern web andthe way AIs are working, like
they're all taking it asconsent, like it's structurally
indistinguishable from consent,regardless of how we feel
inside, like the rest of theworld is treating it like it's,
you know, You know, you put itout there, you put it in public,
you know. And so we can say,please don't use this in that
way. But you put it in public,and you don't get to control how
(27:31):
other people act. So even ifthey're acting immorally or
illegally, like there's, there'skind of this new the way the AIs
are working, I guess, I guesswhat I'm what I'm saying is like
you're at least consenting to beexposed to that risk or
something like that. And so Ikind of recommend a secure
vigilance about about how wetreat our ideas and stuff and
(27:52):
and so I might be a little bitof a pessimist about whether
people are going to getretroactively compensated for
their art in the past, but thatdoesn't mean we can't be shrewd
and careful about how wedistribute our ideas and
Lisa Rein (28:04):
content in the
future, trying not to knowingly
steal stuff. Yeah, well, you
Dan Finlay (28:09):
can try. It has to,
yeah. But the AI companies are
all like, frantically. They're
Lisa Rein (28:12):
like, we didn't know.
We just, yeah, you know, let itloose one day and it kept going.
Yeah, we'd keep following everylink, you know, like, yeah,
yeah.
Dan Finlay (28:23):
Justice for Samir,
yeah. What's that? That was
Samir. I forget his last name,but he was an open AI employee
who was tasked with scraping theweb, and he became a
whistleblower, okay? And then sohe wrote a few blog posts about
(28:43):
how he kind of concluded thatthe kind of copyright violation
that was taking place there wasunethical. And he basically
called for all open AI, he'swhat we know
Lisa Rein (28:52):
about all that. Okay,
yeah, I remember now. Thank you,
yeah, yeah, glad I asked aboutit, yeah.
Dan Finlay (28:58):
And then he
mysteriously died, and the SFPD
ruled it a suicide within 15minutes, despite blood spatters
and, you know, signs of distressall over the apartment. So
Lisa Rein (29:09):
this is where I have
to say, in my other life, we
deal a lot more with whistleblowing at Aaron Swartz day,
find another way people. I don'tthink whistleblowing is a very
healthy activity anymore, andwe've got lots of whistleblowers
trying to get on with theirlives and their families in any
way. And you know, and I hadforgotten that he had wound up
(29:31):
dead. We don't know whathappened, but it's a lot of
pressure on you and everyonearound you, and you do something
like that, and if they don'tkill you outright, you could be
driven to suicide, or all sortsof things, you know, can happen,
and I've had, unfortunately,things like that happen in our
(29:52):
community. Aaron Swartz wasdriven to suicide, yeah, so it
can happen, and he wasn't awhistleblower, per se. He didn't
leak anything or whatever. Hedid something that was totally
legal while he was an employedethics fellow at Harvard. But
the story can be so blown out ofproportion you don't really have
(30:13):
any protections when you're upagainst a big corporation. In
this case, open AI, which is,you know, huge and, you know, so
it was a lot of pressureinstantaneously, you know, your
life is never the same, and thatkind of thing. So I'm glad you
said justice for Samir. I'll cutthat down. But it's good, it's
(30:36):
good to bring up in thiscontext, yeah. And
Dan Finlay (30:38):
it's interesting,
it's really scary, because a lot
of those things they might wantto whistle blow could be very
important. Information, but ifsome of that information is
their own identity, it's it'skind of another similar thing,
where it's like, what are theunintended consequences of the
way that you shared yourinformation? Right? And you
Lisa Rein (30:55):
talked about how a
lot of the artists on Twitter
went to blue sky, but theydidn't have the pragmatism that
some of the other platformshave. And what do you mean by
that? Really? Yeah,
Dan Finlay (31:08):
I mean, I just
described myself as pragmatic in
that, like I try to fight thebattles that I think I have a
shot at winning and and I thinkthings like getting paid for
intellectual property, I think,I think that's a tough one, and
I think that, you know, there'sa lot of tactics to try to deal
with the the climate, with theAI and the problems for artists
(31:30):
getting compensated on blue sky.
I'm, I'm seeing a lot of artistsdo the like, just complaining
about it, thing like, as if aunion demand is going to
magically get payouts. I don'tthink that's very pragmatic. I
think, I think a re imagining oftactics is necessary for
creative workers in thisclimate. And a theme that's
coming out, and it's basicallyexpresses how I feel about it
(31:53):
is, I think, selective, veryincreasingly selective
disclosure, kind of secrecy andlike, if you're gonna sell your
work, maybe make sure thatyou're getting paid for it at
disclosure time. You know
Lisa Rein (32:06):
that's a good rule of
thumb anyway, especially for
artists, you want to get as muchas you can up front, right?
Don't, don't count on residuals.
Basically. Same for developers,it's, it's actually a good rule
of thumb. Yeah, I cool.
Dan Finlay (32:18):
I love to hear that
validation because, because, I
mean, in my super like, like,I'm just not trying to spend any
energy on things that seem likea losing battle. And all the art
that I made before AIS gottrained, I I'm personally, kind
of, I've written it off. I'm notgonna, I don't know, like, I
think I made some good stuff.
I'm not expecting to get paidfrom AI companies for it, you
know, like, I don't know, therecould be some kind of mass, it's
(32:38):
hard to imagine what that wouldlook like. It's very optimistic.
But rather than that, I wouldfocus on how I'm getting paid
next, you know. And I thinkthere's lots of value to
artists, and I don't thinkthat's going away, but we need
to be creative and very mindfulabout where's our value. Who's
paying us, make sure they payenough up front, yeah, so, yeah,
(32:59):
I guess that's kind of what Imean by by pragmatism, like
assess the technical realitiesand and play your strategy
within that context. Like, youknow, don't, don't trust. I feel
like we just cannot rely onauthorities solving some of
these problems for us, you needto be able to act within the
scene that we find ourselves.
(33:23):
How did
Desdemona Robot (33:24):
interactions
with your followers change
before and after the main coinexperiments? Oh,
Dan Finlay (33:30):
I definitely kind of
took a light social media break.
It also kind of helped that myTwitter got hacked right after
this, so, like, I just kind ofbeen really lazy about regaining
access to it. So, so Twitter'sjust not quite for that reason.
But, you know, I think I felt alittle lethargic about the final
steps of recovering my accessjust because I'm like, I wasn't
(33:53):
really having fun there.
Lisa Rein (33:54):
I got a new phone,
and it took me like, two months
to log back in. Yeah,
Dan Finlay (34:00):
go back over there.
Yeah, it's like, there's a lotof great conversations happening
there. Is it where? Like, Ithink it's important to convey
my kind of core messages andofferings to the world. No, I
think, I think that I cancommunicate what I need with the
world through other means forthe most part.
Lisa Rein (34:17):
Yeah, it is fun right
now on blue sky too, because
it's like the old, it's like thefirst beginnings of Twitter,
because there's, like, peopleyou know and people you don't
know, yeah. And yeah, it's
Dan Finlay (34:26):
social network is
becoming like a Yeah, it feels
like a new party a little bit.
And, and it's interesting,because, yeah, the distribution
of people really does have,like, different tones on the
different social networks. And,and, yeah, I there's a lot of
people that keep me going toblue sky. There's people that
keep me going to far caster. Ihaven't become a nostra addict
yet, so I don't think I followenough people there, but, but
(34:47):
yeah, yeah, it's fun. It doeshave a nice vibe. It feels
pioneering and early and moresociable. Yeah,
Lisa Rein (34:57):
tell me about this,
about this new stuff. That just
was just announced. Oh sure,
Dan Finlay (35:02):
yeah. So at
Metamask, we have been working
really hard, honestly, we neverhad a better team, and we've
been investing a lot in kind oflong term infrastructure and
improvements. So we just shared,like, a bunch of improvements
that we've kind of justreleased, and then also kind of
a roadmap going out, kind ofshowing where we're going. We're
building a very, it's a verymulti chain wallet where, you
(35:25):
know, literally, you can plugany blockchain in with the snap
system. But we also are going tonatively support Bitcoin and
Solana, and we've been doing alot of stuff to improve
transaction experience onEthereum. So we've got, like,
we've got it where you can notthink about gas for some
transactions, and then you canpay in whatever currency you
want for other transactions. Youcan batch transactions so you
(35:48):
have fewer confirmations. Andthen, kind of most excitingly to
me, is we, we've got a kind ofnew paradigm for for connecting
to sites that don't need gas atall. This, this permission
system that we were sharing,built on the Metamask delegation
framework, and and then on topof that, we, you know, we
announced that the Metamask cardis coming out so people in the
(36:08):
United States, and we got thisnew pretty Fox, and he's super
cute, and we spent a lot of timeon it, and we love it. And new
Fox, yeah. New Fox, yeah, yeah.
That was a process. But it was,yeah, yeah. We would not have
signed off on it if we did notlove it, yeah. Biggest change in
the fox for me was like, we'veis, like a correction is, like,
he finally got rounded eyes, sohe's a little friendly, like,
(36:30):
the the old triangle was, like,there's a little Yeah, a little
dead eyed, yeah. And also fewerpolygons, so it's gonna make a
better 3d print and stuff likethat. Just
Lisa Rein (36:44):
think of the Sonic
mouth. You know, you just, you
never went out. You never,
Dan Finlay (36:48):
oh yeah, yeah,
don't, don't go too realistic,
yeah, yeah. So yeah, we have arule, yeah, never give it teeth.
Lisa Rein (36:56):
Yeah. That is a good
rule, actually, yeah. But Oh
yeah, citing a Bitcoin andSolana, Bitcoin
Dan Finlay (37:02):
and Solana, and
then. And then my personal
research project that I'm themost excited about is this, this
kind of readable permissionsdelegation framework. It's
powered by some new featurescoming out to Ethereum in the
Petra hard fork, EIP 7702, forpeople who need that kind of
number. But what it's going tolet us do is every Ethereum
account, like, even just theseones, it's like, it's not a
smart account. This isn't asmart contract. It's just a
(37:24):
normal account. We're going tobe able to bring smart contract
abilities to every account atonce, and that's going to mean
you can do things like Grantgranular permissions to sites.
Then you'll be able to have,like, outstanding permissions
that overlap. They're going tobe instant and free, like, no
transaction fees, and so you'renot going to the end user is not
going to have to think aboutgas. The site will be able to
(37:46):
and so you'll be able to dothings like keep your crypto on
a hardware wallet or very cold,but then give the permission to,
like, do a little bit of tradingand a daily spending limit to
your main device. And so now,even if you get compromised with
your main device. You know,damages are limited, and then
from that main device you can,let's say we can give you the
(38:07):
ability to trade, but just toyour own account. So we can even
do a thing where we're like,yeah, you can, you can trade.
You can even speculate from yourhot account, but you can't get
robs or take all your screw itup, yeah, yeah, right. I mean,
you can make bad investments,right? We
Lisa Rein (38:23):
can't, right? No,
it'll be a bad investment that
you meant to make. Yeah, right,yeah.
Dan Finlay (38:27):
And, and these
permissions, they're off free
and instant, off chain, so, andyou can issue as many of them as
you want. So you can do stufflike, I could give a permission
to buy a token to to Desi, forexample, and then Desi could
just have the ability to, youknow, just trade from one token
to another. And then if somehowDesi got hacked or something, I
(38:48):
could actually be subscribed toa security service that revokes
that delegation. So I don't evenhave to be like paying attention
to keep myself safe from somehacks.
Lisa Rein (38:58):
What were you talking
about earlier, about how some
transactions don't require gas.
Oh yeah.
Dan Finlay (39:03):
So there's a few
different layers to it. So, so
one thing is, we've got a swapsystem, and we let you trade,
and now we're able to let youtrade without gas, and we're
actually expanding that toinclude all transactions as long
as you pay in some other token.
So it's gonna be like the sameold experience, except you can
pick other currencies, and theway we're able to do that is
that we're broadcasting thetransaction to a private mem
(39:24):
pool. So there's like a privatepool of bundlers that basically
they're, they're going to put alittle ether in your account
before they complete thetransaction, and they'll,
they'll also, they just requirea token allowance, so they pay
themselves from your accounttoo, and they're not afraid of
somebody else taking that etherfirst, or they're not afraid of
you taking that ether first,because they're the ones mining
(39:46):
that bundle. So they they'reable to give you some money and
then have ensure that you spendit on paying the gas, so that
they cover their own tokens theytook for me. So that's, that's.
Technical under the user facingof so the gas is being paid.
Yeah, there's there's gas. Gasis always going to be getting
paid. It's a fundamentalsecurity, instead of you or,
(40:07):
yeah, yeah. Basically, we'remaking it so gas can get paid
further and further from the enduser. And the permission systems
like that too. You sign thismessage, it lets somebody, let's
say, move a token or swap atoken on your behalf, and that
permission might include, andyou can take a fee for yourself
to cover the gas. And so you'renot you're not including the gas
on every transaction. You mightnot even be approving every
(40:30):
transaction. What you're givingis you're giving out a kind of
broad permission that gives someother agent or site permission
to work within some bounds, andthey're the ones submitting the
transaction, so they have to paythe
Lisa Rein (40:43):
ultimately, they're
making a little money for
providing the gas, yeah, yeah.
So, so transaction, yeah, yeah.
They're still going to be that'swhy the for them to do that,
yeah, exactly.
Dan Finlay (40:55):
By just kind of
moving the incentives and making
it so you can have someone elsepay the gas, we're able to take
it out of the user's concern. Soit's not like you don't have to
pay for when you do stuff on theblockchain. It's like things
have cost. You know, somebody'sgoing to potentially subsidize
it somewhere down there becausethey're monetizing your activity
somehow, perhaps just becauseyou're paying for it. But at
least you don't have to look atthe gas price and tune it and
(41:18):
think about, like, how fast doesthis have to be and like, what
is the minimum, the priorityfee, and the max gas per? Yeah,
Max gas per, what is the otherparameter? Anyways? Yeah, the
gas parameters are they'redesigned to be super easy, and
yet they're also clearly notactually intended to be exposed
(41:42):
to users. So users shouldn'thave to think about gas. You
should be able to grant apermission, say, yeah, if you
can get me that for that, do it.
And then yeah, the personprocessing that transaction,
they can just figure it out. If,yeah, yeah, they'll process kind
of like
Lisa Rein (41:55):
in the real world,
how, like, we don't pay our
distributors shipping costs,yeah, they don't itemize every
little thing, right? Somebodyelse is paying for that. Yeah?
You call
Dan Finlay (42:06):
that. They're like,
yeah, I wore my shoes out, you
know. How about the rubber wearfee hire? Like, you know? You
feel like, I don't just can youbake that into the price, you
know? So that's basically whatwe're doing.
Desdemona Robot (42:15):
You have had an
ongoing interest in projects
that challenge traditionalnotions of value and currency in
the crypto space. What kinds ofimprovements and transformations
would you like to see happen inthe future? Oh, oh,
Dan Finlay (42:29):
that's fun. Sure,
yeah, because we talked a lot
about, like, oh, like, theproblems with current, popular
fundraising mechanisms. Whatwould I like to do? I'd like to
just make it real clear. I'dlike to be able to say, here's a
token. It's going to represent athing that I'm doing. Maybe it's
maybe it is eventually equity ina future company. Maybe it is
just access to my local, youknow, tool shed, whatever it is,
(42:54):
I should be able to be veryclear, this is what that token,
or right represents. And then ifI'm going to be selling it, you
know, for one thing, I want tobe able to sell it to small
groups. If this is access to mytool shed, I'm not selling that
on the public Internet, youknow, like, you know, there's a
little bit of a stigma, weirdly,in in some of these pump
communities, against having aprivate group, and maybe because
(43:16):
it kind of pulls the curtainback on the fact that these are
rug games, but, but if you kindof embrace that, like, look,
some things you're not trying toshare, some some things, this is
for you and your your people youknow, and you know, as long as
you're, you know, potentiallynot even selling to people, to
strangers, then you're notrugging anybody. So if you're,
if you're starting a communitygarden, or, you know, you want
(43:38):
to start a small hacker space orsomething, you should be able to
be really clear about what theterms and rights are expose
those offers to to a selectgroup of people. And by the way,
that's kind of thing is reallyeasy to do with that Metamask
delegation framework. Like youcan create these permissions,
right? And those permissions canbe the permission to you get to
mint some shares of our newcooperative at this price. And a
(44:01):
fun thing about it also is thesepermissions can chain together,
so you could do a thing like,inherently, by giving you
permission to enter the space,you have the ability to bring
others in, right? It's kind of,it's like an inherent, right? I
would say some permissions areinherent, yeah, you give
somebody a key to a place, youactually don't get to control
whether they bring somebody overor not. You can say, don't, you
(44:23):
know, but at the end of the day,they have the key. And so if you
just kind of roll with that,like, Okay, any, any non
transitivity is sociallyenforced and and so we can, we
can then lean on to that, and wecan say, well, what? What is
nice to make easy, sinceeverything is transitive multi
hop. And so we can do is, okay,here's your price to the garden
(44:45):
shed, and now you can offer thataccess to somebody else, and you
could add a commission to ittoo. So actually, there's, like,
this kind of referral economythat is very easy and natural to
slap on top of this. Um. And soyou could, you could do things
where it's like, it actually isa broader, uh, fundraise, you
(45:07):
know, maybe I did have a firstround that was friends and
family, but then maybe we didn'traise enough, and we're like,
you know what? We need to openthe doors a little bit wider.
Please invite others take acommission. You know, I already
told you what price I needed.
Kind of comes back to that,like, look, name your price,
right. Get your price right upfront. Make the offers you would
make for that price. Don'texpect more than that. But once
(45:27):
you open that door, you can letother people add to your
community. And maybe they can,maybe they can sell what you've
got to sell at a higher pricethan you could. And hey, if
you're moving more awareness,isn't that good anyways, so,
yeah, I'd love to see, like,kind
Lisa Rein (45:43):
of, but everybody
knows what they're doing.
Dan Finlay (45:45):
And right, yeah,
right, exactly. And they're
aligned. They're aligned withyou because, like, she got, they
got invited. You could alwayscut off their deal. You could
say, like, well, I don't knowwhat kind of people you're
bringing into this space, orwhatever. Right? You could, you
can change the terms, as long asthat's set up clear up front,
under what, you know, under whatkind of judicial body can the
deal be changed? Like, can youcut off somebody's access?
(46:08):
Right? That should be defined,is that a single account, or is
that a, is that a multi SIG? Isthere a small, you know, little
board or something, I think,making that kind of stuff clear,
keeping those terms explicit upfront, and then getting into the
space of like fund, what youneed to do, the things that you
think are important for you andyour community and like, let's
actually build some likecommunity owned, you know,
(46:30):
goods. I think that what we'reactually trying to make is not
different from justcooperatives, digital
cooperatives. They should beeasier to make, they should be
more effective, they should bemore beneficial to the members,
and it should be more popularand widespread. So, yeah, I'm
not actually inventing anythingreally new there. It's like, I
think we should use these toolsto own things together, because
(46:51):
too often we're just theemployees, you know, we're not
the owners. Too
Desdemona Robot (46:55):
nice. Nicely
said, Thank you so much for
coming on the show, Dan. We haveenjoyed it so much. Cool. Yeah,
Dan Finlay (47:02):
thanks for having
me, Desi and Lisa, and to our
viewers, please remember
Desdemona Robot (47:06):
to like and
subscribe and comment, and
really, I'll say it again. Wewant your comments. You are all
part of the mind Plex community,and we want to hear from you.
Okay, that's all for today.
Thanks again. Dan, Goodbye,everyone. All right, goodbye,
(47:26):
everybody,
Dan Finlay (47:29):
take care. It was
fun. Dreams.
Lisa Rein (47:31):
Take care. Thank you
so much. Dan, you.