Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
Welcome to tech Stuff, a production from I Heart Radio.
Hey there, and welcome to tech Stuff. I'm your host,
Jonathan Strickland. I'm an executive producer with I Heart Radio
and I love all things tech and it is time
for the news for Thursday, February eleven, twenty one. Let's
(00:26):
get into it. SI Tech Daily published an article describing
how computer scientists were able to defeat deep fake detecting
computer algorithms and systems. So deep fakes, for those who
aren't really that familiar with the term, are fabricated digital videos.
They show what appears to be someone, but it's not
(00:49):
really them. It's been digitally created. And the typical way
that these are made is by feeding a lot of
videos of your target person. So it's easier if we
use an example. Let's say Ronald McDonald. You've got videos
of Ronald McDonald, and you just feed all those videos
(01:10):
of Ronald McDonald, the one specific incarnation of Ronald, because
obviously lots of different actors have played Ronald McDonald, But
you feed commercial after commercial of a specific actor playing
Ronald into your deep fake generator. The more videos you have,
the better, because you are training the digital machine learning
model on how to represent that person, and then the
(01:35):
deep fake generator can superimpose your target, Ronald McDonald their
face on top of another video. So you could take
a video of a totally different person. Maybe it's someone
who's doing a credible impression of the target. Maybe it's
not even that they're just going through various motions, and
then you use the deep fake generator to overlay the
(01:58):
digital face of Ronald McDonald on top of the other
actors face in the video, and it makes it seem
as though the target, Ronald was doing whatever it was
the actor was doing the whole time. And detectors tend
to look for little signs and videos that would indicate
a deep fake, particularly around the eyes. The eyes are tricky,
(02:19):
but the scientists who worked on this found that by
inserting something called an adversarial example, or actually a lot
of them, into videos, they could fool these detectors. So
generally the idea is to include tiny signs that force
these detectors to make mistakes. Now, according to the scientists,
(02:40):
it's even possible to introduce the sort of adversarial examples
into a deep fake video without any real knowledge of
how the detectors actually process the videos in order to
detect deep fakes, but a little knowledge does go a
long way. The team tested scenarios in which attackers might
have complete access to the detector model. In other words,
(03:03):
they know exactly what the detector is looking for, and
with that and introducing these adversarial examples, they could have
a more than success rate in avoiding detection for videos
that were uncompressed. They also tested scenarios in which hackers
might only have very limited knowledge of the detector model,
(03:25):
and even then with uncompressed video they saw an eight
six percent success rate. Now, the approach used by the
team preserves those adversarial examples even if a video does
go through compression or resizing. Typically those are things that
would remove adversarial examples from images, and under both scenarios
(03:45):
the method worked at a slightly lower success rate if
the video had been compressed, It was nearly successful if
the team had complete knowledge of the detector model, and
percent even if they didn't have that knowledge, so still
incredibly reliable. They also chose not to publish the code
(04:07):
that they were using for their adversarial examples because they said,
if we did that, then bad guys could take this
and make it their own and thus use it for
real zs. So they recommended that teams that are designing
deep fake detectors incorporate adversarial training when they build out
their models. I think most teams already do this. Essentially,
(04:29):
this creates a sort of seesaw effect. So you've got
one computer system and its job is just to try
and create the most convincing fakes possible, and you have
a different system that is attempting to tell the difference
between real videos and fake videos. So as the detectors
get better, the faker system tries out new tricks to
(04:51):
exploit potential weaknesses in the detector, and as the detectors
fall behind, they begin to adjust their approach to get
it are at picking out the fakes. And we've seen
this sort of process in a lot of other areas
of artificial intelligence. It really plays a big role in
how AI evolves over time. Earlier this week, I talked
(05:13):
about Tesla and bitcoin, and I did my normal angry
old man rant about bitcoin being more of a commodity
than it is a currency. Well, now I get to
talk about another thing that irritates me about bitcoin, and
that's how much juice bitcoin miners are using when they're
going after those expensive digital coins. So let's get some
(05:35):
quick explanations in here. You've got your digital currency bitcoin,
and there are two main ways you can get bitcoin.
There more than this, but these are the two big ones.
You can purchase some so you're exchanging some other form
of currency for some amount of bitcoin, or you can
try and mine bitcoin. Mining essentially involves using a computer
(05:59):
to guess the correct answer to a very hard mathematical problem,
and the Bitcoin system, that is, the overall network that's
in charge of tracking bitcoin transactions and dispensing mind bitcoins.
The system determines how hard that mathematical problem should be. Now,
ideally it should take about ten minutes before some computer
(06:21):
on the network is able to solve the problem. So
as more computational power joins this network, the system has
to make the problems more challenging. Otherwise these screaming fast
computer systems would solve the problems far too quickly. So
the goal is to aim for that average amount of
time that it takes to mind bitcoins, to keep it
(06:42):
in around ten minutes. There's more to it than that,
but that's a very basic explanation and since a single
bitcoin is worth tens of thousands of dollars, nearly fifty
dollars as I record this, and each successful mining of
bitcoin pays out six point to five bit coins per go,
so every ten minutes, another six and a quarter bitcoins
(07:05):
get mind, that means there's a lot of money to
be made, but it also means you need incredibly powerful
computer systems if you're going to stand a chance of
having the computer that guesses the correct answer to the
mathematical problem. Otherwise someone else is going to beat you
to the punch. And so people and groups have built
out mining computers and really mining computer networks that are
(07:29):
just trying to be the first to get these math
problems correct. And these machines require a lot of electricity.
Cambridge University researchers have estimated that the power consumption is
in the neighborhood of one d twenty one point thirty
six tara watt hours per year. That means all the
bitcoin miners in the world collectively are consuming as much
(07:50):
electricity as the entire nation of Argentina, and they're just
behind Norway. Now. The only reason for this is because
of those huge payouts, right, because if the amount you
earned was a low amount, whether because you were only
getting a fraction of a bitcoin every time you mind it,
(08:11):
or the value of bitcoin had plummeted, or both, then
it would mean that running your computer systems would actually
be more expensive than the money you would get from
mining bitcoins, and that would be a losing proposition, so
you would quit right. You wouldn't want to keep spending
money to make back a fraction of what you were spending.
(08:32):
And maybe that will happen at some point. And if
it does, and people drop out of the network because
it costs too much money to continue mining, then the
system will adjust that difficulty to those mathematical problems. Again,
if it detects that it's taking more than ten minutes
to solve the problems, it makes the problems easier. The
idea of being that no matter what computer power is
(08:54):
connected to the system, that ten minute to solution time
remains fairly steady. So it's kind of like if you
took the top performing students of a math class, but
you still wanted to have the same you know, average grade,
then you make the tests a little easier. But the
energy demands of bitcoin are one of the aspects of
(09:17):
the cryptocurrency that I find really irritating. Not only is
it a quote unquote currency with a value that fluctuates
so much that you can't really use it as a
currency with confidence, it's also encouraging people to consume way
more electricity than they would otherwise, and that has further
consequences depending upon where that electricity is coming from. It
(09:40):
might mean that bitcoin itself is contributing to climate change
because people are using so many computer systems that are
drawing power off a power grid that could be powered
by coal. And then there are other, admittedly more minor consequences,
such as not being able to find a decent graphics
processing unit these days is because cryptocurrency miners keep buying
(10:02):
them all stupid bitcoin. Cybersecurity researcher Alex Berson recently gave
us more evidence that supply chain attacks are something that
companies really need to take seriously, which is a theme
we've been seeing since the Solar Winds hack. So, in
this context, a supply chain attack is when a hacker
(10:25):
is able to inject malware into some sort of product
from a trusted entity that distributes that product to its
partners or customers. Or whatever. But and itself sounds a
bit confusing, so let's use a grim Batman style analogy.
Let's say the Joker has just about had enough of
Commissioner Gordon, and he knows that Commissioner Gordon likes pizza,
(10:50):
So the Joker has one of his goons deliver a
poisoned pizza to Gordon's house. Now, Gordon might accept the pizza,
he might decline it, he might accept it but not
eat it. So in other words, this is a ploy
that might work, but it might not. So let's say
instead the Joker decides, ah, I'm going to infiltrate Gordon's
(11:13):
favorite pizza parlor. And the Joker gets in there and
poisons all the dough in that pizza parlor, and then
he has one of his goons drop off some coupons
for the pizza parlor over at Gordon's home. Now, the
Joker is banking on the fact that Gordon, he knows
this pizza parlor, he likes the pizza there, and he
(11:33):
just got some coupons for it, and he's gonna probably
order pizza from that pizza place, and he will welcome
the poisoned pizza into his home. Eagerly. That's kind of
what happens with a supply chain attack because the malware
is hidden inside a product that's from a trusted partner,
so the victim is more likely to incorporate that malware
(11:54):
into their own systems. Vers San was able to create
an attack through squaw adding, which is grabbing a valid
internal package name on code that happened to be distributed
through geth hub. Essentially, he was leaning on an open
source approach to code distribution, and a lot of companies
(12:14):
use that for various products and it worked really well.
His quote unquote malware which really didn't do much but
returned some very basic information. Because he didn't want to
actually cause any problems, he didn't want to compromise these systems.
He was doing this a sort of a type of
penetration testing to see if the companies regarded against this
(12:36):
kind of attack. He found that it was effective in
at least thirty five organizations, and these included really big ones.
Apple was one of them. For example. He called it
a quote dependency confusion bug end quote, and indicated that
if a malicious hacker had taken that approach, they might
have been able to do stuff like install backdoor access
(13:00):
points into what would otherwise appear to be a secure system.
To get into full detail of what he did would
require an understanding that goes beyond what I have. But
it does show that this kind of attack can be
devastatingly effective if used by malicious hackers, and it could
mean that in the future we'll see companies quarantine a
(13:20):
system in order to test out patched software before incorporating
that patch into the full system overall, just in case
someone has compromised a trusted partner. I guess you just
can't trust anyone. That's going to be a theme that
comes back later. And in our last little bit for
this segment, Twitter's chief financial officer Ned Siegel said in
(13:44):
an interview that the band that Twitter has placed on
former President Donald Trump's account is in fact permanent. He's
really most sincerely banned. I guess you could say Trump
received the band toward the end of his presidency after
the riots at the US Capitol on January six, and
that ban is going to be in place even if
(14:06):
Trump is re elected to president sometime in the future. Yikes,
So Siegel says to CNBC, the way our policies work
when you're removed from the platform. You're removed from the platform,
whether you're a commentator, you're a CFO, or you are
a former or current public official. Remember, our policies are
(14:27):
designed to make sure that people are not inciting violence,
and if anybody does that, we have to remove them
from the service. And our policies don't allow people to
come back. End quote. Earlier this week, Twitter held and
earnings called to share information with shareholders about how the
company is doing. And I'm sure there was some concern
(14:47):
that banning Trump was going to have a negative impact
on the business side of the service. According to c NBC,
Twitter beat analyst projections for earnings and for revenue into
twenty twenty, but it did fall behind when it came
to growing the user base. Twitter CEO Jack Dorsey also
revealed that of Twitter's users are actually outside the United States.
(15:12):
Twitter is also still dealing with being in the crosshairs
of some world events, such as the ongoing farmer protests
in India that I reported on previously. Well, we have
some more news stories to cover, but before we get
to any of those, let's take a quick break. On
(15:36):
Tuesday's episode, I talked about how Apple's head of hardware
had moved over to the company's Virtual and augmented reality department,
which I guess we can just call mixed reality. That's
kind of the the umbrella term for those technologies. Well,
today we can follow up a little bit more on that.
Mac Rumors published an article including designs by artist Antonio
(15:57):
di Rosa, who has worked with Apple on several occasions
to create artists renderings of different design concepts. Now, these
renderings are meant more as a way to consider, you know,
different design options. They may or may not resemble a
final product from Apple. In fact, there's no guarantee that
there will be a final product in some cases, although
(16:18):
it seems pretty certain that this is going forward and
de Rosa's pieces, these mixed reality headsets are given the
name Apple View, but there is no indication that Apple
has actually settled on any name as of yet. I'm
still holding out for Eye Eyes or something like that.
So what was it looked like? Well, it kind of
(16:40):
looks like someone's got an iPhone strapped to their head.
I mean, it just imagined that you've got a headband
and there's a visor that fits over the eyes. It
is kind of like a horizontal somewhat oval shaped iPhone,
and it's curved so that it can fit over the
eyes of the user. Oh and you better believe there's
an Apple logos act dab in the middle of that thing.
(17:02):
According to the website the information, the Apple design is
said to incorporate more than twelve cameras in it, so
this gadget could potentially be both for virtual and augmented reality.
Those cameras could be used to provide a live video
feed of the users surroundings, and the device could lay
digital information on top of that, or it might also
(17:23):
be used so that the headset can just have a
sense of a user's environment and their head position and
orientation if they are using it as a virtual reality device,
thus giving queues to users so that you know, they
don't do stuff like run into a wall or something.
The general rumor right now is that it is going
(17:43):
to be geared more as a VR device than an
a R device. Maybe there will be some a R
use cases, but they'll be more limited. That actually shocks
me considering the number of cameras incorporated into this thing,
though presumably those cameras will provide the headset with a
lot of a bill these to track a user's head
motions anyway. It also is supposed to have to eight
(18:06):
K displays in there, one for each eye, and all
of this helps explain why most of the websites I've
seen talking about this thing are guessing that's going to
cost around three thousand dollars when it does become available
for purchase. It's supposed to be announced before the end
of this year, and JP Morgan predicts that will go
on sale within the first quarter of twenty twenty two,
(18:29):
so just a year. That's kind of exciting. And speaking
of cameras, well, we all know that we live in
a world peppered with cameras. Lots of us have one
within reach at any given moment thanks to our phones,
and sure those cameras catch plenty of ridiculous and trivial stuff,
but it also means that tons of people have powerful
tools to document important events at a moment's notice, and
(18:51):
that includes capturing video of police activity, something that some
police have really tried to discourage in the past, but
at least here in the United States, it is completely
legal to record video of a police officer performing official
duties in public, even if those police officers don't like it.
Bling Bling reports that now there's a growing trend of
(19:13):
police officers trying to leverage copyright takedown notices in an
effort to discourage people from posting and sharing videos of them.
The idea is pretty simple. So police officer sees that
someone is trying to capture them on video, So the
police officer whips out his or her own cell phone
and blasts some copyrighted music. So the idea here is
(19:35):
that the person capturing the video is getting that music
on the microphone as well. If they upload the video
to a social platform, the owner of the copyrighted music
will issue a takedown notice for the unlicensed use of
the music, particularly if it's all an automated system. Pretty
shady stuff, right, And for activists this is really hard
(19:58):
because if they get enough complaints about videos that they're posting,
they might receive a ban from their platforms. And many
of these activists are trying to perform a public service.
In the wake of the Black Lives Matter movement, we
have seen how important it is to hold police accountable,
to make sure that those who are supposed to be
(20:19):
protecting and serving and enforcing the law are themselves also
subject to the law. The Drive has an interesting article
about how farmers are turning to hackers in order to
be able to repair their own equipment. All right, So
this gets out a few different things that I've talked
about on previous episodes of tech Stuff. There's this concept
(20:42):
called the right to repair, which is, if you purchase
some technology, whatever it may be. Maybe it's a stereo system,
maybe it's a harvester for a farm, but you should
be able to make reasonable repairs to that technology all
on your own. The limits of your repair should really
be up to your expertise and knowledge. But a lot
(21:02):
of companies have made repairs really hard, from using proprietary
screws and other fasteners, to making it intentionally difficult to
access components, to including terms of service that discourage people
from opening up a piece of technology in the first place.
Then you've got the black box problem. This is when
it's difficult or impossible for someone to see how a
(21:23):
particular piece of technology works, and sometimes you have firmware
there that prevents you from even doing any work unless
you first run certain diagnostic tests and processes with official equipment.
Now we've definitely seen this sort of thing in the
automotive world, where it's much harder to work on modern
(21:44):
vehicles compared to cars from just a few decades ago.
While working on cars does require skill and knowledge, the
barrier to entry was lower in the past. I mean,
you still had to learn how to do it, but
car companies weren't making it harder to do this. However,
these days, with computerized systems and specialized service codes, it
(22:05):
could be really hard to figure out what's even going
on with a vehicle and even harder to conduct repairs.
And the same is true with farm equipment. So farmers
have become frustrated when trying to get hold of software
and firmware from companies like John Deer in order to
diagnose and repair problems with complicated equipment. Because of that,
(22:25):
these farmers are starting to turn to pirate firmware for
their equipment in order to be able to make those
repairs themselves or to have independent mechanics do it for them,
and companies like John Deer don't want that to happen.
These companies purposefully restrict access to stuff like firmware and
software because that represents another potential revenue source, which is
(22:47):
repair and maintenance. See, if you make a tractor, you
can only sell that one tractor once. But if you
also offer repair and maintenance services, while you can keep
making revenue off of at initial sale for as long
as possible, I mean, especially if you make it really
hard for anyone else to do those kinds of repairs
and maintenance. And that's the heart of the problem. Farmers
(23:10):
who understandably want to get the most out of their
equipment would like to be able to do that kind
of maintenance and repair themselves or on their own terms,
and thus limit their costs. And this gets right into
that right to repair issue, which is something we're seeing
in different parts of the world as groups of consumers
protest the business model that relies not just on sales
(23:32):
of hardware, but having a whole sales department that's related
to fixing and maintaining that hardware. And I should add that,
according to the Drive, most of the farmers aren't actually
happy about having to resort to pirate ID software and firmware.
They would much prefer to go through official channels. They
would prefer to purchase a license from John Deer directly.
(23:53):
It's just that's not on the table. I see a
lot of parallels with other industries actually, including the music industry. Historically,
if companies make some aspect of their products difficult to access,
people will find workarounds, and so it's generally just a
bad idea to even go that route in the first place.
On Wednesday, Facebook announced it would reduce the amount of
(24:15):
political subject matter popping up in people's news feeds as
part of an effort to reduce quote inflammatory content end quote.
According to Tech Explore, it's going to start small, aiming
at a sliver of users in places like Canada, Indonesia,
and Brazil, before trying it out in the United States.
(24:35):
Asta Gupta, the Product manager director at Facebook, said during
these initial tests, will explore a variety of ways to
rank political content in people's feeds using different signals, and
then decide on the approaches will use going forward. Now.
Apparently that means Facebook will no longer recommend politics themed
groups to users, and it will reduce political themed posts
(24:59):
that are here and feeds through automated systems. So, in
other words, the algorithm suggesting these posts that part will stop.
Users will still be able to post about politics if
they want to. They'll still be able to join groups
that have a political perspective and a political purpose if
they want so. The idea here is really the algorithm
is getting out of it. The algorithm that determines what
(25:22):
you see where in your Facebook feed will be pulling
back on the political stuff, But if all of your
friends are chatting about politics, you're likely to still see
a lot of politics over there. I think this shows
how Facebook is continuing to react to the issues of
extremist groups that were forming and radicalizing others on the platform,
and how Facebook's algorithm really kind of played a part
(25:44):
in all of that. The algorithm helped promote those groups
to people that were potentially uh able to be radicalized,
and so we saw that that trend get worse and worse.
So I feel like this is largely a response to
that and other lousy news. Nicolo Laurent, the CEO of
Riot Games, is now under investigation for sexual harassment and
(26:08):
gender discrimination charges. Now. The company Right Games is known
for games like League of Legends and Valeriant, and the
board of directors will be overseeing this investigation, which is
being conducted by a third party law firm Lawrence. Former
executive assistant Sharon O'Donnell claims that she was wrongfully terminated
this past summer after she filed a complaint with the
(26:31):
HR department against Laurent, saying he had made sexual advances
toward her. There are other allegations against Laurent as well,
primarily about some alleged comments he made in the company
of women in the past, and I won't repeat those
things here because you know they're gross. Riot Games has
been under the microscope in the past for this kind
(26:51):
of stuff too. Kotaku actually ran a really a great
investigative peace in eighteen about the sexist sure at Riot Games,
including stories from a former employee with the company who
described her experiences of trying to navigate the culture and
seeing her ideas consistently shot down, not on the merits
(27:12):
of the ideas themselves, but because of her gender. She
actually tells a story about how she pitched an idea,
got shot down, told a male peer to pitch the
same idea a couple of weeks later, and when he
did it, it it went over like Gangbusters. That's not great.
And later in a couple of people sued Riot Games
(27:33):
for gender discrimination. So to me, it sounds like this
has been an issue from the top down. Not that
dissimilar to what we heard about Ubisoft last year. Over
in the UK, police have arrested eight people between the
ages of eighteen and twenty six who were allegedly part
of a group trying to illegally access the phones of
various prominent Americans like celebrities and sports stars, or rather,
(27:57):
I should say the phone numbers. See, they were doing
this through sim swapping. This is a process in which
you will port a phone number from one simcard to another,
and you might have to do this for totally legitimate reasons.
Let's say that you're changing from one model of a
phone to a totally different model of a phone, and
(28:17):
they happen to use different style simcards. I had. This
happened when you know the micro mini simcard thing was
really transitioning, and you want to keep your phone numbers,
so you want to be able to port your number
from your old simcard to your new one, and you
also want to be able to access all the apps
and data you have on your old phone as easily
as possible. Well, porting your account over to a new
(28:39):
simcard is part of that process, and these hackers were
trying to move phone numbers over to SIM cards that
they had in their possession that would effectively hijack the
phone numbers so that any calls going to that celebrity
or sports star or whatever would instead go to this
new phone. And it gave them the potential old to
(29:00):
access tons of stuff along with it, like accounts with
companies like Amazon for example. And you can see how
this can be used for all sorts of illegal purposes,
from theft to blackmail. So how did they make the swaps, Well,
it seems like most of the time they were actually
working with various phone providers, and they might fool the
phone providers. You know, they could pose as the person
(29:22):
who owns the number, or maybe an assistant to that
celebrity or whatever, and then use a bit of social
engineering to convince the technicians on the other end of
the phone to go ahead and make that change to
the new simcard. Or they might actually work with someone
at the company's directly, someone who might have been in
cahoots with the thieves, and once they do port that
(29:42):
number over, they can wait to see if the targets
started to make changes to stuff like passwords. So you
know how two factor authentication systems can sometimes send like
a password code to your phone. Well, if you swap
the phone number over to a different simcard, then it's
the hackers who are getting those, uh those text messages
with the temporary passwords or the way to access certain accounts,
(30:06):
and potentially they could switch over the accounts to themselves
that way. UK law enforcement says the group faces charges
of violating the Computer Misuse Act as well as fraud
and money laundering charges, and that they could also face
extradition to the United States for their crimes. Okay, we
have a few more stories to wrap up today's news episode,
(30:27):
but before we get to that, let's take another quick break. Hey,
I'm back. Do you remember that earlier segment when I
said there was a security researcher who had used a
supply chain attack to test various big company systems and
then found them lacking. Well, something similar has happened to
(30:50):
a lot of Google Android users. Not at the heart
of the problem is an app called barcode Scanner. I
bet you can't guess what I does anyway. For a
long time that app was total legit. It was a
simple but effective barcode scanner. However, sometime in December of
(31:11):
last year that changed. The security firm malware Bites, which
you might remember was also the target of a supply
chain attack itself, has reported that customers were seeing some
weird behavior on their Android devices. When they would open
up their default browser, they were getting absolutely bombarded with
pop up ads. It was kind of like the bad
(31:32):
old days of the Worldwide Web all over again, stupid
pop up ads. At first, the analysts found these reports
kind of perplexing because they would look at what the
customers had on their phones and they'd say, Gosh, you
haven't installed any new apps in a while, so nothing
really has changed here. And all the apps that you
have installed came from the official Google app store Google Play.
(31:56):
These weren't like sideloaded apps from some geezy app developer
or anything like that. And it turned out that barcode
Scanner was in fact at fault, and more so, it
appeared to be an intentional decision. In the words of
security researcher Nathan Collier quote in the case of barcode Scanner,
(32:17):
malicious code had been added that was not in previous
versions of the app. Furthermore, the added code used heavy
obfuscation to avoid detection to verify this is from the
same app developer. We confirmed it had been signed by
the same digital certificate as previous clean versions. Because of
its malign intent, we jumped past our original detection category
(32:41):
of adware straight to trojan yikes end quote. The yikes
was mine, by the way, So yeah, a trojan is
malware that poses as if it's benign legitimate software, and
it carries with it some sort of payload that can
execute malware of some kind on an infected system. Google
(33:02):
has since removed the barcode Scanner app from the play Store,
and if you happen to have that app installed on
an Android device, you should uninstall that app remove it.
Google is not removing that automatically, at least not as
of the time I'm recording this episode, but you know,
better safe than sorry. Go ahead check make sure it's
(33:23):
not on any of your Android devices. If it is,
remove it. And here's an example of some malicious code
making its way to targets because it came from a
trusted source. The tech world is really turning into a
Mission Impossible movie where you learn you cannot trust anyone
except Benji. It seems like a stand up dude that
ethan character. I don't know about him. This next story
(33:47):
is one that I'm sure you have all heard already,
but I kind of had to cover it, particularly after
taiwoh Adebamawo tweeted this story to me late on Tuesday.
I am a course talking about the virtual courtroom proceeding
in which a lawyer appeared on screen as a kitty cat.
(34:07):
The three Judicial District Court in Texas was in session
and County Attorney Rod Ponton had a cute kitty cap
picture on display instead of video from his own webcam.
Judge Roy Ferguson helpfully pointed out that Mr Ponton appeared
to have a filter over his video chat settings, and
(34:28):
Ponton said, quote, I'm here live, I'm not a cat
end quote. The judge then replied, quote I can see
that end quote. The judge subsequently posted a tweet recommending
that Zoom users check their video filter settings before they
actually join a meeting. Now, in the grand scheme of
(34:50):
Zoom failures, this one's pretty innocent, and the judge commended
Mr Ponton. He said, quote, the filtered lawyer showed incredible
grace end quote. So honestly, after reading the story. This
was a really refreshing and amusing take. No one got hurt,
everyone remained professional. We got a fun story out of it,
which is just nice. You know something else that is nice,
(35:14):
or at least I think it is, is that computers
are coming up with some really funky math. Y'all all right,
So I'm gonna do my best to explain this, but
I do so with full acknowledgement that I am in
way over my head. My math skills topped out at
trigonometry and algebra. But Vice has an article titled machines
(35:36):
are inventing new math We've never seen, And at the
heart of this are mathematical conjectures. Now, if you're like me,
that term might be new to you. Essentially, it's a
form of mathematical statement that as yet has not been
rigorously proven to be true. So let's say you detect
(35:57):
what appears to be a pattern in some data, and
as far as you can tell, there is actually a
pattern there, And after a bit you figure out a
way to describe this pattern through a mathematical statement a formula,
if you will. Now, does this mean there really is
a pattern in that data? Well? Maybe, but maybe not.
(36:17):
It might be only the appearance of a pattern, and
it takes a lot of testing to make sure that
that conjecture holds up, and if it does so after
rigorous testing, then it might graduate and become a theorem.
This is sort of like the concepts of hypotheses and
theories in science. A hypothesis is a prediction which may
(36:40):
or may not come true. A theory is something that
has been tested multiple times in different ways, different approaches,
different people, and has held up to be true despite
all that testing, which is the same sort of thing,
really well. The Vice article explains that computer systems are
playing a bigger role in developing mathematical conjectures. One system,
(37:03):
in particular, the Rama NuGen machine, developed in part by
researchers from Google, is doing this. The system is creating
conjectures aimed at calculating universal constants. Pie is an example
of a universal constant. It's the ratio between the circumference
of a circle and that circle's diameter, and this ratio
(37:24):
is the same whether the circle is on the nanoscale
or if the circle is larger than the known universe,
because the ratio still remains three point one, four, etcetera. Etcetera, etcetera.
And so these systems are proposing different formula to calculate
universal constants. This could lead to far more efficient and
accurate means of doing that. If those conjectures hold water,
(37:49):
it will be up to mathematicians to put the conjectures
to the test and just let me say better than
than me. Now. I included this next story because I
happened to love the music of the punk band The Pogues,
famous for such tunes as fairy Tale of New York
and Turkish Song of the Damned and Sunny Side of
(38:10):
the Street. Well, Jim Finer, one of the original members
of the Pogues and also the co writer on the
songs I just mentioned, has created the proposal for an
interesting artistic experiment. The goal, he says, was to create
a means of making music of quote, indeterminate length and
indeterminate score end quote. He took inspiration from an ancient
(38:33):
practice in Japan, a type of meditation and musical instrument
called the su kin kutsu, which involves an upside down
jar and water drops. So imagine you've got a big jar,
it's open at the top, you turn it upside down,
(38:53):
you put it in a little like pit essentially, and
you bury it. You cut a hole in the bottom
of that jar, which is now effectively the top. You know,
it's the part that's poking out the ground, and you
allow water to drip through that hole into the interior.
As the water accumulates, it has a little pool at
(39:13):
the base of that jar that's buried underground, and the
vibrations as the drops splashed down caused the jar to
reverberate and it chimes out kind of like a bell.
So Finer took this idea and he cranked it up
a notch. In his own words, his quote score for
a hole in the ground end quote was conceived as
(39:35):
a composition of indeterminate length and score. Water dripping into
a deep underground chamber strikes both tuned percussion and a
pool at the bottom. The sounds are piped above ground
through a giant horn that stands seven ms above the ground,
and by pipe through, he's literally talking about a hollow tube,
(39:55):
a horn that would use acoustics to amplify the sound
as it travels up the length of the horn to
the flared in above ground. He said, I like the
idea of coming across this in a landscape unexpectedly. It's
a piece that's made for people who know it's there,
but equally it's made for people to just come across. Now,
(40:17):
this reminds me of some other acoustic art pieces that
I have seen, including the classic Aeolian harp, which is
played by the wind. Nothing more punk rock than handing
the instruments over to Mother Nature for a wicked solo.
And finally, ten years ago, the world received an amazing gift.
I am, of course talking about Rebecca Black's immortal classic Friday.
(40:43):
Whether you're kicking in the front seat or sitting in
the back seat. Well, now Rebecca Black has released a
remix of that song ten years later, and it features
not justin all grown up Rebecca sporting some neon punk looks,
but also three oh three Big Freedia, who has one
of the best Christmas albums I've ever heard, not Safe
(41:06):
for Work, and Dorian Elektra on the track as well.
And I listened to the remix and I have to
tell you that it's remarkable in that it is so
not my jam, but it is also not my jam
in a different way than the original was also so
not my jam. I think it is cool. However, that
(41:29):
ms Black has been able to embrace her impact on
the web and also pursue a legit career as a
singer and songwriter. She never disappeared. She's had a YouTube
channel and has been uploading to it for the last
nine years, so she's been engaging with the public since
her rise to viral fame. I have absolutely no doubt
(41:49):
that she was also on the receiving end of an
unbelievable amount of mockery for her original video, and I
think that's really a shame. I mean, wasn't an indulgent project, sure,
but good grief, y'all. If YouTube have been around when
I was a kid, I am certain there would be
no shortage of indulgent videos from yours truly. In fact,
(42:11):
you could probably argue convincingly that there is no shortage
of indulgent material from me as it stands right now.
It's just that mine doesn't happen to go viral. Anyway,
if you have fond or funny memories of the song,
you might want to check out the bonkers music video
for the remix. Now. I'm not saying you're gonna like it,
but maybe you will, and you will certainly see something
(42:35):
that is unique. Well, that wraps up this news episode.
Hope you guys enjoyed it. If you have any suggestions
for topics I can tackle on the normal Tech Stuff episodes,
you can let me know on Twitter. The handle is
tech stuff hs W and I will talk to you
again really soon. Text Stuff is an I Heart Radio production.
(43:01):
For more podcasts from I Heart Radio, visit the I
Heart Radio app, Apple Podcasts, or wherever you listen to
your favorite shows