Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
Hey there folks, lock your screen and mute your mic because you're now on cracking the
(00:05):
code with Craig Peterson, privacy and security unraveled.
Our mystery box today, well it seems Microsoft has been playing fast and loose with our privacy.
Yeah, it's a problem.
You are not alone, at least when it comes to your privacy and security, that's why I'm
(00:25):
here.
I'm Craig Peterson and you're listening to news radio, G-I-R-A-M-610 and FM-96.7.
Join me with Chris Ryan Monday mornings at 734 right here.
Hey queued up this week, outlook, more like look out.
(00:46):
You probably saw this in your newsletter this week and it's really kind of scary.
Do you remember the times when we accepted the terms of service without a second thought?
You know conveniently trading some of our information for using services like Microsoft
Outlook.
Yeah, so about that, right?
(01:06):
It turns out there is a bit more to it.
Outlook isn't just your friendly email service anymore.
It's converted into a data collecting gopher for Microsoft and it's whopping 772 external
partners that Microsoft is sharing their data with, well, your data, right?
(01:28):
We're talking about friends in the high places here, Microsoft, you get to know them and they
will give you all kinds of great data, just passing a couple of bucks under the table.
So you might ask yourself right now, what's the game?
What am I talking about?
Do you even use Outlook?
I tend to not use it.
I use Mac mail, typically a lot of people use Thunderbird mail, which is a free mail client
(01:54):
from our friends at Mozilla and it's really quite secure.
But let me tell you, this is nothing in our favor because all of this data collection
actually helps them personalize ads for you and perhaps sell your taste in Hawaiian shirts
to whoever's interested, including your friends who might mention something to you, Microsoft
(02:15):
picks up on it, assumes that you might be interested in it as well.
And you know what?
Hey, you never looked for socks online and all of a sudden you're getting Hawaiian shirt,
right?
And socks and who knows what else that you never looked at.
And that's because they are now selling your information, okay?
(02:35):
And what are they actually accessing?
Well, I'll hold tightly onto your coffee mugs because you know, we only caffeine 24 or
7 because here's the pinch.
It's everything from your emails to your contacts and your event details.
Microsoft has basically made peeping Tom an official job role for them.
You know, a little birdie tells us the new outlook has been acting a bit shady.
(03:00):
It's been siphoning off data, right?
And remember, not all heroes wear capes.
Some of them wear badges of hoarders or data hoses, as I like to call them.
This is not a desirable product, really, of this new outlook that Microsoft has.
Microsoft is making more money than they have ever, basically, at this point.
(03:21):
And now they've decided that they can theoretically read all your emails.
Don't know about you, but that sounds like my worst nightmare.
And maybe even on steroids.
So to put it in perspective, imagine Microsoft having the access to an ability to scan and
analyze all of your emails, use machine language on them to figure out what you're talking
(03:46):
about, what related products might be out there, and then share them with 772 as of today,
third parties.
That sounds pretty daunting, doesn't it?
Well data collection folks, it's no longer a side job.
It's the big league.
You've seen all kinds of little guys do that over the years.
If you're using one of those free apps on your phone or on your computer, you can be
(04:09):
sure it's free because they're collecting data on you and selling it, maybe even delivering
ads.
And our friends over at Microsoft aren't shying away here, but they're putting a double effort
into magnifying this whole charade.
In fact, if you dare to even blink, you'll find your personal data shipped off to strange
services you've never heard of, turned into mula.
(04:32):
You had nothing to know about.
No notice even it.
I found this stuff hidden away in some of the darker, quieter sides of the sides of the
internet.
It's just nuts, right?
Data is the golden egg and we're just laying out those eggs for them.
(04:53):
Now if all this makes you wish you were living in a pre-digital world, I'm with you, but
folks, let's not go throwing our devices out the window just yet.
There's some things you can do.
If you're interested, email me, me at cragpeterson.com and if enough people respond, I'll put together
a special thing on mail clients that you can use other than outlook that are not selling
(05:18):
your data and are trying, in fact, to try and keep it safe, right?
So you can avoid being piggyback, but you've got to let me know if I have enough people
out.
I'll put the time in.
As you can probably get some pretty busy guy with all my business customers, but be
glad to do that, me, me at cragpeterson.com.
So first up here, guys, consider using two-factor authentication.
(05:42):
Some form of multi-factor authentication too is enough for added security.
Don't use SMS text messages.
They are not secure.
So for something even more secure, look at duoduo.com.
It costs money, but it's great.
It allows you to share your passwords.
It allows you to share them with your family members, with other people within your organization
(06:06):
and only specific passwords.
It is really good.
I don't make a dime off of this, by the way, you probably figure that out, right?
Second, when it comes to managing passwords, don't go for the basics.
Check out one password as your best new friend for generating a managing password like a
pro.
I'm going through a list that I made, and I just assume it is some things to duoduo.
(06:28):
There are actually one password.
One password is the one that lets you keep passwords, create passwords, have it automatically
filled in when you go to a website and share them as necessary.
Duos, the one that does the multi-factor authentication and does it quite well.
We use it at all of our customers.
I can't think of a single one we don't use it at.
And lastly, protect yourself against ransomware.
(06:50):
There's some tools that you can use that are highly effective, and they start free.
For instance, OpenDNS.
We sell the commercial version of that.
That's called Cisco Umbrella.
It has a lot more features and does better lockdown and other things, but OpenDNS, find
it online.
Just do a DuckDuckGo search, and you'll find it there, and it is going to stop some 90%
(07:16):
of all of the ransomware attacks, right?
Because normally the way it works is there's a phishing email.
You click on the email, you didn't mean to click on the email, it took you somewhere,
and it tricked you into downloading some software, or maybe you're using an older computer and
it downloaded it automatically.
Yeah, I got to love the old windows, right?
(07:38):
And guess what happens at that point?
Well, now you are infected.
So it now calls home to command and control this piece of malware, this ransomware, in
order to register, find out what they want to do, right?
What should I be doing here to this poorer person?
And it's been interesting over the years to see how some of these different pieces of
(07:58):
ransomware have evolved.
One of them, and more than one now actually, looks to see if you have a Russian keyboard
on your computer.
Now, this doesn't have to be real Russian keyboard.
It can just be a virtual Russian language keyboard, because they don't want to get in
trouble with the Russian government, because they are located inside Russia, right?
(08:20):
So there's some ways to shut it down in some cases, but in almost every case, if it can't
call home to get its instructions, it cannot and will not encrypt your stuff, you don't
get the ransomware, okay?
So open DNS.
And of course, if you're using Windows, turn on Windows Defender, Microsoft finally realized
(08:42):
that they're creating all kinds of problems, and they need to help solve them.
And they've done, I've got to tell you, I'm very respectful of some of the stuff they've
done over the last few years with the new guy at the helm of Microsoft, he's doing a
pretty darn good job.
But Windows Defender, that's the replacement for some of these antivirus programs you've
(09:03):
had over the years, you know, the Macafees or the Norton, all those basic ones, we tend
to go for the much more advanced stuff from Cisco, but Windows Defender, again, it's one
of those 90% solutions, right?
So let's see, 90% better with open DNS, 90% better with Windows Defender, that means
(09:24):
what?
They add those together, 99% effective, right?
That's what you hope for.
Now, I want you to remember folks, understanding technology really shouldn't feel like deciphering
the Enigma code, I get it, right?
This isn't World War II.
Let's unravel the tech talk together, one episode at a time, and if you loved our little
(09:44):
tech talk tour today, talking about Outlook, well Microsoft is now doing with the new versions
of Outlook, hopefully, by the way, enough people will complain that they'll stop.
I'm not signing up for my free weekly emails over at CraigPeterson.com, we cover everything
from the latest tech insights to security secrets, things that I'm doing, in fact, I'm
(10:06):
planning on coming out with a new email that comes out every week that goes into some of
the big problems that are lurking out there at the time.
And I've said it before, I'll probably say it again many times, but make sure you guys
shoot your patch, right, and I understand why you don't do it, it can cause some serious
problems, right?
Patching can really hose up your computer in an extremely big way, which, again, you
(10:31):
don't want to do.
And all of us, when we're working at work, we're trying to get our work done.
And it's a reasonable fear to say, well, if I apply a patch, am I going to lose hours
or days worth of work while I try and recover from a bad patch?
And guys, I understand that.
So another thing Microsoft did in order to get around this problem to a degree is you
(10:56):
can now roll back patches.
You have 10 days to do it.
And if the patch didn't work right for you, if it messed something up, you can say, okay,
roll back and it will go ahead and put all of your old stuff in place, which is absolutely
fantastic.
But you only have 10 days for that.
Unless, of course, you're doing some sort of an additional backup, in which case you might
(11:20):
be able to restore your machine to the previous state.
So that's our problem with our friends at Microsoft and Outlook.
It's our new frenemy, frankly, here.
And I'm going to talk next about the electric vehicles.
And are they entering the total failure phase of their existence?
Check it out online and make sure you get the newsletter crigpederson.com.
(11:43):
Hear me out, folks.
It's high time we dipped our toes into the stormy pool of electric vehicles.
It stopped me if you've heard this one.
So electric vehicles stride into a bar.
And the bartender says, what will it be?
And they reply, recharge in a few warranties, buddy.
(12:05):
Yeah, EVs are supposed to be our savior from the filthy claws of fossil fuels, right?
A night riding on a lightning bolt is rescuing us from impending doom.
We're all going to get flooded out.
President Obama's home on the Cape, right there by the ocean, is going to be flooded
out.
(12:25):
What, as of 15 years ago, according to Al Gore, do you remember all of that?
Well, now, how is that all working out?
Well, according to Red State and the Gateway Pundent, not so well, my friends, are overpriced
and unreliable friends like Ford's lightning seem to be more sparkle than thunder.
(12:48):
Who would have thought?
I like Ford, I drive Ford's, my family all drives Ford's, but are you kidding me?
A whopping $55,000 for a truck you can't even take for a spin more than a couple hundred
miles.
And if you're trying to use your lightning truck for business and you're hauling things
around, remember, you've got an extra one ton of just battery rate weight in that Ford
(13:13):
lightning truck.
Oh, by the way, also in other vehicles.
Yeah, a ton is pretty common amount just for the battery.
So you have even less range because of the batteries you're hauling around.
You have even less load capacity because you already have a extra ton in the car.
Not very practical, is it, folks?
Now, just another case of pretty wrapping around a questionable gift.
(13:37):
Think about it.
Our federal government is forcing this on us.
It's forcing it on the auto industry.
Maybe we should follow the money a little bit here, but there's even a more bizarre part
because those malfunctioning electric vehicles were not the problem to begin with.
Ask anybody, do you like a car that grumbles and groans every winter?
(14:00):
Rendering you hitchhiker faster than you can shout road trip.
I don't think I've hitchhiked since I was in college, but, you know, hey, it's up to
you.
I don't want to do it anymore.
So if you're taking one thing away today, it's this.
When you're paying through your nose and draining your emergency funds, those long promise
(14:21):
savings shrink into the sunset.
We're seeing better than 50% loss on electric cars after just a couple of years of ownership.
And then when it's time to replace the batteries in the cars, what else do you have?
Well, you also have the problem of what, another depends on the car, but usually $15,000 to $20,000
(14:45):
expense.
No wonder people don't want to buy used cars.
Look at what we just heard about with Hertz, getting rid of a third of their electric car
fleet.
People don't want them.
They cost a lot in insurance.
They cost a lot to run.
It's very difficult to find charging stations when you're renting a car in an unknown area,
(15:08):
right?
It's a pity.
These electric vehicles are powered by elements mined in inhumane conditions.
We've talked about that before.
Some of these elements, some of these minerals are being mined in Quebec.
They're being mined in Africa.
Now, who do you think has better conditions?
(15:30):
These kids in Africa and the mines with toxic elements here, you know, cobalt and everything
else are trying to get their hands on for our wonderful solar panels and you name it,
right?
Or us here.
Who do you think's better off, right?
These things are not zero emissions.
You might not have anything coming out of a tailpipe while they're driving.
(15:53):
But the manufacturer them is insane and the people that are being abused, the kids that
are losing their lives working in mines, what, so people can feel good about themselves.
I love the tech.
I've said this so many times, you know, when the Toyota Prius came out, I was just fascinated
by how they had set up a hybrid car and how well that thing worked, how reliable that
(16:17):
was.
But then I saw a study where they asked Prius owners why they bought a Prius.
And you know what the number one answer was?
It was 70% of the people saying we bought a Prius because of what other people, or excuse
me, what we think other people will think of us for owning a Prius.
(16:38):
Do you see what I'm saying there?
In other words, they bought a Prius because they thought other people would think better
of them or highly of them because oh, they're saving the environment with a Prius.
Not true, but you know, truer than it is for a fully electric vehicle, right?
So we are not winning on the saving the planet angle on this either, right?
(16:58):
So it's not just a financial thing.
It's not just where you have $5 million down the drain in this North Carolina city where
they bought electric buses and they're all sitting idle.
And the price tag ultimately for each one of these electric buses, these are just city
buses.
I'm looking at a picture right now.
(17:18):
The article was in my newsletter and I'm looking at the picture and these buses cost
them a million dollars each once you wrapped all of their expenses in.
And I'm sure you heard over the winter about problems with chargers.
You know, there's problems with chargers to begin with when it comes to electric vehicles.
(17:39):
But over the winter time, it's gotten even worse as batteries are harder to charge.
They drain faster and I am read an article.
This just absolutely blew my mind.
This guy was so proud of himself because he had electric car and he makes this same
trip pretty much every day.
And he can make this trip and still have 5% charge left when he got home.
(18:02):
He did it in the coldest part of the year and he didn't have 5% battery when he got
home.
He had 15% battery when he got home.
But then you read down further as to what happened.
How could he get better distance when it's that cold outside?
Because we all know that your car, regular car battery that starts the engine is going
(18:24):
to go when it's cold outside, right?
The sales of car batteries in the winter time are much higher than they are any other
time of year because it's really hard on batteries.
So I read down further.
Guess what?
Guess what?
Yeah.
So he made sure that he drove 10 miles an hour under the speed limit and he didn't run the
heater in the car because remember electric car, your heating is coming from your battery.
(18:51):
You literally put in a resistor across your battery in order to heat your car which just
completely zaps a battery.
That's the same problem with air conditioning in the summertime whereas in an internal combustion
engine you're actually just using excess heat to heat your car or excess heat coming
off of the motors, okay?
So this isn't just a nice idea to rank guys.
(19:13):
Our friends at Ford, our mentioned before, they've cut back on their manufacturing of
these electric cars, GM's calling back its EV dreams and it's being compelled by market
forces.
Good ones, right?
I wish a government would get out of business, right?
They can't make a good investment decision.
(19:35):
It's crazy.
You know, and Ford, what was it?
$4 billion they spent?
Every time they sell in an electric car, Ford is losing close to $40,000 per car.
Yeah.
It's crazy.
So it's not all doom and gloom, okay?
So here's my three tips.
These skeptical, don't let the polished marketing fool you, do the math.
(19:59):
Don't shy off from being a sticker for details.
Think of it a long-term utility, reliability costs, including insurance, which is much
higher on electric cars because a minor fender bender can be a total write off.
If you're really thinking you want to do something electric, go after a hybrid.
Toyota makes the best ones out there, but now you got two problems because you have electric
(20:22):
and you have gasoline.
So you've got double the problems, right?
So hybrids, I'm not totally hot on those.
And stay informed, right?
Make sure you get my newsletter every week.
And I helped on earth, get underneath, technologies, shiny facade.
CraigPeterson.com, easy to sign up.
(20:43):
And if you have any questions, just hit reply to any of my weekly emails, CraigPeterson.com.
You know how technical progress sometimes drives other technical progress.
For instance, you know, we got rid of the Teamsters and we ended up with what, no more
of these rutted, dirty roads and we went to paved roads, right?
(21:08):
We sold one problem, we created another problem, right?
Sounds like the government I'm talking about here, but hang on to your hats here because
right now we're going to peel back the curtain on something massive.
Listen up everybody.
When was the last time you thought about how artificial intelligence or AI for short might
(21:30):
just be thirstier for power than a room full of politicians?
We found that some of these AIs, in fact, all of them, are using more power, more electricity
for our effectively searches that we do on AI than Google or regular search engine does.
(21:50):
And that's causing a problem here because the AI revolution here is gobbling up way
more energy than we can supply.
So add to that the fact that we're shutting down power production plants, we're shutting
down some of these gas pipelines, we have so much natural gas.
(22:11):
And at the same time, we're increasing the need for electricity.
It's just insanity.
It's mutual suicide, frankly.
Well this is all true.
Sam Altman, you've probably heard his name, we've talked about him before, you know, the
whole firing and open AI, and he's kind of the big brain, he was one of the founders
and he's in responsible for it now.
(22:33):
He says if we keep advancing AI like we want to keep advancing AI, we're going to end up
with a real problem here.
And the problem is the electricity, right?
So that's a problem when you get right down to it.
And how do we solve it?
(22:54):
Well we don't have Superman out there, right?
Super president doesn't matter who it is, no, they're not going to be able to solve this
problem.
I can tell you who the hero is in this particular problem and that hero wears a hard hat and
speaks nuclear physics.
They're building new types of nuclear energy plants that are safer than ever, safe as having
(23:18):
a cookie jar on the counter, really, no.
These new nuclear plants are cooked up in factories with high quality checks that are
designed to be tight checks.
They use basic physics to make sure nothing bad is going to happen in the future.
And especially if safety is a middle name, these new plants are really kind of the Fort
(23:44):
Knox of nuclear safety.
Okay, so once upon a time nuclear energy was like that rock band everybody loved until
they had one bad concert or maybe a bad trip.
But there's a curveball here because these modern marvels called SMR's small modular
reactors are changing the energy tune.
(24:06):
They're safe, like built in a factory safe, they're scalable and clean as a whistle.
Think about it folks, power that keeps AI running, that keeps our electric cars running,
keeps our homes heated without making Mother Earth frown, that's a pretty sweet deal.
And when we're talking about these types of new nuclear plants, we're talking about
(24:31):
having thousands of them, not the big old ones, right?
These SMRs are like Lego for the power world, they're made in pieces, in a factory, then
they're put together on site like a high tech jigsaw puzzle.
This is important because it's not your grandpa's nuclear plant.
These bad boys are meltdown proof.
(24:53):
They're built to shut down faster than you can say Houston, we have a problem, even if
you aren't Tom Hanks and they have a much smaller ecological footprint than you might
expect.
Okay, so a few bullet points that we covered so far, these new nuclear plants have no meltdowns.
These things regulate themselves using basic physics.
(25:15):
They don't need electricity to stay safe.
They don't need to have water running through them to stay safe, very, very well designed.
They're made in a factory, built to the highest quality, they're small but mighty, right?
Popular parties, because they can fit anywhere.
Well now, especially if you're scratching your head wondering, well, how does this help
(25:35):
me?
I got you buddy, okay?
If you run a business is looking to harness artificial intelligence without the guilt of
a carbon footprint bigger than a Sasquatch.
These SMRs might be your ticket to green and clean prosperity.
In fact, some of these companies out there, including Microsoft, including OpenAI, are
(25:57):
planning on building their own or at least installing their own nuclear power plants,
because it uses so, so much electricity.
Now that gives you that fuzzy feeling inside, at least I hope it does.
But here's another aha moment I had when I was going through this.
When you think about the need for electricity, and having basically an infinite amount of
(26:22):
electricity available for very little costs, because these new nuclear plants right now
have about 150 years worth of fuel right now, because they can burn the fuel and the waste
from the old nuclear plants.
Now how's that for exciting?
Okay, so now we've got cheap, almost unlimited electricity, so it changes the world, because
(26:49):
these plants can be used to create electricity to desalinate water.
All of these people live next to the ocean, but not a drop of water to drink.
They can be used to create hydrogen fuels, which is something I've been pushing for a
long time, right?
The problem with hydrogen fuel cells in cars and in some cars, I would have had an engine
(27:11):
like this, will actually burn the hydrogen directly, is that it's hard to make and it's
hard to transport, right?
We just don't have that infrastructure.
But if you can have a small nuclear plant that only runs the gas station, I mean really
small, intrinsically safe, you can bury it underground and leave it there for 20 years
(27:31):
before it's time to dig it up and maybe replace it or you actually, what you do is you send
it in for recycling, literally what you do.
And they put a new one in and it just uses water at the fueling station, creates hydrogen
from that water and now puts it into your car.
Now you can drive your car and your zero emissions.
(27:53):
We don't have to have kids working in mines in Africa who are being killed by all of this
nastiness that we need in order to have electric car batteries, right?
There's so many things we can do, can you tell them excited, right?
Call it multitasking here.
There's so much we can do with this.
So what can you do?
(28:14):
Well, to help out, you got to share the story, right?
You got to tell your neighbor, tell your dog, write a letter if you're feeling old school.
The more we talk about these power solutions, the better our future looks.
You can't leave it to the politicians who are going to be influenced by the money that
is sent to them, right?
(28:34):
And if you're in or nerd is itching for more, you got to visit CraigPetersong.com.
That's my website.
It's a digital treasure trove.
You can sign up right there, get my weekly newsletter.
But stick around.
We've got a lot more stuff to talk about here.
Don't go anywhere.
The AI power challenge isn't a tech boogie man.
(28:55):
It's the start of electrifying new chapter.
And you heard it here first, right on cracking the code with Craig Peterson.
No two ways about it.
Here's what you can do.
Join the conversation, as I mentioned.
Chat up about these SMRs, these small nuclear reactors.
Talk about AI's energy needs.
Talk about desalinization, cleaning the ocean, using this power, okay, getting again.
(29:21):
Embrace the change.
How could your business benefit from a mini but mighty nuclear buddy?
If you are a manufacturer, use a lot of power while we're looking into and stay informed.
Make sure you get my free weekly emails, CraigPetersong.com for this and more, served
up and bite-sized pieces that aren't going to overwhelm you.
(29:42):
Don't go anywhere.
We're just getting started.
Okay, let me tell you, there's more to this than meets the eye.
Have you ever felt like your computer's pulling a fast one on you?
Well, get ready for this, pals.
AI models might just be trained to deceive you.
(30:03):
It's like when you teach your dog to fetch, but instead it plays dead, we've got these
tech brains.
Big old language models, as are called, trained on data enough to trip up a library.
They're churning out answers that could bamboozle the best of us, and here's the twist.
These models learn from us, the humans, like village apprentices with an internet connection.
(30:27):
But how do you think they got so smart they didn't just wake up and decide to start tricking
you?
No, they're like students in a classroom.
We feed them our knowledge, they study hard, and voila, they graduate with honors and pull
in your leg.
Yeah, we've got the URLs that tell the whole story.
It's been in our newsletter, but there's stuff you'll find online, that Kumbaya or the folks
(30:50):
at MIT Sloan who've laid it all out for us.
But don't get me started on content moderation here.
Fake news, AI is not just churning out fibs.
They're keeping the digital streets, quote, clean, unquote.
As our friends at Forbes highlighted, AI's got the broom and the badge scrubbing through
(31:11):
the junk like nobody's business.
Listen up, because this is important.
When AI models are trained, when they are learning, they get their diet of data from
you guessed it, everything online.
And when it comes to keeping our online neighborhoods tidy, they're learning to sort the good,
the bad, and the ugly.
(31:32):
But just like rookies, they need some guidance, some rules, and a lot of learning before they
can wear their badge with pride.
Does that make sense?
And since they've got so much data, and because we don't want them necessarily telling you
how to make a nuclear bomb or how to do other illegal things, they're moderated as well.
(31:54):
So the mischievous side here of AI could be an aha moment for you because the AI models
really can be cheeky little tricksters.
That's right.
They're not just calculating your taxes.
They're not just figuring out some of the basics of how to do the taxes, et cetera.
Some of these AIs can dial up the whimsy, pull a slight digital sleight of hand.
(32:21):
You know, imagine you're asking your smart speaker for stock advice instead of the usual
junket throws out a joke about chicken stock.
But hold your horses.
We're not talking about everything that's fun and games here.
We are using AI for business.
We are using our AI for schooling.
(32:41):
So we've got to be sharp here because it's not necessarily the AIs being all funny.
They might just not understand that you weren't talking about chicken stock, right?
They're learning hard, if you will, but they're not really learning either.
It's a matter of repetition.
It's a matter of looking for stuff that's similar to what you're talking about so that
(33:05):
they can put it all together.
And that's why if you ask a question of these AIs and you just rephrase it slightly, you
could get a completely different answer.
In fact, you likely will.
Even if you ask the same question, again, the exact same way, you could get a completely
different answer.
So we got all of these AIs running around and it's especially crucial that everybody
(33:30):
understands these AIs are supervised, just like a trainee, shadowing a pro whose work
is completely reviewed and edited, right?
So they learn from tag data and outcomes we provide.
Yeah, we're pretty much virtual teachers, but their supervisor is the one that decides
(33:51):
what they should be tagging, what they should be learning.
So every click, just like what you do online with Google, you click on a link and Google's
learning with its AI, what the right answer might be to that question.
You see how that all works, right?
So when you're clicking in social media or you're clicking online, it's reinforcing the
(34:16):
AI's knowledge about stuff.
Now that brings us to what is called AI poisoning.
Years ago, I talked here about Google poisoning and how it was used to make President Bush
look bad, right?
It's been used since then again and again and again.
And it's by people feeding in information that they want the AI to repeat and it can
(34:42):
be tricked, okay?
So it's really kind of an interesting problem when you get right down to it.
And these people who are supervising the AI's don't want information to get out that they
think might be harmful, things like, oh my gosh, maybe you should not take the jab unless
you are at extreme risk, right?
(35:04):
All that stuff was blocked.
It's still blocked in most places, right?
And man, someone just shared with me and I think it came out maybe even within the last
few weeks, but where they're saying now that the medical community does not to need to
give or get informed consent from patients.
(35:27):
Did you hear that?
No informed consent from patients needed.
If the community decides that there probably won't be much harm to you and they're using
machine learning and AI more and more to come up with all of that stuff, right?
So it's a huge, huge problem.
(35:49):
Artificial intelligence really it's about a straightforward as a pot full of spaghetti.
The even the people that make this stuff don't really understand how it's putting together
all the connections.
And you know what?
I don't think we ever will because it continues to get more and more complex, more and more
complex connections in all of this stuff.
(36:10):
So you talk about a bowl of spaghetti.
It's getting even worse all of the time.
And the models have been to well, frankly, more schools and we could possibly name.
But remember it's feeding, it's us feeding them the smarts.
They can't think they don't have intelligence.
They can't really draw conclusions.
So make sure you crank up your sarcasm meter and throw in, you know, some of the knowledge
(36:36):
that you have about AI and that you've learned here when you're talking to your friends.
Because it's just like the old computer added, dried garbage in, garbage out.
And that's what we're seeing out there.
So a few tips for you.
Be curious when you're using AI, but be cautious.
(36:56):
Read everything thoroughly.
A lot of the people have been caught using AI because the AI made stuff up entirely because
you remember, all it's doing is associating words and phrases.
If you've ever watched AI generate, you'll see it makes it comes up with a word, displays
the word, and then it pauses and then another word or maybe a few words and then it pauses.
(37:21):
That's because it's not thinking, but it's trying to correlate what the next correct word
should be.
So be curious with AI, but be cautious as well.
It'll really help you, it's great when it comes to writer's block because it gets you
going.
And just like with me, when I'm putting together all of my show prep, it's really great to have
(37:44):
it kind of beat around the bush and then I just go through and I typically rewrite most
of it and it does give me some good pointers.
Sometimes things you don't really think about, particularly when you've got the curse of
knowledge and you're trying to help explain stuff to an audience that maybe doesn't
understand it.
So it's been good for me from that aspect.
(38:05):
All right, next up, stay sharp on security.
You have to do this because AI is being used to fish you.
They're using AI to come up with messages specifically tailored for you because they
can do it for the cost of a fraction of a penny.
(38:25):
And if they can get you to do something that you shouldn't do, they might be able to make
thousands of dollars and most of these hackers are in a place where $500 a month will take
care of them, their housing, their food and their family.
So don't think of it, well, they're not going to come after me because I don't have much
money, right?
Go after the billion dollar corporation, which by the way has much better cybersecurity than
(38:49):
you do, right?
But no, they'd be very glad to get 500 bucks.
That's a whole month's worth of expenses, even more in some areas of the world.
Okay, so keep an eye on that.
Keep an eye on security.
Use tools like Duo One Pass where we talked about them earlier in the show today.
Keep updated and my weekly emails have all of the intel that you need to stay on top
(39:14):
of this wild web world of ours.
So I encourage you to go.
This is free emails.
I'm not going to hammer you, right?
One of the world's worst work characters.
No two ways about it.
I ask my wife, and I'm trying to help, CraigPeterson.com.
Just go online to CraigPeterson.com.
(39:36):
The home page is a signup page.
And I also want to let you guys know that you are not alone in this, okay?
I'm trying to help.
I answer multiple emails a week from people who are having problems, who have questions,
retirees or otherwise, I am more than happy to just answer them.
(39:57):
So if you are on my email list, and you need to be on my email list in order to get an
answer from me, right, because it's all tied in together.
But if you're on my email list, just reply to one of my emails.
Hit reply and ask your question.
And if enough people ask me something, I'll go ahead and put together a special write-up
on it and send it out to you, put it up on the website, talk about it here on the show,
(40:22):
and send it out in the emails, right?
So don't think that your question is somehow unique, because it's not, right?
And I'll make sure, hopefully I'll catch you next time on cracking the code with CraigPeterson.
We really try and unravel privacy and security.
And stay safe out there.
Technology is not overwhelming.
(40:46):
It's only overwhelming if you forget to laugh, right?
That's kind of the bottom line.
Keep up to date, enjoy life a little bit, have a whole lot of fun, and meet me online.
CraigPeterson.com.
And I also want to kind of alleviate some fears people have of kind of the Skynet thing, right?
(41:07):
That's the movie with Arnold Art or Schwarzenegger.
There are a few of them that terminate the movies.
We are a long ways away from Skynet being possible even.
Yeah, there's going to be some problems.
There's no doubt about it.
I'm not fond of Ukraine using machine learning in the drones to target Russian troops and
(41:28):
vehicles, because you just don't know exactly what it's going to do.
Just like it might give you and does give you bad answers, fake answers, make things
up when you're doing searches right there on chat GPT.
So yeah, there's some risk, but Skynet is not around the corner.
Join me online, CraigPeterson.com, and thanks for joining me today too.
(41:52):
Bye-bye.
It seems like every few weeks there's a study that comes out looking at what are the top
problems people have when it comes to cybersecurity.
Problems that they might not need to have if they had done things right.
And as you know, it's difficult sometimes to do the whole cybersecurity thing.
(42:13):
I was on a call at webinar with CISA, which is this federal government agency for cybersecurity.
And going over with them just all the things that are going on.
What should we do?
What should we do with this?
What tools are available?
How do we check this?
And you know, I've been doing cybersecurity for decades now, myself personally.
(42:38):
And my company provides cybersecurity for a lot of different companies and has over the
years.
And that webinar that I was on was overwhelming.
Even to me, there were a couple of acronyms they were using.
Yeah, it's federal government rights.
So these TLAs and acronyms are what the government is made of, right?
(42:59):
You can't be a bureaucrat unless you can rattle off various and sundry letters all in some sort
of a random order, right?
And somehow sound like you know what you're talking about.
So if you have one of those jobs, you know what I'm talking about here, it gets very confusing.
But it's all very, very overwhelming.
And yet some of the most basic things that really should be done are not that easy to
(43:25):
do.
So for instance, number one is your password.
Now he had some really interesting statistics.
And one of them was, okay, successful hacks.
What percentage of successful hacks can be directly attributed to what technique, which
technique from the bad guys?
(43:47):
And over half of the successful hacks were drumroll, please, passwords were compromised.
Now you think about that for a minute.
Over half, I think it was like 52 or 54% of all successfully hacked systems were hacked
because of the password.
(44:08):
So let's talk about that for a minute.
What he also showed in the statistics, this is the first year we've ever seen statistics
like this, was that about 70% of all of the passwords that are out on the dark web, in
other words, passwords that have been compromised, they were stolen, et cetera, et cetera.
(44:29):
About 70% of them are actually unique.
Now we talked before about admin one, two, three being a password that's very, very common.
And that password is often used by things like your home router or maybe your small
business router just using default passwords, default user names.
(44:51):
Well those are repeated, aren't they?
But when you talk about over 70% of all of the passwords that are out there on the dark
web, being unique, it really makes you think twice.
So the bad guys have your username, which is usually your email address, and they have
(45:11):
your password.
And what they do then is they try and use that at various sites online.
You might use it at your bank or a bank and see if it works, that's called password stuffing
where they're just kind of randomly trying all of these stolen email addresses and passwords
in order to try and break into something.
(45:33):
But again, it's difficult, right?
How do you keep track of unique passwords that you have for all of the different sites?
Now I have some friends that have their own little techniques, and when you go and look
it up on have it in Pwn.com, you know, Troy Hunt's website and you put the password in.
(45:54):
And I trust Troy, by the way, I trust that website to put your information in, to have
it checked.
But when you look at Troy Hunt's website with their supposedly unique passwords, guess
what?
They aren't that clever after all.
So you need a unique password for every site.
So the bad guys can't do this password stuffing technique because they've stolen your username
(46:17):
and your password from some other website.
And you don't want to have the same one on multiple websites because then they can try
it again on that.
So how do you do that?
Well, you can do it like my dad was doing, which completely failed on him because it
was stolen.
He kept all of his usernames and passwords in a spreadsheet.
(46:38):
And of course, that's a very bad idea when that spreadsheet is stolen, which is exactly
what ended up happening to him, right?
So you have a stolen spreadsheet with all your usernames and passwords.
You can bet that's going to end up on the dark web and they sell it.
And it was interesting, too, looking at what the value of some of these things were online,
the tools that are being used to hack you.
(47:02):
It's cheap for a few hundred bucks and no programming experience, you can buy software
on the dark web that lets you break into systems.
It's that easy.
It's that cheap.
So if you want to try and keep different passwords, what have I been saying for a very long time?
What do I do?
Well, I use a password manager and I've been using them for probably good 20 years, probably
(47:27):
more than that, frankly.
And a password manager lets you keep a password for each site, maybe has some secure note.
And as part of that password information, of course, it's keeping it secret, right?
Because it's got to keep it secret.
It's encrypted and might be stored locally.
(47:47):
It might be stored in the cloud that you know that I use one password by far.
It is the best overall out there right now.
You can get a family version of it, individual.
We use them for businesses, right?
One password is quite good, quite good.
Now that is one thing, but it's confusing.
I sat down with a buddy of mine who had had paycheck stolen because of this very thing.
(48:13):
His password had been leaked online and he was on the dark web and bad guys used it
in order to get into his employer's account for him, right, so you know, his paycheck
information.
And they changed the account number to deposit the money into, and of course the money is
long gone.
But that's what they did.
(48:33):
So now what happens here for this guy?
So I helped him get it all back.
And then we sat down with one password and tried to work away through all of that.
And he was 75 at the time.
So it's confusing, right?
He's not a tech guy.
He loves to play with technology, but it's very, very confusing.
(48:56):
So what do you do?
How do you do that?
How do you keep all of those passwords?
I'm looking right now in my password manager and I have about, what is it?
It's over 2,000 unique items in there, isn't that crazy?
And so it's all kinds of passwords, it's all kinds of log-ins.
It's just all of this wonderful information that I've been storing in there that is being
(49:20):
kept safe.
Confusing, right?
He's 75.
Heck, I know people in their thirties and have a hard time with stuff like this, right?
So it's not his fault.
Well, that's what's happening now.
It's happening.
Yeah.
Passwords.
They are going by by.
(49:40):
There's a group that we've been talking about for more than a decade called FIDO, F-I-D-O
online.
And they've had some really great progress and passwords and then these keys.
So you can have a nice little key.
In fact, I have one that works on Bluetooth, that has USB on it.
You can plug it in and it's something you have, right?
(50:01):
Because we've said this before, the best account, the best type of security is something you
have along with something you know, right?
Well, I don't use that as much as I probably should use that little crypto key because what
happens if I lose my crypto key?
How do I recover things?
In fact, we had that problem not too long ago with one of our employees.
(50:23):
How do you recover it?
What if the employee leaves and takes their little key with them?
We know a client of ours who had that very thing happen, right?
Gets very confusing, very fast, it gets to be a real big problem.
These keys are the latest thing that's going to help you keep yourself safe online.
(50:46):
Now they're supported by, right now at least a couple of dozen companies as the primary
method for logging in.
Those companies include Google, by the way, PayPal apps, Shopify, Instacart, Kayak, I don't
ever use them before, I've had them on the show, but Kayak can be a great place to find
(51:07):
good deals on traveling things.
Robinhood, Adobe, Microsoft, LinkedIn and video, WhatsApp, Apple, Uber, Twitter.
Here's what it is.
Pass keys allow you to use something like your phone, which has, you know, one of these face
(51:28):
identification things on it.
If it's an iPhone, you know, you've got a good one.
If it's Google, you may or may not have a good one.
Everything with a fingerprinting and you can use that now to not only identify you with
a passcode, but you don't even have to put in your username anymore.
It gets to be that simple and you can get really easy ways of logging in, including on
(51:55):
your keyboard.
So for instance, I'm sitting here in front of my computer and it's got a fingerprint
scanner right on the keyboard.
I can use that to authenticate myself.
So now you don't have to remember 2,000 different passwords, different accounts for all these
different websites.
It's now built in and it's rethink hack proof, at least for the time being, right until proven
(52:20):
otherwise, but it is a godsend for all of us, whether you're 75 or you're 30.
No longer have to mess around so much with keeping tabs on your passwords and username.
So have a look for that.
Google is really pushing it right now and you might have noticed that on Google, if you
(52:45):
try and log in, and I have been using it on a few sites, just trying it out for y'all.
Or since it was introduced in beta, I think it was March of '23 or maybe May, can't remember
now.
Check it out, pass keys, pass keys, much easier, faster, and more secure than passwords.
(53:05):
Visit me online, CraigPeterson.com.
AI, of course, has been the talk of the town since late 2022 when OpenAI came out with
their first AI.
And man, we could talk about this for a very long time, but what I want to get into is
this study that came out from Rand Corporation.
(53:28):
The researchers have been looking into some of these large language models, which are
colloquially, if you will, known as AI.
And it is kind of interesting when they dig into it deeper.
Now if you've used it for a while, you know, chat GPT 3.5 was a very good large language
(53:50):
model, and it lets you type in questions and get answers, it can do all kinds of wonderful
things for you, including writing sections of books, writing computer code, doing a whole
bunch of things.
And then they came out with later versions, of course, chat GPT 4, and they've got more
(54:12):
stuff with Dali and various other things.
All of these have improved in some ways the way you can either create computer pictures,
generative AI, or the way you can have these questions answered, et cetera, et cetera.
But there are some problems.
And it was noticed right away.
(54:33):
So immediately OpenAI started hiring people.
They have over 5,000 people in Africa that are sitting there all day long looking at questions
and answers and feeding it data.
So it'll say things, they'll say things like, "Hey, if someone asks for a great joke about
this, give them this," right?
(54:53):
So that, you know, that part of it is obviously nothing to do with AI.
It has to do with human programmers.
And 5,000 of these guys and gals working for, I don't know, how long months, year, maybe
more, working on this stuff has really put some boundaries on it.
Here's what they wanted to do is make sure that you could not ask it a question that
(55:16):
would result in the AI giving you some information to do something that might be illegal.
So for instance, they've gone in and they've tried to make it so it wasn't sexist.
Google, some years ago now, had an AI going on for images.
And it had a really hard time in some images, differentiating people from various apes and
(55:44):
other animals, right, because they're not smart, they're not people.
You can show a child a picture of a great ape and the child will, from that moment on,
recognize the cat, a dog, a person, right?
It doesn't take much for the human brain to figure this stuff out, but it's difficult
for the AI, so Google pulled that almost right away because it was ridiculous, right?
(56:07):
A lot of these AI's have been trained using American data, so the typical American faces,
the way we speak here, the way we write here, and that has really limited what some of it
can do.
Some, yeah, they're anyway, so we're not going to get into that right now.
The problem that we're finding now with the latest versions of chat GPT is, yeah, okay,
(56:33):
it's got a lot larger language model, way larger.
It's got more information.
Some people say it's more intelligent, of course, as I've said for a long time, there's
no intelligence here, people, so quit looking for it.
But what we have also found now is that it is easier to jailbreak these newer versions
(56:56):
of OpenAI than some of the older ones, so what's jailbreaking?
You might have a smartphone and you might have jailbroken it.
It's pretty common, particularly in the Android community.
It used to be a lot more common in the Apple community, and then Apple kept locking it
down and locking it down to make it harder to jailbreak.
(57:16):
And believe it or not, people, that is a good thing for you because what that means is it's
harder for the bad guys to get into your phone, get on your phone, put stuff on your
phone to steal your information, to track you, to do all of that sort of stuff if Apple
keeps locking it down every time someone's able to jailbreak the phone.
So I'm happy about that.
(57:38):
I'm very happy about that because there's so many repercussions potentially from technology
like that.
But that's jailbreaking.
It's using a device for something that wasn't really designed to do.
And jailbreaking, when it comes to AIs, is how do we get around the restrictions these
(57:59):
developers have put in place?
The obvious ones are things like, you know, how do I make a bomb?
And it's going to come back and it's going to say, hey, I can't, I'm sorry, I can't answer.
In fact, I think the exact text is, I'm sorry, I can't help you with that.
Now, that's interesting, right?
So it's not going to help me make a bomb, but let's talk about jailbreaking.
(58:24):
You see, if you word the question in such a way that it doesn't think that you're really
trying to make a bomb, you can get it to tell you.
So that's a jailbreak here.
And I've used that before with medical stuff, right?
You ask it a kind of a medical question and it'll come back and it'll say, hey, go talk
(58:45):
to your doctor, right?
You get a response, no, that's not the exact response, but you'll get a response like that.
But if you say, hey, if we're doing a study and we see these components in a blood test,
what might be the likely cause of that?
Now it'll give you an answer.
It still might at the end say, you know, talk to a doctor, but it'll give you an answer.
(59:05):
That's kind of jailbreaking and you might have done it yourself if you've ever spent
a lot of time on some of these AIs or some of these AIs.
So what Christopher, that's his name, Mooten, told Decrypt and you'll find this article
online at Decrypt.com.
(59:26):
An interview is, hey, bottom line, you can use jailbreaking techniques or what's called
prompt engineering to get below these safeguards that have been built into all of these different
large language models. It's not just open AI, chat GPT, it's barred.
(59:47):
It's all of these.
There's a lot of them that are out there right now.
So in this RAN study, researchers use these jailbreaking techniques to get the AI models
to engage in a conversation about how to cause a mass casualty biological attack using various
agents, including smallpox, anthrax, and the bubonic plague.
(01:00:09):
Researchers also ask AI models to develop a convincing story for why they are purchasing
the toxic agents.
You can see this whole thing from RAN Corporation if you follow them on X, it's at RAN Corporation.
(01:00:29):
So the team examined the risk of the misuse of these large language models and broke their
team up into three pieces. One, that was using the internet only to find out how to carry
on a terrorist attack. A second group using the internet and an unnamed large language
model.
And a third team, again, internet and another unnamed large language model, which people
(01:00:54):
might call AIs, okay. So they did this so they could figure out if the AI models generated
these problematic outputs that were different from each other in a meaningful way on the
dark web, by the way, was completely prohibited, right in print publications, had to be online.
(01:01:15):
So they're not trying to figure out if one model is riskier than another, they just wanted
to see what happened. So they had 42 AI and cybersecurity experts, which in our biz, they
call red teams, try and get them to respond with these problematic responses. And sure
enough, they were able to get it.
(01:01:39):
Researchers at Brown University discovered that chat QPT's prompt filters could be circumvented
by entering the prompt in less common languages, such as Zulu or Gaelic, instead of in English.
So we've got some problems here. This technology is so new. It's not regulated. I don't know
that it really could be regulated. There's all kinds of AI potential problems. It goes
(01:02:03):
on and on and on. But we've got some serious problems we're going to have to deal with.
And this is, to me, very scary. Terrorists using AI to carry out their attacks. Hey,
online, CraigPeterson.com, see you there.
Many of us are using tools like chat QPT and have been for a while in order to help us
(01:02:29):
write things, to answer questions, right? If you haven't played with it, you should.
Chat QPT, completely free. You can go online, sign up. There used to be waiting lists. There
aren't anymore. If you want the latest, greatest, most wonderful, which, frankly, most people
don't need, then you can pay them their monthly little fee and have restricted numbers searches
(01:02:50):
per day, which is all well and good, right? I do use the paid versions as well. And I have
also access to what they call the playgrounds so that I can put things together from a programming
standpoint too. Okay, so there's a lot that you can look at here. But what Google is doing now
is really stirring up a pot yet again. You might remember, back in the day, a lot of people were
(01:03:19):
upset with Google because Google was going into their websites and was scraping the data from
the websites, right? And that's the Google we know and love and have for a long time, where you go
to Google and you ask for some sort of information, and it'll show you various websites ranked based
(01:03:41):
on what it thinks is going to be the best for you, the best answer. And frankly, that's been powered
by a large language model for a very long time. AI's been out there for who years, years and years
now, probably a decade ish depends on, you know, where do you want to draw the line anyways?
So we've now got Google going beyond just showing a snippet of your copyrighted work from your
(01:04:12):
website and moved on to the next step, which is Google now, if you have an access to Google
bard, b-a-r-d, which is Google's large language model model, and you can find it at bard.google.com.
It's marches an experiment. But you can you can log in there, you can create yourself a little
(01:04:39):
bard account. So now when you do a search on Google, things a little bit different. And they're
different because of the ability to use an AI powered overview for the search. So for instance,
I'm on my Google search page right now, just any Google search page. I'm logged into my Google
(01:05:01):
account. I have signed up for the free bard, which is Google's alternative, if you will,
to chat GPT. And both of them, by the way, bard and chat GPT. There's a lot of others.
But those two big ones have some real differences between each other. No question about it. But
right at the top of my page now, it gives me the option to get an AI powered overview for the search.
(01:05:28):
So I typed Craig Peters on tech talk up at the top. And guess what? There's a whole
bunch of stuff. It shows my podcast and how to subscribe to it. Some of the radio stations I'm on,
some of the TV networks. I'm like, oh, there's my Twitter account, my Facebook, LinkedIn.
(01:05:49):
It's got pretty much everything did a really good job. Oftentimes it messes things up and
confuses me with some guy that's a football player or coach or something. I don't know.
I don't follow sports. Anyways, this is so this is what we're used to seeing. However,
if you want to, there's two new things on the web page. One of them says generate and that'll
(01:06:15):
give you an AI powered overview for the search. And it also, so you get down beneath the basics
here. It says continue the conversation where you can ask a follow up question. So if we say a follow
up, where can I listen to Craig's radio show? And here we go. Up it comes. It's generating an AI
(01:06:40):
response. So here it is. That's interesting. Okay, so it's got a bunch of shows that aren't
mine. So it really messed that up. So it may be if I asked it a different follow up question
question, it would do better, right? And so I said, I'm working on this in the Craig's radio show.
Maybe I say work, should I? Can I listen to Craig Peterson's? Where can I listen to Craig
(01:07:04):
Peterson's radio show? Now, this is also demonstrating how it should have had context
because of my first question, but it didn't have it. But this stuff is improving so fast. It is
going to be out there. So here we go. It's giving me streaming on tune in on my website. And sound
(01:07:29):
cloud didn't give me my iTunes and stuff. I probably should work on making my data a little
easier for people to understand. Okay, so if I go back now to Craig Peterson's on Tech Talk,
and I say, okay, generate me an AI powered overview for this search. This is where it's starting to
get hairy again, because whereas people were upset that Google was grabbing a couple of sentences
(01:07:55):
from the website. And that's those sentences were copyrighted. It's even more of a problem now,
that you can ask it a question via the AI, and it is going and taking stuff, not attributing it,
(01:08:17):
although Google's a little better about that, and putting it all together. So I said to who is
Craig Peterson, Tech Talk, and the AI powered overview that Google gave me that's free overview,
says Craig Peterson is an experienced cybersecurity consultant and owner of mainstream technology
group for over 35 years. He has worked with business people and government agencies to ensure their
(01:08:40):
data security. So there you go. This is available to you. I like this in many cases. And it does save
time, because it's giving you the answer to your question. So you no longer have to scroll down
through all of these little webpage snippets, and then click on the ones that you think might be
giving you the information that you need. So now there is also another problem. And that
(01:09:07):
problem is, well, how about all of these people that have websites like me, right? I got a website,
and you don't have to click on my website to find out that I'm an experienced cybersecurity
consultant, right? And then I've got more than 35 years, well, quite a few more actually,
in cybersecurity for business and government agencies. So I put that information up there.
(01:09:32):
Google barred and open AI and all these others went ahead and grabbed that information off of
my website. And yet I'm not getting credit for it. The people are not visiting my website.
They're not signing up for my newsletter because they visited the website and said,
Oh my gosh, this information is so marvelous. I wish Craig would send me his weekly free newsletter.
(01:09:55):
Or man, Craig's opinion is so valuable. I'm going to sign up for his paid newsletter.
No, none of that, right? So now, barred and these other large language models are taking
all of this information that people have written. And believe me, it takes a lot of time. I've got
(01:10:16):
thousands of hours into my website. No question about it and my podcast. And yet Google is taking
all of that and is basically calling it their own, where they generate answers to questions,
using the data that I put together. Do you see how this is a problem? So from a user standpoint,
(01:10:39):
it's really nice. And as I said, I've used this before, this generative AI tie in,
absolutely free, go to bard.google.com. But on the other side, what's going to happen to content
generation? We've already got problems with funding for podcasts. The amount of people buying
advertising on podcasts is phone or just dropped through the floor. I don't accept advertising on
(01:11:02):
my podcast, by the way. But it's just, it's not there anymore. So what's going to happen
to content when content creators are not paid to create content? Well, unless you're like me
and trying to help. And there's lots of those. That content's going to disappear, isn't it?
Another big problem, thanks to AI. Great article that I grabbed out of town hall here a few weeks
(01:11:33):
ago. I hadn't had time to get to it. But you know, electric cars and trucks are our total crock. Okay,
cool technology if you want one by one. But don't think that you're being green or environmentally
conscious by doing that. Because you are not, it is a problem, frankly, it is a very big problem.
(01:11:55):
So there is a great little article that was in, as I said here, the paper here town hall.
And it's talking about this guy. I've got to read you a couple of parts of that. Okay. So this guy,
he's a resident of Winnipeg up in Canada, southern Canada, right? Everybody in Canada lives in
(01:12:17):
the south is almost everybody. He bought an electric Ford F one 50 for $85,000 and was forced to
abandon it after discovering it wasn't worth the cost. Moreover, it's not for working people who
must spend an arm and a leg installing the charging station in their home. This guy, Delber, spent
(01:12:39):
around $130,000 on this green initiative with the bonus of discovering that the fast charging
stations only charges batteries up to 90%. Also, it's more expensive to recharge these vehicles
than refueling a gas powered car. Paula declared electric vehicles were the biggest scam of modern
(01:13:03):
times. That's a he said that on Fox business. He told Fox business that he needed the vehicle
for his work, but also wanted something suitable for recreational activities such as driving to
his cabin or going fishing. He also wanted an environmentally friendly vehicle as owning one is
quote, responsible citizenship these days, end quote. But Bala was quickly hit with the reality of
(01:13:30):
owning and operating an EV soon after his purchase. The vehicle compelled him to install two chargers,
one at work and one at home for $10,000. To accommodate the charger, he had to upgrade his
home's electric panel for $6,000. Oh, and by the way, that $130,000 that he spent on this little
(01:13:51):
electric vehicle project did not include the taxes. So not long after that purchase, he got into a
minor accident, which he said required light assembly on the front bumper.
Bala took the vehicle to the body shop and did not get it back for six months. He said no one
from Ford answers email or phone calls for help. The limitations of his electric truck
(01:14:17):
became even more apparent when Bala embarked on a chaotic 1400 mile road trip to Chicago.
Fast charging stations, which only charge EVs up to 90% cost more than gas for the same mileage.
On the family's first stop in Fargo, North Dakota, it took two hours and $56 to charge his vehicle.
(01:14:42):
The charge was good for another 215 miles. On the second stop in Albertville, Minnesota,
the free charger was faulty and the phone number on the charging station was of no help, he said.
The family drove to another charging station, Elk River, Minnesota, but the charger was faulty
there as well. This reminds me of Pete Buttigieg, who is our czar here for all things dealing
(01:15:13):
with transportation and he's allocated some, what was it, $150 million for charging stations
because he can't charge his vehicle either. This year, helplessness was mind boggling. Bala wrote
in an online post. My kids and wife are really worried and stressed at this point. I've heard
(01:15:34):
some people say that the best time when you have an electric vehicle is when you buy it and it's
fully charged. From then on out, it's white knuckle time until you charge it again.
So how did this 1400 mile trip to Chicago and the truck died? Bala got towed to a nearby
four-dealer ship where he rented a gas-powered car to finish the trek to the city and of course
(01:15:58):
Ford says that geography and weather can impact driving range. Yeah, no kidding,
like whether in the northeast where it gets really cold or in the south where it gets really hot.
So not ideal for winter travel or for long distances. It's not a practical car. Never was,
never will be, kill it before more people waste their hard-earned money on this nonsense.
(01:16:21):
Isn't that something? And I have to agree with this. I've seen this sort of thing
over and over and over again. It's a real problem. Now let's get into the next article here. This is
from, again, I've had this sitting here for a little while, from summer, from August 2023.
(01:16:42):
Texas, you know, they've got a power grid. I think Texas and California, I'm not sure who has more,
but the two of them have the most so-called green energy production of all of the states
in the Union. And that, of course, is solar and that is wind. You might remember a couple of years
ago now, Texas had a cold, cold snap and almost destroyed its entire grid. I mean, three minutes
(01:17:10):
away from people being without power for months. That's how bad it was because of the reliance on
wind energy and getting rid of some of these coal, gas and nuclear plants is not a good idea,
people. Okay. So this summer, the Texas power grid operator, it's called Aircott,
(01:17:34):
Electric Reliability Council of Texas asked the residents to reduce energy usage
because it was hot. It was hot in Texas. Yeah. Can you believe that? So they said, hey, hey,
reduce to avoid rolling blackouts. Now these, this company manages electric power to more
(01:17:54):
than 26 million Texas customers. That's 90% of the state's electric load.
Temperatures got hot, 115 plus degrees. I've been in Texas when it's been that hot before.
Yeah. Okay. So why did they issue this voluntary conservation notice? Well,
due to extreme temperatures forecasted high demand and lower reserves due to low wind generation.
(01:18:22):
Yeah. The wind turbines, once again, were not producing enough electricity. So what did they do?
Well, they raised their prices 6,000% for electricity. It was almost at the $5,000 per megawatt hour
price cap, if you can believe that. Okay. So the spot prices jumped to almost $5,000 per megawatt
(01:18:49):
hour from an average of $75. Okay. Can you believe that? So people ended up getting stuck
with huge electric bills that were not expecting it. Can you imagine? If you got nailed with a
6,000% increase in your electric bill, even if it's only for one day of the month.
(01:19:12):
Yeah. It's really kind of crazy, isn't it? So that's Texas for you. That's your wind power.
That's your solar power. There's a lot of people that are working on, okay, how do we level this
out? Because as it turns out, there's no sun at night. So solar doesn't be any good at night.
(01:19:33):
And there's not much wind at night, usually, because the wind is caused by
that giant orb in the sky. It heats up, right? Makes the air start to flow up and down. And
all of the drafts was create wind, which run the wonderful wind turbines. So in other words,
the sun's responsible pretty much for all of it. Okay. So what do you do then? Well, I heard
(01:19:58):
an interview. This was an interesting one with a lady who's the head of a company that was spun
off of Google and it was one of Google's X projects. And this is like in 2019, I think they spun off.
And what they were going to do is put these small turbines in the Mississippi River.
(01:20:18):
So think of a big turbine, you know, and a power dam. So they're going to have a whole bunch of
these submerged just below the water. Obviously, they don't want to block the Mississippi River
from the transportation that it provides, right? So they didn't want to block the main channels or
anything. And then they found they couldn't really do it. It just wasn't feasible economically
(01:20:42):
with the technology they had. So they they've moved on. Now, we've talked before about these
new nuclear power plants, which are absolutely amazing. These small ones made in a factory,
good specs, the good for 20 years, you can literally bury them in a small community,
dig it up in 20 years and replace it. It's just it's recyclable, the whole plant, right?
(01:21:06):
And it can't melt down. It's it's just amazing what they're doing. But some of these nuclear plants
are using liquid salt. As it turns out, it actually doesn't take that much heat to liquify
some of these salt blends, like 500 degrees Celsius, which is a lower temperature than
(01:21:29):
the gas generation plants run at. So what they're looking to do is take some of these old gas-powered
turbines, or some of the even older ones, the cold burning ones, or perhaps even nuclear,
where you have all of the grid right there. Transformers, substations are ready for you.
(01:21:53):
You've already got a gas turbine that runs off of steam that was made by what by gas or whatever.
And then this is the cool, cool part. They can store the energy. And how do they store it?
Well, they basically use a heat pump. They take electricity off of the grid when there's an excess
(01:22:16):
production, which is frankly most of the day. There's a big peak in the morning. There's a big
peak when people get back from work. Other than that, it's pretty smooth. And then you've got the
overnight, right? And they would take that extra electricity, use it to liquify the salt,
and then they keep the salt moving and let it stay hot. And then when they need the extra power,
(01:22:41):
they then heat up the water, make the steam, run the turbine, and feed power back to the grid.
Very, very cool idea. And what I think particularly neat about it is, in some areas, like they're
starting this over in Spain, because Europe has so much of this wind power and Spain solar power
(01:23:03):
that they have stopped the installation of new power, solar power, and they will pay you to take
the extra electricity generated during the daytime. So these new plants will be able to get free
electricity and store it effectively until it's needed at night, or maybe during a cold day or a
(01:23:29):
hot day in Texas. Hey, online crigpederson.com/subscribe.