Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
The moment you bring say a smart thermostat into your home,
a connected car, or even a pair of those shoes
with sensors in them, you're not just dealing with physical
objects anymore, are you. You're actually building a cyber physical system,
a digital twin of your own reality exactly.
Speaker 2 (00:17):
And that's the key. Our physical environments are homes, cities,
I mean, even our body area networks in healthcare, they've
all merged with the digital world. It's a total merger,
it is, and it's not just about convenience. This is
a high stakes legal challenge. That simple appliance. The moment
it connects to Wi Fi, it instantly starts generating data.
Speaker 1 (00:36):
And that data has real world legal.
Speaker 2 (00:38):
Consequences, consequences that well, our traditional laws were never designed
to handle.
Speaker 1 (00:43):
And that is our mission today. We're going to take
a deep dive into what we're calling the law of
smart things. We want to unpack how existing legal frameworks
are being stretched, how they're being applied to this this
hybrid reality of connected tech, the people using it, and
all the activities both good and bad, happened within this
Internet of things ecosystem.
Speaker 2 (01:03):
Yeah, and it's not just about creating brand new laws
from scratch. A lot of it is about defining what
a computer even is now, or what a breach means
when every single object is smart.
Speaker 1 (01:14):
For companies building these devices, and for you the person
using them.
Speaker 2 (01:18):
For everyone, there are really five core areas of concern
that the law has to address and urgently.
Speaker 1 (01:25):
Okay, so let's list those out. I think that's really
important to define the scope of the problem here, right.
Speaker 2 (01:31):
So first up, and this is the big one, You've
got massive privacy concerns.
Speaker 1 (01:35):
From all that constant data collection.
Speaker 2 (01:37):
Exactly, that constant low level data collection. Then, second, there's
access and control.
Speaker 1 (01:42):
Who can connect to what and when.
Speaker 2 (01:44):
And under what conditions. Third, and this is maybe the
most critical, is cyber physical security. That means protecting your
digital systems and your physical safety at the same time.
Speaker 1 (01:54):
Right. And Fourth, the one everyone thinks about when things
go wrong, liability and blame you. If your autonomous vacuum
cleaner decides to attack your neighbor, who's at fault?
Speaker 2 (02:04):
It's a genuine question now. And finally, the foundation that
people always ignore the agreements between companies.
Speaker 1 (02:12):
And users, the terms of service.
Speaker 2 (02:13):
It's terms of service that define the legal relationship between
you and your smart toaster from the very beginning.
Speaker 1 (02:19):
Okay, so to really get a handle on this invisible
data trail, let's use that classic example, the smart fridge.
Speaker 2 (02:25):
Ah, the smart fridge. It seems so harmless, doesn't it
just sitting there ordering milk?
Speaker 1 (02:30):
But what actually happens when we trace the data from
that one appliance?
Speaker 2 (02:34):
You see the complexity almost immediately. The data chain involves
well the user, the fridge manufacturer, maybe a third party
service provider like a grocery delivery company, and.
Speaker 1 (02:45):
Even I suppose your friends or visitors who.
Speaker 2 (02:48):
Are just in your house, anyone interacting with that device,
or even just the home network. Every single interaction generates information.
Speaker 1 (02:56):
And what kind of information are we talking about here
that has legal weight? It's got to be more than
just knowing I'm out of comatos.
Speaker 2 (03:02):
Oh, it's way beyond that. We are talking about highly
detailed metadata. This includes location, communication patterns, so when and
how often the fridge contacts the network, maybe even biometric
data if it uses facial recognition.
Speaker 1 (03:16):
Wow.
Speaker 2 (03:17):
And while companies, you know, they market the positive side,
analyzing your purchases to offer you a discount, the real
legal issue is about classification and access if your fridge
reveals purchasing habits that say strongly suggest a medical condition
that is now sensitive personal data.
Speaker 1 (03:34):
So data minimization, which is a core principle of GDPR,
suddenly becomes this huge compliance hurdle for a refrigerator company.
Speaker 2 (03:41):
Precisely, it's what we call privacy by design if you're
selling products in or to Europe. The General Data Protection
Regulation GDPR, it insists that manufacturers build in security and
privacy from the very start, not just bolted on later exactly.
And the sheer volume of data being generated from doorbells,
from shoes, from CCTV, it just necessitates these strong protection principles.
Speaker 1 (04:03):
Okay, let's pivot a bit from data privacy to the
darker side of this, which is security. This is where
it gets really interesting because the threats aren't just you know,
traditional hacking anymore. What are the specific cyber physical threats
that are emerging.
Speaker 2 (04:19):
The threats are systemic, really because they exploit the weakest link.
And it's not always some sophisticated attacks. Sometimes it's just
human error.
Speaker 1 (04:27):
Like a family member giving out the guest Wi Fi password.
Speaker 2 (04:30):
That or a physical security breach allowing someone to physically
tamper with the device.
Speaker 1 (04:34):
But on the technical side, criminals are getting very, very creative.
Speaker 2 (04:37):
Absolutely, we've moved beyond simple network sniffing, you know, monitoring
the traffic between your smart lock and the internet. Now
you've got data injection and modification.
Speaker 1 (04:47):
What does it mean Exactly?
Speaker 2 (04:48):
It means criminals can inject false data, modify existing instructions,
or and this is more worrying, they can replace the
firmware on the device entirely.
Speaker 1 (04:59):
So you're turning in an innocent household object into a weapon.
Speaker 2 (05:02):
Essentially, you're turning it into a malicious tool. Exactly when
a device's operating system gets replaced, that smart fridge or
that doorbell camera becomes a zombie. It's just a node
in a massive botnet used to launch denial of service
attacks against targets all over the world.
Speaker 1 (05:19):
And if the legal systems don't adapt to cover the
use of these devices in crime.
Speaker 2 (05:23):
We risk creating huge legal loopholes, loopholes for crimes carried
out using smart drones or fitness trackers against people or infrastructure.
Speaker 1 (05:32):
It really sounds like we can't just sit around and
wait for brand new IoT specific laws to be written.
So how are existing laws particularly in the UK, which
are sources detail being stretched to cover these new connected realities.
Speaker 2 (05:46):
This is really where we see the whole concept of
a computer being redefined. Our sources point to four main
legal pillars in the UK that are now being applied
to all these connected devices.
Speaker 1 (05:55):
Okay, let's start with the foundational one.
Speaker 2 (05:57):
The first is the Computer Misuse Act the CMA. Historically
this law targeted unauthorized access against you personal computers, but
now the crucial shift and definition is that, for IoT,
a computer includes every single connected device.
Speaker 1 (06:13):
You watch, your garage door, your drone controller, all of it.
Speaker 2 (06:17):
The CMA covers unauthorized access and any acts that impair
the operation of those computer systems.
Speaker 1 (06:23):
And there was a specific kind of nuanced point about
jurisdiction in there too, Right.
Speaker 2 (06:26):
Yes, the CMA has what's called a material requirement about
British citizenship. So if a hacker is in a foreign
country but the crime the unauthorized access, affects a British
system or a British citizen, the law can apply, but
the specifics of prosecution often hinge on the location and
citizenship of everyone involved. It's a very complex way of
(06:47):
trying to apply a national law to international cybercrime.
Speaker 1 (06:50):
Right, So what are the other pillars that kind of
bolster this.
Speaker 2 (06:52):
Well, second, we have the Police and Justice Act of
two thousand and six. This act wasn't about creating new
laws from nothing. It was about amending pre acts, including
the CMA, to make it easier to prosecute computer crimes.
For instance, it broadened the scope to explicitly cover the
possession or supply of things like hacking tools if they're
intended to commit a computer and misuse of fence. It
(07:15):
recognizes that just preparing for an attack is also a crime, which.
Speaker 1 (07:18):
Makes the law more proactive than just reactive. Okay, what
about the communication side of things?
Speaker 2 (07:24):
That's pelllar three the laws governing communications, networks and services.
This one is critical because it targets malicious interference with
the actual communication between devices.
Speaker 1 (07:34):
Ah so, not the device itself, but the signal exactly.
Speaker 2 (07:38):
Imagine someone using a jammer to interfere with the Wi
Fi between your smart hub and your security system, or
disrupting the communication from a connected car to a traffic
control center, even if the device isn't technically impaired. Disrupting
the network traffic is a separate, punishable offense.
Speaker 1 (07:54):
So the communication path itself is legally protected. And the
final pillar you mentioned involves investigations.
Speaker 2 (08:00):
That's RIPA, the Regulation of Investigatory Powers Act. Rip's main
scope covers the lawful and authorized interception of electronic communication. So,
for instance, if law enforcement needs to monitor the data
stream from a drone they suspect is being used for
illegal activity, they do so under ripe as authority.
Speaker 1 (08:18):
But there's a twist there, right there is.
Speaker 2 (08:20):
The interesting part is that RIPA also strictly regulates what
law enforcement themselves can do. It ensures they can only monitor, modify,
or interfere with communications traffic when they are properly authorized.
It's a check on state power.
Speaker 1 (08:33):
Okay, that gives us a really clear picture of the
UK baseline. Let's zoom out now and look at the broader,
more global regulatory pressures that are shaping how these devices
are even built. Maybe starting again with GDPR.
Speaker 2 (08:46):
The General Data Protection Regulation is well, it's the ultimate
hammer for data handling across the world. Any company doing
business with EU or handling data of EU citizens is
bound by.
Speaker 1 (08:57):
It, and the penalties are what really gives it t.
Speaker 2 (09:00):
That's what mandates action. The fines can be up to
four percent of a company's global annual turnover or millions
of pounds. This is not a slap on the wrist.
It forces companies to prioritize minimum security standards, to implement
things like data encryption and specific breach notification procedures.
Speaker 1 (09:20):
So that's the big regulatory push on manufacturers. But what
about specific objects that are creating these unique legal headaches
right now? Drones are a huge one. There's this friction
between governments wanting control and users wanting freedom exactly.
Speaker 2 (09:35):
And the UK proposals are focusing on user registration and
mandatory safety testing, usually spurred by public fears over drones
hitting airplanes or something.
Speaker 1 (09:44):
They're trying to regulate the operator in the hardware, right.
Speaker 2 (09:46):
But then you look at the US approach, like in
the John A. Taylor case, and you see that friction
point really clearly.
Speaker 1 (09:52):
Yeah, why did the US Court of Appeals side with
the recreational drum Pilot against the FAA, the Federal Aviation Administration.
Speaker 2 (09:59):
It's a FASc legal distinction and it's all based on
congressional intent. The court argued that Congress had previously limited
the FAA's power to regulate model aircraft that are used
just for recreation, so it stopped the agency from imposing
the same strict rules they might on commercial aviation. It
really highlights this fundamental disconnect and global regulation.
Speaker 1 (10:21):
Which is does the law regulate the technology itself or
the activity someone's performing with it?
Speaker 2 (10:28):
That's the question.
Speaker 1 (10:28):
And if drones are tough, then autonomous vehicle liability that
feels almost impossible. Who is responsible when a self driving
car crashes? Is it the passenger, the car itself, the
AI designer, the map data provider.
Speaker 2 (10:42):
This is probably the biggest shift in liability doctrine that
we face. In traditional law, it's usually about negligence the
driver with speeding or texting.
Speaker 1 (10:50):
Simple cause effect, right, But.
Speaker 2 (10:51):
In the autonomous world it shifts into product liability. The
UK Parliament is grappling with this right now with the
Automated and Electric Vehicles Bill.
Speaker 1 (11:00):
And what is that bill actually trying to do.
Speaker 2 (11:02):
It's trying to establish a clear legal framework for insurance
and liability. The core problem is assigning fault when so
many different components are involved. Was it the faulty sensor
from company A, the software update from company B, or
just poor maintenance by the owner. The law needs to
define who holds the liability for the automated function.
Speaker 1 (11:22):
Our sources mentioned the UK government published eight Principles for
Connected and Autonomous Vehicles. It sounds like that's the framework
they're using to try and solve this.
Speaker 2 (11:31):
They're crucial for setting baselines. For example, Principle one insists
on safety and security by design, which mirrors that GDPR
approach we talked about. Principle three focuses on authorized access,
making sure only the right people can control safety critical systems.
Speaker 1 (11:47):
And what about the liability part.
Speaker 2 (11:48):
That's Principle seven, which emphasizes establishing a clear liability chain.
It's about ensuring transparency about who is responsible for which
component of the vehicle system, and that feeds directly into
that parliamentary bill.
Speaker 1 (12:01):
It really hammers home this point that integrating technology into
every part of life is forcing us to redefine these
fundamental legal concepts. It's not just about stopping malicious attacks,
it's about defining blame when a complex system just makes
a mistake.
Speaker 2 (12:17):
Absolutely, and legal systems all over the world are realizing
that these older statutes need this kind of expansion. Concepts
like unauthorized access, interference, negligence. They have to be constantly
redefined based on evolving tech, like that autonomous car that
has to decide whether to hit a squirrel or a pedestrian.
The law is chasing innovation, just constantly trying to catch.
Speaker 1 (12:39):
Up, which leaves us with a final thought experiment for you,
the listener, to consider, and it links right back to
those foundational laws we discussed earlier.
Speaker 2 (12:46):
So imagine the scenario. A smart doorbell camera, which is
clearly a computing device under modern interpretation, gets hacked. The
criminal not only steals the recorded video, but also uses
the camera's internal microphone to record private conversation and monitor
encrypted traffic on your entire home network.
Speaker 1 (13:03):
Okay, so which two of the four main UK laws
we talked about the CMA, the Police and Justice Act,
the Communications Networks and Services Law ORBA, which two would
primarily cover the act of the hacker accessing the network
without permission and the subsequent monitoring of your network communication.
Speaker 2 (13:20):
Think about the two distinct criminal actions happening there, the
initial entry into the system and then the interference with
the flow of data. If you can differentiate those two concepts,
you'll know exactly which pillars are holding up the legal
response to the Internet of Things,