Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
This article waspublished in January 2025.
It's titled Algorithm Reviews,public versus private reports.
And the background to thisis that there's quite a bit
of discussion around, publicreporting of AI audit reports.
And just wanted Clarify,I guess, some of what
that meant because thereis, , already anxiety around
(00:23):
this particular topic.
So here we go.
if you've encountered theview that AI audit reports
need to be made public andyou're responsible for an
algorithmic system, youmight feel a bit uneasy.
However, for most readers ofthis article, For most listeners
of this episode, there's noneed to be overly concerned.
This article explains whyyou shouldn't be too worried
(00:45):
and how to redirect theenergy misspent on anxiety.
Context matters.
AI and algorithm auditguidelines vary widely and are
not universally applicable.
We discussed this in aprevious article, outlining
how the appropriatenessof audit guidance depends
on your circumstances.
Here are some further specificsto explain what this notion
(01:07):
of public audit reportingmay mean for your situation.
Firstly, where this comes from.
Public financial statementaudit reports are often cited
in the context of AI audits.
Here's some backgroundinformation on financial audits
that will help in understandingthe potential direction
that AI audits may take.
(01:28):
Historical context, publicfinancial audit reporting
emerged in the early 20thcentury driven by the
need for transparencyand investor protection.
Applicability, public reportingtypically applies to publicly
traded companies and certainlarge private entities.
(01:49):
Exemptions, small businesses andmost private companies are often
exempt from public reporting.
Of their financialaudits that is.
And time lapse.
The current financial auditregime developed over decades,
responding to changinggovernance requirements and
some corporate scandals.
(02:09):
AI audit reporting mayfollow a similar path.
It will probably happen faster.
But it is reasonable to assumethat it won't happen overnight.
Then we haveemerging legislation.
While public reporting isn'tuniversally required, recent
legislation shows some signsof moves in that direction.
(02:29):
The EU AI Act has mandatoryaudit expectations with
potential public reporting.
The Colorado ECDIS Actrequires insurance companies
to consider external data oralgorithms and file annual
reports with the state.
These signal a growingtrend towards transparency.
The EU AI Act primarily targetshigh risk applications in
(02:53):
terms of audit requirements andthe ECDIS Act requires annual
returns but not necessarilypublic audit reports.
And then let's talk about thedistinction in terminology.
So, audit, audit, audit.
Versus review, and again, weexplored this topic in depth in
a previous article, but here arethe key points from there that
(03:13):
are relevant to our discussion.
An audit, is a specific formalprocess typically conducted by
an independent external party.
It follows adefined methodology.
The result is a report thatmay be used for compliance
or certification purposes.
An audit can be considered atype of review, but there are
(03:37):
other types of reviews too.
Those produce documents orreports for internal use.
other reviews that is, producedocuments or reports for
internal use, identifyingwhat's working and what
needs to be addressed.
And this distinction isimportant, because the debate
around public reportingrelates to formal audits,
(03:59):
not other types of reviews.
Understanding this shouldalleviate some anxiety
about public reporting.
If you're conducting internalreviews of your AI systems,
these are likely not the auditsdiscussed in the context of
public disclosure debates.
Transparency demands mainlytarget formal independent
(04:20):
audits and primarily forhigh risk AI applications.
Let's talk about when publicreports might be necessary.
So the need for public reportwill likely be driven by
several factors, includinglegislation specific laws
may require public disclosureof the results of certain
(04:40):
algorithm audits like the EUAI Act, and then transparency.
Your organization maychoose to be transparent
as part of its ethical orpublic relations strategy.
However, this won'tapply to all reviews.
And, when public reportingis required, the level of
detail in a public reportis likely to differ from
(05:03):
an internal review report.
This is because some informationneeds to remain confidential,
not all aspects of a review needto be reported on publicly, some
findings may not be appropriatefor public disclosure, and
many reviews are conducted toproactively address issues.
(05:23):
For example, if you commissiona review of your algorithmic
fraud system to check forpotential bias, you might
choose to publicly share someinformation, but you won't
disclose all the details.
For example, how you checkfor fraud may need to remain
hidden, or it may be easierfor fraudsters to figure out
how to bypass the checks.
(05:45):
Here's a quick note, forhigh risk AI systems that
fall under the EU AI Act,certain information may need
to be registered in the EUdatabase, and that includes
a description of the system,its intended purpose and
information about the provider.
And then let's talkabout preparation for
potential public reporting.
Even if your organizationisn't currently required to
(06:07):
publish AI audit reports,you It's wise to be prepared.
Here are three thingsyou can consider now.
Number one, ethicalAI practices.
You are likely alreadyimplementing AI
governance practices.
If not, now is agood time to start.
This type of commitment willposition you well for any
(06:28):
future reporting requirements.
number two, regular reviews,conduct periodic assessments for
fairness, accuracy, security,privacy, and other risk areas.
These help you to identifyissues to resolve early.
Even if the issues arenot complicated, they
can take time to fix.
(06:50):
And then number three,reporting strategy.
Consider the informationyou would be comfortable
sharing publicly if required.
A proactive approach can helpyou prepare for potential
future reporting requirementswith safeguards, for example,
to maintain confidentiality.
Lastly, where to from here?
(07:10):
The potential for publicAI audit reports shouldn't
be a source of anxiety.
Instead, view it as anopportunity to strengthen
your AI governance practices.
While there's a growingtrend towards transparency in
algorithmic decision making,especially in the public
sector, the need for publicreports on algorithm integrity
(07:31):
reviews is not universal.
However, banks and insurancecompanies using high risk AI
systems, as defined by theEU AI Act, may be subject
to specific requirements.
And further public reportingrequirements may emerge as
regulators gain more experiencein overseeing these systems.
(07:52):
So, being prepared isimportant, and this includes
identifying and mitigatingrisks before they become issues.
By conducting regular,thorough reviews of your
algorithmic systems, youcan ensure their integrity,
fairness and reliability.
That's the end of the article.
Thanks for listening.