All Episodes

April 24, 2025 28 mins

As artificial intelligence (AI) continues to transform the automotive industry, one question remains at the center of innovation: Who’s accountable when AI goes wrong? In this thought-provoking episode of Testing 1-2-3, Parasoft’s Arthur Hicken (a.k.a. The Code Curmudgeon) and Chief Marketing Officer for Parasoft,  Joanna Schloss tackle the ethical, legal, and technical challenges posed by AI in autonomous vehicles.

🚗 What You’ll Learn in This Episode:

  • What is the "trolley problem," and why is it constantly referenced in discussions about AI ethics?
  • Who’s legally responsible when an autonomous vehicle causes an accident— the manufacturer, the developer, or the driver?
  • How Volvo’s bold move to accept liability may reshape consumer trust in self-driving technology.
  • The role of software testing, quality assurance, and transparency in minimizing real-world risks.
  • Surprising responses from popular AI tools like GitHub Copilot when asked about ethical decision-making and liability.

🎙️ Why This Episode Matters:
Whether you're a developer building enterprise applications, a QA engineer testing machine learning models, or a tech leader steering digital transformation, the implications of AI-driven automation go far beyond code. As AI systems increasingly influence safety-critical decisions, understanding your ethical and legal responsibilities as a technologist is essential.

🧠 Real Talk, Real Risks:
Arthur shares firsthand experiences using open-source software to enable semi-autonomous driving in his own vehicle—raising important questions around consent, responsibility, and risk. Joanna draws a compelling parallel to the real-world fallout of the 2024 CrowdStrike endpoint failure, highlighting how even non-automotive software can impact public safety in unexpected ways.

💡 Key Takeaways:

  • Autonomous driving isn’t just a hardware problem—it’s a software quality problem.
  • Product liability laws are being tested in new ways as AI makes more independent decisions.
  • Software engineers must think beyond functionality to ethics, safety, and accountability.
  • AI may lack principles, but your software design and testing shouldn't.

Don’t miss this essential conversation for anyone working with or around AI.
Hit play, subscribe to Testing 1-2-3, and join us as we break down the intersection of ethics, automation, and accountability—one question at a time. 

🔗 Explore More from Parasoft

Stay connected and dive deeper into the world of automated software testing and AI-driven quality assurance:

Join our community for the latest insights, episodes, and discussions on software testing, AI integration, and quality assurance best practices.

Mark as Played

Advertise With Us

Popular Podcasts

Stuff You Should Know
Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies!

The Breakfast Club

The Breakfast Club

The World's Most Dangerous Morning Show, The Breakfast Club, With DJ Envy, Jess Hilarious, And Charlamagne Tha God!

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.