All Episodes

September 10, 2025 37 mins

Susan C. McLeod, VP of Data Center Market Development at Hitachi Energy, joins us to explore how large organizations can successfully navigate AI adoption. Drawing from over 20 years in enterprise tech, Susan offers practical insights for turning complex technology into a wins for people on the ground.

With a background leading global support and success at Hitachi Vantara, Susan now works at the critical intersection of AI, data centers, and the energy sector. In this practical conversation, we discuss why most AI pilots fail, how to truly prepare your data and teams for new tools, and why communication is the most vital skill for a future shaped by automation.

Susan's insights challenge us to look beyond the hype, preserve the invaluable human knowledge within our organizations, and champion the critical thinking skills that will define the next generation of leaders.

Highlights:

  • The "leapfrog" moment that sparked a shift from simple bots to generative AI
  • Why leaders need to say everything seven times
  • The reason 95% of corporate AI pilots are failing
  • How to prevent the loss of irreplaceable knowledge as experts retire
  • Why the next generation’s greatest challenge is protecting their own critical thinking
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
.0000000001Welcome back to Kinwise Conversations where we explore the real crossroads of humanity and technology. 2 00:00:04,739.9999999998 --> 00:00:23,890 Today we're joined by Susan McLeod, the newly appointed VP of Data Center Market Development at Hitachi Energy, and she was also an executive advisor of Envira Global, a woman owned business, accelerating sustainable infrastructure and smart cities If you're a leader wondering how to prep your systems, people, and data for ai, this conversation is for you.

(00:24):
Susan shares lessons from the enterprise world that apply just as powerfully in education, including why communication is the skill that will shape our AI future. 4 00:00:34,141.851180041 --> 00:00:35,161.851180041 Hello everyone. 5 00:00:35,161.851180041 --> 00:00:38,851.851180041 I am so excited to be here today with Susan C. 6 00:00:38,851.851180041 --> 00:00:46,231.851180041 McLeod, Susan's this rare leader who can talk code with engineers and strategy execs and make both sides feel heard. 7 00:00:46,411.851180041 --> 00:00:53,701.851180041 She's helped big companies actually make AI useful, and she cares about doing it in a way that's good for people and for the planet. 8 00:00:54,61.851180041 --> 00:00:55,81.851180041 Thank you Lydia. 9 00:00:55,111.851180041 --> 00:01:00,451.851180041 Thanks so much for having me and I'm excited to talk to you and just have this discussion. 10 00:01:00,811.851180041 --> 00:01:06,891.851180041 So my background, over 20 years in it, which is specifically focusing on. 11 00:01:07,276.851180041 --> 00:01:11,836.851180041 Data applications within the data centers, enterprise businesses. 12 00:01:12,256.851180041 --> 00:01:19,666.851180041 And I really started with a focus in services, professional services and services delivery. 13 00:01:19,666.851180041 --> 00:01:25,156.851180041 And then evolved over time into the support organization, the post-sale side of the business. 14 00:01:25,576.851180041 --> 00:01:36,196.851180041 And it's been an incredible journey for me and I'm Excited to share some of the learnings and insight specifically to focus with generative AI and what's happening in our space today. 15 00:01:36,606.851180041 --> 00:01:37,236.851180041 Amazing. 16 00:01:37,296.851180041 --> 00:01:43,236.85118004 And speaking of ai, I am really curious about how AI first showed up on the picture for you. 17 00:01:43,236.85118004 --> 00:01:51,466.85118004 Like I think different people, became aware of artificial intelligence or generative AI really recently. 18 00:01:51,496.85118004 --> 00:01:57,976.85118004 And so for you, what did, what does that, what did that look like and what was your experience? That's a great question. 19 00:01:57,976.85118004 --> 00:02:07,126.85118004 And so working in the last seven years, I ran global support and success for Hitachi Vantara, which was a great experience. 20 00:02:07,606.85118004 --> 00:02:12,586.85118004 And as you can imagine with services and support AI and generative ai. 21 00:02:12,806.85118004 --> 00:02:22,46.85118004 Specifically has been looked at one of the, low hanging fruit areas that really can bring value to services and support call center organizations. 22 00:02:22,406.85118004 --> 00:02:36,256.85118004 I'll be honest, I was very heads down in looking at bots and how do we enable bots and how do we enable, self-service from our customer service and support lens at Hitachi Vantara and. 23 00:02:37,221.85118004 --> 00:02:37,731.85118004 Generative. 24 00:02:37,731.85118004 --> 00:02:50,121.85118004 I came onto the scene back in 20 21, 22 really quickly, and I'm still, at the time, I was still very much focused on enabling the bot through the Salesforce CRM tool. 25 00:02:50,611.85118004 --> 00:02:56,191.85118004 We had to stop and pause and say, okay, this is what we're working to in a large organization. 26 00:02:56,191.85118004 --> 00:02:56,881.85118004 It takes time. 27 00:02:56,881.85118004 --> 00:02:58,981.85118004 You can't just go turn it on, right? you're. 28 00:02:59,501.85118004 --> 00:03:07,211.85118004 Working from established data sets and established tools, and we had to just take a break and say, okay, wait a minute. 29 00:03:07,521.85118004 --> 00:03:21,221.85118004 we continue down this path of enabling this feature function when we could pause and actually leapfrog and go to the future, the future state, the next generation offerings that bring in true generat value. 30 00:03:21,611.85118004 --> 00:03:22,301.85118004 And we did. 31 00:03:22,301.85118004 --> 00:03:26,291.85118004 We had to pause and say, okay, let's level set and figure out how do we. 32 00:03:26,433.51784671 --> 00:03:40,403.51784671 Work to ready our data ready, our tools ready, our solution to be able to take advantage of this new offering through generative ai, which the CRM platform against Salesforce that we were using, brought that value. 33 00:03:40,403.51784671 --> 00:03:42,983.51784671 And so, I don't wanna say it caught me off guard, but. 34 00:03:43,478.51784671 --> 00:03:50,438.51784671 It was moving so fast that with large enterprises you have a timeline to ready for certain rollouts. 35 00:03:50,678.51784671 --> 00:03:52,808.51784671 It was moving so quickly and bringing so much value. 36 00:03:52,808.51784671 --> 00:04:00,398.51784671 We had to pause and say, we're not even gonna try to do a, we're gonna wait ready and jump to C if, if that's a good way to describe it. 37 00:04:00,398.51784671 --> 00:04:01,898.51784671 Yeah, it makes a lot of sense. 38 00:04:01,898.51784671 --> 00:04:05,858.51784671 AI in general over the past few years has changed and moved so fast. 39 00:04:05,858.51784671 --> 00:04:18,128.51784671 So you had some foresight to be able to say, how do we prepare ourselves to be able to take advantage of the new technology? I think recently there was, a report that came out, about how 95% of AI pilots are failing. 40 00:04:18,128.51784671 --> 00:04:19,928.51784671 And so I think part of. 41 00:04:20,298.51784671 --> 00:04:24,388.51784671 this failure rate has to do with probably the preparation that underlies it. 42 00:04:24,508.51784671 --> 00:04:39,618.51784671 And so I'm curious, as you were, at Hitachi and you're deciding we want to take, advantage of this technology, what kind of preparation did you have to do as an organization to be able to make that happen? That's a great question, and I read that article as well. 43 00:04:39,668.51784671 --> 00:05:12,853.51784671 I believe what's happening in the space is everyone is just moving so quickly and there's a lot of pressure to take advantage and roll out generative AI within enterprises and corporations because it's just the hot topic right now and it's so new Companies and corporations still have to do their due diligence in understanding what is the problem we're trying to solve? What is the right use case for our business where we can have success and bite it off in small chunks? Don't just go out and say, we're gonna invest X dollars in generative ai. 44 00:05:13,113.51784671 --> 00:05:15,563.51784671 And just start, developing and testing. 45 00:05:15,803.51784671 --> 00:05:21,953.51784671 You still have to go through the process like you do in any solution, which is understanding the problem you're trying to solve. 46 00:05:22,373.51784671 --> 00:05:34,893.51784671 Understand, is it gonna bring, financial savings? Is it gonna bring just time management savings? What is the return on investment gonna look like so that you can plan for it correctly and be successful. 47 00:05:35,173.51784671 --> 00:05:39,573.51784671 I would say with our team and the team at, Hitachi Van Torah. 48 00:05:40,418.51784671 --> 00:05:49,958.51784671 they did a really good job at identifying use cases that we could be successful in attacking those small use cases first versus trying to be everything to everyone. 49 00:05:50,208.51784671 --> 00:05:57,978.51784671 I think a lot of companies are trying to kinda reel it back now to say, all right, we've invested in these tools, we've invested in these licensing. 50 00:05:58,218.51784671 --> 00:06:04,488.51784671 Now how do we actually use them to create value for our organization and create efficiencies for the organization. 51 00:06:05,103.51784671 --> 00:06:05,493.51784671 Right. 52 00:06:05,493.51784671 --> 00:06:17,113.51784671 There's been so much hype and pressure to be able to take advantage of this technology because it seems like it can really create a tremendous amount of value at the organizational level. 53 00:06:17,438.51784671 --> 00:06:30,978.51784671 I think in part because as individuals it's very easy to adapt and create individual value, but then at an organizational level, you have a, there's a lot more data, a lot more challenges in terms of. 54 00:06:32,8.51784671 --> 00:06:43,328.51784671 identifying the right use cases, ensuring that you have the right data in place to be able to use the technology in an effective way to be able to navigate the legal or policy component. 55 00:06:43,328.51784671 --> 00:06:50,558.51784671 So it's very complex at an organizational level, even though it's incredibly simple at an individual level to open up. 56 00:06:50,918.51784671 --> 00:06:57,938.51784671 A generative AI tool and start using it to collaborate or create something that can help you work more effectively. 57 00:06:58,568.51784671 --> 00:07:00,338.51784671 Yeah, that's a great way to describe it. 58 00:07:00,338.51784671 --> 00:07:05,788.51784671 Microsoft Copilot or Gemini or through Salesforce, their agents. 59 00:07:05,848.51784671 --> 00:07:10,258.51784671 And you also have to think about the security, the compliance. 60 00:07:10,298.51784671 --> 00:07:12,878.51784671 all of these security wrapper around enterprises. 61 00:07:12,878.51784671 --> 00:07:19,873.51784671 And they are, creating their own environments, utilizing where employees, individuals can utilize ai. 62 00:07:20,228.51784671 --> 00:07:29,498.51784671 To do their work, but do it in a, protected fashion, right? So that the data they're using is still protected within their, their cloud and their security cloud. 63 00:07:29,498.51784671 --> 00:07:40,278.51784671 So you're not just releasing proprietary IP out into, It me just going out to open AI's chat, GPT, which I do personally, and use that for my own personal creation. 64 00:07:40,488.51784671 --> 00:07:42,828.51784671 You can't necessarily do that through an enterprise. 65 00:07:42,828.51784671 --> 00:07:45,78.51784671 You have to do it through your protected data sets. 66 00:07:45,378.51784671 --> 00:07:46,698.51784671 And it takes time, right. 67 00:07:46,698.51784671 --> 00:07:53,88.51784671 For organizations and IT organizations especially to ready that and make sure the security policies are in place. 68 00:07:53,338.51784671 --> 00:07:59,248.51784671 but there are so many different ways as organizations do that, that the individual employee. 69 00:07:59,933.51784671 --> 00:08:02,303.51784671 Can use it for efficiencies. 70 00:08:02,303.51784671 --> 00:08:12,413.51784671 And you know why start from scratch in creating a document when you can feed the prompts to your enterprise AI tool and ask it to create something for you. 71 00:08:12,663.51784671 --> 00:08:15,333.51784671 giving it, the guidelines, giving it even templates. 72 00:08:15,513.51784671 --> 00:08:19,323.51784671 It'll get that individual 70% of the way there, which is great. 73 00:08:19,503.51784671 --> 00:08:21,753.51784671 And then you just take it and you customize it. 74 00:08:21,753.51784671 --> 00:08:27,213.51784671 You put your own message and tone, make sure it reflects your goals, and then you've saved tremendous. 75 00:08:27,218.51784671 --> 00:08:31,118.51784671 Times of, for efficiencies and efficiency gains for individuals. 76 00:08:31,478.51784671 --> 00:08:32,258.51784671 Absolutely. 77 00:08:32,258.51784671 --> 00:08:37,958.51784671 And being able to set up those systems so that individuals can use them in a way that's compliant. 78 00:08:37,958.51784671 --> 00:08:39,788.51784671 I think it's very challenging. 79 00:08:39,788.51784671 --> 00:08:50,318.51784671 I think for people who may have relied on, chat, GBT or some sort of Publicly available tool to not to that's change management in and of itself. 80 00:08:50,318.51784671 --> 00:08:54,338.51784671 To move from using like a chat GBT to a enterprise based tool. 81 00:08:54,338.51784671 --> 00:09:07,263.51784671 Yes, and, and being able to, to understand the types of data or the types of information that is proprietary that you don't want publicly shared versus, I don't know, maybe you can. 82 00:09:07,933.51784671 --> 00:09:26,223.51784671 Maybe you can brain, maybe there are some things that you can brainstorm and are, are not proprietary that may be related to your work, there's this education component of helping people to make choices and also the change management piece of helping people understand how important it is to protect certain types of data so that they stay inside your enterprise system. 83 00:09:26,973.51784671 --> 00:09:27,453.51784671 Yeah. 84 00:09:27,453.51784671 --> 00:09:28,173.51784671 That, that's great. 85 00:09:28,173.51784671 --> 00:09:31,293.51784671 And it is, change management and just awareness and training. 86 00:09:31,293.51784671 --> 00:09:32,583.51784671 To your point, you were spot on. 87 00:09:32,583.51784671 --> 00:09:41,93.51784671 It's training and there are things that, for my work now where I'm still using my personal chat, GPT Or deep seek. 88 00:09:41,93.51784671 --> 00:09:51,388.51784671 I actually have three and I'll go to each and ask for different ideas because you have to validate your sources and you wanna make sure you're not getting bad data 'cause it's feeding you off of what's out there. 89 00:09:51,388.51784671 --> 00:09:55,78.51784671 And there's so much just bad incorrect sources out there as well. 90 00:09:55,78.51784671 --> 00:09:56,878.51784671 So you have to really validate your sources. 91 00:09:57,328.51784671 --> 00:10:05,788.51784671 But I will do research on overall trends and market trends, but I can't feed it anything specific to my per my business. 92 00:10:06,123.51784671 --> 00:10:12,123.51784671 Because once you en enter it in to deep seek or chat GPT, it's, it's out there now. 93 00:10:12,123.51784671 --> 00:10:14,733.51784671 It's out and utilized in the public domain. 94 00:10:14,913.51784671 --> 00:10:17,73.51784671 So you do have to be very cautious of that. 95 00:10:17,73.51784671 --> 00:10:26,523.51784671 And I think the training that enterprises and organizations need to do for their employees is absolutely critical in understanding how to use your internal it. 96 00:10:27,413.51784671 --> 00:10:33,463.51784671 AI tools versus your personal and public AI tools and it's changing every day. 97 00:10:33,463.51784671 --> 00:10:44,918.51784671 You and I were talking last week about how quickly this industry and this specific space is changing and it's a challenge for organizations to keep up and be able to train and educate their teams. 98 00:10:45,958.51784671 --> 00:10:56,528.51784671 Yes, and I think understanding how the technology works, even at a very basic level about how AI tools are trained, how data labeling works, where the data comes from. 99 00:10:56,528.51784671 --> 00:11:10,938.51784671 I think all of that is important too, because if you understand at least a little bit of how generative AI or AI tools function, then you can begin to make choices that are more aligned to the way that. 100 00:11:11,583.51784671 --> 00:11:13,83.51784671 You should use the tool. 101 00:11:13,83.51784671 --> 00:11:14,163.51784671 You can be more critical. 102 00:11:14,163.51784671 --> 00:11:15,573.51784671 You can question, you can say. 103 00:11:16,893.51784671 --> 00:11:24,753.51784671 Based on what I understand about how AI tools work, I wouldn't wanna put this information into deep seek or chat GBT. 104 00:11:24,753.51784671 --> 00:11:35,78.51784671 So I think building that understanding is, is there's this compliance level of saying, okay, it's really important that you don't put X, Y, and Z in a public. 105 00:11:35,908.51784671 --> 00:11:37,108.51784671 Generative AI tool. 106 00:11:37,348.51784671 --> 00:11:46,588.51784671 And then I think there's also this component of, and this is how it works, so that employees and individuals can make choices that are more aligned to what they hope to accomplish. 107 00:11:47,368.51784671 --> 00:11:49,88.51784671 Yeah, absolutely, agree with you. 108 00:11:50,238.51784671 --> 00:12:11,478.51784671 You've been at the head of and been leading some pretty complex initiatives in Hitachi and you're moving into doing some energy focused work, and I'm curious about what is a lesson over time that you've learned about turning like a complex technology into something that is usable and a win for, for people on the ground. 109 00:12:11,796.85118004 --> 00:12:13,176.85118004 That is a great question. 110 00:12:13,636.85118004 --> 00:12:21,856.85118004 I would say the learning has been, it's gonna get back to one of our prior discussions, which is really understanding the problem you're trying to solve. 111 00:12:22,546.85118004 --> 00:12:24,766.85118004 And I think you have to stay true to that. 112 00:12:24,766.85118004 --> 00:12:29,861.85118004 You really have to understand whether it's an internal rollout or working with a customer. 113 00:12:30,268.51784671 --> 00:12:32,638.51784671 Solutioning for an organization. 114 00:12:32,638.51784671 --> 00:12:47,3.51784671 What is the problem statement that that customer internally you have in trying to solve that? And if you can stick with that and really as you define the problem and the options, the solutions that you can bring to the table to roll it out. 115 00:12:47,893.51784671 --> 00:12:52,693.51784671 If you can stick to that and stay true to that, you are gonna avoid so many of the misfires. 116 00:12:52,693.51784671 --> 00:13:00,843.51784671 Just like, again, to your point about the article that came out last week about, pilots failing, I think you really have to stay true to what it is you're trying to solve. 117 00:13:01,203.51784671 --> 00:13:12,473.51784671 And as a leader, within any organization or even just personally as you're rolling out, you have to be able to communicate and have that strong communication. 118 00:13:12,780.73469556 --> 00:13:31,680.73469556 Tool mindset so that you can ensure that everyone on your team or the people that you're working with, that you're influencing, you're really all marching to the same drum, you're marching to the same path of the solution versus everyone being out there trying to do really good things and with good intent, but not being in sync on how they're rolling it out. 119 00:13:31,680.73469556 --> 00:13:33,630.73469556 And that's how a lot of companies get into trouble. 120 00:13:33,630.73469556 --> 00:13:39,780.73469556 So those communication skills are absolutely critical, and especially now with generative ai. 121 00:13:41,25.73469556 --> 00:13:45,135.73469556 You know, we think that it's gonna take over everyone's jobs in certain roles. 122 00:13:45,615.73469556 --> 00:13:59,85.73469556 What I believe in my opinion is it's gonna make the communication for these businesses and leaders even more critical because we have to set the tone and we have to set the strategy that the company is driving to, or the solution sets you're driving to. 123 00:13:59,619.5535058 --> 00:14:05,589.5535058 The communication piece makes me, a couple years or when I first started my career, I was a teacher. 124 00:14:05,589.5535058 --> 00:14:09,219.5535058 And something that teachers say a lot is just because you taught it doesn't mean students learned it. 125 00:14:09,219.5535058 --> 00:14:12,389.5535058 And I think there's some tie in, in communication in general. 126 00:14:12,389.5535058 --> 00:14:15,359.5535058 Just because you say something doesn't mean your team is on board. 127 00:14:15,359.5535058 --> 00:14:18,449.5535058 It doesn't mean that the message that you said one time is there. 128 00:14:18,449.5535058 --> 00:14:20,849.5535058 And so it's really important. 129 00:14:21,194.5535058 --> 00:14:35,624.5535058 Once you identify a problem for everyone to be working towards solving that problem in an aligned and collaborative way, and so that focus on communication and our increasingly complex world is so key in creating alignment between individuals and. 130 00:14:36,259.5535058 --> 00:14:41,569.5535058 And acknowledging that people don't necessarily understand what you said the first time. 131 00:14:41,569.5535058 --> 00:14:44,749.5535058 And communication is an ongoing experience. 132 00:14:44,749.5535058 --> 00:14:49,819.5535058 And so just because you have communicated at once doesn't mean that everyone's on the same page. 133 00:14:49,819.5535058 --> 00:14:55,549.5535058 And so being able to validate that and go back to the table, I think can be really valuable and powerful too. 134 00:14:55,859.5535058 --> 00:14:56,579.5535058 Absolutely. 135 00:14:56,579.5535058 --> 00:14:58,919.5535058 And I'm not sure where I heard this throughout my career. 136 00:14:58,919.5535058 --> 00:15:00,929.5535058 It was in one training years ago, but it's. 137 00:15:01,329.5535058 --> 00:15:04,84.5535058 You know, you have to say it and reinforce it. 138 00:15:04,114.5535058 --> 00:15:05,314.5535058 What, seven times. 139 00:15:05,524.5535058 --> 00:15:07,24.5535058 Say it, say it, say it again. 140 00:15:07,234.5535058 --> 00:15:10,804.5535058 Seven times quite often before people truly absorb it. 141 00:15:11,464.5535058 --> 00:15:17,729.5535058 Understand it and then can echo it back to you and I think as leaders, that is something we have to remember. 142 00:15:17,759.5535058 --> 00:15:19,529.5535058 everyone is wired a bit differently. 143 00:15:19,529.5535058 --> 00:15:21,989.5535058 Everyone absorbs information differently. 144 00:15:22,289.5535058 --> 00:15:33,929.5535058 So you have to, as you're talking to your teams and readying your teams, especially for change management and transformation, you have to be very crystal clear in setting the direction or the North Star. 145 00:15:34,334.5535058 --> 00:15:43,424.5535058 Of the organization and where you're going, and then continue to reemphasize it to the different groups to make sure that they truly are on the same page. 146 00:15:43,784.5535058 --> 00:15:47,564.5535058 AI in particular has just added a level of complexity and change. 147 00:15:47,646.22017247 --> 00:16:00,556.22017247 along with the many other things that are happening in our world right now that organizations have to navigate and that ability to communicate when you're in a time of immense change is even more critical. 148 00:16:00,556.22017247 --> 00:16:11,546.22017247 And so a moment ago you said like, I'm, not worried as much about people's jobs being taken as how do we build that skill of communication so that we can use the technology that we have and move effectively forward. 149 00:16:12,486.22017247 --> 00:16:13,86.22017247 Very true. 150 00:16:13,86.22017247 --> 00:16:15,36.22017247 And, and look, the jobs are gonna change. 151 00:16:15,36.22017247 --> 00:16:17,676.22017247 I think that's the key for, and there's a lot of fear. 152 00:16:17,676.22017247 --> 00:16:19,866.22017247 There's a lot of fear and uncertainty. 153 00:16:20,206.22017247 --> 00:16:24,586.22017247 and I, and I understand it, especially as organizations are so quickly. 154 00:16:24,976.22017247 --> 00:16:33,376.22017247 Rolling out generative AI solutions or AI solutions, and we're seeing it, we're seeing jobs change, and we're seeing some impacts. 155 00:16:33,881.22017247 --> 00:16:44,251.2201725 I think, for the next generation, I know for you, your, your passion is education, and as you think about the next generation coming up, I think we really need to think about. 156 00:16:44,946.2201725 --> 00:16:50,106.2201725 What do those new roles of the future look like? And, you know, we are seeing it. 157 00:16:50,106.2201725 --> 00:16:53,526.2201725 So again, for me to say job, I'm not stressed about the jobs. 158 00:16:53,526.2201725 --> 00:16:57,696.2201725 I'm not being, intellectually honest that the jobs are, the roles are changing. 159 00:16:58,326.2201725 --> 00:17:03,276.2201725 and if you're in certain roles that can be done now through. 160 00:17:04,596.2201725 --> 00:17:14,256.2201725 Tools that are being rolled out within organizations, you probably do need to really think about how do you evolve and adapt with the new tool sets that are coming out. 161 00:17:14,306.2201725 --> 00:17:15,356.2201725 AI is a tool. 162 00:17:15,356.2201725 --> 00:17:23,606.2201725 Generative AI is truly a tool and companies are using it to do things more effectively and efficiently, which means they may need less people to do that. 163 00:17:23,846.2201725 --> 00:17:26,66.2201725 So as that is evolving. 164 00:17:26,466.2201725 --> 00:17:43,526.2201725 individuals need to think how do they modify or adapt to these new roles? and that's where, to your point about communication, we're always gonna need people to communicate with customers, to speak to customers, to determine what do you want that user interface to look like. 165 00:17:43,836.2201725 --> 00:17:48,216.2201725 we may use AI and generative AI to build and code in the future. 166 00:17:48,657.8868391 --> 00:18:06,237.8868391 Those individuals are still gonna be needed to define what does that solution look like? So the jobs are evolving and the roles are evolving, and I think that's what everyone needs to really think about, especially the younger generation getting ready to go into college, right? Where people before were so excited about being coders. 167 00:18:06,477.8868391 --> 00:18:14,547.8868391 I think those individuals need to really think about how do you put more of a business lens, a finance lens, a communication lens around the technology. 168 00:18:15,162.8868391 --> 00:18:19,482.8868391 To ensure that you have flexibility and agility when you come outta college. 169 00:18:19,746.2201725 --> 00:18:25,76.2201725 It's making me think about something that I feel like I've heard a lot recently is this. 170 00:18:26,66.2201725 --> 00:18:34,346.2201725 Importance of almost soft skills of your ability to interact with other people in a way that is effective. 171 00:18:34,346.2201725 --> 00:18:49,286.2201725 You're able to communicate, get your point across, understand the situation that you're in, because a lot of the analytical pieces that kind of left brain thinking is something that AI can do very effectively. 172 00:18:49,286.2201725 --> 00:18:59,156.2201725 And so as we're Upskilling, the current, all of us who are adults in the workforce now upskilling the college students who are about to enter the workforce. 173 00:18:59,186.2201725 --> 00:19:04,16.2201725 And then we have this whole, group of K 12 students who are in school right now. 174 00:19:04,16.2201725 --> 00:19:12,288.2711893 And just thinking about how do you become more communicative? How do you understand maybe like the macro environment that you're in. 175 00:19:12,363.2711893 --> 00:19:13,598.2711893 So that you can. 176 00:19:13,938.2711893 --> 00:19:17,683.2711893 make some of those, decision making ability, it feels like a piece of it. 177 00:19:17,783.2711893 --> 00:19:24,823.2711893 what do you see as like it for teachers right now and maybe just educators in general. 178 00:19:25,63.2711893 --> 00:19:37,952.350785 What do you recommend prioritizing to help them upskill students for the changing world? That's a great question and, and what a challenge they have, right? It will. 179 00:19:37,952.350785 --> 00:19:46,382.350785 We need the, the, well first of all, this generation is gonna come up with AI there, so they're just naturally going to know how to use it. 180 00:19:46,382.350785 --> 00:19:49,52.350785 They're using it every day and may not even realize what it is. 181 00:19:49,52.350785 --> 00:19:57,917.345894 It's just now their norm and the education, especially K through 12, of course, continuing that training, helping them understand how to use it. 182 00:19:58,532.345894 --> 00:20:07,522.345894 I believe though it's gonna be even more important for the teachers to find ways to delineate between the technology and the tools. 183 00:20:07,552.345894 --> 00:20:14,542.345894 'cause it's a tool and keep focus on overall critical thinking, human critical thinking. 184 00:20:14,722.345894 --> 00:20:18,82.345894 And how do you ensure that while they've got this. 185 00:20:18,652.345894 --> 00:20:21,172.345894 Incredible tool set access in everyday life. 186 00:20:21,202.345894 --> 00:20:29,482.345894 Everything they're doing today, they continue to focus on critical thinking and how to think through different options. 187 00:20:29,542.345894 --> 00:20:34,342.345894 Decision making value, validating the data that's presented to them. 188 00:20:34,342.345894 --> 00:20:35,512.345894 That's, that's the big thing. 189 00:20:35,692.345894 --> 00:20:44,752.345894 Have you checked the sources? How do you vet the data? How do you know the data is correct? And for me, I think that's something teachers are really gonna have to think through. 190 00:20:45,297.345894 --> 00:20:59,397.345894 how to ingrain that into this generation, that what's coming out of these tools is not just solid truth, right? It's not the source of truth from a data perspective. 191 00:20:59,397.345894 --> 00:21:00,117.345894 You have to vet it. 192 00:21:00,117.345894 --> 00:21:00,792.345894 You have to know your source of. 193 00:21:01,382.9874265 --> 00:21:11,117.9874265 It's interesting because I was working with another organization helping them develop an AI strategy and how they wanna recommend AI be used within their organization. 194 00:21:11,167.9874265 --> 00:21:20,577.9874265 in my conversation we talked about how they were seeing some employees Give their own critical thinking a backseat and say oh, this output looks so good. 195 00:21:20,577.9874265 --> 00:21:24,777.9874265 I'm just gonna use this AI generated output, but not necessarily evaluate it. 196 00:21:25,167.9874265 --> 00:21:27,927.9874265 And kind of, you know, like lessen sort of. 197 00:21:28,572.9874265 --> 00:21:37,272.9874265 Like not respect your own expertise because in five minutes you can write some prompts and get this amazing looking document that doesn't have any grammatical errors. 198 00:21:37,272.9874265 --> 00:21:39,642.9874265 And it's so easy to say, oh, this is perfect. 199 00:21:39,642.9874265 --> 00:21:45,342.9874265 Like why would I, why would I do my own thing? And we even see adults doing this. 200 00:21:45,402.9874265 --> 00:21:45,522.9874265 Yes. 201 00:21:45,522.9874265 --> 00:21:48,372.9874265 At least when they first interact with the tool. 202 00:21:48,372.9874265 --> 00:21:49,992.9874265 Which is, which is concerning. 203 00:21:49,992.9874265 --> 00:21:57,402.9874265 you want to be vetting and understanding, what you are creating, and why it's there, and what you think and what you believe about what you've done. 204 00:21:57,822.9874265 --> 00:21:59,562.9874265 And if AI can just. 205 00:21:59,847.9874265 --> 00:22:02,777.9874265 Do all of your work for you, then you're not useful anymore. 206 00:22:02,777.9874265 --> 00:22:05,367.9874265 And so it was a very interesting conversation. 207 00:22:05,397.9874265 --> 00:22:10,317.9874265 'cause as an organization, they did not want their employees generating everything through ai. 208 00:22:10,317.9874265 --> 00:22:18,697.9874265 They wanted them to really put the work and think the work they're doing is important and they need to carefully vet, what they create before they put it, put it out into the world. 209 00:22:18,847.9874265 --> 00:22:21,217.9874265 And so I think it is sort of. 210 00:22:21,682.9874265 --> 00:22:26,592.9874265 Seductive to see something created so fast with no obvious errors. 211 00:22:26,592.9874265 --> 00:22:26,862.9874265 Right. 212 00:22:26,862.9874265 --> 00:22:30,72.9874265 Like all the punctuations in the right place. 213 00:22:30,72.9874265 --> 00:22:30,342.9874265 Exactly. 214 00:22:31,62.9874265 --> 00:22:31,572.9874265 Yeah. 215 00:22:31,632.9874265 --> 00:22:32,652.9874265 So I see that with children. 216 00:22:32,802.9874265 --> 00:22:44,862.9874265 I see that with K 12, but I also see that with adults who are maybe newer, maybe as you work with a technology more, you're less susceptible to this perfect looking creation that technology made. 217 00:22:44,907.9874265 --> 00:22:45,927.9874265 it's happening everywhere. 218 00:22:45,927.9874265 --> 00:22:47,757.9874265 Look, it just happened to me the other day. 219 00:22:47,757.9874265 --> 00:22:54,357.9874265 I was writing an email and it was a little technical, three years ago, I would've never questioned myself writing that email. 220 00:22:55,197.9874265 --> 00:23:03,927.9874265 I'm very clear on what I know what I can talk to, and the level point of delineation that, okay, now I need to bring somebody else into this conversation so that they can go deeper. 221 00:23:04,257.9874265 --> 00:23:08,517.9874265 I started questioning myself and I went into chat GPT and started asking it. 222 00:23:08,517.9874265 --> 00:23:09,207.9874265 I had to stop. 223 00:23:09,207.9874265 --> 00:23:11,632.9874265 I'm like, what are you doing? you never did this before. 224 00:23:11,632.9874265 --> 00:23:14,902.9874265 You're very comfortable and confident and capable of. 225 00:23:15,683.6474214 --> 00:23:20,273.6474214 Communicating this without having to lean on a tool. 226 00:23:20,543.6474214 --> 00:23:22,823.6474214 And so it's happening to me. 227 00:23:22,873.6474214 --> 00:23:25,3.6474214 I know it's happening to a lot of other people. 228 00:23:25,213.6474214 --> 00:23:30,13.6474214 You know, we should never devalue our experience and our expertise. 229 00:23:30,293.6474214 --> 00:23:37,703.6474214 use the tool if you have questions about, oh, is that technically accurate to say it that way? Of course, then you might wanna vet it through a tool. 230 00:23:37,993.6474214 --> 00:23:40,363.6474214 Or another SME expert. 231 00:23:40,553.6474214 --> 00:23:47,23.6474214 but it is concerning that, individuals with incredible experience in IP are gonna start devaluing that. 232 00:23:47,333.6474214 --> 00:23:52,633.6474214 And then you've got the younger generation who are just coming up and they're so used to it being there. 233 00:23:52,963.6474214 --> 00:23:57,733.6474214 Are they going to even be able to and learn how to critically think. 234 00:23:58,163.6474214 --> 00:24:01,763.6474214 And how to put that down with just immediately doing what they're used to. 235 00:24:01,763.6474214 --> 00:24:06,33.6474214 it's gonna be a very interesting dynamic in the future and how things evolve. 236 00:24:06,663.6474214 --> 00:24:07,353.6474214 Absolutely. 237 00:24:07,353.6474214 --> 00:24:15,33.6474214 I think it, it just really, I feel like education about how the technology works and perpetual. 238 00:24:16,8.6474214 --> 00:24:24,798.6474214 just reinstating the importance of critical thinking and making sure you, you know, what you think and why you think and how the world works. 239 00:24:24,798.6474214 --> 00:24:32,768.6474214 I think all of those are such important skills that can complement, The technical expertise that many people are already bringing to, to the table. 240 00:24:33,368.6474214 --> 00:24:39,488.6474214 I wanna pivot us a little bit because you have done such interesting work, particularly with women leaders. 241 00:24:39,488.6474214 --> 00:24:47,318.6474214 So I know you've done some work with, with Wake and you also have a blog where you've brought together different executives and talked about ai. 242 00:24:47,318.6474214 --> 00:24:51,698.6474214 And so I'm curious about your, just like, what. 243 00:24:52,28.6474214 --> 00:25:04,888.6474214 What has inspired you to bring different people together to talk about, AI or your desire to, to support this larger community? Thank you for asking that question. 244 00:25:04,888.6474214 --> 00:25:12,438.6474214 That this is, a very important passion for myself, the blog series when I left Hitachi Vantara last August. 245 00:25:13,563.6474214 --> 00:25:24,333.6474214 And as we described, really found what I believed foundationally readied the organization to now start training the LLM models, which they're doing successfully. 246 00:25:24,363.6474214 --> 00:25:25,858.6474214 And I love seeing it, hearing about it. 247 00:25:26,518.6474214 --> 00:25:41,218.6474214 It was such a learning for me of what needed to happen within established enterprises, established organizations, large organizations that have so much data, legacy tools, multiple tools to actually be prepared to. 248 00:25:41,608.6474214 --> 00:25:49,498.6474214 Utilize AI and especially the LLMs, the large language models, to be able to execute and bring those efficiencies. 249 00:25:49,978.6474214 --> 00:25:55,198.6474214 After that happened, I decided, you know, I'm gonna sit back and I'm going to use this time. 250 00:25:55,198.6474214 --> 00:26:03,698.6474214 I had some time to just document it and educate myself even more on tools, generative ai and what it would take. 251 00:26:03,698.6474214 --> 00:26:04,838.6474214 And as I started this. 252 00:26:05,378.6474214 --> 00:26:07,178.6474214 I was gonna do it all in one blog. 253 00:26:08,348.6474214 --> 00:26:11,498.6474214 And then I realized very quickly, oh, there's so much more to this. 254 00:26:11,768.6474214 --> 00:26:19,398.6474214 And in speaking to really good mentors, friends of mine in this space, they recommended there's like, Susan, this should be a series. 255 00:26:19,398.6474214 --> 00:26:20,748.6474214 There's so many different. 256 00:26:21,438.6474214 --> 00:26:27,318.6474214 Items that have to be tackled from an organizational perspective to ready and move these forward. 257 00:26:27,318.6474214 --> 00:26:28,608.6474214 This could be a full series. 258 00:26:28,788.6474214 --> 00:26:40,308.6474214 And so I took their guidance and decided to break it up into truly how we as an organization had ready and moved it forward and broke it out into different articles. 259 00:26:40,668.6474214 --> 00:26:44,873.6474214 And as I was talking to these great peers and mentors of mine, I thought, wait a minute. 260 00:26:45,538.6474214 --> 00:26:49,18.6474214 they all have their insights as well and their own experiences. 261 00:26:49,18.6474214 --> 00:26:52,378.6474214 So I decided to bring other, and they were female. 262 00:26:52,378.6474214 --> 00:26:53,398.6474214 it didn't start that way. 263 00:26:53,398.6474214 --> 00:26:55,198.6474214 It wasn't intent for it to be all female. 264 00:26:55,198.6474214 --> 00:26:56,578.6474214 It's just how it evolved. 265 00:26:56,878.6474214 --> 00:26:58,618.6474214 And I thought, you know, this is actually really cool. 266 00:26:58,618.6474214 --> 00:27:03,498.6474214 You've got, a mix of women leaders talking about generative AI in real world. 267 00:27:03,693.6474214 --> 00:27:07,323.6474214 Experiences and how it works, what doesn't work, and lessons learned. 268 00:27:07,683.6474214 --> 00:27:09,603.6474214 So that's how the blog series started. 269 00:27:09,603.6474214 --> 00:27:14,493.6474214 I brought in what I called co-authors to help drive different messages. 270 00:27:14,743.6474214 --> 00:27:26,713.6474214 Renee Latte, who's an incredible leader, CIO level on boards, really sharing information and data of how to talk to boards, how to ready boards, how to ready your executive team for. 271 00:27:28,3.6474214 --> 00:27:32,523.6474214 Having successful, solutions and launches using this new tool. 272 00:27:32,773.6474214 --> 00:27:40,623.6474214 it's been a great, pleasure to work with all of these women and get their viewpoints from different industries as well and launch the blog. 273 00:27:40,913.6474214 --> 00:27:47,303.6474214 in parallel to that, I've done a lot of work with a group called Wake and it's Women's Alliance for Knowledge Exchange. 274 00:27:47,303.6474214 --> 00:28:07,953.6474214 They do some incredible work within the US and globally, helping bring, Female industry leaders with different levels of experience and technical skill sets into groups of younger female entrepreneurs to help give guidance and feedback on how to help them be successful in the launch of their business. 275 00:28:08,193.6474214 --> 00:28:12,893.6474214 And it's an absolutely great opportunity and probably one of the best things I've ever done. 276 00:28:13,203.6474214 --> 00:28:18,423.6474214 it is one of the best things I've ever done personally and professionally is working with, the wake organization. 277 00:28:19,668.6474214 --> 00:28:20,418.6474214 That's it. 278 00:28:20,468.6474214 --> 00:28:27,208.6474214 there's this level of giving back, right? You're able to communicate back into, you've learned all these things over your career. 279 00:28:27,208.6474214 --> 00:28:31,918.6474214 You've been able to give a lot to the organizations that you've worked in, but being able to. 280 00:28:32,473.6474214 --> 00:28:45,863.6474214 Come together with a group of women and whether it's creating dialogue or education about AI adoption initiatives or supporting, the next generation of women business leaders, you're able to give back. 281 00:28:45,913.6474214 --> 00:28:52,853.6474214 when I think about generative ai, there's this very technical kind of human, it's a machine, but it's human-like. 282 00:28:52,853.6474214 --> 00:28:55,583.6474214 there are a lot of things that it can do, but there are some things that it can't do. 283 00:28:55,633.6474214 --> 00:29:00,523.6474214 one of those I think is mentoring and sharing real life experience. 284 00:29:00,553.6474214 --> 00:29:13,333.6474214 And so even though you've been a leader who's able to prepare at a large corporation for AI adoption, you've also been able to lean into that human side and do the things that artificial intelligence can't do. 285 00:29:13,393.6474214 --> 00:29:26,113.6474214 Mentor, guide support, share personal experiences, and I think that's, really cool to hear about both and that balance that exists within you a person is being able to ready the machine, but also ready the person. 286 00:29:27,83.6474214 --> 00:29:35,213.6474214 We have to remember at the end of the day in the business that the team members that are utilizing these tools, creating, feeding the data to the tools, they're human. 287 00:29:35,543.6474214 --> 00:29:39,623.6474214 And we have to make sure that we're, taking care of those individuals as well. 288 00:29:39,623.6474214 --> 00:29:46,343.6474214 And it is a balance and it's a challenging balance, especially today because the industry is changing and moving so quickly. 289 00:29:46,713.6474214 --> 00:29:49,473.6474214 I think leaders just have to find a way to. 290 00:29:50,28.6474214 --> 00:29:57,338.6474214 Ensure that they're taking advantage of the technologies and the tools, but don't lose sight of your people and ensuring that you. 291 00:29:57,666.9807547 --> 00:30:13,576.9807547 Protect the human capital and the ip, which you can't get that data back and that knowledge that are in people's brains, you know, so many people are retiring I've started shifting, as you mentioned earlier, into the energy sector, which I'm so excited about, starting with Hitachi Energy. 292 00:30:13,576.9807547 --> 00:30:15,286.9807547 So taking the generative AI. 293 00:30:15,561.9807547 --> 00:30:20,961.9807547 Knowledge, the enterprise IP knowledge into the data center focus within Hitachi energy. 294 00:30:21,251.9807547 --> 00:30:26,101.9807547 one of the challenges the energy sector has, is the loss of individual knowledge. 295 00:30:26,251.9807547 --> 00:30:37,11.9807547 So many people that have run these power generations substations, OT skill are of retirement age, and they've got a big gap in the sector. 296 00:30:37,41.9807547 --> 00:30:44,241.9807547 there's a very small group of people that now the energy companies need, utility companies need, but also now the hyperscalers need. 297 00:30:44,721.9807547 --> 00:30:53,871.9807547 And it's a challenge in the industry and it's one of the things that, I know Hitachi Energy and Hitachi as a whole is working on, to continue to educate and share knowledge. 298 00:30:54,86.9807547 --> 00:30:56,306.9807547 but it's something leaders have to think about. 299 00:30:56,576.9807547 --> 00:31:04,236.9807547 You know, is these individuals retire or maybe they're not happy or they feel like they're not getting a balance of growth opportunities, within their company. 300 00:31:04,236.9807547 --> 00:31:19,656.9807547 If you lose that IP to a competitor or they retire, do you have that backfill? Is there someone readying, you know, that is, shadowing them, learning from them before they retire so you can pass that knowledge on? And that's really important that companies have to think about. 301 00:31:19,925.3140881 --> 00:31:33,605.3140881 It's interesting because we have this new technology that everyone's trying to upskill around generative ai, but we also have a wealth of knowledge that's been accumulated over many, many years of personal experience that. 302 00:31:34,715.3140881 --> 00:31:38,195.3140881 you have to maintain and you have to figure out how to pass that on. 303 00:31:38,195.3140881 --> 00:31:50,955.3140881 And so I feel like there's this balance right now of people trying to upskill their employees, upskill themselves, but also we do have this incoming wave of folks who are ready for retirement and they have. 304 00:31:50,965.3140881 --> 00:31:58,405.3140881 just brilliant people who've worked for many, many years and have a lot of specialized knowledge and leadership capacity. 305 00:31:58,405.3140881 --> 00:32:00,295.3140881 I think that's a real challenge for companies right now. 306 00:32:00,295.3140881 --> 00:32:00,385.3140881 Mm-hmm. 307 00:32:00,575.3140881 --> 00:32:02,505.3140881 balancing those two needs at the same time. 308 00:32:03,435.3140881 --> 00:32:06,5.3140881 It is, a big, challenge for corporations. 309 00:32:06,35.3140881 --> 00:32:21,890.3140881 you know, everyone is trying to do the right thing and to find solutions to capture knowledge and IP and knowledge management systems are absolutely critical for organizations so that people can, you know everything that's in your head as an individual that's worked in. 310 00:32:22,460.3140881 --> 00:32:26,120.3140881 A certain technology for 30 years and are maybe getting ready to retire. 311 00:32:26,300.3140881 --> 00:32:42,790.3140881 How do you capture some of that into a knowledge management system? and guess what, by the way, that knowledge management system and now feeding into the ai, into the LLMs, the big models that are being created, so others can now utilize that IP and we can't lose sight of. 312 00:32:43,170.3140881 --> 00:32:59,800.3140881 Yes, we're training it with new technology, but also the IP and the individuals that we have to find a way to capture that for those individuals that are retiring and then also not lose that IP that, a lot of companies have invested in that, you know, maybe for whatever reason are, considering to leave that company. 313 00:33:00,580.3140881 --> 00:33:00,640.3140881 Yeah. 314 00:33:01,255.3140881 --> 00:33:17,875.3140881 it's definitely interesting and something that is important, and I hope that as you step into the energy sector, you're able to learn a lot and help fill some of those gaps because energy is so important for all of us and being able to act in a sustainable and responsible way is really critical. 315 00:33:17,905.3140881 --> 00:33:20,945.3140881 And I think losing that knowledge is a risk for. 316 00:33:21,545.3140881 --> 00:33:28,145.3140881 a lot of people even who don't work directly in that sector, so it's exciting, an exciting place for you to move to in a very important place as well. 317 00:33:28,890.3140881 --> 00:33:29,760.3140881 Absolutely. 318 00:33:29,760.3140881 --> 00:33:34,630.3140881 I'm very excited to bring, just this knowledge into the energy sector and to learn. 319 00:33:34,630.3140881 --> 00:33:40,610.3140881 I mean, there's so much, within the sector that is truly the IT and the OT intersect. 320 00:33:41,0.3140881 --> 00:33:41,90.3140881 Mm-hmm. 321 00:33:41,330.3140881 --> 00:33:48,470.3140881 And energy is now right in the middle of such a demand, from the generative AI just boom, an explosion over the last two to three years. 322 00:33:48,470.3140881 --> 00:33:49,530.3140881 So very exciting place to be. 323 00:33:50,390.3140881 --> 00:33:50,900.3140881 Amazing. 324 00:33:50,990.3140881 --> 00:33:51,260.3140881 Okay. 325 00:33:51,260.3140881 --> 00:33:52,880.3140881 I have one last question for you, Susan. 326 00:33:53,180.3140881 --> 00:33:53,330.3140881 yes. 327 00:33:53,380.3140881 --> 00:34:00,550.3140881 I always end this podcast asking folks to share an idea or a question about AI that is really sitting with you right now. 328 00:34:00,550.3140881 --> 00:34:08,240.3140881 So it might be the thing that's keeping you up at night, or just something that you're kind of chatting about or thinking about throughout your day. 329 00:34:08,270.3140881 --> 00:34:32,949.7407665 And so I'm curious for you, what's your question or idea related to generative AI right now? Fifth, the thing that I would say is keeping me up or that I'm very, thoughtful about, and we touched on it earlier, I'm very concerned about or curious, I'm gonna say curious, the next generation and the ability to not lose that critical thinking function and capability. 330 00:34:33,529.3238054 --> 00:34:38,389.3238054 Again, this Gen Next generation is gonna grow up with these tools just at hand. 331 00:34:38,389.3238054 --> 00:34:39,229.3238054 they don't even realize it. 332 00:34:39,229.3238054 --> 00:34:39,589.3238054 To them. 333 00:34:39,589.3238054 --> 00:34:44,469.3238054 It's just gonna be part of their daily life on their iPhones or their iPads or their systems. 334 00:34:45,39.3238054 --> 00:34:48,969.3238054 And it's gonna come so simple to them just to ask a question and get to your point. 335 00:34:49,29.3238054 --> 00:34:49,989.3238054 But really. 336 00:34:50,799.3238054 --> 00:35:02,289.3238054 Beautifully formatted response and answer, how do we ensure that this next generation doesn't lose that capability of critical thinking? And that is something that is concerning to me. 337 00:35:02,319.3238054 --> 00:35:06,189.3238054 And I know, the K through 12, as you mentioned, and the education system, I think. 338 00:35:07,199.3238054 --> 00:35:16,219.3238054 is a bit of a headwind for them of how are they going to overcome that and keep that ability of, younger generation to use that. 339 00:35:16,429.3238054 --> 00:35:17,959.3238054 And that is concerning to me. 340 00:35:17,959.3238054 --> 00:35:28,814.3238054 And I don't wanna say laziness, but you could see with this type of functionality and tool, how the human race could ultimately become very lazy. 341 00:35:29,864.3238054 --> 00:35:32,954.3238054 And that concerns me from a critical thinking. 342 00:35:33,434.3238054 --> 00:35:37,724.3238054 Perspective, but also just how we do our day-to-day life. 343 00:35:37,724.3238054 --> 00:35:39,824.3238054 I have to say I love my pool robot now. 344 00:35:39,824.3238054 --> 00:35:40,724.3238054 I call him tea. 345 00:35:40,844.3238054 --> 00:35:42,494.3238054 He cleans the pool, the bottom of the pool. 346 00:35:42,494.3238054 --> 00:35:44,744.3238054 That used to be really challenging to do. 347 00:35:45,974.3238054 --> 00:35:46,214.3238054 Okay. 348 00:35:46,214.3238054 --> 00:35:48,264.3238054 That's actually a great thing and I don't want that to go away. 349 00:35:48,989.3238054 --> 00:35:57,899.3238054 But what, you know, what's next? I mean, at some point, how does it go too far? that is one of those things, if you sit down and really think about it, that can be a little, a little scary at times. 350 00:35:57,959.3238054 --> 00:35:58,169.3238054 Mm-hmm. 351 00:35:58,589.3238054 --> 00:36:16,309.3238054 And thinking about what do we want to continue to keep really sharpen ourselves as, as humans and what do we wanna lean into and develop in ourselves and in the next generation? I think there have been lots of cycles of innovation that have led to some skills becoming a little. 352 00:36:17,164.3238054 --> 00:36:19,714.3238054 Weaker and other skills becoming stronger. 353 00:36:19,714.3238054 --> 00:36:26,604.3238054 And so what do we wanna, what do we wanna prioritize? I wonder about the intentional aspect of that. 354 00:36:26,604.3238054 --> 00:36:38,694.3238054 Like what, what I do think we have choice as people about what we wanna prioritize in terms of what we sharpen and how we sharpen it, and how we kind of mentor and encourage the next generation too. 355 00:36:40,269.3238054 --> 00:36:44,999.3238054 To pay attention to the skills That they can really bring up in themselves. 356 00:36:45,49.3238054 --> 00:36:53,299.3238054 it is kind of frightening to think about people not thinking and losing those critical skills, but also there's a lot of potential as well. 357 00:36:53,299.3238054 --> 00:36:56,659.3238054 And so trying to hold those two things is a perpetual balance. 358 00:36:59,2.9955367 --> 00:37:04,132.9955367 That's a wrap on our conversation with Susan McLeod, tech leader, AI strategist and connector of connectors. 359 00:37:04,342.9955367 --> 00:37:05,842.9955367 Three takeaways to carry forward. 360 00:37:05,842.9955367 --> 00:37:12,832.9955367 First, slow down to speed up, as Susan puts it, pausing to align your systems, people and use cases is what turns AI pilots into real value. 361 00:37:13,42.9955367 --> 00:37:16,942.9955367 Second communication is the killer app and a world of automation. 362 00:37:17,272.9955367 --> 00:37:20,2.9955367 Human alignment is everything from boardrooms to classrooms. 363 00:37:20,122.9955367 --> 00:37:23,332.9955367 And third, the AI shift isn't just technical, it's generational. 364 00:37:23,602.9955367 --> 00:37:29,742.9955367 As leaders and educators, it's on us, the model curiosity, critical thinking and care, and how we teach and use those tools. 365 00:37:31,834.9632897 --> 00:37:39,244.9632897 if you're ready to build your team's, AI muscle Kinwise offers everything from a 30 day teacher pilot to a one day AI leadership lab. 366 00:37:39,514.9632897 --> 00:37:43,624.9632897 For boards and leadership teams, learn more or get started@kinwise.org. 367 00:37:43,724.9632897 --> 00:37:48,884.9632897 And if this episode sparked something for you, the best way to support Kinwise conversations is to subscribe. 368 00:37:49,64.9632897 --> 00:37:51,974.9632897 Leave a quick review or share it with someone you're building the future with. 369 00:37:52,124.9632897 --> 00:37:55,474.9632897 Until next time, stay curious, stay grounded, and stay Kinwise.
Advertise With Us

Popular Podcasts

Stuff You Should Know
Ruthie's Table 4

Ruthie's Table 4

For more than 30 years The River Cafe in London, has been the home-from-home of artists, architects, designers, actors, collectors, writers, activists, and politicians. Michael Caine, Glenn Close, JJ Abrams, Steve McQueen, Victoria and David Beckham, and Lily Allen, are just some of the people who love to call The River Cafe home. On River Cafe Table 4, Rogers sits down with her customers—who have become friends—to talk about food memories. Table 4 explores how food impacts every aspect of our lives. “Foods is politics, food is cultural, food is how you express love, food is about your heritage, it defines who you and who you want to be,” says Rogers. Each week, Rogers invites her guest to reminisce about family suppers and first dates, what they cook, how they eat when performing, the restaurants they choose, and what food they seek when they need comfort. And to punctuate each episode of Table 4, guests such as Ralph Fiennes, Emily Blunt, and Alfonso Cuarón, read their favourite recipe from one of the best-selling River Cafe cookbooks. Table 4 itself, is situated near The River Cafe’s open kitchen, close to the bright pink wood-fired oven and next to the glossy yellow pass, where Ruthie oversees the restaurant. You are invited to take a seat at this intimate table and join the conversation. For more information, recipes, and ingredients, go to https://shoptherivercafe.co.uk/ Web: https://rivercafe.co.uk/ Instagram: www.instagram.com/therivercafelondon/ Facebook: https://en-gb.facebook.com/therivercafelondon/ For more podcasts from iHeartRadio, visit the iheartradio app, apple podcasts, or wherever you listen to your favorite shows. Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.