Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:02):
Bloomberg Audio Studios, Podcasts, radio News.
Speaker 2 (00:08):
You're listening to Bloomberg BusinessWeek with Carol Masser and Tim
Steneveek on Bloomberg Radio. Remember last week it was all
the way Yes. Last week Carol Alphabet, the parent company
of Google, reported a surge and demand for its cloud
and AI services. It pleased investors, who sent it shares up,
even as the company said capex for the year will
be even higher than expected. The company's investing record amounts
(00:30):
to try to push progress in AI and infuse answers
and assistance from its Lmgemini into its popular products, including search, and.
Speaker 1 (00:37):
That's where Ryan J. Salva comes in. He is Senior
director of Product over at Google, where he builds AI
tools for developers, such as Gemini Cli. I think I'm
saying it correctly. We're talking about the command line interface.
It's an open source AI agent for developers, as well
as Gemini code Assist, Google's AI code assistant, Tim.
Speaker 2 (00:57):
We've got Ryan Jay Salva with us. Also with us
Mandy Saying Bloomberg Intelligence, Global Head of Technology Research. He's
also those of the Tech Disruptors podcast. Ryan was featured
on an episode of the Tech Disruptors podcast that was
with Mandeep back in the spring. Welcome to both of you, Ryan,
our audience, some who code, probably more who don't. I'm
wondering though, if you can explain for everybody out there
(01:19):
how an AI assist in, including those from Google, how
they work right now with programmers, and the vision that
you have in the future.
Speaker 3 (01:27):
Yeah. Absolutely, and first, thank you so much for having me.
You know, really, what we see today is that a
lot of developers are really caught in kind of the
labor of writing if then l's statements, getting caught up
in little tiny logical loops, and so often developers and
organizations are really just trying to deliver user requirements. They're
(01:49):
trying to deliver real value to their customers, and so
they're able to use AI in large language models to
write those requirements and natural language translate that code, and
through that ultimately accelerate their space of their pace of iteration,
their pace of learning, so that developers can focus more
on building features rather than on the syntax of the
(02:13):
code itself.
Speaker 4 (02:14):
And so what kind of productivity benefits you think you've
seen both internally as well as with clients like maybe
talk us about one of the best use cases that
you've come across with Gemini.
Speaker 3 (02:28):
Oh my gosh, there's so many, you know, So I'll
maybe first talk a little bit about from a metrics standpoint,
what we tend to see. So one of the teams
within Google is the Door Research Team. Dora effectively surveys
thousands and thousands of engineers every year, follows that up
with hundreds of hours of qualitative interviews. One of the
(02:50):
things that we're seeing is that today roughly ninety percent
of developers are integrating AI into their everyday work. They're
using AI for roughly two hours of so that tidal
wave of adoption has already swept over us all and
now we're swimming in the ocean of AI. At Google.
(03:11):
What we see is today roughly fifty percent of our
code is being written by AI. And I want you
to stop and maybe a.
Speaker 1 (03:19):
Match second, say that one more time.
Speaker 3 (03:22):
Five zero fifty percent of code is being written by AI.
That is a tremendous amount of code. And this is
in all of Google's products, from search to YouTube, to
cloud to you name it. And so this is allowing
our developers to really iterate again at a much much
(03:43):
faster pace to experiment, to learn, to test out new ideas,
and ultimately to be just a little bit less precious
about every line of code they write. Because they're able
to use the large language models to experiment, it's real
easy for them to to try out an idea on
a Tuesday, put it in front of a couple of
(04:04):
users on a Wednesday, and get a feel for whether
or not it provides real value. This is the real
magic and the real value that I feel, like AI
on Locks, I love that.
Speaker 1 (04:14):
Idea less precious because it almost to me is akin
to when we got like digital cameras on our phone, right,
and we used to take pictures with film and everyone
Like I used to think about everyone, how many more
photos did I have?
Speaker 4 (04:27):
Left?
Speaker 1 (04:27):
Now I don't even care, right, I just take a
million photos Ryan. I do wonder, though, if we're less precious,
we're more efficient, We're more productive, which is what I'm
kind of getting from this conversation. What does it mean
for developer jobs?
Speaker 3 (04:41):
Oh so, I mean, let me tell you this right now,
within my team, we are hiring more engineers, we are
hiring more product managers. And I see this when I
talk to so many other enterprises and organizations today. It's
not so much that the developer's job is any less important,
but what it does mean is that our job requirements
(05:01):
are changing. The skills that we need are a little
bit different. Because developers are spending a little bit less
time writing syntax, they're spending more time thinking about requirements.
We're really asking developers to think more like architects, to
think about systems design, to think about negotiating the contract
between components. And it means that ultimately, as our next
(05:26):
generation of creators and developers and builders are coming up,
we're asking them to think not just about can they
speak the language of programming, can they speak Java or
Python or c sharp, but rather can they do good
basic problem solving and can they think about large systems
(05:46):
level design. That's where the magic is at.
Speaker 2 (05:49):
We're speaking with Ryan J. Salva, Senior director of Product
at Google. Ryan, you must remember that New York Times
article from August Goodbye one hundred and sixty five thousand
dollars tech job, as it went through all the entry
level tech job attentry that you're laughing, but the edgry
level tech jobs that were drying up and people work,
you know, compside graduates essentially working at Chipotle because they
(06:09):
couldn't find those entry level jobs. When you say you're
hiring engineers, are you hiring entry level engineers? Or is
entry level just dried up because of LMS.
Speaker 3 (06:19):
Yeah. And by the way, I don't mean to laugh,
because every job is really important and I want folks
to be able to discover it. But I laughed because
I do think that the mem is sometimes the headlines
a little bit easier to grab attention than the ground
level reality. You know, I have, So that headline's wrong,
I'm sorry.
Speaker 1 (06:39):
Is so that headline Ryan is wrong?
Speaker 3 (06:42):
You know what? I think that I'm not saying that
an individual use case or an individual company doesn't go
through periods where they may let go of workers or
they may make different hiring decisions. But what I am
saying is that writ large across the industry, I'm still
seeing a very, very healthy the engineering ecosystem, and I'm
still seeing companies really prize and value the developers who
(07:06):
can come bringing skills that are more appropriate for this
new AI era. And that does mean less kind of again,
just being able to speak programming, to be able to
speak Java or JavaScript or typescript, is not enough anymore.
The developers really need to think about how they solve
the pro.
Speaker 4 (07:26):
So, Ryan, one of the stats from Google that has
caught my attention is the increase, the exponential increase in
their token count, you know, to almost one point three
quantillion tokens. Where does coding assistance as well as that
number one point three contillion? So it's like that.
Speaker 2 (07:45):
Is that what he's going to ask?
Speaker 4 (07:47):
Look, I mean, these numbers are staggering, but when it
comes to use cases, I think there's a big variance
between you know, a simple chat bot Q and A
versus coding agent or and AI agent running for days.
How would you characterize the contribution of coding assistant and
the products that you oversee to the overall token consumption
(08:10):
at Google?
Speaker 3 (08:11):
Sure? Sure? So, I mean I'll start here. We don't
necessarily count if a token is used for a Google
search versus software development problem versus someone doing their homework.
Having said that, what I can tell you is that
perhaps nowhere better than in software development have I seen
(08:32):
product market fit better between large language models and a
particular use case. There are a lot of reasons for this.
I think probably the biggest one is that large language models.
You know. You know this when you use Gemini or
use chat, GPT or any other kind of large language
model out there, if you're asking it to help you
(08:54):
write an email or help you write a document of
some kind. Often the response, the quality of the response
depends an awful lot upon your personal judgment and your
personal taste. Whereas with software development, we have decades of
deterministic quality measures that let us know whether the software
(09:16):
is good and safe and useful or not. We have
unit tests and static analysis and all these other ways
of validating the quality of software. And so what I
see is a lot of organizations using AI, using agents,
using large language models to accelerate their engineering life cycle
because they can deterministically say this is of good quality,
(09:39):
this is of bad quality, this is something I want
to use, this is something I don't. That's how I
see it really accelerating, particularly in the software development space.
Speaker 4 (09:48):
So do you expect a big migration of legacy systems
to the modern architecture that you mentioned as a result
of you know, coding agents being that good, or do
you see limitations in terms of you know where the
practical use cases are versus you know where the legacy
technologies are just too hard to move.
Speaker 3 (10:10):
Yeah, you know. Actually, migration and modernization is one of
the areas where I see the most interest among large
engineering teams today. There are a lot of reasons for that.
In some cases, the engineers who are maintaining those legacy
kind of applications are retiring or moving on. Skill sets
are atrophying, and there is a thing within software development
(10:32):
called code rot effectively when an application just sits around
so long that it atrophies over time and becomes less performed.
Speaker 4 (10:41):
So AI is good in that without consuming too many
tokens or you know, increasing your bill.
Speaker 3 (10:47):
So what I actually hear is a lot of organizations
are willing to dedicate waves and waves and waves of
tokens because the cost of maintaining those legacy applications is
so high. Often they're having to maintain entire data centers,
which means that you're paying not only the cost of
the engineers to maintain them, but you're also paying for
(11:08):
the facilities, for the hardware, for all of the extra
it that goes with maintaining those And honestly, if you
even just take the cost of maintaining them to the side,
the fact that you're not able to carry those applications
forward and innovate with them and do new things at them.
Often that's the real cost.
Speaker 1 (11:29):
Ryan come back. We'd love to continue this. Ryan J.
Salva over at Director or senior director of product at Google,
and of course our own man Deep seeing a Bloomberg Intelligence.
Speaker 2 (11:37):
Met For more insights from mand Deep in the Bloomberg
Intelligence team, check out the Tech Disruptors podcast. You can
find it on Apple, Spotify, or wherever you get your podcasts.