Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Welcome back to the deep dive. If you are here,
you are probably one of the millions who saw Apple
make that huge leap, you know, from iOS eighteen all
the way to twenty six, and you realized, Okay, this
isn't just an update. This feels different, a reboot, almost,
especially for Apple Music.
Speaker 2 (00:16):
That's exactly it. The scale is just a men skipping
eight version numbers that tells they're not just tweaking things.
It's a total rethink at restructuring.
Speaker 1 (00:24):
All under this new software philosophy they've rolled out.
Speaker 2 (00:27):
Exactly. This is Apple Music for iOS twenty six. It
landed globally September fifteenth, twenty twenty five, after that big
WWDC reviewal back in June.
Speaker 1 (00:37):
So our mission today is simple cut through the noise.
We've got stacks of sources here talking about this supercharged update.
There's the liquid glass design which is everywhere, and then
there's the AI stuff, some really game changing features potentially.
We want to find the important bits, the surprises, and
figure out how listening, discovering, organizing music, how it's all
(00:57):
actually changed for you.
Speaker 2 (00:59):
The yeah and the core themes really are immersion and intelligence.
That liquid glass, like you mentioned it's all about translucency,
fluid motion, dynamic. It really changes the whole feel of
the OS and Apple music is front and center.
Speaker 1 (01:13):
But it's not just cosmetic, right, not at all.
Speaker 2 (01:15):
Underneath that shiny surface, you've got things like aipower DJing
with automics, real time layered translations across languages, and crucially,
tools to actually manage your library, you know, tackling that
feeling of being overwhelmed by millions of songs.
Speaker 1 (01:31):
Okay, so let's dive right in with the visuals. Then
that seems like the most immediate change. What is this
liquid glass design and why the big shift away from
what they had before, well, a cook glass.
Speaker 2 (01:42):
It's way more than just picking a new color or
arounding some corners, you know. It feels like a fundamental
shift in how they think about UI design.
Speaker 1 (01:49):
So when we talk about the core concept, how is
it different from say, the transparency affects iOS is used
for years. Is it just more transparency?
Speaker 2 (01:57):
Not quite. It's more about replacing those flos, static backgrounds
and sharp lines we are used to. Now you have
these translucent animated layers and they genuinely look like well
like flowing glass. They refract light they move, they have
this sense of depth.
Speaker 1 (02:13):
And weight, okay, like a physical material almost exactly.
Speaker 2 (02:16):
It simulates a physical material reacting within the app. It's
trying to elevate the whole visual thing, and.
Speaker 1 (02:21):
Apple Music seems to be where they've really leaned into this.
So how does that change how you navigate the app
day to day?
Speaker 2 (02:26):
The first thing you probably notice is the bottom navigation bar,
you know, home search library that used to be just
stuck there.
Speaker 1 (02:34):
Opaque right fixed at the bottom.
Speaker 2 (02:36):
Now, as soon as you start scrolling down through a
playlist or an album, that bar it sort of shrinks,
dynamically becomes this subtle, floating, translucent overlay.
Speaker 1 (02:46):
Almost like it's a receding letting the album art or
the tracks.
Speaker 2 (02:49):
Come forward precisely. It hovers, and then the moment you
scroll back up, it just smoothly expands back to its
full size, ready to be tapped.
Speaker 1 (02:56):
So it gets out of the way when you don't
need it.
Speaker 2 (02:58):
Enhances immersion, yeah, but it's instantly back when you do
even the little control buttons, shuffle, repeat, AutoPlay. They've got
this softer pill shape now. They're designed to kind of
lend into that liquid glass background.
Speaker 1 (03:10):
Making them feel less like clunky buttons slapped on top
more part of the design itself.
Speaker 2 (03:15):
That's the goal. Yeah, making the controls feel natural, part
of the canvas.
Speaker 1 (03:19):
You mentioned content prioritization and making the artwork the star.
How does that play out with this new design?
Speaker 2 (03:26):
Content, especially album art is definitely the hero now, it's
not just stuck in a little square full screen artwork
and extend right out even onto the lock screen. But
the really interesting part, I think, based on our sources,
is the support for artists designed animations, ah.
Speaker 1 (03:41):
The dynamic visuals where the image isn't static, it moves
with the music exactly.
Speaker 2 (03:46):
If the artist supplies the right kind of assets, you
get these subtle animations on compatible tracks. The sources mentioned
Taylor Swift's Tortured Poet's Department visuals, where the black and
white images sort of pulse gently with the rhythm. It's
not just a visualizer. It feels like Apple's trying to
bring back some of that you know, Vinyl era immersion
(04:07):
where you'd stare at the album cover.
Speaker 1 (04:08):
But making it dynamic for digital right and on the
now playing screen.
Speaker 2 (04:12):
The way the artworks colors and textures kind of bleed
into those translucent control layers around it. It creates this
real sense of depth but without feeling cluttered.
Speaker 1 (04:22):
Okay, that sounds really elegant. But you know, big design
changes like this, they often get pushed back during the
beta testing phase, don't they. What did the early feedback
look like?
Speaker 2 (04:32):
It was mixed? Actually, lots of praise for the elegance,
people saying it felt instantly familiar yet delightful, that sort
of thing. Yeah, But the big critique, especially from people
using dark mode or listening in dim light, was about legibility.
Speaker 1 (04:46):
The classic beauty versus function problem exactly.
Speaker 2 (04:49):
Yeah, when you have dynamic translucency floating over album art
with vastly different colors, sometimes white text or thick icons
could just get lost. People reported eyestrain.
Speaker 1 (04:58):
That's a real accessibility concern. So how did Apple tackle
that before the official release? Did they dial it back?
Speaker 2 (05:06):
They did a couple things. First. Yeah, they tweak the
default contrast a bit in the final iOS twenty six release,
just boosted it slightly to help with baseline legibility. Okay,
but the really key thing for you the user is
the control they added. You can actually customize the intensity yourself.
Speaker 1 (05:22):
Oh interesting, Where do you find that it's.
Speaker 2 (05:24):
Tucked away under setting successibility motion. There's a new slider
called liquid glass.
Speaker 1 (05:29):
Intensity accessibility, not display.
Speaker 2 (05:31):
Why there and that placement is telling? I think putting
it under accessibility shows Apple acknowledges that for some people,
maybe with visual sensitivities or just older eyes, the esthetic
can interfere with basic usability.
Speaker 1 (05:44):
So it's giving people an out if they need it.
Speaker 2 (05:46):
Right, you can choose subtle default or vivid, so you
can dial back that flowing glass effect if it's bothering
you without losing the overall new look and feel of
the OS.
Speaker 1 (05:56):
That shows some good awareness balancing the design vision with practicality. Okay,
let's talk performance. More animations, more translucency. Usually that means
more battery drain, right, you.
Speaker 2 (06:07):
Think so, But apparently it's kind of the opposite. This
is where that big version jump eighteen to twenty six
starts to make more technical sense. It's tied tightly to
the hardware. How so, iOS twenty six includes specific optimizations
for battery life when you're listening to spatial audio.
Speaker 1 (06:24):
Oh wow, that's huge for people using Airpod's pro or max.
That stuff normally eats battery right.
Speaker 2 (06:29):
The sources we looked at had some specific numbers. Early
testers reported getting ten to fifteen percent longer listening sessions
with Dolby at most tracks compared to iOS eighteen fifteen percent.
Speaker 1 (06:41):
How are they managing that?
Speaker 2 (06:42):
It comes down to smarter on device processing. The newer
chips like A seventeen pro are using the neual engine
much more efficiently for the heavy lifting involved in spatial
audio encoding and decoding.
Speaker 1 (06:53):
So it's not just brute forcing it with the main
CPU anymore exactly.
Speaker 2 (06:57):
It's offloading the work to silicon that's optimized for these
kinds of machine learning tasks. Runs cooler, uses less power, better.
Speaker 1 (07:05):
Battery life, which is great if you have the newer hardware.
This sounds like it might leave older devices behind. What's
the cutoff?
Speaker 2 (07:11):
It's pretty clear cut. The iPhone eleven series is the minimum,
So if you're still using an iPhone XS or anything older,
you won't get this iOS twenty six update.
Speaker 1 (07:21):
Ouch, So that whole liquid glass experience the efficiency games, Yeah,
it needs the eight thirteen bionic chip or newer.
Speaker 2 (07:28):
Yeah. It seems that level of dynamic rendering and AI
processing just demands more horsepower. It's a hard stop, which
is tough for people holding onto perfectly good older phones.
Speaker 1 (07:38):
Understandable but frustrating. Okay. Lastly, on the design front, ecosystem Sinc.
Handoff between Apple devices is usually good, but does liquid
glass change that experience?
Speaker 2 (07:47):
The descriptions use words like silkier apparently when you hand
off music from your iPhone to say, your Apple Watch
or home pod. Those liquid glass visual cues, the dynamic art,
the shrinking bar, they actually sink visual across devices now.
Speaker 1 (08:01):
So it's not just the audio transfer. The visuals bridge
the gap too.
Speaker 2 (08:04):
Yeah. It creates this visual continuity that makes the handoff
feel even more instant, more seamless, blurs the lines between
the devices.
Speaker 1 (08:11):
Okay, so the interface is undeniably striking. It's customizable for
accessibility and surprisingly efficient if you have the right hardware.
Let's shift gears now to what's under the hood, the
intelligence layer. This sounds like where Apple Music really starts
to change from just a player to something more proactive.
Speaker 2 (08:29):
Exactly. This is Apple Intelligence really making its mark on
how you listen. The AI isn't just recommending songs anymore.
It's basically taking over the DJ booth.
Speaker 1 (08:38):
Let's start with the headline feature here automix. They're calling
it a personal AI DJ. What's it actually doing technically?
Speaker 2 (08:45):
Okay? So automix uses on device machine learning. That's important
to analyze the music in real time. It's not just
looking at genre tags. It digs into the audio itself,
like the waveform, yeah, characteristics like tempo, the key signature,
the energy level, even harmonic str ructure. It uses all
that to create really seamless, genre aware transitions between songs
in your queue or playlist.
Speaker 1 (09:07):
So the goal is no more awkward silences or sudden
jarring shifts between tracks.
Speaker 2 (09:13):
Precisely, it aims for that continuous flow you might get
from a live DJ, but totally personalized to your music.
Speaker 1 (09:19):
Can you give an example of where it works well,
like a specific transition?
Speaker 2 (09:23):
Sure? The sources mentioned a good example smoothly fading from
Billie Eilish's Birds of a Feather, which is pretty ambient
and down tempo, directly into the start of Lord's Green Light,
which is also moody but builds energy.
Speaker 1 (09:35):
How does it manage that blend?
Speaker 2 (09:37):
The AI identifies they share a similar key, and it
sees the energy level rising the Lord track, so it
sinks the beats just before the transition, making it feel
natural not abrupt.
Speaker 1 (09:48):
Okay, you stressed the on device machine learning part. That
sounds computationally heavy for real time mixing. Why is it
so important that it happens locally on the phone.
Speaker 2 (09:56):
This is a really big deal, both technically and philosophically
for Apple. Doing it on device means two crucial things. First, privacy,
your detailed listening habits the deep analysis of your music.
It never leaves your.
Speaker 1 (10:08):
Phone, right, no data going to the cloud for this exactly.
Speaker 2 (10:12):
And second, no cloud dependency. Automix works perfectly when you're offline,
on a plane, subway tunnel, just listening to downloaded tracks
doesn't matter. It still works.
Speaker 1 (10:21):
Plus no network latency I guess zero latency.
Speaker 2 (10:24):
It happens instantly on the chip. That's a massive advantage
over streaming competitors, whose smart features often need an Internet connection.
Speaker 1 (10:32):
It's enabled by default, you said, but can you turn
it off or tweak it?
Speaker 2 (10:36):
Yeap enabled by default, but easy to toggle off. Just
tap the three dot menu in the now playing screen.
And if you like cross fades but want more control,
they've actually enhanced the manual crossfade feature too. You can
now set fades up to twelve seconds.
Speaker 1 (10:49):
Long twelve seconds, that's a long blend.
Speaker 2 (10:51):
It is an automix apparently still works underneath that manual setting,
ensuring even that long fade sounds harmonically good.
Speaker 1 (10:58):
Okay, so we have this AIDJ sounds smart, But how
smart is it? Does it handle everything well? Or does
it struggle with certain types of music?
Speaker 2 (11:07):
Ah, that's the critical question, isn't it? And the honest
answer is it's not perfect yet. It excels with music
that has a pretty predictable structure. I think electronic music,
mainstream pop.
Speaker 1 (11:19):
Hip hop, stuff with a steady beat, consistent key.
Speaker 2 (11:22):
Exactly blending say the Weekend into Drake. That's easy for
the AI. Their song structures are similar enough where it
tends to falter is with complexity and big dynamic shifts.
Speaker 1 (11:34):
Like classical music with quiet parts and then sudden loud sections.
Speaker 2 (11:38):
Classical is a tough one for it. Yeah, Or think
about progressive metal, where tempos and time signatures can change
wildly within one song. The AI struggles to predict those
non standard structures, so.
Speaker 1 (11:49):
It might make awkward transitions sometimes it can.
Speaker 2 (11:52):
There was an anecdote from Reddit quoted in the sources
that summed it up pretty well. It's like having a
DJ who sometimes nails the drop but occasionally skips the
bill up entirely. Still useful, especially for casual listening or parties,
but not flawless.
Speaker 1 (12:05):
But hang on, if they're marketing this as an intelligent DJ,
shouldn't we expect better? Or is this just typical for
a version one point zero algorithm trying to handle the
sheer variety of music.
Speaker 2 (12:14):
I think we should expect it to get better, definitely,
but mixing everything perfectly across one hundred million song library,
that's a huge challenge. Apple seems to be treating this
initial release as a foundation.
Speaker 1 (12:27):
And they're trying to improve it.
Speaker 2 (12:29):
How by getting artists involved. Through the Apple Music for
Artists platform, artists can now upload their tracks with stems
or even just specific beat markers.
Speaker 1 (12:37):
So they're basically telling the AI, here's a good place
to mix out of my song.
Speaker 2 (12:41):
Essentially, yeah, it teaches the AI the intended transition points
that should dramatically improve how well autom mix handles those
specific tracks.
Speaker 1 (12:50):
Artists training the AI interesting. What's the near term outlook?
Any hints of improvements coming soon?
Speaker 2 (12:56):
Apple's definitely promising rapid updates and We're already seeing hints
in the iOS twenty six point one beta, things like
genre specific tuning, so the.
Speaker 1 (13:05):
AI might learn different rules for mixing pop versus say,
ambient music.
Speaker 2 (13:09):
Potentially, yeah, and the ultimate goal seems to be customizable
mix modes. Imagine telling Apple Music, give me high energy
transitions from my workout or slow smooth fades for chilling out.
Speaker 1 (13:21):
Okay, that sounds powerful, personalized adaptive mixing on the fly.
Speaker 2 (13:25):
That's the potential. Yeah, huge potential there, all right, So.
Speaker 1 (13:29):
From blending tracks seamlessly, let's switch to breaking down language barriers.
This feels really relevant with global music K pop, jpop,
Latin hits dominating the charts totally.
Speaker 2 (13:41):
This is Apple positioning music as a cultural bridge. It
comes through two features lyrics translation and pronunciation mode, and
again its power by Apple Intelligence running on device translation models.
Speaker 1 (13:53):
How does it work visually in the app and crucially
does it avoid that robotic Google Translate feel? You know
where the poetry gets.
Speaker 2 (14:00):
So when you open the sync lyrics view for a
supported song, you'll see the translation appear right next to
the original lyric, line by line, side by side is
key for context and the quality. Apple claims they specifically
train their models to focus on preserving poetic intent and
cultural nuances. They want the metaphors, the idioms, the emotion
to come through, not just a literal word swap. The
(14:21):
goal is lyrics that feel like they were written, not
just processed.
Speaker 1 (14:25):
That's ambitious. Which languages are they confident enough to launch.
Speaker 2 (14:28):
With the initial list is English, Spanish, French, Japanese, Korean
and simplified Chinese. They say the full list is in
settings and will expand over time.
Speaker 1 (14:38):
Okay. And the pronunciation mode that sounds like it goes
beyond just understanding into participation.
Speaker 2 (14:44):
Yeah, this is really cool. It overlays phonetic guides directly
onto the lyrics, So for Korean you might see romanized hangle,
for Chinese maybe pinion, letting.
Speaker 1 (14:55):
You actually try to sing along, even if you don't
speak the language exactly.
Speaker 2 (14:58):
It lowers the barrier to accurately singing along. It turns
Apple music from just a player into this kind of
stealth language learning tool. It encourages active engagement, which feels
very apple.
Speaker 1 (15:11):
Is there evidence people are actually using the pronunciation guide?
Does it work culturally?
Speaker 2 (15:15):
The early feedback, especially from international music fans, seems really positive.
People saying these guides, which used to only find in
dedicated language apps, make them feel more confident trying to
sing along. Deeper engagement basically makes sense.
Speaker 1 (15:28):
What are the catches? Is this available for every song?
Speaker 2 (15:31):
No, that's the main limitation Right now, it's listed as
select songs. Just like the artist animations, it needs the
labels and artists to submit clean, verified lyrics through that
Apple Music for Artists.
Speaker 1 (15:43):
Portal, So adoption depends on the industry playing ball right.
Speaker 2 (15:47):
But the good news is big players like Universal and
Sony seem to be on board because Apple estimates it
already covers about seventy percent of the songs currently in
the global top charts.
Speaker 1 (15:57):
Seventy percent is pretty good coverage for a long Where
does it struggle?
Speaker 2 (16:01):
Predictably? It has trouble with non Roman scripts that aren't
supported yet, like Arabic and sources mentioned. It can struggle
with accuracy on really slang heavy rap or hip hop
lyrics where meanings are very contextual. But beyond just listening,
think about the ecosystem angle. You can share translated lyric
clips straight to messages. Imagine sharing a kpop course with
(16:21):
your friend instantly translated or.
Speaker 1 (16:23):
Using it in CarPlay for a road trips, sing along
in another language totally.
Speaker 2 (16:27):
And for education, think about language teachers exporting translated lyrics
with notes to create study materials. It's got layers and
all of this.
Speaker 1 (16:36):
The aidj the translation, the pronunciation is included in the
standard subscription, no extra cost.
Speaker 2 (16:42):
No extra cost. It's a pretty significant value add for subscribers.
Speaker 1 (16:46):
Huge. Okay, but all the smart features in the world
don't help if you can't find your own music. For
people with big libraries, that library tab can be a nightmare.
Speaker 2 (16:54):
Yeah, anyone who's been collecting digital music for years knows
this pain. You scroll and scroll, the cognitive load of
just finding that one album you want.
Speaker 1 (17:02):
Exactly, it's library chaos. So iOS twenty six introduces a
couple of organizational MVPs, as the sources called them, aims
squarely at heavy users drowning in songs. First up music.
Speaker 2 (17:16):
Pins pins like pinning a chat conversation. How does that
work for music?
Speaker 1 (17:20):
Pretty much the same idea. It lets you pick up
to ten items could be individual songs, albums, artists, or
even specific playlist and stick them right at the very
top of your library.
Speaker 2 (17:28):
Tap ten items. That's not very many is that lemon intentional,
very intentional, it seems the idea is to force you
to curate what's truly essential. If you could pin unlimited items,
you just end up scrolling through pins ten keeps it focused,
always visible, zero scrolling needed to reach.
Speaker 1 (17:43):
Them, And how do you use it?
Speaker 2 (17:45):
Just long press simple? Is that long quest the album playlist?
Whatever tap pin done, it appears in that top row.
Dragging it off unpins it super intuitive.
Speaker 1 (17:55):
And this sinks right. That feels like a key advantage
over just liking tracks on some other services.
Speaker 2 (18:00):
Absolutely critical advantage. Spotify's pinning is more local. Apple's music
pins sink instantly via iCloud across your iPhone, iPad, Mac
pin your top ten workout albums on your phone. They're
right there at the top on your Mac. Too.
Speaker 1 (18:13):
Consistent experience everywhere.
Speaker 2 (18:15):
And people are finding clever uses like pinning a podcast
episode in a music playlist. If you switch between them
on your commute, creates a little hybrid queue right at
the top.
Speaker 1 (18:23):
Okay, pins help with quick access to the absolute essentials.
But what about the person with like two hundred playlists
nineties rock workout mixes, dinner party vibes. Pins alone won't
solve that scrolling nightmare.
Speaker 2 (18:35):
Nope. For that level of chaos, you need structure, and
that's where playlists folders come in. This was apparently the
feature they got the biggest cheers from longtime power users.
Speaker 1 (18:45):
AH folders for playlists Finally, like on the desktop music.
Speaker 2 (18:49):
App, exactly like that. They finally brought that desktop grade
organization to iOS. It seems simple, but it's a massive change.
Speaker 1 (18:56):
So you can create a folder, name it and just
drag playlists into it.
Speaker 2 (19:00):
Tapti flus and library playlist. New folder name it, say
workout mixes, then just drag all your relevant playlists into it.
Speaker 1 (19:07):
Can you nest folders like a genre's holder than rock
inside that, then maybe nineties alternative inside that?
Speaker 2 (19:13):
You can full nesting support. That's huge for really granular organization.
It transforms that endless flat list of playlists into a
proper browsable filing system.
Speaker 1 (19:23):
I can literally hear the size of relief from people
managing huge collections.
Speaker 2 (19:28):
It was genuine relief reported in the user feedback. Yeah,
and because it's iOS twenty six, even the folders look nice.
They use the liquid glass aesthetic to show these little
translucent collages of the covers of the playlists inside makes
it visual and easy to scan.
Speaker 1 (19:42):
Nice touch. But we need to mention platform availability here.
Does everyone get folders on good point? No?
Speaker 2 (19:49):
While the Android version of Apple Music version five point
zero did get the new pins feature pretty quickly, playlist
folders remain in iOS and macois exclusive at least for now.
That's a significant organizational advantage kept within the Apple ecosystem.
Speaker 1 (20:05):
Interesting differentiator. Okay, so library tamed. Let's talk about making
the actual listening experience more immersive across all the different
places you might listen. Right.
Speaker 2 (20:15):
This is about making the music feel like it takes
over the whole experience, not just the audio channel.
Speaker 1 (20:19):
Starting with the lock screen and widgets. Yeah, big changes there.
You mentioned artwork extending.
Speaker 2 (20:25):
Yeah, it's much more dynamic when music is playing. If
you tap the little mini player on the lock screen,
it extends the now playing screen takes over full screen
with that animated.
Speaker 1 (20:35):
Artwork and the clock.
Speaker 2 (20:36):
Does it just disappear, No, it does something clever. It
dynamically resizes and shifts its position. It might tuck into
a corner or frame the artwork elegantly. The point is
the music's visual identity dominates the screen even when locked.
Speaker 1 (20:51):
And you mentioned earlier those three D effects for certain
albums like the Beyonce example. You see that on the
lock screen too.
Speaker 2 (20:57):
Yep, if the album supports it, you get that suttle
three D parallax effect, that sense of depth. It stays
visible even when the phone is locked and the screen
dims for always on display, the artwork feels constantly alive.
Speaker 1 (21:10):
Always. The question with visuals like this Battery life, how
did they manage that?
Speaker 2 (21:14):
Again, it goes back to the newer hardware, especially a
seventeen pro chip, and the efficiency of the GPUs and
the low power display modes. These ambient animations are optimized
to run without being a major power hog.
Speaker 1 (21:26):
Okay, and widgets did they get an update to match
the new features?
Speaker 2 (21:30):
They did. You can now have medium and large widgets
dedicated just to your music pins, so those top ten
items are right there on your home screen for one.
Speaker 1 (21:37):
Tap playing quick access from the home screen.
Speaker 2 (21:39):
Nice. And there's also a new large live radio widget
if you listen to stations like Apple Music one, this
widget streams it live and shows the dynamic liquid glass
artwork in real time.
Speaker 1 (21:51):
Okay, let's move from the lock screen to the living room.
This karaoke feature sounds potentially amazing or maybe gimmicky. How
does it actually work.
Speaker 2 (22:00):
It's actually a really smart integration between devices. Basically iOS
twenty six on your iPhone and the updated tvOS twenty
six on your Apple TV work together. Your iPhone becomes
the wireless microphone for Apple Music sing playing on the TV.
Speaker 1 (22:16):
So no need to buy a separate karaoke mic system
or anything.
Speaker 2 (22:19):
Nope, you just hold your iPhone sing into its microphone.
The connection uses a low latency protocol, so your voice
should sing perfectly with the lyrics scrolling on the big screen.
Speaker 1 (22:27):
And you mentioned emoji reactions, Yeah.
Speaker 2 (22:29):
Apparently you can send little emoji reactions from your iPhone
screen that pop up on the TV. Adds a bit
of fun party mode interaction.
Speaker 1 (22:35):
And it supports multiple singers.
Speaker 2 (22:37):
Up to four people can queue up. Yeah. But the
really interesting technical detail is the audio processing. It uses
spatial audio tech to process your.
Speaker 1 (22:45):
Voice how so to make you sound better kind of.
Speaker 2 (22:48):
The sources described it as giving your voice booth like vocals.
It tries to simulate that clean, slightly isolated sound you
get seen in a recording booth, just elevates the whole
amateur carryer experience.
Speaker 1 (23:01):
Okay, that sounds more polished than just raw mike input.
Speaker 2 (23:04):
Interesting.
Speaker 1 (23:05):
Moving from fun to function Voice memos integration, How does
that help musicians or creators?
Speaker 2 (23:10):
Voice memos got a serious upgrade. There's a new recording
mode called Studio Voice. Studio Voice, Yeah, it lets you
capture ideas, singing, spoken word instrument snippets directly in spatial audio.
The idea is it's a much higher fidelity capture than
a standard voice memo pro grade, almost.
Speaker 1 (23:28):
Right on your phone. And then what can you easily
move that audio somewhere else?
Speaker 2 (23:32):
That's the key. These high fidelity spatial audio recordings can
be exported directly. You can send them to logic pro
or garage band, or even just say them to files
to use another production software. It bridges that gap between
capturing a quick idea and actually working on it, streamlining
the creative workflow. Okay. Lastly, in this section, Apple Music
(23:53):
replay that yearly summary of your listening habits. What's new?
There used to be just a website, right, it.
Speaker 1 (23:59):
Did, and it felt a bit disconnected. The huge change
is that replay is now fully native within the Apple
Music app itself.
Speaker 2 (24:07):
Ah, finally, where do you find it?
Speaker 1 (24:09):
Just scrawl right down to the bottom of the home tab.
It's there year round now, not just a thing that
appears in December.
Speaker 2 (24:15):
Being native must mean they can do more with it, right,
more data.
Speaker 1 (24:19):
Way more. You still get your top songs, artists, genres,
listening time, but now you can see that data broken
down month by month throughout the year. See how your
tastes changed.
Speaker 2 (24:28):
Okay, historical trends. That's cool. But you mentioned Apple intelligence
playing a role here too.
Speaker 1 (24:33):
Yeah, this is the really unique part. It tracks what
Apple calls mood arcs.
Speaker 2 (24:38):
Mood arcs h How on earth does it know my mood?
Am I supposed to tell it?
Speaker 1 (24:42):
No, it's inferring it highly sophisticated inference. Apparently. It looks
beyond just genre tags. It analyzes the acoustic properties of
the music you listen to at certain times.
Speaker 2 (24:51):
Like tempo, key instrumentation exactly, but also your listening behavior. Yeah,
how long are your sessions? Do you skip tracks often?
Speaker 1 (24:59):
So?
Speaker 2 (24:59):
For example, if you consistently listened to slow instrumental minor
key music late at night and you listen all.
Speaker 1 (25:06):
The way through the tracks, it might infer that's a
calm or reflective period.
Speaker 2 (25:10):
Precisely, it's trying to piece together an emotional narrative of
your year through your music choices, and it presents these
insights as these really nicely designed shareable cards.
Speaker 1 (25:21):
Wow, that moves replay from just stats to actual storytelling
makes it feel much more personal.
Speaker 2 (25:27):
Okay, so we've covered a ton of new features within
the Apple ecosystem, but Apple Music exists on Android two.
It's important to see how this update plays out across platforms.
Speaker 1 (25:36):
Right the competitive landscape? Did Android users get left out
in the cold or did they get some of these
goodies too? When did their update land?
Speaker 2 (25:42):
They didn't have to wait long. The Android Apple Music
Update version five point zero rolled out around September twenty fourth,
twenty twenty five, so just about nine days after iOS
twenty six. Pretty quick turnaround.
Speaker 1 (25:54):
And what features made the cut for Android? Did they
get Automix liquid Glass?
Speaker 2 (25:58):
Ah? No, Automix and the full liquid glass design language.
Those seem to be iOS exclusives, likely due to the
deep hardware integration needed, especially the neural engine for Automix.
Speaker 1 (26:10):
So what did Android users get?
Speaker 2 (26:12):
They got the key organizational stuff, music pins are there,
and the global features lyrics, translation and pronunciation mode made
it across, and the enhanced native replay experiences on Android two.
Speaker 1 (26:23):
So functional parody on organization and the core translation replay features,
but the advanced AI mixing and the signature look remain
Apple only.
Speaker 2 (26:32):
That seems to be the strategy. Yeah, keep some core
advantages tied to the ecosystem though interestingly, Android did get
the new pill shaped buttons, so there's a visual nod
towards the new design language, even without the full liquid
glass effect.
Speaker 1 (26:45):
Okay, makes sense now. Software development never sleeps. iOS twenty
six just launched, but Apple already started beta testing the
next update, twenty six point one. What hints did that
give us about where things are heading immediately?
Speaker 2 (26:56):
Yeah? The twenty six point one beta popped up just
about ten days after the main launch. The biggest user
facing change in that beta was swiped to skip in
the miniplayer.
Speaker 1 (27:04):
Ah like on Spotify, Swiping the little player at the
bottom to change tracks.
Speaker 2 (27:09):
Exactly like that, Swipe left or right on the track
title in the mini player to go next or previous.
It's a small thing, but something users coming from other
platforms really missed, much quicker than tapping.
Speaker 1 (27:21):
Definitely reduces taps when you're busy doing something else. Anything
else significant in twenty six point one.
Speaker 2 (27:27):
Some continued interface refinement making that collapsed tab bar integrate
even more smoothly with the miniplayer, just tightening up the
navigation flow.
Speaker 1 (27:36):
Any bigger strategic hints in the beta, anything that suggests
future directions.
Speaker 2 (27:42):
Well, some code digging in the beta suggested Apple might
be laying the groundwork for third party smartwatch support for music.
Speaker 1 (27:49):
Control whoa really moving beyond just Apple Watch exclusivity for controls.
Speaker 2 (27:54):
That's the speculation. It would be a pretty big shift
if they open that up, suggesting maybe a slightly more
platform magnostic approach to accessories down the line. We'll have
to see, but expect those twenty six point one features
like swipe to skip to roll out publicly, probably by
early October.
Speaker 1 (28:09):
Okay, exciting possibilities. Now, we've talked up a lot of positives,
but let's bring it back to reality. What are the
potential downsides or things listeners should be aware of before
they jump into iOS twenty six and the new.
Speaker 2 (28:21):
Apple Music Right Always good to have a balanced view.
Three main caveats come to mind from the sources. First, Automix.
While cool, remember it can glitch, especially if you have
really diverse, eclectic playlists. Don't expect flawless dejaying across wildly
different genres, at least not yet.
Speaker 1 (28:39):
Okay, manage expectations on automix. What else?
Speaker 2 (28:42):
Second, liquid glass legibility. Even with the contrast boost, some people,
particularly those with older eyes or visual sensitivities, might still
find it a bit hard to read text over complex artwork.
Be ready to dive into those accessibility settings and maybe
switch the intensity to settle.
Speaker 1 (28:58):
Good practical tip and the third caveat music pins.
Speaker 2 (29:02):
Those super useful pinned items at the top of your library,
they require an active Apple Music subscription. If you're just
using the app to play music you bought years ago
via the old iCloud Music library match without a current
streaming subscription, pins won't work for you.
Speaker 1 (29:15):
Ah, So it's tied to the subscription service, not just
the library itself. And the standard advice.
Speaker 2 (29:21):
Before any major OS update like this, back up your
device first and make sure your phone is charged to
at least fifty percent before you start the update process.
Basic stuff, but crucial solid advice.
Speaker 1 (29:33):
Okay, let's try and wrap this up if you had
to synthesize the core identity of this massive iOS twenty
six shift for Apple Music. What is it? What's the
big picture?
Speaker 2 (29:43):
I think the synthesis is that iOS twenty six really
cements Apple Music as like the sources said, the ecosystem's soul.
It's moved beyond just being a utility for playing songs.
Oh so it's blending serious AI smarts, the automix djaying,
the instant global translations with really thought full human centric design,
the Liquid Glass aesthetic, the organizational tools like pins and folders,
(30:05):
the deep integration with things like voice memos and karaoke
all ties together.
Speaker 1 (30:09):
So it's becoming more about storytelling, connection, curation, not just
streaming access exactly.
Speaker 2 (30:14):
It feels like a platform focused on cultural connection, personalized curation,
and even enabling creativity. And the success of the on
device AI and the visual foundation of Liquid Glass. This
feels like just the beginning.
Speaker 1 (30:27):
What's next? Then? What is this at the stage?
Speaker 2 (30:29):
For well, the fact that betas are already hinting at
things like spatial remixing tools suggest the future is about
even deeper interactivity, more creative control for the listener, perhaps fascinating.
Speaker 1 (30:40):
So a final thought for everyone listening. If your phone's
AI can now seamlessly djy your life's soundtrack and instantly
translate global hits with poetic nuance, what barriers are actually
left in music? How close are we to a future
where literally any piece of music is instantly understandable, accessible,
maybe even remixable by anyone.
Speaker 2 (31:01):
It's a profound thought for now. Dive in pin your
absolute favorites, maybe try singing along to something new with
the translation help, and just let the intelligent music flow.
Speaker 1 (31:09):
Great stuff. Thanks for diving deep with us today into
the huge changes with Apple Music and iOS twenty six.
We'll catch you on the next steep dive.