Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:07):
Voices of Video.
Voices of Video.
Voices of Video.
Voices of Video.
Speaker 2 (00:17):
Welcome to Voices of
Video.
I'm Jan Ozer.
This is where we explorecritical streaming-related
topics with experts in the field.
If you're watching and havequestions, please post them as a
comment, on whichever platformyou're watching, and we'll
answer live if time permits.
Today's episode is all aboutdistribution at scale, and Alex
(00:38):
Zambelli, who's TechnicalProduct Manager for Video
Platforms at Warner BrothersDiscovery, is our guest.
I've known Alex at least 15years, going back to his history
with Microsoft, and we'll startthere, where he was a Codic
evangelist and a producer ofevents like Olympics and NFL
football, and we'll hear aboutsome of the experiences there
(01:02):
and then we'll walk through thevarious points in his career
where he got to Warner Brothers.
There's a lot of stops that areworth chatting about, and then
you know I'm known, I think, asa codec theorist, right, you
know I do a lot of testing and Irender conclusions, and that's
useful in a lot of ways, or atleast I hope it is but it's not
real world and Alex just has aton of real or at least I hope
(01:23):
it is, but it's not real worldand Alex just has a ton of real
world experience that he's goingto share with us today.
Things as high level is wherethe industry needs to go to to
make it simpler for publisherslike Warner Brothers to focus on
content as opposed tocompatibility as opposed to, you
(01:46):
know, compatibility and issuesas deep diving.
As you know.
What's his percentage of BBR,you know.
Is it 200% constrained BBR,300% constrained BBR?
And, particular to what I'minterested in, when does a
company like Warner Brotherslook at adopting a new codec,
and I think Alex is going totalk about the decision that
they're in the process of making, which is, you know, whether to
(02:06):
integrate AV1.
So Alex just has a ton of realworld experience in live event
production at huge scales, aswell as, you know, premium
content encoding and delivery,you know, with some of the
biggest names in the industry.
So I'm way excited to have Alexjoining us today.
Alex, thanks for being here.
Speaker 1 (02:26):
Jan, thanks so much
for having me Real pleasure and
I'm looking forward to the nexthour talking to you.
Speaker 2 (02:33):
Yeah, we don't get a
chance to do this that often.
Let's dive in.
I'm not intimately familiarwith your CV.
Did you start in streaming atMicrosoft, or was there a stop
before that?
Speaker 1 (02:46):
I did start my career
at Microsoft, so that was my
very first job out of collegeactually.
So this was back in 2002.
And I started out as a softwaretester.
So I started as a software testengineer in Windows Media
Player and I worked on bothWindows Media Player and then
(03:08):
the codec team at Microsoft as asoftware tester for about five
years.
It was during that second phaseof my software testing role
there, working on the codecs,where I started working with the
VC1 codec, which at the timewas a new Codec for Microsoft in
the sense that it was the firstCodec that Microsoft had
(03:30):
standardized.
So there was a Codec calledWindows Media Video 9, wb9, and
Microsoft took that throughSMPTE.
It's basically standardized,and so that became VC1.
Some folks may recall that thatwas basically one of the
required codecs for both HDDVDand Blu-ray at the time, and so
(03:54):
that's what put it on the map.
During that time where I wastesting the VC1 encoder, I
started interacting a lot withMicrosoft's external customers
and partners.
That then transitioned me intomy next job in Microsoft, which
was technical evangelism.
(04:15):
I ended up doing technicalevangelism for VC1 for a few
years and then my scopebroadened to include really all
Microsoft media technologiesthat were at the time available
and could be used for buildinglarge online streaming solutions
(04:36):
.
And so when I started atMicrosoft, you know, working in
digital media I mean these, youknow in 2002, it was still, you
know, mostly dominated byphysical media.
So we're still talking aboutCDs, dvds, you know, blu-rays.
By the time, you know, Itransitioned into this technical
evangelism job, which is around2007 or so, streaming was
(04:59):
really starting to pick up steamand so from that point on,
really, you know, until to thisday, my career has been focused
on streaming, really, cause thathas become the dominant method
of distribution for digitalmedia, and so I mentioned that,
starting around 2007 or so, Istarted doing technical
(05:21):
evangelism for a whole bunch ofdifferent Microsoft Media
Technologies.
At the time, serverlight was atechnology Microsoft was
developing that was compared tothe Flash.
It was seen as a solution forbuilding rich web pages, because
everything was still primarilyonline through websites and
(05:43):
browsers at the time.
Mobile applications haven'teven started picking up yet.
Really, the primary way ofdelivering streaming media at
the time was through the browser, and this is where Serverlight
came in.
It was a plugin that allowedboth rich web experiences to be
built but also really greatpremium media experiences as
(06:05):
well, and so that included eventhings like digital rights
management, so using PlayReadyDRM to protect the content, and
so on.
Speaker 2 (06:14):
How did that
transition to actual production
and your work at the Olympicsand with the NFL?
Speaker 1 (06:19):
Yeah, so at the time,
microsoft was partnering with
NBC Sports on several projects.
The first one that I wasinvolved with was the 2008
Olympics in Beijing, and so NBCSports had the broadcast rights
the Olympics still does.
They wanted to basically putall of the Olympics content
(06:56):
online for essentially any NBCsports subscriber to be able to
access, and that was, I think, afirst where that was like,
really you had to wait for it tobe broadcast on either your
local NBC station or one of thecable channels, and so if it
wasn't broadcast in LiveLinear,you could never see it.
(07:16):
It wasn't available, and so NBCSports had the idea to put all
of that content online.
Put all of that content onlinethe very first version of the
NBC Olympics site that we builtin 2008,.
We're still using Windows Mediafor live streaming, but was
starting to use ServerLite andwhat at the time, was actually
(07:38):
the very first prototypeimplementation of adaptive
streaming at Microsoft to do ondemand.
And then the next project wedid with NBC Sports in 2009 was
supporting Sunday Night Football, and for that we built a fully
adaptive streaming-based website.
So that was the origins ofMicrosoft's smooth streaming
(08:01):
technology.
So Microsoft had taken thatprototype that was built during
the 2008 Olympics andessentially productized that
into Smooth Streaming.
We had both live streams in HD,which was again breakthrough at
the time, like to be able to doHD at scale.
(08:22):
Now we take it for granted.
But in 2009, that was reallyseen as a big deal.
And then 2010, vancouverOlympics that's when really, we
kind of went, you know, full onsmooth streaming.
Everything was basicallyavailable on demand and live and
(08:43):
smooth streaming, and so, yeah,those are some really uh,
everything was uh basicallyavailable on demand and live
in-suit streaming and so uh.
So, yeah, those are those aresome really, uh, I would say,
groundbreaking events that wedid.
Um, uh, we ended up beingnominated for a few, uh, sports
emmys, technical emmys at thetime, um, I don't remember, uh,
which years we won or didn't win, but uh, but yeah, like it was.
I think recognized by theindustry is also like pushing
(09:05):
the MLO.
Speaker 2 (09:06):
I'm remembering and I
could I don't want to mix you
up with another technology, butI'm remembering either Monday or
Sunday night football with aplayer that had four different
views that you could kind ofpage through.
Was that you guys?
Speaker 1 (09:19):
That was us.
Yeah, that was us.
Yep, that was Sunday nightfootball.
So, yeah, we had.
Basically, you could watchmultiple camera angles
simultaneously, and one of thecool things about that is that
we use smooth streaming to dothat, where it was actually a
single manifest that had allfour camera angles in the same
manifest, and so switchingbetween the camera angles was
(09:42):
completely seamless because itwas similar to switching bit
rates the way you do in Dash orHLS today.
So it was a very cool solutionthat, actually, I don't think
we've even rebuilt it since then.
It was a feature that wedeveloped in 2009 and then sort
of lost to history.
Speaker 2 (10:00):
Did you actually go
to the Olympics or were you
working back in the plumbing inRedmond?
Speaker 1 (10:06):
We were on the
backend side of it, so I did get
a chance to go to one Olympicevent at the Vancouver Olympics,
since they were close toSeattle where I live.
But other than that, we spentmost of those projects in
windowless rooms and datacenters, mostly in Redmond.
(10:28):
Sometime in Las Vegas, becausewe were working closely with Ice
Cream Planet at the time aswell, who were based out of Las
Vegas.
Spent a lot of time in New Yorkas well, at 30 Rock, because
NBC Sports was still at the 30Rock location at the time.
So, yeah, it was a fun time.
Speaker 2 (10:45):
What were the big
takeaways?
You know, if you met somebodyon a plane and they ask gosh,
I'm doing a live streaming event, that's huge.
What did you learn from theOlympics?
What are the high level thingsthat you took away from that,
that you've implementedthroughout your career?
Speaker 1 (11:01):
One of the perhaps
obvious I would perhaps obvious
take of it takeaways was thatyou know live streaming is is
hard in that it's it's it's noton demand, like everything you
know about on-demand streaming.
You kind of have to throw thatout the window when you started
working on live streamingbecause, uh, you're dealing with
very different issues.
You're dealing with real-timeissues, uh.
(11:22):
So even something as simple aspackets getting lost on the way
from your origin encoder to yourdistribution encoder and
dealing with packet loss, andthen dealing with segment loss
on the publishing side andfiguring out how do you handle
that, and handling blackouts andad insertions, and so
(11:47):
everything's under a lot morepressure.
Right, because you know if, ifyou're doing on-demand streaming
, and if there's something wrongwith the content, if there's
something wrong with you knowthe origin or any part of your
delivery chain.
You sort of have a little bitof leeway in that.
Like you know, you got, you gottime to address it.
Hopefully you'll address itvery quickly.
(12:08):
But if the content goes downfor a few hours, it's fine.
People will come back later,whereas with live you don't have
that luxury.
You really have to be on top ofit.
My memory of it is that everytime we were doing these events
it was all hands on deck.
I mean, we had everyone fromMicrosoft to NBC to Akamai to
(12:30):
Ice Cream Planet, like all thedifferent companies that are
involved in these projects.
We would just have everyone oncalls ready to go fix whatever
needs to be fixed in real time,because that was the nature of
it.
So that was a big learninglesson.
There was that live is not ondemand.
You have to really give it alot more focus, give it a lot
(12:53):
more attention than you wouldnecessarily to on demand.
Speaker 2 (12:57):
Does live ever get
easy.
I mean even events like whatwe're doing today.
It seems like there's alwayssomething that breaks, or
there's always the potential forit.
You never feel comfortable withit.
Speaker 1 (13:07):
I think that's a
great way to describe it.
Like it's just, you know,you're never comfortable because
, yeah, something could go wrong, and then you can't just say,
well, you know we'll fix it.
You know, sometime in the next24 hours you have to fix it
right now, right.
And so it's like, yeah, if ourZoom link went down right now,
like we'd be in trouble right.
Speaker 2 (13:29):
No backup for that,
so you jumped from the frying
pan into the fire.
I think your next stop wasiStream Planet, where you're
doing live events all the time.
Speaker 1 (13:38):
So tell us about that
.
At the very end of 2012, I leftMicrosoft and I joined iStream
Planet.
And iStream Planet for thosenot familiar with the company,
so that was a startup out of LasVegas started by Mio Babic, and
they, you know, built areputation for themselves, as
you know, being a premium liveevent streaming provider, and at
(14:04):
the time, they wanted to getinto LiveLinear and they wanted
to also start building their owntechnology.
And so 2012 was when Miostarted a software engineering
team in Redmond, and so the nextyear I joined that software
engineering team and what Iworked on was the very first
(14:26):
live encoder that was builtin-house that I just planted.
One of the ideas at the timewas to build it all on commodity
hardware Again, something thatwe now take for granted, because
now we're accustomed to thingsrunning in the Cloud and so we
assumed that, yeah, of courseyou can go spin up a live
encoder in the cloud and it'srunning on just, you know,
(14:46):
commodity hardware that's there.
But 2012, 2013, that was not thecase.
Right, it was mostly hardwarebased encoders that you had to,
you know, actually put in a datacenter and maintain, and so the
idea that neo had was like,let's run it on commodity
hardware, let's build the.
Let's build a cloud-based liveencoder, and so I worked in that
(15:10):
product for about four and ahalf years and 2015,.
If my memory serves mecorrectly, I think it was 2015
or 2016,.
Ice Cream Planet got acquiredby Turner, and Turner was part
of Warner Media, and so IceCream Planet became a subsidiary
(15:30):
of Warner Media, and so thatwas a pretty nice ending to that
story as well.
Speaker 2 (15:36):
Real briefly if you
can, I'm trying.
So we had Silverlight here,then we had Flash here and
somehow we ended up with both ofthose going away and I guess it
was the whole HTML5 thing.
Then we had Flash here andsomehow we ended up with both of
those going away and I guess itwas the whole HTML5 thing.
And that brought HLS, andSmooth is in there.
But when did you transitionfrom VP or VC1 to 264?
And how did that work?
Speaker 1 (15:57):
When Serverlight
launched, originally the only
video code that could support itwas VC1.
And then I think it was a thirdor fourth version of
Serverlight where H.264 supportwas added, and I think Flash
added it around the same time.
I think it was literally likeone month after another.
So the challenge with basicallybuilding any streaming solution
(16:18):
in HTML around that time sokind of again going back to 2007
, 2008, uh, uh, timeframe, um,the challenge was that HTML was
just not ready.
Um, there's basically no APIsin HTML that would allow you to
do streaming with the level ofcontrol that that was needed.
(16:38):
Um and so, um, there, you know,there were some workarounds.
Uh, example, apple went andwhen they came out with HLS as
their streaming protocol, theybaked it into the Safari browser
, and so if you use the videotag in HTML in Safari, you could
(17:00):
basically just point it at anM3U8 playlist and it would just
work.
But that was, you know,exception rather than the rule.
I mean, most other browserimplementations, whether it was,
you know, chrome or Firefox orin an Explorer at the time, did
not do that, and so there wasthis kind of challenge of well,
(17:22):
how do you stream?
And so we're, we're basicallyflashing and server light, I
think brought to the table atthat time was an opportunity to
really kind of leapfrog HTML tobasically like just take
advanced it even if it, you know, was a proprietary plugin but
advanced technology to a pointwhere it was usable.
(17:43):
And so one of the innovationsthat several I brought was the
concept to a point where it wasusable.
One of the innovations thatServolite brought was the
concept of a media stream source, which today now exists in HTML
.
When you go build a solution inHTML today that's a streaming
solution, you're using the mediasource extensions and the
encrypted media extensionsportions of the HTML spec.
(18:05):
At the time that was not yet inHTML5.
So what had that approach of?
Well, we're not gonna bake inany particular stream protocol
into the plugin.
We're gonna basically open upan API that allows you to go
handle your own downloading ofsegments and parsing of segments
and then you essentially justpass those video and audio
(18:28):
streams into a media buffer andthen the plugin goes and decodes
and renders that and handlesthe rest.
And then another crucial part, Ithink, of what Serverlight
brought to the table was DRM,because that was something that,
again, html just didn't have agood solution for content
protection.
The reality of the industrythat we work in is that if you
(18:53):
want to provide premium contentto audiences, you have to
protect it.
Generally, content ownersstudios will not let you go
stream their content just in theclear, and so it was a big deal
that Serverlight could bothenable streaming but also enable
(19:14):
content protection of thecontent.
And then Flash ended up doingthe same with Flash DRM, adobe
DRM as well, and so around.
I think it was 2012, 2011, if Iremember, where both Serverlight
and Flash kind of went away andwere replaced by HTML, and it
(19:36):
was because by that point, htmlhad matured enough where that
was feasible and there werestill some growing pains there.
I remember there was a periodwhere it was kind of like we
were neither here nor there, butI would say like 2014, 2015,.
Like HTML5 had all the neededAPIs to enable basic stuff like
(19:57):
implementing Dash and HLS andstreaming in the browser and
protecting it with DRM.
So that's where we are today.
Yeah, it took a while to getthere.
Speaker 2 (20:09):
Real quickly.
What do you do at WarnerMedia?
So I'm hearing were you aprogrammer or were you a live
video producer?
You started testing, which isso.
What's your skill set?
Speaker 1 (20:21):
So I mentioned that
earlier when I started my career
.
I started in engineering andthen transitioned to technical
evangelism, uh, by the time thatI uh moved over to ice cream
planet, so my job at that pointbecame product management, uh,
and so I've been a productmanager, uh, since since then,
so for the past 10 years.
Uh.
So after ice cream, uh, I wentto hulu and I was a product
(20:44):
manager for the video platformat Hulu for five years, and then
my most recent job, so for thepast two years, I've been at
Warner Brothers Discovery also,product managing the video
platform here as well.
So what would myresponsibilities are as a
product manager is?
I focused on the video platformitself, so I focus specifically
(21:08):
today.
I focused on mostly transcodingpackaging.
So for the most recent launchof Max, which is the new service
that combines Discovery Plusand HBO Max they just launched
last week.
So I was the product managerfor the VOD transcoding and
(21:28):
packaging platform there.
That involved essentiallydefining the requirements of
whether the different codecs andformats we need to support,
what the workflows should looklike, how do we get content in
from the media supply chain,what are all the different
permutations of formats we needto produce.
(21:50):
You know what kind of signalingneeds to be in the manifest
that players would be able todistinguish between HDR and SDR.
So all those types of technicaldetails like those are part of
my job.
Speaker 2 (22:03):
Let's do a speed
round of some technical encoding
issues.
Your answers will appear in mynext book.
Where are you on encoding costversus quality?
And that would translate to areyou using the placebo or the
very slow preset?
And I don't know if you useX.264, but do you use that to
(22:25):
get the best possible qualityfor bit rate, irrespective of
encoding cost, or do you dosomething kind of in the middle?
I'm sure you're not in theultra-fast category, but real
quick.
Where are you in that analysis?
Speaker 1 (22:39):
So, yeah, we
currently do use X.264 and X.265
for the VOD transcoding atWarner Bros Discovery.
So we typically use either theslow or slower presets for those
encoders, though one of thethings we have been discussing
(23:01):
recently is that we perhapsshouldn't necessarily use the
same preset across all bearrates or even across all content
, and so that's an idea thatwe've been exploring where you
know, for you know, if you lookat your typical encoding ladder,
right, you got, you know, let'ssay, you know 1080p, or you
know 2160p at the top, but youknow, at the bottom of your
(23:23):
ladder you'll have, you know,320 by 180, you might have a 640
by 360, right?
And so then the questionbecomes well, why use the same
preset for both thoseresolutions?
Right, because, like, why usethe same preset for both those
resolutions?
Right, because, like you know,x264 very slow, is gonna take a
lot less time on your 640 by 360resolution than on your, you
(23:44):
know, 1080p resolution.
And so that's one of the ideasthat we've been looking at is
like, okay, we should probablyapply different this, different
presets for differentresolutions, different
complexities, and then not allcontent is necessarily the same,
in the sense that it's notequally complex, right?
So perhaps not everythingrequires the very slow preset.
(24:06):
And then not all content isequally popular.
If there's a particular pieceof content that's watched by 20
million viewers versus somethingthat's watched by 10,000
viewers, the one that's watchedby 20 million probably should
get the more complex preset, theslower preset, because whatever
extra compute you spend on thatis going to be worth it,
(24:28):
because it will hopefullytranslate to some CDN savings on
the other side and so yeah.
So hopefully that answers yourquestion.
Speaker 2 (24:37):
Question when did you
?
You talked about x.265, that'satbc.
When did you add that and why?
Or were you even there?
Did warner add it before yougot there?
Speaker 1 (24:46):
uh, yeah, so, uh, hbo
max had already been using hbc
uh, and so this was uh.
So we, you know, obviouslycontinued using it for Max as
well.
On the Discovery Plus side, wehad been using HEVC for some 4K
content, but there wasn't a lotof it and so it was really
mostly all H.264 on theDiscovery Plus side.
(25:07):
But with Max we are using,obviously, h.264 still and we
are using HEVC as well for bothSDR and HDR content, and so
right now, for example, if yougo play something on Macs, on
most devices it's actually goingto play back in HEVC.
(25:29):
So, even if it's SDR, it willbe 10-bit HEVC and then,
obviously, if it's HDR, itwill-bit HEVC and then,
obviously, if it's HDR, it willdefinitely be HEVC.
Speaker 2 (25:40):
How many encoding
ladders do you have for a
typical piece of content?
Speaker 1 (25:44):
So the way we define,
when you say how many encoding
ladders, you mean sort of likedifferent variations of encoding
ladders or do you mean likesteps within the ladder?
Speaker 2 (25:53):
Different variations
of encoding ladders.
Speaker 1 (25:55):
Literally looking at
the spreadsheet right now, and I
think it's about six or eightdifferent variations right now,
and so what we've tried to do isbuild an encoding ladder where,
depending on the sourceresolution, we don't have to
necessarily have differentpermutations of the ladders and
(26:16):
so we have sort have a UHDladder where, depending on what
the source resolution is, thatdetermines where you stop in
that ladder, but doesn't changethe ladder necessarily itself.
Where the permutations come inis things like frame rates.
So if the source is 25p or 30por 24p, that's going to go and
(26:39):
use a different ladder than ifthe source is 50p or 60p,
because that is one of thethings we've done for Macs.
That wasn't supported before,for example, is high frame rates
.
So previously everything wascapped at 30 FPS, and most of
that was due to the fact thatthere wasn't really a lot of
source.
Content on HBO Max was cappedat 30 FPS, and most of that was
due to the fact that therewasn't really a lot of source
(27:00):
content on HBO Max, for example,that required more than 30 FPS.
But now that the contentlibraries of Discovery Plus and
HBO Max are combined, there's alot more reality TV on the
Discovery Plus side.
A lot of that is shot at 50 FPSif it's abroad, or 60 FPS if
(27:20):
it's US, and so we wanted topreserve that temporal
resolution as much as possible,and so we've started to support
high frame rates as well, and sowe have different encoding
ladders for different framerates.
And then, of course, there'sdifferent encoding ladders for
SDR versus HDR, and even withincourse, there's different
encoding ladders for SDR versusHDR, and even within HDR we have
(27:40):
different encoding ladders forHDR10 versus Dolby Vision 5, for
example.
Speaker 2 (27:45):
What about for
different devices?
So if I'm watching on my smartTV and then I transition to my
smartphone, am I seeing the sameladder, or do you have
different ladders for differentdevices?
Speaker 1 (27:58):
At this moment
they're the same ladders for all
the devices.
We might deliver differentsubsets of the ladder for
certain devices, but that'stypically capping on the high
end of the ladder.
So if, for example, some devicecannot handle 60 FPS, or if it
cannot handle resolutions above1080p, for example, then we
(28:21):
might intentionally cap themanifest itself that we're
delivering to that device.
But in terms of different bearrates and different encodings,
we're not differentiating it yetbetween different devices.
So I'll give you my personaltake on that question, which is
(28:45):
that in most cases it's notreally necessary, in my opinion,
to have different encodingladders for different devices,
because your 1080p should lookgreat, no matter whether you're
watching it on a iPhone or AppleTV, and so having two different
1080p encodes doesn'tnecessarily make sense.
(29:08):
I've definitely heard people saywell, perhaps on the lower end
of the bear rate ladder, whereyou have your lower bear rates,
lower lower resolutions, that'swhere you need to have
differentiation.
But again, in my opinion,there's no harm in delivering uh
, you know, 100, 200 kilowattper second, uh bear rates in a
manifest to a uh smart tv,because most likely it's never
(29:31):
going to play it, um, and so youknow, you can put in the
manifest.
You can deliver it to the, tothe TV or to you know streaming.
Stick in, you know, in, in vastmajority of cases it's never
even going to touch that bitrate, it's just going to skip
right over it, go straight forthe you know HD and the UHD.
And the only times you mightever see that low bit rate is,
(29:55):
you know, if somethingcatastrophic happens to your,
happens to your network and theplayer struggles so badly and
needs to drop down to that level.
Speaker 2 (30:01):
What's your VBR
maximum rate on a percentage
basis?
When we started out, it was CBR, so your max was 100% of your
target.
Where are you now with your VBRfor your premium content?
Speaker 1 (30:16):
VBR for your premium
content.
So we've taken an approach withX264 and X265 of relying
primarily on the CRF ratecontrol, but it's a CRF rate
control that uses a bitrate andbuffer cap.
So when you are writing yourcommand line in FFmpeg, right,
(30:38):
you can set the crf target, butyou can also specify a vpb
buffer size and a vpb max rate.
And so we are doing that, andthe reason behind that is we
want to make sure that we'recontrolling essentially the
codec level at each resolutioneach period and that we're
controlling essentially thecodec level at each resolution
each period and that we're sortof keeping the peaks also
(31:04):
constrained that way.
And so I can give you an examplewhere you know if it's
something like, let's say, youknow HEVC and it's you know
1080p, you know you might wantto stay.
You know at codec level fourrather than codec level four one
, because you know four onemight, or that one actually like
(31:29):
maybe it's not as big of a deal, but, for example, like what,
if you're choosing between levelfive and level five one, right,
there's certain devices thatmight not support five one, for
example.
And so in order to stay undercodec level five for HEVC, you
have to maintain you have tostay under certain buffer sites,
right, and so that's how.
(31:50):
That's what ends up driving alot the actual caps that we set.
Speaker 2 (31:54):
Circling back.
I mean CRF gives you a measureper title encoding as well, so
is that intentional?
Speaker 1 (32:00):
Yeah, that's part of
it.
Yeah, is that with CRF right?
Really, when you specify yourVPB max rate, you're just
specifying your highest averagebit rate really for the video.
And so as long as you'recomfortable with that max rate,
then you know you can also counton CRF probably bringing your
average bitrate below that maxrate most of the time.
(32:23):
And so if we said, for example,10,000 kilobits per second as
the max rate, most of the timethe CRF target is really going
to bring in that average bearrate much lower, around five or
six megabits.
And so that is a way of kind ofgetting per title encoding in a
way and achieving CDN savingswithout sacrificing quality
(32:46):
right, because depending on thecomplexity of your content, it's
either going to be way belowyour max rate or it's going to
hit against the max rate.
And then at least you're sortof putting out you know you're
capping the highest possiblebear rate that you'll have for
that video.
Speaker 2 (33:01):
That's a pretty
creative way to do it.
What's the impact of DRM onencoding ladder, if anything?
So I know there's a differencebetween hardware and software
DRM and there are somelimitations on content you can
distribute with software-basedDRM.
So can you encapsulate?
We're a bit running short oftime, but can you encapsulate
(33:23):
that in, like you know, a minuteor two.
Speaker 1 (33:25):
The way most of the
content licensing agreements are
structured, you know, typicallyunder the content security
chapter there's requirementsaround what kind of essentially
security levels are required toplay back certain resolutions
(33:46):
and then often what kind ofoutput protection is required.
And so typically what you'llsee is that something like
WideLine L1, which is ahardware-based security level of
Widevine or hardware-basedprotection, and then on the
PlayReady side something likeSL3000, which is also the
(34:06):
hardware-based implementation ofPlayReady.
Those will be required for1080p and above, for example.
So a lot of the contentlicensing agreements will say
unless you have hardware-backedDRM on the playback client, you
cannot play anything from 1080pand above.
Then they'll have similarrequirements around each level.
(34:34):
They'll group your resolutionstypically in SD, hd, full HD,
uhd, and each one of those willhave different DRM requirements
in terms of security levels.
Also requirements around HDCP,whether that needs to be
enforced or not, whether it'sHDCP1, hdcp2.
And so what that essentiallymeans in practice then is that
(34:57):
when you're doing your ABRladder you have to define those
security groups based onresolution and you have to
assign different content keys tothose groups, and so your video
streams up to, let's say, 720pmight get encoded with one
encryption key, and then youknow, between 720p might get
encoded with one encryption key,and then between 720p and 1080p
(35:20):
gets a different encryption key, and then everything above
1080p gets another encryptionkey and audio gets a different
encryption key.
And so by doing that weessentially accomplish that at
playback time, when the licensesare being requested by the
players for each of those bearrates because they're using
different keys, you can nowassociate different playback
(35:41):
policies with each key, and soyou can say well, this SD
content key, for example, has apolicy that doesn't require HDCP
to be enforced and doesn'trequire hardware level of
protection, whereas the HD groupor the UHD group might require
those.
And so that's really somethingthat we do today in response to
(36:05):
the way the content licensingagreements are structured, and
so in the future that mightchange.
My impression is that we'reactually moving in a in a uh
direction of more DRM ratherthan less, less DRM, uh.
So like, even as recently asyou know, three, four years ago,
like some some uh studios, somecontent owners were still
(36:29):
allowing certain resolutions tobe delivered in the clear uh,
like SD, for example, um, and alot of that's kind of going away
, where now essentially it'slike look, if you're going to do
DRM, you might as well do DRMacross the board, because it
actually kind of makes it lesscomplicated that way.
And so and one of the thingsI've also noticed is that, like
(36:51):
when it comes to HDR for example, it's the strictest
requirements you know for all ofHDR, and so, even with HDR, you
have an encoding ladder thatranges from UHD all the way down
to 360p or something, and therequirements and the agreements
are well, you must usehardware-based DRM and you must
(37:13):
use HDCP 2.3 for the whole HDRladder, and so it seems that
that's the trend of the industry, is that we're actually moving
just towards using DRM foreverything.
Speaker 2 (37:25):
What's the difference
between hardware and software
Hardware?
Is that a browser versus mobiledevice thing?
Or where is software DRM andwhere is hardware?
Speaker 1 (37:36):
So the difference is
in the implementation of the DRM
client itself.
And so if you basically want toget the highest security
certificate from either Googleor Microsoft for their DRM
systems, you essentially have tobake in their DRM client into
(37:56):
the secure video path of thesystem.
So that means a type couplingwith the hardware decoder as
well, so that essentially, whenyou send a video stream to the
decoder, once it goes past thedecoder, there's no getting
those bits back.
So essentially, once you sendit to the decoder, there's no
(38:17):
getting those bits back right.
So essentially, once you sendit to the decoder, at that point
it's secure decoding and securedecryption.
Well, first, I guess, securedecryption, then secure decoding
, and then it goes straight tothe renderer, right, and so
there's no API call that you canmake as an application that
says, now that you've decryptedand decoded these bits, like I,
you know, hand them back to me,and so that's typically called a
(38:41):
secure video path or securemedia path.
And so that's what you get witha hardware-based DRM.
Software-based DRM, you know,does either some or all of those
aspects of decoding anddecryption in software, those
aspects of decoding anddecryption in software, and
(39:05):
therefore there's a risk thatsomebody could essentially hack
that path at some point and getthose decoded bits back and be
able to steal the content.
Speaker 2 (39:09):
So if I'm watching
265 on a browser without
hardware support, I'm likely tobe limited in the resolution I
can view if it's premium contentbecause the publisher says I
don't want anything larger than360p, going to software.
Speaker 1 (39:26):
Exactly yeah, and
today, for example, if you're
using Chrome, for example.
So Widevine DRM is available inChrome, but only L3, which is
the software-basedimplementation of Widevine.
Drm is available in Chrome, butonly L3, which is the
software-based implementation ofWidevine.
And so oftentimes if you'reusing Chrome, you actually get
worse video quality with some ofthe premium streaming services
(39:49):
than if you're using Edge orSafari, for example.
Or Safari, for example, becauseboth Safari on Mac and Edge on
Windows do support hardware DRM,because they are just more
tightly integrated with theoperating system and so they're
able to essentially achieve thatsecure video path between the
browser and the operating systemand the output.
(40:11):
So let's jump to the packaging,because you're in the HLS, dash
or CMAP camp these days jump tothe packaging, because you, uh,
you and the hls, dash or cmfcamp these days, both uh, so at
uh, both uh wonder birdsdiscovery and then my previous
job at hulu uh, we, we've beenusing both hls and dash and,
interestingly enough, actuallylike even distributing in it.
Uh, just like, the splitbetween those two is almost
(40:33):
identical, so, so we use HLS forApple devices and we use Dash
for streaming to all otherdevices.
What's common to them is theCMF format, and so one of the
things that I kind of get alittle bit annoyed about in our
industry is when people refer toCMF as a streaming protocol,
(40:55):
and I always kind of feel like Ineed to correct them and say no
, no, it's not a streamingprotocol.
Because, you know, cmf isreally two things.
Right, like CMF is, on one hand, a standardized version of what
we, you know, frequently callfragmented MP4, right, the
ISO-based media file formats,and what the CMF spec did is
(41:19):
basically just define look, ifyou're going to use FMP4 and HLS
and Dash, here's, you know, theboxes you need to have and
here's how common encryptiongets applied to that and so on,
and so it's really just a morekind of buttoned down version of
.
You know what we have alwayscalled FMP4.
And so in many cases, like youknow, if you have been packaging
(41:40):
, you know either Dash or HLS.
You know in FMP4 media segmentsyou're most likely already CMAP
compliant, you're already usingCMAP.
But the other thing that CMAP isright, like the CMAP spec, also
defines a hypotheticallylogical media presentation model
and so it essentially describeswhat really kind of, when you
(42:02):
read through the lines, willsound a lot like HLS or Dash
without HLS or Dash.
It's really defining kind ofhere's the relationship between
tracks and segments andfragments and chunks and here's
how you sort of address like allthose different levels of the
media presentation.
And so you can then think ofHLS and DASH really being kind
(42:26):
of the physical manifestationsof that hypothetical
presentation model.
And there's a really great specthat CTA authored, so I think
it's CTA, I think 5005.
That is the HLS-DASHinteroperability spec and it's
(42:50):
heavily based on CMAP and usingkind of CMAP as the really the
unifying model and then reallydescribing how both HLS and DASH
plug into CMAP and how theyreally kind of describe the.
You can describe the sameconcepts in both and so it's
almost like HLS and DASH arejust programming languages that
are describing the same.
You know sort of pseudocode.
Speaker 2 (43:07):
I want to come back
to some other topics, but one of
the topics important to you isis the CTA part of the
organization that's going tomake it simpler for publishers
to publish content and justfocus on the content development
and not the compatibility,Because it seems like that's a
pretty compelling issue for you.
Speaker 1 (43:23):
I hope that CTA will
make some efforts in that space.
I think a lot of what they'vebeen doing is trying to improve
the interoperability in thestreaming industry, and so I
think it does feel like CTA Waveis the right arena for that.
One of the issues that I thinktoday makes deploying streaming
(43:46):
solutions really complex andchallenging is that we have a
lot of different applicationdevelopment platforms.
Just before this call, I kindof went and counted the number
of app platforms that we have atWBD that we just developed for
Macs, and it's basically aboutyou know a dozen or 16 different
(44:10):
application developmentplatforms.
Now there's overlap betweensome of them.
So you know Android TV and FireTV are kind of you know more or
less the same thing with, youknow slight differences.
But at the end of the day, youknow you're looking at probably
like at the very least, at thevery least, half a dozen
different app developmentplatforms and then, worst case
scenario, you're looking upwardsof 20 or so app development
(44:32):
platforms, especially once youstart considering set-top boxes
made in Europe or Asia thatmight be like HBBTV compatible
and so on, and so that's a lotof complexity because the same
app needs to be built over andover and over again, right, in
different program languages,using different platform APIs,
(44:56):
and I think as an industry we'rekind of unique in that sense.
I'm not actually aware of anyindustry other than streaming
that needs to develop that manyapplications for the same thing.
You know if you're working inany other, I think, industry,
you know if you're working inFinTech or you know, or you know
anything else, right, youtypically have to develop three
applications a web app, ios appand Android app and you're done
(45:21):
right.
And so it's kind of crazy thatyou know, in our industry,
industry we have to go build,you know, over a dozen different
uh applications.
But the practical challengesthat then brings, when it comes
to uh things like encoding andpackaging and so on, is that, uh
, it's, it's hard to know whatthe devices support.
(45:42):
Um, because there is no spec,there is no standard that
essentially allows thatspecifies APIs, for example,
that every different deviceplatform could call and expect
standardized answers, right?
So when we talk about mediacapabilities of a device, like,
(46:03):
what are we talking?
We're talking about we need toknow what decoders are supported
right For video, for audio, butalso for images, for text,
right Timed text.
We need to know what differentsegment formats are supported.
You know, is it CMAP, is it TS?
Like, what brand of CMAP?
Right, cmap has this niceconcept of brands but nobody's
(46:25):
really using it, right, like youneed to.
Like you know, for that conceptto be useful, you need to be
able to query a device and saywell, what CMAP brands do you
support?
Manifest formats?
Right, there's differentversions of HLS.
There's different profiles ofDash.
There's different DRM systems,right, and so all these are all
things that we kind of need toknow if we want to place
(46:46):
something back on a device andplay it well.
Speaker 2 (46:49):
So how do we
standardize the playback side?
Speaker 1 (46:51):
Probably one of the
key steps I think we need to
take is, I think we need tostandardize device media
capabilities detection APIs, andthere has been some efforts in
W3C of defining those types ofAPIs, in HTML, for example, but
again, not every platform usesHTML.
(47:14):
When it comes to Roku, when itcomes to Media Foundation and
other different media appdevelopment platforms, we need
essentially the same API reallyto be present on every platform.
And then, once we have APIsstandardized in the way they
detect media support, we need toalso have a standardized method
(47:38):
of signaling those capabilitiesto the servers, because if you
want, for example, targetspecific devices based on their
capabilities, the next questionbecomes well, how do you express
that?
How do you signal that to thebackend?
How do you take action on that?
How do you do things likemanifest filtering based on that
?
So there's a I think there's alot of space there for
(48:01):
standardization, there's a lotof room for standardization, and
so, yeah, I'm hoping that youknow CTA way, where one of the
other industry organizationswill take some steps in that
direction.
Speaker 2 (48:10):
Final topic is going
to be AV1, or new chaotic
adoption.
So you know when you you're incharge of choosing which
technologies you're going tosupport.
When does a technology like AV1come on your radar screen from
us?
I mean, you've heard of itsince it was announced,
obviously, but when does it comeon your radar screen in terms
(48:32):
of actually supporting it in aWarner Brothers product?
Speaker 1 (48:35):
The first thing I
typically will look at is device
adoption, because that's really, I think, the most crucial
requirement is that there has tobe enough devices out there
that we can actually delivermedia to with a new codec.
That makes it worthwhile.
Because there's going to becosts involved in deploying a
new codec right.
(48:55):
First cost comes from just R&Dassociated with investigating a
new codec, testing it, measuringquality, then optimizing your
encoding settings and so on,right and so that's both time
and then also either manual orautomation effort that needs to
(49:16):
be done right To be able to justunderstand what is this codec,
is it good?
Do I want to use it?
Right.
And then, if you suddenlydecide you want to deploy that
codec, there's going to becompute costs associated with
that.
There's going to be storagecosts associated with that.
Then in some cases, there mightbe licensing costs as well.
(49:37):
If you're using a proprietaryencoder, maybe you're paying
them.
Or if you're using an opensource encoder, well, you still
might owe some royalties on justusage, and you're pretty
familiar with that.
I read one of your recent blogposts and so I know that you've
spent a lot of time looking atroyalties and different business
(49:59):
models that different codecsnow have.
So in order to justify thosecosts, in order to make those
costs actually worthwhile, thereneeds to be enough devices out
there that can be reached bythat new codec.
And so the first really questionis what percentage of devices
that are, you know, activedevices on a service, are
(50:22):
capable of using that codec.
And you know, interestingly,like this kind of goes back to
that previous question that youasked, which is, you know about
device capabilities and you knowhow do we basically improve
those things.
So, without good, healthy datacoming back from players, coming
back from these, you know appsthat tell us what's supported on
(50:45):
the platforms, it's hard toplan.
You know what your next criticis that you want to deploy, like
right now.
For example, if I wanted toestimate the number of AV1
decoders out there, my bestresource would be to go study
all the different hardware specsof all the different devices
out there and figure out youknow which ones support AV1, for
(51:09):
example, or VVC, or you knowLCVC, and then try to kind of
extrapolate from that data.
Okay, what does that mean?
You know how do we project thatonto our particular active
device base.
And so, yeah, it's notstraightforward today, but I'm
hoping that if we can improvethe device capabilities
(51:32):
detection and reporting then wecan also get to a point where we
can just run a simple query andsay, okay, tell me what
percentage of devices that theservices seen in the last week
supports AV1 decoding, andspecifically maybe AV1 decoding
with DRM support or AV1 decodingof HDR, right, and so it's like
(51:55):
there's even nuances withinjust you know, beyond just which
codec is supported.
Speaker 2 (52:01):
What kind of pressure
do you get if any from you know
, your bosses or your co-workersabout new codecs?
Because you know we love totalk about them.
We read about them all the time.
But are people pounding on youand saying you know where's AB1
support, where's VBC, when's VBC, or do they not care?
Is that not part of whatthey're thinking about?
Speaker 1 (52:21):
I would say there's
not a lot of pressure from
leadership to support specificcodecs.
I think they're more interestedin, probably, cost savings and
looking at things like how do welower CDN costs.
But one of the things that Iusually always explain to them
is that it's not a perfectone-to-one relationship between
(52:45):
deploying a new codec and cdncost savings, for example.
Um, like, even if you save, forexample, 20 on your encoding
bitrate, for example with a newcodec, that doesn't necessarily
translate into 20 of cdn costsavings.
Um, because in some cases, youknow, like, if somebody's on a
three megabit connection speed,for example, right, somebody's
(53:08):
on 4G, and the most they can getis, like you know, three
megabits per second, you beingable to lower your bitrate from
10 to six megabits per second isnot really going to impact it,
right, they're still going to bepulling the same amount of data
, and so that's why it's not aclear one-to-one mapping.
But, yeah, I would say, likemost of the demand for new
(53:29):
codecs comes from that aspect,right from that direction,
rather than somebody saying,well, we have to support VVC
because it's the latest,greatest thing out there, right,
like, generally, like that'snot the case.
If anything, I'm usually theone that's, you know, pushing
kind of for that and saying like, well, you know, we really
should be moving on from HD64and moving on to the next
(53:50):
generation of codecs, because atsome point, right like you,
just do have to leave old codecsbehind and slowly deprecate
them as you move on to the newtechnology.
Speaker 2 (54:00):
Do you have, I mean,
do you have, a sophisticated
financial analysis for doingthis, or do you, um, or you know
, do you do the numbers on anenvelope, kind of thing?
Speaker 1 (54:10):
it's.
It's more an envelope kind ofthing right now.
Uh, it it is.
Uh, yeah, it would be somethingthat would be based on, you
know again, like number of uhyou know devices supported and
then comparing that to kind ofaverage bear rate savings and
comparing that to compute costsand, and you know, potentially
licensing costs associated withit.
(54:31):
So, yeah, it is.
It is a, yeah, sort of a backof a paper napkin kind of
calculation at this point, but Ithink the I think the factors
are well known.
It's really coming up with thedata that you know feeds into
those different variables.
Speaker 2 (54:45):
A couple of questions
.
What about LCEBC?
Are you doing enough live?
Or is that even a live versusVOD kind of decision?
Speaker 1 (54:55):
With LCEBC, I don't
think it's even a live versus
VOD decision, I think with LCEBC, I think what's interesting
with that codec right is thatit's an enhancement codec.
It's a codec that reallypiggybacks on top of other
codecs and provides, you know,better resolution, better
dynamic range, for example, atbit rates that would typically
(55:16):
be associated with lowerresolutions.
You know more narrow dynamicranges, and so you know, the way
LCVC works is that there's apre-processor part of it that
essentially extrapolates thedetail that is then lost when
the video is scaled down.
So you can start with a 1080pvideo, scale it down to, let's
(55:40):
say, 540p, encode this 540p andthen, with the LCVC decoder on
the other end, it can now takesome of that sideband data and
attempt to reconstruct the fullfidelity of the 1080p source
signal, and so that conceptworks the same, whether that the
baseline codec that you'reusing is H.264 or 265 or VVC or
(56:05):
AV1.
And so I think that's what'sinteresting about that codec is
that it can always let you be astep ahead of whatever the
latest generation of codecs isproviding.
And then the other nice thingabout it is that there's a
backwards compatibility optionthere, because if a decoder
doesn't recognize that sidebanddata.
(56:25):
That is specific to LCBCdecoding.
It'll just decode your basesignal, which might be half
resolution or quarter resolution, and so I think in ABR I think
it can be very applicable in ABRbecause typically you have a
lot of different resolutions inyour ladder right.
So it's like if you couldpotentially deliver that you
(56:47):
know 360p resolution in yourladder, at 720p, for example, to
a LCVC decoder, then you knowwhy not.
Speaker 2 (56:57):
We've got a technical
question here.
Are you able to deliver oneCMAP package using one DRM, or
do you have to have differentpackages for Apple and the rest
of the delivery platforms?
Speaker 1 (57:13):
Yeah, that's a great
question.
So right now what we do is weencrypt every CMAP segment twice
, once with CBCS encryption modeand the other one with CTR CNC
encryption mode, and so the CBCSencrypted segments, those are
the ones that we deliver to theHLS, to Fairplay devices, and
(57:38):
then at the moment the CTRsegments are the ones that we
then package with Dash and, withyou know, are used with both
PlayReady and Widevine.
That said, both Widevine andPlayReady have introduced
support for CBCS a while agoit's actually, I think, been
probably over five years at thispoint and so theoretically we
(58:02):
could deliver those CBCSencrypted segments to all three
DRM systems and it would work.
The challenge at the moment isthat not all devices that are
Widevine or PlayReady clientshave been updated to the latest
version of PlayReady or Widevine, because in a lot of cases
there are hardwareimplementations and so without
(58:22):
basically firmer updates fromthe device manufacturer, they're
never going to be up to datewith the latest DRM client, and
so we're kind of waiting to seewhen those last CTR only
Widevine and PlayReady clientsare going to kind of be
(58:43):
deprecated, right Slowly, kindof move out of the life cycle
Once you know the vast majorityof the PlayReady and Widevine
clients out there are CPCScompatible, then that opens up
the path to using CPCS insegments everywhere.
Speaker 2 (58:58):
Final question AV1
this year or not?
What do you think?
Speaker 1 (59:03):
I think probably not
this year.
I would say I mean, I think youknow we might do some
experimentation.
I think you know some just someresearch into like encoder
quality and you knowoptimization this year with AV1.
But I wouldn't expect, you know, deployment of AV1 this year,
not because of lack of support,because I think the support is
(59:26):
really starting to be there insignificant numbers.
So I know that I think it's thelatest.
Either Samsung or LG TVs, forexample, now include 81 decoders
as well, and so that's always.
I think often people will lookat mobile as kind of being the
indicator of you know codecadoption.
(59:48):
Right, and especially Applepeople will be like, okay, well,
you know, if Apple adopted iniOS, then you know clearly it's
here.
But when it comes to you know,like premium streaming services,
so when it comes to you knowwhether it's Max or Hulu or
Amazon Prime or Netflix, most ofthat content is watched in
living rooms and so really like,the devices to watch are smart
(01:00:11):
TVs and connected streamingsticks.
So once those devices havesupport for a particular codec,
then in my opinion that's reallykind of the big indicator that
yeah, it might be ready.
Speaker 2 (01:00:26):
We're running over,
but this is a question I need
the answer on.
But what's the HDR picture forAV1 and how clear does that have
to be?
Because it seems like there's abunch of TV sets out there that
we know play Dolby Vision andHDR10, or 10 plus with HDDC,
(01:00:49):
dolby Vision and HDR10 or 10plus with HDDC?
Do we have the same certaintythat an AV1?
Speaker 1 (01:00:52):
compatible TV set
will play AV1 and HDR.
I don't think that certainty isthere yet and I do need to do
some more research into thatparticular topic because I've
been curious about the samething.
So I think some standardizationefforts have been made.
I can't remember off top of myhead if it's CTA or some other.
Speaker 2 (01:01:09):
The ATR 10 plus is
now a standard for AV1.
I just don't know if TVs outthere will automatically support
it and then automaticallydoesn't work for you.
You've got to make sure you'vegot a test, yeah.
Speaker 1 (01:01:20):
And then you know,
with Dolby Vision, right, it's
sort of like, well, until Dolbysays so, then it's not a
standard.
And so, yeah, I mean I thinkthat's an excellent question Is
that there's nothing from atechnical perspective that
should be stopping somebody fromusing AB1 or UBC or any other
new codec with HDR, Becausethere's nothing specific to the
(01:01:43):
codec that HDR needs, becausethere's nothing specific to the
codec that HDR needs, and soit's really just a matter of
standardization, a matter ofcompanies implementing that
standard.
So, yeah, I'm with you on thisone in that it is sort of like
one of those where, yeah, itshould work, but until it's been
tested and it's been tested onmany different devices it's not
(01:02:06):
a real thing, right?
Speaker 2 (01:02:07):
Listen, we are way
out of time, Alex.
I don't think we've ever donethis for an hour, but it's great
.
I really appreciate youspending time with us being so
open and honest about how you'reproducing your video, because I
think that helps everybody.
And thanks, this has been great.
Speaker 1 (01:02:22):
Absolutely.
Thank you so much for having meand, yeah, this has been really
great.
Like, I feel like we couldprobably keep talking for
another hour or two and I thinkwe'd have still plenty of topics
to discuss.
I was I was taking some noteswhile we were doing this and I
think I have notes for anotherhour at least.
Speaker 2 (01:02:38):
OK, we'll talk to you
about that and I'll see you at
IBC.
You're going to go to that showat IBC You're going to go to
that show.
Yeah, I think I'll be at IBC,so I'm most likely to see you
there.
Speaker 1 (01:02:48):
Cool, take care, alex
.
Thanks a lot.
All right, thanks so much.
This episode of Voices of Videois brought to you by NetInt
Technologies.
If you are looking for cuttingedge video encoding solutions,
check out NetInt's products atnetintcom.