by crazygringo
10 subcomments
- Wow. To me, the big news here is that ~30% of devices now support AV1 hardware decoding. The article lists a bunch of examples of devices that have gained it in the past few years. I had no idea it was getting that popular -- fantastic news!
So now that h.264, h.265, and AV1 seem to be the three major codecs with hardware support, I wonder what will be the next one?
by IgorPartola
1 subcomments
- Amazing. Proprietary video codecs need to not be the default and this is huge validation for AV1 as a production-ready codec.
by shanemhansen
1 subcomments
- > AV1 streaming sessions achieve VMAF scores¹ that are 4.3 points higher than AVC and 0.9 points higher than HEVC sessions. At the same time, AV1 sessions use one-third less bandwidth than both AVC and HEVC, resulting in 45% fewer buffering interruptions.
Just thought I'd extract the part I found interesting as a performance engineer.
by VerifiedReports
2 subcomments
- I had forgotten about the film-grain extraction, which is a clever approach to a huge problem for compression.
But... did I miss it, or was there no mention of any tool to specify grain parameters up front? If you're shooting "clean" digital footage and you decide in post that you want to add grain, how do you convey the grain parameters to the encoder?
It would degrade your work and defeat some of the purpose of this clever scheme if you had to add fake grain to your original footage, feed the grainy footage to the encoder to have it analyzed for its characteristics and stripped out (inevitably degrading real image details at least a bit), and then have the grain re-added on delivery.
So you need a way to specify grain characteristics to the encoder directly, so clean footage can be delivered without degradation and grain applied to it upon rendering at the client.
- There's an HDR war brewing on TikTok and other social apps. A fraction of posts that use HDR are just massively brighter than the rest; the whole video shines like a flashlight. The apps are eventually going to have to detect HDR abuse.
- Netflix has been the worst performing and lowest quality video stream of any of the streaming services. Fuzzy video, lots of visual noise and artifacts. Just plan bad and this is on the 4k plan on 1GB fiber on a 4k Apple TV. I can literally tell when someone is watching Netflix without knowing because it looks like shit.
- On a related note, why are release groups not putting out AV1 WEB-DLs? Most 4K stuff is h265 now but if AV1 is supplied without re-encoding surely that would be better?
- I'm surprised AV1 usage is only at 30%. Is AV1 so demanding that Netflix clients without AV1 hardware acceleration capabilities would be overwhelmed by it?
by aperture147
2 subcomments
- AV1 is not new anymore and I think most of the modern devices are supporting them natively. Some devices like Apple even have a dedicated AV1 HW-accelerator. Netflix has pushing AV1 for a while now so I thought that the adoption rate should be like 50%, but it seems like AV1 requires better hardware and newer software which a lot of people don't have.
by liampulles
0 subcomment
- I'm a hobbiest video encoder (mostly I like to experiment in backing up my DVD collection), and I recently switched to using AV1 over HEVC.
I've found the ratio of a fixed quality vs CPU load to be better, and I've found it is reasonably good at retaining detail over smoothing things out when compared to HEVC. And the ability to add generated "pseudo grain" works pretty well to give the perception of detail. The performance of GPU encoders (while still not good enough fory maybe stringent standards) is better.
by resolutefunctor
0 subcomment
- This is really cool. Props to the team that created AV1. Very impressive
by tr45872267
2 subcomments
- >AV1 sessions use one-third less bandwidth than both AVC and HEVC
Sounds like they set HEVC to higher quality then? Otherwise how could it be the same as AVC?
- Worth a note, H.264 High Profile is patent free in most countries and soon be patent free too in US.
- I imagine that's a big part of the drive behind discontinuing Chromecast support..
https://www.androidcentral.com/streaming-tv/chromecast/netfl...
- Please Sir, can I have some more bitrate?
- Weirdly, Netflix on my Samsung TV it's been a few months it's using only H264. Not AV1. When they first launched AV1, it worked there...
Honestly not complaining, because they were using AV1 with 800-900~kbps for 1080p content, which is clearly not enough compared to their 6Mbps h.264 bitrate.
by conartist6
0 subcomment
- For a second there I wasn't looking very close and I thought it said that 30% of Netflix was running on .AVI files
by techpression
0 subcomment
- Compression is great and all, but Netflix is overdoing it and their content looks like an over-sharpened mess with lego blocks in high intensity scenes. And no, it's not my connection, Apple TV does it far better and so does Prime.
It's really sad that most people never get to experience a good 4K Blu-ray, where the grain is actually part of the image as mastered and there's enough bitrate to not rely on sharpening.
by nrhrjrjrjtntbt
1 subcomments
- > At Netflix, our top priority is delivering the best possible entertainment experience to our members.
I dont think that is true of any streamers. Otherwise they wouldnt provide the UI equivalent of a shopping centre that tries to get you lost and unable to find your way out.
- Qualcomm seems to be lagging behind and doesn't have AV1 decoder except in high end SoCs.
by philipallstar
0 subcomment
- This is a great result from Google, Netflix, Cisco, etc.
by forgotpwd16
0 subcomment
- Am I the only one that thought this is an old article by the title? AV1 is now 10 years old and AV2 has been announced for year-end release few months ago. If anything the news is that AV1 powers only 30% by now. At least HEVC, released about the same time, has gotten quite popular in warez scene (movies/TV/anime) for small encodes, whereas AV1 releases are still considered a rarity. (Though to be fair 30% Netflix & YT means AV1 usage in total is much higher.) Will've expected a royalty-free codec to've been embraced more but seems its difficulty for long time to be played on low power devices hindered its adoption.
- The one big hardware deficiency of my Nvidia Shield TV is its lack of YouTube AV1 support.
by testdelacc1
1 subcomments
- Something doesn’t quite add up to me. The post says “AV1 powers approximately 30% of all Netflix viewing”. Impressive, but I’m wondering why it isn’t higher? I’m guessing most devices should support AV1 software decoders. 88% of devices in certified in the last 4 years support AV1, all browsers support AV1 software decoding, Netflix apps on Android (since 2021) and iOS (since 2023) obviously do.
So why isn’t it AV1 higher? The post doesn’t say, so we can only speculate. It feels like they’re preferring hardware decoding to software decoding, even if it’s an older codec. If this is true, it would make sense - it’s better for the client’s power and battery consumption.
But then why start work on AV2 before AV1 has even reached a majority of devices? I’m sure they have the answer but they’re not sharing here.
- I understand that sometimes the HN titles get edited to be less descriptive and more generic in order to match the actual article title.
What’s the logic with changing the title here from the actual article title it was originally submitted with “AV1 — Now Powering 30% of Netflix Streaming” to the generic and not at all representative title it currently has “AV1: a modern open codec”? That is neither the article title nor representative of the article content.
- [dead]
by beritdotdev
0 subcomment
- [dead]
- [dead]
by beritdotdev
0 subcomment
- [dead]
by badmonster
0 subcomment
- [dead]
by endorphine
0 subcomment
- Is it me or this post has LLM vibes?
- Top post without a single comment and only 29 points. Clearly my mental model of how posts bubble to the top is broken.