With the bitrate set to 100MB/s it happily encodes 2160p or even 3240p, the maximum resolution available when using Virtual Super Resolution (which renders at >native res and downsamples, is awesome for titles without resolution scaling when you don't want to use TAA)
They shifted to h.264 successfully, but I haven't heard of any more conferences to move forward in over a decade.
Currently "The Last of US S02E06" only has one AV1 - https://thepiratebay.org/search.php?q=The+Last+of+Us+S02E06 same THMT - https://thepiratebay.org/search.php?q=The+Handmaids+Tale+S06... These are low quality at only ~600MB, not really early adopter sizes.
AV1 beats h.265 but not h.266 - https://www.preprints.org/manuscript/202402.0869/v1 - People disagree with this paper on default settings
Things like getting hardware to The Scene for encoding might help, but I'm not sure of the bottleneck, it might be bureaucratic or educational or cultural.
[edit] "Common Side Effects S01E04" AV1 is the strongest torrent, that's cool - https://thepiratebay.org/search.php?q=Common+Side+Effects+S0...
AV1 hardware decoders are still rare so your device was probably resorting to software decoding, which is not ideal.
I don't know instagram, but I would expect any provider to be handle almost any container/codec/resolution combination going (they likely use ffmpeg underneath) and generate their different output formats at different bitrates for different playback devices.
Either instagram won't accept av1 (seems unlikely) or they just haven't processed it yet as you infer.
I'd love to know why your commend is greyed out.
There is one large exception, but I don't know the current scene well enough to know if it matters: sources that are grainy. I have some DVD and blurays with high grain content and AV1 can work wonders with those thanks to the in-loop grain filter and synthesis -- we are talking half the size for a high-quality encode. If I were to encode them for AVC at any reasonable bitrate, I would probably run a grain-removal filter which is very finicky if you don't want to end up with something that is overly blurry.
In my case, I get both 4k (h265) and 1080p (h264) blurays and let the client select.
The ITU standards have had a lot better record of inclusion in devices that people actually have; and often using hardware encode/decode takes care of licensing. But hardware encode doesn't always have the same quality/bitrate as software and may not be able to do fancier things like simulcast or svc. Some of the hardware decoders are pretty picky about what kinds of streams they'll accept too.
IMHO, if you're looking at software h.264 vs software vp9, I think vp9 is likely to give you better quality at a given bitrate, but will take more cpu to do it. So, as always, it depends.
That's a pretty messy way to measure. h.264 with more CPU can also beat h.264 with less CPU.
How does the quality compare if you hold both bitrate and CPU constant?
How does the CPU compare if you hold both bitrate and quality constant?
AV1 will do significantly better than h.264 on both of those tests. How does VP9 do?
https://goughlui.com/2024/01/07/video-codec-round-up-2023-pa...