←back to thread

305 points todsacerdoti | 8 comments | | HN request time: 1.457s | source | bottom
Show context
IgorPartola ◴[] No.44061907[source]
AV1 is an amazing codec. I really hope it replaces proprietary codecs like h264 and h265. It has a similar, if not better, performance to h265 while being completely free. Currently on an Intel-based Macbook it is only supported in some browsers, however it seems that newer video cards from AMD, Nvidia, and Intel do include hardware decoders.
replies(5): >>44062089 #>>44062484 #>>44063108 #>>44063170 #>>44065078 #
1. karn97 ◴[] No.44062089[source]
9070xt records gameplay by default in av1
replies(2): >>44062596 #>>44078597 #
2. monster_truck ◴[] No.44062596[source]
RDNA3 cards also have AV1 encode. RDNA 2 only has decode.

With the bitrate set to 100MB/s it happily encodes 2160p or even 3240p, the maximum resolution available when using Virtual Super Resolution (which renders at >native res and downsamples, is awesome for titles without resolution scaling when you don't want to use TAA)

replies(1): >>44063678 #
3. kennyadam ◴[] No.44063678[source]
Isn't that expected? 4K Blurays only encode up to like 128Mbps, which is 16MB/s. 100MB/s seems like complete overkill.
replies(2): >>44063998 #>>44074051 #
4. vlovich123 ◴[] No.44063998{3}[source]
I think op just didn’t type Mbps properly. 100MB/s or ~800Mbps is way higher than the GPU can even encode at a HW level even I would think
replies(1): >>44074043 #
5. monster_truck ◴[] No.44074043{4}[source]
100,000kbps. It will more than double that for 3240p.

https://i.imgur.com/LyrhNXZ.png

replies(1): >>44078841 #
6. monster_truck ◴[] No.44074051{3}[source]
It isn't for the amount of motion involved. Third person views in rally simulators and continuous fast flicks in shooters require it
7. rasz ◴[] No.44078597[source]
Is the encoder any better than previous AMD offerings?

https://goughlui.com/2024/01/07/video-codec-round-up-2023-pa...

8. vlovich123 ◴[] No.44078841{5}[source]
Right. That’s 223642 kilobits/s (kbps) in your picture or ~200MBit/s whereas you wrote (intentionally or otherwise) 200Mbyte/s a nearly 10 fold difference (100Mbit/s =~ 12Mbyte/s). 100MByte/s is 800Mbit/s or ~800000 kbps which is an order of magnitude more insanity than already choosing 100Mbit/s for live streaming (and not physically possible on consumer GPUs I believe).