[FFmpeg-devel] [PATCH] avcodec/nvenc: High bit depth encoding for HEVC
Diego Felix de Souza
ddesouza at nvidia.com
Fri Apr 19 11:33:23 EEST 2024
Hi Roman and Timo,
Timo is right. As a general rule, hybrid video coding standards allow encoders to take advantage of encoding
a 8-bit input as 10-bit due to the interpolation filters (inter and intra) and transform coding at 10-bit depth.
This can generate a better prediction and reduced banding artifacts in smooth gradient areas, e.g., in the sky.
On the particular case of the NVIDIA Video Codec SDK, we do a simple 8 bit > 10 bit conversion. No SDR > HDR
conversion is performed. Due to video being encoded as 10 bits, it results to better de-correlation and hence
better compression at same quality. We have observed ~3-5% BD-rate savings due to this feature.
Although you are right, that the same could be acomplished with an external filter, I would still humbly ask
you to consider including this patch into FFmpeg. Besides the fact that this patch, as I explained before, is a
more efficient way to achieve the same result due to memory accesses and storage, the same feature is already
supported in FFmpeg for AV1 (av1_nvenc). Hence, it would not make sense to the user perfom in one way
for AV1 and another way for HEVC.
Best regards,
Diego
On 19.04.24, 09:39, "Roman Arzumanyan" <r.arzumanyan at visionlabs.ai> wrote:
External email: Use caution opening links or attachments
Thanks for the explanation, Timo!
I was hoping that 8>10 bit up-conversion which happens in the driver may bring some goodness like SDR > HDR conversion, recently presented by NV. Or some other algo which is easier to keep proprietary.
Otherwise, although it is convenient in some use cases, it doesn't look more tempting than, say, a similar 8>10 bit NPP up-conversion which shall yield the same (presumably SoL) performance.
чт, 18 апр. 2024 г. в 16:32, Timo Rothenpieler <timo at rothenpieler.org<mailto:timo at rothenpieler.org>>:
On 18/04/2024 14:29, Roman Arzumanyan wrote:
> Hi Diego,
> Asking for my own education.
>
> As far as you've explained, the 8 > 10 bit conversion happens within the
> driver, that's understandable.
> But how does it influence the output? Does it perform some sort of
> proprietary SDR > HDR conversion under the hood that maps the ranges?
> What's gonna be the user observable difference between these 2 scenarios?
> 1) 8 bit input > HEVC 8 bit profile > 8 bit HEVC output
> 2) 8 bit input > 10 bit up conversion > HEVC 10 bit profile > 10 bit
> HEVC output
>
> Better visual quality? Smaller compressed file size?
> In other words, what's the purpose of this feature except enabling new
> Video Codec SDK capability?
Video Codecs tend to be more efficient with 10 bit, even if it's just 8
bit content that's been up-converted to 10 bit.
I.e. yes, it'll (Or can, at least. Not sure if it's a given.) produce
smaller/higher quality content for the same input.
As for the exact reason, I can't explain, but it's a well known concept.
-----------------------------------------------------------------------------------
NVIDIA GmbH
Wuerselen
Amtsgericht Aachen
HRB 8361
Managing Directors: Rebecca Peters, Donald Robertson, Janet Hall, Ludwig von Reiche
-----------------------------------------------------------------------------------
This email message is for the sole use of the intended recipient(s) and may contain
confidential information. Any unauthorized review, use, disclosure or distribution
is prohibited. If you are not the intended recipient, please contact the sender by
reply email and destroy all copies of the original message.
-----------------------------------------------------------------------------------
More information about the ffmpeg-devel
mailing list