[FFmpeg-devel] [PATCH] Revert "avfilter/vf_palette(gen|use): support palettes with alpha"

Clément Bœsch u at pkh.me
Tue Nov 1 12:18:14 EET 2022


On Mon, Oct 31, 2022 at 10:58:57PM +0100, Michael Niedermayer wrote:
[...]
> > You have to make a call between whether you want to preserve the
> > transparency or the color while constructing the palette, but when
> > choosing a color you must absolutely not choose a color with a different
> > transparency, you must pick amongst the closest alpha, with a particular
> > attention to extreme alphas: an opaque colors must stay opaque, and fully
> > transparent one as well:
> > - rounding a color with 43% alpha into 50% alpha is acceptable
> > - rounding a color with 100% alpha into a 99% alpha is not acceptable in
> >   any way because you're starting to make transparent areas that weren't
> > - rounding a color with 0% alpha into a 1% alpha is not acceptable because
> >   some areas of the images are not starting to blend into an area that was
> >   supposedly non-existent
> 
> really ?
> so if i have all shades of green available for all transparencies from 1% to 99%
> i "must" make my plants all use 0% trasparency even if i only have a single color and
> that is bright pink 

I believe so because you don't know how the alpha channel is going to be
used in the user pipeline. The goal of the palette filters is to quantize
colors, not mess up the alpha channel. It's better that for these filters
to be very bad at quantizing colors but still preserving as best as
possible the alpha, than giving the illusion that the colors are great
while messing up massively the alpha channel (which it currently does).

BTW, the colors are not even pre-multiplied, so in the current state it
just makes no sense at all: we are comparing colors with different alpha
channel even though we have no idea how they look like when blend.

> There are perceptual differences between the different areas of the RGBA hypercube
> though. Hardly anyone would notice the difference between a 255 and 254 blue but
> having some slight transparency might be noticable.

It's noticeable late: that is when your asset reach the blending stage,
which is the worse user experience you can provide.

Just imagine, the user quantize its files, thinking it's going to preserve
transparency. Blend with a black background it appears to be somehow ok
(see softworkz screenshot), so the user starts using it on their website.
Everything looks fine. Then few months later, the user decides to changes
the black background to a brighter colored color: all the images suddenly
revealed their destroyed alpha channel, which artifacts everywhere.

> These different weights in different areas could maybe be considered in palette*
> and elbg, it likely would improve things. OTOH heuristics like always and never
> feels like that might become alot of work to tune. I think its better to attemt
> to achieve a similar goal with less hard and more perceptual scoring

Working on perception can only work with colors, you can not jam in the
alpha, it's another dimension entirely. So you first have to work with
premultiplied data, and then you need to separate the alpha scoring
separately.


-- 
Clément B.


More information about the ffmpeg-devel mailing list