[FFmpeg-devel] [PATCH] Revert "avfilter/vf_palette(gen|use): support palettes with alpha"

Clément Bœsch u at pkh.me
Mon Oct 31 12:57:16 EET 2022


On Mon, Oct 31, 2022 at 01:43:11AM +0000, Soft Works wrote:
[...]
> > > > The patch I had submitted doesn't change the previous behavior
> > > > without the use_alpha parameter.
> > 
> > Yes I noticed, but unfortunately I'm reworking the color distance to
> > work
> > in perceptual color space, and the way that alpha is mixed up in the
> > equation just doesn't make any sense at all and prevents me from
> > doing
> > these changes. 
> 
> If you want to implement a new color distance algorithm, it should 
> be either a new filter or a new (switchable) mode for the existing 
> filter.

Why?

> Photoshop has these different modes as well and it would 
> surely be useful, but I don't think it should be replacing the
> existing behavior.
> 

There is no point in keeping a ton of complexity exposed as user options
for something implementation specific. We offer no guarantee over how the
quantization is expected to run.

> When it turns out that the use_alpha implementation doesn't fit
> with your new color distance calculation and you add it as 
> an additional mode, then it would be fine IMO when the filter
> errors in case it would be attempted to use that mode in 
> combination with use_alpha.

IMO The use_alpha option shouldn't exist in the first place, it should be
the default behaviour because honoring the alpha is the correct thing to
do. That's not what the option is currently doing though.

> > > Do you think it might make sense to put more weight on the
> > > alpha value by tripling it? So it would be weighted equally to the
> > > RGB value?
> > 
> > You cannot mix alpha with colors at all, they are separate domains
> > and you
> > need to treat them as such.
> 
> What's interesting is that I've followed the same (simplified)
> way for adding a use_alpha option to vf_elbg and it provides excellent
> results without treating alpha separately.

I don't know how the filter works and what it's supposed to do, but if
it's indeed using the same approach as the palette ones, it cannot work.

> > From paletteuse perspective what you need to do is first choose the
> > colors
> > in the palette that match exactly the alpha (or at least the closest
> > if
> > and only there is no exact match). Then within that set, and only
> > within
> > that one, you'd pick the closest color.
> > 
> > From palettegen perspective, you need to split the colors in
> > different
> > transparency domain (a first dimensional quantization), then quantize
> > the
> > colors in each quantized alpha dimension. And when you have all your
> > quantized palettes for each level of alpha, you find an algorithm to
> > reduce the number of transparency dimensions or the number of colors
> > per
> > dimension to make it fit inside a single palette. But you can't just
> > do
> > the alpha and the colors at the same time, it cannot work, whatever
> > weights you choose.
> 
> I would be curious to see how well that would work, especially
> in cases when the target palettes have just a few number of colors.
> 

You have to make a call between whether you want to preserve the
transparency or the color while constructing the palette, but when
choosing a color you must absolutely not choose a color with a different
transparency, you must pick amongst the closest alpha, with a particular
attention to extreme alphas: an opaque colors must stay opaque, and fully
transparent one as well:
- rounding a color with 43% alpha into 50% alpha is acceptable
- rounding a color with 100% alpha into a 99% alpha is not acceptable in
  any way because you're starting to make transparent areas that weren't
- rounding a color with 0% alpha into a 1% alpha is not acceptable because
  some areas of the images are not starting to blend into an area that was
  supposedly non-existent

> But to return to the proposal of removal: If everything from ffmpeg
> would be removed which is not perfect, then it would be lacking
> quite a number of features I suppose :-)

We're not talking about perfection, we're talking about files with
artifacts. It's almost as bad as a corrupted file, because if used in a
pipeline where transparency matters, you're going to get a completely
broken output.

> In the same way, one could say that palettegen/-use should be removed
> because its results are wrong and colors are randomly mixed and 
> misplaced while the vf_elbg filter does it right.
> When you look at the result under the heading
> 
> "Paletteuse/gen Regular (to 8-bit non-alpha palette; only single 
> transparent color)"
> https://gist.github.com/softworkz/deef5c2a43d3d629c3e17f9e21544a8f?permalink_comment_id=3905155#gistcomment-3905155
> 
> Even without the alpha, many color pixels appear to be wrong and
> random like for example the light purple pixels on the darker purple
> at the bottom of the "O". That's not much different from irregularities
> in the alpha channel you've shown (https://imgur.com/a/50YyRGV).
> So, I agree to that it's not perfect, but the whole filter is
> not perfect (vf_elbg is close-to). Do we remove the filter because
> it's not perfect?

The colors are indeed wrong because it has some *incorrect* computations
(working in sRGB space, and the MSE is wrong at least) and these should be
corrected (working on it). I'm also adding improvements on the quality by
working in the perceptual space.

> As mentioned above, if you want to add an additional mode for 
> calculating the color distance, it's fine when it doesn't 
> work with use_alpha IMO.

That's not how it's going to work, sorry; I'm not going to increase
complexity and maintenance effort for no gain. Implementing a correct
support for the alpha will likely involve a revert of that commit anyway.

Note that if I was active at the time this patch was submitted I would
certainly have rejected it in this state. So it's my fault, but I'm
working on fixing it.

-- 
Clément B.


More information about the ffmpeg-devel mailing list