[FFmpeg-devel] [PATCH v3 1/1] avfilter/buffersink: Add video frame allocation callback

John Cox jc at kynesim.co.uk
Tue Jul 25 14:41:38 EEST 2023


On Sun, 23 Jul 2023 17:04:55 -0300, you wrote:

>On 7/23/2023 4:40 PM, Paul B Mahol wrote:
>> On Sun, Jul 23, 2023 at 9:26?PM Nicolas George <george at nsup.org> wrote:
>> 
>>> James Almer (12023-07-23):
>>>> What about when FF_FILTER_FLAG_HWFRAME_AWARE filters are present in the
>>>> graph? hw_frames_ctx from AVFilterLink can't be accessed from outside
>>> lavfi.
>>>> Is vf_hwdownload meant to be added to the graph before buffersink?
>>>
>>> I do not know how hardware acceleration works at all. (The tidbits of
>>> discussion I catch left me the impression all of it is very badly
>>> designed, but I have low confidence in that impression.) If this API
>>> only works with filters that output software frames, it is already very
>>> useful.
>>>
>> 
>> Patch is only marginally useful:
>> 
>> - missing audio support
>
>Trivially added if needed and when needed. alloc_cb is a union that can 
>get a new callback typedef field for audio.

Now added in v4

>> - missing full internal buffers allocation replacement support
>
>What is the benefit of supporting a custom allocator for all filters in 
>the chain? Internally, it's already using a very optimized buffer pool. 
>The caller only cares about how what they get out of buffersink is 
>allocated.

A case was made that you might want to tweak the number of buffers in
the pool, but I remain to be convinced that the substantial extra
complexity of trying to do this is worth it.

>> - missing/untested hardware acceleration support
>
>This however i agree about. We need to know how it will behave in this 
>scenario. How does buffersink currently handle things when the previous 
>filter in the chain propagates hardware frames?

It just delivers them to the user

The case that this code was written for and I suspect pretty much the
only case it will be used for is where the user wants the frame in a
special buffer that it can pass directly to display or other hardware
and doesn't want the overhead of copying.

Assuming that to be the case then there are two possibilities if the
final filter stage produces hardware frames:
a) The frames are in the right sort of h/w buffer in which case no
custom allocator is wanted - code unused
b) they need copying / conversion to the right sort of buffer - again
custom allocation doesn't help

So given that I could do the same as what avcodec ff_getbuffer does and
if there is an existing h/w frame allocator then the  user function
isn't called. Would that be OK?

If not then I'd like some constructive suggestions as to what exactly
would be OK please.

Regards

JC


 
>> 
>>> Regards,
>>>
>>> --
>>>    Nicolas George
>>> _______________________________________________
>>> ffmpeg-devel mailing list
>>> ffmpeg-devel at ffmpeg.org
>>> https://ffmpeg.org/mailman/listinfo/ffmpeg-devel
>>>
>>> To unsubscribe, visit link above, or email
>>> ffmpeg-devel-request at ffmpeg.org with subject "unsubscribe".
>>>
>> _______________________________________________
>> ffmpeg-devel mailing list
>> ffmpeg-devel at ffmpeg.org
>> https://ffmpeg.org/mailman/listinfo/ffmpeg-devel
>> 
>> To unsubscribe, visit link above, or email
>> ffmpeg-devel-request at ffmpeg.org with subject "unsubscribe".
>_______________________________________________
>ffmpeg-devel mailing list
>ffmpeg-devel at ffmpeg.org
>https://ffmpeg.org/mailman/listinfo/ffmpeg-devel
>
>To unsubscribe, visit link above, or email
>ffmpeg-devel-request at ffmpeg.org with subject "unsubscribe".


More information about the ffmpeg-devel mailing list