[FFmpeg-devel] [PATCH]lavfi/minterpolate: Allow bigger resolutions if SIZE_MAX is big
Carl Eugen Hoyos
ceffmpeg at gmail.com
Tue Mar 31 15:11:42 EEST 2020
> Am 31.03.2020 um 12:24 schrieb Anton Khirnov <anton at khirnov.net>:
>
> Quoting Carl Eugen Hoyos (2020-03-31 11:56:44)
>>> Am Di., 31. März 2020 um 11:18 Uhr schrieb Anton Khirnov <anton at khirnov.net>:
>>>
>>> Quoting Carl Eugen Hoyos (2020-03-28 13:54:22)
>>>> Hi!
>>>>
>>>> Attached patch allows to work-around ticket #7140, tested on a system
>>>> with a lot of memory.
>>>
>>> This looks very ad-hoc.
>>
>> Is there another part of FFmpeg that rightfully allocates that much memory?
>
> Something decoding really big images?
Are you sure?
A quick calculation indicated a max allocation of 1500MB to me, what did you think of?
> 2GB is not really that much these
> days.
Please describe an example.
>>
>>> The right thing to do is fix arbitrary limits in
>>> av_malloc_array().
>>
>>> That INT_MAX there looks wrong.
>>
>> I wonder if it is a good idea that demuxers and decoders cannot allocate
>> random amounts of memory...
>
> I see no valid reason why this specific function should have this
> specific limit, while other memory allocation functions have different
> limits.
I know of only one other function in FFmpeg that does this, but I believe it already listens to max_alloc.
> Beyond that, I don't think we should have any arbitrary limits
> on what allocations are reasonable and what are not. There are many
> other --- more appropriate --- mechanisms for limiting memory usage.
I tend to disagree, especially as this does not affect any normal usage.
Carl Eugen
More information about the ffmpeg-devel
mailing list