[FFmpeg-devel] [PATCH 2/8] png: Don't fail when a packet is larger than INT_MAX
wm4
nfxjfg at googlemail.com
Mon Mar 30 18:02:35 CEST 2015
On Mon, 30 Mar 2015 17:47:03 +0200
Michael Niedermayer <michaelni at gmx.at> wrote:
> On Mon, Mar 30, 2015 at 05:11:05PM +0200, wm4 wrote:
> > On Mon, 30 Mar 2015 13:49:08 +0000
> > Donny Yang <work at kota.moe> wrote:
> >
> > > On 30 March 2015 at 02:48, Michael Niedermayer <michaelni at gmx.at> wrote:
> > >
> > > > On Sun, Mar 29, 2015 at 11:05:41AM +0000, Donny Yang wrote:
> > > > > Signed-off-by: Donny Yang <work at kota.moe>
> > > > > ---
> > > > > libavcodec/pngenc.c | 2 --
> > > > > 1 file changed, 2 deletions(-)
> > > > >
> > > > > diff --git a/libavcodec/pngenc.c b/libavcodec/pngenc.c
> > > > > index 3697dbb..bd3aae5 100644
> > > > > --- a/libavcodec/pngenc.c
> > > > > +++ b/libavcodec/pngenc.c
> > > > > @@ -373,8 +373,6 @@ static int encode_frame(AVCodecContext *avctx,
> > > > AVPacket *pkt,
> > > > > enc_row_size +
> > > > > 12 * (((int64_t)enc_row_size + IOBUF_SIZE - 1) /
> > > > IOBUF_SIZE) // 12 * ceil(enc_row_size / IOBUF_SIZE)
> > > > > );
> > > > > - if (max_packet_size > INT_MAX)
> > > > > - return AVERROR(ENOMEM);
> > > >
> > > > the check is neccessary to prevent potential integer overflows
> > > >
> > >
> > > Doesn't ffmpeg support memory allocations of greater than 4 GiB?
> > > I thought it did because the memory allocation functions either accept an
> > > int64_t or size_t...
> >
> > No.
>
> That is false, the maximum allocation size is set by av_max_alloc()
> its INT_MAX by default but can be increased.
This is a library-unsafe hack that shouldn't exist. Access isn't even
synchronized, terrible.
> The reason for it being INT_MAX by default is that it limits broken
> media streams from allocating insane amounts of memory and by limiting
> below the 32bit range it protects against integer overflows in
> places that use int to store a size or index
There are probably a bunch of situations where broken streams can do
this anyway, by e.g. creating oversized linked list or arrays of
pointers to big data structures (consider an array of AVPackets), so I
can't take this quite seriously.
Is the dimension limit on ca. 16000x16000 images "intentional" too?
(This actually prevents ffmpeg being used to read large images.)
>
> > The codebase is so rotten,
>
> The mailing lists and IRC channels exist to foster the development and
> use of FFmpeg. Non-constructive or off-topic messages, along with
> other abuses, are not welcome.
>
>
> > libavutil/mem.c even limits memory
> > allocations to INT_MAX.
>
> this is misleading, it limits to a user settable variable that has a
> default of INT_MAX
>
>
> > In your specific case the problem is that
> > AVPacket.size is an int.
>
> yes and no.
> yes, AVPacket.size could be changed to int64_t, that would need major
> ABI version and soname bumps of all libs though. Its something that
> could be done during the next such bump.
>
> but that is missing a point, AVPackets of >4GiB are very unwieldy
> and would be very akward to handle in applications
>
> If we do add support for frames larger than 4gb samples, then maybe
> a different system than storing all of it in a single array in memory
> should be considered.
I suggest 5 dimensional storage.
More information about the ffmpeg-devel
mailing list