[FFmpeg-devel] What is FFmpeg and what should it be

Kieran Kunhya kierank at obe.tv
Sat Aug 5 02:17:12 EEST 2023


On Fri, 4 Aug 2023, 13:35 Nicolas George, <george at nsup.org> wrote:

> Michael Niedermayer (12023-08-04):
> > Everything is there for a reason.
> > Every part of mp4 has a use, still we extract the data and setup various
> > structs like AVStream, AVPacket, AVProgram and so on.
> > We do not return raw mp4/mov atoms
> > the seperation between programs in a stream of bits/bytes looses meaning
> > once the frames are in AVPackets with AVStream/AVProgram.
> > If there is more data in any framing that people want, theres a wide
> range
> > of ways to preserve and export that data.
> > OTOH outputting AAC in TS or AAC is other framing is painfull to handle
> > especially when it is muxed into something again. because it then needs
> > the right framing and even if it comes in as DAB framing and the output
> > wants DAB framing, it is unlikely everything in the framing will be
> correct
> > for the output.
> > The same is true for TS. I surely can take raw TS from 3 programs but if
> > i just take these and concatenate them into something that suppports TS
> > thats quite likely going to blow up somehow.
> > All this framing stuff should IMHO be "removed" on the demuxer side
> > usefull data be extracted and properly exported. And if anything
> > on the muxing side needs something similar it needed to rebuild it all.
> >
> > I may be missing something but i dont think the raw framing is too
> usefull
> > to the user.
>
> I recommend you do what feels most simple, or most elegant, or most
> logical, whichever feels right.
>
> If somebody else, or you later, find a use for the framing, the code
> that removes it can be turned into code that extract information from it
> or reshape it. If and when that happens is the good time to figure out
> how to bring that information to the user, because that will be when
> what is necessary will be known.
>

Literally in this thread someone has countered all your points by wanting
TCP replay (a form of framing). If you design a bad API for a simple case,
the edge use cases (that have a tendency to make it into FFmpeg) will
immediately need hacks to support.

Plenty of examples of this such as wrapped_avframe.

Kieran

>


More information about the ffmpeg-devel mailing list