[FFmpeg-devel] [PATCH] Frame rate emulation
Ramiro Ribeiro Polla
ramiro
Fri Jun 15 22:57:31 CEST 2007
Ramiro Ribeiro Polla wrote:
> Michael Niedermayer wrote:
>> Hi
>>
>> On Mon, Jun 11, 2007 at 11:01:50AM -0300, Ramiro Ribeiro Polla wrote:
>>
>>> Michael Niedermayer wrote:
>>>
>>>
>>>> Hi
>>>>
>>>> On Wed, Jun 06, 2007 at 03:31:03PM -0300, Ramiro Ribeiro Polla wrote:
>>>>
>>>>
>>>>
>>>>> Hello,
>>>>>
>>>>> x11grab provides its own frame rate emulation inside its
>>>>> read_frame function. Many other grab formats will need it (GDI,
>>>>> VFW...), so it would be best to make a format level frame rate
>>>>> emulation in FFmpeg, and not each demuxer (also I believe they
>>>>> don't belong in the demuxers themselves).
>>>>>
>>>>> FFmpeg already provides a frame rate emulation (added for DV), but
>>>>> it works depending on a codec's demand, and not at a format level,
>>>>> like avcodec.h says:
>>>>> /**
>>>>> * Frame rate emulation. If not zero, the lower layer (i.e.
>>>>> format handler)
>>>>> * has to read frames at native frame rate.
>>>>> * - encoding: Set by user.
>>>>> * - decoding: unused
>>>>> */
>>>>>
>>>>> It's in Fabrice's TODO "suppress rate_emu from AVCodecContext".
>>>>>
>>>>> I don't know DV's needs for rate emulation, but couldn't it be
>>>>> passed to ffmpeg.c's AVOutputStream instead of AVCodecContext? It
>>>>> would depend on the format then, and not the codec.
>>>>>
>>>>> Also, I plan to add that to AVInputStream, right before
>>>>> av_read_frame, where it will be useful for grabbing demuxers.
>>>>>
>>>>> Can there be 2 places where rate emulation occurs (reading from
>>>>> the format, and writing to the codec)?
>>>>> Is it ok for DV for the rate emulation to occur at writing to the
>>>>> output format, so it's checked in AVOutputStream?
>>>>>
>>>>>
>>>> first, AVInputStream / AVOutputStream are private structs of ffmpeg.c
>>>> libav* is used by more than just ffmpeg.c thus moving essential
>>>> functionality into ffmpeg.c is not ok
>>>>
>>>>
>>>>
>>>>
>>> But ffmpeg.c is currently responsible for the actual frame rate
>>> emulation.
>>> Anyways, it's your decision, so I won't bother with output frame
>>> rate emulation anymore.
>>> I'll just leave that in Fabrice's TODO =)
>>>
>>>
>>>> second, its the input formats / demuxers job to read the frames
>>>> properly
>>>> that includes reading them at the proper time for realtime stuff, sane
>>>> capture APIs should provide the needed functionality to capture at
>>>> a specific
>>>> time. simply waiting in ffmpeg.c and then calling the demuxer is a
>>>> incredibly
>>>> inaccurate way to do it, keep in mind ffmpeg does encoding,
>>>> writeing and
>>>> others too, using a seperate thread would probably be needed ...
>>>>
>>>>
>>>>
>>>>
>>> To avoid duplicating the same code in all grabbing formats that
>>> require it (only X11 for the moment, but VFW and GDI will also need
>>> it), is it ok to move the input frame rate emulation from x11grab.c
>>> to libavformat/utils.c under the name ff_rate_emu (or something
>>> similar)? Or should it be av_rate_emu?
>>>
>>
>> it should be moved into a new file so its not compiled if no grabing
>> interface is
>>
>>
>
> Now that I looked closer and implemented it, ffmpeg.c could also
> benefit from the same code, so it's best if unconditionally compiled
> (= no need for a separate file that depends on grabbing interfaces).
> Being this way, is it acceptable to put it in utils.c? If not, where?
> There are 2 different implementations of rate emulation in FFmpeg. One
> in ffmpeg.c, and the other in x11grab.c. Which one is preferred?
> Attached patch uses ffmpeg.c's.
> Is it possible for a codec to change time_base? I don't know if it's
> best for time_base to be passed on initialization or every call to
> av_rate_emu.
>
Ping.
More information about the ffmpeg-devel
mailing list