[FFmpeg-devel] A change in r_frame_rate values after upgrade to FFmpeg 6.1
Anton Khirnov
anton at khirnov.net
Mon Sep 23 18:04:19 EEST 2024
Quoting Kieran Kunhya via ffmpeg-devel (2024-09-23 16:45:30)
> On Mon, Sep 23, 2024 at 3:27 PM Anton Khirnov <anton at khirnov.net> wrote:
> >
> > Quoting Antoni Bizoń (2024-09-23 10:09:51)
> > > I understand that the r_frame_rate is the lowest framerate with which
> > > all timestamps can be represented accurately. And I know it is just a
> > > guess. But why did the logic behind the calculation change?
> >
> > Because you're most likely using a codec like H.264 or MPEG-2 that
> > allows individually coded fields. In that case the timebase must be
> > accurate enough to represent the field rate (i.e. double the frame
> > rate), but the code doing this was previously unreliable, so you'd
> > sometimes get r_frame_rate equal to the frame rate rather than field
> > rate. That is not the case anymore.
>
> This is bizarre and kafkaesque to say the least.
As far as I'm concerned, r_frame_rate is a mistake and should never have
existed. But since it does exist, it's better for the calculation to at
least not depend on whether the lavf internal decoder has been opened
during avformat_find_stream_info() or not (which used to be the case).
--
Anton Khirnov
More information about the ffmpeg-devel
mailing list