[FFmpeg-devel] [RFC] Issue with "standard" FPS/timebase detection
Jason Garrett-Glaser
darkshikari
Thu Feb 11 10:18:18 CET 2010
Test case:
http://stfcc.org/misc/fraps.fps.test.zip
In the "101fps" file, tb_unreliable gets triggered because of the
framerate being >= 101. The if(duration_count[i] &&
tb_unreliable(st->codec) code in libavformat then proceeds to run,
resulting in a framerate of 1/12, which obviously doesn't make any
sense whatsoever and is rather broken.
I understand the basic idea behind what the duration error code is
doing, but I don't understand why it's giving such a weird result.
The primary reason seems to be the multiply in double error=
duration_error[i][j] * get_std_framerate(j); , which results in a bias
towards very small framerate values. But even if I remove that, it
still comes up with an extremely weird fps (695/12). Something is off
in the error calculation, but I don't fully understand it enough to
judge what it is.
As this is Michael's code, I figure he can probably offer the best
insight on what's going on here.
Dark Shikari
More information about the ffmpeg-devel
mailing list