[FFmpeg-devel] [PATCH] Incorrect Ogg Theora duration
Måns Rullgård
mans
Sun Apr 20 14:28:55 CEST 2008
Henrik Gulbrandsen <henrik at gulbra.net> writes:
> On Fri, 2008-04-18 at 00:45 -0700, Baptiste Coudurier wrote:
>> I think the "start" value is incorrect. According to specs, ogg
>> granule stores the pts plus the duration of the ogg packet, so last
>> packet granule is the stream duration.
>
> As mentioned previously, Ogg Theora bitstreams prior to version 3.2.1
> don't follow this rule. Since libtheora uses 3.2.0 for versions before
> 1.0beta1, I think there is a good case for being backwards compatible.
How many such files exist in the wild? Given that I've never seen a
wild Theora file, other than Xiph promotional stuff, I reckon there
can't be that many.
> On Fri, 2008-04-18 at 09:10 +0100, M?ns Rullg?rd wrote:
>> > Index: libavformat/oggdec.c
>> > ===================================================================
>> > --- libavformat/oggdec.c (revision 12884)
>> > +++ libavformat/oggdec.c (working copy)
>> > @@ -307,6 +307,7 @@ ogg_packet (AVFormatContext * s, int *st
>> > ogg_stream_t *os;
>> > int complete = 0;
>> > int segp = 0, psize = 0;
>> > + uint64_t start = url_ftell(s->pb);
>> >
>> > #if 0
>> > av_log (s, AV_LOG_DEBUG, "ogg_packet: curidx=%i\n", ogg->curidx);
>> > @@ -368,6 +369,7 @@ ogg_packet (AVFormatContext * s, int *st
>> > if (os->header < 0){
>> > int hdr = os->codec->header (s, idx);
>> > if (!hdr){
>> > + url_fseek (s->pb, start, SEEK_SET);
>> > os->header = os->seq;
>> > os->segp = segp;
>> > os->psize = psize;
>>
>> This looks very wrong.
>
> Interleaved multi-page packets. Right. That is a potential problem if
> higher specs allow it, but somehow I doubt that this was your point.
> Could you please clarify? This particular line is only meant to run
> precisely when the first data packet has already been fetched and is
> about to be discarded after failing to be accepted as a header packet.
I still don't understand. Why do you seek back in the file after
reading the headers? It doesn't make any sense to me.
>> > @@ -463,14 +465,14 @@ ogg_get_length (AVFormatContext * s)
>> >
>> > if (idx != -1){
>> > s->streams[idx]->duration =
>> > - ogg_gptopts (s, idx, ogg->streams[idx].granule);
>> > + ogg_gptopts (s, idx, ogg->streams[idx].granule + 1);
>> > }
>>
>> Baptiste is right, the last value gives the length of the stream.
>
> In this case, it's irrelevant what the low-level granule semantics are.
> The function is named ogg_gptopts(), and - unless that is just a random
> selection of letters - should presumably convert Granule Position to
> Presentation Time Stamp. Unless PTS is the time when presentation stops,
> we really need a granule from the (imaginary) packet after the last one.
Read the code again. The value is used as the PTS of the first packet
on the next page.
>> > ogg->size = size;
>> > ogg_restore (s, 0);
>> > ogg_save (s);
>> > while (!ogg_read_page (s, &i)) {
>> > - if (i == idx && ogg->streams[i].granule != -1 && ogg->streams[i].granule != 0)
>> > + if (i == idx && ogg->streams[i].granule != -1)
>> > break;
>> > }
>> > if (i == idx) {
>>
>> Why?
>
> Because 0 is a legal granule value for data packets in old Theora files,
> as noted above.
I see.
> ogg_get_length() is a static function and is called only once when
> the headers have already been skipped, if that worried you.
I fail to see the relevance of that remark.
--
M?ns Rullg?rd
mans at mansr.com
More information about the ffmpeg-devel
mailing list