[Ffmpeg-devel-irc] ffmpeg-devel.log.20140929

burek burek021 at gmail.com
Tue Sep 30 02:05:02 CEST 2014


[00:29] <cone-228> ffmpeg.git 03Michael Niedermayer 07master:755f7929c6aa: avcodec/mpegvideo_enc: Only enforce rc_max_available_vbv_use on first encoding attempt
[00:29] <cone-228> ffmpeg.git 03Michael Niedermayer 07master:3f5095f213a0: avformat/hlsenc: fix outter typo
[11:45] <ubitux> BBB: do you know if there are security concerns about chromium linking to a system version of ffmpeg?
[11:45] <ubitux> like, can the probing be trigger to exploit a crash in a random codec or format
[11:46] <ubitux> or something along these lines?
[11:46] <ubitux> (because of the system ffmpeg being full featured instead of a very small subset)
[11:48] <wm4> I'm not BBB or google, but obviously the answer is yes
[11:48] <wm4> in addition to ABI problems
[11:48] <wm4> the latter makes system ffmpeg a nightmare for application devs anyway
[11:48] <ubitux> well, maybe they do the probing differently
[11:49] <ubitux> like, forcing the format according to mime types or whatever
[11:50] <Daemon404> ubitux, why do you ask?
[11:50] <Daemon404> because chrome will always bundle their own .... always
[11:51] <ubitux> there is a -Duse_system_ffmpeg=0/1
[11:51] <ubitux> my distro sets it to 0, like probably many/all of them
[11:51] <Daemon404> do you mean *Chrome*
[11:51] <Daemon404> or chromium
[11:51] <Daemon404> (sorry, i wasnt clear)
[11:51] <ubitux> chromium
[11:51] <Daemon404> oh ok
[11:51] <Daemon404> that makes more sense.
[11:52] <ubitux> isn't chrome just the windows version of chromium?
[11:52] <nevcairiel> chrome is their binary distribution, it also exists for linux
[11:52] <ubitux> ok
[11:53] <Daemon404> ubitux, main difference is bundle flash
[11:53] <Daemon404> bundled*
[11:53] <wm4> at least it makes no sense for chrome to enable obscure game format decoders and such, whose only value is probably adding security issues
[11:53] <ubitux> wm4: right, but the question is more about whether the code is triggerable or not
[11:54] <ubitux> if they do force codec and format, then it might be fine
[11:55] <Daemon404> ubitux, i dont see why they wouldnt though
[11:55] <Daemon404> it's such a small subset that it's actually preferable
[11:56] <ubitux> wouldn't what?
[11:56] <ubitux> force or probe?
[11:56] <Daemon404> hard set / enable vp8, vp9, h264, aac
[11:56] <Daemon404> during init
[11:56] <Daemon404> i mean they HAVE to have their own probe stuff, for their hw playback
[11:57] <Daemon404> which is what most users get
[11:57] <ubitux> ah right, you can indeed enable only some codecs/format, forgot that
[11:57] <Daemon404> yeah
[11:57] <Daemon404> just dotn do av_register_all() etc
[11:57] <ubitux> yeah right ok
[11:57] <Daemon404> and manually register the few you need
[11:57] <ubitux> then it should be fine :)
[11:57] <wm4> huh? since when can you register individual codecs
[11:57] <ubitux> avcodec_register(...)?
[11:58] <wm4> ubitux: and how do you get the codec ptr?
[11:58] <ubitux> avcodec_find_decoder()?
[11:58] <ubitux> avcodec_register(avcodec_find_decoder("h264")) ?
[11:58] <wm4> but that finds only registered codecs
[11:59] <ubitux> ah
[12:00] <Daemon404> wow
[12:00] <Daemon404> yeah
[12:00] <Daemon404> what the hell?
[12:01] <Daemon404> wow that's pretty bad... now it bothers me
[12:01] <wm4> avcodec_register is basically just useless global state
[12:02] <wm4> someone in Libav actually made plans how to remove that, but never posted a patch
[12:02] <Daemon404> it seems that everything is always init'd anyway
[12:02] <Daemon404> so idneed usless
[12:02] <Daemon404> mind you for chrome, they only enable what they need in configure
[12:02] <wm4> yeah, that's the right way for this stuff
[12:04] <Daemon404> wm4, anyway, i dont think chrome uses probing
[12:04] <Daemon404> probably just find_decoder(AV_CODEC_ID_H264)
[12:04] <wm4> they probably have their own demuxers and still use libvpx too...
[12:04] <Daemon404> libvpx is still faster for vp9 decoding on their biggest platform
[12:05] <wm4> arm?
[12:05] <wm4> i386?
[12:05] <Daemon404> 32bit windows
[12:05] <Daemon404> like most of the world
[12:05] <Daemon404> that isnt in a bubble
[12:06] <wm4> nah, 64 bit windows is getting pretty popular
[12:06] <wm4> but 32 bit might still be easier to distribute
[12:06] <wm4> also the future is mobile
[12:06] <Daemon404> 64 bit chrome is sitll not in the release channe
[12:06] <Daemon404> its in beta though
[12:06] <wm4> (unfortunately)
[12:14] <rcombs> wm4: Chrome uses libavformat for demuxing, can confirm
[12:14] <wm4> rcombs: even webm?
[12:14] <rcombs> yup
[12:15] <wm4> libavformat doesn't do DASH though
[12:15] <rcombs> you can throw an MKV with H.264 at it and it'll chug along happily, but it'll fail if the required probe time for the file is longer than 5 seconds
[12:15] <rcombs> neither does Chrome, natively
[12:15] <rcombs> DASH in Chrome involves MSE and a lot of JS
[12:15] <ubitux> so it does actually call the probing code from ffmpeg?
[12:15] <wm4> then how are they doing youtube?
[12:16] <wm4> though from what I hear, youtube uses a subset of dash... just separate audio and video tracks
[12:16] <rcombs> ubitux: yup
[12:17] <rcombs> wm4: YouTube on CC, for example, uses MSE and segments
[12:17] <ubitux> so, if it links to a system ffmpeg, will it be able to play like... nut/ffv1 videos?
[12:17] <rcombs> YouTube on desktop just uses separate audio and video tracks, and probably loads them separately and plays at the same time
[12:18] <rcombs> ubitux: I'm not sure if it ever links to system ffmpeg; I think it always bundles its own (libffmpeg_sumo)
[12:18] <ubitux> rcombs: there is a -Duse_system_ffmpeg=0/1 
[12:18] <rcombs> oh, well then
[12:18] <kierank> ubitux: no because they ship their own ffmpeg
[12:18] <kierank> they aren't stupid
[12:18] <ubitux> i know right
[12:18] <kierank> their own ffmpeg only has stuff they need
[12:19] <kierank> 11:17 AM <"ubitux> so, if it links to a system ffmpeg, will it be able to play like... nut/ffv1 videos?
[12:19] <kierank> so no
[12:19] <wm4> even open source applications ship their own ffmpegs
[12:19] Action: kierank does
[12:19] <rcombs> possibly, but I haven't tested it; they might special-case MKV (and possibly others) and otherwise not call into lavf at all
[12:19] <ubitux> kierank: -Duse_system_ffmpeg=1 will link against the system ffmpeg i suppose, which is not a subset
[12:19] <kierank> ah
[12:20] <ubitux> and i'm wondering about the impact on security it has (assuming it works in the first place)
[13:04] <cone-810> ffmpeg.git 03Michael Niedermayer 07master:8ba694548782: avcodec/utils: Fix off by 1 error causing unneeded allocation in ff_fast_malloc()
[15:19] <wm4> why does the libavformat mpegts demuxer seek around in the stream
[15:19] <wm4> isn't mpegts designed to not need that
[15:19] <nevcairiel> it should only do that on opening to determine the duration, or when it loses sync
[15:20] <wm4> seems to be mostly for resync
[15:23] <cone-810> ffmpeg.git 03Michael Niedermayer 07master:cf32181b7011: avcodec/put_bits: Add rebase_put_bits()
[16:10] <rcombs> so, free SSL
[16:21] <BtbN> "free" "SSL"
[17:21] <ubitux> [] find ass headers without "Format:" line in styles, but still defining some styles
[17:22] <ubitux> of course, ffmpeg doesn't like it
[17:22] <wm4> ubitux: yes
[17:22] <wm4> vsfilter doesn't read the format lines at all
[17:22] <ubitux> i wonder why we do
[17:22] <wm4> libass also fails in this case
[17:23] <wm4> there's no reaosn to
[17:23] <ubitux> not really
[17:23] <wm4> just makes the code more complicated
[17:23] <ubitux> it seems to like it well
[17:23] <wm4> unfortunately, ffmpeg _still_ outputs ass that assumes you can change the format lines
[17:23] <ubitux> :D
[17:23] <ubitux> i know i know :)
[17:23] <ubitux> i'm working on it
[17:23] <wm4> libass falls back to default ass/ssa format line if it fails
[17:23] <ubitux> right, but ffmpeg doesn't fallback
[17:24] <ubitux> it justs... return -1 ;)
[17:24] <wm4> s/fails/is missing/
[17:24] <nevcairiel> the damn thing has a header indicating which fields it has, why would you not assume!
[17:24] <wm4> ubitux: if you want to do it "correctly", you have to read vsfilter sources
[17:24] <ubitux> nevcairiel: Format: is an alias for Comment:, but be careful because Comment: actually is a timed event
[17:24] <ubitux> so Format: is the real comment field
[17:25] <ubitux> wm4: yeah probably
[17:26] <ubitux> anyway, since i'm going to change the decoded text form, it will fix that at the same time
[17:26] <wm4> e.g. vsfilter ignores even sections
[17:26] <wm4> it just takes whatever headers it can find
[17:26] <wm4> and I forgot how it distinguishes ssa and ass
[17:27] <ubitux> V4+ Styles vs V4 Styles
[17:27] <ubitux> probably
[17:28] <wm4> possibly
[19:28] <ubitux> today i found a -lowres user 
[19:29] <wm4> ubitux: me too
[19:29] <ubitux> really?
[19:30] <wm4> a mplayer user who tried to play something on his underpowered device with mpv
[19:30] <wm4> not sure if he actually used it
[19:30] Action: wm4 sees deprecated API warnings when compiling ffmpeg.c... wat
[19:30] <ubitux> his nickname started with 'f'?
[19:32] <ubitux> wm4: yes we wait for our users to test the api before we do
[19:33] <wm4> no, not 'f'
[19:33] <ubitux> ok, then it's indeed 2 different users
[19:33] <wm4> fascinating
[19:33] <nevcairiel> unless its the same user under different names
[19:35] <ubitux> the guy was complaining that he couldn't play "hd" videos on his eeepc anymore because libav dropped it (not more mplayer -lavdopts lowres=1)
[19:42] <wm4> h264: The maximum value for lowres supported by the decoder is 0
[19:42] <wm4> when using lowres=1
[19:44] <ubitux> try a mpeg4 video
[20:33] <cone-810> ffmpeg.git 03Timothy B. Terriberry 07master:a05f5052fef3: sdp: Make opus declaration conform to the spec
[20:33] <cone-810> ffmpeg.git 03Michael Niedermayer 07master:77ab7407c272: Merge commit 'a05f5052fef3b3743fab7846da12861d8a8098ec'
[20:38] <Compn> yeah i think you can only lowres some strange h264 (maybe no slices? ) but mpeg4 should still work
[20:38] <Compn> and mpeg2 of course
[20:39] <Compn> i'm surprised some eeepc was using it :)
[20:39] <Compn> or underpowered device*
[21:02] <cone-810> ffmpeg.git 03Luca Barbato 07master:e3a00acde05c: hevc: Initialize mergecand_list to 0
[21:02] <cone-810> ffmpeg.git 03Michael Niedermayer 07master:46807921f768: Merge commit 'e3a00acde05c925617dc19b5373969d864bf8414'
[21:10] <cone-810> ffmpeg.git 03Michael Niedermayer 07master:2cd7c99498b1: h264: reset ret to avoid propagating minor failures
[21:10] <cone-810> ffmpeg.git 03Michael Niedermayer 07master:56c47364c3b5: Merge commit '2cd7c99498b1bcb450c328459cf85a686ce83456'
[21:35] <someone-noone> hello. I capture my mic & webcam with avfoundation framework, but resulting file is out of sync. Here's what I run: http://pastebin.com/EZRRCXST
[21:35] <someone-noone> I think it's due to the fact that my webcam starts not immediately as mic and that's why video little bit late
[21:36] <someone-noone> I looked at avfoundation.m and saw that first pts values is set in read_packet() , before devices are opened
[21:36] <someone-noone> so pts values should be correct, I can't figure out why it's happening, can someone help?
[21:44] <cone-810> ffmpeg.git 03Justin Ruggles 07master:19133e96d30e: lavf: fix memleaks in avformat_find_stream_info()
[21:44] <cone-810> ffmpeg.git 03Michael Niedermayer 07master:a2d5f6b9dbcc: Merge commit '19133e96d30e3f80dbae236ef081aedef419a6bf'
[21:45] <michaelni> someone-noone, probably best you mail thilo or open a ticket on trac
[21:46] <someone-noone> michaelni, thanks
[22:08] <cone-810> ffmpeg.git 03Michael Niedermayer 07master:1441641d786a: avcodec/mpegvideo_enc: Allocate only a small buffer and reallocate as needed
[22:27] <akira4> Hi! I'm new here. I wanted to apply for OPW under ffmpeg. Could someone guide me  about how to start? 
[22:28] <hawken> Hi, was wondering if there were plans for mvc support in ffmpeg. I'd like to participate but I don't know where to start
[22:29] <llogan> akira4: have you seen this link? https://trac.ffmpeg.org/wiki/SponsoringPrograms/OPW/2014-12
[22:32] <akira4> llogan, I did check the page but couldn't figure out where exactly to start
[22:33] <llogan> do any of the listed projects interest you? do you have a project idea you would like to work on?
[22:35] <akira4> llogan, I am interested in the subtitles project.
[22:35] <llogan> i think ubitux can help you with that.
[22:36] <ubitux> oh?
[22:36] <llogan> you're listed as a mentor
[22:36] <ubitux> yeah right
[22:36] <ubitux> but i wasn't briefed on the opw process
[22:36] <ubitux> so yeah sure i can answer subtitles questions, but not so much about opw
[22:37] <akira4> I see. Thanks
[22:37] <akira4> Also how do I start with the qualification task?
[22:38] <ubitux> mmh let me what i put there
[22:38] <ubitux> "write one subtitles demuxer and decoder (for example support for Spruce subtitles format). This is in order to make sure the subtitles chain is understood." mmh alright
[22:39] <ubitux> let me see if i can find something that looks like a specifications
[22:39] <ubitux> http://documentation.apple.com/en/dvdstudiopro/usermanual/index.html#chapter=19%26section=13%26tasks=true
[22:40] <ubitux> alright, this looks like a good starting point
[22:40] <ubitux> not sure if we have samples around
[22:41] <akira4> cool. Thanks
[22:41] <ubitux> http://www.eso.org/~lchriste/trans/eyes/subtitles/soundtrack.stl
[22:41] <ubitux> alright, here is one
[22:42] <ubitux> akira4: are you familiar with libavformat/libavcodec and the demux/decode process or not at all?
[22:42] Action: rcombs would love to see subtitles in lavfi, and would probably help with it
[22:42] <akira4> ubitux, No I'm not actually.
[22:43] <ubitux> rcombs: yes that's the last step, but there are a few things before, to give me enough time to redesign the api so it's possible :P
[22:43] <ubitux> akira4: alright, so...
[22:43] <ubitux> akira4: basically, see this first: http://ffmpeg.org/ffmpeg.html#Detailed-description
[22:43] <ubitux> (just the ascii graph and the explanation below)
[22:44] <ubitux> poke me when you're done, i'll explain how subtitles fit into this
[22:44] <akira4> I see.
[22:44] <akira4> cool
[22:44] <akira4> I'll do that
[22:44] Action: hawken looks around
[22:45] <hawken> So, I don't have anything to do with OPW being a guy but I still want to see mvc happen :P Can I take part in that?
[22:45] <ubitux> hawken: sorry, i don't know anything about mvc, you should probably ask michaelni 
[22:45] <hawken> Okay..
[22:45] <hawken> michaelni: ^ ?
[22:45] <ubitux> iirc several people tried to approach the mvc problem over the time
[22:45] <hawken> I'd like as much information as I can get
[22:45] <ubitux> we had mixed results
[22:46] <hawken> I tried making something from scratch to analyze the files but I feel I only scratched the surface
[22:46] <CoRoNe> Good afternoon,
[22:46] <ubitux> i think at some point one guy sent the worst patch dump ever
[22:46] <ubitux> hawken: maybe try "site:ffmpeg.org ffmpeg-devel mvc" on google
[22:46] <hawken> https://github.com/hawken93/mvc-decoder <-- so I didn't get to actual MVC but I think I made a nice m2ts debugger
[22:46] Action: hfvjkab is reminded about about people trying to add cuda to x264
[22:47] <hfvjkab> WTFis up with my nick?
[22:47] <hawken> Ah I saw that..
[22:47] <hawken> so that draw me towards making my own stuff since I totally don't know where to start in a project as big as ffmpeg :P
[22:47] <hawken> I'm a total newb lol :3
[22:48] <ubitux> https://encrypted.google.com/search?hl=en&q=site%3Affmpeg.org%20ffmpeg-devel%20mvc
[22:48] <hawken> mhm
[22:48] <hawken> Okay I'll be going through all the results now :P
[22:49] <ubitux> akira4: i realize that description doesn't actually explain much; can you start building ffmpeg and get maybe a .srt file somewhere?
[22:49] <CoRoNe> Good afternoon,
[22:49] <CoRoNe> Possible bug: whenever I'm trying to stream opus over rtp, ffmpeg by default prints:
[22:49] <CoRoNe> SPD:
[22:49] <CoRoNe> ...
[22:49] <CoRoNe> a=rtpmap:97 opus/48000
[22:49] <CoRoNe> ...causing the stream to be mono. If I manually append '/2' to 'opus/48000', the stream is stereo like the source.
[22:49] <CoRoNe> I there anything you developers can do about it?
[22:50] <akira4> ubitux, Alright I'll try doing that.
[22:50] <CoRoNe> libavformat / sdp.c: line 606
[22:53] <llogan> CoRoNe: submitting a patch to ffmpeg-devel will allow us to see it in full context and allow others who are not in IRC to possibly comment.
[22:54] <CoRoNe> I'm sorry, I'm not familiar with that, nor am I a developer. Would creating a thread on http://ffmpeg.zeranoe.com/builds/ suffice too?
[22:55] <ubitux> CoRoNe: http://ffmpeg.org/bugreports.html or http://ffmpeg.org/developer.html#Contributing
[22:55] <ubitux> depending on what you want to do
[22:57] <CoRoNe> oh yes, there's a bugtracker of course. I'll create a new ticket. Thank you
[23:17] <akira4> ubitux, I'm done with the building of the source code part. Should I read the documentation that you provided?
[23:18] <ubitux> akira4: it will just take you 2 min
[23:18] <ubitux> akira4: do you have a .srt file at hand?
[23:18] <akira4> ubitux, yep I have many .srt files with me
[23:19] <ubitux> ok; so do you have a ffprobe tool built in the source directory?
[23:20] <akira4> ubitux, yes I do.
[23:21] <ubitux> try running ./ffprobe -show_packets -show_data foo.srt|less
[23:21] <ubitux> this will show you the demuxing process
[23:21] <akira4> I see.
[23:21] <ubitux> basically, the srt demuxer (in libavformat/srtdec.c) will fill "packets"
[23:22] <ubitux> a packet is a simple structure, which has a few fields, notably pts, data and size
[23:22] <ubitux> and duration
[23:22] <ubitux> (and a few other things you can see here)
[23:23] <ubitux> the data is basically supposed to be kind of opaque
[23:23] <ubitux> in the case of the srt, you can see that it contains basically the text for each event
[23:23] <akira4> yes
[23:23] <ubitux> but it can have markup, right
[23:23] <ubitux> typically, it's the event copied verbatim
[23:23] <ubitux> so with <font ...> and stuff like that
[23:24] <ubitux> other subtitles demuxer will output similar packets, with their markups as well
[23:24] <akira4> I see.
[23:24] <ubitux> for microdvd typically, you'll get stuff like ${c:...}
[23:24] <ubitux> and same for every other formats
[23:24] <ubitux> anyway
[23:25] <ubitux> these packets, you can send them directly to a muxer
[23:25] <ubitux> for instance with ffmpeg, you can do ffmpeg -i in.srt -c copy out.srt, and only the demuxer and muxer will be in the chain
[23:26] <ubitux> the demuxer will output timed packets, and the muxer will re-create a file by printing timestamps and the payload (data)
[23:26] <ubitux> similarly, you can do ffmpeg -i in.srt -c copy out.mkv
[23:26] <akira4> oh
[23:26] <akira4> wait so
[23:26] <akira4> if i'm getting this right
[23:26] <akira4> we're basically taking a .srt file with markup
[23:27] <akira4> and creating a file with timed events corresponding to the text?
[23:27] <ubitux> you should get the exact same file at the end
[23:28] <ubitux> can you open libavformat/srtenc.c ?
[23:28] <ubitux> the srt_write_packet() is the main callback of the muxer
[23:29] <ubitux> it takes a packet with a pts and duration, and print the string "00:01:02:03 --> 04:05:..."
[23:29] <ubitux> and the payload
[23:29] <akira4> okay.
[23:30] <ubitux> the idea is that some containers (or formats) accepts packets of different known tags
[23:30] <ubitux> so your srt demuxer is outputing packets, with the codec "subrip" (that's the name of the markup)
[23:30] <ubitux> several muxers can take these packets
[23:30] <ubitux> the srt muxer is obviously one, but the matroska (mkv) muxer also accepts them
[23:30] <ubitux> it means the muxer know how to store these packets
[23:31] <akira4> I see.
[23:31] <ubitux> in the case of the srt muxer, it will create a new .srt file with just the timestamps printed as is, and the text
[23:31] <ubitux> and matroska has its own way of storing the timestamps
[23:31] <ubitux> so you'll have something like 32-bits for a timestamps, it will be stored as binary
[23:32] <ubitux> (not sure if that's exactly that, but you get the point)
[23:32] <akira4> yeah
[23:32] <ubitux> so, is that fine with you so far?
[23:32] <akira4> one thing
[23:32] <ubitux> i'm going to go on the decoding process now
[23:32] <ubitux> ok
[23:32] <akira4> so the srt file that was created out.srt
[23:33] <akira4> is in some ways different from in.srt ?
[23:33] <ubitux> ideally, it should be the same
[23:33] <ubitux> in practice you might have slight differences
[23:33] <ubitux> let me thing of an example..
[23:34] <ubitux> right, imagine you have a lot of empty lines at the beginning of in.srt
[23:34] <ubitux> the srt demuxer will ignore them
[23:34] <ubitux> and just outputs packets loosing that "information"
[23:34] <akira4> okay
[23:34] <ubitux> similarly, you could imaging a in.srt with timestamps written in a weird manner
[23:35] <ubitux> like, i don't know, if the timestamps are written like 00001:02:03.04 --> ...
[23:35] <ubitux> when reading the timestamps and storing it in AVPacket.pts, the demuxer will loose that weird "0" padding
[23:36] <ubitux> only the timestamps value itself is kept
[23:36] <akira4> and that would lead to losing of data if the .srt file is opened by a container that isn't compatible with it?
[23:36] <ubitux> the muxer will probably print 01:02:03.04 --> ...
[23:36] <ubitux> you can't actually send the packet to a muxer that doesn't support it
[23:37] <ubitux> if the muxer doesn't accept subrip packets, you'll have to convert them, that's the nextttstep
[23:37] <ubitux> (convert them from one markup to another)
[23:37] <ubitux> this is basically the same as audio and video
[23:37] <ubitux> if you have a mkv with h264 in it, and you want to put that h264 into ogg, you can't
[23:37] <ubitux> because ogg doesn't accept h264 packets, and you'll have to convert these packets
[23:38] <ubitux> OTOH, you can demux h264 packets from a mkv file, and just remux them into a mp4 file
[23:38] <ubitux> because both mkv and mp4 accept h264 packets
[23:40] <akira4> hold on. Let me read the whole thing. it happened too fast.
[23:40] <ubitux> right, sorry
[23:41] <akira4> so the whole idea is that if there is a packet that a muxer doesnt support we convert it?
[23:41] <ubitux> yes
[23:41] <akira4> and the packets can be different if they have different tags?
[23:41] <ubitux> that's the decoding/encoding process i was going to explain
[23:41] <akira4> okay
[23:42] <ubitux> what do you mean by different packets and different tags?
[23:42] <ubitux> a subrip packet will be different from a microdvd packet yes
[23:42] <akira4> I'm not sure what tags mean
[23:42] <akira4> I was gonna ask that
[23:42] <ubitux> i don't remember talking about tags, but you did :p
[23:43] <akira4> sorry. I think I got confused
[23:43] <ubitux> i'm going to give you more example before i continue
[23:43] <akira4> cool. thanks :)
[23:44] <ubitux> a .sub file (microdvd) contains lines like this: "{1400}{1500}hello world"
[23:44] <ubitux> 1400 is the starting frame, and 1500 is the ending one; you can consider them as timestamps for now
[23:44] <akira4> ok
[23:44] <ubitux> now microdvd also has a markup system
[23:45] <ubitux> it looks like this typically: "{1400}{1500}{c:$ff0000}hello world"
[23:45] <ubitux> this will make the text in red
[23:45] <ubitux> anyway, the demuxer has no knowlead of the markup
[23:45] <ubitux> and in this case, it will output a packet that looks like this:
[23:46] <ubitux> AVPacket { pts=1400, duration=100, data="{c:$ff0000}hello world" }
[23:46] <ubitux> (it's a C-struct, right?)
[23:46] <akira4> yep
[23:46] <ubitux> if you have a .srt file, it's different
[23:46] <ubitux> you will probably get something like this:
[23:47] <akira4> I see.
[23:47] <ubitux> AVPacket { pts=56, duration=100, data="<font color=red>hello world</font>" }
[23:47] <ubitux> and so, as you guess, the srt muxer can not accept the microdvd packets
[23:47] <ubitux> and in the same way the microdvd can not accept the srt packets
[23:48] <ubitux> otherwise, you would end up with files like with this:
[23:48] <ubitux> {1400}{1500}<font color=red>hello world</font>
[23:48] <ubitux> and this is an invalid file
[23:48] <akira4> hmm.
[23:48] <ubitux> that's why the microdvd muxer only accept microdvd packets
[23:48] <ubitux> and the srt muxer only accepts srt packets
[23:49] <ubitux> for example, matroska accepts the srt packets, but not the microdvd ones
[23:49] <ubitux> because that's how it was designed, it only accepts the srt markup
[23:49] <ubitux> did i lost you again?
[23:49] <akira4> nope
[23:49] <akira4> I got everything :)
[23:50] <ubitux> cool
[23:50] <ubitux> should i move on then?
[23:50] <akira4> yes
[23:50] <ubitux> alright, now it's a bit tricky
[23:50] <ubitux> i'm going to make a comparison with how audio and video are handled
[23:51] <akira4> okay
[23:51] <wm4> oh, someone is going to help out with subtitles stuff... that's nice
[23:51] <ubitux> if you have h264 packets, the data is the compressed data that only the decoder can translate to images
[23:51] <ubitux> every video decoders (they are in libavcodec/) only understands one kind of packets
[23:51] <ubitux> they do output "raw" frames
[23:52] <akira4> I see.
[23:52] <ubitux> there are various different form of "raw" but all of them are generics
[23:52] <akira4> what exactly are frames?
[23:52] <ubitux> typically, some decoders will output RGB
[23:52] <ubitux> some other will output YUV, but that's all pure raw that can be piped to image processing code
[23:52] <wm4> please note that this separation is sometimes a bit non-sensical with subtitles
[23:52] <ubitux> wm4: wait wait ;)
[23:53] <wm4> because subtitles are complicated
[23:53] <wm4> ok ok
[23:53] <ubitux> wm4: i'm trying to make things simple first :P
[23:53] <ubitux> akira4: when talking about frames, i'm talking about the AVFrame structure
[23:53] <ubitux> it contains a lot of information about the decoded "frame" or image
[23:54] <ubitux> (or slice of sound in the case of audio)
[23:54] <akira4> Oh. I see.
[23:54] <ubitux> it's basically the exploitable data
[23:54] <ubitux> a decoder task is to transform a AVPacket into a AVFrame (in the case of audio/video)
[23:54] <ubitux> the AVPacket is the opaque form that only demuxer and muxers understand
[23:54] <ubitux> and the AVFrame is the decoded form, which is usable by everyone
[23:55] <akira4> ok
[23:55] <ubitux> so if you want to display a frame in your video player, you have to decode it, to get the picture itself, and blend it to the screen or whatever
[23:55] <wm4> it's worth noting that a packet (AVPacket) is basically always a byte blob
[23:56] <ubitux> yes, the data is supposely opaque, it just carries timing information and says what the data blob type is
[23:56] <wm4> so it's basically a byte array plus timestamps (and, very rarely, some "side" data)
[23:56] <akira4> ok.
[23:56] <ubitux> so now we reach the point where you can get a clue about what the ascii graphic @ http://ffmpeg.org/ffmpeg.html#Detailed-description means
[23:57] <akira4> yeah.
[23:57] <ubitux> the decoded frames box is where you plug your filters typically (to alter the image itself)
[23:57] <ubitux> ok, so this is how the audio and video works
[23:57] <ubitux> subtitles are a PITA so it's a bit different
[23:57] <akira4> the filters can be used on both audio and video frames right?
[23:57] <ubitux> yes
[23:58] <ubitux> nowadays both audio and video are stored in the AVFrame structure
[23:58] <wm4> ubitux: we should explain how audio/video is displayed
[23:58] <wm4> since video players are, you know, a very common use case of ffmpeg
[23:58] <ubitux> i don't think that's necessary now
[23:58] <ubitux> i'm trying to make sure the demuxer/muxer and decoder/encoder process is well known
[23:59] <ubitux> so akira4 can write a demuxer/decoder for a simple subtitle format
[23:59] <ubitux> and actually understand what's going on
[23:59] <ubitux> anyway
[23:59] <ubitux> akira4: so, are you ok so far?
[23:59] <akira4> Yep.
[00:00] --- Tue Sep 30 2014


More information about the Ffmpeg-devel-irc mailing list