[Ffmpeg-devel-irc] ffmpeg.log.20130125
burek
burek021 at gmail.com
Sat Jan 26 02:05:01 CET 2013
[00:17] <KeshlWare> I has a PNG sequence. How do I loop it 3 times during encode? xwx
[00:35] <notedible_> is there any way to mux dvdsubs into MP4 or will this never be supported since it is non-standard (although mp4box / nero recode / etc. support muxing)?
[00:35] <notedible_> using ffmpeg 1.1.1
[01:09] <JEEB> notedible_, I'm pretty sure DVD subs in "mp4" is nero-specific, and mp4box sure found nothing else to do if they implemented it :V That said, nero chapters got implemented, too.
[01:10] <JEEB> it is very much non-standard
[01:10] <notedible_> JEEB: you're right, mp4box only transcodes from other softsubs to TTXT
[01:11] <notedible_> is there a way to dump them to a file from ffmpeg?
[01:12] <notedible_> i don't know what extension to put (.sub is a softsub format)
[01:39] <frozenbone> hi folks - i've got a raw h264 video without a container - i've used ffmpeg to put it in an h264 container with -vcodec copy, but it takes up twice as much disk (one for the orig, one for the copy). Anyone know of a way to containerize a file inplace? Without the copy?
[01:51] <stephanedev> frozenbone: cannot you delete the original file once you're done if you don't want it anymore?
[01:52] <frozenbone> Thanks stephanedev - I can, but I'm up against a space crunch :-) Was holding out on buying another hard disk until I asked
[01:54] <stephanedev> well, all i can say is that most of the time programs don't write in-place. if it looks like they are doing it, quite often they actually create a new file, write to it, delete the original one and rename the old one to the original name
[01:55] <stephanedev> this helps with not losing data too in case of failure. imagine your computer crashes while overwriting the original file
[01:56] <frozenbone> Thanks stephanedev for the reply :-) Totally understand
[01:56] <stephanedev> you're welcome
[03:12] <ulatekh> I'm trying to pack raw bottom-field-interlaced YUV into a container (e.g. "ffmpeg -i input.yuv -vcodec copy output.avi") but when I look at the result (i.e. "ffmpeg -i output.avi -vcodec rawvideo -f yuv4mpegpipe - | head -1") the header now says it's progessive! How do I preserve interlacing? Is this a bug? I'm using ffmpeg 0.10.6 on Fedora Core 17.
[03:19] <ulatekh> Anyone awake here?
[03:20] <michaelni> ulatekh, its probably a feature request
[03:20] <michaelni> for .avi at least
[03:20] <ulatekh> Should I be using a different container?
[03:21] <ulatekh> I just want my raw YUV in a format that kdenlive will accept, since it won't accept a .yuv file.
[03:23] <michaelni> maybe .y4m or .mov would work
[03:23] <michaelni> you can also open a feature request on the bug traker for the avi case
[03:24] <ulatekh> I've had bad luck with ffmpeg requests in the past...if I get any response at all, it's usually "send a patch", i.e. they expect me to fix it.
[03:32] <ulatekh> Encoding to .mov worked, but decoding to raw video again (i.e. "ffmpeg -i output.mov -vcodec rawvideo -f yuv4mpegpipe -pix_fmt yuv444p - | head -1") freaks out with an endless stream of "Error while decoding stream #0:0" messages.
[03:33] <ulatekh> y4m decode worked, but the header says the video is progressive now.
[03:37] <michaelni> ulatekh, you will have to submit a feature request then ... or a patch ;)
[03:38] <ulatekh> After seeing http://ffmpeg.org/doxygen/trunk/huffyuv_8c-source.html online, like 636 suggested adding "-flags +ilme" to the command line...result still progressive...argh
[03:38] <ulatekh> like -> line
[03:50] <ulatekh> I grabbed the source for my version of ffmpeg (i.e. "yumdownloader --source ffmpeg") and its libavcodec/huffyuv.c also sets the interlaced flag if "-flags +ilme" is set on the command line...but still no interlaced...
[03:51] <ulatekh> Hey, the comment at the top of huffyuv.c says "michaelni"...dude, this is your code :-)
[04:02] <ulatekh> Oh well, guess I'll have to work on this later...gotta sleep
[04:48] <Demon_Fox> I think "Stream #0:0: Audio: mp3, 44100 Hz, stereo, s16, 224 kb/s" Sounds good for a CD
[05:16] <Mista_D> can I have ffprobe use `-vf idet` to check for interlacing, i need a json/csv output, instead of stderr??
[05:24] <Mista_D> anytime I run `ffprobe -filters` it shows none... compiled 5 versions already 0.6 - 1.1.1
[05:26] <Mista_D> is there a way to enable libavfilters in ffprobe?
[05:51] <aboba> Does this channel also support avconv?
[05:53] <aboba> I'm having some serious issues doing streaming with ffmpeg to twitch, and I think I might be screwing up my options somehow
[05:53] <aboba> I've played with a whole bunch, but just can't seem to get it right
[05:53] <klaxa> for avconv support visit #libav
[05:55] <aboba> I set this all up using ffmpeg originally
[05:55] <aboba> some options
[05:55] <aboba> maybe thats the problem
[05:55] <aboba> http://pastebin.com/PFD8FsGN
[07:34] <fling> I have a fine script http://bpaste.net/show/72678/
[07:34] <fling> hom may I add a file length hh:mm:ss into a filename?
[09:39] <pzich> is it possible to get ffmpeg fast seek to properly sync with audio?
[09:40] <pzich> I'm currently running the same command but only changing the location of the -ss flag relative to the -i, the fast seek is much faster, but the sound is pretty horribly off
[09:40] <pzich> if you have any other flag tips while you're at it: http://pastebin.com/qSkusKMK
[09:53] <jeje> hi to all
[09:53] <jeje> I use ffmpeg to decompress video streaming from IP cameras
[09:54] <jeje> and I have to do some strange operation on the line size
[09:54] <jeje> av_image_fill_linesizes(m_lpFrame->linesize, PIX_FMT_YUV420P,dwWidth+(dwWidth%32)+32);
[09:55] <jeje> if I don't do this, all width resolution which are not divisible by 32 have a wrong stride in the renderer (make by swscale)
[09:55] <jeje> If someone can explain me this part, regards
[09:59] <jeje> for information, I use ffmpeg version I found on Latest Zeranoe FFmpeg Build Version 32 bits
[10:04] <jeje> the video compression from camera is H264 of course
[10:07] <jeje> the renderer is directly make on DirectDraw surface in RGB32
[10:09] <jeje> if needed I can make a pastebin code
[11:05] <jeje> also with the last version of ffmpeg, I have a crash when using avcodec_default_free_buffers
[11:05] <jeje> is this function deprecated?
[11:28] <jeje> please someone can make a comment because It seems I just see the join and quit in my IRC soft
[11:38] <Samus_Aran> can ffmpeg dump single frames for specific timestamps in a video?
[11:40] <Samus_Aran> and if so, can it do it with a high level of time precision? mencoder is very inaccurate in this regard.
[11:40] <durandal_1707> Samus_Aran: try select filter
[11:41] <durandal_1707> it select decoded frames though...
[11:42] <Samus_Aran> I'm trying to automate this from the command line
[11:49] <Samus_Aran> the man page doesn't have any examples of the select filter, I'll try a web search, thanks.
[11:50] <durandal_1707> Samus_Aran: man page is not documentation, documentation is on web page
[11:51] <durandal_1707> http://ffmpeg.org/documentation.html
[11:53] <jeje> durandal_1707>thanks. About the buffer padding, I only see the input buffer must be padded with FF_INPUT_BUFFER_PADDING_SIZE
[11:53] <Samus_Aran> durandal_1707: man pages very often are documentation, i.e. man bash, man screen, etc.
[11:54] <Mavrik> not here.
[12:09] <Samus_Aran> what does this mean: "Buffering several frames is not supported. Please consume all available frames before adding a new one."
[12:09] <Samus_Aran> my command was: ffmpeg -i Video.mkv -r 1 -vframes 120 -ss 01:30:14 Snapshot-%5d.jpg
[12:11] <jeje> Can someone explain me how to get and set the padding value for an AVFrame according to the width and height of the resolution ....
[12:14] <jeje> because I only use avcodec_alloc_frame to get my AVFrame, but I think I have to set a padding value
[12:15] <jeje> I need to call av_image_fill_linesizes(m_lpFrame->linesize, PIX_FMT_YUV420P,dwWidth+(dwWidth%32)+32) to have a good result
[12:16] <jeje> but I can't understand why and durandal_1707 tell me I have to pad and memset buffers
[12:16] <jmbccoffee> av_image_alloc ?
[12:16] <durandal_1707> buffers are allocated by other means ...
[12:18] <durandal_1707> jeje: your buffer that you allocate must be padded
[12:20] <jeje> but when I call avcodec_alloc_frame() to get my AVFrame, how can I set the padding buffer values?
[12:21] <jeje> because I don't allocate buffers
[12:21] <jeje> and it work
[12:30] <durandal_1707> that function allocate AVFrame but not buffers.....
[12:31] <jeje> I need to do a malloc or call an FFMPEG funvtion to alloc buffer so?
[12:36] <jmbccoffee> i just do an alloc_frame, image_alloc, and i never touch a video buffer
[12:37] <jeje> in the documentation of AVFrame, I just see: The buffers for the data must be managed through other means
[12:38] <Mavrik> avpicture_alloc is the call that fills AVFrame buffers properly
[12:39] <jmbccoffee> which is like obsoleted by image_alloc
[12:40] <Mavrik> mhm, av_image_alloc actually
[12:42] <jmbccoffee> he, of corse
[12:58] <Samus_Aran> can someone tell me why this command eats up all my system RAM and swap before I can kill it: ffmpeg -i Video.mkv -ss 1:13:13.13 -vframes 1 Shot-%6d.jpg
[13:00] <Samus_Aran> same error I mentioned a few minutes ago: [buffer @ 0xf43780] Buffering several frames is not supported. Please consume all available frames before adding a new one.
[13:00] <Samus_Aran> I'm just trying to save one frame...
[13:15] <Samus_Aran> this seems to work: ffmpeg -ss 1:13:13.13 -i Video.mkv -vframes 1 output.jpg
[13:25] <durandal_1707> Samus_Aran: you can archive same with select filter
[13:26] <burek> Samus_Aran, try: ffmpeg -ss 1:13:13.13 -i Video.mkv -vframes 1 Shot-%6d.jpg
[13:27] <burek> (move -ss before -i)
[13:27] <burek> it shouldn't crash your machine)
[13:27] <burek> i really should read all the messages before replying :)
[13:37] <Samus_Aran> burek: if it's after, it eats all the system RAM, then all the swap, then I can kill it
[13:37] <burek> yes
[13:37] <burek> you are actually parsing each frame that you decode
[13:38] <burek> and i guess ffmpeg buffers it (you didn't show your full log, so we can't tell more than this)
[13:38] <Samus_Aran> it repeated that message thousands of times
[13:38] <Samus_Aran> presumably for every frame
[14:12] <durandal_1707> Samus_Aran: select='gte(t\,10)*lte(t\,20)' <-- Select only frames contained in the 10-20 time interval
[14:17] <Samus_Aran> durandal_1707: I am trying to grab an individual frame for a given timestamp
[14:17] <Samus_Aran> which I have working now with the above command
[14:18] <Samus_Aran> I just tested on a 29.970 FPS video, doing the match for the first 29 frames, and they were all dumped correctly (each a different frame), so it appears to be time-accurate
[14:18] <Samus_Aran> *doing the math
[14:19] <Samus_Aran> -ss $frame/29.970
[14:24] <durandal_1707> Samus_Aran: are you telling me you are calling ffmpeg multiple times to grab single frame?
[14:47] <funyun> hi. can anyone tell me what these errors mean and what i can do to fix them? http://pastebin.com/nwc0qde4
[14:50] <JEEB> broken H.264 stream?
[14:56] <funyun> JEEB: any idea how to fix that?
[14:57] <JEEB> no idea :P
[14:57] <JEEB> you could poke the issue tracker with a sample
[14:57] <JEEB> and get a better explanation
[15:35] <Samus_Aran> durandal_1707: what do you mean? I am calling ffmpeg once to grab one frame
[15:36] <durandal_1707> Samus_Aran: so you need only single frame frome single file?
[15:38] <durandal_1707> if you need multiple frames from single files select filter is better tool
[15:38] <Samus_Aran> durandal_1707: if you want to tell me how to dump frames at specific intervals, feel free. but if it requires the entire video to be decoded, it isn't of any use
[15:39] <durandal_1707> it should not need entire video to be decoded ...
[15:39] <Samus_Aran> it seemed that is what the select filter would be doing
[15:40] <durandal_1707> seemed?
[15:40] <Samus_Aran> it's a filter, filters do things to frames
[15:40] <durandal_1707> you actually tried it?
[15:40] <Samus_Aran> at least in other video editing apps
[15:40] <Samus_Aran> I have no clue how to use it
[15:41] <Samus_Aran> the man page doesn't explain, and the documentation I was pointed to isn't indexed well enough for me to have any idea where to look
[15:41] <durandal_1707> so is your solution working good for you?
[15:41] <Samus_Aran> it seems to be accurate to the frame.
[15:42] <Samus_Aran> takes about 0.5 to 1.0 seconds to dump a frame in PNG
[15:46] <Mavrik> Samus_Aran: doing a full decoding pass and grabbing frames you need is almost always the most efficient way of doing what you're trying to do
[15:46] <Mavrik> especially if number of frames is large
[15:48] <durandal_1707> Samus_Aran: so you have any further questions?
[16:00] <Samus_Aran> Mavrik: it takes me 1 second to grab a frame using my method, entire decoding would take over an hour. how is that more efficient?
[16:01] <Mavrik> entire decoding would take over an hour? what?
[16:01] <Mavrik> are you doing that on 300MHz ARM?
[16:01] <Samus_Aran> Mavrik: decoding a 3 hour 720p video
[16:01] <Mavrik> yes?
[16:01] <Samus_Aran> takes hours
[16:02] <Mavrik> well, that presumption is wrong.
[16:02] <Samus_Aran> if it is decoding the entire video, it isn't
[16:02] <Samus_Aran> it uses about 90% CPU time to play the video.
[16:03] <Mavrik> 300MHz ARM?
[16:03] <Samus_Aran> Mavrik: you're not being helpful in the slightest
[16:04] <Mavrik> yes, because you're not reading what I write.
[16:04] <retard> i've been using http://pastebin.com/1u96HKuz to extract clips from a file beginning at keyframes
[16:04] <Mavrik> you managed to miss this too: < Mavrik> | especially if number of frames is large
[16:04] <retard> which is ugly and slow but works
[16:04] <Mavrik> mhm
[16:05] <retard> without this convoluted bullshit the audio begins at the specified time while the video plays from the nearest keyframe
[16:05] <Mavrik> if you give seek parameters before "-i" ffmpeg will skip to keyframe
[16:05] <retard> leading to desynch in certain players
[16:05] <Mavrik> and then do a decoding pass until your set time
[16:05] <Mavrik> at least in new versions
[16:05] <retard> it still desynchs the audio, i tried
[16:05] <Mavrik> doesn't work on all videos though
[16:05] <Mavrik> yeah
[16:06] <retard> this script works with everything i throw at it, but it has to parse every god damn frame
[16:06] <Samus_Aran> Mavrik: -ss skips to any frame, not just keyframes. I have tested it accurately on a 29.970 FPS video to capture individual frames.
[16:06] <Mavrik> and again, you're not reading what I write.
[16:06] <Samus_Aran> <Mavrik> if you give seek parameters before "-i" ffmpeg will skip to keyframe
[16:06] <Mavrik> Samus_Aran: and next two lines.
[16:07] <Samus_Aran> yes, it does that. it drops 0 to 90 frames when capturing one shot
[16:07] <Samus_Aran> on my video sample
[16:10] <jeje> thanks a lot at all for explication about the avcodec_alloc_frame and the needed of use av_image_alloc after to allocate the buffer of the AVFrame
[16:10] <jeje> it works well now
[16:11] <jeje> Just a final question. how to free the buffer of the AVFrame?
[16:11] <jeje> Is this call enought: av_free( m_lpFrame->data[0]) and after av_free( m_lpFrame)?
[16:11] <Mavrik> jeje: not always
[16:12] <Mavrik> (depends on pixel format)
[16:12] <jeje> so what is the "good" method to be sure to free all allocated?
[16:12] <Mavrik> jeje: documentation says "The allocated image buffer has to be freed by using av_freep(&pointers[0]).
[16:12] <Mavrik> "
[16:12] <Mavrik> for av_image_alloc
[16:13] <Samus_Aran> Mavrik: for my usage, there will be a possibly large gap between frame captures (e.g. 20-30 seconds)
[16:13] <jeje> ok so I need to do av_freep(m_lpFrame->data[0]) and after av_free (m_lpFrame)? it looks like better?
[16:13] <Mavrik> jeje: yea
[16:14] <jeje> thanks a lot
[16:45] <jeje> Hi, I have a crash in my application when I use av_freep( m_lpFrame->data[0]).
[16:46] <jeje> i check before my AVFrame* m_lpFrame is not null (of course) and check if (m_lpFrame->data[0]!=NULL) but when I use av_freep(m_lpFrame->data[0]) I've got an exception on delete operator
[16:50] <jeje> To allocate my AVFrame pointeur, I use m_lpFrame = avcodec_alloc_frame() and av_image_alloc(m_lpframe->data,...)
[16:56] <jeje> sorry I just found my error : av_freep( &m_lpFrame->data[0])
[16:56] <yermandu> how i can convert to divx, my dvdplayer said that accepts only dvix 556 :|
[19:40] <lodenrogue> hello
[20:00] <Diogo> this is possible ffmpeg the output time in seconds??
[20:01] <Fjorgynn> what?
[20:01] <llogan> i don't understand the question
[20:01] <Fjorgynn> -t 60s or something?
[20:02] <llogan> you don't need the "s"
[20:06] <lodenrogue> i cant seem to install ffmpeg on Gnu/Linux mint
[20:06] <Fjorgynn> why?
[20:07] <lodenrogue> I tried installing it via the ./configure make make install but after it does all that there is no executable file to run. I also tried to install it via the package manager and it wont appear anywhere in the computer
[20:10] <Fjorgynn> $ sudo apt-get install ffmpeg
[20:10] <Fjorgynn> lodenrogue: appear? You just do $ ffmpeg in the terminal...
[20:11] <lodenrogue> Fjorgynn: how do I get the program to run? I do ffmpeg in the terminal and all I get is release info
[20:11] <Fjorgynn> lodenrogue: ffmpeg isn't gui
[20:11] <lodenrogue> ???
[20:11] <lodenrogue> What?
[20:11] <lodenrogue> lol omg
[20:11] <lodenrogue> ok thanks
[20:11] <Fjorgynn> JEEB: give me a pillow
[20:12] <lodenrogue> is there a gui version of ffmpeg?
[20:13] <lodenrogue> or something similar with a gui option?
[20:13] <Fjorgynn> for what?
[20:13] <lodenrogue> I want to do gameplay live stream on twitch.tv
[20:14] <Fjorgynn> aha
[20:15] <Fjorgynn> and why do you want ffmpeg?
[20:15] <lodenrogue> I cant find any other GNU/Linux live stream recorder. And I read online that ffmpeg does that
[20:16] <Fjorgynn> http://linuxgamecast.com/2012/08/l-g-c-how-to-broadcasting-to-twitch-tv-and-justin-tv-with-linux/
[20:17] <lodenrogue> thanks man!
[20:17] <Fjorgynn> I googled twitch linux
[20:17] <lodenrogue> I get too specific with my searches so I miss the simple solutions. Thanks
[20:19] <hotwings> is ffmpeg git from videolan down right now? i get "Cloning into 'ffmpeg'..." but it just sits there forever
[20:20] <JEEB> yes
[20:20] <JEEB> and if you're on msys/mingw then it will clone like that
[20:21] <JEEB> (it doesn't have the detailed view by default or whatever)
[20:21] <hotwings> im using linux, which should give me more
[20:22] <hotwings> and git://source.ffmpeg.org/ffmpeg.git is just a redirect to videolan?
[20:22] <JEEB> yes
[20:22] <JEEB> git://git.videolan.org/ffmpeg.git is the same
[20:28] <hotwings> ok so it would be pointless to try cloning ffmpeg.org then. the way today has gone so far, im not surprised ... :\
[20:43] <llogan> hotwings: source.ffmpeg.org is the "official" one despite them currently being the same
[20:44] <JEEB> I don't see any real reason to move off of videolan though
[20:45] <beastd> JEEB: no reason ATM
[20:45] <JEEB> well yeah, maybe in case videolan gets destroyed or if there is :drama: between videolan and ffmpeg
[20:45] <JEEB> for whatever reason I don't really see that happening, though :)
[20:45] <beastd> yeah
[20:46] Action: beastd never saw any reason to split ffmpeg developer community and surely would have not seen it coming 5 years ago
[20:46] <llogan> developering is hard. let's go shopping.
[20:47] <JEEB> hats would be good
[20:47] <JEEB> D_S would join us for that too, probably!
[20:47] <llogan> handbag!
[20:49] <hotwings> [11:43:49] <llogan> hotwings: source.ffmpeg.org is the "official" one despite them currently being the same <-- i was told the opposite a couple months ago :\
[20:49] <JEEB> hotwings, the configure script currently derps at you if your remote is not source.ffmpeg.org >_>
[20:50] <llogan> hotwings: refer to line 4590
[20:50] <JEEB> so yeah, it's the "official" way of accessing it
[20:51] <llogan> i never did end up with many hats in TF2...
[20:51] <hotwings> im not saying youre wrong llogan, just noting whatever you were told before (even when its "official"), someone else will say the opposite :)
[20:51] <hotwings> its a common theme
[20:51] <llogan> then it sounds like to me then you're on the "official" internet
[20:51] <hotwings> indeed
[20:53] <beastd> hotwings: sorry for the inconvenience. but please just use/change to source.ffmpeg.org . we must put this systems at work now because the lazy way will sure be too late
[20:54] <hotwings> i originally used source.ffmpeg.org, but was told its just a mirror or videolan and shouldnt use it but rather use videolan directly :\
[20:54] <JEEB> they shouldn't be mirrors, it's just redirection or whatever
[20:54] <JEEB> so basically the same thing
[20:55] <llogan> it doesn't really matter at this point.
[20:55] <hotwings> why is there even two to begin with? seems pointless
[20:55] <beastd> hotwings: tje
[20:56] <hotwings> not following you beastd. tje a ffmpeg-scene thing or?
[20:56] <beastd> hotwings: nope. typo
[20:57] <beastd> hotwings: source.ffmpeg.org is just the DNS name you should use. the other one is just there because the repo is hosted by videolan and that is where the ffmpeg.git is at the moment
[20:57] Action: llogan adopts tje as a ffmpeg-scene thing.
[20:57] <beastd> tje :)
[20:57] <hotwings> lol
[20:57] <hotwings> tje!
[20:58] <beastd> historically source.ffmpeg.org did not exist when ffmpeg.git repo was created. that is the reason those two names are in the wild
[21:06] <Diogo> this is possible ffmpeg the output time in seconds??
[21:14] <Fjorgynn> Diogo: how do you mean?
[21:14] <Fjorgynn> ....
[21:23] <Diogo> change the formtar time to parse to a external program?
[21:23] <Diogo> timeseconds not time for human reading? 00:00:20
[21:24] <Diogo> this is possible native ffmpeg outpout a json?
[21:25] <Diogo> with a status of encoding?
[21:27] <llogan> Diogo: ffprobe can output json: ffprobe -print_format json -show_streams input.foo
[21:35] <Diogo> no and ffmpeg
[21:36] <Diogo> during the encoding...
[22:56] <mulicheng> Anyone using VDA hwaccel on OSX for h264 decode?
[22:58] <anesbit> hi all. i need to determine exactly how ffmpeg converts multichannel (esp. 2-channel or stereo) audio input into single-channel (mono) output. in other words, i need to know how the "-ac 1" output option works when presented with multichannel inputs.
[22:59] <anesbit> i'm pretty new to navigating the ffmpeg source code, so does anybody have any pointers, either as to where i should start looking, or what algorithm ffmpeg (or its libraries) uses?
[23:00] <JEEB> ffmpeg uses libswresample, and I would guess you would have to look at ffmpeg.c to see which exact settings it uses for it
[23:01] <anesbit> JEEB: thank you.
[23:01] <anesbit> i'll have a look
[23:05] <mulicheng> I stepped through the h264 code and found that the codec->id is always 0. Decoding works for me but the hwaccel is never found. I'm curious what initialization steps I may have missed to use hwaccel.
[00:00] --- Sat Jan 26 2013
More information about the Ffmpeg-devel-irc
mailing list