[Ffmpeg-devel-irc] ffmpeg.log.20140624

burek burek021 at gmail.com
Wed Jun 25 02:05:01 CEST 2014


[00:01] <ogrgkyle> okay thanks!
[00:01] <ogrgkyle> I'll play around with that
[00:01] <ogrgkyle> maybe i'll be back at some point with more questions
[00:04] <ogrgkyle> So the Dev version doesn't contain ffmpeg.exe...
[00:04] <ogrgkyle> How do I use this?
[00:05] <c_14> Oh, the dev version is for linking programs you make against. If you want a .exe just go with the static or shared. static is probably the easiest if you just want to try facedetect on a video.
[00:05] <ogrgkyle> okay... So static will probalby contain all the "opencv stuff"?
[00:06] <c_14> I'm guessing he compiled it in, but try it and you'll find out.
[00:06] <ogrgkyle> thanks
[00:24] <elichai2> hey
[00:24] <elichai2> i'm trying to use avconv
[00:24] <elichai2> and i get this error:
[00:24] <elichai2> encoder 'aac' is experimental and might produce bad results.
[00:24] <elichai2> Add '-strict experimental' if you want to use it.
[00:24] <elichai2> any idea how to fix?
[00:24] <elichai2> (already tried adding that -strict)
[00:24] <c_14> "Add '-strict experimental' if you want to use it."
[00:25] <c_14> Also, avconv is part of libav, not ffmpeg. Either use ffmpeg or see #libav for support.
[00:25] <elichai2> c_14: tried that
[00:25] <c_14> ffmpeg -i file -c:v aac -strict experimental outfile
[00:26] <elichai2> ffmpeg isn't deperated?
[00:27] <c_14> libav is a fork of ffmpeg, that provides a binary called ffmpeg out of historical reasons and added that message for whatever reason even though it is way too misleading.
[00:27] <c_14> If you just want to download a quick working ffmpeg.
[00:27] <elichai2> ok....
[00:27] <elichai2> so what command i need to run with ffmpeg to convert 720p to 480p?
[00:27] <elichai2> (it's mp4)
[00:28] <c_14> ffmpeg -i file.mp4 -vf 'scale=-1:480' -c:v [videocodechere] -c:a copy out.mp4
[00:31] <c_14> See the filtering guide and the h264 guide if you want something decent to start out with: https://trac.ffmpeg.org/wiki/FilteringGuide https://trac.ffmpeg.org/wiki/Encode/H.264
[01:52] <ac_slater> hey all. I'm loosing my mind with ffmpeg's (libavformat) RTP stack. The idea of all of these contexts are fine, but I see little usage documentation. How does an RTPContext relate to a URLContext, etc?
[02:16] <ac_slater> I'm actually a little shocked at how little examples there are ... ~4k lines of examples and ~1mill lines of code
[02:18] <c_14> Write some, submit a patch. Additions are always welcome.
[02:19] <ac_slater> c_14: I totally will. For the next ~8months I'll be working in almost nothing but ffmpeg.
[02:19] <ac_slater> (libav*)
[02:20] <c_14> The best documentation for the libav* libraries that I can give you is the doxygen.
[02:20] <ac_slater> c_14: yea, I mean the doxygen is nice. But picking the lucky context/function out of the large pool is not very guided.
[02:21] <ac_slater> Looking at the mux/demux/enc/dec code is useful too... but without setup or context
[02:24] <c_14> The libav* libraries have a rather steep learning curve.
[02:55] <ac_slater> c_14: it seems that way. I'm interesting in writing some *muxers that work with RTP - thus implementing our own payload types.
[04:01] <ac_slater> so, regarding the decoding_encoding example ... the pts (h264) is set to the frame number. When playing the file in `mplayer`, I get complaints about PTS being wrong. What's the deal?
[04:14] <needmorespeed1> I'm also facing a problem with pts, using av_read_frame and avcodec_decode_video2, the AVPacket pts equals dts. Changing the rate that these are called, pts and dts are adjusted. I am expecting that pts is a constant regardless of decoding rate.
[04:18] <needmorespeed1> There is some info on pts/dts here:
[04:18] <needmorespeed1> http://dranger.com/ffmpeg/tutorial05.html
[04:18] <needmorespeed1> ac_slater: You find anything?
[04:19] <ac_slater> needmorespeed1: I found some various things .. I also read that tutorial ... didnt help :(
[04:19] <ac_slater> needmorespeed1: http://arashafiei.wordpress.com/2012/10/19/pts-for-h264-encoding/ ... this looks interesting ... except I have no AVStream
[04:19] <needmorespeed1> However, ffmpeg reorders the packets so that the DTS of the packet being processed byavcodec_decode_video() will always be the same as the PTS of the frame it returns. But, another warning: we won't always get this information, either.
[04:20] <ac_slater> needmorespeed1: oh nevermind, I do have an AVSTream
[04:20] <ac_slater> interesting
[04:20] <needmorespeed1> Not to worry, because there's another way to find out the PTS of a frame, and we can have our program reorder the packets by itself. We save the PTS of the first packet of a frame: this will be the PTS of the finished frame. So when the stream doesn't give us a DTS, we just use this saved PTS. We can figure out which packet is the first packet of a frame by letting avcodec_decode_video() tell us. How? Whenever a packet starts a frame, theavcode
[04:21] <needmorespeed1> My file is not h264 though: Video: msmpeg4v2 (MP42 / 0x3234504D)
[04:21] <needmorespeed1> I think ffmpeg handles it generically at this level?
[04:25] <ac_slater> needmorespeed1: I'm a newbie with video stuff... I'm almost clueless regarding the importance of PTS/DTS and their calculations
[04:26] <needmorespeed1> ac_slater: In arasha link there is no AVStream, did you get a correct PTS?
[04:26] <ac_slater> needmorespeed1: no sadly
[04:27] <needmorespeed1> Need PTS to render at the correct rate. I can fake it now by decoding at roughly the frame rate. But I will need to do better.
[04:29] <needmorespeed1> It seems like a common problem on google. Noone in this channel stepped up yet. They are letting us tough it out.
[04:29] <ac_slater> needmorespeed1: ;)
[04:29] <ac_slater> it seems encoder/decoder and muxer specific... right?
[04:31] <needmorespeed1> Maybe it can be retrieved from the context or codec using the fps there?
[04:31] <needmorespeed1> Stream #0:0: Video: msmpeg4v2 (MP42 / 0x3234504D), yuv420p, 854x480, 1840 kb/s, 24 fps, 24 tbr, 24 tbn, 24 tbc
[04:31] <needmorespeed1> Create my own PTS using fps?
[04:33] <ac_slater> I tried ... pts = (1.0 / fps *  90 * current_frame); ... apparently the incorrect approach. It was monotonic though.
[04:34] <ac_slater> I'm a newbie with this stuff so I honestly dont know why 90khz was chosen
[04:34] <ac_slater> (apparently a requirement of h264 to some extent)
[04:35] <ac_slater> I might just recompile ffmpeg.c with debug flags ... find a commandline run that works and set soem break points
[04:36] <needmorespeed1> Just use the fps, not sure about the 90khz either. So, (1/25 fps) / (1/1000) = 40 timebase units. Increment by 40 each frame.
[04:36] <needmorespeed1> http://stackoverflow.com/questions/13595288/understanding-pts-and-dts-in-video-frames
[04:37] <ac_slater> interesting
[04:37] <needmorespeed1> Would be nicer if it was provided by ffmpeg as pts, that would be official
[04:40] <ac_slater> needmorespeed1: I guess I'm confused. I set pts... ffmpeg is fine with it. Other players like mplayer claim "No PTS!"
[04:41] <needmorespeed1> VLC?
[04:41] <ac_slater> ffplay doesnt care either
[04:41] <ac_slater> only mplayer says this ... but to some extent... there must be some incorrect settings
[04:41] <needmorespeed1> try VLC
[04:42] <needmorespeed1> I think VLC is more maintained than mplayer
[04:42] <ac_slater> but my mplayer (actually mpv) use libav* for its backend
[04:42] <ac_slater> yea I use the newer `mpv`
[04:42] <ac_slater> I don't like vlc very much ...
[04:43] <ac_slater> do you know of a way `ffplay` can dump its pts?
[04:43] <ac_slater> (for the playing stream obviously)
[04:43] <needmorespeed1> libav and ffmpeg diverged, probably some incompatibilities
[04:43] <needmorespeed1> libav split from ffmpeg
[04:44] <needmorespeed1> try VLC for a second opinion
[04:44] <needmorespeed1> or another player
[04:44] <ac_slater> I meant the ffmpeg libav* libraries ... wildcard meaning all
[04:44] <Dark-knight> mkv to mp4
[04:45] <Dark-knight> i need the coder
[04:45] <Dark-knight> code*
[04:45] <Dark-knight> please and thank you
[04:46] <ac_slater> Dark-knight: source code? or ffmpeg utility invocation?
[04:46] <Dark-knight> command line to convert mkv to mp4
[04:46] <ac_slater> Dark-knight: literally the first result on google
[04:47] <Dark-knight> nvm
[06:28] <ac_slater> I asked this in the devel channel since I cant really fin the answer in the docs / examples ... but if I run `./decoding_encoding h264` then `./demuxing_decoding test.h264 test2` ... the second output reports NOPTS. What's the proper way to set PTS here?
[06:33] <ParkerR> ac_slater, where are you getting these scripts?
[06:34] <ac_slater> ParkerR: they are in `doc/examples` since 2.0
[06:34] <ParkerR> Ahh
[09:10] <polysics> hello! I am now concatenating some silent video files, then adding audio with a separate command
[09:11] <polysics> oops, nvm
[09:11] <polysics> I was using m2v as output :)
[09:12] <volmatrix> hello everyone, is it possible to encrypt a multicast udp live stream (with ffmpeg of course) ?
[09:25] <satanist> volmatrix: https://ffmpeg.org/ffserver-all.html#crypto
[09:31] <polysics> my H264 encoding looks really bad (and file size is not that great)
[09:31] <polysics> currently just using -vcodec mpeg4 -acodec aac -strict -2 output/concat.mp4
[09:32] <polysics> I guess I need way more options :)
[09:44] <satanist> you can use -b:v or -qscale:v to config the output quality
[09:45] <satanist> and mpeg4 != h264
[09:45] <volmatrix> satanist: crypto works only with ffserver? I heard it's not maintained anymore
[09:50] <satanist> i don't know but how about you test it
[09:50] <satanist> if it don't work you can use srtp or a wrapper witch openssl s_server
[09:52] <maksimkaa> hello, whenever I generate a new video with some drawn text on it, I keep seeing a cursor blinking on various parts of the screen. Can anybody advice about why is that and how to hide that cursor?
[10:00] <volmatrix> satanist: I tried srtp with no success
[10:30] <kriegerod> if i have all my inputs specified in -filter_complex_script file, how can i execute ffmpeg so that it doesn't complain about missing input file directives?
[10:44] <kriegerod> and when i add "-f lavfi -i testsrc" as placeholder, i get "not connected to any destination" error. Is there currently a way to overcome this?
[11:03] <SleepyPikachu> Hello everyone, I'm looking for someone who has experience using libavfilter. I want to set the sample_fmt of a buffer source to AV_SAMPLE_FMT_NONE but can't work out how to do so.
[11:12] <saste> SleepyPikachu, what's the point of setting it to NONE?
[11:15] <SleepyPikachu> saste: It's compulsory to set, must match the decoder and the decoder has NONE
[11:15] <SleepyPikachu> saste: At least in my understanding. ;-)
[11:23] <saste> SleepyPikachu, from my understanding the decoder sample_fmt should be set
[11:24] <saste> you just need to make sure that decoder and buffer source sample format *match*
[11:24] <saste> to set the sample format on the filter, you can initialize the arguments in the init function
[11:24] <saste> or you can set the option on the filter context by using the av_opt API
[11:25] <saste> did you have a look at the doc/examples examples?
[11:27] <SleepyPikachu> saste: Thanks for taking the time to investigate, the decoder -> encoder has been working without setting the sample format. Now I'm trying to decoder -> filter -> encoder and as you say the decoder and filter must have matching formats. Could the decoder not need a sample format because the av concerned is the audio?
[11:45] <saste> SleepyPikachu, when you configure the decoder (avcodec_open) the decoder should have the sample format set
[11:45] <saste> i am assuming you have an *audio* decoder
[11:46] <saste> the sample format set on the buffersrc sets the sample format *expected* by the source buffer
[11:46] <saste> so yes you need to set that
[11:47] <saste> i don't understand what you mean by "av concerned"
[11:49] <SleepyPikachu> AV stands for AudioVisual, I was proposing that given the sample formats are things like floating planar and the decoder -> filter -> encoder is for audio it might not need a format.
[11:50] <saste> SleepyPikachu, this is the workflow, you init the decoder and the encoder
[11:50] <saste> then you set up the filterchain
[11:50] <saste> you need to make sure that decoder <-> abuffersrc and abuffersink <-> encoder sample formats match
[11:51] <SleepyPikachu> agreed
[12:36] <maksimkaa> hello, whenever I generate a new video with some drawn text on it, I keep seeing a cursor blinking on various parts of the screen. Can anybody advice about why is that and how to hide that cursor?
[12:36] <kriegerod> maksimkaa: no idea. Describe your ffmpeg cmdline and how you playback it
[12:40] <maksimkaa> kriegerod: it is like this: ffmpeg -y -i input.mp4 -vf "drawtext=fontfile=/usr/share/fonts/truetype/ttf-dejavu/DejaVuSerif.ttf: text=\\\\'Hello world!\\':draw='eq(n,12)': x=200: y=200: fontsize=14: fontcolor=white: box=1: boxcolor=black -vcodec libx264 -preset fast -crf 18 output.mp4
[12:43] <kriegerod> maksimkaa: what's your ffmpeg version?
[12:45] <kriegerod> maksimkaa: i suppose what you see may be just presentation artifact during playback
[12:46] <maksimkaa> i am using latest ffmpeg
[12:47] <maksimkaa> kriegerod: what i see is like a blinking cursor that is most of the time at the beginning of the text but sometimes it shows on other areas...
[14:44] <maksimkaa> kriegerod: any thoughts or suggestions?
[14:45] <slystone> Hey guys! I need your expertise. So I have to convert some video files (*.dv to *.webm). So I'm using ffmpeg. But I also have to add some images (beginning and end of the videos).
[14:45] <slystone> Is there a way to merge the images without reencoding the whole video (and thus not losing quality)?
[14:46] <slystone> Otherwise, would you recommend the version of ffmpeg in Debian testing (2.2.3)?
[14:47] <sacarasc> If you do it at the same time as the dv ¶ webm conversion, you'd only have to encode once.
[14:47] <sacarasc> To do that, you'd have to use the concat filter, I believe.
[14:50] <slystone> Cheers sacarasc :)
[16:09] <Dave77> where can I get a binary of ffmpeg?
[16:11] <sacarasc> Which OS?
[16:11] <Dave77> linux with ARM cpu
[16:12] <Dave77> never had much luck with compiling myself.. never normally works
[16:13] <sspiff> Dave77, what goes wrong when compiling?
[16:14] <Dave77> just linux source compiling in general.. either compile script don't work.. some lib version , some link problem etc etc..
[16:15] <sspiff> I see
[16:20] <Fjorgynn> troll
[16:21] <sspiff> Dave77, well, I don't know of any static builds, is your ARM board running a distribution?
[16:22] <Dave77> yes ubuntu.. will try compile src.. see how far i get
[16:22] <sspiff> Dave77, why not use the ubuntu provided package?
[16:24] <klaxa> sspiff: that would be avconv then presumably
[16:24] <sspiff> klaxa, really?
[16:26] <sspiff> doesn't ubuntu bundle ffmpeg proper?
[16:26] <c_14> Nah, ubuntu bundles libav. At least last time I checked.
[16:30] <sspiff> bah
[17:51] <bd12> Hello, it is possible to ffmpeg to redirect output to a socket but to create a socket in that time! For example -f mpegts tcp://127.0.0.1:8000 , but i wanted to make ffmpeg to create that socket too if not exists
[17:58] <cuba> bd12: listen=1 ?
[18:00] <bd12> what is the complete command after -f
[18:04] <cuba> ttp://www.ffmpeg.org/ffmpeg-protocols.html
[18:04] <cuba> http://www.ffmpeg.org/ffmpeg-protocols.html
[18:09] <bd12> I'm using this command
[18:09] <bd12> ffmpeg  -i "STRAEM_URL" -acodec copy -vcodec copy -f mpegts tcp://127.0.0.1:8000?listen=1
[18:10] <bd12> but when i connect to 8000 then ffmpeg stops
[18:13] <bd12> It accepts one connection, it starts the stream but then when i close the connection
[18:13] <bd12> ffmpeg stops
[18:20] <cuba> you can use socat bd12
[18:20] <cuba> for every connection you start a new ffmpeg instance
[18:21] <bd12> can't be done to one ffmpeg? I mean we can use netcat or something to handle th is
[18:21] <bd12> ?
[18:23] <ogrgkyle> Hi
[18:23] <ogrgkyle> I downloaded Zeranoe FFmpeg dev build
[18:23] <ogrgkyle> not build
[18:24] <ogrgkyle> But I downloaded Zeranoe FFmpeg dev version
[18:24] <ogrgkyle> I'm trying to get one of the example C scripts to compile
[18:24] <ogrgkyle> But I get a bunch of "undefined reference to" errors in codeblocks
[18:25] <cuba> you need to link against the lib ogrgkyle
[18:25] <ogrgkyle> I've tried adding the lib folder to Search Directories > Linker
[18:25] <cuba> add the .lib things in codeblocks
[18:26] <cuba> you need to add like "avcodec.lib" etc
[18:26] <ogrgkyle> And adding the .lib files to Linker Settings > Link Libraries
[18:26] <cuba> i dont know codeblocks but you need to add every .lib on it own i guess
[18:26] <ogrgkyle> Yeah, that's what I don't understand.  I already added those
[18:26] <cuba> the names
[18:28] <ogrgkyle> Which IDE do you use?
[18:30] <cuba> vs ogrgkyle
[19:50] <fixedtype> hi. is it possible to use complex filtergraph with ffplay? i'd like to play two videos side by side to compare quality but for this i need overlay filter.
[20:02] <Filarius> hello, I want stream frames generated from my software, I thought I can just save frames to jpeg file and use ffmpeg to stream. But ffmpeg crash if I trying to replace file. Is here any fix ?
[20:18] <keynote2k> Hi, all. I'm with Software Freedom Conservancy. We were investigating a GPL violation report on behalf of one of our member projects.  During our investigation of the report, we discovered that the violator is likely using ffmpeg as well.
[20:18] <keynote2k> I'd like to report this possible violation to the proper channels.  Where should I send the report?  Please advise.
[20:20] <klaxa> ffmpeg-devel list might be the right place
[20:20] <klaxa> the mailing list that is
[20:21] <sacarasc> There used to be a fun page on the website about projects doing that...
[20:21] <sacarasc> Can't find it now, though.
[20:25] <sfan5> fixedtype: https://lists.ffmpeg.org/pipermail/ffmpeg-user/2013-June/015662.html
[20:33] <keynote2k> klaxa:  is the ffmpeg-devel list a better place to go than the #ffmpeg-devel IRC channel?
[20:33] <klaxa> maybe do both?
[20:33] <keynote2k> sacarasc: I had heard there was a website and/or a dedicated email address; I couldn't find it.
[20:33] <keynote2k> klaxa: worth a shot.
[20:34] <keynote2k> Thanks for your responses.  :)
[20:34] <klaxa> if it's about licensing, emails seem more apropriate
[20:34] <fixedtype> sfan5: this works with ffmpeg but not with ffplay. ffplay doesn't have -filter_complex option
[20:35] <sfan5> you could try ffmpeg <options> -vcodec huffyuv -f matroska - | ffplay -
[20:35] <sfan5> something like that might work
[20:35] <sfan5> ^ fixedtype
[20:39] <fixedtype> sfan5: yes, this works, although pipe seeking is not possible with pipe.
[20:39] <fixedtype> http://ffmpeg.org/ffmpeg-filters.html#movie
[20:39] <fixedtype> i think i've found the workaround
[20:51] <Filarius> why ffmpeg do so many fps ? http://pastebin.com/gDQs6HZM
[20:53] <c_14> Do you mean that fps counter on the bottom?
[20:53] <c_14> That's your encoding fps, how many frames you're encoding per second. It has nothing to do with your playback fps.
[20:54] <Filarius> oh, thanks
[20:58] <Filarius> hm
[20:59] <Filarius> I do something wrone, I save it as flv file and after few seconds I have about 2 hour movie
[20:59] <c_14> How many input images do you have?
[21:00] <Filarius> I need take 2 pictures per second and stream it like 20 fps video
[21:00] <c_14> I'm pretty sure -loop 1 isn't doing what you think it's doing.
[21:02] <Filarius> I want make live streaming of pictures what generated on the fly. Just I`m too dummy to add streaming inside of my application
[21:03] <Filarius> so as test I make "1.jpg" and "2.jpg" and trying to stream it
[21:03] <c_14> ffmpeg -framerate 1 -i %01d.jpg [output optinos] rtmp://localhost:1935/live
[21:04] <c_14> Should give you 40 frames output video.
[21:05] <Filarius> what about loop ? if I want stream only this two images
[21:06] <c_14> afaik loop will only loop over a single image.
[21:06] <Filarius> and replace it with another images (same name)
[21:06] <Filarius> no, it good for sequence
[21:07] <Filarius> at last it work, just I see big fps
[21:07] <c_14> ye, like I said before. That should be fine. That's just the encoding fps. Or do you mean in the output file/stream?
[21:09] <Filarius> if I save to flv file instead of stream I have file with hours of video, and it was generated in seconds
[21:10] <c_14> Well, you're looping forever. And encoding two frames forever isn't all that difficult.
[21:10] <c_14> Add something like -t 5:00 to the output options and it'll stop after 5min
[21:10] <Filarius> but I need livestreaming...
[21:11] <c_14> just for when you're outputting to the file...
[21:13] <Filarius> so for streaming it must be okay ? I think I found strange behavior with rtmp stream, so it looks like same with file - it not show changes even of think about big buffers
[21:14] <Filarius> if I replace files with another one
[21:15] <c_14> I'm guessing that if you use -loop 1, ffmpeg just loads the files into memory and then never queries if they've changed.
[21:17] <Filarius> is here trick to make live encoding/streaming to file ?
[21:18] <Filarius> so ffmpeg will not do it such fast
[21:20] <c_14> try adding -re as an input option? but I'm still not sure what you're trying to do.
[21:24] <Filarius> ok, from start point - I want generate jpeg files from my application, and use ffmpeg to stream it on the fly, so it will be like pretty dummy way to live stream from own application. First I thought to save frames to jpeg file with same name
[21:25] <c_14> just save them with increasing numbers ie 00000.jpg 000001.jpg etc, then ffmpeg -re -framerate 1 -f image2 %04d.jpg [output options]
[21:25] <c_14> (note, I didn't actually count how many zeros I used)
[21:27] <Filarius> but this way somehow I must know what last image was processed by ffmpeg
[21:27] <c_14> Hmm, why?
[21:28] <Filarius> what ffmpeg will do if no more image left ?
[21:28] <c_14> stop
[21:29] <Filarius> so if my software somewhy will not create new image right in time- stream will be ended, i do not want this
[21:30] <Filarius> also too many images not so good, if I will want to make long time stream with good framerate
[21:31] <ogrgkyle> Hello again
[21:31] <Filarius> I had to work with folder with +50000 images on usual HHD, it`s not so nice
[21:32] <ogrgkyle> I have solved the issue of the "undefined reference" errors, I think.
[21:32] <Filarius> *HDD, sorry for misstypes
[21:32] <ogrgkyle> Now there's one error left: ld.exe cannot find -lswscale.lib
[21:32] <ogrgkyle> Do you know what this is about?
[21:33] <c_14> Filarius: just delete the older pictures as you go along?
[21:33] <Filarius> but what if I will delete images what was not processed ?
[21:34] <c_14> ffmpeg should eat one frame per second, keep around 60 frames and you should be fine as long as your app generates frames at about 1 fps
[21:35] <Filarius> I do not want to make timing be boss here
[21:37] <Filarius> I know my situation pretty synthetic, sorry if I can look like annoying person
[21:38] <Filarius> I want mike live stream what show (almost) real time video, 60 seconds delay will be too long
[21:42] <c_14> you might be better of muxing the pictures into some sort of video codec like mjpeg and then feeding that to ffmpeg
[21:43] <iamtakingiteasy> hi, which options would you recommend for white/black (even not grayscale) x264-encoded video of many small volatile text characters (ascii-art video) for real-time streaming to rtmp server?
[21:44] <iamtakingiteasy> for some reason i am expriencing massive lags on some clients on -maxbitrate >100k and -crf<40
[21:44] <iamtakingiteasy> but it is nearly unwatchable on such low bitrates
[21:45] <iamtakingiteasy> -maxrate, sorry
[21:45] <Filarius> I just tested it with 1 image file in sequence, with replacing it with 3 else images, and it save to file pretty in right way, just too many frames per seconds...
[21:45] <Filarius> it`s with loop
[21:45] <c_14> Filarius: -re -framerate 1 should limit ffmpeg to processing only 1 frame per second
[21:46] <c_14> iamtakingiteasy: you're probably going to need a relatively high bitrate because of the fast movement of the chars so it doesn't blur, what kind of lag are you experiencing?
[21:47] <iamtakingiteasy> c_14: video plays fine on client (when buffer is filled, which takes about 5-10 seconds), plays for a while and then stops (sometimes only video stops while audio is playing fine), then vidoe jumps without plaing and stops again
[21:48] <iamtakingiteasy> err
[21:48] <iamtakingiteasy> the first thing when it plays lasts not longer than for 15 seconds
[21:49] <c_14> what if you were to output to a file and then play that file, same problems?
[21:49] <iamtakingiteasy> hmm, can't really say, i am encoding&streaming on headless system; i'll try to setup Xvnc now
[21:50] <Filarius> Oh thanks c_14, can I kiss you? I`m so stupid. Now it works who I need, at last with file, will check it with stream now
[21:59] <beastd> Filarius: Sorry, I am not sure what exactly you are trying to accomplish, but maybe image2pipe could help with your producer/consumer synchronization problem? See e.g. at the end of this FAQ for a simple example: http://www.ffmpeg.org/faq.html#How-do-I-encode-single-pictures-into-movies_003f
[22:00] <Filarius> I think pipes not available in OS Windows
[22:01] <iamtakingiteasy> c_14: yes, the same problem persists when played from local file
[22:01] <Filarius> just think about it like bad(easy) way to make rtmp streaming from own software
[22:02] <Filarius> not production, just hobby and curious
[22:02] <c_14> iamtakingiteasy: can you pastebin the command you're using to make the video and if possible a sample file with which the problem persists? If not a paste of the ffprobe output maybe?
[22:06] <iamtakingiteasy> the command i am using is:
[22:06] <iamtakingiteasy> ffmpeg -re -i 'VTS_01_1.VOB' -vf 'scale=iw/3:ih/4, aa=contrast=50:fontsize=10:fontname=liberation mono' -crf 30 -vcodec libx264 -pix_fmt yuv420p -preset ultrafast -ar 44100 -f flv ~/out.flv
[22:07] <iamtakingiteasy> the sample file: http://static.eientei.org/data/out.flv
[22:09] <iamtakingiteasy> ffprobe output: http://paste.eientei.org/show/225/
[22:10] <c_14> And you are experiencing the stuttering with this sample file?
[22:11] <iamtakingiteasy> yes, on my clients which supposed to decode this as real-time stream
[22:12] <c_14> The file plays just fine for me.
[22:12] <c_14> not stuttering.
[22:12] <iamtakingiteasy> hmm
[22:26] <znf> 
[22:40] <beastd> Filarius: AFAIK pipes are available in Windows since somewhen in the 80s and still supported today. No guarantee about any behavioural differences that might matter in your use case though.
[22:42] <fixedtype> when ffplay sends output via xv does it prescale 720x480 (DAR 4:3) movie to "square" pixels or is the scaling done by hardware? how can i tell from output?
[22:43] <fixedtype> the output tells me: w:720 h:480 sar:0/1 -> w:720 h:480 sar:0/1
[22:44] <fixedtype> but i get correct aspect ratio on display. does it mean the scaling is done in hardware?
[22:44] <fixedtype> also what does 0/1 value for sar mean?
[22:45] <fixedtype> actual sar value is 8:9
[22:45] <sacarasc> Screen aspect ratio of 8 across for every 9 up.
[22:50] <fixedtype> sacarasc: yes, i know what it means. i'm trying to investigate who's in charge of scaling correction
[22:51] <fixedtype> i was referring to odd value for sar -- 0/1
[22:55] <fixedtype> i also confused xv (as in mplayer) for sdl in the previous message. ffplay is sdl based
[23:04] <ogrgkyle> Hey folks
[23:04] <ogrgkyle> Very basic question here
[23:04] <ogrgkyle> I want to build the FFmpeg libraries
[23:05] <c_14> https://trac.ffmpeg.org/wiki/CompilationGuide
[23:05] <ogrgkyle> Is "compiling FFmpeg" the same as "building FFmpeg libraries"?
[23:06] <sacarasc> They get compiled along with the ffmpeg executable.
[23:06] <sacarasc> You can pass a switch to the configure to not compile that bit if you don't want it.
[23:06] <ogrgkyle> Oh ok
[23:06] <_aeris_> ohai everybody !
[23:07] <ogrgkyle> What is cross compiling?
[23:07] <_aeris_> i try to concatenate 2 files in webm (vp8 576p + mono vorbis)
[23:07] <_aeris_> but the result is very ugly « iron » sound
[23:08] <_aeris_> concatenate 2 files with stereo : ok, 2 files on mono : nok :(
[23:08] <c_14> ogrgkyle: compiling something on one platform for use on a different platform
[23:11] <_aeris_> https://paste.imirhil.fr/?86e551d1e63566fa#X4ByEWhuchB7gh+GkYUec01Lg4kAB5pzwf2sIRdGmNM=
[23:13] <Nitrodist> anyone know why I get an sample_aspect_ratio of 0:1 (same for display_aspect_ratio) when I convert a gif to mp4 with these options? https://gist.github.com/Nitrodist/cadcf4c6949d903dc6b7
[23:14] <Nitrodist> the sampel_aspect_ratio is determined from ffprobe -show_streams
[23:14] <c_14> _aeris_: And the command?
[23:15] <_aeris_> c_14 > what command ?
[23:15] <_aeris_> ffmpeg -y -f concat -i /tmp/concat20140624-29289-15dd8ul -c copy transition-libre-experience-ong.webm
[23:15] <_aeris_> this command, in the pastebin ?
[23:15] <ogrgkyle> Is there a way to download FFmpeg prebuilt with GCC 4.7.1?
[23:19] <fixedtype> ogrgkyle: it's probably easier to build it yourself
[23:20] <c_14> Nitrodist: can you paste the output of ffprobe ?
[23:21] <c_14> _aeris_: I don't know. Might be a bug. Does it have the same problems if you reencode the audio?
[23:21] <Nitrodist> c_14: https://gist.github.com/Nitrodist/a1cdbfaac1698e865f43
[23:23] <c_14> Nitrodist: try adding setsar=1 after the scale filter?
[23:27] <Nitrodist> c_14: sorry, I'm not sure how to append that to the end of that
[23:27] <c_14> *2,setsar=1'
[23:27] <Nitrodist> ah, comma, cool
[23:29] <Nitrodist> c_14: well, it's set now -- I'll see if it fixes my problem. Thanks a lot!
[23:30] <debianuser> _aeris_: If instead of concatting 2 different files you specify the same file twice does the sound gets concatted correctly?
[23:33] <_aeris_> debianuser > yep
[23:44] <debianuser> _aeris_: Both of them? Have you checked both file1+file1 and file2+file2 sounds?
[23:45] <_aeris_> one of the file is a generated one with « no » sound
[23:45] <_aeris_> fix image + /dev/zero for sound
[23:45] <_aeris_> difficult to say if good or not :P
[23:49] <debianuser> _aeris_: Heh... It's just '-f concat' is very picky about what it wants to concat. I've hit this recently when tried to concat wav+ogg. I ended using concat filter, e.g.: ffmpeg -i 1.wav -i 2.ogg -i 3.wav -filter_complex concat=n=3:v=0:a=1 0.wav
[23:50] <_aeris_> not able to reencode for me
[23:50] <_aeris_> too long
[23:50] <_aeris_> and bad quality
[23:58] <debianuser> I just tried that with two 24kHz mono ogg/vorbis `ffmpeg -i file1.ogg -i file2.ogg -filter_complex concat=n=2:v=0:a=1 concat.ogg` worked fine for me...
[23:59] <_aeris_> for 3min sound, perhaps
[00:00] --- Wed Jun 25 2014


More information about the Ffmpeg-devel-irc mailing list