[Ffmpeg-devel-irc] ffmpeg.log.20150319

burek burek021 at gmail.com
Fri Mar 20 02:05:01 CET 2015


[01:03:32 CET] <drFace> I'm new to ffmpeg and am having a problem with merging channels from 2 files. Is there an option like -shortest, but which limits the length of the output to that of a given input?
[01:32:11 CET] <c_14> concat all inputs shorter than the one you want with aevalsrc=0, then use -shortest
[03:34:13 CET] <skyroveRR> I've got a device that outputs audio on a single channel, 16 bit signed little endian at about 8000 kHz, is it possible to listen to it using ffplay? The device shows up as /dev/ttyUSB3.
[03:40:37 CET] <c_14> wait
[03:40:41 CET] <c_14> 8000 kHz ?
[03:41:07 CET] <skyroveRR> Sorry, 8kHz.
[03:41:20 CET] <skyroveRR> 8000 Hz.
[03:42:50 CET] <c_14> try ffmpeg -f pcm_s16le -ar 8000 -i /dev/ttyUSB3 -f alsa default, though ffplay -f pcm_s16le -ar 8000 /dev/ttyUSB3 should work as well
[03:44:08 CET] <skyroveRR> c_14: Failed to set value 'pcm_s16le' for option 'f': Invalid argument
[03:44:21 CET] <c_14> Which one did you try?
[03:44:29 CET] <skyroveRR> ffplay command.
[03:45:19 CET] <c_14> eh
[03:45:21 CET] <c_14> right
[03:45:24 CET] <c_14> use -c instead of -f
[03:45:43 CET] <c_14> and -f u16le
[03:45:54 CET] <c_14> eh
[03:45:56 CET] <c_14> s16le of course
[03:46:28 CET] <c_14> eh, without -c
[03:46:40 CET] <c_14> seems ffplay doesn't like that as an input option
[03:46:46 CET] <c_14> the -f s16le should cover that though
[03:47:25 CET] <skyroveRR> ffmpeg -c pcm_s16le -ar 8000 -i /dev/ttyUSB3 -f s16le default is correct, c_14 ?
[03:48:09 CET] <c_14> either `ffmpeg -f s16le -ar 8000 -i /dev/ttyUSB3 -f alsa default' or `ffplay -f s16le -ar 8000 /dev/ttyUSB3'
[04:14:59 CET] <skyroveRR> c_14: ty.
[05:52:10 CET] <crazy6> I am succesfully using FFMPEG to pull "raw" video data out of an AVI container, with "-vcodec copy -an -f rawvideo". How can I do the exact reverse of this? No debayering, no codec, just take binary data and stuff it back into an AVI, with a prescribed resultion and framerate?
[05:53:23 CET] <crazy6> This is the information that ffmpeg gives me on the source AVI:
[05:53:24 CET] <crazy6> http://pastebin.com/QdFAZqnM
[07:13:17 CET] <relaxed> crazy6: ffmpeg -f rawvideo -video_size 1920x1081 -pixel_format pal8 -i input
[07:17:36 CET] <crazy6> relaxed: Ah, thanks. I had to add a little to that, but this seems to work: ffmpeg -f rawvideo -video_size 1920x1081 -pixel_format pal8 -i iso3200.rawv -vcodec copy -an remux.avi
[07:18:25 CET] <crazy6> relaxed: I had a problem with choosing the right pixel format before ... but now it's obvious that I should have seen 'pal' in the info for the source AVI
[07:27:45 CET] <crazy6> relaxed, er, no, I spoke too soon: http://pastebin.com/F5KDV8KT
[07:28:12 CET] <crazy6> now I'm getting, "[avi @ 000000000447ba20] pal8 rawvideo cannot be written to avi, output file will be unreadable"
[08:10:57 CET] <shevy> is this scary? [mp3 @ 0x81cdb20] Application provided invalid, non monotonically increasing dts to muxer in stream 0: 13871 >= 12947
[08:11:00 CET] <shevy> from a .wma file
[08:54:30 CET] <relaxed> crazy6: try matroska
[08:54:40 CET] <relaxed> shevy: pastebin
[08:55:53 CET] <crazy6> relaxed, eh, I'm a bit constrained. This is an AVI that comes out of a special high speed camera. I'd like to do a black-level subtraction, then pack it back into their AVI format. Although, maybe I should rethink my workflow.
[09:23:14 CET] <onovy> hi, i have this project: I'm receiving continuous mpeg2-ts and saving it to disk splitted by size, for example 32 MB. Then i want
[09:23:18 CET] <onovy> to encode this input to 3 profiles (bitrates/resolution) i-frame aligned, splitted by time (10s) for HLS adaptive playing. It's
[09:23:24 CET] <onovy>  no problem to code something which reads this chunks from disk, and send it to ffmpeg for encoding. But I want to have way to
[09:23:31 CET] <onovy> restart this process and start from last encoded 10s chunk.
[09:52:37 CET] <heeen> is it possible to use ffmpeg to create glitch art
[09:53:06 CET] <heeen> e.g. remove i frames or repeat p frames
[09:54:18 CET] <heeen> http://i.imgur.com/bEPqGQy.gif
[09:54:21 CET] <heeen> like this
[09:54:39 CET] <heeen> or this http://i.imgur.com/ajizktD.gif
[09:55:49 CET] <Silex> heeen: my knowledge of ffmpeg is limited, but with filters you can "select/drop" i frames only so I guess it's feasible
[09:55:57 CET] <heeen> http://butmac.tumblr.com/post/103190316185/kryptinite-this-is-breaking-my-brain
[10:44:52 CET] <Silex> question: is there an index for the frames/timestamps in the h264 codec? when I seek into a video file, does it walk all the frames to find out where to jump?
[10:50:39 CET] <Silex> ah hum, apparently the index is done at the container level? e.g mkv?
[10:50:53 CET] Action: Silex is confused and wikipedia is unclear
[10:54:55 CET] <Silex> (fwiw, I understand that landing on a P/B frame will require you to load neighbor frames to display this frame properly, but my question is really about how seeking to a timestamp works)
[10:57:37 CET] <zotta> is there a way to remove dc from audio files using ffmpeg command line?
[11:09:01 CET] <cowai> If the only thing I am doing is "ffmpeg -i rtmp://origin -c copy -f flv rtmp://edge", can I get away with only building with flv and rtmp support?
[11:09:18 CET] <cowai> Or does it need to decode the video too even though it only copies?
[11:15:24 CET] <cowai> the stream is x264 and aac.
[11:18:27 CET] <Silex> cowai: I have no clue, but isn't this easy to test?
[11:19:07 CET] <beepbeep_> anyone knows what exactly goes wrong here?
[11:19:16 CET] <beepbeep_> $ /c/ffmpeg.exe -i "concat:/d/recordings/foo/loop00000.h264|/d/recordings/foo/loop00001.h264|/d/recordings/foo/loop00002.h264" -f h264 -vcodec copy -loglevel debug /d/captures/foo.h264
[11:19:39 CET] <beepbeep_> I get following output
[11:19:40 CET] <beepbeep_> concat:/d/recordings/foo/loop00000.h264|/d/recordings/foo/loop00001.h264|/d/recordings/foo/loop00002.h264: No such file or directory
[11:19:43 CET] <cowai> Silex: Good point, I will try it
[11:19:47 CET] <beepbeep_> the files definitely exist.
[11:20:11 CET] <beepbeep_> looks like ffmpeg doesnt know I want to concat
[11:21:57 CET] <beepbeep_> when I cd into the recordings/foo folder
[11:22:03 CET] <beepbeep_> and use relative filenames
[11:22:06 CET] <beepbeep_> then it does work
[11:22:14 CET] <beepbeep_> hmz
[11:30:13 CET] <cowai> beepbeep_: Are you sure "/d/" is an actual path in your ffmpeg binary
[11:31:07 CET] <cowai> I remember some time ago when I was using windows with a cygwin compiled ffmpeg binary I had to use "/cygdrive/d/" instead of "/d/.
[11:31:30 CET] <cowai> I may be totally off about that though.
[11:39:06 CET] <Ders> Is there a way to tell when ffmpeg is done with its task after I've let in run via a c++ project? (using a popen() call)
[13:13:19 CET] <heeen> I compiled with libvpx enabled but it still says vp8 encoder not found
[13:13:51 CET] <Mavrik> ffmpeg -codecs says what?
[13:15:18 CET] <heeen> only D
[13:15:33 CET] <heeen> D.V.L. vp8                  On2 VP8
[13:15:58 CET] <heeen> D.V.L. vp9                  Google VP9
[13:16:11 CET] <heeen> the libvpx is from ubuntu 14.04
[13:16:23 CET] <heeen> trying with upstream libvpx
[13:17:38 CET] <Mavrik> heeen, full output.
[13:17:48 CET] <Mavrik> last I checked it was named libvpx, not vp8 or 89
[13:19:49 CET] <heeen> http://pastebin.com/raw.php?i=J7YsXGjZ
[13:20:27 CET] <heeen> oooh
[13:20:29 CET] <heeen> wtf
[13:20:35 CET] <heeen> --prefix=~/linux
[13:20:52 CET] <heeen> it created a directory called "~"
[13:20:55 CET] <Mavrik> :)
[13:21:02 CET] <Mavrik> wrong ffmpeg binary I guess? :P
[13:21:41 CET] <heeen> yeah
[13:22:29 CET] <heeen> how come this mp4 file I created does not play in firefox or chrome
[13:22:46 CET] <heeen>  http://heeen.de/zdf_neo-fake/diff.mp4
[13:23:16 CET] <Mavrik> *shrug*
[13:30:50 CET] <heeen> [libvpx @ 0x30f5560] Failed to initialize encoder: ABI version mismatc
[13:30:52 CET] <heeen> damn it
[13:32:46 CET] <heeen> uhm
[13:32:56 CET] <heeen> libvpx did not build and install a .so
[13:33:00 CET] <heeen> only a .a
[13:33:22 CET] <__jack__> heeen: no configure option about that ?
[13:38:39 CET] <heeen> I assumed shared was the default. trying withg --enable-shared
[13:38:54 CET] <heeen> make clean
[14:58:16 CET] <mop> hi all, could you know if is possible to call an external program to produce a PNG image with alpha channel to use it as frame overlay of an input video and than save the merged frame in the output ?
[15:01:52 CET] <cowai> Can I take three separate inputs and copy the output again to three different outputs with only one process?
[15:22:23 CET] <c_14> cowai: yes
[15:53:00 CET] <cowai> c_14: any link for more info?
[15:53:10 CET] <cowai> Or care to explain?
[15:55:05 CET] <c_14> I'm not entirely sure exactly what you want, but something like `ffmpeg -i in0 -i in1 -i in2 -map in0 out0 -map in1 out1 -map in2 out2' might be what you're looking for?
[15:55:13 CET] <c_14> All in all, various combinations of map should do it
[15:58:19 CET] <cowai> I use three of this today: ffmpeg -i rtmp://origin/[quality] -c copy -f flv rtmp://edge/[quality]
[15:58:31 CET] <cowai> for three different bitrates
[15:59:26 CET] <c_14> just replace the in[] with the streams, the out[] with the output streams, and add -c copy after each map
[16:06:31 CET] <cowai> c_14: have I used map correctly? "ffmpeg -i rtmp://origin/high -i rtmp://origin/medium -i rtmp://origin/low -map 0 -c copy -f flv rtmp://edge/high -map 1 -c copy -f flv rtmp://edge/med -map 2 -c copy -f flv rtmp://edge/low"
[16:08:10 CET] <c_14> should work
[16:08:31 CET] <cowai> so -map 0 means input 0 right ?
[16:08:47 CET] <c_14> map all the streams from the 0th input file, yes
[16:18:38 CET] <cowai> c_14: thanks. It worked!
[16:19:06 CET] <cowai> Although it used about 10 times more virtual memory (buffers/cache)
[16:21:15 CET] <cowai> instead of 3 processes using about 30mb each. It went to one process using 275mb
[16:21:49 CET] <cowai> but the actual used memory is about just 3 times more, as expected.
[16:22:07 CET] <cowai> But I dont know if I should care about the buffers/cache being used.
[17:26:52 CET] <TommyC> Hello #ffmpeg, I've been stuck at a loss (and I don't consider this really a development question). I'm trying to find how ffmpeg generates its man pages given the tex files in doc/ (in the source). I can't find anything about a tex2man so was wondering if someone might know?
[17:38:12 CET] <Adtoox-Ola> Hello, everybody! Is this the right place to ask questions about ffmpegs audio capabilities?
[17:40:40 CET] <Adtoox-Ola> I was looking for a way to normalize a *lot* of audio files. I was thinking it might be possible in ffmpeg, using 2 passes... Any pointers?
[17:40:56 CET] <Mavrik> Adtoox-Ola, look at available audio filters
[17:41:01 CET] <Mavrik> I think normalize actually exists
[17:42:15 CET] <c_14> TommyC: I believe it's texi2pod and pod2man
[17:42:32 CET] <TommyC> c_14: Thank you!
[17:43:36 CET] <Adtoox-Ola> I've checked ffmpeg.org, and it seems like I can only VIEW for instance "peak level", but I can't find any way to automatically apply volume changes using that info.
[17:44:14 CET] <Adtoox-Ola> i.e. Normalization doesn't seem to exist yet, but I might be wrong :-/
[17:57:00 CET] <c_14> Adtoox-Ola: what do you mean with normalization? FFmpeg has filters for replaygain, which could be used together with the volume filter, volumedetect + volume can be used, as well as the compand filter for dynamic range compression.
[18:01:26 CET] <Adtoox-Ola> c_14 Scenario: Using ffmpeg to analyze an audiofile. (for instance with -af "astats"). The result shows that there is a lot of headroom, peak level is -16dB. I want a way to use this info, so that I can normalize the peak level to -1dB. For a lot of files (batch normalize).
[18:02:08 CET] <Adtoox-Ola> c_14: ^_^
[18:04:29 CET] <c_14> The volumedetect filter together with the volume filter can accomplish that. Sadly I don't think the volumedetect filter can inject data to the volume filter, so you'll have to do a 2-pass and parse the output from the volumedetect filter.
[18:04:49 CET] <c_14> https://ffmpeg.org/ffmpeg-filters.html#volumedetect
[18:12:37 CET] <Adtoox-Ola> Thanks c_14, I'll definitely try this using 2 pass. Tomorrow. Thanks again! :)
[21:13:58 CET] <gcl5cp> what are OGG's parameters equivalent to HE-AAC compression?
[21:14:16 CET] <skitfoo> what would a ffmpeg command look like to capture a stream on rtsp if the ip/port were 192.168.1.9:559 with user name "skit" and password "password12345" just as an example?
[21:16:08 CET] <skitfoo> I just need a command to get me started where I can plug in my actual credentials, online documentation has not gotten me very far.
[21:30:40 CET] <c_14> skitfoo: `ffmpeg -i rtsp://skit:password12345@192.168.1.9:559' iirc
[22:16:21 CET] <bramgg> How do I cut a gif without losing any quality? `ffmpeg -i original.gif -ss 00:00:05 cut.gif` cuts the gif as expected but it's very low quality and has these weird yellow borders everywhere.
[22:27:21 CET] <__jack__> bramgg: ffmpeg -i blabla -ss .. -c copy output
[22:30:36 CET] <bramgg> __jack__: thanks, but getting this error: "Assertion video_enc->pix_fmt == AV_PIX_FMT_PAL8 failed at libavformat/gif.c:130 \n Aborted"
[22:35:09 CET] <__jack__> that's a bug, hum
[22:36:05 CET] <bramgg> __jack__: I just installed the latest version (from git) of ffmpeg an hour ago. I passed no options to the configure file, could I have missed something important?
[22:37:38 CET] <__jack__> bramgg: try with a "stable" release, maybe there is a commit that breaks stuff
[22:38:44 CET] <bramgg> __jack__: the static build from https://www.ffmpeg.org/download.html?
[22:39:06 CET] <bramgg> *linux static build
[22:45:51 CET] <bramgg> __jack__: just did that and ran `ffmpeg -i out.gif -ss 00:00:05 -t 10 -c copy cut.gif` and am getting the same error
[22:48:29 CET] <__jack__> bramgg: can you share the source gif somewhere ?
[22:49:26 CET] <bramgg> __jack__: sure, do you know of an image host that doesn't edit the source?
[22:49:35 CET] <bramgg> __jack__: or would you be fine with downloading it from somewhere?
[22:50:53 CET] <__jack__> bramgg: tell me
[22:52:34 CET] <__jack__> a file upload like mega or whatever (don't know a few, don't use it) should not modify files
[22:52:50 CET] <bramgg> __jack__: just sent you the file via dcc
[22:52:56 CET] <bramgg> i'll upload to mega if you want though
[22:57:37 CET] <bramgg> __jack__: dcc request aborted, pm'd download link
[23:00:30 CET] <bramgg> actually I could have just uploaded that to my own server -_-
[23:04:54 CET] <viric> bramgg: http://viric.name/cgi-bin/filegive/
[23:06:51 CET] <bramgg> viric: interesting, thanks
[23:23:26 CET] <bramgg> In case anyone new can help out... `ffmpeg -i out.gif -ss 00:00:05 -t 10 -c copy cut.gif` is aborting with the error "Assertion video_enc->pix_fmt == AV_PIX_FMT_PAL8 failed at libavformat/gif.c:130"
[23:29:17 CET] <__jack__> still looking at your issue, got the same error, trying another way
[23:34:17 CET] <bramgg> __jack__: thanks, so you think this is specific to my gif?
[23:34:40 CET] <bramgg> It was recorded with Byzanz
[23:41:36 CET] <bramgg> __jack__: I don't want you to spend too much of your time on this. I'll just re-make it with ffmpeg.
[23:43:39 CET] <iive> i don't think you can do stream copy with gif
[23:58:34 CET] <Mista_D> Can FFmpeg's stdout and errout output be supressed except the filter reports (silence and black frame detection)?
[23:59:36 CET] <bramgg> So I recorded my screen with ffmpeg as an mp4, and it came out fine. Then I ran `ffmpeg -i input.mp4 output.gif` and it still gives this weird yellow tint: http://www.bram.gg/img/giffy.gif
[00:00:00 CET] --- Fri Mar 20 2015


More information about the Ffmpeg-devel-irc mailing list