[Ffmpeg-devel-irc] ffmpeg.log.20130824
burek
burek021 at gmail.com
Sun Aug 25 02:05:01 CEST 2013
[01:31] <CentRookie> hello all :)
[01:32] <CentRookie> Does somebody of your knowledge encoders tell me what it is that I have to change in order to fix MB-tree frametype 0 doesn't match actual frametype 2
[01:33] <CentRookie> I try to run 1pass with ultrafast but obviously it doesnt work well with my 2nd pass settings, but first pass works well with veryfast
[01:39] <CentRookie> hm, i guess it cant be fixed
[01:54] <CentRookie> so silent
[01:54] <CentRookie> maybe they are all ghosts
[01:54] <CentRookie> does somebody have experience with multi file 2 pass encoding?
[02:10] <mjiig> when i try to make a timelapse video using the command "ffmpeg -y -r 5.0 -f image2 -i /home/angus/glapse/%09d.jpg /home/angus/glapse/timelapse.mp4.avi" i get a video that's much much whiter than the original images
[02:10] <mjiig> is there anyway i can avoid this?
[02:19] <CentRookie> -chromaoffset -2
[02:22] <mjiig> that doesn't change anything
[02:22] <CentRookie> :(
[02:23] <CentRookie> i doubt it actually parses the images like videos, since you have no interframes
[02:23] <CentRookie> so there isnt much you can do with filters
[02:26] <mjiig> ugh, just got a friend to look at it and it's fine for him, so i think it's probably the player that's screwing not ffmpeg
[02:27] <mjiig> sorry for the time waster
[02:41] <rafael2k> people, I'm with trouble with decoding HD h.264 using crystalhd hw accel. w/ gst-crystal all works fine, any recent issue related to crystalhd h.264 dec?
[02:41] <rafael2k> btw, I'm using bcm 70012 hw decoder
[05:59] <elkng> how much RAM is enough for ffmpeg if you encode video ? if there is 8GB RAM will ffmpeg use it all ? what if there is 32GB RAM ?
[06:00] <relaxed> elkng: you can use `top` to view its memory usage.
[06:02] <relaxed> I doubt it would use that much memory unless your frames were *HUGE*
[06:02] <relaxed> like xbox
[06:02] <klaxa> lol
[06:05] <elkng> I'm converting 1920x800 to 650x350 and top shows it uses "81MB RSS" so its enough about 128MB ?
[06:06] <elkng> seem like the only thing that benefit from huge amount of RAM is kernel compilation
[06:06] <klaxa> virtualization :x
[06:06] <klaxa> that uses quite some ram too
[06:06] <elkng> relaxed: "like xbox", what do you mean by that ?
[06:07] <elkng> klaxa: blender
[06:07] <klaxa> that too
[06:07] <klaxa> also: http://knowyourmeme.com/memes/huge-like-xbox-hueg-like-xbox
[06:07] <relaxed> "huge like xbox" was a meme
[06:07] <klaxa> re: xbox hueg
[06:08] <elkng> or gimp could consume much RAM
[06:08] <elkng> or converters like imagemagic
[06:09] <elkng> actually I remember run imagemagic to convert some image, it was about 10000x5000 or so on a machine with 1GB RAM and it was killed by system because of "out of memory" issues, and I wasn't been able to convert those image, even with 1GB RAM
[06:11] <elkng> so seems like all ffmpeg need RAM for is for one current frame ? so it covert video frame by frame and the most needed amount of RAM for ffmpeg procees depends on size of one single frame ?
[06:12] <klaxa> yeah imagemagic eats ram like pacman eats pills
[06:12] <klaxa> *imagemagick
[06:12] <klaxa> i think it also depends on the codec?
[06:12] <elkng> I though there is some sort of optimization for ffmpeg, some kind of cashing or so, the more RAM the faster procees
[06:12] <klaxa> x264 needs a lot of ram for m-trees i think although i have no idea how they work exactly
[06:12] <klaxa> but i think they optimize P and B frames?
[06:13] <klaxa> hmm... encoding one frame will probably take significantly longer than reading a frame from the disc
[06:13] <klaxa> *disk
[06:13] <elkng> I'm converting now from "1920x800 x264" to "650x350 mpeg4"
[06:15] <elkng> and it eats 81MB RAM
[06:16] <klaxa> you could put the input file to /tmp if it is mounted as a ramfs and output to /tmp too
[06:16] <klaxa> that would speed up reading the file from the filesystem
[06:16] <klaxa> since it is actually in ram already
[06:16] <klaxa> but i don't think it will speed up the process really UNLESS, your bottleneck is your disk io
[06:16] <klaxa> which i highly doubt
[07:14] <elkng> I have atom 1.6
[07:15] <klaxa> optimizing on the ram-side is probably misplaced then :P
[09:14] <liquidmetal> I have a general codec question
[09:14] <liquidmetal> http://developer.android.com/reference/android/media/MediaCodecInfo.CodecCapabilities.html#COLOR_TI_FormatYUV420PackedSemiPlanar
[09:14] <liquidmetal> This 'color format' says it's packed AND semi-planar.
[09:14] <liquidmetal> how is that possible
[09:14] <liquidmetal> ?
[11:27] <JEEB> liquidmetal, I wonder if that's something like NV12
[11:28] <JEEB> (not related to nvidia even with that name)
[11:28] <liquidmetal> JEEB, I understand planar formats
[11:28] <liquidmetal> I understand packed formats
[11:28] <liquidmetal> aren't they mutually exclusive?
[11:29] <JEEB> well, if you have one plane as planar and two of the other planes as packed together
[11:29] <liquidmetal> (okay, I kinda understand those two terms)
[11:29] <JEEB> although to be honest that thing doesn't document what it is so lol
[11:30] <JEEB> liquidmetal, and yes
[11:30] <JEEB> COLOR_TI_FormatYUV420PackedSemiPlanar is NV12
[11:30] <JEEB> from a random piece of code I found on the internet
[11:30] <liquidmetal> ah!
[11:30] <liquidmetal> So packed semi-planar means, one part is planar and the other is packed
[11:31] <JEEB> somehing like that, the name in this case is rather ambiguous
[11:31] <liquidmetal> Got it!
[11:31] <liquidmetal> Now I need to figure out how to decode NV12 on android
[11:43] <fschuetz> Why is there no avformat_close_output method? Am I missing something?
[11:49] <fschuetz> I could really use some help with converting an audio file to another format. I have checked the examples and documentation, however each example seems to take a different approach to tasks like opening/closing files and writing to them.
[11:50] <fschuetz> There also seems to be a general absence of symmetry in ffmpeg. Is there a particular reason, why opening and writing to files are so very different for input and output?
[12:34] <fschuetz> A great start would be, if someone could explain to me, what open_audio in http://ffmpeg.org/doxygen/trunk/doc_2examples_2muxing_8c-example.html does
[12:35] <fschuetz> I think everything it computes should be determined by output format and codec.
[13:13] <liquidmetal> Looking at http://developer.android.com/reference/android/media/MediaCodecInfo.CodecCapabilities.html I found that there are two different color formats mentioned there:
[13:13] <liquidmetal> COLOR_FormatYUV422SemiPlanar
[13:13] <liquidmetal> COLOR_FormatYUV422PackedSemiPlanar
[13:14] <liquidmetal> This makes me wonder - what's the difference between these two formats?
[13:17] <blez> hello
[13:17] <blez> is there a version of ffmpeg that supports hardware acceleration
[13:18] <fschuetz> compile it
[13:21] <Mavrik> blez, what is "hardware acceleration" for you?
[13:21] <Mavrik> liquidmetal, I think line alignment
[13:21] <blez> CUDA support for example?
[13:21] <Mavrik> liquidmetal, but I haven't used it enough to be sure
[13:21] <Mavrik> blez, for which part of transcoding process?
[13:22] <liquidmetal> Mavrik, I'm sure the naming wouldn't be android specific - so there must be some documentation about this.
[13:22] <Mavrik> anyway, no, there are no CUDA encoders in ffmpeg at the moment since they're silly
[13:22] <liquidmetal> Any clue where I can find more about this?
[13:22] <Mavrik> liquidmetal, why are you sure naming isn't android specific?
[13:22] <Mavrik> liquidmetal, and you will find that info in SoC documentation
[13:22] <Mavrik> so look at qualcomm, nvidia
[13:23] <liquidmetal> SoC documentation?
[13:23] <liquidmetal> Mavrik, sure because of faith in Android :) Which means I could be wrong
[13:24] <liquidmetal> But I just want to know the difference between 'packed semiplanar' and 'semiplanar'
[13:25] <blez> what about OpenCL?
[13:31] <Mavrik> liquidmetal, and I'm telling you you need to consult documentation of a SoC that produces that kind of images
[13:31] <Mavrik> since in Android doc it's not clearly specified
[13:32] <Mavrik> liquidmetal, also, if you love youself just a little
[13:32] <Mavrik> you'll forget about MediaCodec API for a few versions of Android more at least
[13:32] <Mavrik> it's a horrible horrible mess
[13:33] <liquidmetal> Mavrik, they now have tests for those APIs - so they'll stay consistent with newer version of android
[13:34] <liquidmetal> Just that the % of devices they work on is limited - I don't care about that just yet
[13:34] <Mavrik> liquidmetal, the problem is that pixel formats aren't defined
[13:34] <Mavrik> and the format in which the devices expect frames varies wildly without a reliable way to check
[13:35] <Mavrik> anyway, as I said, I think the non-packed YUV:4:2:2 has to be line aligned on 16-byte mark, but I'm not 100% sure
[13:39] <Mavrik> liquidmetal, doh sorry, I'm talking crazy talk
[13:39] <Mavrik> liquidmetal, "packed" formats have luma and chroma channels interleaved
[13:39] <liquidmetal> Mavrik, how's that different from semiplanar?
[13:40] <Mavrik> good question actually
[13:41] <Mavrik> that's also what doesn't make sense to me
[14:18] <CentRookie> hi
[14:18] <CentRookie> im trying to run multiple 2 pass encodings simultanously
[14:19] <CentRookie> but the mbtrees keep overwriting each other
[14:19] <CentRookie> is there a way to define the mbtree names?
[14:21] <relaxed> -passlogfile
[14:22] <CentRookie> it also applies to mbtree?
[14:22] <CentRookie> and i only need that for first pass encoding right
[14:23] <relaxed> yes and no
[14:23] <CentRookie> thanks :)
[14:23] <relaxed> the second pass needs the name too
[14:23] <CentRookie> hmm
[14:23] <CentRookie> do i assign it as input somehow?
[14:24] <CentRookie> or do i say for 2nd pass -passlogfile video02
[14:24] <relaxed> the same name for both
[14:24] <relaxed> passes
[14:24] <CentRookie> ok
[14:24] <CentRookie> helpful like always :)
[14:25] <CentRookie> wish there was a parameter in ffmpeg to auto remove logtrees after last pass
[14:27] <relaxed> it's trivial to script
[14:28] <CentRookie> yup
[14:28] <CentRookie> still it is so useful
[14:28] <CentRookie> if there was such an option i mean
[15:36] <fschuetz> i am trying to convert audio. decode works and got packet, however avcodec_encode_audio2 crashes. Is there something I need to do with the decoded frame, before I can hand it to avcodec_encode_audio2? Code is here: http://pastebin.com/UkQywDHF
[15:51] <fscz> in the examples/muxing.c write_audio_frame there is a resampling step. Is resampling necessary, if you get the frame from avcodec_decode_audio4?
[15:59] <fscz> Would be smarter, if this step was integrated in the respective encode/decode functions
[16:08] <hackeron> hey, I'm trying to record audio from alsa with: ffmpeg -f alsa -ac 1 -i hw:1,0 -dn -vn -codec:a libfdk_aac -flags +qscale -ar 44100 -y test.mkv -- but I am getting: "[alsa @ 0x2333bc0] cannot set sample format 0x10000 2 (Invalid argument)" -- the sample format used by arecord is "S32_LE" but I can't see it in the list of -sample_fmts -- any ideas?
[16:10] <klaxa> just try s32p maybe? :x
[16:11] <klaxa> and s32
[16:11] <klaxa> one of them might sound correct
[16:12] <hackeron> klaxa: I'm not sure where to put it, but I tried: ffmpeg -f alsa -ac 1 -sample_fmt s32 -i hw:1,0 -sample_fmt s32 -dn -vn -codec:a libfdk_aac -flags +qscale -ar 44100 -y test.mkv -- and it throws the same: "[alsa @ 0xe74cc0] cannot set sample format 0x10000 2 (Invalid argument)"
[16:12] <hackeron> am I putting it in the wrong place?
[16:12] <klaxa> no i think that place is right
[16:12] <klaxa> wait
[16:12] <hackeron> no matter what sample format I use, it always says: "cannot set sample format 0x10000 2"
[16:12] <klaxa> remove the second one
[16:12] <klaxa> hmm yeah weird
[16:13] <hackeron> same thing with: ffmpeg -f alsa -ac 1 -sample_fmt s32 -i hw:1,0 -dn -vn -codec:a libfdk_aac -flags +qscale -ar 44100 -y test.mkv
[16:13] <hackeron> ffmpeg -f alsa -ac 1 -sample_fmt dblp -i hw:1,0 -dn -vn -codec:a libfdk_aac -flags +qscale -ar 44100 -y test.mkv --- also returns: "cannot set sample format 0x10000 2 (Invalid argument)" :/
[16:14] <klaxa> can you try: ffmpeg -f alsa -i hw:1,0 test.wav ?
[16:15] <klaxa> and if that works add more and more arguments to the command line
[16:15] <hackeron> yep: [alsa @ 0xd738c0] cannot set sample format 0x10000 2 (Invalid argument)
[16:15] <hackeron> hw:1,0: Input/output error
[16:15] <klaxa> so
[16:15] <klaxa> input output error sounds quite suspicioius
[16:15] <klaxa> second hardware soundcard, first device?
[16:16] <hackeron> this is the format from arecord: http://pastie.org/8265575
[16:16] <hackeron> the output I mean
[16:16] <hackeron> so the hardware is fine :/
[16:17] <hackeron> and the resulting wav file plays just fine
[16:17] <klaxa> hmm hmm
[16:20] <hackeron> klaxa: the full ffmpeg debug output: http://pastie.org/8265584
[16:21] <klaxa> hmm dunno wait for someone who knows about this to show up :/
[16:23] <hackeron> also, check this out: http://pastie.org/8265588 -- so no matter what sample format I try, it always says: "cannot set sample format 0x10000 2" - hmmm :/
[16:44] <hackeron> for a workaround I'm recording through dsnoop which is working :)
[16:45] <hackeron> but the audio input has 12 channels - how do I record just channel 6 for instance? -- I'm trying ffmpeg -f alsa -ac 12 -i plug:capt -ar 44100 -map_channel 0.0.5 -y test.wav -- but I'm getting silence (probably channel 1)
[16:45] <hackeron> any ideas?
[16:55] <zap0> 0.0.5 ?
[16:55] <hackeron> is that wrong?
[16:56] <hackeron> I tried this: ffmpeg -f alsa -ac 12 -i plug:capt -ar 44100 -map_channel 0.0.0 -y test0.wav -map_channel 0.0.1 -y test1.wav -map_channel 0.0.2 -y test2.wav -map_channel 0.0.3 -y test3.wav -map_channel 0.0.4 -y test4.wav -map_channel 0.0.5 -y test5.wav -map_channel 0.0.6 -y test6.wav -map_channel 0.0.7 -y test7.wav -map_channel 0.0.8 -y test8.wav -map_channel 0.0.9 -y test9.wav -map_channel 0.0.10 -y test10.wav -map_channel 0.0.11 -y test11.wav
[16:56] <hackeron> according to the manual, it should record each channel to a separate file
[16:57] <hackeron> but every one of the files is just complete silence :/ -- if I do just ffmpeg -f alsa -ac 12 -i plug:capt -ar 44100 -y test.wav -- I get a 12 channel wav file and there's sound on 8 of the 12 channels
[16:57] <hackeron> (only 8 channels have microphones so this is expected)
[16:59] <hackeron> the example given is ffmpeg -i INPUT -map_channel 0.0.0 OUTPUT_CH0 -map_channel 0.0.1 OUTPUT_CH1 -- which is what I did above :/ - any ideas why I am getting silence in all the output files?
[16:59] <zap0> hackeron, http://ffmpeg.org/ffmpeg.html then search for "map_channal' has some examples
[17:00] <hackeron> zap0: that's what I'm doing
[17:00] <hackeron> the example is: "ffmpeg -i INPUT -map_channel 0.0.0 OUTPUT_CH0 -map_channel 0.0.1 OUTPUT_CH1" -- what I'm doing is: "ffmpeg -f alsa -ac 12 -i plug:capt -ar 44100 -map_channel 0.0.0 -y test0.wav -map_channel 0.0.1 -y test1.wav -map_channel 0.0.2 -y test2.wav -map_channel 0.0.3 -y test3.wav -map_channel 0.0.4 -y test4.wav -map_channel 0.0.5 -y test5.wav -map_channel 0.0.6
[17:00] <hackeron> -y test6.wav -map_channel 0.0.7 -y test7.wav -map_channel 0.0.8 -y test8.wav -map_channel 0.0.9 -y test9.wav -map_channel 0.0.10 -y test10.wav -map_channel 0.0.11 -y test11.wav"
[17:00] <hackeron> (for 12 channels instead of 2)
[17:00] <hackeron> and getting silence on all channels
[17:01] <zap0> if file 0, stream 0 really the input?
[17:01] <hackeron> yes: Input #0, alsa, from 'plug:capt':
[17:01] <hackeron> Duration: N/A, start: 1377355941.479040, bitrate: 9216 kb/s
[17:01] <hackeron> Stream #0:0: Audio: pcm_s16le, 48000 Hz, 12 channels, s16, 9216 kb/s
[17:02] <hackeron> am I misunderstanding something?
[17:03] <hackeron> if I try to use any other file or stream input, other than 0, it says: "mapchan: invalid input file stream index #0.1"
[17:03] <hackeron> and if I use a channel higher than 11, it says "mapchan: invalid audio channel #0.0.12" - so I'm using the right parameters it seems
[17:03] <zap0> why haven't you told in the format.. s16le
[17:03] <zap0> it/
[17:04] <hackeron> zap0: I don't need to, dsnoop does in .asoundrc - also ffmpeg -f alsa -ac 12 -i plug:capt -ar 44100 -y test.wav -- records all 12 channels just fine
[17:05] <zap0> if you say so.
[17:07] <hackeron> zap0: ok, I changed the command to: ffmpeg -f alsa -ac 12 -sample_fmt s16 -i plug:capt -ar 44100 -map_channel 0.0.0 -y test0.wav -map_channel 0.0.1 -y test1.wav -map_channel 0.0.2 -y test2.wav -map_channel 0.0.3 -y test3.wav -map_channel 0.0.4 -y test4.wav -map_channel 0.0.5 -y test5.wav -map_channel 0.0.6 -y test6.wav -map_channel 0.0.7 -y test7.wav -map_channel 0.0.8 -y test8.wav -map_channel 0.0.9 -y test9.wav -map_channel 0.0.10 -y ...
[17:07] <hackeron> ... test10.wav -map_channel 0.0.11 -y test11.wav -- no change, silence in all output files
[17:07] <hackeron> any other ideas?
[17:07] <zap0> are they the file length you expected?
[17:08] <zap0> also, im not sure s16 is valid.. how does it know endianess?
[17:08] <hackeron> yep, all 7 seconds - if I add -t 5 to the beginning, all are 5 seconds
[17:09] <hackeron> the only ones available are: s8 s16 s32 flt dbl u8p s16p s32p fltp dblp -- which one is the correct one?
[17:11] <zap0> don't know... i don't use -sample_fmt i use something else
[17:11] <hackeron> what do you use?
[17:11] <zap0> i'm trying to find it (hence the delay)
[17:11] <zap0> -f s16le
[17:13] <hackeron> ok, current command is: ffmpeg -f alsa -ac 12 -i plug:capt -f s16le -ar 44100 -map_channel 0.0.0 -y test0.wav -map_channel 0.0.1 -y test1.wav -map_channel 0.0.2 -y test2.wav -map_channel 0.0.3 -y test3.wav -map_channel 0.0.4 -y test4.wav -map_channel 0.0.5 -y test5.wav -map_channel 0.0.6 -y test6.wav -map_channel 0.0.7 -y test7.wav -map_channel 0.0.8 -y test8.wav -map_channel 0.0.9 -y test9.wav -map_channel 0.0.10 -y test10.wav -map_channel ...
[17:13] <hackeron> ... 0.0.11 -y test11.wav
[17:13] <hackeron> silence in all output wav files :/
[17:13] <hackeron> input is: Stream #0:0: Audio: pcm_s16le, 48000 Hz, 12 channels, s16, 9216 kb/s - so that looks right
[17:14] <zap0> have you given the output file format?
[17:14] <hackeron> if I output to a single output.wav file, it outputs all 12 channels correctly
[17:14] <hackeron> no, I haven't
[17:15] <hackeron> doesn't seem like I need to? < # file test1.wav
[17:15] <hackeron> test1.wav: RIFF (little-endian) data, WAVE audio, Microsoft PCM, 16 bit, mono 48000 Hz
[17:16] <zap0> hwo do you know it's silence?
[17:16] <hackeron> I opened it in audacity, there is absolutely no signal
[17:17] <zap0> if you open it in notepad... can you literally see the file is just a bunch of NUL chars ? (of whatever value is audio-silence) ?
[17:17] <zap0> or whatever/
[17:17] <hackeron> yes: ^@^@^@^@^@^@^@^@^@^@^@^@
[17:17] <hackeron> just a bunch of that, nothing else
[17:17] <zap0> do you have a sample of this 12chan file i could download and try ?
[17:18] <hackeron> sure
[17:18] <hackeron> one sec
[17:18] <zap0> k
[17:18] <zap0> back in 3
[17:20] <hackeron> recorded with: ffmpeg -t 10 -f alsa -ac 12 -i plug:capt -y test.wav -- http://itstar.co.uk/test.wav
[17:21] <zap0> .ogx file? it's downloading..... slowly.... ETA 14mins
[17:22] <hackeron> so as you can see, all 12 channels are there and there is signal on channels 3,4,5,6,7,8 and 11 and 12 is pink noise
[17:22] <hackeron> .ogx? -- it's test.wav: RIFF (little-endian) data, WAVE audio, Microsoft PCM, 16 bit, 12 channels 48000 Hz
[17:23] <hackeron> so now I need to figure out why -map_channel isn't working to split the channels to separate output files (or to pick just a single channel)
[17:24] <zap0> yep.. test.ogx and VLC plays it (although i only have stereo speakers), but it's info says 12 chns
[17:24] <hackeron> if you open it with audacity, it will show a waveform for each separate channel
[17:24] <hackeron> and allow you to solo each channel
[17:26] <hackeron> so any ideas how to record just 1 channel out of the 12?
[17:26] <zap0> the levels appear to be VERY low.. i 'see' 3,4,5,6,7,8,11,12 the others appear to be silent.
[17:26] <hackeron> say channel 4
[17:26] <hackeron> yes, the others are silent
[17:29] <hackeron> zap0: so any ideas how to record just channel 4 for instance?
[17:31] <zap0> trying a few things
[17:31] <hackeron> thanks :)
[17:31] <mecil9> hi guys
[17:31] <mecil9> anybody can help me?
[17:32] <mecil9> i has been git ffmpeg to my computer
[17:32] <mecil9> make all
[17:32] <mecil9> error:libmp3lame >=3.98.3 not found
[17:33] <mecil9> what's means
[17:34] <hackeron> mecil9: http://bit.ly/1dCQNEK
[17:36] <mecil9> i can't open the pages
[17:38] <zap0> hackeron.. ffmpeg says during file creation.. "-map_channel is forwarded to lavfi similarly to -af pan=0x4:c0=c6."
[17:39] <zap0> hackeron, then pan says: "This syntax is deprecated. Use '|' to separate the list items"
[17:39] <hackeron> zap0: yeh, I saw that -- is ffmpeg doing it wrong?
[17:39] <Hfuy> Codec question: if I present an h.264 codec (not necessarily the one in ffmpeg) with a black-and-white image, is it likely to be able to use the fact that there's no energy in the U and V channels to improve compression?
[17:39] <zap0> hackeron, maybe.. try using | instead
[17:39] <Hfuy> I understand that h.264 specifies the decoder, not the encoder, so presumably various codecs could handle this differently.
[17:41] <hackeron> zap0: no difference at all
[17:42] <hackeron> zap0: I'm doing: for i in 0 1 2 3 4 5 6 7 8 9 10 11; do ffmpeg -i test.wav -t 5 -dn -vn -codec:a libfdk_aac -flags +qscale -ar 44100 -af "pan=0x4|c0=c${i}" -y ch${i}.aac; done --- no deprecetaion warnings anymore, silence in output files (3KB per file, empty waveform, etc)
[17:42] <hackeron> zap0: are you able to record just 1 channel from that test recording?
[17:44] <zap0> i have a ffmpeg command line that produces are correct file (although it is still full of silence).
[17:44] <zap0> it makes a .wav in the right size, header etc.. just full of NUL chars :(
[17:45] <hackeron> yeh, that's what I'm getting :/
[17:45] <hackeron> all examples I can find say something like: ffmpeg -i stereo.wav -map_channel 0.0.1 right_mono.wav -- the 1 in 0.0.1 being the channel id -- but the output file is silence regardless of channel id :/
[17:45] <zap0> is this a one-off problem? can you not just use Audacity?
[17:45] <hackeron> no, it isn't, it's for recording live audio
[17:46] <zap0> ok
[17:47] <zap0> the other day i write a WAV reader/writer in C. i'd have just used my own code by now :)
[17:48] <hackeron> lol, can it record from just 1 specified channel from an alsa source with 12 channels?
[17:48] <Hfuy> I once wrote a WAV reader in Javascript, in windows scripting host. Complicated it ain't.
[17:48] <hackeron> and create segment files?
[17:48] <zap0> hackeron: maybe use -af pan ? ? http://superuser.com/questions/601972/ffmpeg-isolate-one-audio-channel
[17:49] <hackeron> zap0: that's the page I have open on the screen - no change
[17:49] <hackeron> zap0: output file is silence :/
[17:49] <hackeron> it looks like ffmpeg is broken when dealing with 12 channel inputs?
[17:50] <Hfuy> I haven't heard the whole conversation - are you trying to process a 12 channel wav?
[17:50] <zap0> Hfuy, i'm currently trying to build a simple waveform generator/player in JS, to run on mobile phone browsers!
[17:50] <Hfuy> zap0: The problem is not so much the language itself, it's the facilities provided by the environment. The trick for doing it in windows was simply how to get a binary stream into a series of numbers. Figure that out and it's trivial.
[17:51] <hackeron> Hfuy: yep, here's a wav file, it is 12 channels: http://itstar.co.uk/test.wav -- I need to be able to get just 1 channel out of it with ffmpeg. All the documented methods including -af pan and -map_channel are not working and producing an empty output file. For example: for i in 0 1 2 3 4 5 6 7 8 9 10 11; do ffmpeg -i test.wav -codec:a libfdk_aac -flags +qscale -ar 44100 -af "pan=1|c0=c${i}" -y ch${i}.aac; done
[17:51] <Hfuy> I'm actually lying anyway - I did it in windows script host, but there was some aspect or other of binary handling that the JScript interpreter's tendency to make everything into ASCII was screwing up. I had to use a tiny bit of VBscript to get around it. But it worked.
[17:52] <Hfuy> Mmm, I wouldn't be surprised if any $SOFTWARE had a problem reading multichannel audio, they often do. But equally it isn't a complex format.
[17:52] <zap0> i feel so dirty, even just hearing about the use of VB script ;)
[17:52] <Hfuy> zap0: Imagine how dirty I felt writing it. But it was only a couple of lines.
[17:53] <Hfuy> hackeron: Sorry, you're massively exceeding my experience with ffmpeg.
[17:54] <Hfuy> I don't even know if your commandlines have the correct intent, let alone their likely performance.
[17:54] <Hfuy> I have to ask, though: where the hell did you get a 12-channel wave from?!
[17:56] <Hfuy> zap0: http://pastebin.com/YCpVZw7S
[17:57] Action: zap0 reads
[17:58] <hackeron> ok, filed a bug report: http://trac.ffmpeg.org/ticket/2899
[17:59] <zap0> lol.. i just tried ffmpeg 2008 .. "unrecognized option '-map_channel'"
[17:59] <Hfuy> zap0: If you can spot the big ugly cheat, which isn't in that file, you can have one of my doughnuts.
[18:00] <hackeron> zap0: yeh, it was added in 2011 I believe?
[18:00] <zap0> on a diet
[18:00] <zap0> hackeron, good idea.
[18:00] <zap0> : re bug report.. good idea,.
[18:00] <hackeron> thanks :)
[18:01] <zap0> hackeron, it's not likely to get looked at unless a 12 channel input file is available... so add a link in your bug report. or at least email so you can be contacted
[18:01] <hackeron> zap0: the first line is a link
[18:02] <Hfuy> I would expect it's somewhat unlikely to be looked at anyway.
[18:02] <zap0> if a dev has an interest in 12 chn audio... it might get a lot of attention
[18:02] <Hfuy> And the likelihood of that is...
[18:02] <hackeron> Hfuy: actually every bug I filed to ffmpeg has generally been looked at very quickly
[18:03] <Hfuy> Hey, did I figure out how to parse wave files without needing my evil vbscript hack? I think I did!
[18:03] Action: Hfuy is a genius
[18:04] <zap0> hackeron, success!!
[18:04] <braincracker> my genius mice's buttons died early ;/
[18:04] <hackeron> zap0: yeh???
[18:04] <zap0> hackeron.. oh oh oh... O M G... w000tttt!!!! /me runs about the room naked!
[18:04] <hackeron> zap0: I'm going to join you, what is it???
[18:04] <zap0> hackeron.. used an older ffmpeg.. the file is non-full-of-NULs
[18:05] <zap0> lemme listen.. back in 3
[18:05] <hackeron> zap0: lol! - ok, now to figure out what someone broke and how and get a developer to revert it
[18:05] <braincracker> zap0 <= tin-foil-hat is a must!
[18:06] <braincracker> Hfuy <= in C? for loop?
[18:06] <zap0> hackeron, i selected channel 11,, it sounds a bit like white noise heard thru a toilet roll stuck to ones ear.
[18:06] <Hfuy> braincracker: Sorry?
[18:06] <Hfuy> hackeron: where did this 12 channel audio come from, out of interest?
[18:06] <hackeron> zap0: yep, channels 11 and 12 are pink/white noise -- try something like channels 3 to 8
[18:06] <braincracker> [180330] <Hfuy> Hey, did I figure out how to parse wave files without needing my evil vbscript hack? I think I did!
[18:06] <hackeron> Hfuy: M-Aaudio lt1010 sound card
[18:06] <braincracker> don't you just hate vb* ?
[18:07] <Hfuy> Oh I do.
[18:07] <hackeron> Hfuy: it has 8 analog inputs and 4 digital inputs
[18:07] <Hfuy> I did it in Javascript.
[18:07] <Hfuy> I can't quite figure out how, but apparently I did.
[18:08] <braincracker> okey, most ms things work like this.
[18:09] <zap0> hackeron, channel 3 sounded like pink noise too. channel 8 sounds very quite.. just turned up it sounds a bit like shitty audio chips on motherboards that produce digital noise into their crappy pre-amps
[18:09] <zap0> is there a specific channel that has something very distringuishable ?
[18:09] <hackeron> zap0: yeh, it is recording sound in a few rooms that are empty right now - but at least it is working :D
[18:09] <zap0> yes!
[18:09] <hackeron> zap0: well, you should be able to hear rain on 7 and 8 I believe?
[18:10] <hackeron> zap0: ok, so any ideas what revision broke it and what specific code?
[18:10] <braincracker> nobody knows how, it just works
[18:10] <hackeron> also, can you add a comment what version works for you?
[18:10] <Hfuy> I'm not sure if it's really an MS thing or a JS thing.
[18:10] <Hfuy> The situation is that the file reader you get in JScript expects to work on text files, and munges character values above 128 in certain circumstances.
[18:10] <zap0> hackeron, ffmpeg.exe --version ffmpeg version N-35295-gb55dd10, built on Nov 30 2011 00:52:52 with gcc 4.6.2
[18:10] <braincracker> ddos logs will be forwarded to authorities
[18:11] <zap0> Hfuy, surely it has a binary mode!?
[18:12] <hackeron> zap0: it's so annoying :( - I need the latest version of ffmpeg because it has all the -segment beauty but -map_channel is broken, grrr
[18:12] <zap0> hackeron, at least we identified it's a bug.. and not a missing feature.. so it should be fixable... quickly-ish
[18:12] <Hfuy> zap0: Well, to be completely fair, the function is called openTextFile()
[18:13] <hackeron> zap0: yeh, I've added a comment: "Also note that ffmpeg version: N-35295-gb55dd10, built on Nov 30 2011 00:52:52 with gcc 4.6.2 works fine, but latest trunk is broken.
[18:13] <hackeron> "
[18:13] <zap0> hackeron, let me find a mid point-date-wise.. see if thats broken too.. maybe something near... oct/nov 2012
[18:17] <zap0> hackeron, ffmpeg-20121003-git-df82454 throws an error, and writes a zero sized output file.
[18:18] <Hfuy> zap0: Aha. You have to do some chicanery with translating unicode numbers to their ASCII equivalents.
[18:18] <hackeron> zap0: aha, so somewhere between 201111 and 20121003, lol?
[18:18] <Hfuy> If you read a "text" file byte >127 then do charCodeAt() on it, you'll get a unicode number if it's >127.
[18:19] <zap0> hackeron, indeed!! the last verison i just quoted has some git reference... perhaps that is valuable to someone
[18:19] <Hfuy> Which is why I got Visual Studio and started using C# instead :)
[18:21] <hackeron> zap0: hmm, someone replied - have a look at the ticket
[18:22] <Hfuy> I don't get that. How is there supposed to be a "known" channel layout for 12 arbitrary inputs?
[18:23] <zap0> cause 5.1 and 7.1 has known layouts... 12 is not a "standard"
[18:23] <Hfuy> Pan filter just takes numeric input, though, doesn't it?
[18:24] <Hfuy> (and in fact, in may situations, the channel layout of streams known to contain 5.1 and 7.1 tracks is really not very consistent!)
[18:25] <hackeron> zap0: hmm, how do I specify the aformat=channel_layouts=0xFFF?
[18:25] <zap0> hackeron, i don't know. im still staring at it trying to comprehend that too
[18:26] <hackeron> heh
[18:27] <Hfuy> In movie postproduction, multichannel surround is almost always ferried around as a set of single channel files.
[18:27] <Hfuy> For this exact reason.
[18:27] <Hfuy> Few things support multichannel files, and even fewer support them properly.
[18:28] <Hfuy> Really you need something like -af "assign_channels=l,r,c,ls,rs,lfe"
[18:29] <hackeron> Hfuy: I want to record 1 channel, in mono, a channel number I specify from an input with 12 channels - there is no left/right/whatever - every channel is a different room
[18:29] <Hfuy> Oh I completely understand. But if it's going to insist on somehow knowing what the channels represent, there ought to be a way to assign them labels.
[18:30] <Hfuy> But I agree there seems to be no reason why that should be necessary simply to split out one of the channels.
[18:30] <zap0> RIFF format provides packets for data
[18:30] <zap0> someone needs to set (YET ANOTHER) standard ;)
[18:31] Action: Hfuy cries
[18:32] <Hfuy> The reason I wrote that wave parser was so as to have the ability to read and write "broadcast wave" extensions, with timecode etc.
[18:32] <zap0> i hereby declare a new RIFF packet called 'channel layout', containing a list like 1=23ºW @ 206.4mm from center. 2=...
[18:32] <Hfuy> Gathering example files from field audio recorders, I immediately discovered a collection of RIFF chunks I'd never heard of before.
[18:33] <Hfuy> There were n of these proprietary chunks, where n is in fact slightly larger than the number of recorder manufacturers involved.
[18:33] <zap0> Hfuy, lol.. the reason i'm write audio in JS is for a timecode generator!
[18:33] <Hfuy> Writing a slate app?
[18:33] <zap0> more or less!
[18:33] <Hfuy> I would counsel against it :/
[18:34] <zap0> why?
[18:34] <Hfuy> We tried two different ipad slate apps against a real Ambient clockit slate.
[18:35] <Hfuy> The problem I think is that the accuracy of the slate apps is dependent on the clock accuracy of the audio hardware in the ipad.
[18:35] <Hfuy> And it isn't good enough.
[18:35] <Hfuy> It lost whole frames an hour, which is way not good enough.
[18:35] <zap0> are you implying the accuracy/drift is an issue?
[18:35] <Hfuy> There are circumstances where you could make it work, but it isn't good enough to jam sync then walk away.
[18:35] <zap0> yes, i guess you are!
[18:36] <Hfuy> Depends what you're doing I guess.
[18:36] <zap0> i am quite aware of the haphazard timing of these android/ipad consumer hardware.
[18:36] <Hfuy> If you want to let it listen to incoming SMPTE timecode using the audio input, and just display what you're getting, fine.
[18:36] <hackeron> zap0: any luck? -- I tried ffmpeg -v debug -i test.wav -filter:a aformat=channel_layouts=0xFFF -af "pan=0x4|c0=c4" -y ch4.wav -- still getting silence in the output
[18:37] <Hfuy> Equally if it's going to be a timecode master and you're going to record its output onto a spare audio track for later syncing, probably fine.
[18:38] <Hfuy> Perhaps you could put some sort of calibration term into the software but I'm not sure how much of the drift we saw is down to interrupts and so on.\
[18:39] <zap0> how often do you record a single take that goes for over an hour ?
[18:40] <Hfuy> Not often. But that's not the factor, if you want to jam sync it.
[18:40] <Hfuy> The issue is has it been RUNNING for an hour since it was last synced.
[18:42] <zap0> this is for some simple stuff anyway... if i wanted something i'd have to rely on, then i'd use this microcontroller to do it, and run it off a real-tme-clock module thingy.
[18:43] <Hfuy> I think really this is a microcontroller project.
[18:43] <Hfuy> I've been pondering doing just that for ages, but it's a lot easier if you can simply ensure the uC is accurately clocked, as opposed to trying to refer your code to an external RTC.
[18:43] <Hfuy> And I think you can do that.
[18:43] <zap0> i'm using a uC for displaying milliseconds anyway... for the high-speed cameras
[18:44] <Hfuy> Ooh, high-speed cameras
[18:44] Action: Hfuy rubs his hands
[18:44] <Hfuy> It's not as if SMPTE code is complicated, anyway.
[18:45] <zap0> i've seen some people trying to use POV displays for high-speed.
[18:45] <Hfuy> I just wish it encoded frame rate.
[18:45] <zap0> there are some empty blocks in SMPTE you can write your own data into
[18:45] <zap0> although not all hardware likes it when you do
[18:45] <Hfuy> Only a couple of bits.
[18:45] <Hfuy> Although that'd be enough to indicate whether we were at a fractional frame rate or not.
[18:45] <Hfuy> But, as you say...
[18:46] <zap0> how many frame rate changes per second do you need ? just write 1 bps of your frame-rate-info stream, until its done!
[18:46] <Hfuy> Heh.
[18:47] <Hfuy> Really the issue when I was doing it was simply being able to tell the difference between, say, 29.97 and 30.
[18:47] <Hfuy> Which I found was fine, even from analogue tape. So it really isn't a huge deal.
[18:48] <zap0> lol.. NTSC
[18:49] <hackeron> zap0: this seems to work! < ffmpeg -v debug -i test.wav -filter:a "aformat=channel_layouts=0xFFF,pan=0x4|c0=c4" -y ch4.wav
[18:50] <Hfuy> zap0: Tell me about it. I live in the UK, where we can count to 25. IN WHOLE NUMBERS.
[18:51] <zap0> lol... .au PAL too ;)
[18:51] Action: Hfuy waves a very small union jack
[18:51] <sacarasc> So, you're out to sea, Hfuy?
[18:51] <Hfuy> Oh dear. A heraldic pedant.
[18:51] <Hfuy> OK, OK. Union FLAG.
[18:51] <sacarasc> \o/
[18:52] <zap0> hackeron, well done!
[18:52] <Hfuy> Nobody in the UK understands the difference, or has any idea what you're on about when you talk about the "union flag." And if you use the term in international company, they tend to think of the American civil war.
[18:52] <mecil9> hackeron
[18:52] <Hfuy> So yes, I tend to use the more common term. But be happy; I know how not to indicate I'm in distress when flying said flag.
[18:53] <zap0> Hfuy, i have no idea... but i'm going to guess you are talking about the Cross-of_... overlayed on the cross-of-.... that makes up the multiple colours?
[18:54] <Hfuy> The George Cross is a red cross on white. George, by one mythology, is the patron saint of England.
[18:54] <hackeron> mecil9: yes?
[18:54] <Hfuy> The Cross of St. Andrew is a diagonal white cross on blue. By said mythology, Andrew is patron saint of Scotland.
[18:55] <mecil9> hackeron new error played
[18:55] <mecil9> libx264 must be >=0.118
[18:55] <mecil9> i git x264 from git.videolan.org
[18:56] <mecil9> make it
[18:56] <Hfuy> Aaaand the cross of St. Patrick is a diagonal red cross on white. But we made that bit smaller. Because frankly, who cares bout the Irish. :)
[18:56] <Hfuy> And mainly because it was selected more or less at random from the flags of the great houses of Ireland.
[18:56] <Hfuy> Infodump ends.
[18:58] <zap0> hackeron, i've noted your example in my 'notebook of ffmpeg tricks' which grows by the day!
[19:00] <hackeron> zap0: haha, cna I see that notebook?
[19:01] <fscz> I've decoded a frame, using avcodec_decode_audio4, however when I put this frame into avcodec_encode_audio2 it crashes on me.
[19:02] <fscz> I've debugged it all the way to where the segault is thrown, which is at samplefmt.c/av_samples_copy
[19:02] <fscz> the problem is, that the source parameter of the function is messed up and therefore the memcpy call in that function fails
[19:04] <fscz> I am not sure about this, but i think the problem might be in avcodec_encode_audio2, as it enters the branch where pad_last_frame is called.
[19:04] <fscz> this happens, even though it is the very first frame that was decoded.
[19:10] <hackeron> zap0: ok, here's my command to record audio :D < ffmpeg -loglevel info -f alsa -ac 12 -i plug:capt -map 0 -analyzeduration 0 -dn -vn -codec:a libfdk_aac -flags +qscale -global_quality 1 -afterburner 1 -f segment -segment_time 60 -segment_wrap 10 -segment_list_flags live -segment_list_size 10 -reset_timestamps 1 -segment_list 'test.csv' -ar 44100 -filter:a "aformat=channel_layouts=0xFFF,pan=1|c0=c4" -y test_%02d.mkv
[19:10] <hackeron> simples :P
[19:16] <Hfuy> Does ffmpeg have -filter:v "camerawork=good"
[19:16] <Hfuy> and if not why not
[19:18] <fscz> Am I missing something obvious, or is nobody here who has the answer?
[19:55] <shahinkhan> Hi, can anybody help me with this? : how can I have less delay when I use ffplay via RTSP
[21:31] <taladan> hey folks, quick q - is there a way to use ffmpeg to capture from two different windows at once without having to have those windows overlayed on the desktop?
[21:32] <taladan> like, for instance if I was streaming out to a source and wanted to overlay a webcam video in the lower right quadrant of the screen, could I have the webcam on a seperate portion of my desktop without having to have it positioned over the lower right quadrant of the main window I'm streaming?
[21:59] <fscz> decoding audio file, I got following error: [vorbis @ 0x2226e0] Not a Vorbis I audio packet. Error decoding frame: Invalid data found when processing input
[21:59] <fscz> is this recoverable
[22:00] <fscz> maybe with skipping?
[00:00] --- Sun Aug 25 2013
More information about the Ffmpeg-devel-irc
mailing list