[Ffmpeg-devel-irc] ffmpeg.log.20121116

burek burek021 at gmail.com
Sat Nov 17 02:05:01 CET 2012


[01:27] <ozone89> uhmmm... what could i doing wrong?
[01:27] <ozone89> ffmpeg -f lavfi -i "amovie=04\ Wish\ You\ Were\ Here.flac, asplit[out0],showspectrum=s=1920X1080[out1]" -r 24 -f image2 test/image-%3d.jpeg
[01:28] <ozone89> i'm trying to get images from the rawvideo generated by the filter
[01:38] <ozone89> there, with verbose stuff too, so you couldn't say i've cherrypicked it http://pastebin.com/RiChZ8vV
[01:40] <burek> Could not open file : test/image-001.jpeg
[01:40] <burek> permissions, permissions...
[01:40] <ozone89> of what, of what...?
[01:41] <ozone89> it doesn't exist yet!
[01:42] <ozone89> if i output a video, it's ok, if i oitput images, i get this error
[01:42] <burek> try to type touch test/image-001.jpeg
[01:42] <ozone89> s/oit/out/
[01:42] <burek> just to see if you have permission to do that
[01:44] <ozone89> oh, dear... fmpeg is not clever enough to make the output dir if not existing O-o
[01:44] <burek> it's smart enough not to screw up the system if its owner is not smart enough :)
[01:48] <ozone89> wow! what a bullseye! stopped right at 100th frame XD
[01:48] <ozone89> err
[01:48] <ozone89> 1000
[01:49] <ozone89> uhmmm... ok, it's almost right
[01:50] <ozone89> but it looks like it does a snapshot every frames instead of every 24 as stated by -r 24
[01:51] <burek> you didn't state that with -r 24
[01:52] <ozone89> ah.
[01:52] <ozone89> how do i do that?
[01:53] <burek> you read the manual first, usually
[01:53] <burek> http://ffmpeg.org/ffmpeg.html#select
[01:54] <burek> not(mod(n,24)) for example
[01:59] <ozone89> ffmpeg -v verbose -f lavfi -i "amovie=04\ Wish\ You\ Were\ Here.flac, asplit[out0],showspectrum=s=1920X1080=select='not(mod(n\,100))'[out1]" -filter:v select="not(mod(n\,24))" -f image2 test/image-%5d.png
[01:59] <ozone89> like this?
[02:00] <burek> why do you have 2 selects?
[02:01] <ozone89> !!
[02:01] <ozone89> nice question
[02:03] <ozone89> ok, dropped the second one and put 24 instead of 100
[02:04] <ozone89> but i swear i don't know where that did came from
[02:10] <ozone89> ok, i've added -vsync 0, and now frames are different, yet still too many
[02:10] <ozone89> ffmpeg -v verbose -f lavfi -i "amovie=04\ Wish\ You\ Were\ Here.flac, asplit[out0],showspectrum=s=1920X1080=select='not(mod(n\,100))'[out1]" -vsync 0 -f image2 test/image-%5d.png
[02:13] <ozone89> ah! did it!!
[02:14] <ozone89> ffmpeg -v verbose -f lavfi -i "amovie=04\ Wish\ You\ Were\ Here.flac, asplit[out0],showspectrum=s=1920X1080[out1]" -vf "select='not(mod(n,100))'" -vsync 0 -f image2 test/image-%5d.png
[02:21] <ozone89> damn... i can't get the spectrum slide, it still overlaps start once it reaches end :|
[02:22] <ozone89> ffmpeg -v verbose -f lavfi -i "amovie=04\ Wish\ You\ Were\ Here.flac, asplit[out0],showspectrum=s=1920X1080=slide[out1]" -vf "select='not(mod(n,100))'" -vsync 0 -f image2 test/image-%5d.png
[02:22] <burek> this wasn't valid: showspectrum=s=1920X1080=select='not(mod(n\,100))'[out1]
[02:22] <ozone89> sure, that manual is very... thin about examples
[02:22] <burek> =select should have been ,select
[02:22] <burek> manual is ok
[02:23] <burek> you're just not paying enough attention to what you write
[02:23] <ozone89> slide
[02:23] <ozone89>     Specify if the spectrum should slide along the window. Default value is 0.
[02:23] <ozone89> should i put a come before slide instead of = too?
[02:23] <ozone89> *coma
[02:24] <burek> well, do you need another filter in the chain
[02:24] <burek> or do you need to select additional params to showspectrum
[02:24] <ozone89> isn't that an option of showspectrum?
[02:24] <burek> read about showspectrum
[02:24] <ozone89> read
[02:24] <burek> then you already know the answer
[02:25] <burek> you just need to understand it now :)
[02:25] <ozone89> i've even tried with showspectrum=s=1920X1080=slide=1
[02:25] <ozone89> but nothing
[02:25] <burek> that's incorrect notation
[02:26] <ozone89> =slide 1   nothing
[02:29] <burek> http://ffmpeg.org/ffmpeg.html#lavfi
[02:31] <burek> and this also http://ffmpeg.org/trac/ffmpeg/wiki/FilteringGuide
[02:34] <burek> -f lavfi -i "amovie=04\ Wish\ You\ Were\ Here.flac, asplit, showspectrum=s=1920X1080: slide=1, select=not(mod(n\,100))"
[02:34] <burek> try that
[02:35] <burek> -f lavfi -i "amovie=04\ Wish\ You\ Were\ Here.flac, asplit[out0], showspectrum=s=1920X1080: slide=1, select=not(mod(n\,100)) [out1]"
[02:35] <burek> btw, if you are creating png images
[02:35] <burek> why are you creating audio output (out0) ?
[02:37] <ozone89> Error initializing filter 'showspectrum' with args 's=1920X1080: slide=1'
[02:38] <ozone89> the probleme here is to put multiple options on the same filter
[02:39] <burek> you should use ':' separator
[02:39] <burek> maybe even showspectrum='s=1920X1080: slide=1'
[02:39] <burek> try looking at examples here http://ffmpeg.org/trac/ffmpeg/wiki/FilteringGuide
[02:39] <burek> and see what you can figure out
[02:40] <burek> filters were always a mess to parse for me
[02:54] Action: ozone89 throws the biggest swearing ever
[02:54] <ozone89> burek, surprise!  http://ffmpeg.org/pipermail/ffmpeg-cvslog/2012-October/056329.html
[02:55] <burek> what about that?
[02:55] <ozone89> slide option added on oct 25
[02:56] <ozone89> mine is built oct 21st
[02:56] <burek> and...?
[02:56] <burek> well update
[02:56] <burek> :)
[02:56] <ozone89> <burek> and...?      and... we were both right and wrong
[02:56] <ozone89> :)
[02:56] <burek> ok :)
[02:58] <ozone89> worst thing is, probably the sources were from august :(
[02:58] Action: ozone89 hopes nothing drastical changed in dependencies
[02:59] <ozone89> lemme guess... idiotic hoping, uh?
[03:00] <burek> why, what'll break?
[03:00] <ozone89> i'm hoping i have anything i need
[03:00] <ozone89> it's 3 AM here, not very wanting to hunt down deps
[03:04] <ozone89> ok, sources were from oct 21
[03:04] <ozone89> shouldn't be a trauma updating
[03:04] <burek> checkout that commit only?
[03:04] <burek> if you are too afraid of breaking stuff :)
[03:04] <ozone89> you mean the showspectrum file only?
[03:05] <ozone89> or the whole oct 25 branch?
[03:05] <burek> no find the commit it after the oct 25 and check it out
[03:05] <burek> yes
[03:05] <burek> it's not an oct 25 branch, but you get the idea
[03:05] <burek> commit id*
[03:05] <ozone89> lemme check how the script works
[03:06] <ozone89> GIT_URL="git://git.videolan.org/ffmpeg.git"
[03:06] <ozone89> damn
[03:09] <ozone89> uhmmm... it's doing the CC, YASM, CXX stuff, seems ok
[03:12] <ozone89> btw, burek, may i ask you a question about audio hardware, or at least a channel dedicated to? :)
[03:13] <burek> feel free :)
[03:13] <ozone89> ok, here it is :)
[03:13] <tremby> i have one audio input file and a bunch of video-only input files. i want the videos to be concatenated end to end, while the audio stream is over top of everything. i've tried the concat:vid1|vid2|vid3 but only vid1 shows, and i'm stuck on -filter_complex concat syntax. can someone please help?
[03:14] <ozone89> tremby, try $cat vid* > out ;)
[03:14] <burek> you can't just cat
[03:14] <ozone89> provided they are the same tyope of file, of course
[03:14] <burek> it only works in some cases
[03:14] <burek> and you still loose timestamps
[03:15] <tremby> ozone89, burek: no good in this case
[03:15] <tremby> can the -filter_complex syntax do what i'm asking? if so i'll try further to figure it out, but the docs are pretty opaque there
[03:15] <burek> tremby https://ffmpeg.org/trac/ffmpeg/wiki/How%20to%20concatenate%20(join%2C%20merge)%20media%20files
[03:15] <burek> first join all your videos
[03:16] <tremby> burek: i've seen that, it's with named pipes. i'd very much like to avoid those since i want this to be cross platform. i want to do it in one ffmpeg invocation if possible
[03:16] <burek> then just do: ffmpeg -i video -i audio -c:v copy -c:a copy output
[03:16] <burek> tremby then good luck :)
[03:16] <tremby> are you saying it's not possible?
[03:16] <ozone89> eh... split is easy, join it's not
[03:16] <burek> no, im saying good luck with that
[03:17] <ozone89> where was i?
[03:17] <ozone89> oh, yeah
[03:17] <ozone89> my father told me that since a couple decades ago, it was important to get amp and loudspeaker impedence as similar as possible
[03:18] <ozone89> while now i read the loudspeaker/headphone impedence should be at least 8 times the impedence  amp
[03:19] <ozone89> *of the amp
[03:21] <ozone89> how so? :)
[03:26] <burek> this has got to be a million dollar question right? :)
[03:26] <ozone89> not really
[03:26] <ozone89> just simple curiosity
[03:27] <ozone89> :)
[03:27] <ozone89> it just riddles me, something seems wrong about the change of position
[03:28] <burek> http://nwavguy.blogspot.com/2011/02/headphone-impedance-explained.html
[03:31] <tremby> is anyone able to help explain to me the last example at http://ffmpeg.org/ffmpeg.html#concat -- where does that stuff go on the command line?
[05:40] <JEEB> burek, btw the problems with fdk-aac and he-aac v2 with itunes were because it by default with mp4-like output uses a newer signaling mode for the stream. "-signaling implicit" makes it work. The actual fdk library uses this mode by default, but wbs's libavcodec encoder picks the fancier mode by default :)
[09:18] <petep> Hi, I have a problem with ffmpeg on mac osx. I installed ffmpeg via macport. I can reproduce the following: when activating version ffmpeg @0.7.13_2 I can create a h264 video from pngs and safari can replay it from file:///.../mymovie.mp4 w/o any problems. Now having updated ffmpeg to ffmpeg @1.0_2 I had to modify my command lione which creates the movie. The preset file I used before didn worked any more. Instead I use now -preset
[09:18] <petep>  ultrafast and the movie is created sucessfully. VLC can play the vid. BUT: Safari cannot. When trying to play the movie in safari the qtkitserver plugin process just gets high cpu but the movue fails to load. A similar bahavior can be reproduced in Chrome. I tried here ogv and webm movies with ffmpeg  @0.7.13_2 vs @1.0_2 movies. Chrome could not play the @1.0_2 movies, but VLC still can. Any ideas about that weird issue? Thanks a
[09:18] <petep> lot.
[12:03] <sledges> hello
[12:03] <Mavrik> good day.
[12:06] <sledges> just transcoded 6.2Meg sized 10sec h264 1080p at 48FPS (big buck bunny) to raw (using method 1 in http://www.hdslr-cinema.com/news/workflow/convert-between-framerates/) and then down to 30fps and one to 24fps of same format with -c:v libx264 -qp0 , but both files resulted in the same 24M size! (byte to byte). vlc shows correct framerate for each 30 & 24fps... where did I go wrong?
[12:08] <Mavrik> sledges: can you put ffmpeg command-line AND outputs for both commands in pastebin please?
[12:12] <i42n> hey guys, I try to record a screencast with the command listed here: http://pastebin.com/5RtUwQx5 This works fine on my laptop. But on my desktop it does not. I get a "ALSA buffer xrun." error and only the first second of sound is recorded. suggestions what I am doing wrong?
[12:12] <sledges> Mavrik: http://pastie.org/5386938
[12:12] <i42n> I already tried "export ALSA_BUFFER_SIZE_MAX=524288" but this did not help
[12:14] <Mavrik> sledges: there's no ffmpeg output which is the important bit ;)
[12:14] <Mavrik> i42n: hmm, ALSA can be pretty wierd
[12:15] <i42n> Mavrik: alternatives?
[12:15] <Mavrik> i42n: I can't help with your immediate problem, but using pulseaudio as input (-f alsa -i pulse) usually fixed alot of problems or me
[12:15] <Mavrik> that is, if you have a distro with pulseaudio server installed
[12:16] <i42n> I am on archlinux and have alsa running. Would take a lot of time to change to pulse I think.
[12:16] <i42n> so you think my command might not be the problem. maybe an alsa bug?
[12:16] <divVerent> when can this happen:
[12:16] <divVerent>     Stream #0:3[0xa0]: Audio: pcm_s16be, 48000 Hz, 2 channels, 1536 kb/s
[12:17] <divVerent> where it should be
[12:17] <divVerent>     Stream #0:1[0xa0]: Audio: pcm_s16be, 48000 Hz, 2 channels, s16, 1536 kb/s
[12:17] <sledges> Mavrik, 24fps: http://pastie.org/5386953  30fps: http://pastie.org/5386951
[12:17] <Mavrik> divVerent: what exactly is the problem?
[12:17] <divVerent> this causes, when encoding, [graph 1 input from stream 0:3 @ 0x26060e0] Invalid sample format '(null)'
[12:17] <divVerent> s16 is missing, which is likely the cause here
[12:17] <i42n> Mavrik: I found this: https://ffmpeg.org/trac/ffmpeg/ticket/615 But the BufferSize change did not fix it for me.
[12:18] <divVerent> I got this when trying to use a DVD ISO file as direct input
[12:18] <divVerent> and when moutnign the DVD, the VOB of the menu track also does this
[12:18] <divVerent> while the other VOBs are fine
[12:18] <divVerent> I don't care if this file really is not "fixable" because it may simply have no audio
[12:18] <divVerent> but why is it "pcm_s16be" but not "s16"?
[12:18] <ubitux> it's the codec
[12:18] <divVerent> is there another kind of pcm signed 16bit big endian that's not signed 16bit?
[12:19] <i42n> Mavrik: ok maybe I should actually change to pulseaudio https://bbs.archlinux.org/viewtopic.php?id=100144
[12:19] <divVerent> right, codec name is correct
[12:19] <divVerent> but how can the sample format be missing
[12:19] <sledges> Mavrik: the very first ffmpeg (decoding to raw): http://pastie.org/5386957
[12:20] <divVerent> basically, for my specific input I have a way around this to fix it properly
[12:20] <divVerent> I just want to understand how this can happen to begin with
[12:20] <divVerent> because I'd assume the pcm_s16be codec ALWAYS would have the s16 sample format no matter what its input is
[13:36] <pettter> is there any nice and easy rundown of what timestamps to set, to what, and at what point, when encoding and muxing both audio and video into a (say, mpeg-ts) container?
[13:39] <JEEBsv> pettter: when you feed stuff to an encoder, you should have PTS set. And then you get a DTS and a PTS from the encoder and use those for muxing
[13:39] <pettter> and that goes for both video and audio?
[13:42] <JEEBsv> yes
[13:43] <JEEBsv> pts = presentation time stamp aka "when stuff is meant to be shown" and dts = decode time stamp aka "when stuff is supposed to be decoded"
[13:46] <Mavrik> um, one question about that
[13:46] <Mavrik> will x264 automatically set DTS when it creates a B-frame?
[13:47] <JEEBsv> the encoded result? yes I would think
[13:47] <pettter> yes, I know about that, but something is apparently failing
[15:05] <salomartin> Hello everyone! I'm trying to output one image per each actual frame from a variable framerate flv with "ffmpeg -i input.flv out%d.png" but it outputs a frame for each millisecond (ie 1k frames for first second). How can I make it output only the actual frames which I see when I do "ffprobe -i input.flv -show_frames"? I'm using ffmpeg version N-46469-gc995644.
[15:07] <JEEBsv> it seems to be actually outputting every frame for a timescale tick ^^;
[15:07] <JEEBsv> if it's 1000 per second
[15:08] <JEEBsv> the flv container saves timestamps on the 1/1000 of a second level
[15:08] <salomartin> Yes - I actually see the same effect when I want to convert it to mp4 container
[15:08] <JEEBsv> o_O
[15:08] <salomartin> I thought -copyts and -copytb would maybe help, but they didn't
[15:09] <JEEBsv> try poking the trac with a bug report
[15:09] <JEEBsv> and a sample that reproduces the issue
[15:09] <salomartin> I should probably try an older version first
[15:09] <salomartin> maybe it's a regression or maybe I'm doing something wrong
[15:12] <pettter> has anyone tested if the muxing example actually works with recent versions of ffmpeg?
[15:13] <pettter> or if it is misleading i.e. worse than useless?
[15:15] <ubitux> it's maintained
[15:15] <ubitux> do you have a problem with it?
[15:17] <pettter> I'm trying to make a Simple(tm) app for doing similar magic to ffplay - stitching together lossy/gappy mpeg-ts files into something that is nice, monotonic and synced
[15:17] <ubitux> mmh seems it's not that working well indeed
[15:18] <pettter> but apparently something is going horribly wrong when I set things up
[15:18] <ubitux> though, i saw some patches not that long ago
[15:18] <pettter> such as only getting a single output stream, despite adding both audio and video
[15:21] <salomartin> Tried with the version N-34549-g13b7781 from a year ago- still the same behavior so I think it's rather me missing some flag. Any ideas?
[15:22] <ubitux> salomartin: -vsync vfr?
[15:27] <salomartin> Perfect! Thank you so much! This worked on the new version.
[15:28] <petep> Hi, some hours ago I asked: I have a problem with ffmpeg 1.0 which I had not with v0.7.13. I installed ffmpeg via macports on Mac OS X. I can reproduce the following: when activating version ffmpeg @0.7.13_2 I can create a h264 video from a set of pngs and Safari can replay it from local url file:///.../mymovie.mp4 w/o any problems. Now having updated ffmpeg to ffmpeg @1.0_2 I had to modify my command line which creates the movie.
[15:28] <petep> The preset file "-fpre .../libx264-ultrafast.ffpreset" I used before does not work any more. Instead I use now build in "-preset ultrafast" which rather creates the movie successful. VLC can play the video just fine. BUT: Safari cannot. When trying to play the movie in Safari the QTKitserver plugin process is running high cpu but the movie fails to load. A similar behavior can be reproduced in Chrome. I tried here ogv and webm movi
[15:28] <petep> es encoded with ffmpeg  @0.7.13_2 vs @1.0_2. Chrome can play the @0.7.13_2 videos but cannot play the @1.0_2 videos.  VLC plays both. To sum up: Atm I have no clue how to create Html5-videos playing in browsers with ffmpeg 1.0. Though 0.7.13 workes, at least on Mac OS X, havent tried on linux so far. Any ideas about this weird issue? Thanks a lot.
[15:28] <petep> I found now what causes the problem: the deafult -pix_fmt in 0.7.13 is yuv420p when looking at the console out of ffmpeg.  Since v 1.0 this seems to have changed. For ogg and h264 it is now
[15:28] <petep> yuv444p. When I setting -pix_fmt yuv420p with ffmpeg-v1.0 the browsers (chrome and safari) can play my video again. So I found a workaround. From my point of view this seems a bit bad to have default values that are not fine for web videos. Can anybody explain this to me, is it a bug or is this intended, why?
[15:30] <JEEBsv> petep: after the libx264 wrapper enabled encoding of non-4:2:0 video there was a phase when ffmpeg defaulted to the colorspace closest to your input, which is kind of what you want usually... except most things only support 4:2:0 H.264
[15:30] <JEEBsv> it was then later fixed
[15:30] <JEEBsv> so I recommend you just update your ffmpeg to current git HEAD
[15:31] <JEEBsv> fixed as in a widely supported colorspace was made the default
[15:33] <petep> Ah ok, so it was a "bug" and is fixed now. In my case I will until the latest changes come with macports (package manager). Until then I ll stick to setting the 4:2:0 via cli-parameter, which works for me.
[15:33] <petep> thx JEEBsv
[15:34] <petep> In my case I will wait until ... *
[15:36] <JEEBsv> petep: you should try to see if homebrew has a more up to date package
[15:36] <JEEBsv> also editing the rulesets shouldn't be too hard for homebrew
[15:41] <petep> JEEBsv: Hmm, I like macports, especially, because it does not build everything. It often downloads already build packages by using a checksum, which is part of  port. I have no experience with homebrew so far and avoid getting into it, cause I am more a user, dont want to have to much administration overhead. So I would like to have only one package system and not multiple, and it should not bother me with builds and broken package
[15:41] <petep> s too much. Am I a homebrew guy?
[15:44] <petep> In other words, should I switch?
[15:48] <JEEBsv> petep: I thought macports compiled the compiler too
[15:48] <JEEBsv> homebrew does compile the packages, but it tries to use the standard OS X development stuff as possible
[15:52] <petep> JEEBsv: e.g. the package ffmpeg is already compiled by the maintainer of that port. I dont know exactly how it works, I guess precompiled packages are available, if the maintainer of a port compliles them for some common platforms. As a result updates of packages are often very quick. Though, there still some packages that need to complile on an update.
[15:53] <JEEBsv> also I know that macports people at times did really dumb selections, like disabling asm optimizations from x264
[15:54] <petep> some years ago, macports compiled much more, but today there are many precompiled packages, especially if do not use any special variants.
[15:58] <petep> hmm, on the dumb descisions: Ok, in this case I am lost. I am a just user, I want to solve higher level tasks. I can only hope, that the packages I use are set up nice by the mainatiners, until i detect an issue and have to ask aunty google :)
[15:59] <StaRetji> Howdy folks
[15:59] <StaRetji> I rememeber I used to send input video to ffmpeg with pipe
[15:59] <StaRetji> but can't find working example
[16:00] <StaRetji> I wold appreciate example where I pull stream from external program and pipe to ffmpeg which will do output
[16:00] <StaRetji> something like wget input.mp4 | ffmpeg output.mp4
[16:01] <StaRetji> but none of my tries was successful
[16:09] <Mavrik> StaRetji: ffmpeg -i - output.mp4
[16:10] <StaRetji> thx Mavrik, but this stream can't be run by ffmpeg
[16:10] <StaRetji> while cvlc runs it without a problem
[16:10] <Mavrik> - means take input from pipe
[16:10] <Mavrik> isn't that what you're trying to do?
[16:10] <StaRetji> it is rtsp live555 and I can't play it in ffmpeg for some reason
[16:11] <StaRetji> yes, i try cvlc rtps-stream to pipe to ffmpeg and re-encode
[16:11] <StaRetji> I will try now your example :)
[16:12] <StaRetji> pipe: Invlaid data found when processing input
[16:12] <salomartin> Is there some documentation about the ffprobe -show_frames output? I'm trying to understand what the variables exactly represent. I'm seeing pretty different values for  pkt_pts, coded_picture number etc if I compare the original source flv and the output mp4 - the pkt_pts_time seems to be the same though. I'm trying to understand if it's expected or not.
[16:13] <Mavrik> salomartin: PTS is a "presentation timestamp"
[16:14] <Mavrik> it tells the player when (in time) that frame has to be displayed
[16:14] <Mavrik> so it's pretty clear those should not change when transcoding unless timebase or FPS changes
[16:14] <salomartin> the last frame has pkt_dts_time N/A whereas for source it's set
[16:15] <salomartin> but what's pkt_pts? I was assuming it's a timestamp but those are wildly different between the two so I guess it's something else?
[16:16] <Mavrik> salomartin: pts is what I told you
[16:16] <Mavrik> probably your timebase changes
[16:16] <burek> JEEB great :)
[16:17] <burek> I'll try it and test a little bit as soon as I find some free time to do so :)
[16:17] <salomartin> So the cause for pkt_pts being different and pkt_pts_time being the same could be caused by timebase changes
[16:17] <StaRetji> hey Burek dude, waaazaaaa :)
[16:18] <burek> StaRetji :beer: :)
[16:18] <burek> petep, did you try with -profile
[16:19] <salomartin> So what should I add to keep the timebase same? Right now it's "ffmpeg -i input.flv -vsync vfr output%d.png"
[16:19] <salomartin> this command keeps the pkt_pts_time the same, but the pkt_pts changes
[16:20] <salomartin> * output.mp4
[16:21] <burek> salomartin, well you told it to change
[16:21] <salomartin> Strangely coded_picture_number also goes out of sync occasionally - i'm not sure if it's normal or not. pkt_pos is also out of sync.
[16:21] <burek> with -vsync
[16:22] <burek> vfr: Frames are passed through with their timestamp or dropped so as to prevent 2 frames from having the same timestamp.
[16:25] <salomartin> But even if I switch the -vsync to 0 to passthrough then I still get differences
[16:26] <burek> btw, how can you see timestamps in png image?
[16:27] <salomartin> I made a typo earlier- it was from mp4 actually, not png
[16:37] <petep> burek: petep, did you try with -profile
[16:37] <petep> sorry was away
[16:38] <pettter> under what circumstances can av_read_frame generate a segfault?
[16:38] <pettter> the packet is allocated on the stack and av_init_packet'd
[16:38] <petep> burek: U mean ffmpeg with option -profile? How should that help with the yuv 4:2:0 issue?
[16:40] <burek> I didn't say anything about 4:2:0
[16:41] <burek> I just asked if you tried using -profile option of libx264 encoder
[16:41] <petep> burek: no I did not
[16:47] <petep> burek: I am of in a minute, but I ll try to figure out what the option -profile is about. If you like you could provide a link to some specific docu to me. If not it s ok, i ll do some search. Now have to go. Thx anyways.
[16:48] <burek> or type: x264 --help
[16:49] <petep> burek: ok and is there a similar option for theora? Cause here I have the same problem.
[16:50] <burek> I don't use theora, so I don't really know..
[16:50] <petep> ok
[17:27] <salomartin> Ok - here are the commands and the output http://pastebin.com/w7xUzspJ i truncated the diff a bit, but all the patterns should be visible
[17:28] <salomartin> I tried all the combinations of vsync, copytb, copyts and tried to move them before input and after input, nothing made the values sync up
[17:53] <raptor67682> hi guys, would you know how to record with ffmpeg this stream video? http://direct.francetv.fr/regions/evt/medit-nice-direct.wsx?MSWMExt=.asf
[18:06] <StaRetji> folks, can someone give working example for wget mp4 file and pipe to ffmpeg for rtmp stream, so far whatever I do, I get
[18:07] <StaRetji> pipe:: Invalid data found when processing input
[18:18] <StaRetji> thx klaxa, will do it right now
[18:21] <StaRetji> Here it klaxa http://pastebin.com/AGFw8FTd thx dude
[18:22] <mistym> Stupid question. Is there a license on the testsrc test pattern? Can I upload a video w/ it using CC0?
[18:24] <klaxa> StaRetji: i don't see any console output, that's the input only
[18:25] <StaRetji> really?
[18:26] <StaRetji> I mean, I thought wget is taking the file and piping it to ffmpeg
[18:26] <StaRetji> which does output to rtmp server
[18:26] <klaxa> what i mean is i don't see what ffmpeg outputs
[18:26] <klaxa> i don't see any error messages
[18:26] <klaxa> that's only the command you execute
[18:26] <StaRetji> I remember I had working example before by I was stupid enough not to save it :/
[18:27] <StaRetji> yes, error i only this: pipe:: Invalid data found when processing input
[18:27] <StaRetji> should I paste whole console text?
[18:27] <klaxa> yes
[18:29] <StaRetji> http://pastebin.com/ZcD2tJXe
[18:29] <StaRetji> here it is
[18:29] <StaRetji> thx one more time, really appreciated
[18:30] <klaxa> you see
[18:30] <klaxa> >wget: missing URL
[18:30] <klaxa> >Usage: wget [OPTION]... [URL]...
[18:30] <klaxa> that's the important error you get
[18:30] <StaRetji> yes
[18:30] <StaRetji> missing url
[18:30] <klaxa> you are using wget incorrectly, obviously it won't output any data
[18:30] <klaxa> that's why ffmpeg can't process the data (because it's not there)
[18:31] <klaxa> try wget -O - <url> instead
[18:31] <StaRetji> omg lol
[18:31] <klaxa> i.e.: wget -O - 'http://myurl/video.mp4'
[18:31] <StaRetji> I was typing 0 instead of O
[18:31] <klaxa> also i think ffmpeg can handle http downloads itself
[18:32] <klaxa> so ffmpeg -i http://myurl/video.mp4 should work too
[18:32] <StaRetji> well. to be honest, I don't need wget, I was just trying to find working pipe example so that I can use it with openRTSP
[18:32] <StaRetji> as it fails there all the time
[18:32] <klaxa> ah
[18:32] <StaRetji> ffmpeg wont play rtsp live555
[18:33] <StaRetji> and I am sure openRSP can play it
[18:33] <StaRetji> so, idea was to openRTSP url
[18:33] <StaRetji> and pipe to ffmpeg, but this fails, so I start from the beginning, I need to get working pipe example
[18:33] <StaRetji> btw, I as typing O after all :/
[18:34] <klaxa> you still need the '-' to tell wget to output the file to stdout
[18:34] <StaRetji> but -O - did the trick ;)
[18:35] <StaRetji> yes, thx klaxa,
[18:35] <StaRetji> is - needed as wget command
[18:35] <StaRetji> or I will have to use it if I try to pipe openRSTP too?
[18:36] <klaxa> depends on openRTSP, i don't know it at all, i'll take a short look at the manpage
[18:36] <StaRetji> cheers mate
[18:37] <klaxa> you have to use -q , see: http://www.live555.com/openRTSP/ Outputting a ".mov", ".mp4", or ".avi"-format file section
[18:37] <klaxa> or -4
[18:38] <klaxa> i myself would tend to -4 because ewww .mov
[18:38] <StaRetji> tryed both
[18:39] <StaRetji> let me try one more time
[18:39] <StaRetji> yes, openRSTP plays the file
[18:40] <StaRetji> but that is it, ffmpeg never shows up or take over
[18:40] <StaRetji> all I got is this:
[18:40] <klaxa> can you paste the complete console log?
[18:40] <StaRetji> Started playing session
[18:40] <StaRetji> Receiving streamed data (signal with "kill -HUP 2992" or "kill -USR1 2992" to terminate)...
[18:40] <StaRetji> of course, give me a second
[18:44] <StaRetji> klaxa: http://pastebin.com/CnmfyD8P here you go mate
[18:46] <klaxa> StaRetji: can you do the same, but add 2> /dev/null before the first | ?
[18:46] <klaxa> and then paste output
[18:47] <StaRetji> ok, moment
[18:50] <StaRetji> http://pastebin.com/9zQBVdyb here ya go klaxa mate, seems no error, but stays like that
[18:50] <t4c0c4t> can anyone suggest a solution for looping a file? I have been trying movie=file.mpg:loop=0 but I receive an error stating that ffmpeg cannot find a suitable output format
[18:50] <StaRetji> and streaming doesn't work
[18:55] <klaxa> StaRetji: did you read the part about -w -h and -f being important?
[18:55] <StaRetji> ehm
[18:55] <StaRetji> not sure if I understand
[18:55] <klaxa> for openRTSP
[18:56] <klaxa> it says here: >If the session contains a video subsession, you should also use the "-w <width>", "-h <height>" and "-f <frame-rate>" options to specify the width and height (in pixels), and frame rate (per-second) of the corresponding video track. (If these options are omitted, then the values width=240 pixels; height=180 pixels; frame-rate=15 are used.) These values are important; if they are not correct, your file might not play at all!
[18:56] <StaRetji> oh, I've missed that
[18:56] <StaRetji> so, it should be openRSTP -4 and whf of the stream
[18:57] <StaRetji> hm, but, this changes
[19:02] <StaRetji> I did openRTSP -4 -w 720 -h 576 -f 25 and output is exactly the same as in previous pastebin
[19:04] <klaxa> try leaving out the ffmpeg command and see if it outputs anything to stdout at all
[19:04] <klaxa> (no '|' either this way your stdout will be filled with binary data though)
[19:08] <StaRetji> well, openRTSP pulls the stream, as always when I don't use ffmpeg pipe
[19:08] <StaRetji> once I try pipe (which we confirmed it is working for .mp4 file)
[19:08] <StaRetji> it fails
[19:08] <klaxa> so does it fill your stdout with binary garbage?
[19:09] <StaRetji> no
[19:09] <StaRetji> it stay at Started playing session
[19:09] <StaRetji> Receiving streamed data (signal with "kill -HUP 3037" or "kill -USR1 3037" to terminate)...
[19:09] <klaxa> you didn't add the 2> /dev/null then did you?
[19:10] <klaxa> because that kind of information should be sent to stderr and not to stdout
[19:10] <klaxa> in that case openRTSP doesn't output the stream to stdout correctly
[19:11] <StaRetji> yes, if I add 2> /dev/null
[19:11] <StaRetji> it doesn't output anything
[19:11] <klaxa> then openRTSP doesn't output anything to stdout
[19:11] <StaRetji> not even Started Playing session, it just stays
[19:11] <StaRetji> oh
[19:11] <StaRetji> got it, so that is why pipe is not working, right?
[19:11] <klaxa> and if openRTSP doesn't output anything, ffmpeg can't read anything
[19:11] <StaRetji> maaan lol, good conclusion
[19:11] <StaRetji> well done :)
[19:12] <StaRetji> I would lose hours trying to make it
[19:12] <StaRetji> while it could never work
[19:12] <StaRetji> okay, so I have to find another way, probably cvlc
[19:12] <StaRetji> or something :)
[19:12] <StaRetji> anyway, thx klaxa, you are real pal
[19:12] <klaxa> or bug the live555 people about it
[19:13] <klaxa> they should know how openRTSP works (right?)
[19:13] <StaRetji> exactly, but now that it is like that, I am convinced this is done on purpose
[19:13] <StaRetji> so that can't be restreamed easily :)
[19:13] <klaxa> well it's kinda not working the way it's described to be working
[19:14] <StaRetji> hehehe, yeah
[19:50] <PaperWings> I'm not having good results playing this video in windows xp.  What can windows xp media player support?  -i "11062012050819353\avsscript.avs" -c:v libx264 -c:a aac -strict experimental -ac 1 -ar 44100 -b:v 512k -b:a 48k -y "11062012050819353\11062012050819353.mp4"
[19:55] <relaxed> PaperWings: use vlc
[20:03] <Techdeck> hey guys, I tried transcoding a video from m4v to mp4 and for some reason, the output video is really screwed. Every couple of seconds it keeps stalling and the speed of the video keeps changing - really weird. Any ideas why? The command and output are at: http://pastebin.com/kgsNHbcG
[20:04] <PaperWings> Vlc is cool, but not an option.  Should I try to do like MPEG2?
[20:10] <relaxed> mpeg4/mp3 should work
[20:11] <relaxed> ffmpeg -i input -c:v mpeg4 -vtag xvid -b:v 512k -c:a libmp3lame -ac 1 -ar 44100 -b:a 96k output.avi
[20:12] <relaxed> Do you need a specific size?
[21:05] <i42n> hey guys, I just installed pulseadio on my archlinux system and try to record audio with ffmpeg now. unfortunatley it does not work. can someone help me?
[21:06] <relaxed> did you build ffmpeg with libpulse support?
[21:07] <i42n> I did not build it myself. ffmpeg does actually not complain about any options. there is just an empty audio recorded
[21:07] <i42n> ok wait a second...
[21:08] <klaxa> try looking into pulseaudio volume control
[21:08] <klaxa> maybe it's set to the wrong recording device
[21:08] <i42n> there you go
[21:08] <i42n> http://pastie.org/5389239
[21:09] <klaxa> hmm... yeah looks correct, i'm pretty sure pulseaudio selects the wrong input device then
[21:09] <i42n> possible. I was not able to configure that yet.
[21:09] <klaxa> in the pulseaudio volume control see the recording tab and see if ffmpeg is assigned to the correct device
[21:10] <i42n> pavucontrol crashes on startup
[21:10] <i42n> doing a full system update now to have all packages in the same state
[21:10] <klaxa> heh... well you can always try the command line tool of pulseaudio (pacmd)
[21:12] <i42n> never used it before. so how can I set my mic as recording device?
[21:12] <i42n> in alsa it was hw:1,0
[21:12] <i42n> manpage does not help a lot
[21:12] <klaxa> run pacmd list-sources
[21:12] <klaxa> that should list all input sources you have
[21:12] <i42n> ah pacmd --help
[21:13] <relaxed> minor nitpick, but you don't need grep there- awk '/geometry/ {print $2}'
[21:13] <klaxa> then look for your microphone
[21:14] <i42n> ok found it. ists index 5
[21:15] <klaxa> to make it easy you can set it as the default recording device, to do that run pacmd set-default-source 5
[21:15] <i42n> ok done
[21:15] <klaxa> then try recording with ffmpeg again
[21:15] <klaxa> that should set ffmpegs recording sink to the default source (your mic in this case)
[21:17] <i42n> hm had no luck with that
[21:17] <i42n> also tried with audacity
[21:17] <i42n> recording pulse does not create any audio
[21:18] <klaxa> hmm... maybe ask in #pulseaudio ? they might know more
[21:19] <klaxa> with audacity, did you record stereo or mono?
[21:19] <klaxa> maybe one of the channels is muted
[21:19] <i42n> both
[21:19] <i42n> no audio on both
[21:19] <i42n> pulseaudio is running
[21:19] <i42n> checked that also
[21:20] <klaxa> the GUI for the pulseaudio volume control crashes when you start it right?
[21:20] <i42n> this is all fucked up... trying for hours now :/
[21:20] <klaxa> because checking with the gui would be a lot easier
[21:21] <i42n> actually it is not crashing any more - I installed the package updates in the background and now it works. seems to be that I had bad resolved dependencies
[21:22] <klaxa> you should have a "Input Devices" section, right?
[21:22] <klaxa> *an
[21:22] <klaxa> does your mic show up?
[21:22] <i42n> yes
[21:23] <i42n> i can see the bar jumping if i talk
[21:23] <klaxa> if you start recording with ffmpeg now, it should show up in the "Recoding" tab
[21:23] <klaxa> next to it there should be shown from what source it's recording
[21:24] <klaxa> ah, you could also check the green tick next to your mic in the "Input Devices" tab
[21:25] <i42n> I checked it. when I start recording it used the wrong mic
[21:25] <klaxa> the pacmd output is a bit too verbose and kinda confusing, maybe you set the wrong index for the default recording device earlier
[21:26] <klaxa> checking the green tick next to your mic in the "Input Devices" section sets the default recording device, if you have set that you don't have to move ffmpeg's sink manually anymore
[21:26] <klaxa> or so it should work
[21:26] <i42n> ok I got what you mean
[21:26] <i42n> but acutally the green tick is "set as fallback"
[21:27] <i42n> shouldn't it be "unticked" to make the other device default?
[21:27] <klaxa> hmm... good point
[21:27] <i42n> actually this did the trick :)
[21:27] <klaxa> until now i thought it's the default unless you specify something else before execution
[21:28] <i42n> it works now. but unfortunately the audio is about 5 seconds back in time
[21:29] <klaxa> that's pulseaudio for you :)
[21:29] <klaxa> either try offsetting it from the beginning or post process
[21:29] <klaxa> see -itsoffset in the ffmpeg manpage
[21:30] <i42n> but the offset will never be the same...
[21:30] <klaxa> yeah...
[21:30] <i42n> wunderfull
[21:30] <i42n> *ful
[21:31] <klaxa> if you don't plan to stream it, easiest would be to adjust the audio offset after recording
[21:31] <klaxa> or see if you can tweak pulseaudio for low latency
[21:31] <i42n> hm
[21:31] <i42n> I think the low latency in alsa blew up the alsa buffer before. thats why I am switching to pulse.
[21:32] <i42n> I have to tweak i after recording but that is a lot of work and no fun :(
[21:32] <klaxa> restarting pulse with a lower nice value *might* help
[21:33] <klaxa> in case you are using some pulseaudio modules for real-time processing (equalizer, mixing, whatever) see if you can avoid that, it inceases latency
[21:33] <klaxa> had the same issue, in the end i just shifted the audio
[23:54] <sgfgdf> hello, guys! i have a .vob video files which are (video: mpeg2video; audio: ac3) pretty big andi want to make them easy importable to video editing software. how should i convert them? which codecs (audio, video), container is better to use? i want to keep the converted files as original, not the vob files. is there a standard (codecs, container) which most famous DSLRs keeps their files on?
[00:00] --- Sat Nov 17 2012


More information about the Ffmpeg-devel-irc mailing list