[Ffmpeg-devel-irc] ffmpeg.log.20130512

burek burek021 at gmail.com
Mon May 13 02:05:01 CEST 2013


[03:49] <JRWR> I'm having some issues with RTMP streaming [Command Line: http://paste.debian.net/3689/ | Ver: http://paste.debian.net/3688/] the fps that ffmpeg is encoding is too fast, 44~fps, its overloading the FMS server and other services I've tried, is there anyway to fix this?
[05:02] <diyos> I have an MKV file that I would like to change the order of the tracks on.   |  + Track number: 5 (track ID for mkvmerge & mkvextract: 4)
[05:02] <diyos> This Track #5 I would like to be #2 instead
[05:02] <diyos> Video track 1, then this audio switched to Track 2, then everything else follows
[05:02] <diyos> Is this possible without fully remuxing?
[06:56] <tomlol> h
[06:56] <tomlol> hello*
[06:58] <tomlol> I'm trying to record video/audio together from my desktop and the audio is always very jittery.  I get a bunch of these messages:  real-time buffer 727% full! frame dropped!
[06:59] <tomlol> I haven't found any info on the web talking about how to troubleshoot this, is there a link with that sort of info?
[07:01] <tomlol> I've tried all sorts of random settings, not setting anything works as good or better than most that I've tried.
[07:37] <Mavrik> tomlol, that looks like the encoder can't keep up with recording
[07:54] <tomlol> http://pastebin.com/Hvk0fvEH
[07:56] <Mavrik> hmm
[07:56] <Mavrik> you are grabbing the input at a huge resolution
[07:56] <tomlol> 720x480
[07:56] <tomlol> that app that is piped into crop just gets the window dims from a game
[07:57] <Mavrik> no the picture is being grabbed at 2560x1440
[07:57] <Mavrik> which you can see in the input part of the ffmpeg output
[07:57] <Mavrik> it's being cropped later
[07:58] <tomlol> oh, I see
[07:58] <tomlol> it isnt a problem unless audio is going with it
[07:59] <tomlol> so I assumed
[08:06] <Mavrik> tomlol, well basically your problem is that ffmpeg doesn't process your frames fast enough
[08:06] <Mavrik> the bottleneck is probably (but not 100%) the input bandwidth (if you have a very fast processor)
[08:06] <Mavrik> so try using lower display resolution or finding how to limit capture just to one window
[08:06] <Mavrik> or you can drop fps
[08:22] <tomlol> I do have a fast proc
[08:23] <tomlol> so that is likely
[08:37] <tomlol> that is odd that it only grabs one screen, I run dual head.
[08:47] <tomlol> I cut the rate down to 10fps and 1024x68 and it still has issues.
[11:05] <MortenB> Hey guys. Is it a good idea in general when scaling and using x264 to keep to multiples of 16? I'm thinking trunc(/16)*16 style
[11:06] <Fjorgynn> vf=scale
[11:06] <Fjorgynn> I think
[11:06] <Fjorgynn> or what?
[11:07] <MortenB> yes, exactly
[11:08] <MortenB> Several websites recommend using trunc(/2)*2 to ensure a multiple of 2 (otherwise x264 reports error), but elsewhere I've seen the suggestion that x264 produces optimized output if the resolution is a multiple of 16, so I'm thinking, why not force that?
[11:08] <Mavrik> MortenB, do you actually need /16?
[11:08] <Mavrik> MortenB, forget about htat
[11:08] <Mavrik> doing that won't give you a measurable difference
[11:08] <Mavrik> keep it /2 if you have 420 chroma subsampling
[11:08] <Mavrik> and that's it
[11:08] <MortenB> ok, so it's true there's a difference, but it's very small?
[11:09] <Mavrik> MortenB, let me rephrase that
[11:09] <Mavrik> farting in general direction of the computer will make a greater difference?
[11:09] <Mavrik> ok?
[11:09] <MortenB> hm, might try that as well
[11:09] <Mavrik> ...
[11:10] <MortenB> thanks
[11:10] <Mavrik> MortenB, increasing the size of video to multiples of 16 will of course cause your quality to drop though ;)
[11:11] <MortenB> yeah
[11:13] <luc4> Hi! I'm trying to mux a h264 raw stream with vfr into an mp4. I pass each frame with a timestamp to libavformat. The resulting mp4 anyway does not keep the same speed and vlc cannot determine the total length. Maybe I failed to set the timestamps correctly?
[11:14] <Mavrik> luc4, hmmm& didn't know raw h.264 streams even have timestamps
[11:15] <Mavrik> are you sure ffmpeg is listening to them?
[11:15] <luc4> Mavrik: no, I suppose it does not. I'm dumping those to a text file, and reading those manually.
[11:16] <Mavrik> hmm
[11:16] <luc4> Mavrik: I modified the muxing.c sample code to not encode and just write into the muxed file my frames taken from a stream, along with the timestamp, expressed in time_base units.
[11:17] <Mavrik> ah I see
[11:18] <Mavrik> luc4, make sure your timebases are correctly set
[11:18] <Mavrik> even though& I'm not entirely sure mp4 muxer is really aware of the vfr option
[11:18] <Mavrik> never really worked with those streams
[11:19] <luc4> Mavrik: shouldn't it take into consideration the timestamps and render accordingly? Or maybe my assumption is wrong?
[11:20] <luc4> Mavrik: if I pastebin my code can you have a quick look to my modifications? Maybe you can see something wrong...
[11:21] <Mavrik> luc4, sometimes the muxers are trying to be "smart" and have their own pts generation& not commonly though
[11:21] <luc4> Mavrik: it is essentially the muxing.c source with a couple of modifications to take data from 2 files.
[11:21] <Mavrik> luc4, paste it and let's see
[11:22] <luc4> Mavrik: this is my code: http://paste.kde.org/742016/
[11:24] <luc4> Mavrik: what I modify is simply I removed audio (I have no audio), and I'm parsing a file containing h264 and another file containing on each line the timestamp for the frame in us and its size in bytes. I used Qt to read. Then I modified the write_video_frame function to write that. The rest is kept as it was.
[11:25] <luc4> Mavrik: I'm not entirely sure on the AVPacket creation in line 390. I tried to provide timestamps both absolute and relative to the beginning of the video, but nothing changed.
[11:26] <luc4> Mavrik: also unsure on 537, where I compute the timestamp. I tried many ways but none changed anything...
[11:27] <Mavrik> ugh man
[11:27] <Mavrik> that's alot of pointless code
[11:28] <luc4> Mavrik: it is a code done only with the purpose of creating the video correctly. I'll have to rewrite all for android.
[11:28] <Mavrik> why are you even calling audio writing when you said you have no audio?
[11:29] <luc4> Mavrik: you mean in 553? I took the code from muxing.c, and changed only what I noticed to be strictly necessary.
[11:29] <Mavrik> yeah, you should really read the docs first
[11:30] <luc4> Mavrik: audio_st is kept NULL, so I think audio is ignored.
[11:30] <Mavrik> the only thing you have to do in write_packet funcion is to initialize the AVPacket
[11:30] <Mavrik> put the raw h.264 payload as data
[11:30] <Mavrik> set pts AND MORE IMPORTANTLY dts!
[11:30] <Mavrik> (don't ever set DTS to 0, this will break everything)
[11:30] <Mavrik> do you have B-frames in your stream?
[11:31] <luc4> Mavrik: ok, I read the documentation and it was saying to put 0 if unknown. What should I put in there?
[11:31] <Mavrik> the decoding time stamp.
[11:31] <Mavrik> again, do you have B-frames in your H.264 stream?
[11:32] <luc4> Mavrik: I'm checking that.
[11:33] <luc4> Mavrick: I created the stream using Android classes, and I specified nothing about B-frames, so maybe not... I specified I-frame interval to be 5.
[11:34] <Mavrik> um, you will have to check that
[11:34] <Mavrik> if it's baseline profile then there are no B-frames
[11:34] <Mavrik> but when dealing with video you need to understand you have TWO timestamps for each encoded frame:
[11:34] <Mavrik> 1.) When the frame has to be decoded
[11:35] <Mavrik> 2.) When the frame has to be displayed
[11:35] <Mavrik> if you don't have B-frames then those are the same, if you do then those WON'T be the same
[11:35] <Mavrik> also, encoders really need decoding timestamp to be set (DTS)
[11:35] <Mavrik> PTS MAY be omitet (and MUST be set to AV_NOPTS_VALUE NOT 0!)
[11:35] <Mavrik> but it shouldn't be
[11:35] <Mavrik> since you're not setting those properly, that is your problem
[11:36] <Mavrik> also, throw away all the code dealing with encoding, setting up encoders and everything, you don't need that and will only mess with your understanding of concepts
[11:36] <luc4> Mavrik: ok, I'll try following your advices, thanks!!
[11:36] <Mavrik> then when you have that, make sure your timestamps are running correctly:
[11:36] <Mavrik> you're setting timebase to 1/25
[11:36] <Mavrik> which means that frame at time 00:00:001.000 has to have timestamp of "25"
[11:37] <Mavrik> frame on second 2 needs timestamp of "50" etc.
[11:37] <Mavrik> and basically to mux an encoded stream your checklist is:
[11:37] <Mavrik> 1.) open AVFormatContext
[11:37] <luc4> Mavrik: that is not ok... where did you see that timebase is 1/25?
[11:37] <Mavrik> 2.) create stream with proper codec_id and timebase
[11:37] <Mavrik> 3.) for each frame
[11:37] <Mavrik>  - initialize packet
[11:37] <Mavrik>  - set PTS and DTS
[11:38] <Mavrik>  - set data and size
[11:38] <Mavrik>  - set AV_FLAG_KEY for keyframes
[11:38] <Mavrik>  - write to format
[11:38] <Mavrik> luc4, um, where it says "time_base.den" and "timebase_num" maybe?
[11:39] <luc4> Mavrik: but frame rate in my stream is variable... so my frames should not be equally distributed, right?
[11:40] <Mavrik> yes.
[11:40] <Mavrik> you still need to set a proper timebase
[11:40] <Mavrik> so the decoder knows WHAT your timestamps even mean
[11:40] <luc4> Mavrik: oh ok so that is correct.
[11:40] <Mavrik> (usually H.264 streams have timebase of 1/90000 for enough granularity)
[11:41] <luc4> Mavrik: thanks for your help! I'll follow your indications.
[11:43] <luc4> Mavrik: I'm missing just one thing, supposing I also have B-frames, is it the decoder who should be telling me the dts?
[11:43] <luc4> sorry, I meant the encoder, not the decoder
[11:44] <Mavrik> luc4, yes, the encoder has to tell you PTS and DTS
[11:44] <Mavrik> since the encoded decides where to put them
[11:44] <Mavrik> but if you're using HW encoder on Android
[11:44] <luc4> Mavrik: yes, hw encoder on Android.
[11:45] <Mavrik> those ones only support baseline profile which doesn't allow B-frames
[11:45] <luc4> Mavrik: in fact I see nothing related to that. So, in my case, as you said, dts = pts, correct?
[11:46] <Mavrik> mhm
[12:09] <luc4> Mavrik: sorry, was that a confirmation or unsure?
[18:01] <rsh`> in a streaming video setup, person A wants to receive video from person B via RTP (with SIP connection initiation). A tells B that he can send RTP to UDP port 5000 and RTCP sender reports to UDP port 5001. However, I am confused which port A should use to send RTCP receiver reports back to B. It doesn't make sense to me that B should be "listening" on 5001 as well, but I cant find anywhere in the SIP spec that specifies anything else,
[18:01] <rsh`> and I can't determine how it should work by reading through the ffmpeg source code for RTP muxers
[18:26] <Mavrik> rsh`, why doesn't it make sense? :)
[18:26] <Mavrik> RTCP is twoway
[18:28] <rsh`> true, but how can you guarantee that person B should be able to bind to port 5001?
[18:28] <rsh`> especially in a NAT scenario
[18:28] <Mavrik> bind to port?
[18:29] <Mavrik> RTP/RTCP is used almost exclusively over UDP
[18:29] <Mavrik> no binding really
[18:29] <Mavrik> and yeah, NAT isn't something that was taken into account when designing RTP/RTCP protocols :)
[18:29] <rsh`> I'm confused a bit by what you said. Wouldn't you still need to bind to "listen" (in the UDP sense) on port 5001?
[18:30] <rsh`> is it uncommon in typical RTP/RTCP setups to have two way feedback?
[18:31] <Mavrik> rsh`, yes, but what does listening on a port have to do with NAT?
[18:32] <Mavrik> and yes, the whole point of RTCP is to have two-way feedback
[18:33] <rsh`> so behind NAT, you actually may present port 12345 to the outside world, so they would somehow need to be told this port (they can't just assume to send to 5001)
[18:33] <rsh`> I'm not sure where that communication happens
[18:33] <rsh`> couldn't find it in the ffmpeg source
[18:33] <rsh`> or in the SIP protocol for that matter
[18:33] <rsh`> but I could be overlooking something
[18:34] <Mavrik> basically, RTP expects the sender not to be NATed
[18:34] <Mavrik> there is an RFC that describes use of STUN protocol with RTP
[18:34] <Mavrik> haven't had time to read it yet though
[18:35] <rsh`> do you know if ffmpeg supports RTP with STUN today?
[18:35] <Mavrik> remember, most players expect that RTCP port is RTP + 1
[18:35] <Mavrik> rsh`, very much doubt it :)
[18:36] <Mavrik> http://sipsorcery.wordpress.com/2009/08/05/nat-rtp-and-audio-problems/
[18:37] <diyos> It's an unfortunately long path towards IPv6 :-(
[18:38] <rsh`> thanks for that link, Mavrik
[19:19] <xlinkz0> i've got a broken file
[19:19] <xlinkz0> ffmpeg says 'moov atom not found'
[19:19] <xlinkz0> is there any way to salvage anything from the file?
[19:48] <durandal_1707> xlinkz0: depends on many factors
[19:48] <xlinkz0> what can i do to debug it?
[19:50] <tomlol> is there a way to only screen grab a specified portion of the screen?
[19:51] <durandal_1707> xlinkz0: recognize and extract packets in file....
[19:54] <ubitux> tomlol: https://ffmpeg.org/ffmpeg-devices.html#x11grab
[19:54] <tomlol> I'm not on x11
[19:54] <tomlol> would I need to do it with a device though?
[19:54] <ubitux> look at the documentation of your input device then
[19:55] <tomlol> I was using crop but my res is too high, so when I try to add audio there isn't enough buffer.
[19:58] <tomlol> http://pastebin.com/Hvk0fvEH
[19:58] <tomlol> I found some stuff in the input device, looks like I can restrict it to the hwnd I need.
[20:01] <tomlol> the symptom I was seeing was this: [dshow @ 000000000245dce0] real-time buffer 727% full! frame dropped!
[20:01] <tomlol> then the audio would stutter or pop all over.
[20:01] <tomlol> I tried a lot of different settings with about the same result.
[20:02] <tomlol> is there a way that I can increase that buffer?  I have ram to spare.
[20:04] <durandal_1707> tomlol: maybe machine is just too slow to encode h264 in real time
[20:05] <tomlol> I'd hate to think what sort of machine you'd need if this one didnt work.
[20:05] <tomlol> hehe
[20:05] <tomlol> I was talking with someone last night and they said that it was probably input bound.
[20:06] <durandal_1707> what is input bound?
[20:06] <tomlol> I tried running the app in fullscreen at 1024x768 at 10 fps instead of 30 fps and the buffer was 150% full.
[20:07] <tomlol> [01:05] <Mavrik> tomlol, well basically your problem is that ffmpeg doesn't process your frames fast enough
[20:07] <tomlol> [01:06] <Mavrik> the bottleneck is probably (but not 100%) the input bandwidth (if you have a very fast processor)
[20:07] <durandal_1707> what 150%? where it is?
[20:07] <tomlol> that error message I posted about the real-time buffer being 727% full
[20:08] <tomlol> I found some docs on reducing screen input, so I'll try that.  I was just wondering if there was some pedestrian way of giving ffmpeg more resources.  I assume that the default is to take as much as it can get.
[20:12] <durandal_1707> yo cant use crop to reduce buffer usage, that will just slow things more ....
[20:16] <burek> tomlol, did you try -s before -i
[20:16] <durandal_1707> burek: you cant remember that link/ dont have bookmarks?
[20:16] <burek> durandal_1707 i dont get it :)
[20:16] <tomlol> I am not sure, ill try it rq.
[20:17] <durandal_1707> seeking while recording doesn't make sense
[20:18] <burek> durandal_1707, -s is not seeking..?
[20:18] <durandal_1707> oh...
[20:18] <burek> as i figured a guy needs to restrict the area that he captures
[20:20] <durandal_1707> makes sense, but than its -video_size and not -s
[20:21] <durandal_1707> http://ffmpeg.org/ffmpeg-devices.html#dshow
[20:21] <burek> dshow doesn't pick the input -s(ize) automatically?
[20:23] <durandal_1707> one need to specify size of window thats gonna be recorded
[20:23] <durandal_1707> otherwise, whatever is default (whole screen?) is used
[20:23] <burek> ffmpeg -f dshow -s 640x480 -i video="UScreenCapture" output.flv
[20:25] <burek> btw, when i run "ffmpeg -list_devices true -f dshow -i dummy", i get "[dshow @ 028b78c0] Could not enumerate video devices."
[20:25] <burek> do i need a specific device installed in order to capture the screen?
[20:25] <tomlol> yep
[20:26] <tomlol> https://github.com/rdp/screen-capture-recorder-to-video-windows-free
[20:27] <tomlol> both -s and -video_size options don't seem to parse
[20:27] <burek> with -video_size
[20:28] <durandal_1707> video_size must be first, input args
[20:31] <tomlol> ffmpeg -f dshow -video_size 720x480 -i video="screen-capture-recorder" ...
[20:32] <burek> now im not sure if 720x480 is a valid format for video size
[20:32] <burek> or 720:480 or 720,480
[20:33] <burek> documentation doesn't say anything on that
[20:33] <durandal_1707> what is error?
[20:33] <tomlol> could not set video options
[20:33] <durandal_1707> burek: stop
[20:33] <durandal_1707> HxW should work fine
[20:33] <tomlol> I tried all
[20:34] <burek> ok
[20:37] <tomlol> the registry settings seem to reduce the capture size
[20:38] <tomlol> so I will see if that allows me to add audio
[20:48] <tomlol> I still get dropped frames, but the video seems to work.
[20:48] <tomlol> avi's also won't let me signal out of them for some reason.  mkv doesn't have an issue with that.
[21:44] <xlinkz0> what's wrong with this filtergraph? http://codepad.org/9EWbYIHx
[21:47] <ubitux> the separator before the scale filter at least
[21:48] <xlinkz0> ubitux: i have a ;
[21:48] <ubitux> yes and it is wrong
[21:49] <xlinkz0> ah
[21:49] <xlinkz0> thanks
[22:16] <xlinkz0> i have multiple videos with the same codec and container but different fps
[22:16] <xlinkz0> should i 'normalize' them all to the lowest fps or it doesn't matter for concatenating?
[22:22] <xlinkz0> another question.. is it faster to add a watermark if i reduce the number of fps at the same time vs just applying the watermark?
[22:40] <Mavrik> xlinkz0, for fps depends on container, but I bet you'll have interesting problems with playback if you concat them at variable framerate ;)
[22:40] <Mavrik> err, depends on codec
[22:40] <Mavrik> xlinkz0, reducing fps and applying watermark will require video reencode
[22:41] <Mavrik> doing it in two passes will takeawhile, do it together ;)
[22:47] <xlinkz0> Mavrik: it's h264 from GoPro's
[22:47] <xlinkz0> Mavrik: so it's better to reduce fps to the lowest video if i'm going to concat them?
[22:48] <xlinkz0> i mean for h264 will there be any problems with playback?
[22:50] <Mavrik> depending on a player
[22:50] <Mavrik> I strongly suggest against having variable bitrate video
[22:50] <Mavrik> especially if the encoder didn't create it
[23:25] <hikiko> hello! I am trying to record video from the framebuffer but I get very low quality, could you please tell me which parameters to use to improve it?
[23:26] <hikiko> atm I just type avconv -f fbdev -i /dev/fb0 myvid.type
[23:27] <sacarasc> If you're using avconv, you should ask in #libav
[23:28] <hikiko> ok thank you :) +sorry!
[23:29] <beastd> hikiko: If you can get FFmpeg and try with ffmpeg binary you could also ask here for help.
[23:31] <hikiko> beastd, i just tried the ffmpeg with the same parameters and got a low quality too
[23:32] <sacarasc> Actual ffmpeg, or the ffmpeg legacy binary from libav?
[23:32] <hikiko> I suppose that it's because I record a virtual screen (i have connected my laptop to a vga) is there anyway to set the resolution?
[23:32] <hikiko> mmm
[23:32] <hikiko> let me see
[23:34] <hikiko> sacarasc, i think I have the libav one
[23:34] <hikiko> apt-cache show returns this line: Source: libav and when I type -v i get a message that this package is deprecated :/
[23:36] <hikiko> sacarasc, is it fine to compile the git version given here: http://www.ffmpeg.org/download.html ?
[23:37] <sacarasc> You can get a build from one of the first links, and use that.
[23:37] <sacarasc> And you don't need to install it, just use ./ffmpeg (if you're in the same directory as the ffmpeg binary)./
[23:38] <hikiko> ok I download the tarball :)
[23:41] <hikiko> wow sacarasc!! much better!!
[23:41] <hikiko> thank you!
[00:00] --- Mon May 13 2013


More information about the Ffmpeg-devel-irc mailing list