[Ffmpeg-devel-irc] ffmpeg.log.20160223

burek burek021 at gmail.com
Wed Feb 24 02:05:01 CET 2016


[00:03:51 CET] <kyleogrg> I have a Firewire input on my laptop, and I have an FFmpeg command that will pipe a live display of it into ffplay.  This works pretty well (just some latency).  Now I added a color key effect so that I could have a live "green screen" effect, and replace the background.  But since I added this effect, now it won't open in ffplay -- the encoding time code is always at 00:00:00.00.
[00:04:04 CET] <kyleogrg> Here's the command line: http://pastebin.com/hvYYxdVU
[00:05:18 CET] <kyleogrg> Any clue as to why this might happen?  Is it because I added a JPG as an input, and this has no "duration"?
[01:43:31 CET] <Graduating> ffmpeg -ss 00:5:10 -t 1  is it possible to use frame number instead of time -ss offset?
[02:06:38 CET] <FlorianBd> Hi there :)
[02:07:20 CET] <Wildefyr> what the hell does the -crf option do?
[02:07:26 CET] <FlorianBd> by default, saving a movie clip to png files creates a flatten image (no alpha channel. How to force the alpha channel even though it won't be used yet?
[02:07:31 CET] <Wildefyr> I can't find it anywhere in the manpage
[02:08:28 CET] <FlorianBd> Wildefyr: https://superuser.com/questions/677576/what-is-crf-used-for-in-ffmpeg
[02:09:10 CET] <Wildefyr> it's great that there is an answer (google wasn't helping much) but something like that really should be in the manpages
[02:09:28 CET] Action: FlorianBd found this in 10s with google :)
[02:09:55 CET] <Wildefyr> dunno why it wasn't being forthcoming for me
[02:13:31 CET] <FlorianBd> ok I found the answer to my question:   -pix_fmt rgba -c:v png
[02:14:10 CET] <furq> Wildefyr: it's in the x264 manpage
[02:14:18 CET] <Wildefyr> ah
[02:14:24 CET] <furq> or ffmpeg -h encoder=libx264
[02:18:53 CET] <Wildefyr> yeah but as a new person to ffmpeg I had no idea that -crf was an option of libx264 specifally
[03:56:54 CET] <k_sze[work]> For libx264 and ffv1 level 1, is the GOP size fixed or is it allowed to change throughout the encoding?
[03:57:30 CET] <J_Darnley> libx264 has variable
[03:57:39 CET] <J_Darnley> ffv1 is intra-only (I think)
[04:12:55 CET] <t4nk404> I am trying to use ffprobe to determine if an in progress download/save of an mpg file has completed. Is there any identifying information in the probe data which would indicate this so I can skip currently downloading mpgs in an tool i'm building?
[05:17:06 CET] <k_sze[work]> hmm, then I need a way to extract the index of keyframes from my .nut files.
[05:18:23 CET] <k_sze[work]> Efficiently, of course. I know I can basically get that information using ffprobe but it seems to require reading the whole file when in fact the index of keyframes should be stored at the end of the .nut file already.
[05:54:32 CET] <k_sze[work]> ugh, I just used ffprobe on a libx264/.nut file, somehow the display_picture_number for all frames are 0.
[10:04:50 CET] <smolleyes> hello
[10:05:59 CET] <smolleyes> no way to do/add a visualisation liek this thru ffmpeg ? https://www.youtube.com/watch?v=JI0Ws6rOJws
[10:06:25 CET] <smolleyes> i use the waveform filter for the moement but spectrum might be cool
[10:54:22 CET] <smolleyes> no idears ? showcqt filtr seems right without the letters and with the spectrograph... just want the "bars"
[11:10:18 CET] <smolleyes> ok found showfreqs in the doc :)
[12:23:22 CET] <bencc1> can ffmpeg play rtp from a file?
[12:23:26 CET] <bencc1> in what format?
[14:16:25 CET] <ented> Hello I have some problems with decoding of video file. I decoded yuv frame and need to dump them. What is the fastest way to do so? I am using C interface.
[14:17:56 CET] <ented> That is I need to debug them, does ffmpeg has an api to dump them to say jpegs for debugging purposes. I do not want to establish entire avformat context just for temporary debugging
[14:18:17 CET] <ented> ?
[14:25:09 CET] <J_Darnley> fwrite
[14:26:11 CET] <J_Darnley> jpeg features are only availble through avformat
[17:34:43 CET] <iGeni> how do i take screenshots of a .m3u8 file
[17:39:41 CET] <DHE> m3u8 still plays like most regular video files or live streams. same way you would anything else
[17:39:46 CET] <J_Darnley> Open it in a text editor and press printscreen?
[17:59:15 CET] <lroe> My goal here is to take an rtsp stream from an IP camera buffer it locally, and reshare it to whomever wants to consume it via a web-friendly link.  I'm working on configuring FFserver as I think this is the right tool, however, I'm a bit hazy on which stream format I should use and how to get ffmpeg to 'feed' ffserver with the rtsp stream from the ip camera.  For reference i'm using version 2.6.5
[17:59:43 CET] <lroe> Any pointers would be appreciated
[18:39:13 CET] <DHE> lroe: ffserver is actually not well supported. you are encouraged to try... well, anything else really. nginx with the rtmp module, HLS or MPEG-DASH are alternatives
[19:32:15 CET] <andrey_utkin> \me wondering why generation of placeholder video from image and silence with filter sources "movie" and "aevalsrc" behaves so weird
[19:39:02 CET] <durandal_1707> andrey_utkin: command example?
[19:42:40 CET] <andrey_utkin> ffmpeg -f lavfi -graph "movie=filename=test.jpg:loop=1882,settb=100/2997,setpts=N[out0]; aevalsrc=0:sample_rate=48000[out1]" -i nullsrc -map 0 -t 62.81 -vcodec libx264 -acodec aac -strict -2 -y /tmp/placeholder.mp4
[19:42:51 CET] <andrey_utkin> durandal_1707: --^
[19:43:51 CET] <andrey_utkin> maybe i don't get something, but it's quite much harder to obtain than with just "testsrc"
[19:45:12 CET] <durandal_1707> andrey_utkin: weird?
[19:47:11 CET] <Mavrik> hmm.
[19:47:29 CET] <Mavrik> Is there a precedent for FDK-AAC not holding to low bitrate?
[19:47:45 CET] <Mavrik> (E.g. encoding what should be 64kbit audio to 90-100)?
[19:47:49 CET] <andrey_utkin> durandal_1707: it's weird how much time it takes, even if it works
[19:47:59 CET] <andrey_utkin> have you tried it?
[19:49:38 CET] <durandal_1707> No, but you can loop images also with loop filter, with correct pts
[19:55:07 CET] <andrey_utkin> Cool, but still this gives less than 1 FPS of output on Core i5
[19:55:10 CET] <andrey_utkin> ffmpeg -f lavfi -graph "movie=filename=test.jpg,loop,settb=100/2997,setpts=N[out0]; aevalsrc=0:sample_rate=48000[out1]" -i nullsrc  -t 62.81 -vcodec libx264 -acodec aac -strict -2 -y /tmp/placeholder.mp4
[19:55:35 CET] <andrey_utkin> test.jpg isjust default-sized testsrc frame
[19:58:01 CET] <durandal_1707> you haven't set any parameters for loop, with no params it does nothing
[19:58:33 CET] <durandal_1707> andrey_utkin: ^
[20:01:38 CET] <bencc1> is it possible to play rtp packets from a file?
[20:01:43 CET] <bencc1> or only from udp socket?
[20:02:37 CET] <andrey_utkin> durandal_1707: wow. As long as I set all the params to "loop", it starts working fast.
[20:02:41 CET] <andrey_utkin> ffmpeg -f lavfi -graph "movie=filename=test.jpg,loop=loop=1882:start=0:size=32767,settb=100/2997,setpts=N[out0]; aevalsrc=0:sample_rate=48000[out1]" -i nullsrc  -t 62.81 -vcodec libx264 -acodec aac -strict -2 -y /tmp/placeholder.mp4
[20:03:35 CET] <andrey_utkin> I still wonder what is "Set maximal size in number of frames for loop filter" and why max is 32767
[20:04:17 CET] <durandal_1707> duration of loop, for 1 image input its always 1
[20:04:56 CET] <andrey_utkin> ok, but it is not clear at all from the doc
[20:05:00 CET] <andrey_utkin> thanks a lot
[20:06:07 CET] <andrey_utkin> another question: is there a ./configure option to include default set of filters after --disable-everything?
[20:11:14 CET] <durandal_1707> what's default set of filters?
[20:11:43 CET] <ethe> durandal_1707: the filters which are enabled with no options I'd presume
[20:11:45 CET] <andrey_utkin> the set which you get with ./configure without additional options
[20:12:34 CET] <durandal_1707> enable-filters?
[20:12:47 CET] <andrey_utkin> will try that
[20:16:03 CET] <andrey_utkin> yes, --enable-filters work. Thank you durandal_1707
[20:30:17 CET] <Rajko> the h264_mp4toannexb bsf will 'inject' SPS/PPS NALs if they arent present in the source stream but are in extradata, right ?
[20:55:57 CET] <Mavrik> yp
[20:57:08 CET] <Rajko> Mavrik, hi ?
[21:31:25 CET] <James123> Hi! Does anyone know how to extract keyframes from a video, say from timeA to timeB?
[21:34:35 CET] <J_Darnley> use the select filter, maybe twice.
[21:47:31 CET] <hunternet93> I can't get ffserver to stream h.264/aac over RTSP. I can start the sender just fine and receive over HTTP, but both ffplay and vlc just sit there when trying to receive over RTSP. Here's my ffserver config: https://gist.github.com/hunternet93/e85c2b805d7e2f73475f
[22:23:10 CET] <proxima> i have installed ffmpeg libraries on ubuntu successfully. But unable to compile the example codes. When compiled with gcc it gives following errors http://pastebin.com/7MSQcQsX, and in case g++ gives this error: http://pastebin.com/73XnrjWz.  any help or suggestion? what changes or improvement i need to do?
[22:24:54 CET] <J_Darnley> perhaps you should actually link to the libraries.
[22:26:27 CET] <proxima> the linking problem doesn't show up when compiled with g++
[22:26:55 CET] <J_Darnley> because it errors out long before linking
[22:27:40 CET] <J_Darnley> that in fact looks like the preprocessor
[22:27:51 CET] <proxima> okay
[22:41:17 CET] <proxima> thanks J_Darnley, its working now!!
[23:04:47 CET] <salviadud> I want to comment on my failed attempts at deciphering this hidden message
[23:05:31 CET] <salviadud> Since I'm at work I get all the electricity I can get, and therefore, all the attempts possible.
[23:06:51 CET] <salviadud> My latest attempt was to invoke ffmpeg with -cpuflags 0
[23:06:51 CET] <salviadud> That slowed everything down, and was not cool.
[23:06:52 CET] <salviadud> So, what I decided to do, was to compile the x264 library with the SSE flag, but disabling asm.  And to recompile ffmpeg without sse
[23:08:05 CET] <salviadud> So, the ffmpeg is now invoked without any cpuflag alterations.
[23:08:05 CET] <salviadud> I noticed that I already rendered 1 minute and 20 seconds worth of frames at 2448 kb
[23:08:05 CET] <salviadud> In 227 mins, which is 3.78 hours
[23:08:39 CET] <salviadud> So, I have sped up my rendering time by 4 hours.
[23:10:01 CET] <salviadud> The idea is that the video file itself should invoke the optimization when needed and only done by the library.
[23:10:02 CET] <salviadud> And that is the key to deciphering this thing.
[23:10:02 CET] <salviadud> I feel like Frankenstein, and that thing right there, is my monster.
[23:12:32 CET] <J_Darnley> I remember you
[23:17:01 CET] <salviadud> hehe
[23:17:01 CET] <salviadud> I'm trying to crack a really hard cookie
[23:19:44 CET] <salviadud> I also ran mediainfo on the file to make sure I used the same version of the library
[23:19:44 CET] <salviadud> All I could get was a date, so I had to aproximate, hopefully that one is it.
[23:19:46 CET] <salviadud> Before I tried cpuflags 0, I tried it out by just disabling asm, and keeping sse on both ffmpeg and x264, it took a week to render, but I didn't get the result I was expecting.
[23:23:26 CET] <lroe> wow.  Resharing an rtsp stream should not be this difficult
[23:24:03 CET] <lroe> my goal is to serve an rtsp stream via a website so that the source of the rtsp stream doesn't get overloaded by 100s of people hitting it
[23:24:05 CET] <J_Darnley> You mean like copy-pasting the URL?
[23:24:19 CET] <furq> lroe: use nginx-rtmp and hls
[23:24:41 CET] <lroe> at the moment I have rtsp stream -> nginx rtmp but I can't get the rtmp stream to show up in a browser
[23:24:50 CET] <lroe> I can view it fine via vlc or mplayer
[23:25:46 CET] <furq> you're using flash, right
[23:25:53 CET] <lroe> no flash
[23:26:00 CET] <furq> rtmp is flash only
[23:26:09 CET] <furq> if you want browser support then use hls and hls.js
[23:26:21 CET] <lroe> ok, that's good to know
[23:26:21 CET] <furq> https://github.com/arut/nginx-rtmp-module/wiki/Directives#hls
[23:26:40 CET] <JEEB> oh, that thing supports hls now as well
[23:26:42 CET] <furq> you can serve rtmp and flash from the same location if you still want decent player support
[23:26:52 CET] <furq> s/flash/hls/
[23:27:15 CET] <lroe> how do I connect the rtsp stream to it?
[23:27:20 CET] <furq> exec_pull
[23:27:25 CET] <furq> https://github.com/arut/nginx-rtmp-module/wiki/Directives#exec_pull
[23:27:51 CET] <lroe> but instead of rtmp:// I use hls://?
[23:28:04 CET] <furq> hls is over http
[23:28:07 CET] <JEEB> woah, and that's under a nice license
[23:28:18 CET] <furq> it'll generate an m3u8 playlist, you just serve that
[23:29:42 CET] <JEEB> I always somehow thought that nginx-rtmp only supported rtmp(e)
[23:29:50 CET] <lroe> http://paste.debian.net/402869/
[23:29:56 CET] <lroe> so that's what I'm currently doing
[23:29:57 CET] <furq> it's supported hls for ages and it nominally supports dash as well
[23:30:00 CET] <JEEB> but I guess if it uses lavf dash/hls become rather simple (not really)
[23:30:07 CET] <furq> but i'm told (by someone in here whose name i forget) that the dash muxer is broken
[23:30:50 CET] <furq> lroe: that should still work
[23:30:59 CET] <furq> the server always takes rtmp as input
[23:31:23 CET] <JEEB> if it's lavf it should be possible to fix, since lavf's DASH by itself isn't broken. it uses positive CTS offsets which aren't liked by some livestreaming servers, but should otherwise be OK
[23:31:48 CET] <JEEB> because I'm dealing with some of those livestreamers I'm currently working on implementing those darn negative CTS offsets as well
[23:32:03 CET] <furq> i don't think it uses lavf at all but i could be wrong
[23:32:17 CET] <furq> i'm pretty sure the workaround was to exec ffmpeg for dash muxing
[23:32:20 CET] <JEEB> well of course you could implement your own stuff... but you'd have to put a lot more effort into it
[23:32:35 CET] <JEEB> and most probably the result wouldn't be as pretty
[23:32:55 CET] <lroe> http://paste.debian.net/hidden/263a82ef/
[23:33:00 CET] <JEEB> gesus christ
[23:33:02 CET] <lroe> and that is what my site looks like
[23:33:04 CET] <JEEB> https://github.com/arut/nginx-rtmp-module/blob/master/dash/ngx_rtmp_mp4.c
[23:33:12 CET] <JEEB> someone really went and implemented it
[23:33:46 CET] <furq> oh hey, rtmp_control
[23:33:47 CET] <furq> that's new
[23:34:05 CET] <furq> or maybe not. i guess i just didn't see it
[23:35:04 CET] <lroe> this is the whole rtmp {} section: http://paste.debian.net/hidden/8e76749d/
[23:35:21 CET] <furq> lroe: if you have rtmp working then you can just add "hls on;" to that application block
[23:35:32 CET] <lroe> right I did
[23:35:46 CET] <lroe> but how do I serve the hls?  that's the part I don't understand
[23:35:59 CET] <furq> it'll generate an m3u8 file in /tmp/hls
[23:36:07 CET] <furq> serve that over http and point a <video> tag to it
[23:36:36 CET] <furq> https://github.com/dailymotion/hls.js/
[23:36:38 CET] <furq> you'll need that as well
[23:36:40 CET] <lroe> should that m3u8 file exist?
[23:36:55 CET] <lroe> so if it doesn't exist something is wrong?
[23:37:14 CET] <furq> actually i wonder if exec_pull will work with hls because no play request is made
[23:37:41 CET] <furq> you might need to have the html player make a request to the rtmp_control handler or something
[23:37:58 CET] <furq> but yeah the m3u8 will only exist when a stream is active
[23:38:54 CET] <lroe> this hls.js thing is a node.js package?
[23:38:56 CET] <Rajko> isnt it better to use video.js with the hls source ?
[23:39:01 CET] <Rajko> instead of just raw hls.js
[23:39:18 CET] <Rajko> you get an actual player and such
[23:39:19 CET] <furq> lroe: https://raw.githubusercontent.com/dailymotion/hls.js/master/dist/hls.min.js
[23:39:32 CET] <furq> Rajko: there's an actual player in my browser, why not use that
[23:39:35 CET] <Rajko> (its based on hls.js)
[23:39:47 CET] <Rajko> furq, it has things like flash fallback for browsers without MSE etc
[23:40:11 CET] <Rajko> controls on the video for pause and volume and such
[23:40:34 CET] <furq> the builtin player has controls, and it's not particularly difficult to add a fallback manually
[23:40:56 CET] <furq> i mean you can use video.js if you want but the builtin player works fine for me
[23:41:00 CET] <Rajko> it only has controls if youre playing from a mp4 file, not with MSE
[23:41:13 CET] <JEEB> why wouldn't it have controls with MSE?
[23:41:15 CET] <furq> it definitely does have controls with hls.js
[23:41:20 CET] <Rajko> ok then
[23:41:26 CET] <JEEB> I mean, I've used MSE with dash.js
[23:41:27 CET] <Rajko> their sample page doesnt
[23:41:31 CET] <furq> lroe: you can also use exec_static instead of exec_pull
[23:41:41 CET] <JEEB> (which sucks, but I've tried it)
[23:42:37 CET] <furq> that'll work regardless of whether there are any clients
[23:43:45 CET] <lroe> so with exec_static and hls_path /var/www/hawk/;  I should have an m3u8 file in /var/www/hawk?
[23:43:58 CET] <furq> i should have thought so
[23:44:01 CET] <furq> i've never used exec_* though
[23:45:03 CET] <furq> er
[23:45:13 CET] <furq> it'll be in /tmp/hls/hawk unless you changed it
[23:46:10 CET] <lroe> right, I changed it to /var/www/hawk/ but nothing was created there
[23:55:04 CET] <Rajko> can MSE decode intra-refresh streams ?
[23:55:26 CET] <JEEB> I think that depends more on the actual decoder than the MSE demuxing
[23:55:36 CET] <Rajko> because flash just waits for a intra frame
[23:55:38 CET] <JEEB> MSE is just the input/demuxing
[23:55:50 CET] <JEEB> it then feeds to the browser's decoder(s)
[23:56:03 CET] <JEEB> whatever they do is then up to the gods
[23:56:25 CET] <Rajko> ppapi in chrome exposes the decoder, you can feed it annex-b NALs
[23:56:50 CET] <Rajko> it uses ffmpeg-sumo.dll or DXVA internally
[23:56:56 CET] <JEEB> what I meant is that it depends on the decoder, not MSE
[23:57:15 CET] <Rajko> can you answer this
[23:57:16 CET] <Rajko> <Rajko> the h264_mp4toannexb bsf will 'inject' SPS/PPS NALs if they arent present in the source stream but are in extradata, right ?
[23:57:32 CET] <JEEB> it will convert the AVCc extradata to Annex B
[23:57:35 CET] <JEEB> which is in-band
[23:58:06 CET] <Rajko> so if the first frame isnt sps/pps but there is some in extradata... the first frame will become sps/pps ?
[00:00:00 CET] --- Wed Feb 24 2016


More information about the Ffmpeg-devel-irc mailing list