[Ffmpeg-devel-irc] ffmpeg.log.20141112

burek burek021 at gmail.com
Thu Nov 13 02:05:02 CET 2014


[03:47] <danomite-> can ffplay show the frame rate?
[10:11] <ribasushi> for anyone interested - I solved my problem from yesterday by simply switching to matroska as intermediate containers (nothing changed in the codec/quality/pipeline)
[10:11] <ribasushi> with flv sources PiP desyncs stuff
[10:11] <ribasushi> with mkv sources - it works flawlessly
[10:12] <ribasushi> weird ;)
[11:13] <pentanol> ribasushi jap?
[11:13] <pentanol> or deu?
[11:16] <ribasushi> neither.
[11:45] <arkonova> Hi guys! Using libvorbis  if it has anything to do with it  are the options '-aq' and '-qscale:a' have the same effect? I can't find any authoritative doc on this, hope you can help me.
[11:50] <relaxed> arkonova: yes, use "-q:a" and look at --quality in "man oggenc" for the quality range.
[11:51] <arkonova> relaxed: Great, thank you
[12:59] <Safa_[A_boy]> Hello. Can someone explain the "-c copy" part in "ffmpeg -i www.sample.com/file.m3u8 -c copy file.ts" please? :)
[13:00] <iive> Safa_[A_boy]: it means that ffmpeg won't do any processing on the audio/video/subtitles etc.. it would just demux the input , copy the packets verbatim and then mux the output
[13:00] <Safa_[A_boy]> I didn't find the -c parameter in the man page..
[13:01] <Safa_[A_boy]> Thanks :)
[13:01] <iive> if you want to do any processing , e.g. on the video (adding a text,box, deinterlace, etc) ffmpeg should first decode the video , the process the raw image and then encode it.
[13:01] <iive> if you want to change the codec, you also need to decode and encode with the new codec.
[13:02] <iive> hum... the full syntax is something like -c:v -c:a -c:s, for video, audio, subtitles etc... -c just assumes all of them.
[13:02] <iive> it is in there, it is short for the -codec option.
[13:03] <Safa_[A_boy]> Ok thanks very much for these great information! :)
[13:04] <DelphiWorld> Safa_[A_boy]: welcome from BH  to FFMPEG!
[13:04] <DelphiWorld> lol i thought i'm the only one from the mena ;)
[13:06] <termos> I am having some issues when muxing to HLS, after calling avio_close the file descriptors to the .ts files are still open, the file descriptor to the m3u8 file has been closed. Is there a known bug here?
[13:09] <Safa_[A_boy]> DelphiWorld: O.o
[14:31] <waressearcher2> is "MX Player" for android based on MPlayer ?
[15:06] <ggVGc> can someone please tell me what parameters I need to produce this output stream? "Video: h264 (High) (avc1 / 0x31637661), yuv420p(tv, bt709"
[15:29] <ribasushi> hmmmm have a tangentially-ffmpeg-related question
[15:29] <ribasushi> I have an rfbproxy capture
[15:30] <ribasushi> the following combo reproduces the colors *perfectly*: rfbproxy -l -p rec_vnc.rfb & xtightvncviewer 127.0.0.1:5910
[15:30] <relaxed> ggVGc: -c:v libx264 -profile:v high -pix_fmt yuv420p
[15:30] <ribasushi> however this pipeline generates "dull" colors (likely 601/709 loss or something): rfbproxy --framerate=30 -x rec_vnc.rfb | ppmtoy4m -v 1 -F 30:1 -S 420jpeg | yuvplay
[15:31] <ribasushi> my question is - how/if can I use ffmpeg's yuv4mpegpipe to reverse the incorrect colors
[15:31] <relaxed> why don't you pipe directly to ffmpeg to simplify what's going on
[15:32] <ggVGc> relaxed: thanks, I figured that out. But will that be completely lossless? I would like basically -crf 0, but not 4:4:4
[15:32] <ribasushi> relaxed: I am: rfbproxy --framerate=30 -x "$WORKDIR/rec_vnc.rfb" | ppmtoy4m -v 1 -F 30:1 -S 420mpeg2 | ffmpeg -y -hide_banner -f yuv4mpegpipe -i - .....
[15:32] <ribasushi> relaxed: the "disassembled pipeline" is because I am trying to debug where the color goes wonky
[15:32] <relaxed> lose ppmtoy4m
[15:33] <ribasushi> oh? how can I do that?
[15:33] <ribasushi> clearly there's a codec I missed experimenting with...
[15:34] <relaxed> try ... | ffmpeg -vcodec ppm -f image2pipe -r $framerate -i - ...
[15:34] <ggVGc> hm, seems profile:v is not compatible with lossless encoding
[15:35] <relaxed> lossless is high anyway
[15:36] <relaxed> for visually lossless use -crf 16 (or lower)
[15:36] <ribasushi> relaxed: indeed this works to remove ppmtoy4m (thanks for that): ... -f image2pipe -c:v ppm -r 30 -i -  ...
[15:36] <ribasushi> the color problem is still there however
[15:37] <ggVGc> relaxed: yeah, I know, but this is source material for video editing, so I am trying to keep all quality. But after effects will not import h264 with profile High 4:4:4, so I guess I won't get all that I wished for
[15:37] <ggVGc> anyway, works not with profile:v High
[15:37] <ggVGc> thanks
[15:37] <ggVGc> works now*
[15:37] <relaxed> ggVGc: high444 is the lossless profile
[15:37] <ggVGc> yep, I read tht
[15:37] <ggVGc> but yeah, after effecrs doesn't accept it
[15:38] <relaxed> ribasushi: can you have rfbproxy output to a file? If so, run "ffmpeg -i" on it and paste the results.
[15:39] <relaxed> effecrs?
[15:41] <ribasushi> relaxed: is this what you asked? http://paste.scsys.co.uk/439411
[15:42] <relaxed> can you view rfbexport with ffplay?
[15:42] <ribasushi> for the record - rfbproxy doesn't really have many knobs to tweak, -x is pretty much all of it
[15:42] <relaxed> Do the colors look right?
[15:43] <relaxed> does "ffmpeg -i -i rfbexport" return the same thing?
[15:44] <relaxed> minus an -i :)
[15:44] <ribasushi> relaxed: ffmpg -i rfbexport errors
[15:44] <ribasushi> relaxed: still trying to see how to make ffplay to run...
[15:44] <ribasushi> relaxed: I can upload the rfbexport somewhere, sec...
[15:45] <ribasushi> (compressing, will take a bit...)
[15:49] <relaxed> ggVGc: oh, after effects! It's hard to believe it doesn't take lossless h264
[15:50] <ggVGc> relaxed: I find that hard to believe also, but as far as I can tell, it doesn't support pixel format other than yuv420p, and it doesn't support h264 files with the profile "High 4:4:4", but with the profile High it works.
[15:50] <ggVGc> i.e it refuses to load files with High 4:4:4, and if it's the wrong pixel format, it loads them but the image is completely garbage
[15:51] <ggVGc> but I am not well versed in video formats, so I might be misunderstanging something or doing simething wrong
[15:51] <relaxed> ggVGc: did you use the .mov container?
[15:52] <ggVGc> hm, no, tried .m4v and .mp4. It loads them if I used High, but not Hith 4:4:4
[15:52] <ggVGc> why would the container matter?
[15:53] <relaxed> I might not matter, but it's wortk a shot.
[16:00] <ggVGc> :( didn't work
[16:02] <relaxed> ggVGc: Ut Video is lossless, try ffmpeg -i input -codec:v utvideo -codec:a pcm_s16le output.avi
[16:02] <ggVGc> thanks, will give it a shot
[16:02] <relaxed> and it should work according to google
[16:03] <ggVGc> nice, thank you!
[16:03] <ggVGc> I am just now trying a .mov with vcodev huffyuv
[16:04] <ribasushi> relaxed: I pm-ed you a complete pipeline that plays for me, and all colors are desaturated
[16:04] <ribasushi> the magenta, the greens, etc
[16:04] <relaxed> ggVGc: it supports the following pixel formats rgb24 rgba yuv422p yuv420p, so use the one you need
[16:07] <ggVGc> thanks, I guess yuv244p is better then, since the real source is yuv444p
[16:07] <ggVGc> hm, that utvideo didn't work either :(
[16:08] <relaxed> ggVGc: add -pix_fmt yuv422p after the input
[16:10] <ggVGc> trying that now, but I don't think this was a pixel format error. When the pixel format was wrong before, it loaded but didn't show up right
[16:10] <ggVGc> this didn't even load
[16:10] <ggVGc> "cannot be opened, unsupported format" etc.
[16:10] <ggVGc> didn't work with the pix_fmt either
[16:22] <relaxed> ggVGc: hmm...my last guess, ffmpeg -i input -c:v rawvideo -pix_fmt yuv422p -codec:a pcm_s16le output.avi
[16:25] <ggVGc> yeah, I was just trying rawvideo
[16:25] <ggVGc> will see how it goes
[16:26] <ggVGc> can I make ffmpeg use one video as guide and output a new one with exactly the same stream config?
[16:26] <ggVGc> i.e read stream properties from video A to convert video B to C.out
[16:28] <waressearcher2> wis "MX Player" for android based on MPlayer ?
[16:29] <ggVGc> relaxed: so, I exported a lossless video from after effects, and it gave me a stream with this. "rawvideo, bgr24, 960x576, 809758 kb/s, 60 fps, 60 tbr, 60 tbn, 60 tbc"
[16:29] <ggVGc> so if I can reproduce that in ffmpeg I should be okay
[16:30] <ggVGc> what does "SAR 1:1 DAR 5:3" mean+
[16:30] <ggVGc> ?
[16:32] <relaxed> Sample Aspect Ratio 1:1 (square pixels) and Display Aspect Ratio 5/3
[16:33] <relaxed> waressearcher2: it's based on ffmpeg https://sites.google.com/site/mxvpen/download
[16:34] <relaxed> ggVGc: did that inport work?
[16:34] <ggVGc> yeah, I figured that out. How can I make it convert that? I am close now, but my source material is a a h264 yuv444p .mkv, and the destination needs to be "rawvideo, bgr24, 960x576, 809758 kb/s, 60 fps, 60 tbr, 60 tbn, 60 tbc"
[16:34] <ggVGc> relaxed: no, but if I can make it match what I wrote above it will work
[16:35] <ggVGc> this almost gets me there "ffmpeg -i grab.mkv -pix_fmt bgr24 -codec:v rawvideo -s 960x576 -d 960x676 -acodec none test.avi"
[16:35] <ggVGc> but I get "rawvideo, bgr24, 960x576, 800891 kb/s, SAR 1:1 DAR 5:3, 60 fps, 60 tbr, 60 tbn, 60 tbc"
[17:06] <wh-hw> hi, everyone, how to detect a video' s watermark position?
[17:11] <ggVGc> relaxed: .... so in the end I manged to get it to work with h264, .mp4 container, and -crf 1 instead of -crf 0
[17:11] <ggVGc> :(
[17:11] <ggVGc> I am so stupid
[17:11] <ggVGc> since -crf 1 gives the profile High, but is still basically lossless
[17:30] <Isweet> Attempting to take an infinite stream of PNG files (png files being produced on the fly by a C program) and transcode them into video compatible with V4L2 devices. I'm confident that ffmpeg can do this, but could someone point me in the right direction?
[17:32] <relaxed> what does "compatible with V4L2 devices" mean?
[17:33] <Isweet> I need to be able to write the video to a video device in /dev/
[17:33] <Isweet> Such a v4l2loopback device
[17:34] <Isweet> The ultimate goal is to get a stream of images to appear as the output of a webcam device
[17:37] <relaxed> cprogram | ffmpeg -vcodec png -f image2pipe -r $framerate -i - (encoding options) output
[17:38] <Isweet> I'll give it a try
[17:38] <Isweet> Thats close to what I came up with on my first go
[17:38] <Isweet> but let me verify
[17:39] <relaxed> Isweet: are you trying to use another v4l2 program to read the loopback stream? (just curious)
[17:40] <Isweet> Yeah, I test it with VLC/MPlayer
[17:40] <Isweet> on Ubuntu 12 or 14 or something
[17:41] <Isweet> I dont have much experience with different encodings
[17:42] <Isweet> Off the top of your head, do you know what video encoding might be compatible with v4l2? Usually theres an explicit encoding option but I dont think ffmpeg has one
[17:42] <Isweet> let me check
[17:43] <relaxed> it depends, chould be mjpeg or h264 depending on the device
[17:43] <relaxed> coulr*
[17:44] <relaxed> ha, I can't type
[17:46] <relaxed> Isweet: https://trac.ffmpeg.org/wiki/Capture/Webcam
[17:48] <Isweet> Yeah, I'm trying to write rather than read
[17:48] <Isweet> Sorry for the delay, trying to get ffmpeg to install on 14.04
[17:49] <relaxed> right, but I thought you might get some ideas for your output there.
[17:49] <waressearcher2> relaxed: yes I actually wanted to ask "based on ffmpeg" but didn't replaced word MPlayer after asking same question in #mplayer
[17:49] <waressearcher2> relaxed: but how it is "based" if ffmpeg is not a player ? is it based on ffplay ?
[17:51] <relaxed> waressearcher2: without looking at it I assume it uses ffmpeg's libs to decode, demux, etc, just like ffplay and mplayer
[17:53] <Isweet> Currently installing ffmpeg from source for 14.04
[17:53] <Isweet> I'l test when I'm done
[17:53] <Isweet> I think the problem I was running into before is that the video encode wasn't compatible with my v4l2 loopback device
[17:53] <Isweet> Relaxed: Thanks
[19:46] <waressearcher2> is "MX Player" for android supported here ?
[19:54] <llogan> waressearcher2: we only provide support for FFmpeg stuff here
[19:58] <isweet> relaxed: Almost got the image -> video transcoding for v4l2 figured out
[19:58] <isweet> Using: cprogram | ffmpeg -vcodec png -f image2pipe -r 1/25 -i - -f rawvideo /dev/video0
[19:59] <isweet> But I'm getting "File 'dev/video0' already exists. Exiting."
[20:00] <isweet> But I'm getting "File '/dev/video0' already exists. Exiting."*
[20:01] <isweet> Is there a distinction between writing to a file and creating it?
[20:08] <waressearcher2> can someone confirm this is the right command to capture video in linux: "ffmpeg -threads 0 -async 30 -f alsa -i pulse -f x11grab -s 640x480 -r 30 -i :0.0+44,47 -vcodec libx264 -preset superfast -crf 16 -acodec libmp3lame -f mp4 /tmp/1.mp4" ?
[20:19] <danomite-> I'm having trouble getting my hls output to use the command line parameters: http://dpaste.com/3T97K4D
[20:29] <c_14> waressearcher2: looks about right
[20:31] <c_14> danomite-: what appears to be the issue?
[20:31] <c_14> ie, what ffmpeg is doing looks fine from where I'm sitting.
[20:32] <danomite-> c_14, the output of hls isn't honoring the command line args, for example hls_time is set to 2 seconds but the output is well over that
[20:49] <llogan> waressearcher2: do y-u really need the -async? i don't know what -threads 0 as an input option for x11 is supposed to do. you may want to add -pix_fmt yuv420p as an output option depending on your supported players
[20:49] <llogan> *you
[20:50] <llogan> and use -framerate instead of -r for x11grab input option
[20:50] <c_14> danomite-: add -force_key_frames 'expr:gte(n,n_forced+2)' as an output option
[20:50] <xanal0verlordx> Hello. How can I capture playback using alsa or pulse?
[20:50] <c_14> https://trac.ffmpeg.org/wiki/Capture/ALSA
[20:50] <llogan> http://ffmpeg.org/ffmpeg-devices.html#alsa
[20:51] <llogan> http://ffmpeg.org/ffmpeg-devices.html#pulse
[20:51] <xanal0verlordx> I saw all of that.
[20:51] <xanal0verlordx> There was nothing.
[20:51] <llogan> The Nothing took them.
[20:51] <c_14> Do you mean playback from an application? Ie audio output?
[20:51] <xanal0verlordx> Yes.
[20:52] <c_14> Either by using snd_aloop with alsa or by pushing funky nobs in pavucontrol
[20:52] <llogan> the wiki article could use an update mentioning that sort of stuff, IIRC
[20:53] <c_14> I could add the ALSA stuff to the wiki page, I just went through it with somebody a couple of days ago. Don't know squat about pulse though.
[20:54] <llogan> same here.
[20:54] <llogan> and what's the difference between -f alsa -i pulse, and -f pulsa -i default or whatever?
[20:54] <llogan> i don't ever see people usign -f pulse
[20:55] <xanal0verlordx> Is it not so obviously?
[21:05] <xanal0verlordx> Modprobed snd-aloop, used hw:1 for capture.
[21:05] <xanal0verlordx> Nothing.
[21:05] <xanal0verlordx> What should I do?
[21:05] <c_14> Sec, will have the wiki article up shortly.
[21:05] <xanal0verlordx> Thanks.
[21:06] <c_14> You'll be the first victi-- I mean volunteer.
[21:09] <danomite-> c_14, any pointers on reducing the stream delay with hls output?
[21:11] <c_14> xanal0verlordx: right, added 2 examples at the bottom of the page
[21:12] <xanal0verlordx> Capture/ALSA?
[21:12] <c_14> danomite-: no clue, sorry
[21:12] <xanal0verlordx> Thanks.
[21:12] <c_14> yep
[21:12] <c_14> If something doesn't work, ping me.
[21:13] <xanal0verlordx> What is pcm_substreams?
[21:14] <llogan> c_14: thanks. you could remove the ALSA suggestion from https://trac.ffmpeg.org/wiki/ArticlesForCreation when you're satisfied with the update
[21:18] <xanal0verlordx> Does not work without those substreams. And I can't rmmod it, it's in use.
[21:21] <danomite-> c_14, thanks, the key frame option worked
[21:23] <isweet> Currently using % cat abcd.png | ffmpeg -y -vcodec png -f image2pipe -r 1/25 -i - /dev/video0
[21:23] <isweet> To try to write images to a video device on linux
[21:24] <isweet> Getting "Unable to find a suitable output format for '/dev/video0'"
[21:24] <isweet> Any solutions?
[21:25] <c_14> xanal0verlordx: modprobe snd-aloop index=1 pcm_substreams=1 <- try that
[21:26] <c_14> isweet: add -f rawvideo
[21:26] <c_14> xanal0verlordx: you'll have to adjust the asoundrc though
[21:27] <c_14> Because you're using a different (new) loopback device
[21:27] <isweet> c_14: Tried that, yields a different error: "av_interleaved_write_frame(): Invalid argument"
[21:28] <c_14> isweet: you can try setting -f v4l2 as an output device, but I'm not sure ffmpeg'll like that either
[21:29] <isweet> c_14: "Unknown V4L2 pixel format equivalent for rgba"
[21:30] <c_14> What pixel formats does your device accept?
[21:31] <c_14> xanal0verlordx: or try `fuser -v /dev/snd/*' to find the processes using the sound output, kill them, `rmmod snd-aloop' and `modprobe snd-aloop pcm_substreams=1'
[21:31] <isweet> Thats a question over my head. Its a video loopback device (https://github.com/umlaeute/v4l2loopback) based on v4l2. I assumed it would support at least the standard pixel formats listed under the V4L2 spec. Is that an incorrect assumption?
[21:32] <c_14> hmm, add -pixel_format yuv420p
[21:33] <isweet> % cat abcd.png | ffmpeg -y -vcodec png -f image2pipe -r 1/25 -i - -f v4l2 -pix_fmt yuv420p /dev/video0
[21:33] <isweet> like so?
[21:34] <c_14> ye
[21:34] <isweet> Brilliant! No errors that time
[21:34] <isweet> Let me actually read the devicenow
[21:34] <isweet> now*
[21:35] <isweet> It totally works.
[21:35] <isweet> c_14: Thanks so much. Been struggling with this for a few days.
[21:35] <isweet> The pixel format stuff is out of my wheelhouse
[21:55] <xanal0verlordx> c_14: will try now.
[21:56] <xanal0verlordx> Killed pulseaudio, rmmoded snd_aloop.
[21:56] <xanal0verlordx> How to add this parameter to /etc/modules?
[21:56] <xanal0verlordx> Just write it with snd-aloop?
[21:57] <c_14> I think so.
[21:58] <c_14> Though you might have to add that to modprobe.d
[21:59] <c_14> with 'options snd-aloop pcm_substreams=1'
[21:59] <xanal0verlordx> Do not care right now. It still does not work.
[22:00] <xanal0verlordx> $ cat .asoundrc
[22:00] <xanal0verlordx> pcm.!default {
[22:00] <xanal0verlordx>   type plug slave.pcm "hw:Loopback,0,0"
[22:00] <xanal0verlordx> }
[22:00] <xanal0verlordx> Right?
[22:00] <c_14> ye, try adding a newline between plug and slave
[22:01] <waressearcher2> llogan: what option should I add to configure ? --enable-x11grab ? or is it only to grab desktop ? will it grab say fullscree opengl programm ?
[22:02] <xanal0verlordx> waressearcher2: yep, you should configure ffmpeg with --enable-x11grab.
[22:02] <xanal0verlordx> And some sound libs.
[22:03] <xanal0verlordx> Maybe, lame.
[22:03] <xanal0verlordx> It will grab everything you see.
[22:04] <waressearcher2> http://sprunge.us/eEcQ  these are options for configure I use, any other I should add ?
[22:04] <xanal0verlordx> c_14: still does not work.
[22:05] <waressearcher2> xanal0verlordx: by 'lame' you mean '--enable-libmp3lame' ?
[22:05] <xanal0verlordx> Yes.
[22:06] <waressearcher2> xanal0verlordx: are you into pr0n ?
[22:06] <xanal0verlordx> Wut?
[22:07] <c_14> Can you pastebin your .asoundrc and the output of aplay -l and the output of ffmpeg ?
[22:11] <xanal0verlordx> https://clbin.com/0GtWa
[22:11] <xanal0verlordx> c_14:
[22:13] <xanal0verlordx> news.microsoft.com/2014/11/12/microsoft-takes-net-open-source-and-cross-platform-adds-new-development-capabilities-with-visual-studio-2015-net-2015-and-visual-studio-online/
[22:13] <xanal0verlordx> OHWOW
[22:14] <xanal0verlordx> Today is 1st of April?
[22:17] <c_14> So what exactly isn't working?
[22:17] <xanal0verlordx> I do not hear any sound.
[22:17] <c_14> When playing the mp3?
[22:18] <xanal0verlordx> Yes.
[22:18] <c_14> Probably because your audio device is still set to the loopback device?
[22:18] <waressearcher2> xanal0verlordx: do you have 2 sound cards ?
[22:18] <waressearcher2> one Intel and other Yamaha ?
[22:18] <xanal0verlordx> waressearcher2: dunno.
[22:18] <waressearcher2> xanal0verlordx: lspci
[22:18] <xanal0verlordx> c_14: I hear any other sounds.
[22:19] <c_14> You have pulse installed?
[22:19] <xanal0verlordx> Yes.
[22:20] <c_14> I can think of 2 reasons it might not be working. 1: it's an avconv issue (avconv is from libav and is a fork of ffmpeg [and technically not supported here, if you're on debian you can get ffmpeg from unstable, otherwise you'll need to build from source]) or 2: pulse is being a bitch and breaking things
[22:21] <xanal0verlordx> I think that libav devs did not breaked that stuff, so maybe it is pulse.
[22:22] <xanal0verlordx> Okay, do you know how to do loopback with pulse?
[22:23] <xanal0verlordx> waressearcher2: ~$ lspci | grep Audio
[22:23] <xanal0verlordx> 00:1b.0 Audio device: Intel Corporation 7 Series/C210 Series Chipset Family High Definition Audio Controller (rev 04)
[22:23] <c_14> Sadly no, I know it's possible but I don't know the details.
[22:26] <waressearcher2> xanal0verlordx: you have one card and its Intel
[22:27] <xanal0verlordx> waressearcher2: I see.
[22:28] <waressearcher2> now you know
[22:30] <xanal0verlordx> Will try tomorrow and tell you about the results, if you want to make your wiki better.
[22:30] <xanal0verlordx> Bye.
[22:48] <xanal0verlordx> Yet another question.
[22:49] <xanal0verlordx> Is there a way to grab video using Android?
[22:49] <xanal0verlordx> Smth like -f androidfb.
[22:51] <xanal0verlordx> Wait, it uses Linux fbdev.
[22:51] <xanal0verlordx> So I can grab video directly from /dev/fb0?
[00:00] --- Thu Nov 13 2014


More information about the Ffmpeg-devel-irc mailing list