[Ffmpeg-devel-irc] ffmpeg.log.20130606
burek
burek021 at gmail.com
Fri Jun 7 02:05:01 CEST 2013
[00:00] <bunniefoofoo> sure, with png codec
[00:00] <pyBlob> I don't want to use mjpg as jpg uses lossy-compression
[00:00] <bunniefoofoo> Have you looked at FFMPEG-Java (or naother java wrapper?) http://fmj-sf.net/ffmpeg-java/getting_started.php
[00:01] <smus> how can i tell what pix_fmts are supported? AV_PIX_FMT_YUV420P seems to work, but AV_PIX_FMT_RGBA doesn't.
[00:01] <bunniefoofoo> also there is JavaCV for that http://code.google.com/p/javacv/
[00:03] <pyBlob> yes, I already have thought about using something like this
[00:04] <pyBlob> but for now I want to use the ffmpeg application, not the library
[00:05] <pyBlob> so the correct command would be: "ffmpeg -i INPUT -f png OUTPUT" ?
[00:05] <pyBlob> -> reads from stdin and writes a png-sequence to stdout?
[00:05] <bunniefoofoo> smus, I'm not sure if ffmpeg command line can do it, but you can look at the codec sources. At the bottom of each _enc.c file under libavcodec, is the declaration of supported formats
[00:06] <smus> bunniefoofoo: im doing this from C++
[00:06] <bunniefoofoo> smus, in that case you can iterate over pix_fmts array on the codec
[00:06] <smus> bunniefoofoo: AV_PIX_FMT_RGBA is in that list, but it pukes for some reason. maybe missing some compile flag?
[00:07] <bunniefoofoo> smus, does it fail on avcodec_open2 ?
[00:07] <smus> bunniefoofoo: yes, exactly.
[00:08] <bunniefoofoo> smus, raise the loglevel to 99 and see if you get any more info
[00:08] <relaxed_> smus: which encoder?
[00:08] <smus> relaxed_: AV_CODEC_ID_MPEG2VIDEO
[00:09] <bunniefoofoo> uh...
[00:09] <relaxed_> smus: "ffmpeg -h encoder=mpeg2video" will list the supported pixel formats
[00:10] <smus> bunniefoofoo: how do i do that?
[00:11] <bunniefoofoo> av_set_log_level()
[00:12] <smus> yeah indeed "Supported pixel formats: yuv420p yuv422p"
[00:16] <smus> and libav doesn't provide rgba packed => yuv422p conversion utils?
[00:41] <odites> hi to everyone
[00:44] <durandal_1707> its provided by libswscale
[00:45] <odites> if i want only extract audio from a video, i just do "ffmpeg -i file.mp4 -vn -acodec copy out.m4a" ? I found this and I want to know if it's correct. thank you in advance
[00:46] <durandal_1707> yes, its correct, but maybe you will need to extract specific stream, in which case read:
[00:47] <odites> like m4a for mp4, and mp3 for flv?
[00:48] <durandal_1707> m4a, mp4, mp3 and flv are extensions
[00:50] <odites> maybe I understood what you say :) thank you
[00:51] <vlex> hello
[00:53] Last message repeated 1 time(s).
[00:53] <relaxed_> ask your question
[00:55] Action: durandal_1707 +o
[01:03] <pyBlob> is there a way to output a png-sequence using ffmpeg?
[01:03] <durandal_1707> yes, there is.
[01:04] <pyBlob> so which format do I have to specify to get this sequence onto stdout?
[01:06] <durandal_1707> you do not need to specify format: ffmpeg -i input out%4d.png
[01:07] <pyBlob> durandal_1707: that works, but is there a way to get that on stdout for readout in other programs?
[01:07] <durandal_1707> you mean to pipe images?
[01:07] <pyBlob> yes
[01:09] <durandal_1707> ffmpeg -i input -acodec png -f image2pipe pipe: | ....
[01:10] <durandal_1707> -vcodec png....
[01:10] <smus_> durandal_1707: thanks. checking it out. trying to figure out the right way to encode RGBA into an AVFrame. does the RGBARGBA... uint8_t go into the "data" array?
[01:11] <pyBlob> durandal_1707: thanks, this finally worked =)
[01:11] <durandal_1707> smus_: thats is for encoding from rawvideo
[01:12] <smus_> yep, im encoding from raw images.
[01:12] <durandal_1707> what format?
[01:13] <smus_> multiple RGBAs into an MPEG2. seems i have to go to YUV420P first
[01:13] <durandal_1707> you could use rawvideo encoder/decoder, but that is slower path
[01:15] <durandal_1707> actually nope, the only sane way to convert between pixel formats is using libswscale
[01:24] <durandal_1707> or using libavfilter and its format filter (which wraps to libswscale)
[01:36] <hackeron> I have a weird problem, if I do curl 'http://admin:123456@192.168.1.40/cgi-bin/cmd/system?GET_STREAM' > test.h264 -- the stream is read correctly and I can open it with vim and see boundaries and h264. If I however run ffmpeg -i 'http://admin:123456@192.168.1.40/cgi-bin/cmd/system?GET_STREAM' -codec:v copy out.avi -- I get [http @ 0x2732dc0] HTTP error 401 Unauthorized -- why is that? - do I need to pass username/password in some other way ...
[01:36] <hackeron> ... with ffmpeg?
[01:48] <hackeron> I think the camera expects HTTP/1.0 and not 1.1 - can I tell ffprobe/ffmpeg to use HTTP/1.0?
[01:51] <burek> curl ... | ffmpeg -i - ... ?
[01:52] <hackeron> burek: yeh, just tried that but it doesn't recognise the content type.
[01:52] <hackeron> burek: just says: pipe:: Invalid data found when processing input
[01:53] <burek> curl ... | ffmpeg -f SPECIFY_CORRECT_FORMAT -i - ... ?
[01:53] <hackeron> burek: what's the right format for Content-Type: multipart/x-mixed-replace; boundary=GetStreamBoundary
[01:54] <burek> format = container = what are your audio/video streams wrapped into?
[01:54] <hackeron> no audio, just h264
[01:54] <hackeron> raw h264 se[arated by --GetStreamBoundary
[01:54] <hackeron> separated*
[01:57] <burek> "I can open it with vim" :) seriously?
[01:57] <burek> what kind of media files is that?
[01:58] <burek> file*
[01:59] <hackeron> burek: http://pastie.org/8012428
[01:59] <hackeron> where it says RAW-H264-HERE is binary data with raw H264
[01:59] <hackeron> there's no container
[02:00] <hackeron> it's like MJPEG
[02:00] <burek> try with -f h264
[02:00] <hackeron> except instead of JPEG images, it is H264
[02:00] <hackeron> -f h264 says the same pipe:: Invalid data found when processing input
[02:00] <burek> well, i cant help more..
[02:00] <hackeron> :(
[02:00] <burek> what kind of camera is that anyway..
[02:01] <hackeron> acti E32
[02:01] <hackeron> there's RTSP too, but ffmpeg is not sending the required OPTION keep alive, so it stops streaming after 30 seconds :(
[02:02] <burek> what version of ffmpeg are you using
[02:02] <hackeron> burek: today's master
[02:04] <burek> http://ffmpeg.org/pipermail/ffmpeg-devel/2012-February/121386.html
[02:06] <hackeron> yeh, I saw that but the problem is all these IP cameras report that they support GET_PARAMETER so FFMPEG switches to that instead of using OPTION :(
[02:06] <hackeron> reading the camera with ffmpeg through live555ProxyServer gets around the keepalive, but for some reason the video keeps breaking then :(
[02:07] <burek> could you file a bug report for that issue perhaps?
[02:07] <hackeron> yeh, I did to the live555 mailing list :)
[02:07] <hackeron> need to get a tcpdump or something of with and without live555 to figure out exactly what is happening, then will file ffmpeg bug too
[02:08] <hackeron> this is what I see with/without live555 (this is on another camera, one where ffmpeg does work with): http://www.youtube.com/watch?v=ShscnaNvNgw
[02:10] <hackeron> ok, figured out why ffmpeg can't open the http stream
[02:11] <hackeron> it sends Authorization: Basic YWRtaW46MTIzNDU2 at the very end instead of straight after the GET request
[02:12] <hackeron> oh no wait, that's not it, it's because it sends a Connection: Keep-Alive
[02:12] <hackeron> http://pastie.org/8012456
[02:13] <hackeron> how do I get ffmpeg not to send the Connection: Keep-Alive when opening an HTTP stream?
[02:19] <durandal_1707> hackeron: it's not sent by default
[02:19] <hackeron> durandal_1707: it is: https://github.com/FFmpeg/FFmpeg/blob/master/libavformat/http.c
[02:19] <hackeron> durandal_1707: unless I'm misunderstanding something?
[02:19] <durandal_1707> i look at same code, and its not
[02:19] <hackeron> http://pastie.org/8012474
[02:21] <durandal_1707> its sent only if multiple_request is not 0
[02:21] <hackeron> hmm, maybe something else is sent that the camera doesn't like. Is there anyway to output what is sent?
[02:22] <durandal_1707> user-agent
[02:24] <hackeron> durandal_1707: I tried this: http://pastie.org/8012489 - and it streams
[02:24] <hackeron> but ffmpeg -i 'http://admin:123456@192.168.1.40/cgi-bin/cmd/system?GET_STREAM' shows 401 unauthorized, hmmm
[02:24] <hackeron> what could be different?
[02:27] <hackeron> hang on a seconf
[02:27] <hackeron> second*
[02:27] <hackeron> ffmpeg doesn't send the authorization at all :/
[02:27] <hackeron> doesn't show up in tcpdump
[02:28] <hackeron> I see this when I run ffmpeg -i 'http://admin:123456@192.168.1.40/cgi-bin/cmd/system?GET_STREAM' - http://pastie.org/8012500 -- any ideas?
[02:28] <durandal_1707> i first need to know what server needs
[02:30] <hackeron> durandal_1707: http://pastie.org/8012507
[02:34] <durandal_1707> hackeron: have you read documentation?
[02:34] <hackeron> durandal_1707: which documentation?
[02:35] <durandal_1707> http://ffmpeg.org/ffmpeg-protocols.html
[02:36] <hackeron> hmm? - but it works perfectly well with identical headers, except ffmpeg isn't generating the authorization line. Isn't it meant to? < http://pastie.org/8012523
[02:37] <hackeron> it says for ftp you can use ftp://[user[:password]@ - for rtsp it doesn't say but rtsp://[user[:password]@ works. Shouldn't it work for http?
[03:21] <durandal_1707> hackeron: i dunno, probably does not, as you experience problems, use options mentioned in documentation
[04:41] <smus> am i right that an AVFrame encoded with AV_PIX_FMT_RGBA should have a data[] array of size width*height*4 ?
[05:10] <schnarr> Hello
[05:16] <schnarr> Could anyone help me with this issue? http://ffmpeg.gusari.org/viewtopic.php?f=16&t=948
[07:25] <t4nk742> Originally, I use ffmpeg +ffserver to output http stream. the stream in ffserver.conf is : <Stream test.asf> Feed feed1.ffm Format asf AVOptionVideo flags +global_header VideoSize 1280x720 VideoFrameRate 30 VideoCodec libx264 NoAudio </Stream> the command line of ffmpeg is : ffmpeg -loglevel debug -f video4linux2 -r 30 -s 1280x720 -input_format h264 -i /dev/video1 -vcodec copy http://localhost:8090/feed1.ffm =======================
[07:25] <t4nk742> how to modify ffserver.conf for rtsp/rtp stream? and how to modify ffmpeg command line for output rtsp/rtp stream?
[07:27] <yajiv> hello folks&.so i have a pipeline where i capture raw data from the fbdev encode it to h264 (zero latency settings) in an flv -> send it out as an rtsp stream and play it back on ffplay. Even on lan i'm noticing a 200-300ms latency.
[07:28] <yajiv> I would like to try to minimize this latency even further, and i was hoping to figure out where the latency hog was. Using -benchmark_all doesn't give me any actionable information
[07:28] <t4nk742> does anyone know how to output rtsp/rtp stream with ffmpeg+ffserver?
[07:29] <yajiv> t4nk742: You can potentially do it directly from ffmpeg as well.
[07:29] <yajiv> if you have a listener
[07:29] <yajiv> so i was considering perhaps looking at the frames that were encoded and trying to see at what time it arrives at ffplay. So my questions are 2 fold 1. Anybody know where I can plug into ffplay to print out frame numbers ?
[07:30] <yajiv> 2. If anyone has a better method to measure encode/mux/transport/decode/playback latency I'm' all ears :-)
[07:33] <t4nk742> hi yajiv, do you have the example?
[07:33] <llogan> t4nk742: http://ffmpeg.org/ffmpeg-protocols.html#rtsp
[07:34] <t4nk742> in ffserver.conf, I can see the stream for rtsp
[07:34] <t4nk742> does anyone know how to create a stream in ffserver.conf
[07:38] <yajiv> if you were streaming directly from ffmpeg: -f rtsp -rtsp_transport udp rtsp://192.168.1.112:1234/live.sdp
[07:38] <yajiv> for the output portion
[07:40] <t4nk742> and if I use vlc to get the stream, do you know how to?
[07:42] <t4nk742> In ffserver.conf , I can see the stream for rtsp :
[07:43] <t4nk742> <Stream test1-rtsp.mpg> Format rtp File "/usr/local/httpd/htdocs/test1.mpg" </Stream>
[07:43] <t4nk742> does any know how to use feed1.ffm for ffserver to output rtsp stream?
[07:46] <yajiv> have you looked at sample ffserver.conf
[07:47] <t4nk742> yes, but I do not have idea how to modify the stream for my purpose
[07:48] <t4nk742> if the video data is capture by ffmpeg and the network stream is output by ffserver
[07:48] <t4nk742> then the command line for ffmpeg and conf file for ffserver
[07:48] <t4nk742> how to get correct setting for my purpose?
[16:55] <luc4> Hi! I created an mp4 from an h264 stream using libavformat. The resulting mp4 seems to be ok in most players, but for some strange reason totem cannot read it and android cannot create thumbnails for it. ffmpegthumbnailer instead reports: "Not a native file, thumbnailing will likely fail". Any idea how I can find out if something is wrong with it?
[19:22] <diverdude> I have a std::queue< std::vector<uint8_t> >, each vector represents a monochrome image. How can ffmpeg transform these std::vector into a stream which can be read by some mpeg4 reader og webm reader or similar?
[19:28] <diverdude> anyone?
[19:30] <llogan> diverdude: you need to wait a while and if someone has an answer they will help, otherwise try the appropriate ffmpeg mailing list.
[19:35] <saste> diverdude, ffmpeg the program or ffmpeg the API?
[19:43] <diverdude> saste, API
[19:44] <saste> diverdude, basically you need to encode and mux video frames, so it is doc/examples/muxing.c
[19:53] <diverdude> saste, is libav the same as ffmpeg?
[19:53] <saste> diverdude, !fork
[19:59] <diverdude> hehe ok....there was a fight
[20:03] <llogan> that name was in use before it was usurped by the fork
[20:26] <ulatekh> Pretty entertaining read about the ffmpeg/libav fight...I had never heard about any of this
[20:48] <diverdude> saste, hmm looking at doc/examples/muxing.c dont make me much wiser. they do avformat_alloc_output_context2(&oc, NULL, NULL, filename); but i think their source is already a video stream coming from a file
[21:31] <alisonken1home> anyone know how to use ffmpeg to convert from mp4 to ogv without dropping frames and making a jerky video?
[21:31] <alisonken1home> ffmpeg -i world_hunger_left.mp4 -acodec vorbis -vcodec libtheora -strict experimental world_hunger_left.ogv
[21:31] <alisonken1home> that's the command I used
[21:32] <alisonken1home> Encoder did not produce proper pts, making some up.
[21:32] <ulatekh> Did the debug spew indicate that the input specs (framerate, size, bitrate, etc.) were the same as the output specs? Also, does your target codec support all the features of your source codec?
[21:32] <alisonken1home> I get this after transcoding
[21:34] <alisonken1home> http://pastebin.com/E4kh9iiP <-- full output
[21:34] <ulatekh> Looking
[21:35] <ulatekh> "Video: theora, yuv420p, 1280x720, q=2-31, 200 kb/s, 29.97 tbn, 29.97 tbc" 200 kb/s is a really low bitrate....specify something higher
[21:39] <alisonken1home> ulatekh: what would be the option for that?
[21:39] <alisonken1home> (new to transcoding)
[21:41] <ulatekh> I think it's -b:v 2700k (to match your 2669k input)...let me go check
[21:42] <ulatekh> Yeah, that'd be it.
[21:43] <alisonken1home> on output stream?
[21:43] <ulatekh> It'd only be interpreted as something that affects the output stream, so you're safe.
[21:45] <alisonken1home> trying it now
[21:46] <alisonken1home> hmm - same issue
[21:47] <alisonken1home> http://pastebin.com/6b54nBSx
[21:48] <alisonken1home> ogv output file is only 3M still as well
[21:49] <ulatekh> Yeah, that's odd...the bitrate seems to have taken effect as far as the debug-spew is concerned, but then it wasn't used....odd....I'll keep looking
[21:51] <alisonken1home> ulatekh: http://www.namb.net/more_than_a_meal/ <-- this is the video I'm trying to convert (bottom - MPEG-4 or MPEG-1; either one is having the same issue)
[21:51] <ulatekh> FYI, the 2nd pastebin is for a different video.
[21:52] <alisonken1home> ack - sorry, but trying the same with both but the one I need for this weekend is the second paste
[21:52] <ulatekh> NP...it was just something I noticed :-)
[21:52] <ulatekh> So which distribution is still using ffmpeg 0.11.2? I'm on Fedora 18 and we have 1.0.7
[21:52] <alisonken1home> I tried the .mov file to see if I got the same results
[21:53] <alisonken1home> slackware 14.0
[21:54] <ulatekh> Ah, good old Slackware...my first distribution, way back in 1994....it came on floppies :-)
[21:54] <alisonken1home> yeah - http://slackware.dreamhost.com/slackware/ <-- all the slackware versions I've run
[21:54] <ulatekh> OK, I'm on that web page. Which video should I download? i.e. MPEG-4, MPEG-1, MPEG-4 HD, MPEG-1 HD
[21:55] <alisonken1home> mpeg-4 or mpeg-1
[21:55] <alisonken1home> don't think I need the HD versions for a projector
[21:55] <ulatekh> Ha, righteous...I switched from slackware to Red Hat as soon as I heard about rpm.
[21:56] <ulatekh> I'm going to try transcoding it here & see what happens.
[21:59] <ulatekh> So far, I seem to be getting better results...at least the bitrate isn't 200.
[22:01] <alisonken1home> the problem is gstreamer is used by libreoffice and on this machine it has issues with files other than ogv
[22:02] <alisonken1home> ah - ffmpeg is a slackbuild - not part of stock slackware
[22:03] <ulatekh> ffmpeg lets you build it with its own suffix, so you can have your stock version, and a separate version that doesn't interfere. I do that on my machine.
[22:05] <ulatekh> OK, my transcode isn't jerky, but it looks terrible...looks low bitrate. Let me try a few things.
[22:07] <nullie> how can I tell ffplay to treat file as annex b h264?
[22:08] <ulatekh> OK, one improvement...instead of "-sameq", I used "-dc 10 -qmin 1 -lmin '1*QP2LAMBDA'" and put that after the "-b:v" setting.
[22:08] <ulatekh> It doesn't look as good as the h264, but I can increase the bitrate, and my next step will be to do 2-pass transcoding.
[22:08] <ulatekh> But it's not jerky.
[22:09] <ulatekh> nullie: Doesn't ffplay auto-detect your input format?
[22:10] <nullie> ulatekh, I'd like to tell it beforehand, because it's stream actually
[22:10] <nullie> no buffering would be good too
[22:10] <ulatekh> Ah. Haven't done much of that with ffplay, sorry. Hopefully someone more knowledgeable about that will pipe in.
[22:11] <ulatekh> alisonken1home, you lucked out that I've done a lot of transcoding :-)
[22:11] <alisonken1home> that's a big help - thanks
[22:11] <ulatekh> So far, the substitution I gave you for "-sameq" is having a good effect. Did you try that locally?
[22:12] <alisonken1home> yes - the video was the same low quality, and it also dropped the audio quality
[22:13] <alisonken1home> the fun part is both xine and mplayer play them just fine
[22:13] <ulatekh> Oh...what should I be using to play the result, then? I was using mplayer.
[22:14] <LithosLaptop> converting to theora?
[22:14] <alisonken1home> I'm running kde so I use dragonplayer - whatever one uses the gstreamer plugin (unless it's due to mplayer using gstreamer differently)
[22:15] <alisonken1home> ulatekh: hmm - someone suggested using -q:v 10 and it seems to work fine now
[22:15] <alisonken1home> other than file size jumped from 27M to 40M
[22:15] <LithosLaptop> the built-in vorbis encoder is broken. use libvorbis instead
[22:15] <alisonken1home> 48M
[22:16] <ulatekh> -q 10? That's pretty low...I like forcing my q as low as possible. That's what my -sameq substitution did.
[22:16] <alisonken1home> well, seems to work now
[22:16] <alisonken1home> at least dragonplayer works with the new one - time to try it in lo
[22:16] <ulatekh> Sweet, glad you got what you wanted. And apparently Fedora has a "dragon" media player, I'd never heard of it until now.
[22:17] <ulatekh> Since you're here, though, and doing transcoding, let me share my other main trick: 2-pass transcoding.
[22:18] <ulatekh> Run the transcode twice, once with "-pass 1 -passlogfile passlog$$" and once with "-pass 2 -passlogfile passlog$$" -- that'll let it calculate, in the first pass, where the bits are most needed, and in the 2nd pass, will use that information to increase the quality where it's needed.
[22:24] <LithosLaptop> alisonken1home: dd you get the '-acodec vorbis -strict experimental ' from a guide or does your ffmpeg not support libvorbis?
[22:24] <alisonken1home> hmm
[22:24] <alisonken1home> LithosLaptop: from a guide
[22:25] <LithosLaptop> yeah thats wrong :)
[22:25] <LithosLaptop> just use -acodec libvorbis
[22:26] <LithosLaptop> and specify bitrate for the audio with -b:a
[22:26] <alisonken1home> ffmpeg -i world_hunger_left.mp4 -b:v 2700k -acodec vorbis -vcodec libtheora -strict experimental -q:v 8 world_hunger_left-2.ogv
[22:26] <alisonken1home> that's the command I used
[22:26] <ulatekh> LithosLaptop: I think alisonken1home was having more trouble with jerky video than with audio...still, maybe the video problems distracted from any audio issues :-)
[22:26] <alisonken1home> the -b was from ulatekh
[22:26] <alisonken1home> the -q:v 8 was from someone else in another channel
[22:27] <LithosLaptop> ulatekh: ah ok
[22:27] <alisonken1home> yeah - the video _really_ sucked
[22:27] <LithosLaptop> the audio is also going to suck :)
[22:27] <alisonken1home> the audio seems to be fine now
[22:27] <alisonken1home> as well
[22:28] <LithosLaptop> but yeah focus on the video first
[22:28] <ulatekh> I still recommend the -qmin thing and the -pass {1,2} thing. I'm running them locally now.
[22:28] <ulatekh> But is the transcoding working better on your end now, at least?
[22:28] <alisonken1home> so far seems to be doing well
[22:29] <alisonken1home> have one more check - then see how it does on the laptop
[22:29] <ulatekh> OK, so you haven't tested playback yet?
[22:29] <alisonken1home> tested in dragonplayer - working on embedded video in LO now
[22:33] <ulatekh> Wow, the 2-pass-transcoded version looks like the original...so it comes down to your player. Let us know how it does with your latest transcode.
[22:35] <alisonken1home> ulatekh: first I'm getting a working LO presentation with what I have - then I'll try the 2-pass thing and see how it goes
[22:35] <ulatekh> Sensible enough.
[22:35] <alisonken1home> ok - to do the 2 pass, what would be the full command?
[22:37] <ulatekh> Same as before, just add the line I gave you, once with "-pass 1", once with "-pass 2".
[22:38] <alisonken1home> ok - general options so put them at the beginning
[22:39] <ulatekh> I tend to put them after my other video options...so to be safe, put the -qmin thing and the -pass thing after "-b:v", IMO.
[22:39] <alisonken1home> ok
[22:45] <alisonken1home> ok - missed the -qmin option
[22:47] <alisonken1home> ulatekh: ffmpeg -i world_hunger_left.mp4 -b:v 2700k -pass 1 -passlogfile world_hunger-left.log -acodec vorbis -vcodec libtheora -strict experimental -q:v 7 world_hunger_left-2.ogv
[22:47] <alisonken1home> Error while opening encoder for output stream #0:0 - maybe incorrect parameters such as bit_rate, rate, width or height
[22:48] <ulatekh> Take the whole "-b:v 2700k -pass 1 -passlogfile world_hunger-left.log" part and put it after the "-q:v 7" part.
[22:49] <alisonken1home> same thing
[22:50] <ulatekh> Here was my command line: "ffmpeg -i world_hunger_2011-09_hd.mp4 -b:v 5400k -dc 10 -qmin 1 -lmin '1*QP2LAMBDA' -pass 1 -passlogfile world_hunger_passlog.txt -threads 2 -acodec libvorbis -vcodec libtheora -strict experimental world_hunger_2011-09_hd.ogv"
[22:50] <ulatekh> Followed by "-pass 2"
[22:52] <alisonken1home> ok - missed the -dc10 and -qmin 1
[22:52] <alisonken1home> working now
[22:52] <alisonken1home> thanks
[22:53] <ulatekh> ffmpeg tends to have unreasonable built-in limits on q, like 2. I go through all the trouble to make my videos look perfect, and then ffmpeg want to trash them. I found that "-qmin 1 -lmin '1*QP2LAMBDA'" 2-step a long time ago and have been using it ever since.
[23:03] <alisonken1home> the only problem so far is the filesize mp4=15.6M and ogv=50.4M
[23:03] <alisonken1home> other than that works fine
[23:04] <alisonken1home> ulatekh: thanks for all the help
[23:05] <ulatekh> You're welcome...and yeah, codecs vary widely with respect to efficiency.
[00:00] --- Fri Jun 7 2013
More information about the Ffmpeg-devel-irc
mailing list