[Ffmpeg-devel-irc] ffmpeg.log.20141113

burek burek021 at gmail.com
Fri Nov 14 02:05:01 CET 2014


[03:28] <Mista-D> Can't repackage MP4 with MPEG4 to a TS. `./ffmpeg -i source1.mp4 -c:copy test.ts` -- executes without warnings or errors, while resulting TSfile video track is unreadable. http://pastebin.com/yTChS3QV
[03:48] <relaxed> Mista_D: try -c copy -mpegts_m2ts_mode 1
[03:53] <relaxed> nevermind, m2ts doesn't support mpeg4
[04:04] <relaxed> Mista_D: I think you should file a bug report
[04:20] <llogan> Mista_D: why did you remove the ffmpeg version from your console output?
[04:21] <llogan> ah, i see it in the additional paste below
[04:22] <Mista_D> llogan: its the latest release I think...
[04:23] <Mista_D> relaxed: will file a bug with a file sample shortly, thanks
[04:31] <relaxed> Mista_D: I was able to reproduce it
[04:35] <Mista_D> relaxed: with other mpeg4 file?
[04:37] <relaxed> yes
[07:52] <aixenv> hey guys i just did a source compile of a new setup, and im trying to do a 2 pass on a video and getting the following
[07:53] <aixenv> http://pastebin.com/QJi161Zt
[07:53] <aixenv>  Could not find codec parameters for stream 0 (Video: h264 (avc1 / 0x31637661), none, 1920x1080, 20998 kb/s): unspecified pixel format
[07:54] <aixenv> im wondering if this movie got corrupt possibly on xfer off the cell phone
[07:55] <relaxed> aixenv: yeah, it looks like a decoding issue.
[07:55] <aixenv> any way to verify if the file is corrupt?
[07:55] <aixenv> if i do a "file" i get
[07:56] <aixenv> shells_cell_5572.mp4: ISO Media, MPEG v4 system, version 1
[07:56] <aixenv> so im a bit perplexed how it could be corrupt yet still look like an mp4, but again im not sure, this just seems weird
[07:57] <relaxed> aixenv: can you play it?
[07:57] <aixenv> no i was not able to play it via windows, (which is why i was going to try and see if i could make it happy via linux/ffmpeg)
[07:57] <aixenv> i tried in like 3 dif players too
[07:58] <aixenv> vlc player, wondershare player, and windows media player
[07:58] <relaxed> that's a pretty good sign it is.
[07:58] <aixenv> crap, its kinda important for her work, any way i could fix this?
[07:59] <relaxed> grab it off the phone again
[07:59] <aixenv> she xferred and nuked off phone :(
[07:59] <aixenv> since that's the default
[07:59] <aixenv> lame default btw
[08:00] <aixenv> oh well, ill just tell her the files are hosed, that stinks, one other questoin
[08:00] <aixenv> i just upgraded (src compiles and what not) all my ffmpeg stuff, everything looks to be working and happy
[08:00] <aixenv> i have a home project where i encode my videocam videos, and put them ona  streaming site i made (just home use)
[08:00] <relaxed> hmm, try ffmpeg -pix_fmt yuv420p -i shells_cell_5572.mp4 -c copy output.mp4
[08:01] <aixenv> i havent updated/tweaked my ffmpeg command in about 3-4yrs, anything you'd adviseme do dif than this?
[08:01] <aixenv> ok ill try that too , here's the commmand im using currently
[08:01] <relaxed> I saw it in the pastebin
[08:01] <aixenv> ffmpeg -y -i "$file" -vcodec libx264 -r 30000/1001 -deinterlace -s 1024x576 -crf 21 -maxrate 2M -bufsize 6M -vpre slow -threads 0 -acodec libfaac -ar 48000 -ab 128k ${file%.*}-1.mp4 >> "${file%}"-encoding.out 2>&1
[08:01] <aixenv> dif command
[08:01] <aixenv> that was me just trying to get this video working
[08:01] <aixenv> thats my "normal" encoder script command
[08:02] <relaxed> yeah, -vpre is outdated. read https://trac.ffmpeg.org/wiki/Encode/H.264
[08:02] <aixenv> go with preset now?
[08:02] <relaxed> yes
[08:03] <aixenv> ok thanks, anything else youd change ?
[08:03] <aixenv> btw that command gave "Option pixel format not found"
[08:03] <relaxed> -pixel_format yuv420p
[08:04] <aixenv> same
[08:06] <aixenv> its ok if its hosed its hosed ; would that pastebin command give better quality/size ratio than my above command? if so ill just switch to that 2 pass command, i notice that uses libfdk_aac versus libfaac too
[08:06] <aixenv> and i grabbed the 2 pass example from that link you had given me :)
[10:32] <danomite> is ffserver still activley developed?
[10:34] <relaxed> I don't think it's been actively developed for a long time.
[10:39] <danomite> Is there a reccomendation for an open source alternative?
[10:56] <Pkunk> Are there any examples of using the fps filter in C ? I'm trying it but for some reason the output from buffersink returns duplicate frames, even though the source is 50 fpsfps and target is 25fps
[12:38] <Pkunk> Why does using the fps filter with avfilter_graph_parse_ptr give duplicate frames ? I get the same frame repeated for 25 seconds instead of 25 frames every 1 second
[12:42] <c_14> Try inverting the number?
[12:45] <Pkunk> c_14: I'm doing fps=fps=25/1  .. Is thought thats right according to the docs
[12:45] <c_14> In the api?
[12:45] <c_14> I'm pretty sure it's inverted in the API iirc.
[12:46] <Pkunk> Using the C API .. Thanks for the tip , lemme try it your way
[12:49] <Pkunk> Thats slows down the fps to 1 frame every 25 seconds .. Doesn't fix the issue I've been experiencing where the fps filter throws out duplicate frames when it's supposed to drop them
[12:55] <c_14> I have no clue. Never really used the api.
[13:01] <galex-713> Hello
[13:01] <Pkunk> c_14: Thanks anyway, thats one thing that I didn't try
[13:02] <galex-713> Two things: 1) I forgot the link of the ffmpeg binary& where it is? ^^" 2) how to shrink some audio stream by 24/23 ?
[13:03] <c_14> http://johnvansickle.com/ffmpeg/ <- this one?
[13:04] <galex-713> Yes thanks ^^
[13:04] <c_14> And what do you mean shrink by 24/23?
[13:06] <galex-713> Err, 25/23*
[13:07] <galex-713> c_14: I got a movie at 25fps which is ~\frac{25}{23} times bigger than the same movie in French at ~23fps
[13:07] <galex-713> s/at/with/
[13:07] <galex-713> s/with/of/
[13:08] <xanal0verlordx> Found monitor in pavucontrol.
[13:08] <xanal0verlordx> Will try.
[13:17] <xanal0verlordx> Yeah.
[13:17] <xanal0verlordx> I made that.
[13:18] <ribasushi> relaxed: ok, I got to the bottom of the issue (but don't yet know how to solve it)
[13:18] <ribasushi> http://i.imgur.com/Z9nIk4T.png <--- normal color
[13:18] <ribasushi> wget -qO- http://i.imgur.com/Z9nIk4T.png | ffplay -hide_banner -f image2pipe -i /dev/stdin <--- color loss
[13:18] <ribasushi> this is the basic effect I am trying to avoid, and not sure how :(
[13:20] <ribasushi> lots of my google hits suggest that this has to do with crappy rgb->yuv conversion, but even things like h264rgb do not help even though they should
[13:37] <xanal0verlordx> ffmpeg -f x11grab -framerate 25 -video_size `xrandr | grep '*' | cut -d' ' -f4` -i $DISPLAY -f pulse -i `pactl list sources | grep 'Name:' | grep 'monitor' | cut -d' ' -f2`
[13:37] <xanal0verlordx> c_14: is it right?
[13:37] <xanal0verlordx> Without output yet.
[13:45] <xanal0verlordx> Whoops, micro and playback does not mix.
[13:54] <xanal0verlordx> pavucontrol says that he see 2 streams.
[13:54] <xanal0verlordx> ffmpeg too.
[13:54] <xanal0verlordx> Stream 0 from pulse default and stream 1 from pulse monitor.
[13:55] <xanal0verlordx> I hear my voice with this variant.
[13:55] <xanal0verlordx> If I put monitor at first, it records only the music.
[13:58] <relaxed> xrandr | awk '/*/{print $1}'
[13:59] <xanal0verlordx> relaxed: how much faster it will be?
[14:01] <relaxed> you won't notice a difference, but it looks more elegant
[14:02] <xanal0verlordx> Okay.
[14:04] <ribasushi> the more I read, the more I am convinces this is the result of a shit rgb->yuv conversion, and what's more irritating is that the yuv gamut is *wider* than rgb, so logically the trip ought to be lossless
[14:05] <ribasushi> aside from that I wonder how come this isn't a FAQ with a good answer  :(
[14:05] <relaxed> ribasushi: I downloaded the png- which command shows the issue?
[14:06] <ribasushi> relaxed: just ffplay-ing it through image2pipe
[14:06] <ribasushi> wget -qO- http://i.imgur.com/Z9nIk4T.png | ffplay -hide_banner -f image2pipe -i /dev/stdin
[14:06] <ribasushi> or ffplay -hide_banner -f image2pipe -i <png>
[14:07] <relaxed> so the png's color is already off?
[14:07] <ribasushi> it isn't
[14:07] <ribasushi> look at the png in a browser, or gimp
[14:07] <ribasushi> compare to what ffplay shows you
[14:09] <xanal0verlordx> relaxed: how can I saw awk that I need both entries?
[14:09] <xanal0verlordx> Name: and monitor.
[14:09] <xanal0verlordx> I know only or, |.
[14:09] <xanal0verlordx> & and && does not work.
[14:12] <xanal0verlordx> Found.
[14:12] <xanal0verlordx>  /Name/,/monitor/
[14:12] <xanal0verlordx> No?
[14:12] <xanal0verlordx> Ergh.
[14:12] <relaxed> ribasushi: ffmpeg -i Z9nIk4T.png ffmpeg_encoded.png; feh -d Z9nIk4T.png ffmpeg_encoded.png
[14:13] <relaxed> ribasushi: there's no difference ^^
[14:15] <relaxed> xanal0verlordx: I don't have `pactl` so I can't help. If what you have works, use it.
[14:15] <xanal0verlordx> But it's awk releated question.
[14:16] <xanal0verlordx> As you said, it looks better. And it really looks better, so I want to change my second command to smth like this.
[14:20] <relaxed> sounds like a good exercise to learn awk
[14:23] <xanal0verlordx> Found.
[14:23] <xanal0verlordx>  /smth/&&/smthelse/
[14:28] <relaxed> ribasushi: ffplay doesn't playback the file in rgb24, try mpv instead
[14:30] <ribasushi> relaxed: I understand and see that
[14:31] <ribasushi> relaxed: what I want is to produce videos that will get *common players* to display the correct color (this includes ffplay, because vlc uses the same codebase)
[14:31] <ribasushi> relaxed: and given that rgb is a subset of yuv (gamut-wise) I believe I should be able to do this somehow, just haven't found the magic yet
[14:34] <xanal0verlordx> I still have a problem.
[14:34] <xanal0verlordx> I hear only 1 stream on my video.
[14:34] <xanal0verlordx> Should I mix them both into one?
[14:35] <xanal0verlordx> Audio streams.
[14:36] <xanal0verlordx> 1 video from x11grab, 1 mono audio from micro, 1 stereo from playback.
[14:36] <relaxed> ribasushi: mpv uses ffmpeg's libs too. ffplay is far from a "common player"
[14:48] <waressearcher2> I have ffmpeg-2.3 and I run that command to grab screen: "ffmpeg -f x11grab -s 640x480 -framerate 25 -i :0.0 -vcodec libx264 -preset superfast -pix_fmt yuv420p -y -f mp4 /tmp/1.mp4"       and it gave me "Segmentation fault"
[14:48] <waressearcher2> what option should I use to get verbose output ? -v debug ?
[14:49] <ribasushi> relaxed: ok let's try it from a completely different angle
[14:49] <xanal0verlordx> waressearcher2: try to check where it fails.
[14:49] <ribasushi> relaxed: given the very same png, I am doing this to create a "movie" out of the image: ffmpeg -y -hide_banner -f image2 -loop 1 -i out.png -t 5 -f matroska -c:v libx264 -preset ultrafast -qp 0 5_second_still.mkv
[14:50] <ribasushi> the resulting file has lost color on all players (mpv, vlc, ffplay, mplayer), and has the same colorloss when sent to youtube
[14:50] <waressearcher2> xanal0verlordx: I changed codec x264 to mpeg4 and it works, could be a problem with x264 library
[14:50] <ribasushi> relaxed: that is what I am ultimately trying to fix
[14:52] <ribasushi> relaxed: even worse - if I take the resulting video, and extract a frame from it - the color is *fine*: ffmpeg -hide_banner -y -i 5_second_still.mkv -f image2 -vframes 1 out_frame.png
[14:54] <xanal0verlordx> What's the difference between -filter and -filter_complex?
[14:55] <xanal0verlordx> With first I can apply only 1 filter?
[14:56] <ribasushi> xanal0verlordx: yes
[14:56] <xanal0verlordx> And that is all difference?
[14:57] <relaxed> -filter_complex is used for miltiple inputs
[14:58] <xanal0verlordx> In order to use amix I should use filter_complex, right?
[15:10] <waressearcher2> how to grab sound also ?
[15:14] <blippyp> xanal0verlordx: save yourself the headache and just always use -filter_complex... ;)
[15:15] <waressearcher2> if I add options: "-f alsa -i pulse -vb 2000k -ab 128k -ar 44100 -ac 2 -acodec libmp3lame" it says "cannot open audio device pulse (No such file or directory)"
[15:15] <xanal0verlordx> So you do not have PulseAudio installed.
[15:15] <xanal0verlordx> Are you a retard?
[15:15] <xanal0verlordx> Use hw:0 instead.
[15:15] <xanal0verlordx> blippyp: I'm trying to understand it all.
[15:23] <waressearcher2> I can't use "-f alsa -i pulse", is it possible to grab audio without having "pulse" library ?
[15:24] <kepstin-laptop> waressearcher2: sure; try either the 'default' alsa device or one of the hardware devices.
[15:26] <waressearcher2> kepstin-laptop: I run that command: "ffmpeg -f x11grab -s 1024x768 -framerate 30 -i :0.0 -vcodec mpeg4 -vb 2000 -f alsa -i default -vb 2000k -ab 128k -ar 44100 -ac 2 -acodec libmp3lame -y -f avi /tmp/1.avi" and then started "wolfenstein 3D" and when I watched video there was no audio from wolfenstein, so "-i default" doesn't working, and "-i hw:0"
[15:26] <waressearcher2>  also doesn't working
[15:26] <kepstin-laptop> oh, you want to capture the audio that an application is playing? That's harder.
[15:27] <waressearcher2> kepstin-laptop: yes, but what it is capturing otherwise ? from microphone ?
[15:27] <xanal0verlordx> He is trying to do same as me.
[15:27] <xanal0verlordx> Maybe.
[15:27] <waressearcher2> kepstin-laptop: harder but possible ?
[15:27] <kepstin-laptop> you'll either have to set up an alsa loopback (aloop) device or set up pulseaudio
[15:27] <waressearcher2> I what to capture some gameplay
[15:27] <xanal0verlordx> But I do not want to work with bare ALSA.
[15:27] <xanal0verlordx> PulseAudio is shit, but it works.
[15:28] <xanal0verlordx> So I am using PulseAudio.
[15:28] <xanal0verlordx> waressearcher2: you can use JACK.
[15:28] <kepstin-laptop> yeah, if the application is playing back audio into pulseaudio (either natively or via the alsa plugin), then you can record from the pulseaudio 'monitor' device to get the game audio
[15:30] <waressearcher2> any ideas how can I capture only one window ? I can use "-i 0:0+40,50" to capture specific place on screen but if I move window it will not be in focus anymore so is there a way to capture specific window ? I know there is option "-i title=RecordWindow" for windows version but is there similar thing in linux ?
[15:31] <xanal0verlordx> waressearcher2: xwininfo.
[15:33] <kepstin-laptop> waressearcher2: no, nothing like that in the X display capture. For the moment, just don't move your window :/
[15:35] <xanal0verlordx> Yes, the only solution is just not moving window.
[15:59] <waressearcher2> balls
[16:47] <santa> Hi there. We are developing a client for ar drone 2.0 and want to transcode and save the video stream with ffmpeg. If I use ffplay for video playback the is no problem. But with the transcoding I get a huge delay of ten seconds and more.
[16:48] <santa> This is what I tried so far:
[16:48] <santa> ffmpeg -re -i sourceFile -acodec copy -threads 8 -vcodec libx264 -b 160k -f mpegts udp://127.0.0.1:5000
[16:48] <santa> How can I speed this up?
[16:49] <santa> btw sourceFile is a tcp steam
[16:50] <iive> I suspect that x264 is buffering frames, for optimal average bitrate encoding. To rule it out, try -vcodec mjpeg -vqscale=4
[16:51] <iive> ops... vqscale 4
[16:51] <iive> i think.
[16:51] <xanal0verlordx> I made that.
[16:51] <xanal0verlordx> ffmpeg -f x11grab -framerate 25 -video_size `xrandr | awk '/*/ {print $1}'` -i $DISPLAY -f pulse -i `pactl list sources | awk '/Name:/ && /monitor/ {print $2}'` -f pulse -i default -filter_complex amix -f flv -c:v libx264 -pix_fmt yuv420p -g 50 -keyint_min 50 -b 600k -minrate 600k -maxrate 3500k -bufsize 1000k -crf 18 -preset ultrafast -c:a libmp3lame -ar 44100 -ac 2 -ab 96k test.flv
[16:52] <iive> santa: there is x264 option -tune zerolatency, that you might want to try.
[16:52] <xanal0verlordx> Are all the parameters right?
[16:52] <xanal0verlordx> Especially video encoding.
[16:52] <iive> also have in mind that thread encoding is usually frame based, so with 8 threads there would be 8 frames lag.
[16:53] <santa> I already tried "zerolatency" but it doesn't work.
[16:55] <iive> well, try with different codec, to rule encoder lag. afaik mpegts is not going to buffer a lot of frames on its own.
[16:56] <iive> also, try without sound (-an)
[16:56] <xanal0verlordx> Why my bitrate is not constant?
[16:58] <santa> Well, we have to decode the video with h264. Is there any better way?
[17:04] <iive> santa: i ask you to test that, not run it in production.
[17:04] <iive> also... the lag might be caused by buffering in the video player
[17:07] <DelphiWorld> yo
[17:07] <DelphiWorld> is it pocible to aply audio effect using FFMpeg?
[17:09] <blippyp> man ffmpeg-filters the top half shows the audio filters you can apply
[17:10] <DelphiWorld> blippyp: i can't use man at all due to missworking with tts
[17:12] <blippyp> https://www.ffmpeg.org/ffmpeg-filters.html
[17:12] <blippyp> it's the man page in html... ;)
[17:18] <DelphiWorld> blippyp: :P
[17:45] <waressearcher2> is there a way to generate two different files for video and audio ? instead of one video_audio.avi to get two video.avi and audio.mp3 ?
[17:52] <blippyp> waressearcher2: https://trac.ffmpeg.org/wiki/Creating%20multiple%20outputs
[18:08] <koboNor> hi, anyone is able to tell me if it's possible to run ffmpeg on kobo arc without troubles bcs i tried thungs like ffmpeg4android and it's crashing during h264 encoding same with ffmpeg media encoder for android
[18:30] <blippyp> hey spidy
[18:30] <blippyp> oops
[20:17] <danomite-> Is ffserver being activity developed?
[20:18] <llogan> somewhat
[20:29] <danomite-> is ffserver still recommended for new projects?
[20:31] <ChocolateArmpits> danomite, last I heard - no
[20:32] <danomite-> Is there a recommendation  for new projects?
[20:53] <llogan> look at "git log ffserver.c" for development (or via the git browser at git.videolan.org)
[21:16] <danomite-> https://github.com/FFmpeg/FFmpeg/graphs/contributors
[21:16] <danomite-> There does not appear to be much activity
[21:19] <danomite-> I really just want to get a live flash stream going, working on a paste
[21:22] <BtbN> ffserver is basicaly dead, it's only left there because it works for some people. It was/is a candidate for removal.
[21:22] <BtbN> If you want an rtmp server, use nginx-rtmp
[21:29] <danomite-> BtbN, thanks for the straight answer
[22:05] <Sirisian|Work> I'm trying to run the command "ffmpeg -y -threads 1 -i kwt2.wmv -b:v 1200000 -b:a 128000 -acodec libvorbis -aq 100 -async 1 kwt2.mp4" to convert a wmv to mp4. I can convert any other format to mp4 using this method including most wmv files, but I have two wmv files that fail. There's no error message other than it saying "Killed" at the end even when using verbose logging. ffprobe for the file doesn't show anything useful other
[22:05] <Sirisian|Work>  than "Flip4Mac WMV Export Component for QuickTime (Mac)"  and "Stream #0:0(eng): Video: wmv3 (Main) (WMV3 / 0x33564D57), yuv420p, 1920x1080, 1233 kb/s, 25 fps, 25 tbr, 1k tbn, 1k tbc"
[22:05] <Sirisian|Work> It's interesting to note that even if I convert it successfully to ogv I can't then convert the ogv to mp4. I get the same "Killed" message.
[22:06] <waressearcher2> Sirisian|Work: you can use "-b:v 12000k"
[22:09] <Sirisian|Work> (Also this isn't related to audio. Any audio format produces the same issue.)
[22:10] <llogan> Sirisian|Work: can you provide an input sample file?
[22:10] <Sirisian|Work> One moment, I will.
[22:14] <Sirisian|Work> llogan, https://drive.google.com/file/d/0B1SdFF_bw3xManlpRkpyaDZfSk0/view?usp=sharing
[22:14] <Sirisian|Work> That should work ideally
[22:15] <Sirisian|Work> I'm thinking this might have something to do with the flip4mac. Both the files I can't convert were made with it.
[22:15] <waressearcher2> if I use option "-vcodec mpeg4" but add option  "-vtag xvid" what will it change ? the coder stays the same "mpeg4"
[22:15] <waressearcher2> ?
[22:19] <llogan> Sirisian|Work: works for me. upgrade your ffmpeg.
[22:19] <Sirisian|Work> ooh good. Yeah mine is like 5 months old.
[22:19] <llogan> and vorbis in mp4?
[22:20] <llogan> do you need async?
[22:20] <DelphiWorld> llogan: wth? vorbis work with mp4?
[22:21] <Sirisian|Work> Seems fine. I use the same command to convert to ogv and mp4 for the web. Works on every device I've tested.
[22:22] <Sirisian|Work> or it ignores it and does its own thing.
[22:22] <llogan> ok, but you should refer to the specs
[22:22] <llogan> most people use AAC audio
[22:22] <Sirisian|Work> yeah faac. I saw that one being used a lot for mp4.
[22:22] <DelphiWorld> Sirisian|Work: use libfdk_aac
[22:23] <Sirisian|Work> Is that -c:a libfaac ?
[22:23] <llogan> git rid of the bitrates, (the default rate control should be fine), and add -movflags +faststart
[22:24] <llogan> since you did not include the complete ffmpeg console output from your command we can't know you how your ffmpeg has been configured
[22:25] <Sirisian|Work> http://pastebin.com/6HmGq5Dd
[22:25] <llogan> it would be -c:a libfdk_aac, but you don't have support for this
[22:26] <Sirisian|Work> well I'm rebuilding it. I can include it.
[22:26] <llogan> so use "-c:a aac -strict experimental" instead (or compile with libfdk_aac support)
[22:26] <DelphiWorld> hey llogan
[22:26] <DelphiWorld> when i do ffprobe
[22:26] <DelphiWorld> on some video or streams
[22:27] <DelphiWorld> sometime i dont see the audio bitrate if they use aac
[22:27] <DelphiWorld> what do that mean
[22:27] <llogan> i don't know
[22:28] <DelphiWorld> :P
[22:39] <waressearcher2> what is "-bf" option ?
[22:44] <c_14> >  -bf                <int>        E..V.... set maximum number of B frames between non-B-frames (from -1 to INT_MAX) (default 0)
[22:46] <waressearcher2> how is it improve quality if I set "-bf 1" ?
[22:47] <c_14> You'll get more p frames and less b frames.
[22:47] <c_14> Whether or not that improves quality depends on the vidie.
[22:47] <c_14> *video
[22:47] <DelphiWorld> c_14: can ffm send to icecast2?
[22:47] <DelphiWorld> or you're unstreamable:P
[22:48] <waressearcher2> also NUM in that option: "-q:v NUM" goes from 0 to 100 ? I set it to "-q:v 4" and its good quality
[22:49] <c_14> DelphiWorld: https://ffmpeg.org/ffmpeg-protocols.html#Icecast
[22:49] <c_14> waressearcher2: qscale depends on the codec
[22:55] <waressearcher2> c_14: I use "-vcodec mpeg4"
[22:57] <c_14> For mpeg4 qscale goes from 1 to 31 where 1 is the highest quality.
[23:03] <waressearcher2> c_14: is it the same as "-q" option ?
[23:03] <c_14> yep
[23:04] <c_14> -q is an alias for -qscale
[23:40] <Sirisian|Work> llogan, Interesting. I ran it once and it worked and created a file I could play. I tried to run it a second time and it failed. This was after rebuilding ffmpeg from scratch. So I uninstalled it then tried again and it still fails. I think I'm going to format this machine and try again.
[23:41] <llogan> Sirisian|Work: are you using ffmpeg from current git master?
[23:42] <Sirisian|Work> yeah
[23:42] <Sirisian|Work> Are there tags or something?
[23:43] <Sirisian|Work> git clone --depth 1 git://source.ffmpeg.org/ffmpeg
[23:43] <llogan> that's fine.
[23:43] <llogan> remove the --depth 1 if you're interested in checking out older revisions
[23:44] <llogan> such as if you want to use git bisect to find a regression
[23:44] <llogan> or i guess tools/bisect-create to be more accurate
[00:00] --- Fri Nov 14 2014


More information about the Ffmpeg-devel-irc mailing list