[Ffmpeg-devel-irc] ffmpeg.log.20121005
burek
burek021 at gmail.com
Sat Oct 6 02:05:01 CEST 2012
[06:25] <manitpaulose> Hi All, Has some issue when converting mov to flv. Below is the error
[06:25] <manitpaulose> Impossible to convert between the formats supported by the filter 'src' and the filter 'auto-inserted scaler 0'
[06:25] <manitpaulose> i am able to convert almost all other format like mp4 to flv
[06:26] <manitpaulose> any ideas?
[06:28] <manitpaulose> the command i used is
[06:28] <manitpaulose> ffmpeg -i /home/rome/videos/abc.mov -ar 44100 -ab 96 -f flv /home/rome/converted/abc.flv
[06:29] <manitpaulose> sure
[06:33] <manitpaulose> http://pastebin.com/CfiqN3sJ
[06:33] <manitpaulose> here is the complete command
[06:34] <klaxa> hmm... i never came across that, maybe wait for the devs do answer
[06:35] <klaxa> is the file realls just 1.92 seconds?
[06:36] <klaxa> *really
[06:37] <manitpaulose> its a small file
[07:11] <djapo> how can i make a video fit a smaller video size without distortions.
[07:11] <djapo> ?
[07:13] <manitpaulose> guys the issue was actually related to the video. I tried another mov file and it worked just fine.
[07:13] <manitpaulose> Thanks Klaxa
[07:13] <djapo> right now i em trying this... but the video is distorted
[07:13] <djapo> for i in *.flv; do avconv -i $i -b 250k -r 15 -s qvga vid/new/`echo $i| sed 's/.flv//g'`.mp4;done
[07:14] <klaxa> manitpaulose: actually .mov is just the contaner and not the codec
[07:14] <manitpaulose> the issue was i was not able to convert .mov to .mp4 or flv
[07:14] <manitpaulose> the issue was actually wit the file
[07:14] <klaxa> and not exactly the most important part. the .mov you were using just now seems to be a raw video
[07:15] <klaxa> k
[07:15] <manitpaulose> tried another mov file and it worked file
[07:15] <klaxa> djapo: you mean keeping the correct aspec ratio?
[07:17] <djapo> klaxa: no, can i scale a widescreen video to fit inside of a qvga screen without cliping or stretching, i em ok with adding more negative space
[07:17] <klaxa> you mean adding letterboxes so you have black bars on the top and the bottom?
[07:18] <djapo> yes that's what i mean
[07:21] <klaxa> djapo: http://superuser.com/questions/26416/how-to-convert-a-169-movie-to-a-43-letterbox-version
[07:22] <djapo> klaxa: thanks :D
[07:22] <klaxa> :)
[07:44] <djapo> klaxa: can it also be done with avconv
[07:45] <djapo> ?
[07:45] <klaxa> wasn't ffmpeg just renamed to avconv?
[07:45] <djapo> klaxa: i think so, but i tryed with ffmpeg and it didn't work
[07:46] <djapo> klaxa: no, avconv does not recognize those options.
[07:48] <ubitux> avconv is part of a fork of ffmpeg
[07:48] <ubitux> and that fork is distributing an old broken binary of ffmpeg
[07:48] <klaxa> sounds awful
[07:48] <klaxa> i think one could use the drawbox video filter to achieve the same result?
[07:48] <ubitux> and ubuntu/debian are distributing that fork
[07:48] <ubitux> under the name of libav.
[07:49] <ubitux> erhm
[07:49] <ubitux> under the name of ffmpeg sorry
[07:50] <klaxa> ah djapo from the manpage:
[07:50] <klaxa> >All the pad options have been removed. Use -vf
[07:50] <klaxa> >pad=width:height:x:y:color instead.
[07:50] <ubitux> https://www.ffmpeg.org/ffmpeg.html#pad
[07:50] <ubitux> there are some examples a bit below here
[07:51] <djapo> klaxa: ubitux: thanks
[07:51] <ubitux> djapo: you should also upgrade to ffmpeg
[07:51] <ubitux> because that's not what you are using right now :p
[07:53] <djapo> ubitux: i em using ffmpeg but it could not set the options and it tells me that ffmpeg is deprecated use avconv
[09:31] <Emmanuel_Chanel> Hello!
[09:48] <qwe> hey, I'm trying to resize some PAL content to square pixels. Unfortunately, ffmpeg adds small black borders to the sides of the output, instead of just scaling the input from 720x576 to 768x576. dump: http://pastebin.com/DhLvgzUx
[09:49] <qwe> I tried various settings with -sar and -aspect, but somehow ffmpeg seems to read the DAR from the source, which has non-square pixels, and so thinks it's necessary to calculate its own size instead of just friggin do what it's told ;)
[10:24] <qwe> nobody? :( I'm currently trying to muck around with the setdar and setsar filters, but whatever I do, ffmpeg always keeps doing it's own "thinking". I just want it to resize the darn video and let me specify the sar and dar values
[10:27] <qwe> from the codumentation: The scale filter forces the output display aspect ratio to be the same of the input, by changing the output sample aspect ratio.
[10:27] <qwe> how is one supposed to ever change the DAR then? when i specify a DAR, ffmpeg does a silent rescaling of the output
[11:12] <buu> Hey, quick question if anyone is alive, I'm trying to copy a mp3 to a new mp3 and the source mp3 has some weird mjpeg stream as stream '0' I'd like to drop
[11:12] <buu> What do I specify in the options to do so?
[11:13] <buu> Specifically it's "Stream #0:1: Video: mjpeg, yuvj420p, 450x600 [SAR 96:96 DAR 3:4], 90k tbr, 90k tbn, 90k tbc"
[11:17] Action: microchip_ didn't know mp3's can contain video tracks
[11:18] <buu> microchip_: Me either, I find it confusing
[11:18] <buu> But I think I'm closer to my solution except.. Invalid encoder type 'libmp3lame'
[11:20] <buu> microchip_: How do I drop a stream?
[11:20] <microchip_> i've no idea
[11:21] <buu> Success!
[11:21] <buu> -vn
[11:22] <knoch> yes
[11:23] <knoch> I'm having difficulties to create thumbnails from a video
[11:23] <knoch> i.e select only 1 frame every 10 minutes
[11:23] <knoch> the doc says "-r 1/600" but it doesn't work
[11:24] <buu> What does it do?
[11:26] <ubitux> what's your cmd line knoch?
[11:27] <knoch> ffmpeg -ss 00:12:00 -i /home/file.ts -r 1/600 -vframes 15 -s qvga tn/%02d.png
[11:28] <knoch> there are only a few seconds between the first three thumbnails
[11:31] <knoch> the fourth seems to be good, 10 minutes after the third
[11:35] <ubitux> ok just a min
[11:35] <ubitux> trying something
[11:36] <knoch> ok thank you ubitux :)
[11:40] <ubitux> knoch: can you try: ffmpeg -i /home/file.ts -ss 00:12:00 -vf 'select=isnan(prev_selected_t)+gte(t-prev_selected_t\,60*10)' -frames:v 15 -vsync vfr -s qvga th/out%02d.png ?
[11:41] <ubitux> btw, what's your final goal of this?
[11:42] <ubitux> knoch: oh and where did you see "-r 1/600"?
[11:42] <ubitux> (where in the doc)
[11:43] <knoch> wait
[11:43] <knoch> here : http://ffmpeg.org/trac/ffmpeg/wiki/Create%20a%20thumbnail%20image%20every%20X%20seconds%20of%20the%20video
[11:43] <ubitux> arh
[11:43] Action: ubitux hits burek
[11:43] <knoch> haha
[11:44] <ubitux> i guess i'll have to complete that page
[11:44] <ubitux> and fix it
[11:44] <knoch> my final goal is to create navigation in a video with thumbnails
[11:44] <knoch> you basically click on the thumbnail and it seeks
[11:45] <ubitux> knoch: did you try the scene detection? :)
[11:45] <knoch> to the corresponding timestamp
[11:45] <knoch> not at all
[11:45] <knoch> I have to keep the timestamp
[11:46] <ubitux> try replacing 'select=isnan(prev_selected_t)+gte(t-prev_selected_t\,60*10)' with 'gt(scene\,0.4)'
[11:46] <ubitux> of and btw, you should replace the -s qvga with a scale filter at the end of the filtergraph
[11:46] <ubitux> anyway, tell me if it works first
[11:46] <knoch> you have a point, the scene detection would be better to avoid having bad thumbnails
[11:46] <knoch> the first cmd line ?
[11:47] <knoch> a scale filter ? I am a newbie at ffmpeg as you may have noticed I'm sorry
[11:48] <ubitux> ffmpeg -i /home/file.ts -ss 00:12:00 -vf 'select=isnan(prev_selected_t)+gte(t-prev_selected_t\,60*10),scale=320:240' -frames:v 15 -vsync vfr th/out%02d.png
[11:48] <ubitux> ffmpeg -i /home/file.ts -ss 00:12:00 -vf 'select=gt(scene\,0.4),scale=320:240' -frames:v 15 -vsync vfr th/out%02d.png
[11:48] <knoch> yes ok, I'll try these two
[11:49] <knoch> you move the -ss option after the input
[11:49] <knoch> why ?
[11:49] <knoch> moved*
[11:49] <ubitux> more efficient
[11:50] <ubitux> more accurate*
[11:50] <ubitux> (might be slower though)
[11:51] <knoch> ok :)
[11:52] <ubitux> you might be interested in the tile filter as well
[11:52] <knoch> yes but I don't want to bother you too much
[11:52] <ubitux> like: ffmpeg -i /home/file.ts -vf 'select=gt(scene\,0.4),scale=320:240,tile' preview.png
[11:53] <ubitux> bother me? mmh?
[11:53] <ubitux> you're on #ffmpeg, that's the purpose of the channel to help you achieving what you want with ffmpeg :P
[11:54] <knoch> yes but.. it's great what you people do
[11:54] <knoch> so thank you !
[11:54] <ubitux> np :)
[11:54] <knoch> and I have tested the first command lin
[11:54] <knoch> e
[11:56] <knoch> the first thumbnail is incorrect
[11:57] <knoch> it's not what the video shows at 00:12:00 but it is at 20:00
[11:59] <ubitux> mmh wait, try to move back the -ss before -i, just to be sure
[11:59] <knoch> ok
[12:00] <knoch> so I enter this : ffmpeg -ss 00:12:00 -i /mnt/backup/media/dance_nation.ts -vf 'select=isnan(prev_selected_t)+gte(t-prev_selected_t\,60*10),scale=320:240' -frames:v 15 -vsync vfr tn/%02d.png just to be sure
[12:01] <knoch> so you can follow exactly what I do
[12:03] <knoch> well it works like a charm
[12:03] <knoch> perfect
[12:04] <knoch> can you explain the filtergraph ?
[12:04] <knoch> please
[12:39] <knoch> ubitux: I tried the tile filter and.. it's amazing
[12:40] <knoch> same: an explanation would be great :D
[13:05] <burek> ubitux? :)
[13:05] <burek> knoch, what exactly didn't work with 1/600? :)
[13:06] <knoch> please don't get mad :(
[13:06] <knoch> [11:29:00] < knoch> there are only a few seconds between the first three thumbnails ackjewt
[13:06] <knoch> [11:31:46] < knoch> the fourth seems to be good, 10 minutes after the third
[13:06] <burek> knoch can you please use a pastebin site (like www.pastebin.com) to show your ffmpeg command and the complete console output?
[13:07] <knoch> sure
[13:11] <knoch> burek: you're one of the ffmpeg dev ?
[13:12] <burek> no
[13:12] <burek> im just the noise :)
[13:16] <knoch> oh ok
[13:16] <knoch> I lost the previous console output
[13:16] <knoch> running it again
[13:18] <knoch> burek: http://pastebin.com/hCHv49rT
[13:18] <knoch> bash colors
[13:23] <burek> hm, I'm not sure what the issue might be, knoch, but those "skipped MB" and "invalid mb" tell me something is wrong :)
[13:23] <burek> did you try moving -ss after -i option
[13:24] <burek> like ffmpeg -i /mnt/backup/media/dance_nation.ts -ss 00:12:00 -r 1/600 -vframes 15 -s qvga tn/%02d.png
[13:24] <burek> it will run slower but more precise
[13:24] <burek> although I'm not sure that will actually help..
[13:25] <burek> where did you get that dance_nation.ts from?
[13:27] <knoch> it's a live capture from a TV stream
[13:28] <knoch> moving -ss after -i causes my computer to freeze
[13:29] <knoch> it takes a lot of memory and CPU
[13:29] <burek> yeah, I was afraid of that
[13:30] <burek> it is now not skipping through your input fast, but rather decodes all frames and drops them until it reaches the specified -ss time
[13:32] <ubitux> < knoch> can you explain the filtergraph ? // select is filtering some frames, then the output is scaled with the scale filter; for more information look at http://ffmpeg.org/ffmpeg.html#select
[13:33] <ubitux> about the tile filter: http://ffmpeg.org/ffmpeg.html#tile
[13:35] <knoch> many thanks to both of you
[13:37] <knoch> so
[13:40] <knoch> the scene detection is better if we don't have a limited number of thumbnails I guess
[13:40] <chris2> hi. i'm trying to stream with ffserver, but i get: [tcp @ 0x1e63240] TCP connection to localhost:58180 failed: Connection refused
[13:40] <chris2> otoh, ffserver -d shows a POST, and i can access the port with nc
[13:41] <chris2> and the streaming quits after a second or so with 85.8kbits/sav_interleaved_write_frame(): Connection reset by peer
[13:41] <chris2> ok
[13:41] <chris2> http://sprunge.us/YZUX ffserver.conf
[13:42] <chris2> http://sprunge.us/cZTL commandline + output
[13:42] <knoch> ubitux: is it possible to select only 15 thumbnails all over the video while taking advantage of the scene detection ?
[13:43] <burek> chris2, you don't need -acodec/-vcodec in the ffmpeg line, when using ffserver
[13:43] <burek> use something like this: ffmpeg -re -f video4linux2 -i /dev/video0 -isync -f alsa -i hw:0,0 http://localhost:58180/feed1.ffm
[13:43] <chris2> ok, that keeps running
[13:44] <burek> and define your vcodecs and acodecs in ffserver.conf
[13:44] <ubitux> knoch: nope
[13:44] <ubitux> knoch: you can somehow do it like this: use the scene detection to output all the "scene" pictures
[13:44] <chris2> ok
[13:44] <ubitux> knoch: and then filter out yourself with a script or anything what pic you don't want
[13:45] <burek> maybe something like: ffmpeg ... -vf ... -f nut - | ffmpeg -f nut -i - -r 1/10 .. output
[13:46] <burek> ^ knoch
[13:46] <ubitux> ?
[13:46] <ubitux> what are you answering to burek?
[13:46] <burek> to filter out every 10th frame
[13:46] <ubitux> ah no
[13:46] <ubitux> you can do that with -vf select that's not the problem
[13:46] <burek> or he would like some fancy selecting ?
[13:46] <ubitux> the problem is that you have N output pictures
[13:47] <ubitux> and you always want X of them
[13:47] <burek> oh
[13:47] <ubitux> so you need to pic using X/N and so you need to know in advance the number of output pic
[13:47] <burek> btw, why doesn't -r 1/600 work?
[13:47] <burek> it used to work for me at least
[13:47] <chris2> burek: is that related to this error? Fri Oct 5 13:47:18 2012 Codec for stream 0 does not use global headers but container format requires global headers
[13:47] <ubitux> burek: git bisect then, i don't know how that's supposed to work
[13:48] <burek> chris2, take a look at ffserver sample configuration
[13:48] <ubitux> burek: but vf select is appropriate for this, you should update the examples
[13:48] <burek> chris2 and type ctrl+f and then "global_header"
[13:48] <chris2> thanks
[13:48] <knoch> thank you ubitux
[13:49] <ubitux> np
[13:49] <burek> ubitux, the idea was to select 1 frame each 600 seconds
[13:49] <burek> hence -r 1/600
[13:49] <burek> dropping all the others
[13:49] <ubitux> i'm not sure how that could have work but well
[13:49] <ubitux> if you say so..
[13:49] <ubitux> at the moment it doesn't anyway
[13:49] <ubitux> anyway, that's exactly the purpose of vf select
[13:50] <ubitux> and my cmd line is based on the vf select example from the documentation
[13:50] <ubitux> (http://ffmpeg.org/ffmpeg.html#select)
[13:50] <burek> what do you mean "i'm not sure how that could have work" ?
[13:51] <ubitux> because -r is to set the frame rate of the output, and it would have a special meaning for the image2 muxer
[13:51] <burek> specifying an output rate would make ffmpeg drop/dup frames to achieve it, no?
[13:51] <knoch> I should really learn how vf works, I have coded a basic streaming application which sends MPEG2-TS over RTP
[13:51] <chris2> hrm, this breaks down after a few seconds of streaming
[13:51] <knoch> and now I want to implement fast motion feature
[13:51] <ubitux> burek: the image muxer is always tricky, because they are still/standalone images
[13:51] <ubitux> so no timing stuff
[13:51] <ubitux> and -r is for setting the framerate of the container, not really filtering images
[13:52] <ubitux> knoch: look at vf setpts
[13:52] <ubitux> knoch: http://ffmpeg.org/ffmpeg.html#asetpts_002c-setpts
[13:52] <ubitux> (http://ffmpeg.org/ffmpeg.html#Examples-14)
[13:52] <knoch> yeah I have seen setpts
[13:53] <knoch> but to code it in C is a hard work when you don't have a clue of how it works
[13:53] <ubitux> http://git.videolan.org/?p=ffmpeg.git;a=tree;f=doc/examples;hb=HEAD
[13:53] <knoch> AVFilterGraph etc
[13:53] <ubitux> look at the filtering_video.c example
[13:53] <ubitux> and just change the scale filter into a setpts one :)
[13:54] <ubitux> it *might* work :)
[13:54] <knoch> great thanks I love you :D
[13:54] <knoch> but I would like to understand how it basically works
[13:56] <knoch> it's a graph of connected filters, but how are they connected ? do they always expect decoded frames ?
[13:57] <ubitux> knoch: http://ffmpeg.org/filters.html
[13:57] <ubitux> knoch: some filters are "source" filter
[13:57] <ubitux> that don't expect any input
[13:57] <ubitux> example: try ffplay -f lavfi -i testsrc
[13:58] <ubitux> and you can then add a "normal" filter after: ./ffplay -f lavfi -i 'testsrc,hue=H=2*PI*t:s=sin(2*PI*t)+1'
[13:59] <ubitux> you can name some outputs, reuse them between filters taking more than one input, etc
[14:00] <knoch> waow
[14:00] <knoch> you are a developer ? :D
[14:00] <ubitux> somehow
[14:00] <ubitux> :)
[14:00] <knoch> anyway you master ffmpeg
[14:02] <ubitux> i'm just grepping examples :)
[14:02] <knoch> but you understand them
[14:05] <knoch> what is the hue filter ?
[14:05] <ubitux> http://ffmpeg.org/ffmpeg.html#hue
[14:05] <ubitux> "Modify the hue and/or the saturation of the input."|
[14:05] <knoch> argh.. sorry
[14:05] <ubitux> no worry :)
[14:06] <knoch> I should have search the documentation before asking
[14:06] <knoch> +ed
[14:08] <knoch> and another question, I would like to implement (fast) rewind but I believe it is covered anywhere, so we know that fast motion is based upon modifying PTS and DTS, is it possible to do it for rewind ?
[14:08] <knoch> I was thinking
[14:09] <knoch> buffering from a Iframe to another (so a GOP), change the PTS but not the DTS
[14:09] <knoch> and send the frames from the buffer to the decoder
[14:11] <knoch> ths PTS would be changed by keeping the gap, then reversed to display the last frame in first
[14:14] Action: ubitux is lost
[14:29] <knoch> ubitux: tell me if you think this is completely absurd
[14:47] <ubitux> knoch: i just don't understand, but i'm a bit busy anyway now :p
[14:52] <RoyK> when was -padleft etc changed to pad?
[14:53] <ubitux> yes, pad filter
[14:55] <RoyK> but when was this changed_
[14:55] <RoyK> ?
[14:55] <RoyK> I have MediaMosa installed, and that uses the old parameter
[14:55] Action: RoyK wonders why the old format wouldn't be allowed - it wouldn't hurt...
[14:56] <RoyK> and wouldn't break things...
[14:56] <ubitux> it would
[14:57] <ubitux> because the processing is taking out of ffmpeg and moved to libavfilter
[14:57] <ubitux> it complexifies a lot the code to keep these -pad options
[14:57] <ubitux> it was removed a long time ago
[14:57] <ubitux> http://ffmpeg.org/ffmpeg.html#pad
[14:57] <RoyK> well, any idea when it was changed?
[14:57] <knoch> ok ubitux, tell me when you can
[14:59] <ubitux> RoyK: going to look at the log
[15:00] <ubitux> RoyK: Date: Fri May 7 12:16:23 2010 +0000
[15:00] <ubitux> Remove messy pading hack in ffmpeg.c.
[15:01] <RoyK> thanks
[15:54] <rainmake11> Huh, is there a way I can extract program id from mpegts? I saw
[15:55] <rainmake11> http://ffmpeg.org/trac/ffmpeg/ticket/995 buw I get very strange errors
[15:55] <rainmake11> Failed to compensate for timestamp delta of -33698.589720
[15:56] <rainmake11> To be more interesting I have a live stream via network and when I try to use -map I see always different ids for the same stream
[16:13] <chris2> i'm using this to stream: ffmpeg -v 2 -r 25 -s 640x480 -f video4linux2 -vcodec mjpeg -i /dev/video0 -f alsa -acodec copy -vcodec copy -f mpegts 'udp://10.153.59.22:1234?pkt_size=188&buffer_size=65535'
[16:14] <chris2> how do i tell these flags ffplay? it fails to detect the audio
[16:14] <chris2> "Could not find codec parameters for stream 0 (Audio: aac_latm ([6][0][0][0] / 0x0006), 0 channels, s16): unspecified sample rate"
[16:14] <klaxa> maybe you should encode the audio
[16:15] <chris2> i tried -acodec libvorbis too
[16:15] <chris2> but i always get above message
[16:16] <klaxa> wait...
[16:16] <klaxa> you have -f alsa, but no input
[16:16] <klaxa> or is the input from the video device?
[16:16] <klaxa> also i think the -vcodec copy is redundant
[16:17] <klaxa> if you have a seperate audiodevice in alsa, try adding -i hw0,0 or something
[16:17] <chris2> i tried that too
[16:17] <chris2> [aac_latm @ 0x7f066802af60] multiple layers are not supported
[16:17] <chris2> no idea why it thinks aac?
[16:18] <chris2> without -acopdec copy it recodes to mpeg2...
[16:19] <chris2> can i even have mjpeg in a mpegts? what mux should i use over udp?
[16:21] <zap0> you can cram almost any codec into any container; the question really is, is the device i intend to playback this mpegts upon going to understand mjpeg
[16:21] <chris2> well, i want to use ffplay
[16:22] <zap0> then TIAS!
[16:23] <chris2> tias?
[16:23] <microchip_> try it and see
[16:23] <chris2> ffplay thinks its aac_latm :(
[16:24] <chris2> Stream #0:0: Video: mjpeg, yuvj422p, 640x480, q=2-31, -4 kb/s, 90k tbn, 24 tbc
[16:24] <chris2> Stream #0:1: Audio: mp2, 32000 Hz, stereo, s16, 128 kb/s
[16:24] <chris2> that i send
[16:24] <chris2> Stream #0:0[0x100]: Audio: aac_latm ([6][0][0][0] / 0x0006), 0 channels, s16
[16:24] <chris2> Stream #0:1[0x101]: Audio: mp2 ([3][0][0][0] / 0x0003), 32000 Hz, stereo, s16, 128 kb/s
[16:24] <chris2> that i receive
[16:25] <microchip_> maybe there's a bug in ffplay
[16:26] <chris2> hrm, my mplayer2 uses ffmpeg too of course
[16:29] <chris2> an explicit -vcodec mjpeg results in Stream #0:0[0x100]: Unknown: none ([6][0][0][0] / 0x0006)
[16:42] <aleksm> I'm trying to create a "multiply" effect between two videos. How is this accomplished with the use of filters?
[17:26] <relaxed> aleksm: you want two videos side by side?
[17:28] <aleksm> relaxed: no, more along the lines of the "multiply" blend mode (http://en.wikipedia.org/wiki/Blend_modes)
[17:28] <aleksm> so just one video
[17:33] <relaxed> Did you look at the fade filter and the overlay filter in the man page?
[17:36] <relaxed> You should be able to overlay images to achieve alpha blending
[17:51] <aleksm> relaxed: thanks for the response, but would this achieve the same effect as multiplying? I'm not looking to simply overlay one video over the other.
[18:00] <Guest31191> any suggestions for broken ffmpeg default settings detected use an encoding preset (e.g. -vpre medium) preset usage: -vpre <speed> -vpre <profile> speed presets are listed in x264 --help profile is optional; x264 defaults to high
[18:10] <Guest31191> any suggestions guys
[18:11] <relaxed> Guest31191: Yes, use a recent version of ffmpeg.
[18:11] <relaxed> It now uses libx264's internal presets. example: -preset veryslow
[18:14] <relaxed> Guest31191: you can get a list of valid presets from `x264 --fullhelp | less`
[18:14] <Guest31191> are you saying i should use this value at command prompt
[18:14] <relaxed> I'm saying after you upgrade to a recent version of ffmpeg you should use that value at the command prompt.
[18:15] <relaxed> Which version are you using?
[18:16] <relaxed> ffmpeg 2>&1| sed q
[18:16] <Guest31191> Ok Thanks
[18:17] <relaxed> aleksm: I'm really not sure.
[20:26] <t4nk651> im trying to convert mkv to mp4 with ffmpeg -i input.mkv -vcodec libx264 -sameq -b 2089k -acodec libfaac -ab 192k video.mp4 but the audio comes out distorted heavyly
[20:27] <t4nk651> whats the best way to keep the audio intact
[20:28] <relaxed> didn't i help you yesterday?
[20:29] <t4nk651> yup
[20:29] <relaxed> 1) loose -sameq and never use it again
[20:29] <relaxed> lose*
[20:29] <t4nk651> but the command u gave me it wouldnt play in my ps3
[20:29] <t4nk651> so i used ffmpeg -i input.mkv -vcodec libx264 -sameq -b 2089k -acodec libfaac -ab 192k video.mp4 and it played but the audio was messed up
[20:33] <relaxed> How was it messed up?
[20:34] <t4nk651> was sound was distorted
[20:34] <t4nk651> kinda low
[20:34] <t4nk651> but you could hear it
[20:36] <relaxed> remove -sameq, add "-ac 2 -level 41 -t 60" for a minute long sample and see if that works.
[20:37] <t4nk651> ffmpeg -i input.mkv -vcodec libx264 -ac 2 -level 41 -t 60 -b 2089k -acodec libfaac -ab 192k video.mp4 ?
[20:37] <relaxed> yes
[20:37] <relaxed> if that plays okay remove "-t 60" to encode the whole video.
[20:37] <t4nk651> gonna give it a try
[20:55] <t4nk651> that wrked fine
[20:55] <t4nk651> gonna save this command
[20:55] <t4nk651> its a good one thanks relaxed
[21:31] <relaxed> t4nk651: you're welcome
[21:44] <aleksm> I'm trying to create a "multiply" blend between two videos. How is this accomplished with the use of filters?
[21:48] <llogan> aleksm: ffmpeg does not have its own multiply filter, but it does support frie0r which probably has one
[21:52] <aleksm> llogan: thank you :)
[21:57] <llogan> aleksm: ...or maybe not. i can't find one
[21:58] <llogan> oh, it does appear to have one
[21:58] <aleksm> llogan: yes, I can't either -- do you happen to know of any other way I can multiply blend two videos using CL tools?
[21:58] <aleksm> oh?
[21:59] Action: llogan builds with frei0r support
[22:03] Action: aleksm should clearly not rely on binary packages of ffmpeg
[22:23] <llogan> i don't know if ffmpeg can support frei0r multiply and i've never used frei0r with more than one input
[22:35] <llogan> aleksm: might be easier to use something like kdenlive. it also supports frei0r
[22:41] <aleksm> llogan: I really need to do this on the CL -- it's supposed to be a server process :)
[22:54] <llogan> aleksm: i can't figure it out but i'm also preoccupied at the moment. it would be nice to know if ffmpeg can even support the frei0r mixer2 "filters"
[22:57] <aleksm> llogan: I'm going to be playing around with it over the weekend, but I'll pop in here every now and then and share what I find with you
[22:57] Action: aleksm is clocking out
[23:05] <shadylog> Hi, i am using arch linux ffmpeg 1.0
[23:05] <shadylog> I was wondering if pthreads is enabled for this?
[23:06] <relaxed> shadylog: ldd `which ffmpeg`| grep thre
[23:07] <shadylog> [root at ion ~]# ldd `which ffmpeg`| grep thre
[23:07] <shadylog> libpthread.so.0 => /usr/lib/libpthread.so.0 (0x00007fbbfc466000)
[23:07] <shadylog> So I guess it's in there? :)
[23:26] <rainmaker1> Hi, I would like to filter incoming stream (live stream over network in mpegts) into more separate streams
[23:26] <rainmaker1> I use -map:p:program_name
[23:27] <rainmaker1> but it simply ignores command and if I use -map 0:0 - map 0:1 is runs a circular buffer overrun error
[23:27] <rainmaker1> any thoughts?
[23:27] <rainmaker1> I use latest ffmpeg (as of today)
[00:00] --- Sat Oct 6 2012
More information about the Ffmpeg-devel-irc
mailing list