[Ffmpeg-devel-irc] ffmpeg.log.20150209
burek
burek021 at gmail.com
Tue Feb 10 02:05:01 CET 2015
[00:26] <FirstContact> hi everyone, what would you recommend for lowering an 8GB MP4 down to about 2-3GB? The resolution is 1280x536. Would you recommend using the "-crf" option or reducint the resolution as it will be played on a laptop?
[00:27] <FirstContact> would it better to lower the resolution first and then play with the CRF? or just go with the CRF right away and leave the resolution?
[00:33] <FirstContact> hi zap0!
[00:34] <zap0> heLlo
[00:34] <FirstContact> w00t there's a #libav
[00:34] <FirstContact> going to ask there also
[00:35] <zap0> FirstContact, wht are you trying to do anyway
[00:35] <FirstContact> for reducing an 8GB MP4 file to about 2 or 3GB: would it better to lower the resolution first and then play with the CRF? or just go with the CRF right away and leave the resolution?
[00:35] <FirstContact> was going to post in the other channel so as not to repeat question but o well
[00:35] <FirstContact> no one seems to be around atm
[00:36] <c_14> Whichever is less painful for your use-case.
[00:36] <FirstContact> painful how?
[00:36] <FirstContact> haha
[00:36] <c_14> As in, take a short part of the video. Do both. Watch the result. Compare
[00:36] <FirstContact> I just want to reduce the file size without losing that much in terms of quality
[00:36] <FirstContact> hmm...good idea...
[00:36] <FirstContact> I have done samples before, not sure why I didn't think of that this time
[00:37] <FirstContact> I guess I will find the answer myself
[00:37] <zap0> one method will reduce the number of pixels
[00:37] <FirstContact> trueright
[00:37] <FirstContact> err
[00:37] <FirstContact> meant to erase the second word
[00:37] <zap0> one method will reduce teh colours/dynamics
[00:37] <zap0> which is more important to you?
[00:37] <FirstContact> hmm, is that what CRF does?
[00:37] <FirstContact> plays with the bitrate and other stuff I imagine?
[00:37] <FirstContact> merges similar colors, etc
[00:38] <zap0> first thing to do is stop saying abstract words like "quality". far too subjective.
[00:38] <FirstContact> well, on a laptop the 1280x536 pixels wouldnt be necessary
[00:38] <FirstContact> well, less blurriness, less artifacts, good motion
[00:38] <FirstContact> at least on the laptop a friend has
[00:39] <FirstContact> ill do a few samples
[00:39] <zap0> the opposite of less thing... is more XXXXX anti-thing. so you want more sharpness? more detail ?
[00:40] <zap0> more detail means more information. more data. bigger files.
[00:41] <zap0> you can't seem to describe what you want to do.
[00:41] <FirstContact> I can
[00:41] <FirstContact> but you are picking it apart
[00:41] <FirstContact> =)
[00:41] <FirstContact> I want to reduce an 8GB MP4 file down to 2-3GB
[00:41] <FirstContact> while keeping the best "quality"
[00:42] <zap0> so what do you plan to remove?
[00:42] <FirstContact> I think we can all agree that quality in this case would be so that it looks best
[00:42] <FirstContact> which is subjective of course but we can mostly agree that one file will look better than the other
[00:42] <FirstContact> well I have 3 options
[00:42] <FirstContact> reduce the resolution
[00:42] <zap0> "we" can't agree.
[00:43] <zap0> thats the whole point.... quality is too subjective.
[00:43] <FirstContact> I think if I showed you a video with more blurriness you would say that it is worse than the other
[00:43] <FirstContact> do you enjoy blurriness?
[00:43] <zap0> most motion is bluriness.
[00:43] <FirstContact> do you enjoy pixels being merged with other like pixels, or prefer sharpness
[00:43] <FirstContact> youre nitpicking
[00:44] <FirstContact> I understand what youre asking also, but its a simple question
[00:44] <zap0> if i see some thing blurred correctly at EXACTLY good pixel offsets each frame... it's going to look dead smooth which gets me why hard.. mmkay.. but it you paused it.... it'd look like blur-central!!!
[00:44] <FirstContact> what is better to keep as best quality as possible: reduce the resolution, reduce the CRF, or first reduce the resolution and then reduce the CRF
[00:45] <FirstContact> im not talking about smoothness while playing
[00:45] <FirstContact> im talking about sharpness of the image
[00:45] <zap0> the image is a moving image.. it's video.. it's not a still.
[00:45] <FirstContact> right
[00:45] <FirstContact> but one setting will produce less sharpness
[00:45] <zap0> smoothness IS the image.
[00:45] <FirstContact> my question is that
[00:45] <FirstContact> smoothness I would say is the transition of frame to frame
[00:46] <FirstContact> sharpness is how each still frame looks
[00:46] <zap0> which can be acheived by MORE blur
[00:46] <FirstContact> thanks zap0 but clearly you want to nitpick stuff
[00:46] <zap0> not less.
[00:46] <FirstContact> ill ask someone else
[00:47] <zap0> FirstContact, you fail to get real. talking abstract nonsense about keeping some mythical "quality".. yet you can't even describe what "quality" is
[00:47] <FirstContact> roger that zap0
[00:47] <FirstContact> move along now
[00:48] <zap0> you don't want help. you want someone to agree with you so you can feel fuzzy about it
[00:48] <FirstContact> not at all
[00:48] <FirstContact> I want someone to give me an answer
[00:48] <FirstContact> based on what ive explained
[00:48] <FirstContact> its simple
[00:48] <FirstContact> which video will look best
[00:49] <FirstContact> youre going into this whole tirade about "oh well it might not look best to me"
[00:49] <FirstContact> which is true
[00:49] <FirstContact> but I am specifying I want less blurriness in each frame
[00:49] <FirstContact> I dont care about the rest
[00:49] <FirstContact> if it looks horrible playing then so be it
[00:49] <zap0> so you want more detail in each frame
[00:49] <FirstContact> yes
[00:49] <FirstContact> I said that already
[00:50] <zap0> details hold more "information" <-- science. information encoded is data <-- computer science. data is bigger files.
[00:50] <FirstContact> my friend's resolution is actually higher than the video so I guess reducing the resolution is out for now
[00:50] <FirstContact> right I didnt ask that
[00:51] <FirstContact> https://trac.ffmpeg.org/wiki/Encode/H.264
[00:51] <zap0> so you want to throw away 2/3's of teh data, yet you want it to have more detail
[00:52] <zap0> can you not here what you are saying?
[00:52] <zap0> hear/
[00:52] <FirstContact> no
[00:52] <FirstContact> im typing
[00:52] <FirstContact> in the case of a screen that was set to a lower resolution
[00:52] <robotbrain> ok
[00:52] <FirstContact> those extra bits wouldnt matter would they?
[00:52] <robotbrain> so im trying to play a tcp stream on android
[00:53] <robotbrain> i figure I could rebroadcast as udp on localhost
[00:53] <robotbrain> since android already supports that
[00:53] <robotbrain> is that a good option?
[02:16] <pzich> random question: anyone know if it's a bad idea to have ffmpeg read directly off of a NAS when encoding, trying to figure out if it would save time skipping copying the source file locally before encoding
[02:16] <pzich> ideally it'd read just in time and save time and storage
[02:17] <c_14> Depends if IO is your chokepoint or not.
[02:23] <c_14> If your bottleneck isn't IO, it won't matter.
[07:00] <t4nk884> hi
[07:02] <t4nk884> want to extract thumbnail images from videos which are in linux server
[07:03] <t4nk884> and we are running the code using php script
[07:03] <pzich> https://www.google.com/search?q=ffmpeg+thumbnails
[07:03] <t4nk884> i want to know the procedure
[07:03] <t4nk884> iam unaware about linux
[07:03] <t4nk884> what steps i need to follow
[07:04] <pzich> we're here to help with ffmpeg, if you need help with using linux I'd try ##Linux or similar
[07:04] <t4nk884> i did not get you
[07:19] <t4nk884> hi
[07:19] <t4nk884> need to know the steps
[07:19] <pzich> what steps?
[07:19] <t4nk884> want to extract thumbnail images from videos which are in linux server
[07:19] <t4nk884> and we are running the code using php script
[07:20] <t4nk884> we get the video files from linux server using php script
[07:20] <t4nk884> now want to extract thumbnail images from videos
[07:20] <t4nk884> unaware about linux commands
[07:21] <t4nk884> php we are aware
[07:21] <t4nk884> through php how can we do this
[07:21] <pzich> Are you aware of how to run linux commands from PHP?
[07:21] <t4nk884> this is what i want to know...please help
[07:22] <pzich> this is #ffmpeg, for help with ffmpeg, if you want help with PHP, try ##php, if you want help with linux, try ##Linux
[07:59] <__julian> b5
[09:57] <Dutch> hey! Have a question, capturing my webcam and trying to split it into multiple mp4 blocks for playback in browser. I can only get it to work with .ts fragments but I cannot play these back with html5 video tag. Any idea howto split it into usable mp4 files (and upload them with http://xxxx/partx.mp4)
[09:59] <BtbN> ffmpeg won't upload files for you.
[09:59] <BtbN> Just use the hls output/muxer
[10:01] <BtbN> If you want to livestream stuff, i'd recommend setting up an nginx-rtmp server on the remote site.
[10:03] <Dutch> I got ffmpeg to upload files for my using http post
[10:03] <Dutch> -segment_list_flags +live -f ssegment -segment_time 5 http://127.0.0.1/prive/mp4/player/save.php?frame=ffmpeg%%04d.ts
[10:03] <Dutch> uploads 5 seconds chunks
[10:03] <Dutch> save them like this: file_put_contents( ''.time(0).$_GET['frame'], file_get_contents("php://input") );
[10:04] <Dutch> but I need .mp4 segments not .ts is this possible?
[10:06] <Dutch> (Wrote my own http server for subscribing to streams etc using Kqueue on FreeBSD supports up to 10000 connections up to 10gb/s)
[10:07] <Dutch> So I only need chunked mp4 stream now, I would like to use ffmpeg.. if this is not possible I am writing my own application using AVLib
[10:12] <BtbN> Just use the hls muxer, and mount the output dir via WebDAV or something like that.
[10:13] <Dutch> i'll give it a try!
[10:13] <BtbN> Why do you want mp4 so bad? It's not exactly a good container for live stuff
[10:14] <BtbN> nginx-rtmp is also realy worth a look if you want to livestream stuff. You can stream to it via rtmp, and it does the hls muxing for you, and/or streams it via rtmp.
[10:18] <Dutch> Because I need a http stream (requirement)
[10:19] <BtbN> You can't livestream with pure http/html5
[10:19] <Dutch> I am pretty sure I can do it, with ~150ms delay
[10:19] <BtbN> It's stupid, but all current desktop browsers(Except Safari on OSX) don't support live content.
[10:19] <BtbN> You are still stuck with flash for that.
[10:20] <Dutch> Not really
[10:20] <BtbN> sadly, you are
[10:20] <BtbN> HLS is the protocol of choice for it though
[10:20] <Dutch> Proved concept, only need a propper encoder now.
[10:20] <BtbN> And it has way more delay than 150ms, more like 3 times segment length
[10:20] <Dutch> But I think I need to use avlib directly for my needs
[10:21] <BtbN> So you think you know better than all major live streaming sites out there?
[10:21] <BtbN> And want to use mp4? I don't think so.
[10:21] <BtbN> mp4 has the inherent problem that you need to read the end of the file before you can play it.
[10:21] <Dutch> I don't care what other sides use, my costumer needs it like this.
[10:21] <BtbN> so 150ms is outright impossible with it
[10:21] <BtbN> You can move that to the front, but that is a post-processing operation, so if you upload it imediately, that can't happen.
[10:22] <Dutch> that's why I use 100ms chunks
[10:22] <BtbN> 100ms chunks wth?
[10:22] <BtbN> That won't work.
[10:22] <Dutch> It does
[10:22] <BtbN> You need at least one I Frame per segment
[10:22] <Dutch> Already proved concept
[10:22] <Dutch> Only need a encoder that uploads in segments
[10:22] <BtbN> so your stream would be pretty much I-Frame only, and use a ton of bandwidth just because of that.
[10:23] <BtbN> If you want low-latency live streaming, rtmp it is.
[10:53] <blubee> hello guys I am trying to compile ffmpeg on mac os x but I am getting a make error
[10:53] <blubee> Undefined symbols for architecture x86_64: "_ff_filters_ssse3", referenced from: _put_8tap_smooth_64hv_ssse3 in vp9dsp_init.o
[10:53] <blubee> any ideas what's going on?
[11:03] <minimoo> does -isma still exist as an option in ffmpeg?
[11:10] <durandal11707> minimoo: nope
[11:23] <minimoo> durandal11707: i've got something with "-isma -hint -add " options that don't seem to be liked anymore - what did they do/become? :)
[11:26] <relaxed> minimoo: read "man mp4box" and "ffmpeg -h muxer=mp4"
[11:26] <minimoo> always you :)
[11:28] <relaxed> yes :)
[11:28] <minimoo> let me guess - this is basically, i've got some old code trying to add framehints so you can run mp4box as a seperate exe, whereas ffmpeg has for a while allowed -movflags faststart to do the same thing
[11:29] <relaxed> correct
[11:30] <relaxed> look at rtphint too
[11:38] <Filarius> Hello, I`m sending raw video (RGB Bitmaps) to ffmpeg with stdin. Is it possible to also send sound and how if yes?
[11:45] <blubee> bump on the question why does make cause these errors: Undefined symbols for architecture x86_64: "_ff_filters_ssse3", referenced from: _put_8tap_smooth_64hv_ssse3 in libavcodec.a(vp9dsp_init.o)
[11:51] <relaxed> Filarius: you could use a fifo or file as input for the audio
[11:52] <relaxed> blubee: no idea, are you trying to compile git? is xcode up to date?
[11:54] <Filarius> I want to be sure it will be synced and also i`m on windows =\
[12:06] <blubee> relaxed: I am compiling from source downloaded from http://ffmpeg.org/releases/ffmpeg-2.5.3.tar.bz2
[12:34] <edoloughlin> I'm trying to concatenate raw RGBA files and encode them (and add a WAV), but ffmpeg stops after the first RGBA file. Should this work? https://gist.github.com/edoloughlin/0c9608979a6392bdcb5f
[12:46] <edoloughlin> Does anyone know how to achieve this using filter_complex? I'm starting from scratch with this and the docs are either too high level or too detailed for what I want :(
[13:16] <relaxed> edoloughlin: try, cat *rgba | ffmpeg -pix_fmt rgba -s 480x360 -r 25 -vcodec rawvideo -f rawvideo -i - ...
[13:22] <edoloughlin> relaxed: Thanks, but I'm running ffmpeg in a browser (via emscripten) so I can't do any external processing. The reason I have multiple RGBA files is because I'm trying to avoid allocating too much memory at once.
[14:29] <Harzilein> hi
[15:42] <t4nk293> hi
[15:42] <t4nk293> someone read me?
[15:43] <durandal_1707> yes?
[15:44] <t4nk293> hi
[15:44] <t4nk293> i have a problem with ffmpeg in centos 6.6
[15:45] <t4nk293> using the codec x264
[15:45] <t4nk293> could you help me?
[15:46] <durandal_1707> what is problem?
[15:48] <t4nk293> i need to paste it in paste bin
[15:48] <t4nk293> give me a sec
[15:52] <t4nk293> http://pastebin.com/7h8mgFFB
[15:52] <t4nk293> this is the problem I have
[15:53] <t4nk293> i config libx264 and make make install ldconfig
[15:53] <t4nk293> also ffmpeg
[15:54] <t4nk293> but it is not working when I use the enconder
[15:54] <t4nk293> as you can see in the link
[15:55] <t4nk293> help?
[15:57] <klaxa> looks more like a shell issue
[15:57] <c_14> You're missing a space
[15:57] <c_14> after `
[15:57] <c_14> eh, ´
[15:57] <c_14> wait
[15:57] <c_14> eh '´' != '`'
[15:58] <klaxa> if you are on bash, using $() is less ambiguous
[15:58] <c_14> But you are also missing a space after the last quote for the shell expansion.
[16:05] <t4nk293> what do you mean?
[16:06] <c_14> Your shell expansion (probably) isn't a shell expansion, you're also missing a space.
[16:06] <c_14> Try: `ffmpeg -f rawvideo `vncrec -ffinfo -movie marcos.vnc` -i -vcodec libx264 -crf 0 -present ultrafast marcos.avi'
[16:06] <t4nk293> http://pastebin.com/wT2CjztY
[16:07] <t4nk293> try it now
[16:07] <t4nk293> do i have the same problem?
[16:08] <c_14> That's the same problem. Try taking the command I just pasted and using that.
[16:13] <t4nk293> http://pastebin.com/Y5iqwynJ
[16:13] <t4nk293> similar error
[16:13] <t4nk293> present
[16:14] <t4nk293> do you know why?
[16:16] <klaxa> it's called preset not present
[16:18] <t4nk293> thanks but
[16:18] <t4nk293> http://pastebin.com/ueyAjmXk
[16:18] <t4nk293> -vcodec: No such file or directory
[16:19] <t4nk293> why this problem?
[16:23] <t4nk293> i install the codec but I always get the same problem
[16:26] <c_14> Because you never specified an input file
[16:27] <c_14> Is the output of that vncrec command supposed to be an input file?
[16:27] <c_14> If yes, then the -i needs to be in front of that.
[16:35] <t4nk293> http://pastebin.com/rhrkYyhj
[16:35] <t4nk293> better
[16:35] <t4nk293> i got this now
[16:36] <t4nk293> created a file marcos.avi
[16:36] <t4nk293> 1/4 size video before
[16:36] <t4nk293> but there are some errors
[16:39] <t4nk293> one file is 201841 marcos.vnc and the other one is 5682 marcos.avi
[16:39] <t4nk293> the output file is empty nothing was encoded
[16:43] <t4nk293> why?
[16:54] <dusted> ffmpeg -i myfile.tga test.mp4 #This works but: cat myfile.tga | ffmpeg -f image2 -i - test.mp4 #Gives me: pipe:: No such file or directory and: cat | ffmpeg -i - -f image2 test.mp4 #Gives me: pipe:: Invalid data found when processing input
[16:55] <dusted> i'm an idiot, sorry
[16:55] <dusted> image2pipe is the answer
[16:56] <dusted> but it is the wrong answer, I wonder if it can only read tga's from disk but not from pipe
[16:59] <klaxa> cat myfile.tga | ffmpeg -f image2pipe -i - test.mp4 ?
[16:59] <klaxa> is that what you are doing?
[17:00] <klaxa> alternativly: cat myfile.ta | ffmpeg -f image2pipe -i pipe:0 test.mp4 i think
[17:00] <klaxa> maybe look up pipe addressing again, i might be wrong
[17:01] <dusted> klaxa, hmm, it works fine if I do it with jpeg files and jpeg_pipe as format
[17:01] <dusted> so i'm guessing there's something about tga it does not like
[17:02] <klaxa> maybe tga does not count as image2
[17:02] <dusted> klaxa, maybe, I've grepped the format list for tga and targa, nothing comes up, but some module must be eating it, since it works fine when doing ffmpeg -i filename.tga
[17:08] <t4nk293> klaxa why i have this problem http://pastebin.com/rhrkYyhj
[17:08] <dusted> hmm, I have control over the program outputting the data, I chose tga because it is very simple to write, but maybe I should look into streaming raw pixels out instead, if there's a way for me to specify width/height and pixel format to ffmpeg
[17:09] <klaxa> t4nk293: i don't know, i don't know anything about vncrec or what it produces
[17:24] <dusted> it is _WAY_ cooler to just popen ffmpeg and fwrite the pixels :D
[17:25] <kepstin-laptop> yeah, i wrote an app that does that; the only problem is that if you're just writing raw video frames, you can't do per-frame pts :)
[17:25] <dusted> kepstin-laptop, "pts" ?
[17:26] <kepstin-laptop> presentation timestamps, for variable framerate
[17:26] <dusted> kepstin-laptop, aah
[17:26] <dusted> My app is fixed framerate so that's okay for me, now I wonder if I can somehow get it to eat the audio as well
[17:26] <kepstin-laptop> I mean I could always do something like write a timestamps file alongside and use mkvmerge to apply it in a post-processing step.
[17:27] <dusted> but the audio is coming async from a callback function so I don't know about that
[17:27] <kepstin-laptop> you'd want to open two separate pipes to the ffmpeg process on different fds to feed in the audio and video separately
[17:27] <kepstin-laptop> it starts getting complicated, you'll have to do the fork yourself rather than just use popen :)
[17:28] <dusted> hmm, yes
[17:28] <dusted> I'll just merge :p
[17:43] <t4nk293> ok thanks klaxa, but at least that the normal procedure
[17:56] <bastiapple> Hey! o/ I need your help... I'm trying for hours to install ffmpeg on my debian-server, but everytime i want to start the "configure.sh" it crashes and throws this config.log: http://pastebin.com/hG8ae7by
[17:57] <JEEB> there's no configure.sh, the configure script is called "configure" :)
[17:58] <JEEB> and the error is rather simple :P gcc not found
[17:58] <JEEB> do you want to use some other compiler?
[17:59] <bastiapple> ehhmm ok, then only "configure". :P
[18:00] <bastiapple> how could I use another compiler? :o or is there a way to get gcc?
[18:00] <__jack__> bastiapple: just aptitude install gcc
[18:00] <__jack__> or aptitude install build-essential (better)
[18:01] <__jack__> that will fetch you almost everything you need (except specific libs)
[18:01] <JEEB> and yasm
[18:01] <JEEB> since you most probably want the hand-written optimizations
[18:01] <JEEB> so for a basic build, yasm and build-essential
[18:02] <bastiapple> if this works... then fml. I've also tried to install EVERY SINGLE libav hand-written.
[18:02] <bastiapple> I'll try it now
[18:03] <JEEB> you don't want to replace your system libavcodec/-format etc with FFmpeg though, since most probably they are not API or ABI compatible
[18:03] <__jack__> you should remove all libav-* from your system (aptitude purge blabla), or not do
[18:04] <__jack__> make install after building*
[18:04] <__jack__> it will mess things up
[18:04] <JEEB> or just use a custom prefix if you want to have libraries installed
[18:04] <bastiapple> I already reinstalled debian after my failing. :D
[18:04] <JEEB> if you just need ffmpeg the cli app then you just do static as usual
[18:04] <JEEB> as in, just ./configure without any params
[18:05] <bastiapple> i want to install obs to, which need the shared-libs.
[18:05] <bastiapple> *too
[18:06] <JEEB> then just make sure you're not overwriting the prefix where package manager is handling things
[18:06] <JEEB> which should be OK'ish by default because the default prefix in configure is /usr/local while packaged things generally go to /usr
[18:23] <bastiapple> it worked. thaaaaank you! :3 \o/
[18:38] <dusted> it'd be really neat if libaw would burn and die, it's always been sucky for me and debian/ubuntu silently installs it instead of ffmpeg when you apt-get (or at least they used to)
[18:38] <dusted> *libav
[18:39] <Harzilein> dusted: i agree
[18:40] <Meow_meow_meow> sup guys
[18:40] <Meow_meow_meow> [subtitles @ 0x81443f0] Option 'stream_index' not found
[18:40] <Meow_meow_meow> wtf
[18:41] <Harzilein> dusted: it's relatively easy to install ffmpeg-dmo in parallel though, at least if you don't insist on development packages
[18:41] <dusted> Harzilein, I have usually just compiled it by hand and done make install :P
[18:42] <dusted> on archlinux I don't have that problem as they actually provide ffmpeg
[18:42] <Harzilein> dusted: it'd be so nice to have parallel-installable development packages though
[18:42] <Harzilein> dusted: yeah, same with freebsd ports, which is my "other" platform :)
[18:43] <dusted> :)
[18:48] <Meow_meow_meow> oh yes, forgot that 2.* masked in gentoo
[18:48] <c_14> You can probably just unmask that, it's unlikely to break anything. Or you can compile a separate local copy, or use a static build.
[18:50] <Meow_meow_meow> also there is any other way to select subs stream?
[18:51] <c_14> Not with the subtitles filter. Bar muxing a new version of the source file without all other subtitle tracks, or extracting the subtitle track and using that in the subtitles filter.
[19:03] <Meow_meow_meow> ffmpeg doesn't support embedded fonts?
[19:04] <c_14> It does
[19:05] <Meow_meow_meow> that's great
[19:05] <c_14> In modern versions with the subtitles filter and the file with the embedded fonts.
[19:05] <ChocolateArmpits> Why does amerge filter report about overlapping channels when I input 4 mono tracks ?
[19:06] <ChocolateArmpits> Or 2 stereo tracks for that matter
[19:07] <ChocolateArmpits> k
[19:08] <dusted> kepstin-laptop, how about mkfifo and using named pipes to get audio and video at the same time? :)
[19:14] <ChocolateArmpits> c_14: http://pastebin.com/ihQGe2xc
[19:14] <ChocolateArmpits> I don't understand what happens on "Input channel layouts overlap" message
[19:14] <c_14> Right, anytime you have a channel twice in both inputs you'll get that message.
[19:15] <c_14> You have (assuming stereo) FL+FR + FL+FR
[19:15] <c_14> FL overlaps FL and FR overlaps FR
[19:15] <ChocolateArmpits> So will I end up with a 4.0 channel layout or will some channels get thrown out ?
[19:15] <c_14> You will get 4.0
[19:15] <ChocolateArmpits> cool, thanks
[19:15] <c_14> >On the other hand, if both input are in stereo, the output channels will be in the default order: a1, a2, b1, b2, and the channel layout will be arbitrarily set to 4.0, which may or may not be the expected value.
[19:17] <ChocolateArmpits> Well I use pan filter after that to downmix those four or channels into 2 so it's expected
[19:17] <ChocolateArmpits> or more *
[19:24] <Filarius> hello, i`m sending raw video (rgb) to ffmpeg with stdin, how I can also send synced sound with video too? (its on Windows btw)
[19:29] <Meow_meow_meow> oh, audio desync with second audio track ;_;
[19:31] <Harzilein> Filarius: do you expect the audio duration (by specified sampling rate) to not match up w/ the video duration (by specified framerate)?
[19:32] <Harzilein> Filarius: i.e. with 50 fps and 44100 samples/s, the muxer should just pack 882 samples for every frame
[19:33] <Filarius> it must be live encoding. I have image generator and found out how use ffmpeg to push (with encoding) image sequence to rtmp streaming server
[19:34] <Filarius> and I thought is it posible to add music here
[19:34] <Filarius> right now I just sending raw RGB bitmaps
[19:35] <Harzilein> Filarius: do you need to know ffmpeg -i foo -i bar -map 0:v:0 -map 1:a:0?
[19:37] <Filarius> well... I had not use that, but had to read similar options while searching for another staff. "foo" and "bar" is files ? i`m work on Windows (just for sure).
[19:37] <Filarius> *had not use that before
[19:39] <Harzilein> yes, you can specify multiple inputs and reference them in map options, the first value indicates which input you are referencing, zero based, by order of appearance on the command line
[19:40] <Harzilein> so foo would be the video, and bar would be the audio. a video would work too for bar, and a video with audio would work for foo as well, as we carefully select the first _video_ stream from the former and the first _audio_ stream from the latter
[19:41] <llogan> Filarius: http://ffmpeg.org/ffmpeg.html#Advanced-options
[19:41] <llogan> https://trac.ffmpeg.org/wiki/How%20to%20use%20-map%20option
[19:43] <Filarius> No way to send both raw video and audio via StdIn ? I thought I can use muxer and then send both, but I had never to code this staff, I`m just little kid in coding.
[19:46] <kepstin-laptop> Filarius: you can send both in a single stream, but the issue is that you'd have to use a muxer to some format that ffmpeg can demux and which supports raw video
[19:46] <kepstin-laptop> and the easiest way to do that... would be to link ffmpeg into your app so you could use a ffmpeg muxer. which kind of defeats the point :)
[19:47] <Meow_meow_meow> ffmpeg with -map will utilize delay setting from mkv?
[19:47] <Filarius> I mostly had write with Delphi, not such thing what have latest headers of ffmpeg and many expamples to learn from :)
[19:48] <Meow_meow_meow> Filarius: http://en.wikipedia.org/wiki/Named_pipe#In_Windows
[19:48] <Filarius> oh, right, named one
[19:51] <Filarius> just I found and use nice module to easy work with anon pipes, all this examples to work with pipes in internet look like black magic where you just must copy code and wish it will work
[19:52] <Filarius> I will try it, THANKS
[19:59] <Meow_meow_meow> looks like ffmpeg ignore audio delay setting from mkv ._.
[20:02] <Meow_meow_meow> maybe there is other way than map to switch audio stream?
[20:49] <Mista-D> Can't get video to go as track 0 into output. http://pastebin.com/bd0SpZyz
[20:53] <c_14> Try explicitly mapping the output pad from the filter_complex
[20:54] <Mista-D> c_14, yeah will do.
[22:05] <thekrzos> Hello, I can't reload text by drawtext using flag "reload". I've got error "Key 'reload' not found." Can you help me?
[22:06] <thekrzos> This is my log: http://pastebin.com/S0XGTvDP
[22:34] <llogan> thekrzos: that's an old version. is reload listed under drawtext in "man ffmpeg-filters"?
[22:35] <llogan> actaully it doesn't matter. just use a recent build
[22:36] <thekrzos> Can you update it in repo? I've got only this version and i can't compile new ffmpeg
[22:37] <llogan> why not?
[22:42] <thekrzos> Okay, i'm reinstalled my testing VPS to Ubuntu 14.04, can I change invertal when file will be reloaded? FFmpeg closing when i'm update file...
[22:43] <thekrzos> The text file '/tmp/data.txt' could not be read or is empty
[22:43] <thekrzos> Failed to inject frame into filter network: Invalid argument
[22:43] <thekrzos> Sorry for my bad english, i'm from Poland ;)
[22:45] <llogan> it is reloaded before each frame. i do not believe you can change that interval.
[22:45] <llogan> how are you updating the file?
[22:46] <thekrzos> I'm testing by this bash script: http://wklej.org/id/1630443/
[22:49] <MadTBone> is it possible to adjust PTS without re-encoding (using -c:v copy -c:a copy)? I know that using a filtergraph won't work while copying.
[22:51] <c_14> thekrzos: try writing to a new file and using mv to overwrite the old one
[22:52] <techtopia> hey guys
[22:52] <llogan> ...since the docs say, "Be sure to update it atomically, or it may be read partially, or even fail."
[22:52] <techtopia> im encoding h264 video, that is 29.970 fps to x264 with ffmpeg
[22:52] <techtopia> the end result is really jerky playback
[22:53] <techtopia> anyone know why? it's not present in the source
[22:54] <techtopia> http://pastebin.com/tzihrQRN
[22:55] <llogan> techtopia: where is the rest?
[22:56] <techtopia> i don't have the console output anymore unless it's logged somewhere
[22:56] <techtopia> would take about 2 hours to get again
[22:56] <techtopia> i can paste the media info
[22:56] <llogan> you can't re-run it with the -t option?
[22:56] <techtopia> what will -t do ?
[22:56] <techtopia> oh just encode a few seconds of it
[22:56] <techtopia> lemme try
[22:59] <techtopia> http://pastebin.com/SFe3igze
[22:59] <techtopia> console output
[23:00] <techtopia> bah missed the top
[23:00] <techtopia> http://pastebin.com/jhp5nkMR
[23:00] <techtopia> full output
[23:03] <techtopia> any ideas logan
[23:03] <llogan> thekrzos: if you want the date or real-time encoding you can use "text=%{localtime}"
[23:03] <llogan> *date of
[23:04] <llogan> techtopia: where's the command?
[23:05] <techtopia> it's part of a script with echo off but the command is
[23:05] <techtopia> ffmpeg -y -i file.m3u8 -sws_flags spline -sn -vf crop=1276:716:2:2 -s 1280x720 -c:v libx264 -preset slow -profile:v high -level 4.1 -crf 23 -r 29.970 -x264opts colormatrix=bt709 -acodec copy webripoutput.mkv
[23:06] <llogan> why are you setting 29.970?
[23:07] <techtopia> because thats the input framerate
[23:07] <techtopia> didn't want it to change
[23:07] <llogan> ffmpeg will use the same frame rate
[23:07] <llogan> unless you use filters or options to modify it
[23:07] <llogan> and the input is probably 30000/1001 and not 29.970
[23:08] <llogan> looks like you forgot some of the console output
[23:08] <techtopia> ok i will do a quick test without that
[23:09] <thekrzos> llogan: c_14: very thanks for help ;)
[23:10] <llogan> techtopia: try with a simple command and then add until you find the culprit. remove -sws_flags spline -vf crop=1276:716:2:2 -s 1280x720 -r 29.970 -x264opts colormatrix=bt709
[23:13] <techtopia> ok
[23:18] <Seb_Sob> Hey everyone, i have a video in portrait mode 720x1280. Is it possible to export is as landscape 1280x720, and add black background left and right to fill up space?
[23:18] <Seb_Sob> because now it just rotates
[23:19] <techtopia> -transpose will rotate it
[23:20] <Seb_Sob> so -vf transpose=1
[23:20] <Seb_Sob> ok thanks
[23:20] <techtopia> but their shoudlnt be any black borders if you go from portrait to lanscape
[23:20] <techtopia> only the other way
[23:21] <Seb_Sob> yeah its the other way :) my bad
[23:22] <Seb_Sob> well accually no
[23:22] <Seb_Sob> thats the thing
[23:22] <Seb_Sob> i dont want it to rotate
[23:22] <Seb_Sob> i want it to 'fit'
[23:23] <Seb_Sob> fit the portrait in landscape
[23:23] <Seb_Sob> if thats possible?
[23:25] <c_14> You have a video in portrait and you want to add black bars until it's in landscape?
[23:25] <Seb_Sob> yeah
[23:25] <c_14> Just use the pad filter
[23:26] <Seb_Sob> i want the video still to be on the correct orientation, but the output should always be landscape
[23:26] <c_14> Just use the pad filter
[23:26] <Seb_Sob> pad filter, ill check that
[23:30] <bAStek2> guys i am total noob on ffmpeg stuff and have some questions, hopefully someone will be able to answer them:
[23:30] <bAStek2> 1. can i use ffmpeg on iOS and Anroid ?
[23:30] <bAStek2> 2. i need to do something like that: i have a video and i need to shrink it and copy it so i have two same videos on screen (left side of screen and right side of screen) and then save it as one file for someone to watch. Is it possible with ffmpeg?
[23:31] <c_14> 2: Yes. 1: not my area of expertise
[23:31] <bAStek2> thx c_14
[23:33] <bAStek2> anyone knows if i can use ffmpeg on iOS and Andriod mobiles?
[23:34] <Seb_Sob> Bas
[23:34] <Seb_Sob> im using it on Android currently
[23:34] <Seb_Sob> check out this lib: http://androidwarzone.blogspot.be/2011/12/ffmpeg4android.html
[23:34] <Seb_Sob> not sure if official tho
[23:35] <bAStek2> thanks for answet Seb_Sob
[23:35] <bAStek2> cheking your link now
[23:35] <Seb_Sob> its for Android
[23:35] <Seb_Sob> not all functionallities are supported, but what you want to achieve is possible i think
[23:44] <Seb_Sob> c_14
[23:44] <Seb_Sob> any idea how i can do the filter pad?
[23:45] <c_14> https://ffmpeg.org/ffmpeg-filters.html#pad
[23:46] <c_14> Pick an aspect ratio, ih*ar:iw*ar:(oh-ih)/2:(ow-iw)/2
[23:46] <c_14> where ar is the aspect ratio
[23:48] <Seb_Sob> ko thanks
[23:50] <Seb_Sob> wait a sec
[23:50] <Seb_Sob> my input is landscape
[23:50] <Seb_Sob> but it is rotated (mobile meta tag)
[23:50] <c_14> Then put it through a transpose=1 filter first
[23:50] <Seb_Sob> so what i have to do is rotate it and center it
[23:51] <c_14> (I think it's 1, check the docks)
[23:51] <Seb_Sob> yeah 1 is
[23:51] <c_14> Also, that ar bit isn't quite right
[23:51] <c_14> You'll have to find the correct number for both of those...
[23:54] <Seb_Sob> okay but what is the input w and h then?
[23:54] <Seb_Sob> is it the one after the transpose or original?
[23:54] <c_14> After transpose
[23:56] <Seb_Sob> this might do it then transpose=1,pad=360:640:140:0:black
[23:59] <llogan> also make sure to strip the rotate metadata in the video stream: -metadata:s:v rotate=""
[23:59] <llogan> or else it may be preserved
[00:00] --- Tue Feb 10 2015
More information about the Ffmpeg-devel-irc
mailing list