[Ffmpeg-devel-irc] ffmpeg.log.20140206

burek burek021 at gmail.com
Fri Feb 7 02:05:01 CET 2014


[00:23] <pw_> Hello, does anybody know if trac.ffmpeg.org is permanently down?
[00:24] <pw_> a lot of the google search links to it
[00:27] <tsjiller> pw_: probably not, it hasn't been down for that long
[00:29] <beastd> pw_: will come up again. sorry for the inconvenience
[00:34] <pw_> ts,beast: thanks
[00:37] <dbro> eyo! Anyone know how to set the MPEG-4 Audio Object Types ID on a AVCodecContext?
[00:37] <dbro> I'm muxing audio/video with the libav* libraries to "flv" output for rtmp. However, output file has an invalid Audio Object Type of 0
[00:38] <dbro> maybe AVCodecContext::profile is what I want...
[02:09] <SenseiV183> JEEB, Encoding to utvideo my 2Gb source turned in to 20Gb.  Are there quality settings to get the output under 10Gb for 2Gb input file?
[02:12] <qbit_> join #ffmpeg-devel
[02:53] <Logicgate> hey guys
[02:54] <Logicgate> how would I resize a video that's 16:9 to be 1:1
[02:54] <Logicgate> I want to squish it, and not have black bars
[02:54] <Logicgate> I want the video to fill the 480x480 square.
[02:54] <Logicgate> Croping does not seem to work.
[02:55] <Logicgate> whenever I crop a 16:9 video to 1:1 ratio, there's 2 black bars top and bottom.
[03:07] <DeadSix27> Logicgate, so you want to suish?.. eg deformed?
[03:08] <DeadSix27> +q
[03:09] <DeadSix27> well, if so, try -> ffmpeg -i [input] -aspect 1:1 [output]
[03:49] <x_> I want to pipe ffmpeg to another computer to do the bulk of the work but I have 100Mbit bandwidth what format do you think would be best in order to use the least amount of cpu?
[03:50] <x_> would libx264 -crf 0 with ultrafast be the best bet?
[03:53] <klaxa> least amount of cpu would be raw
[03:55] <relaxed> x_: what are you actually doing?
[03:55] <znf> can you fit 720p video on 100mbit in raw?
[03:55] <relaxed> I doubt it
[03:57] <znf> nope, you can't
[03:57] <znf> 276480.0kbits/s
[03:58] <Hello71> you could do it in gbit
[03:58] <Hello71> but you probably increase latency with the network decoding
[03:58] Action: znf wonders how much would 1080p be
[03:58] <Hello71> and possibly cpu if that's in software
[03:59] <x_> Yeah raw is simply too big.
[05:04] <aji> $chan
[05:35] <Logicgate> DeadSix27, you're the man.
[05:35] <Logicgate> I've been looking for this forever
[05:35] <Logicgate> Thank you
[09:25] <SenseiV183> Problem with -vf to speed up or slow down video http://pastebin.com/5g9fNrRJ
[09:29] <shaun__> Nevermind!  I found the typo.  I forgot a white space!
[09:45] <shaun__> God bless you developers and helpers.
[09:54] <Keestu> does ffmpeg depends on yasm library ?
[09:55] <Keestu> i am behind building lastest ffmpeg on android.
[10:13] <relaxed> Keestu: yasm is an assembler for x86 and amd64
[10:25] <Keestu> relaxed,  thanks. ;)
[11:51] <the1_> Hello im trying to record a livestream. The output video pauses on first stream while audio plays right away causing v:a to go out of sync. I guess this is caused by audio not waiting for video keyframe
[11:51] <the1_> how do i make audio wait for keyframe? my command looks something like this:
[11:52] <the1_> ffmpeg -i udp://broadcast-source:2220 -c:a libfdk_aac -ar 32000 -ab 48k -s 400x224 -c:v libx264 -b:v 300k -flags +loop+mv4 -cmp 256 -partitions +parti4x4+partp8x8+partb8x8 -subq 7 -trellis 1 -refs 5 -coder 0 -me_range 16 -keyint_min 25 -sc_threshold 40 -i_qfactor 0.71 -b:v 110k -maxrate 300k -bufsize 300k -rc_eq 'blurCplx^(1-qComp)' -qcomp 0.6 -qmin 10 -qmax 51 -qdiff 4 -level 30 -aspect 16:9 -r 15 -g 45 -async 2 -f mpegts /tmp/me.ts
[12:15] <SirCmpwn> can I get the duration of a video stream with ffprobe?
[12:15] <SirCmpwn> in seconds?
[12:20] <SirCmpwn> answer: yes, but not for all media formats
[12:55] <relaxed> the1_: which ffmpeg version are you using
[13:03] <the1_> ffmpeg -v shows ffmpeg version git-2013-12-24-acafbb4
[13:06] <spaam> x-mas build
[13:08] <d-fens_> hi, i have problem with the "-shortest" argument not cutting the audio to the vid length: http://pastebin.kde.org/pr9dvbhmn
[13:09] <d-fens_> it always is 20sec long, shoudl be ~15
[13:09] <d-fens_> i've thrown -shortest into thw graph like no good
[13:09] <the1_> spaam indeed
[13:11] <d-fens_> what did i do wrong there?
[13:24] <Keestu> while building ffmpeg, where can i find what the options by default is getting selected? for example, network, protocol ,devices.. etc.?
[13:35] <Keestu> never mind i found, ;). it enables everything by default.
[14:03] <RenatoCRON> Hello. i'm trying to low CPU usage, so I tried to pass -profile ultrafast, but I get that ffmpeg dont have this profile to use.
[14:03] <RenatoCRON> http://pastebin.com/raw.php?i=1cNkKabt
[14:03] <RenatoCRON> [libx264 @ 0xa8bc540] Possible profiles: baseline main high high10 high422 high444
[14:03] <RenatoCRON> so, anybody known why ultrafast is not available?
[14:03] <RenatoCRON> do I need to compile ffmpeg with some more flag?
[14:04] Action: RenatoCRON used http://pastebin.com/raw.php?i=mPtTL6Na
[14:04] <JEEB> profile are feature levels in AVC/H.264
[14:04] <JEEB> you mean presets
[14:04] <JEEB> which are x264's speed vs compression defaults
[14:04] <JEEB> (as in, setting a preset sets defaults)
[14:06] <RenatoCRON> JEEB, oh!
[14:08] <RenatoCRON> JEEB, great! i set -profile main and -preset ultrafast and now my test (not stream) halt the time!
[15:46] <SirCmpwn> how do I set a subtitle stream to be the default in an mkv file?
[15:46] <SirCmpwn> I'm using ffmpeg -y -i input.mkv -i input.ass -vcodec copy -acodec copy -map 0:a:0 -map 0:v:0 -map 1:s:0 output.mkv
[15:56] <SirCmpwn> I've tried -metadata:s:s:0 FlagDefault=1 to no avail
[15:56] <SirCmpwn> also tried FlagForced
[16:12] <bacon1989> Hi, I was wondering how the paramter -q:v works, is it bigger values are better, or bigger values are worse?
[16:13] <bacon1989> and what's the scale?
[16:13] <bacon1989> is it 0 --> 10, or 0 --> 100?
[16:13] <bacon1989> I can't seem to find the documentation on it
[16:13] <JEEB> scale depends on the encoder
[16:13] <JEEB> and you only want to use constant quantizer with video formats that don't have anything better
[16:13] <JEEB> for example with libx264 there's CRF
[16:14] <JEEB> which is what should be used, and constant quantizer should only be used for development purposes by developers
[16:14] <BtbN> Does ffmpeg support external encoders which are not a part of ffmpeg itself?
[16:14] <JEEB> yes
[16:15] <JEEB> libx264, libxvid, libtheora, libvorbis to mention a few
[16:15] <BtbN> but those have special code in place in ffmpeg
[16:15] <JEEB> yes
[16:15] <BtbN> i mean something like a general interface
[16:15] <JEEB> no, there is no general interface
[16:15] <BtbN> hm, so i'd have to somehow hack it into ffmpeg itself...
[16:16] <JEEB> in most cases that shouldn't really be too hard, and there should be multiple examples for it already in the libavcodec code base
[16:16] <JEEB> (libutvideo and libfdk-aac being some of the latest ones you can take a look at)
[16:17] <BtbN> Yes, but it forces me to allways use my patched version of it
[16:17] <JEEB> unless you upstream the wrapper, of course
[16:17] <BtbN> I don't think it would be accepted, because it doesn't work without a license key from nvidia
[16:18] <JEEB> it would prevent testing yes, but not like the external encoders are getting tested much to begin with :P
[16:18] <JEEB> wouldn't say it wouldn't get accepted purely on that basis
[16:19] <BtbN> ne, it needs a license key put into the code to make it initialize at all. And nvidia won't like seeing that key freely available in some open source git
[16:19] <JEEB> can't you make it be read from a file or something?
[16:20] <BtbN> i think that would be possible, but would require some processing on the key
[16:21] <JEEB> well, then it means it's more or less possible :)
[16:22] <JEEB> also I wonder which API from nvidia this is :)
[16:22] <BtbN> nvenc
[16:22] <BtbN> their hardware h264 encoder
[16:22] <JEEB> yeah, but is it all of it or just some specific part of it that's locked?
[16:22] <BtbN> the api itself is public: https://developer.nvidia.com/nvidia-video-codec-sdk
[16:23] <JEEB> yeah
[16:23] <JEEB> that's why I was kind of surprised
[16:23] <BtbN> But the init function wants a license key
[16:23] <JEEB> ah
[16:23] <BtbN> it works without a license key, but only on some (very expensive) Quadro/GRID cards
[16:25] <JEEB> in any case, that sounds like something that could go into libavcodec
[16:26] <JEEB> also you'd have to put the license key in a "resource file" loaded up by the code in any case, since if you were to distro your (say, LGPL) binaries you'd have to release the source code to that libavcodec
[16:32] <BtbN> or just an environment variable
[16:32] <JEEB> or that, yes
[16:35] <JEEB> from #elsewhere "JEEB: the latest beta driver for Windows allows NVENC on GeForce cards, if that's what you want"
[16:35] <BtbN> Yes, but i can't imagine that it this is intentional
[16:36] <BtbN> or is it mentioned in some changelog?
[16:36] <JEEB> well, that was an nvidia employee noting that to me. No idea of course if he has an idea of the PC GPU business's real motivations
[16:41] <JEEB> <nevcairiel> apparently they removed this license GUID BS with the 3.0 version of the SDK though
[16:41] <JEEB> (another person)
[16:42] <JEEB> so yeah... it seems like it's getting better, too?
[16:42] <BtbN> No, they didn't. SDK 3 still has the license stuff in it
[16:43] <BtbN> it only works with an empty/zero license key with the latest beta driver
[16:45] <JEEB> <nevcairiel> JEEB: the readme clearly specifies that the license is no longer required, and instead the driver checks for support, and if it works with an empty key, thats exactly what it does. Who cares which driver it works with if it works with one :P
[16:45] <JEEB> <nevcairiel> JEEB: even more so if it works with a recent one, and not some old silly version
[16:46] <BtbN> huh? I'm using the 3.0 SDK since it was released, and it did require a license key to work before the current beta driver.
[16:47] <BtbN> It doesn't require a license to work on Quadro and Tesla cards
[16:47] <BtbN> but for GeForce it did
[16:47] <JEEB> what you just said doesn't contradict to the lines I posted :)
[16:48] <BtbN> I'm just confused that it says so i some readme, i can't find something about that
[17:28] <bparker> JEEB: in my experience with proprietary h264 encoders, they make using libx264 a magical walk in the park
[17:29] <bparker> if I had the time I'd like to make a stupid simple C++ wrapper for libx264 unless a good one already exists
[17:29] <JEEB> oh I've heard of those stories with proprietary things ;)
[17:29] <bparker> that abstracts away the hard-to-understand stuff
[17:30] <JEEB> which is why people were kind of smiling when the low-latency API for nvidia's ASICs came out
[17:30] <JEEB> because the API was clearly designed after someone had taken a look at libx264's similar features
[17:33] <bparker> we thought about using nvenc in the beginning, but honestly when I looked at it it was even more annoying
[17:33] <JEEB> yeah, I've only heard about the certain part of the API and limited similarities
[17:34] <JEEB> haven't looked into that stuff at all in detail yet
[17:34] <bparker> and we thought that maybe in the future we would have products that don't have a quadro card
[17:34] <bparker> and x264 was plenty fast enough
[17:34] <bparker> license fee was reasonable
[17:34] <bparker> so we went with it
[17:35] <bparker> I still ended up making a wrapper class for it just for one application
[17:36] <bparker> but it's not generic enough for general release, plus it's closed source anyway :/
[17:36] <bparker> but you can do like new Encoder(framesize, bitrate, etc.) and then just encoder->encode(framedata) and that's it
[18:19] <pw_> Hi, I'm trying to write a program that would take a h264+ac3 wrapped in ts, and spit out the first I frame as an image.  I've compiled the examples in doc/examples and am currently tracing it.  The problem I have is that if the I stream is being probed, and I guess my short snippet containing the I frame is not long enough for it to auto-probe to recognize that one of the stream is h264.Can you give me some hints as to which field/struc
[18:25] <Olivia_> hi what's the difference between rtbufsize and buffer_size with an udp source ?
[18:40] <llogan> RenatoCRON: (four hours later...) you generally don't need to set a profile unless you're encoding for a limited device
[18:43] <ChocolateArmpits> llogan, but wouldn't setting the profile to main from high realistically increase rendering speed ?
[18:48] <RenatoCRON> llogan, i'm encoding segments of 20sec videos 352x240 4 FPS, in fact, the goal is reduce CPU while encoding. after that there's a player online made in HTML 5 to play these files.
[18:49] <RenatoCRON> after setting -preset to ultrafast, it changed from 25% of cpu to 5% top
[18:49] <RenatoCRON> now 1 AWS c1.medium machine is working fine (i anaylysing it) with 40 cameras
[18:50] <RenatoCRON> analysing*
[18:50] <pw_> renato: what's the latency of that encoding? just curious
[18:51] <RenatoCRON> pw_, from the input ?
[18:51] <RenatoCRON> I guess I don't understood your question very good.
[18:52] <ChocolateArmpits> I think he's asking what time delay there is after the encoding
[18:52] <pw_> input of the encoder to the output.. you said you have it hooked up to cameras... so if somebody were to walk by the camera, when does the image of that person show up
[18:52] <RenatoCRON> pw_, I guess the input profile is 'main' or something like that, because is a RTSP camera
[18:53] <RenatoCRON> pw_, it's almosty realtime, but I can't test this... maybe if I connect using 3g and what it in VLC and move my arms !
[18:54] <pw_> renatocron, just wondering.
[18:54] <RenatoCRON> it's much very fast use -c:v copy, but I need to change framerate and segment, so I need to re-encoding to match keyframes
[18:55] <RenatoCRON> storage is more cheap than CPU, so I and my team was wondering if is possible to save RAW segmented,
[18:55] <RenatoCRON> but to play theses files, it may need 'back in time on files' until see a keyframe
[18:56] Action: RenatoCRON can encode to a cpu-cheap format too, if you guys now, I can do it.
[18:56] <RenatoCRON> maybe video2mpeg  ( i guess is it) i'm very new as FFMPEG 'dev' user
[19:01] <RenatoCRON> pw_, but just for info, i'm using a lot of extenal open cameras from second life for testing also! it's a little crazy universe.
[19:02] <pw_> renatocron, ah.  interesting.
[19:02] <RenatoCRON> I found this on http://h264.wordpress.com/live/
[19:03] <RenatoCRON> not all are online
[19:03] <RenatoCRON> and there's some that return invalid input but .mov is working
[19:03] <RenatoCRON> I don't have time/priority to find why
[19:04] <RenatoCRON> http://i.imgur.com/dMUuSDR.png < all theses cameras is working in the same machine. it's a dual Intel(R) Xeon(R) CPU E5506  @ 2.13GHz
[19:04] <RenatoCRON> uptime is load average: 4.36, 3.55, 2.98
[19:05] <RenatoCRON> i dunno if this is much (I supposed to be) but the machine continue very fast (at least is responding to)
[19:06] <BtbN> bparker, huh? The nvenc api is realy simple, i already used it for a few fully working encoders
[19:06] <BtbN> It doesn't have as much features as x264, but it's not complicated or annoying to use
[19:06] <Olivia_> what's the diff between rtbufsize and buffer_size for udp ?
[19:07] <BtbN> libva intel hw encoding or OpenMAX encoding is horrible, but nvenc is realy great
[19:07] Action: RenatoCRON hmm, looks nvenc is a h264 encoder! 
[19:09] <BtbN> is the github ffmpeg repo up to date?
[19:16] <jnvsor> So what's up with the trac servers?
[20:15] <ChocolateArmpits> Does Yadiff when set to Auto recognize progressive input and hopefully skip deinterlacing ?
[20:16] <ChocolateArmpits> yadif*
[20:22] <ChocolateArmpits> Ok, A few google searches are suggesting it will skip deinterlacing
[20:28] <ChocolateArmpits> But what does "only deinterlace frames marked as interlaced " mean here ? Does it suggest a source input with different frame properties every other scene ?
[21:34] <pw_> is there a way to force ffmpeg to use decode certain pids as a particular decoder?  The closest I found was something like -codec:a:0, but it seems to specify the stream index and not the pid
[22:59] <SirCmpwn> so are there any plans to fix the website
[22:59] <SirCmpwn> it's been a while
[23:18] <llogan> SirCmpwn: yes. should be back by Saturday.
[23:18] <SirCmpwn> excellent
[23:39] <SirCmpwn> can I convert a stero input to a mono output (encoding with libfdk_aac)
[23:40] <SirCmpwn> stereo*
[23:41] <JEEB> as usual, you can use -ac to set the amount of output channels
[23:41] <SirCmpwn> thanks, I just didn't know the option
[00:00] --- Fri Feb  7 2014


More information about the Ffmpeg-devel-irc mailing list