[Ffmpeg-devel-irc] ffmpeg.log.20130620
burek
burek021 at gmail.com
Fri Jun 21 02:05:01 CEST 2013
[00:00] <vhann_> llogan: http://pastebin.com/ES4yvuiH
[00:01] <xeberdee> I tried compiling ffmpeg on ubuntu by the guide but it failed on libopus - I'm not sure what to do with the static build?
[00:01] <xeberdee> I just get install libavtools
[00:02] <llogan> which ubuntu version? works for me on 13.04
[00:03] <vhann_> llogan: Ah shit, let me try again with the video stream this time
[00:04] <vhann_> llogan: Same error: http://pastebin.com/CbRAEV72
[00:04] <xeberdee> llogan: Ubuntu 12.04.2 LTS
[00:04] <llogan> xeberdee: you're not the first person to experience the issue, but i couldn't duplicate it (yet). i'll try on 12.04
[00:06] <llogan> xeberdee: until then you can see static build instrucitons: http://askubuntu.com/a/270107/59378
[00:08] <llogan> xeberdee: can anything play "THE_DARK_NIGHT1-1.vob"?
[00:08] <llogan> oops... vhann_
[00:09] <vhann_> llogan: VLC plays it fine. Let me check mplayer and xine
[00:10] <llogan> the audio specifically
[00:11] <vhann_> llogan: xine and VLC play the file fine. I can switch audio tracks flawlessly in both
[00:11] <vulture> look at the stream details for it
[00:11] <vulture> and see what it says for the audio stream
[00:14] <vhann_> vulture: Me?
[00:15] <vulture> yes
[00:15] <vulture> see if vlc's stream info matches ffmpeg's ?
[00:16] <llogan> vhann_: can you create a sample and upload it somewhere? "dd if=THE_DARK_NIGHT1-1.vob of=output.vob bs=1024 count=10000"
[00:19] <vhann_> vulture: http://postimg.org/image/escpp4ogd/
[00:22] <vulture> so stream1 is the audio?
[00:23] <xeberdee> llogan: thanx for the help I got the static working.
[00:24] <llogan> xeberdee: i updated the guide so compiling should now also work on 12.04.
[00:24] <vulture> would be nice to see a more detailed stream list than w/e that app is spitting out
[00:25] <vulture> especially since the streams dont match up with what ffmpeg says
[00:36] <vhann_> vulture: llogan: I msg'ed you the output.vob file's URL
[00:37] <llogan> thanks
[00:40] <llogan> but i should have asked for a section with sound (using skip option in dd).
[00:41] <vhann_> llogan: Gimme a min, I'll send you the first 100Mb
[00:42] <elkng> what ? pirating ?
[00:45] <llogan> vhann_: 10 mb should be fine if you skip the warnings and assorted other crap.
[00:47] <vulture> Format : AC-3
[00:47] <vulture> Channel(s) : 6 channels
[00:47] <vulture> ffmpeg says: Stream #0:1[0x80]: Audio: ac3, 0 channels
[00:47] <vulture> so yeah ffmpeg fails
[00:49] <vulture> but anyway, maybe it doesnt matter, what's your goal with it anyway? to re-encode it? or just stream copy?
[00:50] <vhann_> vulture: I wanted to make a short derivative work to poke fun at a friend
[00:51] <vhann_> It's allowed in Canada as the clip is short enough
[00:51] <vhann_> s/is/would be/
[00:56] <vulture> different commandline shows 5.1, and channels=6, though I still get error
[00:56] <vulture> maybe you can specify a manual option to merge to 2 channels
[00:56] <vulture> idk
[01:02] <xeberdee> llogan: ffmpeg compiled fine on ubuntu 12.0.4 LTS now
[01:03] <xeberdee> thanx
[01:03] <llogan> no problem
[01:03] <vulture> vhann_: virtualdub has no problem opening/parsing/converting this vob
[01:03] <vulture> even with audio
[01:03] <vulture> so probably an ffmpeg issue idk
[01:03] <llogan> ffmpeg will recognize the correct number of channels (after the 0 channel junk) with -analyzeduration 10000000 -probesize 10000000
[01:03] <llogan> as input options of course
[01:04] <llogan> ...but it still ignores the other streams.
[01:14] <vulture> just use virtualdub instead? :P
[01:14] <vhann_> vulture: ... you would make a terrible marketer :p
[01:15] <vulture> shrug, if it doesnt work out of box, try something else!
[01:16] <vulture> I wish I could get ffmpeg to work in my code but it breaks down occasionally =/
[01:17] <durandal_1707> normal way to report issue is to fill bug report
[01:24] <vulture> from the sheer number of open bug reports I'm guessing the normal way isnt very useful :D
[01:24] <durandal_1707> whatever, looks like you are not interested in helping project
[01:25] <vulture> I'm just being realistic
[01:25] <vulture> I've submitted dozens of bug reports for other projects as well, and some even from like 2001 are still unanswered :P
[01:27] <durandal_1707> vulture: you are not realistic, just extremly ignorant and lazy, you did not looked at list of closed bug reports
[01:32] <vulture> well I can put in a ticket after work and we'll see
[02:17] <Jan-> are you guys seriously trying to tell me that http://ffmpeg.org/libavcodec.html is the ENTIRE DOCUMENTATION to libavcodec?
[02:17] <Jan-> It's two sentences!
[02:17] <durandal_1707> no its not
[02:19] <vulture> Jan-: http://ffmpeg.org/doxygen/trunk/group__libavc.html
[02:20] <vulture> though google may be easier to navigate to where you want there
[02:20] <Jan-> Okay, that's reference, but it isn't so much documentation.
[02:20] <vulture> or actually it also has a search in the upper right
[02:20] <Jan-> is there a broad architectural overview anywhere as to how the thing is supposed to work, at a high level?
[02:20] <vulture> gl with that :D
[02:20] <vulture> at least they include sample programs now
[02:21] <vulture> in the -dev download, there's a doc/examples
[02:21] <Jan-> What I want to do is get from a file on disk to (presumably) a pointer to some audio samples.
[02:21] <Jan-> Presumably that's possible somehow.
[02:21] <Jan-> But nowhere is it actually written down what the basic steps are to do that.
[02:22] <vulture> demuxing.c in the doc/examples directory is what I used
[02:22] <vulture> it's decent
[02:22] <vulture> it's missing a couple key points though
[02:23] <Jan-> I've found lots of examples but none of them ever work as they're more than 7 seconds old, which in ffmpeg is.... ages...
[02:23] <durandal_1707> nonsense
[02:23] <vulture> the examples included work
[02:24] <vulture> demuxing.c
[02:24] <vulture> try that one
[02:24] <Jan-> In any case, my interest is in writing a C# interop layer for it, so I'd rather not have to dive into C if I can possibly avoid it.
[02:25] <vulture> well, I dont think that's entirely unavoidable if you want a complete wrapper
[02:25] <vulture> *avoidable
[02:26] <Jan-> well it *ought* to be, if there were any docs.
[02:26] <vulture> "open source" :P
[02:26] <Jan-> Yeah. I know.
[02:26] <durandal_1707> Jan-: there is doxygen
[02:26] <Jan-> Compiling a useful avcodec and avformat under windows is not easy.
[02:27] <Jan-> Wait, what am I saying, it's not easy under linux, it's downright impossible under windows
[02:27] <vulture> just use the prebuilt
[02:27] <Jan-> Sure, love to
[02:27] <durandal_1707> Jan-: its possible to build on windows just fine
[02:27] <durandal_1707> stop spreading lies
[02:28] <vulture> building on windows isnt as impossible as it was 5 years ago
[02:28] <vulture> where mingw was a big pile
[02:28] <Jan-> AVFormatContext seems to have... quite a lot of members.
[02:28] <vulture> anyway, there are legit prebuilts now :P
[02:28] Action: Jan- looks a bit alarmed
[02:28] <durandal_1707> you can compile with msvc now
[02:28] <vulture> can you really
[02:28] <vulture> how is that accomplished :P
[02:28] <Jan-> Quite a lot of those members are, er, a bit complex in themselves.
[02:28] <durandal_1707> there is fucking documentation
[02:29] <vulture> msvc doesnt support c99
[02:29] <vulture> and it never will afaik
[02:29] <durandal_1707> irrelevant
[02:29] <vulture> seems pretty relevant
[02:30] <Jan-> Eesh, you can have structs with structs in 'em?
[02:30] Action: Jan- looks a bit alarmed
[02:30] <vulture> well that's just basic programming :P
[02:31] Action: Jan- isn't really a C coder
[02:31] Action: Jan- knows enough to be dangerous
[02:31] <vulture> "FFmpeg can be built with MSVC using a C99-to-C89 conversion utility and wrapper."
[02:31] <vulture> slick
[02:32] <vulture> although, it still requires msys, so that almost defeats the purpose.. :P
[02:33] <vulture> Jan-: well in c# it has the same thing
[02:33] <vulture> class/struct within a class/struct is quite common
[02:33] <Jan-> I think if I wanted to directly invoke avformat, at least, I'd have to implement about 8 different structs with several hundred members overall.
[02:34] <Jan-> Which is likely to be a suckfest.
[02:35] <vulture> oh yes
[02:35] <vulture> well certain other mpeg apis have several thousand members
[02:38] <Jan-> I guess I'm just used to object oriented languages where "context" is implied and having to pass around this enormous AVFormatContext seems a bit, er, basic?
[02:40] <vulture> right it's basically just implied/hidden for you in a class
[02:40] <vulture> it's not really much different
[02:41] <Jan-> No, it's exactly the same! Except for this enormous, extremely complicated struct.
[02:41] Action: Jan- skritches her head
[02:42] <vulture> you arent rewriting the whole struct every time, it's just 1 extra parmaeter
[02:42] <vulture> *parameter
[02:43] <vulture> it's being passed by reference :P
[02:43] <Jan-> Sure, but if I'm gonna platform-invoke one of these APIs in C#, I need to create this struct. By hand. Manually. In my own code.
[02:43] Action: Jan- looks a bit uncomfortable
[02:43] <vulture> yeah have fun with that
[02:44] <vulture> maybe you can copypaste the c headers or write a generator to parse them
[02:44] <Jan-> It'd be nice to FIND the C headers.
[02:44] <vulture> ffmpeg-20130613-git-443b29e-win64-dev\include
[02:44] <Jan-> Oh.
[02:44] <Jan-> Looks like I can't even do that.
[02:44] <vulture> I'm just using this from the precompiled win64 package :P
[02:45] <Jan-> Oh.
[02:45] <Jan-> Seems there's a method (er, function) called avformat_open_input() which handles from-file input
[02:46] <Jan-> which internally creates said struct.
[02:46] <vulture> yeah
[02:46] <Jan-> That said I might still need the layout of the damn thing.
[02:47] <Jan-> I'm not sure if it's open_input or open_input_file
[02:48] <Jan-> hm no you still need to pass one in
[02:48] <vulture> I use avformat_open_input
[02:49] <vulture> with a custom AVIOContext to read from memory
[02:49] <Jan-> I have "stream" or "file" in this particular set of docs
[02:49] <Jan-> but I have no idea when it's from
[02:49] <Jan-> also it seems to need you to tell it what format the file is in
[02:49] <Jan-> or is that an "out" parameter
[02:50] <Jan-> is ANY of this documented? ANYWHERE?
[02:50] <vulture> no idea
[02:50] <vulture> just step through the demuxing.c example I mentioned
[02:50] <vulture> it gives the basic process
[02:50] <vulture> it will figure out the format for you generally
[02:51] <Jan-> humph
[02:52] <Jan-> this doxygen stuff is great for producing a lot of verbosity
[02:52] <Jan-> but not actually very explanatory
[03:03] <Jan-> do I need to start looking somewhere else for information on this thing
[03:04] <Jan-> I mean, I'm looking at ffmpeg.org/libavutil.html
[03:04] <Jan-> the entire documentation is three lines.
[03:04] <vulture> yes I told you where
[03:04] <vulture> several times
[03:05] <Jan-> I'm talking about docs, not code. An actual overview of how it's supposed to operate.
[03:05] <Jan-> Not just a list of function names and their parameters. That doesn't tell me much.
[03:06] <vulture> yeah already answered that too, but I'm no expert
[03:06] <Jan-> Looking around I found http://ffmpeg.org/trac/ffmpeg/wiki/Using%20libav*
[03:07] <Jan-> but really it just goes on about how there isn't much of any documentation and what there is tends to be outdated, which I kinda figured out for myself.
[03:12] <Jan-> I'm lookint at demuxing.c and there's almost no comments in it
[03:13] <Jan-> it's not actually explained anywhere what the code is *intended* to do, let alone how it does whatever it is
[03:14] <vulture> it wasnt that b ad
[03:14] <vulture> wasnt great
[03:14] <Jan-> sure vulture
[03:14] <vulture> but was easy to follow imo
[03:14] <Jan-> but... I mean... "demuxing.c"
[03:14] <Jan-> what is it MEANT to do?
[03:14] <Jan-> It doesn't even explain the purpose of the code, how it fits into what you might want to do in a practical application.
[03:15] <Jan-> I don't think I'm particularly stupid but... I mean... how does this help?
[03:15] <vulture> takes a file, demuxes it, and then decodes the audio/video streams into raw data
[03:15] <vulture> * libavformat demuxing API use example.
[03:15] <vulture> *
[03:15] <vulture> * Show how to use the libavformat and libavcodec API to demux and
[03:15] <vulture> * decode audio and video data.
[03:15] <vulture> :P
[03:15] <Jan-> it doesn't define what "demux" actually means in that context
[03:16] <Jan-> demux as in unpack the compressed video frames?
[03:16] <Jan-> demux as in separate out the audio and video?
[03:16] <vulture> if you dont know that then you need to tke a basic multimedia course
[03:16] <vulture> I can give you a nutshell
[03:16] <Jan-> I've been working in television production for ten years.
[03:16] <Jan-> "demux" is a very, very generic term.
[03:16] <Jan-> If there's a specific meaning of it that's useful with regard to ffmpeg, fine, but someone needs to WRITE THAT DOWN SOMEWHERE>
[03:17] <vulture> it's generic, and it also has a specific meaning for multimedia files
[03:17] <vulture> you have codecs, which are encoder/decoders for specific types of data, e.g. audio, or video
[03:17] <Jan-> one of the functions is called "decode_packet"
[03:17] <vulture> libavcodec handles this
[03:17] <Jan-> is this something to do with networking? ethernet packets? what?
[03:17] <vulture> then you combine or mux those encoded streams into a container format, which is what libavformat does
[03:18] <Jan-> I get the impression that avformat emits "packets" of some sort but I only got that by randomly inspecting some other file.
[03:18] <vulture> when you demux a video file, you're pulling back out those compressed streams
[03:19] <Jan-> Okay. Fine.
[03:19] <Jan-> Now if you'd liketo imagine me screaming this into a megaphone at the top of my voice
[03:19] <Jan-> WHY DOESN'T IT SAY THAT IN THE GODDAMN DOCUMENTATION.
[03:20] <vulture> I think they expect you to know that going in
[03:20] <Jan-> How?!?!
[03:20] <Jan-> That's their own private meaning of the word "demux"!?!?
[03:20] <vulture> no thats the entire world's definition :P
[03:20] <Jan-> No, no, it really isn't. Go talk to a satellite engineer.
[03:20] <vulture> it's actually a very very common word
[03:21] <Jan-> Now on to this "packet" thing
[03:21] <Jan-> is this network code or what
[03:21] <vulture> dont think so
[03:21] <Jan-> so... what then...
[03:22] <vulture> just generic packets of data? idk :P
[03:22] <vulture> I wrote a wrapper around ffmpeg so that I dont have to touch any internals
[03:22] <vulture> and also so that I dont have to think about ffmpeg's weird api anymore
[03:22] <Jan-> I don't blame you :/
[03:22] <vulture> several years ago we hired a guy to do it for us
[03:23] <vulture> and it was highly broken
[03:23] <vulture> it's changed a lot since then though...
[03:23] <vulture> they include example source code now!
[03:23] <Jan-> *sigh* I think a lot of it is about that to be honest
[03:23] <vulture> and it even compiles out of box, and actually runs, without (too many) errors
[03:23] <Jan-> I know a few people make a LOT of money out of consulting on ffmpeg
[03:23] <Jan-> I have a feeling there's a bit of job protectionism going on
[03:26] <vulture> I've used several commercial video libraries
[03:26] <vulture> they're nice
[03:26] <vulture> open() read() close() :P
[03:27] <vulture> quite literally that easy
[03:27] <Jan-> in all honesty 95 plus per cent of cases would be pcm audio in either avi, or mp4/quicktime/whatever
[03:27] <Jan-> I'm just aware of how tricksy some of these file formats can be and how they can bite you in the ass
[03:27] <Jan-> so I'm not that anxious to start writing RIFF file parsers. :/
[03:27] <vulture> windows provides a lot of that for you
[03:28] <vulture> if you're relying on windows, maybe there's a directshow wrapper for c# ?
[03:28] <vulture> you can use ffmpeg through directshow
[03:28] <Jan-> There is.
[03:28] <vulture> (but directshow sucks)
[03:28] <Jan-> It does.
[03:28] <Jan-> It really does.
[03:28] <vulture> I did the wrapper thing for directshow too. lol :P
[03:28] <Jan-> To be fair it sucks because it's extremely lightweight and therefore pretty quick.
[03:28] <vulture> riiiiiiiiiiiiiiiiiight
[03:28] <Jan-> I think people expect too much of it
[03:29] <Jan-> Well, it depends what you try to do with it.
[03:29] <vulture> lightweight :P except really slow and lacks all precision
[03:29] <Jan-> Start running filters that do a lot of stuff and obviously it's gonna suck ass.\
[03:29] <vulture> it's based on COM so it's automatically heavy bloat
[03:29] <Jan-> Precision I'll give you but in our case it was a driver fault with third party hardware.
[03:30] <Jan-> totally bent 10 bit RGB video.
[03:30] <Jan-> unusable.
[03:30] <Jan-> But really I don't want to have to go into a whole bunch of DS code just to get the first second of audio out of some files.
[03:31] <Jan-> *especially* as we then end up with complex end user requirements to install codecs for all their target files, which is a sucky thing to ask.
[03:32] <vulture> if it's wav or mp3 then it's trivial
[03:32] <vulture> (with windows)
[03:32] <Jan-> it's video files from cameras
[03:32] <vulture> if it's something else
[03:32] <vulture> ah
[03:33] <Jan-> now I don't actually need the video, I need the audio tracks
[03:33] <Jan-> so if it's prores or something it doesn't really matter
[03:33] <vulture> if you dont care about performance or latency, you could just shell to ffmpeg.exe
[03:33] <Jan-> I could. But I do care about those things a bit.
[03:33] <Jan-> And man, what a sucky solution :/
[03:34] <Jan-> It just seems crazy that they'd write all that code and then... you know... just sort of not bother writing it up in any useful way.
[03:34] <Jan-> it's absurd
[03:34] <Jan-> it's just completely stupid, why would you do that
[03:36] <vulture> busy, lazy, or the "open source" excuse
[03:37] <Jan-> Gah :(
[03:39] <vulture> alternatively, make your own wrapper with simple structures (in c), and then you can call just the wrapper from c# without having to copy all the complex crap
[03:39] <Jan-> That was how I first thought about doing it
[03:39] <vulture> you have to do some c work but imo it needs a wrapper anyway
[03:39] <Jan-> then I realised how often they change the API
[03:40] <vulture> heh
[03:40] <Jan-> I mean it changes several times a DAY Often
[03:40] <vulture> it's more stable now than it was
[03:40] <Jan-> it seemed like that layer would be a huge maintenance issue
[03:40] <vulture> and you'd have to change it MORE if you were to use it natively in c#
[03:40] <vulture> because your entire interop would become invalid
[03:40] <Jan-> every time we updated the libav stuff it would be basically certain we'd need to hack the C
[03:40] <vulture> whereas with a c wrapper it's mostly just the wrapper
[03:40] <Jan-> sure but the main issue is that the only IDE I have for C is, you know, notepad.exe and gcc.exe
[03:41] <vulture> I use notepad too :)
[03:41] <Jan-> I'd really rather be able to just take avcodec.dll and avformat.dll and go from there
[03:41] <Jan-> yes it will be tricky, but at least you're starting from a known place
[03:41] <Jan-> or at least you would be IF THERE WERE ANY DOCS.
[03:45] <durandal_1707> DOCS of what?
[03:45] <Jan-> from what I read most windows compiles of ffmpeg or the libraries are cross compiled from linux anyway
[03:45] <Jan-> so if we were to do a C wrapper we'd probably have to write that on linux, for windows, and cross compile it\
[03:45] <Jan-> and I am NOT really into that thanks
[03:46] <vulture> you dont need linux at all
[03:46] <vulture> my wrapper is a single .c compiled via mingw(gcc) to a .dll/.lib
[03:47] <vulture> all windows
[03:47] <Jan-> well, we can't even get ffmpeg to build with none of the libraries that actually make it useful
[03:47] <Jan-> so that's a nonstarter
[03:48] <vulture> do you have a reason to build it yourself? (e.g. lgpl or something)
[03:48] <vulture> (or you want a stripped down version?)
[03:48] <Jan-> couldn't care less
[03:48] <vulture> if not save yourself the hassle and use the precompiled
[03:48] <Jan-> the licencing is probably irrelevant, the people who will be using the final product won't even know what "source code" is.
[03:49] <vulture> http://ffmpeg.zeranoe.com/builds/
[03:49] <Jan-> yeah, I know
[03:49] <Jan-> that's what we'd do
[03:50] <Jan-> in all honesty we probably would never need to update anyway
[03:50] <Jan-> 99 per cent of all commits on ffmpeg seem to be minor performance tweaks and network security, neither of which would bother us much
[03:51] <vulture> if only they could make avformat_open_input actually work :D
[03:51] <vulture> seems KIND OF IMPORTANT
[03:51] <Jan-> I have no clear idea what it's supposed to do.
[03:52] <Jan-> I have no idea what ANY of it is supposed to do, how it works, what you pass from where to where, what half the terms mean.
[03:52] <vulture> try to find some tutorial then
[03:52] <Jan-> I mean jesus, I know once you've been writing code for a project for years it all starts to become obvious TO YOU, but they need to get a goddamn clue.
[03:52] <Jan-> good docs are coding 101, ffs
[03:52] <vulture> great coders dont have time for documentation ;)
[03:52] <Jan-> I'm sure they think so.
[03:53] <durandal_1707> what docs is missing?
[03:54] <vulture> he wants a primer/overview doc
[03:54] <Jan-> *ahem* it's short for Janine :)
[03:54] <Jan-> all there is, is API reference. Which is fine. But there's no actual description of how it's supposed to work.
[03:54] <Jan-> There's no intro, no overview.
[03:55] <Jan-> The examples are practically uncommented and rely on a lot of knowledge complete newbies to ffmpeg won't have.
[03:55] <Jan-> There's no way to *start*.
[04:03] <vulture> sorry I know a lot of guys named Jan :P
[04:04] <Jan-> everyone assumes I'm a Swedish guy.
[04:04] Action: Jan- scowls
[04:04] <vulture> or any central euro country too :P
[04:04] <Jan-> mutter
[04:04] <Jan-> grumble
[04:09] <vulture> anyway, if there was real docs I might not be here either, since I cant seem to find how custom io contexts are supposed to handle pending data reads
[04:15] <Jan-> I need to sleep
[04:15] <Jan-> it's like 3am here
[04:15] <Jan-> thanks vulture
[04:15] <Jan-> ...no thanks ffmpeg people :/
[06:57] <praveenmarkandu> can someone explain when exporting a hls formatted m3u8 using the -hls commands in FFMPEG, why the quality is lower than using the segment format
[08:56] <ilove11ven> command line you use?
[09:57] <khali> ubitux: I have investigated my "Non-monotonous DTS in output stream" warning flood issue which I see with recent version of ffmpeg and not with 1.0.6
[09:58] <khali> ubitux: it only happens when input file has a damaged audio stream, and only when using -af aresample=async=24000:first_pts=0
[09:58] <khali> it takes the combination of async > 1 and first_pts=0 to trigger the warning flood
[09:59] <khali> ubitux: I hit this because my encoding script uses -async, which translates to these aresample options now
[09:59] <khali> in earlier versions it translated to different options, async and first_pts did not even exist
[10:07] <ubitux> well, maybe you should open a ticket then ;)
[10:08] <khali> ubitux: yes, I think I have enough information (and a sample) to open a decent ticket
[10:09] <khali> ubitux: I found two tickets related to -async (#2421 and #2309) but the warnings are different
[10:09] <khali> so I suppose these are different issues
[10:09] <khali> I'll open a separate ticket
[10:11] <khali> ubitux: one thing I don't get is that -async is deprecated and the man page says to use the asyncts audio filter instead
[10:11] <khali> still when using -async ffmpeg translates it to -af aresample, not -af asyncts
[10:12] <ubitux> the manpage should not be saying that since a while now
[10:12] <ubitux> make sure that's not the fork manpage
[10:13] <khali> ubitux: fork manpage?
[10:15] <ubitux> my manpage from 1.2 doesn't say so
[10:15] <khali> I was reading the 1.0.6 man page and you're right, the current version says -af aresample
[10:15] <ubitux> so you're using an old ffmpeg, or the fork
[10:15] <ubitux> ok, first choice then
[10:15] <khali> I did not know there was a fork of ffmpeg, who would be silly enough to do that?
[10:16] <ubitux> random drama
[10:29] <saste> khali, never underrate people silliness
[10:31] <khali> saste: true enough :(
[10:36] <praveenmarkandu> @ubitix is it possible to point me to the person who committed the hls code from libav?
[10:40] <ubitux> praveenmarkandu: assuming you're talking to me, i think that's lu_zero, but as you can guess he's not contributing directly to this project
[10:45] <praveenmarkandu> @ubitux yeah im talking to him now on #libav
[10:54] <khali> what's the attachment size limit in trac? I have a bug sample but it's a bit large - 3.4 MB
[10:59] <khali> hmm, 2.5 MB that is
[11:20] <khali> ubitux: ticket created, #2693
[11:21] <khali> I don't know if I should start bisecting it right now or if developers will be smart enough to figure it out just by looking at the code
[11:23] <edoardo> How do I record audio with ffmpeg from microphone?
[11:24] <edoardo> I tried ffmpeg -f dshow -i video="VIDEO_DEVICE":audio="MICROPHONE" "out.flv"
[11:48] <keyzs> sirs for converting a ffmpeg -i input.flv output.mp4 with multiple files what would be the sintaxe command?
[11:51] <khali> keyzs: define "multiple files" and what output(s) you expect from them
[11:53] <keyzs> khali now that i´m thinking its more multiple inputs and multiple outputs, so, ffmpeg -i input1.flv input2.flv.. output1.mp4 output2.mp4
[11:54] <keyzs> i have 156 files on this
[11:55] <Mavrik> there is no command for that
[11:55] <Mavrik> use a bash script
[11:55] <Mavrik> or anything else like that
[11:56] <khali> keyzs: for i in `seq 1 156` ; do ffmpeg -i input$i.flv output$i.mp4 ; done
[11:56] <khali> or something similar
[11:57] <khali> (assuming a bash-like shell...)
[11:57] <keyzs> yes
[11:59] <orioni> hi to all . i`m trying to use ffprobe to get info about a multicast channel but the script doest timeout in case there is no data on the multicast channel
[11:59] <orioni> any help
[11:59] <orioni> ?
[11:59] <keyzs> khali how long would ffmpeg take to convert 156 files from flv to mp4 assuming each file has like 3mb
[12:00] <khali> keyzs: it depends on too many factors... CPU speed, video and audio codecs, codec options...
[12:00] <keyzs> in average
[12:00] <khali> keyzs: just measure how much it takes for one, and multiply by 156
[12:00] <keyzs> have a quad core 2.66ghz with 8mb ram
[12:00] <orioni> so if there is no stream on the udp://239.123.10.40:58040 for whatever reason ( broken pipe , no link ) i want the ffprobe to timeout after X amount of time
[12:01] <Mavrik> orioni, you can pass timeout for udp as part of url
[12:01] <khali> (or convert two and multiply by 78, for better accuracy)
[12:01] <Mavrik> orioni, check input protocol documentation.
[12:01] <keyzs> khali ok
[12:01] <keyzs> thanks for the info
[12:01] <orioni> Mavrik: can you send me url .... i checked but nothing found
[12:02] <khali> keyzs: you can use -threads 2 to parallelize if encoding with libx264 (and probably other codecs too)
[12:02] <orioni> tried with probesize , timeout , max_delay , analyzeduration but it doesnt timeout
[12:02] <Mavrik> orioni, go to ffmpeg site, click "documentation", click "ffmpeg-protocols", click "udp"
[12:03] <orioni> ok , gime a sec
[12:03] <keyzs> khali by my calculations an average of 30m
[12:04] <orioni> i see , so it will timeout after 3 secs with this ffprobe udp://239.123.10.40:58040?timeout=3000 ?
[12:05] <keyzs> khali, the command will work nice if the files have different names?
[12:05] <Mavrik> orioni, yep :)
[12:05] <orioni> thanks , lets try
[12:05] <Mavrik> orioni, actually, it'll timeout after 0.3 secs
[12:05] <Mavrik> since unit is microseconds not milliseconds
[12:07] <orioni> not good .... i did used the comand with 3000 timeout but it takes like 12-13 secs to "terminated"
[12:10] <orioni> http://pastebin.com/ibk5ytiV
[12:12] <Mavrik> orioni, avprobe isn't ffmpeg.
[12:12] <orioni> but on debian it says that ffprobe is replaced by avprobe
[12:13] <orioni> lrwxrwxrwx 1 root root 7 Mar 24 08:26 /usr/bin/ffprobe -> avprobe
[12:14] <Mavrik> *shrug(
[12:15] <Mavrik> orioni, sorry, but I have no idea how libav people messed that up and we really can't support everything distros mess up in their package managers :)
[12:15] <Mavrik> I think #libav is the channel for libav project support here on freenode
[12:15] <orioni> *** THIS PROGRAM IS DEPRECATED *** ................ This program is only provided for compatibility and will be removed in a future release. Please use avconv instead.
[12:16] <orioni> ok man , thanks
[12:24] <praveenmarkandu> @ubitux, would you know the difference between using the -f segment command and just using the newer hls commands
[12:25] <ubitux> hls muxer is simpler iirc
[12:25] <ubitux> it was a NIH from libav when they saw we updated their segment muxer to add HLS support
[12:27] <praveenmarkandu> yeah the command looks simpler
[12:34] <praveenmarkandu> thanks
[12:34] <praveenmarkandu> when running 3 ffmpeg transcodes at a time, i get around 15 fps per transcode
[12:35] <praveenmarkandu> when one process finishes, the fps for the remaining two do no increase
[12:38] <khali> praveenmarkandu: I suppose you have more than 3 logical CPUs (cores or HT sibling) and you did not enable threaded encoding in ffmpeg
[12:40] <praveenmarkandu> khali, if not specified, isnt it automatic threading?
[12:41] <khali> praveenmarkandu: to my knowledge, no, by default no threading takes place
[12:41] <praveenmarkandu> threads is set to 0
[12:41] <praveenmarkandu> -threads 0
[12:46] <durandal_1707> for what?
[12:49] <khali> keyzs: no, my example assumed uniform naming of input files
[12:50] <keyzs> khali could you give me a hint on non uniform?
[12:50] <khali> keyzs: if you need something more generic, try the following:
[12:51] <khali> for file in *.flv ; do ffmpeg -i "$file" $(echo "$file" | sed -e 's/\.flv$/.mp4/') ; done
[12:51] <khali> (might need extra quotes if file names contain spaces or special characters)
[12:53] <khali> praveenmarkandu: the man page says auto... could be that auto works OK for internal codecs but explicit -threads is needed for external ones
[12:56] <orioni> hi again , how can i get the stream name / provider name by using the Json output format with ffprobe
[12:57] <orioni> i mean , the "metadata"
[13:02] <praveenmarkandu> ok
[13:04] <praveenmarkandu> khali: i currently launch 3 seprarate ffmpeg processes
[13:04] <praveenmarkandu> shoudl i launch 1 ffmpeg process but make multiple outputs?
[13:26] <MArcin2> hi everyone
[13:28] <MArcin2> may I ask a question about cutting h264 clip?
[13:33] <MArcin2> I have two files recorded by the hardware card (Sensoray 2253), when I try to cut it one cuts almost imedietly and on the second ffmpeg counts zeros from the begining of clip to the point I want. I use that command line ffmpeg -ss 00:19:00 -t 00:00:45 -i "source.mpg" -qscale 0 -vcodec h264 -f mp4 -y dest.mpg
[13:34] <MArcin2> why for some h264 it counts those frames at the begining on some not
[13:34] <MArcin2> why for some h264 it counts those frames at the begining on some not?
[13:41] <MArcin2> anyone knows?
[14:35] <Mavrik> MArcin2, because your encoder probably puts I-frames too far between each other
[14:35] <Mavrik> and ffmpeg can't find one at that position
[14:35] <Mavrik> that is, if you ALWAYS use that command pasted
[14:38] <MArcin2> but when for example I frame is each 10 secs
[14:39] <MArcin2> will i find the on on 0:10:00 and then goes to 0:19:00
[14:41] <MArcin2> the files differ with tbn param
[14:42] <Mavrik> I don't understand what are you trying to say.
[14:42] <Mavrik> what does "I find onon 0:10:00& " mean?
[14:42] <Mavrik> and why do you have I frames only on every 10 seconds?
[14:42] <t4nk530> Hey guys anyone know why my interrupt_callback code isn't being fired with the latest version of ffmpeg?
[14:57] <MArcin2> 10 sec is default value of -g param , and i think the hardware procduces the clip also with I frame each 10 secs
[15:13] <t4nk275> my interrupt code has stopped working with the latest version of ffmpeg - when i call avformat_open_input with no network connection the application hangs - the interrupt is never called - anything changed here?
[16:35] <khali> is there anywhere I can upload large sample files which do not fit in trac?
[16:36] <khali> I seem to recall reading something about that a few days ago, but I can't find it again
[16:36] <JEEB> the mplayerhq ftp
[21:16] <jimjones> Does anyone know why this doesn't work? cat animated.gif | ffmpeg -f gif -i - example.avi? It gives a "pipe:: Input/output error" and "Output file is empty, nothing was encoded (check -ss / -t / -frames parameters if used)". if I just do: ffmpeg -f -i animated.gif example.avi, it works fine.
[21:17] <durandal_1707> because there is no gif parser
[21:18] <durandal_1707> and gif demuxer needs seekable input
[21:20] <durandal_1707> i did not explored if this can be fixed, aka writing gif parser
[21:20] <durandal_1707> eg. if its possible at all
[21:22] <jimjones> thanks
[21:42] <tomlol> I have a bunch of videos which I want to stream randomly, is there a way to keep a stream always up and just push new content to it? Or a way that I could setup a callback to grab the next file whenever its ready for it?
[21:43] <tomlol> I tried a named pipe, but twitch thinks it disconnects in between every file.
[22:04] <kwizart> http://paste.fedoraproject.org/20007/71758436/
[22:04] <kwizart> does this build error talk to someone ? (from ffmpeg 1.2.1)
[22:05] <klaxa> >-marm
[22:05] <klaxa> building for arm architecture?
[22:06] <klaxa> afaik x86 asm doesn't work on arm :V
[22:06] <klaxa> did you run ./configure correctly?
[22:07] Action: kwizart checks the configure line
[22:09] <kwizart> + ../configure --prefix=/usr --bindir=/usr/bin --datadir=/usr/share/ffmpeg --incdir=/usr/include/ffmpeg --libdir=/usr/lib --mandir=/usr/share/man --arch=armv7hl '--optflags=-O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -grecord-gcc-switches -march=armv7-a -mfpu=vfpv3-d16 -mfloat-abi=hard' --enable-bzlib --disable-crystalhd --enable-frei0r --enable-gnutls --enable-libass --enable-libcelt --enable-lib
[22:09] <kwizart> dc1394 --disable-indev=jack --enable-libfreetype --enable-libgsm --enable-libmp3lame --enable-openal --enable-libopencv --enable-libopenjpeg --enable-libopus --enable-libpulse --enable-libschroedinger --enable-libspeex --enable-libtheora --enable-libvorbis --enable-libv4l2 --enable-libvpx --enable-libx264 --enable-libxvid --enable-x11grab --enable-avfilter --enable-postproc --enable-pthreads --disable-static --enable-shared --enable-gpl --disable-debug -
[22:09] <kwizart> -disable-stripping --shlibdir=/usr/lib --disable-runtime-cpudetect --arch=arm --disable-neon
[22:09] <kwizart> oups, I could have pasted it
[22:10] <xreal> Can I set the episode of mp4 using ffmpeg ?
[22:16] <klaxa> kwizart: what compiler are you using?
[22:17] <klaxa> or rather, are you cross-compiling?
[22:24] <kwizart> klaxa, gcc 4.8.1 native compilation
[22:24] <klaxa> ah hmm... i have no arm computer at hand, sorry :(
[22:25] <kwizart> I'm about to switch to --enable-thumb for armv7-a, this file only support arm/asm for armv6 without thumb and neon as I understan
[22:25] <kwizart> d
[22:26] <kwizart> well, actually I wonder if this is thumb or thumbee
[22:31] <dagerik> trying to downscale a video here
[22:31] <dagerik> The encoder 'aac' is experimental but experimental codecs are not enabled, add '-strict -2' if you want to use it.
[22:32] <dagerik> adding that option yielded the same error msg
[22:32] <xreal> Can I set the episode of mp4 using ffmpeg ?
[22:32] <klaxa> try libfdk-aac instead https://ffmpeg.org/trac/ffmpeg/wiki/AACEncodingGuide
[22:32] <klaxa> dagerik: ^
[22:33] <klaxa> xreal: what does "episode" mean in this context?
[22:39] <xreal> klaxa: TV series in iTunes. episode, season etc.
[22:39] <klaxa> sounds like metadata then
[22:40] <klaxa> http://wiki.multimedia.cx/index.php?title=FFmpeg_Metadata#QuickTime.2FMOV.2FMP4.2FM4A.2Fet_al.
[22:40] <klaxa> that maybe?
[22:42] <xreal> klaxa: I can try
[22:45] <beginer_user> hey everybody, I have a "little" question, im begginer though. Im on Win 7, and trying to capture video and audio both. The command I use: "ffmpeg -f dshow -i audio="virtual-audio-capturer":video="screen-capture-recorder" -vcodec libx264 -r 25 -crf 1 -ac 1 -acodec aac -strict -2 -ar 44100 -pix_fmt yuv420p -q 1 -y -f flv output.flv" the video is perfect but the sound is lagging. If I record or
[22:45] <beginer_user> the video, or the audio standallone, it works very good. They seems to conflict together. I have a "good" computer, and recording to disk, so cant make it faster... "real-time buffer 259% full! frame dropped!" I got though
[22:45] <dagerik> klaxa: okay. but why am i not able to use the builtin free aac encoder?
[22:45] <klaxa> you should be able to, it is disencouraged though, since it's not very good
[22:46] <dagerik> help me use it
[22:46] <dagerik> i added -strict -2
[22:46] <dagerik> didnt work
[22:46] <klaxa> pastebin your output of ffmpeg
[22:48] <dagerik> klaxa: http://bpaste.net/show/108666/
[22:48] <beginer_user> klaxa, if its for me: http://pastebin.com/MCZPtKmG
[22:49] <klaxa> include your command and encoding output, right now it's just input analyzation
[22:52] <klaxa> beginer_user so if you don't record audio it works well? sorry, i don't run windows, i don't know a thing about dshow
[22:53] <dagerik> klaxa: turns out i had to place -strict experimental after -i option but before -acodec
[22:56] <beginer_user> klaxa yes. And if I record audio only, it also works good. They fail together, but thx anyway
[22:57] <klaxa> maybe you could try a different container... remove the "-f flv" and name the output file output.mkv or something ending in .mkv
[22:57] <klaxa> if you specify an output file ffmpeg will guess the container from the extension
[22:59] <beginer_user> klaxa, its still the same :)
[23:09] <beginer_user> WOW I cant belive it: I separated the video and audio parameters, and it does work!
[23:27] <dagerik> i got a lowend laptop using cpu for gfx. im unable to play this 1080p video(mpeg-4, aac). what is a smart way to make it playable? convert process must take less than 1 min.
[23:28] <klaxa> what cpu? what mpeg-4 version?
[23:29] <klaxa> you are also probably asking for the impossible
[23:29] <dagerik> klaxa: Intel(R) Atom(TM) CPU Z520 @ 1.33GHz, Video: mpeg4 (Simple Profile) (mp4v / 0x7634706D), yuv420p, 1920x1080
[23:29] <sacarasc> dagerik: If your CPU can't decode it quick enough, you won't be able to reencode it in under 1 minute.
[23:29] <klaxa> does it have an integrated gpu chip?
[23:29] <sacarasc> Encoding generally takes longer than decoding.
[23:30] <dagerik> it has gma500. it is proprietary. no driver for accelerated gfx
[23:34] <klaxa> well it has hardware decoding for h264 :V
[23:34] <klaxa> at least according to wikipedia, no idea how to use vaapi though
[23:36] <dagerik> mpeg4 -> libx264
[23:37] <dagerik> is there an alternative to h264 here?
[23:37] <dagerik> for faster encoding
[23:37] <klaxa> no
[23:37] <dagerik> im able to encode 1s video in 5s real time
[23:37] <klaxa> like i said, you are asking for the impossible
[23:37] <dagerik> with 480p
[23:37] <vulture-> it's an Atom what do you expect :D
[23:38] <vulture-> it's built for low power not performance
[23:38] <klaxa> get your videos in h264 and buy a raspberry pi
[23:38] <klaxa> or something...
[23:40] <vulture-> http://www.techarp.com/article/x264_HD_Results/4_results_01.png you'll have to scroll to the bottom to see the Atoms
[23:50] <sacarasc> dagerik: how long is the video?
[00:00] --- Fri Jun 21 2013
More information about the Ffmpeg-devel-irc
mailing list