[Ffmpeg-devel-irc] ffmpeg.log.20130212
burek
burek021 at gmail.com
Wed Feb 13 02:05:01 CET 2013
[00:50] <sam___> any suggestions .ffmpeg [libx264 @ 0x670520e0] Unable to parse option value "flags2 +bpyramid+wpred+mixed_refs-dct8x8"
[00:51] <sam___> o.ffmpeg [Eval @ 0x7f8f4fffd4c0] Undefined constant or missing '(' in 'flags2' o.ffmpeg [libx264 @ 0x522f8140] Unable to parse option value "+bpyramid+wpred+mixed_refs-dct8x8"
[00:53] <sam___> whats the problem with ffmpeg am i missing something?
[00:53] <sam___> Undefined constant or missing '(' in 'flags2'.
[00:54] <saste> sam___, use libx264 presets
[02:44] <mishehu> greetings folks. I was wondering if openjpeg 2.0.x is supported in ffmpeg currently, or only the older 1.5.x branch.
[03:07] <kode54> fuck, more distributions switching to avconv?
[03:10] <llogan> kode54: which one(s) are you referring to?
[03:11] <kode54> friend on another network caught my whining
[03:12] <kode54> I spotted a capture and broadcast to twitch.tv tutorial
[03:12] <kode54> which I first complained about because it was a freaking youtube video instead of a document
[03:12] <kode54> then I complained because he was displaying a bash script in the video for viewers to transcribe, without actually linking to it in plain text form
[03:12] <kode54> then I noticed he was using avconv instead of ffmpeg
[03:13] <kode54> friend said he doesn't want to get into the whole ego war between the projects
[03:13] <sacarasc> Which distro is your friend using?
[03:13] <kode54> debian
[03:13] <kode54> but he says gentoo is "switching" to avconv as the primary of the two as well
[03:14] <kode54> his complaint is that ffmpeg gets more important features faster than avconv does
[03:14] <kode54> but he's stuck with people making decisions that favor using the other
[03:16] <demonjester> hey everyone, I was wanting to know if someone can help me with audio recording on my openwrt router
[03:19] <llogan> kode54: at least the ffmpeg package in gentoo is actually FFmpeg upstream unlike debuntu
[03:19] <kode54> that's cool, at least
[03:20] <llogan> debian has dropped the "ffmpeg" package from experimental apparently
[03:21] <llogan> so a real ffmpeg package may be one step closer to a reality...assuming the DDs don't simply disregard it
[03:28] <demonjester> can anyone help me? i am using these comands and it wont save the file after I ctrl + c "ffmpeg -i /dev/audio test.mp3"
[03:35] <llogan> relaxed: interested in changing the CentOS guide to static build instructions?
[03:36] <defaultro> good evening folks. Is it possible to fix the color of my video? Looks like there is a a little cyan cast. The blue sky is like not blue
[03:38] <llogan> defaultro: in all players?
[03:38] <defaultro> yes
[03:38] <defaultro> i remember reading an issue with this video camcorder
[03:39] <defaultro> but Panasonic never made a fix for it :(
[03:40] <llogan> there are probably several filters that can do this
[03:40] <llogan> http://ffmpeg.org/ffmpeg-filters.html#hue
[03:40] <defaultro> cool. Checking it out now
[03:40] <demonjester> here is one with using oss http://pastebin.com/pGg9CnYA
[03:41] <demonjester> default its not for you
[03:41] <demonjester> sorry
[03:41] <demonjester> here is the one without using oss http://pastebin.com/LfXsZ7rp
[03:41] <defaultro> k
[03:41] <llogan> demonjester: you have no mp3 encoder
[03:42] <llogan> missing --enable-libmp3lame
[03:42] <demonjester> so in my command i add --enable-libmp3lame?
[03:42] <llogan> no. in your ffmpeg ./configure
[03:42] <llogan> did you compile it?
[03:42] <demonjester> kk
[03:43] <demonjester> no i used opkg install in openwrt it did the configuring
[03:43] <llogan> if you have the lame binary you could possibly pipe to lame
[03:44] <llogan> ffmpeg -i input -f wav - | lame - output.mp3
[03:44] <demonjester> i downloaded a package where i thought it was lame but said lame is not found and idk what the package is in openwrt for lame
[03:46] <llogan> you'll need to install lame headers first before you compile ffmpeg with --enable-lame
[03:46] <llogan> *--enable-libmp3lame
[03:48] <relaxed> llogan: Hey. What kernel does it ship with?
[03:48] <demonjester> i cant fine headers for lame so it seems ffmpeg is out of the question.
[03:48] <demonjester> thanks anyways
[03:49] <llogan> https://dev.openwrt.org/browser/packages/sound/lame/Makefile
[03:49] <defaultro> is it possible to send output to mplayer? This way I don't have to create a temporary file
[03:50] <llogan> why do you want to do that? so you can see the effects of the filter?
[03:50] <defaultro> yup
[03:51] <llogan> ffplay -i input -vf hue
[03:51] <defaultro> ah
[03:54] <defaultro> llogan, can ffmpeg manipulate each RGB channel?
[03:56] <defaultro> here is what I did. I took a snap of the problematic video and opened it in Photoshop. I modified hue settings and I was able to correct the sky. :) FFMPEG's hue filter should be able to do it too :)
[03:59] <relaxed> llogan: I've been working on a distro agnostic guide for compiling ffmpeg to a custom locations, and avoiding package managers entirely. I could add a blurb about building it statically too.
[04:00] <llogan> relaxed: sounds good to me. are you going to add it to the FFmpeg wiki?
[04:01] <llogan> defaultro: i'm not sure.
[04:01] <relaxed> I may, at first it will be a simple txt file in dropbox.
[04:02] <relaxed> If work is slow tonight I'll finish it up and let you look at it.
[04:05] <llogan> relaxed: thanks.
[04:11] <defaultro> llogan, I can't get this to work with vf, mp=eq2=1.0:2:0.5
[04:11] <defaultro> it's contrast brightness
[04:13] <ivor> I'm looking at http://ffmpeg.org/trac/ffmpeg/wiki/UbuntuCompilationGuide and notice many errors making it obvious nobody uses it
[04:13] <ivor> Is there a better page for this?
[04:14] <defaultro> llogan, i got it to work
[04:17] <ivor> What does one do when one wants to compile the latest version of ffmpeg on linux mint?
[04:19] <llogan> ivor: what are the many errors?
[04:20] <llogan> defaultro: what did you do?
[04:20] <ivor> Well, take a look at the page at http://ffmpeg.org/trac/ffmpeg/wiki/UbuntuCompilationGuide and notice how it deletes yasm and then pulls down a new version but never sets the path
[04:21] <defaultro> i wrapped it with double quotes
[04:21] <llogan> yasm in the repo is too old
[04:21] <defaultro> however, i can't combine hue and contrast/brightness
[04:21] <ivor> llogan: exactly, so the script attempts to download a new version but then never uses it
[04:21] <llogan> the path defaults to /usr/local
[04:22] <ivor> llogan: well the version that is pulled down only gets installed into the ~/yasm-1.... directory
[04:23] <ivor> llogan: so if the script is used then it complains that there is no yasm on the system
[04:23] <ivor> llogan: that error can be fixed if one updates their path to include the location the script installs yasm but that's just the first error with the script/directions
[04:25] <defaultro> llogan, I can't get this to work. ffplay -i 00107.MTS -vf "mp=eq2=0.0:1:-.1:hue=h=26"
[04:25] <defaultro> the hue is not working
[04:26] <defaultro> ah. I got it to work using comma :)
[04:27] <llogan> ivor: works for me. sounds like you skipped the checkinstall command.
[04:27] <llogan> that guide will be changed soon (relatively) so there is no system installation
[04:27] <llogan> and no screwing around with checkinstall/package manager.
[04:28] <llogan> defaultro: which Panasonic model?
[04:28] <defaultro> TM700
[04:28] <defaultro> Quality is amazing but the cyan/green cast is so annoying
[04:29] <defaultro> I'm not sure if I should sell it and buy another video camera
[04:29] <llogan> it occurs with default settings? are you sure it's not set to tungsten white balance when shooting in daylight?
[04:29] <defaultro> yup, it's AWB
[04:29] <llogan> do you have a small sample?
[04:29] <defaultro> so many people have the same issue
[04:29] <llogan> lame.
[04:29] <defaultro> i can extract few frames
[04:30] <defaultro> there are so many blogs about it
[04:30] <llogan> i have a HMC150 laying around here that seems fine
[04:30] <ivor> llogan: I'm not familiar with what checkinstall does but I did copy it in as shown...
[04:31] <ivor> llogan: I'm reading the man pages on checkinstall now
[04:32] <llogan> you can see if it installed with: dpkg -L checkinstall
[04:33] <llogan> or: dpkg -s checkinstall | grep Status
[04:33] <ivor> llogan: What would I be looking for in there, yasm, ffmpeg, or something else?
[04:33] <llogan> duh... i meant yasm, not checkinstall
[04:34] <llogan> dpkg -s yasm | grep Status
[04:34] <llogan> should show "Status: install ok installed"
[04:35] <ivor> llogan: no, it did not get installed
[04:35] <ivor> llogan: Do I need to run the script with sudo?
[04:35] <llogan> what script are you referring to?
[04:36] <ivor> llogan: oh, sorry, I put the directions into a script so I could be sure I got everything right
[04:36] <defaultro> llogan, where can I upload the 5 sec extracted clip?
[04:37] <llogan> you don't have your own server?
[04:37] <defaultro> not anymore, I shut it down llast week
[04:37] <llogan> let me guess...you got rid of the neckbeard too?
[04:37] <defaultro> :D
[04:37] <llogan> get. out.
[04:37] <defaultro> let me turn on httpd here
[04:37] <llogan> datafilehost or mediafire are fine
[04:38] <defaultro> oh ok
[04:38] <llogan> i'd like to shut mine down, but i still have some legacy clients. im bored of server admining
[04:39] <defaultro> :)
[04:39] <defaultro> i didn't want to pay $120/year
[04:40] <defaultro> upload should be done soon
[04:40] <llogan> we're paying more than that per month
[04:40] <llogan> not for long though
[04:42] <llogan> ivor: as i mentioned earlier, it seems as if checkinstall was skipped or did not run successfully
[04:43] <defaultro> :)
[04:44] <defaultro> i'm only uploading 4.5mb but it's taking a long time
[04:44] <defaultro> http://www.datafilehost.com/download-48faa237.html
[04:45] <defaultro> maybe the reason why I'm having a hard time correcting the sky is because I used a circular polarizer
[04:45] <llogan> where is that?
[04:45] <ivor> llogan: I guess I have to go back and look at my script
[04:46] <defaultro> Playa del Carmen, Mexico
[04:47] <llogan> it looked familar but i couldn't think of the name
[04:47] <defaultro> Blue something. Saw nude women there
[04:48] <llogan> were they cyan?
[04:48] <defaultro> yes
[04:48] <defaultro> beautiful water
[04:48] <defaultro> or turquoise
[04:49] <defaultro> but look at the sky, it's not blue but i remember it was a really nice blue sky
[04:50] <defaultro> i'll check some new videos. That mexico was from 2010
[04:54] <defaultro> some of my videos have green/cyan cast
[04:54] <defaultro> on some videos, I remember using custom white balance. The skies looked fine
[05:10] <fenduru> Anyone know how to avoid any padding at the end of each segment when using -f segment
[06:27] <bigmac> i have a question about hosting lots of large images on my local webserver... current format is jpg, can i convert them to load faster and not loose much quality?
[06:44] <klaxa> you could create thumbnails and have the user click on those before downloading the real thing
[06:46] <fenduru> is there a way to do a 2 step conversion in one command? i.e. mp3 > wav > ogg
[06:49] <klaxa> you want to have wav and ogg?
[06:49] <klaxa> in one command line, yeah with many pipes though
[06:50] <fenduru> well, my problem is that using -f segment
[06:50] <fenduru> playing the outputted segments in sequence has gaps
[06:51] <fenduru> but it seems to be an encoder or container issue, as straight PCM isn't giving me this issue
[06:51] <klaxa> ffmpeg -i mp3_file.mp3 -c:a pcm_s16le -f wav pipe: | tee somefile.wav | ffmpeg -i - -c:a libvorbis somefile.ogg
[06:52] <fenduru> so I'm trying to see if decoding to PCM.wav and then splitting that guy will alleviate my problems.
[06:52] <fenduru> klaxa, thanks. I had considered that I was just trying to avoid multiple ffmpeg processes
[06:52] <fenduru> but I'll give that a shot, ty
[06:52] <klaxa> i didn't test it, but it should work
[06:57] <fenduru> well, the pipe worked. still getting gaps though
[10:56] <Bor0> could anyone help me with http://stackoverflow.com/questions/14452817/strange-sound-produced-by-ffmpeg-and-sdl ? I am experiencing the same exact problem
[11:59] <wahaha__> There is a file named input.txt file '/home/1.flv' file '/home/2.flv'
[12:00] <wahaha__> I can use the command to Concatenate 1.flv and 2.flv into a mp4 file.
[12:00] <wahaha__> ffmpeg -f concat -i input.txt -c copy output.mp4
[12:00] <durandal_1707> and?
[12:00] <wahaha__> When i use the following command
[12:01] <wahaha__> ffmpeg -i "concat:1.flv|2.flv" -c copy output.mp4
[12:01] <wahaha__> can run ,but wrong result ,only one file 1.flv in the output.mp4
[12:01] <wahaha__> ffmpeg -f concat -i input.txt -c copy output.mp4 is ok
[12:02] <wahaha__> ffmpeg -i "concat:1.flv|2.flv" -c copy output.mp4 ,problem
[12:02] <durandal_1707> perhaps it needs full path
[12:04] <wahaha__> wait a minute ,let me try
[12:09] <wahaha__> 'full path' can not solve it
[12:10] <durandal_1707> with -v debug
[12:11] <wahaha__> wait more moment
[12:12] <wahaha__> i am in china , www.pastie.org was blocked by chinese gfw
[12:15] <wahaha__> i have to prepare a ladder to break the blocked gfw
[12:15] <durandal_1707> wahaha__: couldn't you just find paste site that is not blocked?
[12:25] <wahaha__> please wait a moment
[12:28] <relaxed> wahaha__: mkvmerge -o temp.mkv 1.flv +2.flv; ffmpeg -i temp.mkv -c copy done.mp4
[12:29] <wahaha__> let me try
[12:30] <relaxed> ffmpeg's concat only works well with mpeg containers from what I've seen.
[12:31] <durandal_1707> relaxed: but op issue is that one syntax works and another one do not
[12:32] <wahaha__> mkvmerge -o /tmp/temp.mkv /tmp/1.flv +/tmp/2.flv bash: mkvmerge: command not found
[12:32] <wahaha__> apt-get install mkvmerge Reading package lists... Done Building dependency tree Reading state information... Done E: Unable to locate package mkvmerge
[12:34] <wahaha__> now i can get pastie.org,but i can't use it,it is difficult to use
[12:35] <wahaha__> do you have email?i can send you my screen-grap picture
[12:36] <relaxed> durandal_1707: oh :)
[12:38] <durandal_1707> wahaha__: pastebin is blocked too?
[12:39] <wahaha__> my screen-grap picture is on my desktop in png file,how can i paste it into the pastie.org
[12:39] <durandal_1707> paste text
[12:39] <durandal_1707> copy -paste text
[12:40] <wahaha__> i have break it with some tools for several minutes,i can link pastie now
[12:41] <wahaha__> had never used pastie.org
[12:41] <durandal_1707> pastie* sites is for copy-paste of text
[12:41] <durandal_1707> for uploading images there is bunch of other hosts
[12:43] <wahaha__> here it is
[12:43] <wahaha__> http://pastie.org/6124324
[12:46] <wahaha__> http://pastie.org/6124377
[12:46] <wahaha__> please see http://pastie.org/6124377
[12:50] <wahaha__> are you still there durandal_1707?
[12:54] <leosaeba> hi to all. I'm trying to compile a self contained (not system installed) opencv (with ffmpeg video capabilities enabled) on raspberrypi to be used by an application I'm writing.
[12:54] <leosaeba> I have problems compiling it
[12:55] <leosaeba> I managed to compile and install x264, ffmpeg and opencv on x86_64
[12:55] <leosaeba> then I tried to do the same on the raspberry
[12:55] <leosaeba> but opencv fails with: lib//libavcodec.so.54: undefined reference to `ff_flacdsp_init_arm'
[12:56] <wahaha__> durandal_1707,what is the matter?
[12:57] <durandal_1707> wahaha__: open bugreport
[12:58] <burek> leosaeba, are you sure you really want to use libx264 on RPi?
[12:59] <leosaeba> I'm pasting the whole compilation process on http://pastebin.com/G2GQx1y7
[12:59] <burek> no need to look at that
[12:59] <leosaeba> burek: for now I'm just trying to get things up
[12:59] <burek> most probably you'll get unusable system
[12:59] <wahaha__> think you
[12:59] <burek> and probably shouldnt waste your time with it
[13:00] <burek> RPi is just to weak for any kind of video encoding
[13:00] <leosaeba> I just need opencv to append jpeg frames on a video (after some computations)
[13:00] <burek> find some better hardware for such purpose
[13:00] <leosaeba> (my main constrain is money....)
[13:00] <burek> well, ok, good luck with it, what can i say..
[13:01] <burek> i assume you'll use some usb webcam
[13:01] <burek> so most probably you'll have serious usb issues too
[13:01] <burek> im just trying to save your time=money here
[13:01] <leosaeba> burek:
[13:02] <leosaeba> no, actually I have remote nodes (with usb camera) that mount raspi filesystem
[13:02] <burek> your RPi will generate video frames?
[13:02] <leosaeba> they write frames there and raspi do some little assembly.
[13:03] <burek> what do you mean by "remote nodes" ?
[13:03] <leosaeba> rpi will append received frames to a video
[13:03] <leosaeba> tcp/ip
[13:03] <burek> i see
[13:03] <burek> that might maybe work
[13:03] <burek> but then you dont need libx264 on RPi
[13:03] <leosaeba> I think so
[13:03] <burek> you can remove it
[13:04] <leosaeba> and ffmpeg will allow opencv to use VideoWriter?
[13:04] <burek> just so you know, on RPi, ethernet controller is usb-based, so you won't have 100 mbps if you expect that too
[13:04] <burek> the most you can hope for is < 5-6 mbps
[13:05] <leosaeba> the problem I got on raspi (not on pc) is about flac-arm something
[13:05] <burek> im not familiar with opencv, so you just need to try and test it
[13:05] <burek> flac is an audio codec
[13:05] <leosaeba> I've done some math, and for the bandwidth required it should suffice..
[13:05] <burek> avoid any kind of encoding on RPi..
[13:07] <burek> things might seem to be working at first, but you'll notice that at some point your RPi got frozen
[13:07] <burek> rebooted itself and stuff
[13:07] <leosaeba> I actually could work with uncompressed video, but my problem now is just to set opencv up.
[13:07] <burek> so don't consider is as a cheap version of less-powerfull x86 machine
[13:07] <burek> it*
[13:07] <burek> what is the image size of the video
[13:08] <burek> it shouldn't be a problem to set opencv as long as it doesnt require ffmpeg's encoding abilities
[13:08] <burek> if you only need muxing, that might work
[13:08] <leosaeba> burek: I'm not happy about broadcom gpu too....but I accepted the compromise on openness/cost
[13:09] <burek> low price comes at the cost of everything else
[13:09] <burek> you'll experience it sooner or later
[13:09] <burek> i just wouldnt like that you spend a lot of time setting all up just to realize what i said above
[13:09] <leosaeba> imagesize is not a constrain...I will start with HD (720 I believe) and gracefully downgrade until the system works.
[13:10] <burek> forget hd..
[13:10] <burek> i had a camera with internal h264 encoder
[13:10] <burek> and all RPi was supposed to do was to mux the stream and re-stream it
[13:10] <burek> and it failed..
[13:10] <burek> so.. i think the math is pretty obvious
[13:11] <leosaeba> I see...
[13:11] <burek> cpu could manage it all, but the data bus couldn't
[13:11] <burek> everything was usb-based and all the data travel over that usb bus
[13:11] <leosaeba> My remote node feeds is jpeg frames
[13:11] <burek> so conflicts are numerous at least
[13:11] <burek> that's why i noted the ethernet usb controller too
[13:11] <leosaeba> i understand
[13:12] <leosaeba> well..anyway I will continue to try (at leas until I will reach the bottle-neck you are referring to)
[13:12] <burek> ok
[13:12] <leosaeba> ..I've got no alternatives right now
[13:13] <Bor0> could anyone help me with http://stackoverflow.com/questions/14452817/strange-sound-produced-by-ffmpeg-and-sdl ? I am experiencing the same exact problem
[13:13] <burek> you have an rpi compilation guide on our wiki though
[13:13] <burek> it might help
[13:13] <leosaeba> actually I have set up slackwarearm, and everything is ok.
[13:14] <burek> this "lib//libavcodec.so.54: undefined reference to `ff_flacdsp_init_arm'" means that you need a specific version of ffmpeg
[13:14] <leosaeba> compilation is slow, but yesterday I compiled ffmpeg (stripped of a lot of things) in a few hours.
[13:14] <burek> which was in mind of opencv programmers at the time they decided to use ffmpeg
[13:14] <leosaeba> I compiled ffmpeg-1.1.2
[13:14] <burek> and you probably got latest
[13:14] <burek> so it failed
[13:14] <leosaeba> ah!
[13:15] <leosaeba> I didn't think of that.
[13:15] <leosaeba> on opencv documentation they refer to minimum version of ffmpeg, so I thought to use the lates version.
[13:16] <burek> well see what they recommend
[13:16] <burek> if it is 0.x then use 0.x
[13:16] <burek> 1.x had a lot of changes
[13:16] <burek> so it might be no wonder why it doesnt work
[13:16] <leosaeba> I see
[13:17] <leosaeba> I got tricked by the fact that everything worked on x86_64....but the error is arm related.
[13:17] <burek> hmh.. most probably
[13:18] <burek> on your x64, you have both ffmpeg on your system
[13:18] <burek> one installed through package management
[13:18] <burek> and another (unneeded) your own compiled
[13:18] <leosaeba> Now I will try with the latest 0.x version (in the meantime I investigate which version of ffmpeg opencv 2.4.3 used)
[13:19] <leosaeba> ...to avoid this chance I unistalled system ffmpeg ( but I think something could still be present with vlc)
[13:19] <burek> well, dpkg -l | grep libav
[13:20] <leosaeba> dpkg is debian/ubuntu?
[13:20] <burek> yes
[13:20] <leosaeba> I will check slackware way, just a sec
[13:22] <leosaeba> I've looked in /usr/lib64 for any libav* (none present)
[13:22] <leosaeba> then I checked if vlc used some libav* somewhere else
[13:23] <leosaeba> and I found this: /usr/lib64/vlc/plugins/codec/libavcodec_plugin.la and /usr/lib64/vlc/plugins/demux/libavformat_plugin.la
[13:24] <leosaeba> but I don't think it is it.
[13:24] <burek> Bor0, it might be easier to ask the author of those tutorials, rather than us here, because i dont believe anyone here has got time to read those coding samples just in order to tell you what the issue was
[13:25] <burek> libavcodec_plugin.la is vlc's wrapper for libav* libraries
[13:25] <Bor0> I see. is there any official tutorial for this?
[13:25] <burek> installed libav* libraries
[13:25] <burek> Bor0, also see docs/examples in your ffmpeg dir
[13:26] <Bor0> alright, will check those, thanks.
[13:26] <burek> :beer: :)
[13:28] <Bor0> yoghurt with burek ;)
[13:28] <burek> +1 :)
[13:37] <leosaeba> burek: Ok, now I'm trying to check if using ffmpeg-0.11.2 (without x264) works on my pc, then I will try the same on raspi. In the meanwhile thank you for helping.
[15:49] <jarno> Hi all, I have a question about ffmpeg/libx264...am I in the right place to make such a question?
[15:50] <durandal_1707> yes
[15:52] <jarno> ...great...I'm using ffmpeg - libx264 to write video (actually it's OpenCV that's calling ffmpeg) and I get the infamous "broken ffmpeg default settings detected"
[15:53] <jarno> I'm using Kubuntu 12.10 and OpenCV, ffmpeg, x264 etc. is all git:ted today (should be stable) and I was wondering from where (which source code file) the default settings get written from
[15:55] <jarno> ...so that I could experiment with different settings...I wonder if it is the static const AVCodecDefault x264_defaults[] in libx264.c
[15:56] <jarno> ....I would really appreciate if someone could set me on the right track.....
[15:56] <durandal_1707> you should see what OpenCV use/calls and what expects
[15:57] <durandal_1707> "because broken ffmpeg default settings detected" in nonsense
[15:57] <jarno> I was checking out the OpenCV part and as far as I can see, it would seem that so called "ffmpeg" defaults are being used
[15:57] <durandal_1707> than call ffmpeg with other options
[15:58] <jarno> ...hmmm, what do you mean that "broken ffmpeg default settings detected" is nonsense?
[16:00] <durandal_1707> broken in what sense? what is broken and why it is broken? ....
[16:00] <jarno> ...I mean that I get the following error message from libx264: [libx264 @ 0x12c7440] broken ffmpeg default settings detected
[16:04] <durandal_1707> what ffmpeg version are you using?
[16:06] <jarno> ffmpeg version N-38355-ge2b703f Copyright (c) 2000-2013 the FFmpeg developers
[16:06] <jarno> ...I just git:ted it today...
[16:08] <Mavrik> huh
[16:08] <Mavrik> that "broken ffmpeg settings" is a message from x264
[16:08] <Mavrik> very old
[16:08] <Mavrik> jarno: default settings aren't read from files
[16:08] <Mavrik> you set them via context options
[16:08] <jarno> ...yes, I know, but as far as I understand it, the default settings are written by libffmpeg, but I might be mistaken...
[16:08] <Mavrik> however if you get that error the OpenCV bindings must be ancient
[16:09] <jarno> ...ok...that might be the case
[16:10] <jarno> ...since I am not an expert (very far from it), from where I should search for examples etc. in order to get a better idea of how to use libffmpeg?
[16:11] <Mavrik> hmm, no idea
[16:12] <Mavrik> since libffmpeg seems like some 3rd party product :\
[16:12] <jarno> really?
[16:12] <Mavrik> yep, ffmpeg libraries are named libav* (libavcodec, libavformat, libavutil)
[16:12] <Mavrik> and have such prefixes when calling them (avformat_, avcodec_)
[16:13] <Mavrik> libffmpeg seems like some wrapper around them to ease programming
[16:13] <jarno> ...ok...
[16:13] <Mavrik> sadly, i gotta run now, hopefully you find a solution :)
[16:13] <jarno> thanks!!!!
[16:13] <jarno> ...hmmm...perhaps I should stick to gstreamer then...it seems to be a more straight forward solution...
[16:22] <JEEB> jarno, depends on what you're doing
[16:23] <jarno> ...I'm reading images from uEye cameras...
[16:23] <jarno> ...and then I just need to write them to a video without loosing too much information...
[16:24] <jarno> ...I'm not really an expert when it comes to gstreamer nor ffmpeg, but I have done some simple stuff using gstreamer and it seems to be fairly well documented
[16:24] <jarno> ...and "easy" to use...
[16:24] <JEEB> well, I've just seen way too many people derp at it over at #x264 , but I have no idea what's the best API to do the camera reading for you to be honest :s
[16:25] <jarno> ....actually, I'm getting the images using the API from uEye....
[16:25] <jarno> but writing videos does not function in Linux with the same API
[16:25] <JEEB> Basically I'd use whatever is the best API to grab the pictures, depending on the colorspace push it through swscale or something to convert it if your output needs to be something != input, then encode with libx264, and then mux into a container with libavformat or something
[16:27] <jarno> I'll check out what you are proposing
[16:27] <jarno> ...and see what is the "easiest" / "fastest" way of doing it for the time being
[16:29] <jarno> but I quess that what Mavrik was saying about the OpenCV bindings is right
[16:29] <jarno> ...they are just damn old...but I'll check that out tomorrow
[16:30] <JEEB> I have no idea of your use case, but I am kind of reading through the lines that you might get raw video frames from the camera, in some colorspace -> if the colorspace is something you don't want to encode as-is (f.ex. RGB with libx264 is possible, but you won't really be capable of playing it with anything but FOSS, no flash or hardware decoder will like that) you feed the raw frames to swscale, or another converter of your choice
[16:30] <JEEB> that converts the picture to the colorspace you want -> libx264 itself to encode the pictures into raw H.264 (Annex B) -> take the data that libx264 outputs and put it through libavformat to put the raw H.264 streams into a container
[16:30] <JEEB> and that would be it
[16:30] <JEEB> libswscale and libavformat are libraries that ffmpeg provides, and libx264 is well, WhatItSaysOnTheTin
[16:33] <jarno> ...thanx a lot for the description man!!!
[16:34] <jarno> I really appreciate it!!!
[16:43] <jarno> ...since I'm pretty noob when it comes to this, from where I should/could start learning about using libav:s?...
[16:44] <JEEB> depends on what exactly you want to do?
[16:44] <JEEB> I mean, the trunk doxygen is as good as it gets
[16:45] <JEEB> (you will rip your hair off at some point, but on the other hand that unfortunately generally is better than the alternatives)
[16:45] <jarno> ...just encode and write into a container...
[16:45] <jarno> ...so there are no good "tutorials" going around as normally?
[16:46] <JEEB> I would probably recommend just using libx264 itself for the encoding, thus x264.h is the thing to read through
[16:46] <JEEB> and then libavformat, well that is in the libavformat doxygen
[16:46] <jarno> ok
[16:46] <JEEB> and yes, there are samples but if they are out-of-date you get the fun time of screaming at the developers
[16:46] <JEEB> http://libav.org/doxygen/master/index.html
[16:47] <jarno> ...hahahahah :D
[16:47] <JEEB> the doxygen is auto-generated so you at least have it be generally up-to-date
[16:48] <JEEB> also do make sure what the picture format of what you're getting from the camera and what you want to feed to x264 are the same thing, otherwise you'll have to stick libswscale in the middle
[16:48] <JEEB> or something else
[16:48] <JEEB> which pain to take is up to you
[16:49] <jarno> ...ok...happily I know what the image format is and can actually change it using the API supplied by the vendor...
[16:49] <JEEB> okies, as long as that is something you a) can and b) want to stick to x264 :)
[16:50] <JEEB> http://git.videolan.org/?p=ffmpeg.git;a=tree;f=doc/examples;h=36f7e2da0aa858d15f8bd4ec633a5f14cad63666;hb=HEAD <- and the samples are in doc/examples btw
[16:50] <JEEB> I would guess scaling_video is swscale stuff and muxing is well, that
[16:51] <jarno> ...thanx...I'll check it up tomorrow...gotta go now...and once more, thanx for your help...
[16:51] <jarno> cheers!!!!!
[19:00] <Sam_> yesterday i had asked one question about high CPU usage in ffmpeg, one fellow member said update ffmpeg to latest version. I did migrate to latest version but still i see during flv transcoding CPU reaches about 120%
[19:01] <Sam_> any suggestions what could be the reason behind it?
[19:03] <Sam_> why ffmpeg is eating up so much CPU during transcoding?
[19:03] <sacarasc> Because it's trying to go as fast as possible.
[19:03] <sacarasc> You could make it go real time, which would presumably use less CPU...
[19:03] <stqn> you could use https://github.com/opsengine/cpulimit
[19:03] <Sam_> <sacarasc> but this will reduce throughput right?
[19:05] <sacarasc> If you want it as fast as possible, it's going to use as much CPU as possible.
[19:06] <Mavrik> Sam_: well it's supposed to use ALL CPU
[19:09] <Mavrik> your question is... strange.
[19:09] <Sam_> i am limiting it to 1 thread
[19:09] <Sam_> <sacarasc> Correct but while doing 20 parallel transcoding it reduce the throughput of the system by significant amount
[19:09] <Sam_> <Mavrik> can you please put some light why question is strange
[19:26] <framer99> I have a working ffmpeg cmdline that has jitter when run as part of initscripts on centos 6.3
[19:27] <framer99> I've tried many many things to figure it out, no luck. I can restart the process after bootup and it runs clean with no jitter.
[19:27] <framer99> full details here : http://pastebin.com/L2wwX3iP
[19:28] <framer99> I had it setup as a Upstart process/task and I could just ssh in and restart that Upstart task and it will run fine. I just can't figure out what is different about running during boot
[19:29] <framer99> thansk in advance for any advice
[19:29] <durandal_1707> that is so old, more than 2 years
[19:30] <framer99> yeah i was goign to compile myself eventualy just never got to it.. let me do that and report back
[19:55] <framer99> rrrr.. remaking with x11grab support
[20:22] <charleszivko> in a program like Audacity I can see the waveforms of the each file and manually match the peaks of them. I'm probably using the wrong term , sorry.
[20:22] <charleszivko> I know programs like final cut pro auto match two audio files
[20:23] <charleszivko> trying to figure out how I can implement such a feature myself
[20:24] <durandal_1707> you mean same data in several bytes?
[20:25] <charleszivko> similar, sound files are recorded on different devices. A mic. B video from a iPhone
[20:25] <charleszivko> so it's not exactly the same data
[20:26] <durandal_1707> not currently, but nobody stops you to fill feature request on bug tracker for such audio filter
[20:28] <charleszivko> cool. any pointers on a alternative library that may do this?
[20:28] <durandal_1707> if i know one, i would already told you, i do not think sox have anything like that
[20:29] <charleszivko> oh thanks, yeah I have been researching and already found sox. doesn't seem to handle what I am looking for either.
[20:29] <charleszivko> appreciate your input
[20:32] <durandal_1707> there is showwaves filter so you could just watch it ;)
[20:33] <charleszivko> haha, yeah I found that in the docs. Was hoping I missed something that I might be able to leverage.
[20:34] <durandal_1707> but you are basically looking for something like diff tool
[20:35] <charleszivko> yeah. I know audio A and audio B have a match within a 1 - 2 second timestamp. So I need to analyze those to get the exact timestamp from each where they first match so I can use it as a offset to sync them up.
[20:36] <charleszivko> B would replace A (video audio) but needs to match the video
[21:00] <framer99> durandal_1707: build from latest git seems to be running smoother from initscripts now.... thanks for the pointer that I was way out of date, should have paid more attention
[22:09] <FFmpeg> Hello friends :)
[22:09] <fatpony> can somebody explain me why cropdetect constantly yields values that very clearly overcrop the source?
[22:13] <FFmpeg> I am having some trouble with ffserver. I am trying to live stream via asf. Everything seems to be working, but when I try to open my stream in windows media player I get an error that it can't play the file.
[22:48] <llogan> fatpony: adjust the limit threshold?
[22:49] <fatpony> llogan: i tried to do that but i don't get realistic values
[22:49] <fatpony> low threshold = less aggressive crop, right?
[22:49] <llogan> yes, AFAIK
[22:54] <llogan> you could use drawbox to get an idea using the values from cropdetect
[22:54] <llogan> ffplay -f lavfi color=s=1024x720:c=blue -vf "pad=iw:ih+80:0:40,drawbox=0:40:1024:720:red at 0.5"
[22:54] <llogan> or similar
[22:54] <llogan> of course your input will be a video and not a color source from lavfi
[22:55] <llogan> ...and you won't need pad obviously
[23:23] <fenduru> I have a .wav (pcm_s24le encoded) that ffprobe reports as 9.97s long, however when I reencode that to aac (.mp4), ffprobe reports 10.00 seconds
[23:24] <fenduru> How can I avoid this? I need the .mp4 to be precisely the same as the wav in terms of duration
[23:27] <fenduru> encoding to mp3 gives it a duratnoi of 10.03s
[23:27] <fenduru> So I'm assuming it is a container issue (i.e. frame length is fixed, so the end is padded), but I need to force it to be precise
[00:00] --- Wed Feb 13 2013
More information about the Ffmpeg-devel-irc
mailing list