[Ffmpeg-devel-irc] ffmpeg.log.20141120

burek burek021 at gmail.com
Fri Nov 21 02:05:01 CET 2014


[00:10] <b-p> hi, is there any anti-flicker in the ffmpeg? thank you
[00:35] <llogan> b-p: can you elaborate?
[00:40] <b-p> llogan what?
[00:40] <b-p> i need something like the vlc players anti flicker
[00:50] <llogan> b-p: no dedicated antiflicker filter exists. perhaps you could port the one in VLC to FFmpeg
[00:51] <llogan> if you are unable, then you could submit a feature resuest at the bug tracker, and evn post a bounty if you want to offer some monetary incentive
[00:51] <llogan> *request
[02:17] <compstomp> Hi guys, what would be the cmake equivalent of "./configure --prefix=$TARGET_DIR --enable-static --disable-shared" Trying to compile x265 for static linking into ffmpeg. Thanks!
[02:19] <c_14> cmake -DCMAKE_INSTALL_PREFIX=$TARGET_DIR .
[02:19] <c_14> I think it builds static by default.
[02:21] <compstomp> so presumably -DBUILD_SHARED_LIBS:BOOL=OFF  and -DCMAKE_C_CREATE_STATIC_LIBRARY:BOOL=ON  are fruitless?
[02:22] <c_14> It might do something.
[02:22] <llogan> i dislike cmake
[02:24] <compstomp> I hate it so much. Would kill to just have a ./configure like every other codec's sources
[02:25] <c_14> Hmm, actually it looks like shared might be default.
[02:25] <c_14> So you might want to keep -DBUILD_SHARED_LIBS:BOOL=OFF
[02:26] <compstomp> kk will do. Would the lib dir automagically be set to PREFIX/lib, or does it need to be explicitly stated?
[02:26] <c_14> Should be
[02:26] <compstomp> great! thanks!
[02:27] <compstomp> I assume if shared needs to be disabled it probably wants to be explicitly told to be static? Via: -DCMAKE_C_CREATE_STATIC_LIBRARY:BOOL=ON ?
[02:31] <c_14> I can't find that string anywhere in the cmake folder...
[02:31] <c_14> Or anywhere in the repo for that matter.
[02:32] <c_14> tbh, all I did was use the ./make-Makefiles.bash script, hit generate and started browsing the CMakeCache.txt file
[02:33] <compstomp> hmm good idea, thanks
[02:33] <c_14> There seems to be an x265-static target in the Makefile though.
[02:34] <c_14> Worst case you could probably just call make x265-static
[02:35] <compstomp> what would you guess the output of that would be? Just the .a ?
[02:35] <c_14> Following the fun cmake tree down to CMakeFiles/x265-static.dir/build.make, it seems to generate the .a
[02:37] <c_14> And in my current conf, the cli.dir/build.make links against libx265.a
[02:37] <c_14> Which looks like it's called from the cli target.
[02:55] <compstomp> Welp, ffmpeg is having difficulties linking x265 it seems.  Full console input / output: http://pastebin.com/sqnjJ3Gh  config.log: http://pastebin.com/8T7XQC7s
[02:55] <compstomp> Appreciate the help!
[03:01] <c_14> What does pkg-config --exists --print-errors x265 return?
[03:02] <compstomp> Package x265 was not found in the pkg-config search path.
[03:02] <compstomp> Perhaps you should add the directory containing `x265.pc'
[03:02] <compstomp> to the PKG_CONFIG_PATH environment variable
[03:02] <compstomp> No package 'x265' found
[03:02] <c_14> yep, do that
[03:03] <c_14> export PKG_CONFIG_PATH=wherever_that_was_again
[03:03] <compstomp> x265.pc exists in my $TARGET/lib/pkgconfig
[03:03] <c_14> yes, then that
[03:03] <compstomp> along with .pc's for many other codecs.  Why is it that only x265 is yelling at me?
[03:04] <c_14> configure doesn't use pkg_config for everything
[03:04] <c_14> s/_/-
[03:07] <compstomp> What is the best way to add this to the path in a bash script?
[03:08] <c_14> export PKG_CONFIG_PATH=$TARGET/lib/pkgconfig
[03:08] <c_14> Or rather
[03:08] <c_14> export PKG_CONFIG_PATH=$TARGET/lib/pkgconfig:$PKG_CONFIG_PATH
[03:08] <compstomp> ahh exactly that thanks!
[03:20] <Schnabeltierchen> since i´m very bad with video codecs, video container formats, audio codecs, subtitle formats and combining them all: is there a script "here ffmpeg, look at this video file A, recode video file B like A, GO!" ?
[03:23] <compstomp> Hey c_14, for some reason even with that PKG_CONFIG addendum it still cant find x265.  I tried calling pkg-config -exists in my script but I see no output in the console. Tried capturing it to a variable and echoing that but still nada.
[03:23] <c_14> It won't print anything if it works.
[03:23] <c_14> check $?
[03:24] <compstomp> returns 0
[03:24] <c_14> then it found it
[03:24] <c_14> But why isn't ffmpeg...
[03:24] <c_14> Can you pb again?
[03:24] <compstomp> yep
[03:24] <c_14> Schnabeltierchen: there might be some funky gui that can do something similar
[03:26] <Schnabeltierchen> i would prefer cmdline, because synology nas and stuff
[03:27] <Schnabeltierchen> mhm but k, any gui you would suggest for a remote running ffmpeg?
[03:27] <c_14> None I know of.
[03:27] <c_14> I tend to just use the commandline.
[03:28] <compstomp> input-output: http://pastebin.com/FM6yLwC5  config.log: http://pastebin.com/ZAb2RMvg
[03:31] <compstomp> Schnabeltierchen, It isn't nearly a one-click solution as you may like, but just running ffmpeg -i on whatever input file you have will tell you all you need to know. You could probably write a script to parse that output and do it all based on that. Probably more work than just practicing your codecs, subs, and containers though :)
[03:32] <c_14> compstomp: just making sure, but this is git head, right?
[03:32] <compstomp> input file being the file whose attributes you want to emulate, that is.
[03:32] <compstomp> head of ffmpeg and x265
[03:34] <Schnabeltierchen> once given parameters would be another solution, cause i tested 1080p video, with diferent audio streams and subtitles it worked... now i need to recode all my videos like this sample video...
[03:38] <compstomp> Unabashed self-pitch here.  Here's a script I wrote a little while back that helps you recursively convert an entire directory with ffmpeg: https://github.com/srwareham/Linux-Scripts/blob/master/ffmpeg-scripts/ffmpegDir.bash  Would hardly take any retooling to modify to your specifications I assume
[03:40] <Schnabeltierchen> mhm this seems just like the thing i´m searching for
[03:41] <Schnabeltierchen> mhm only the subtitle thingy :P
[03:42] <c_14> compstomp: just making sure, but you have x265.h in $TARGET/include ?
[03:42] <c_14> and x265.a is in $TARGET/lib ?
[03:43] <compstomp> in $TARGET/lib i have libx265.a but no x265.a
[03:44] <compstomp> I also have x265.pc in $TARGET/lib/pkgconfig
[03:44] <c_14> eh, right. and the header file?
[03:46] <compstomp> yep. In $TARGET/include i have x265_config.h and x265.h
[03:46] <c_14> Hmm, can you try adding --pkg-config-flags=--static to the ffmpeg configure line?
[03:47] <compstomp> Trying now.
[03:51] <compstomp> Holy _explicative_! it appears to be compiling!!
[03:51] <c_14> I just love obscure options.
[03:53] <compstomp> The more power you have, the more ways you have to shoot yourself in the foot lol.  Will post back if I can actually convert something.  POS old laptop this is running on so likely not tonight hah
[07:12] <mohsen-rashidi> how can i rotate a mp4 video clip using ffmpeg?
[10:00] <anddam> hi, I have a list of URL of different kinds of audio stream I'd like to dump, is ffmpeg suited for the job? I've seen some rtsp:// with .ra, mp3 and wma among these
[10:00] <anddam> I'm looking for a "swiss army knife" of dumping tool, i.e. I'd like to just toss my url as argument and be done with it
[11:06] <Pkunk> Is it possible using the latest libavformat to upload encoded packets directly to an rtmp server URL ? For i.e setting AVFormatContext->filename to an rtmp URL ?
[11:28] <reuf> hello - i need some utility to process mp3 file - i want to remove certain segment from mp3 file - trimming - any good tool/approach for this?
[14:23] <lookatmeyou> hello, anyone help
[14:23] <lookatmeyou> me at ubuntu:~/ffmpeg-2.4.3/doc/examples$ ./decoding_encoding h264
[14:23] <lookatmeyou> Encode video file test.h264
[14:23] <lookatmeyou> Codec not found
[14:23] <lookatmeyou> I have already install x264, but still get this error
[14:23] <BtbN> Did you build your ffmpeg with x264 support?
[14:23] <lookatmeyou> yes
[14:25] <lookatmeyou> ./configure --enable-x264 --enable-shared --enable-gpl
[14:25] <lookatmeyou> I build ffmpeg using this command
[14:26] <lookatmeyou> ./configure --enable-shared --enable-gpl --enable-libx264
[14:26] <lookatmeyou> this, sorry
[14:43] <lookatmeyou> I havn't built ffmpeg with --enable-x264
[14:44] <lookatmeyou> thank you very much
[14:53] <cyphase> anyone know of any info about mimicking the mp3 encoding output of adobe audition with ffmpeg? i'm automating some editing that someone's been doing with audition, and am trying to get my output as close as possible
[15:12] <hefest> "Only VP8 or VP9 video and Vorbis or Opus audio and WebVTT subtitles are supported for WebM."
[15:12] <hefest> trying to convert flv to webm
[15:12] <hefest> ffmpeg -y -i files/ET2GYGBFW4.flv -vf scale=320:240 -q:v 0 -profile:v baseline -r 25 -vcodec libx264 -ac 2 -strict -2 -b:a 49k files/RN8N5WT37RO.webm
[15:14] <BtbN> Well, that's because "Only VP8 or VP9 video and Vorbis or Opus audio and WebVTT subtitles are supported for WebM."
[15:14] <hefest> ok, removed vcodec, my bad
[15:14] <hefest> Encoder (codec vp8) not found for output stream #0:0
[15:14] <hefest> im getting this now
[15:14] <hefest> have to install it guess
[15:14] <BtbN> So your ffmpeg isn't compiled with vp8 encoding support
[15:16] <hefest> oh man that's a pain to compile on freakin' mac :D
[15:16] <hefest> why did i switch form linux, oh way
[15:16] <BtbN> Use macports or something like that.
[15:17] <hefest> im creating a service thats going to run on linux anyway. ill just move it to test server
[15:26] <muratmaman> hello
[15:29] <muratmaman> Hello everyone, i am having problem with out of sync between audio and video. I am getting mpegts stream via udp and after decode and encode finally muxer it by rtmp. In general it is working as expected without problem, but after a while i am getting out of sync.
[15:30] <muratmaman> I am using the latest ffmpeg realese
[15:31] <muratmaman> ubuntu 64 bit server i am suing
[15:31] <muratmaman> using
[15:31] <muratmaman> anoy one has a idea?
[15:48] <david___> hi i needed a bit of help to convert files to divx
[15:48] <david___> to Divx Home Theater profile
[15:49] <david___> how to set Macroblocks
[15:49] <david___> VBV Buffer
[15:50] <david___> VBV Bitrate
[15:50] <david___> B-Frame
[15:50] <david___> how i can set those in ffmpeg
[15:54] <david___> @Athideus hi
[15:54] <Athideus> hello there!
[15:54] <david___> i needed a help with converting to divx
[15:55] <david___> how to set Macroblocks, VBV Buffer, VBV Bitrate, B-Frame
[15:55] <Athideus> i am looking for a way to distribute a list videos files to many different servers for encoding that then get sent back to the 'controller' server when completeed
[15:55] <david___> wow i got no idea about that
[15:55] <Athideus> i tried googling "ffmpeg cluster" but i havent found anything extremely usefull
[15:56] <BtbN> DivX is just an mpeg4 encoder. There is no such thing as "encode as divx"
[15:56] <sacarasc> Athideus: I don't think there is a distributed ffmpeg...
[15:56] <BtbN> And as DivX is a closed source commercial encoder, you're not going to be able to use it with ffmpeg.
[15:57] <BtbN> h264 with x264 has better quality anyway, so that's not a huge loss
[15:57] <Athideus> it dosent really need to be a distributed ffmpeg install, just a way to send the files to each encoder server, tell it to encode it, and then tell it to xfer back to the controller server
[15:57] <BtbN> mount some samba/nfs share on all of them?
[15:58] <david___> well is it possible to set those options in h264 or x264
[15:58] <david___> i wanted to play the file in my dvd
[15:59] <BtbN> Check if it support h264, and if it does, use that
[15:59] <BtbN> +s
[15:59] <Athideus> BtbN: hrm thats a pretty good idea for the xfering portion, now i just have to figure how to get the controller server to distribute the workload
[16:00] <david___> i plays it but i wanted get the file as close as possible to DivX Home Theater profile
[16:00] <BtbN> Athideus, check out pdsh
[16:00] <BtbN> so you want it to look worse?
[16:01] <Athideus> Awesome, that looks like exactly what i am looking for. Thanks BtbN
[16:01] <david___> it's not about worse it about that player works better with DivX Home Theater profile
[16:02] <BtbN> Well, if you want to encode with DivX, you have to use their software.
[16:03] <david___> so you saying h264 and x264 don't got this options Macroblocks, VBV Buffer, VBV Bitrate, B-Frame
[16:03] <BtbN> Athideus, that's exactly how clusters usualy work. One control server which has the workdirs of all nodes mounted via nfs, which then distributes the data and launches the actual job via pdsh.
[16:11] <david___> @BtbN so for h264 and x264 i can't set Macroblocks, VBV Buffer, VBV Bitrate, B-Frame?
[16:12] <klaxa|work> david___: see: http://mewiki.project357.com/wiki/X264_Settings
[16:12] <klaxa|work> and ffmpeg --help encoder=x264
[16:13] <klaxa|work> err... ffmpeg --help encoder=libx264
[16:13] <david___> oh didn't think about that sorry
[16:13] <david___> thanks for the info
[16:14] <david___> well i gtg now have a good day/night
[16:45] <sor_> is there a site i can find the default settings for q:v (1-60) ? also i notice -s XxY has an effect on quality where as -vf "scale=X:-1" does not, is that correct? also what does -1 do?
[16:49] <relaxed> sor_: http://ffmpeg.org/ffmpeg-filters.html#scale-1
[16:49] <relaxed> sor_: the range for -q:v is 1 - 31 (for ffmpeg's encoders)
[16:50] <sor_> thanks
[16:57] <sor_> relaxed, so does -s XxY have an impact on quality where -vf "... does not?
[16:59] <kepstin-laptop> sor_: the result of using -s (as an output option) and -vf scale=... should be exactly the same.
[16:59] <sor_> thanks
[17:00] <kepstin-laptop> (in fact, if you read the docs, you'll find out that the -s option is just a shortcut that adds a scale filter to the end of the video filter chain)
[17:05] <sor_> kepstin-laptop, ya that's what i understood the problem is that my file size is considerably smaller with -s hmmmm thanks
[17:17] <Zeranoe> Are there any sort of licensing restrictions on using FFmpeg's decoders in a commercial applications? Does licensing need to be purchased to use each decoder?
[17:19] <kepstin-laptop> Zeranoe: are yout talking about copyright licensing or patent licensing?
[17:19] <kepstin-laptop> as long as you follow the rules of the LGPL or whatever licenses apply, you don't have to pay anything.
[17:20] <kepstin-laptop> as far as patent licenses go, talk to your lawyer
[17:22] <Zeranoe> kepstin-laptop: Because patents are held by each respect codec technology developer?
[17:25] <Zeranoe> Living in the U.S., I'm assuming if FFmpeg is compiled with H.264 decoder support, patent royalties would need to be paid to MPEG LA
[17:26] Action: kepstin-laptop isn't a lawyer, and isn't familiar with your juristiction, so he has nothing further to say on the matter.
[17:27] <iive> yes, mpeg la takes money for people using standard video
[17:29] <Zeranoe> Well that's stupid.
[17:29] <iive> patents are stupid, but people demand that some mythical small inventors should be protected.
[17:30] <uex1fi> hey
[17:30] <uex1fi> I'm using a FFmpeg Windows build helper
[17:31] <kepstin-laptop> Zeranoe: anyways, the ffmpeg devs won't do anything to you if you violate patents, that's not their concern. But do follow the *copyright* license compliance checklist on http://ffmpeg.org/legal.html
[17:31] <uex1fi> And it successfully built FFmpeg with default options
[17:31] <kepstin-laptop> Zeranoe: there is some stuff on patents on that page too
[17:31] <uex1fi> But it doesn't understand one option: --high-bitdepth=y
[17:32] <uex1fi> Why is this an unknown option?  Is there a different option for specifying bit depth?
[17:33] <kepstin-laptop> uex1fi: I dunno what you're talking about; that's not an ffmpeg build or runtime option.
[17:33] <iive> uex1fi: is that libx264 option?
[17:33] <uex1fi> iive: yes, and libx265
[17:33] <kepstin-laptop> (ffmpeg's decoders/encoder and stuff like swscale is built with high bitrate supportted by default if applicable)
[17:34] <uex1fi> hmm
[17:34] <uex1fi> i wonder if this is an option specific to this particular build script?
[17:35] <uex1fi> okay, but bit *depth* cannot be changed after ffmpeg is built, right?
[17:36] <kepstin-laptop> uex1fi: ffmpeg decoders/encoders that support multiple bit depths can have the bit depth selected at run time.
[17:36] <kepstin-laptop> x264 cannot have its bit depth changed except at build time
[17:36] <Zeranoe> Any idea why "--disable-decoders --enable-decoder=*264* --enable-decoder=*mp3*" is still enabling the mpeg2video decoder?
[17:37] <iive> uex1fi: i can't find the string "high-bitdepth" in libx264/5 source
[17:38] <relaxed> libx264 supports 8 and 10
[17:38] <kepstin-laptop> uex1fi: interestingly, if you build two copies of x264 (one 8bit, one 10bit), and dynamically link them to ffmpeg, you can switch them out at runtime by swapping which library is used.
[17:38] <uex1fi> okay... then i think this is something specific to this build script
[17:38] <uex1fi> kepstin-laptop: that is interesting
[17:41] <uex1fi> has anyone here used the windows build helper script recently?
[17:42] <uex1fi> I'm slightly confused by the instructions for enabling options like nonfree and bitdepth
[17:49] <Zeranoe> Seriously... I cannot get the mpeg2video decoder to disable
[17:53] <kepstin-laptop> looks like the only stuff that selects mpeg2video in the configure script is the hwaccel versions of the mpeg2 decoder.
[17:54] <relaxed> this works for me, --disable-decoders --enable-decoder=*264* --enable-decoder=*mp3* --disable-decoder=mpeg*
[17:55] <kepstin-laptop> Zeranoe: i suspect disabling hwaccel support might fix your issue?
[17:55] <Zeranoe> Passing --disable-decoders and I still have "bmp h264 mpeg2video h263 hevc vc1" enabled.
[17:55] <Zeranoe> relaxed: Why did you have to disable mpeg a 2nd time
[17:56] <kepstin-laptop> Zeranoe: that list of codecs looks suspiciously like the list of codecs that have hwaccel decoders available. Please try disabling hwaccel.
[17:57] <kepstin-laptop> (note that 'bmp' is pulled in if you have the 'gdigrab' screen capture enabled)
[17:59] <relaxed> I would file a bug report. It makes sense that the decoders should be parsed first.
[18:00] <kepstin-laptop> eh, it's just a quirk of the way the configure script works
[18:00] <kepstin-laptop> you disable all the encoders, then enable the hwaccels, then the hwaccels need some decoders so they re-enable them
[18:12] <uex1fi> Hey, when I encode a regular .mp4 file as HEVC .mp4, it plays fine in VLC.  But when I do DV-AVI to HEVC .mp4, there is no video, only audio.
[18:12] <uex1fi> Can anyone suggest a fix or a workaround?
[18:13] <uex1fi> okay
[18:14] <Zeranoe> kepstin-laptop: Running with '--disable-encoders --disable-decoders --disable-hwaccels' and I still get decoders: bmp, h264, vc1, h263, and hevc enabled.
[18:15] <uex1fi> relaxed: the console history doesn't go all the way back
[18:16] <kepstin-laptop> Zeranoe: well, that got rid of your mpeg2 :)
[18:16] <uex1fi> relaxed: http://pastebin.com/dZjntU0u
[18:16] <Zeranoe> kepstin-laptop: lol, still not predictable results
[18:17] <kepstin-laptop> Zeranoe: there's probably some filters or muxers/demuxers that you have enabled which are pulling in those codecs
[18:17] <kepstin-laptop> ah, you need to disable the parsers
[18:18] <kepstin-laptop> the h264, hevc, vc1 parsers all pull in the respective video codecs
[18:18] <Zeranoe> Jeez
[18:19] <kepstin-laptop> Zeranoe: you're not exactly doing something that's very common, most people want their builds with this stuff enabled :)
[18:20] <Zeranoe> What do the parsers do
[18:20] <kepstin-laptop> allow some metainfo to be read from streams (stuff like frame timing, size, type, etc?) to be extracted without running the full decoder.
[18:20] <kepstin-laptop> I think?
[18:24] <Zeranoe> lol, and of course --disable-parsers doesn't actually disable all parsers and the bmp decoder is still enabled
[18:27] <kepstin-laptop> Zeranoe: this is a windows build?
[18:28] <kepstin-laptop> like I said before, the bmp decoder is pulled in by the 'gdigrab' screen capture input device
[18:29] <kepstin-laptop> (that's actually an interesting bit of code; basically the bitmap datastructures are used internally by gdi, so rather than reimplementing bmp, I just made the gdigrab device pass bitmap images to the bmp decoder)
[18:34] <Zeranoe> kepstin-laptop: impressive
[18:34] <Zeranoe> kepstin-laptop: would that be possible/beneficial with other decoders?
[18:57] <Zeranoe> Can anyone explain where all the codecs are coming from when they aren't listed in -encoders or -decoders http://paste.ubuntu.com/9127939/
[18:58] <Zeranoe> As in, how are the codecs still included if they do not have decoder or encoder support?
[19:01] <DiegoMax_> hello, im doing some experiments with FFServer, and i see that one of the formats for streams is HLS (Http Live Streaming), however i cant really find a lot of documentation or exmaples on FFServer options for that format
[19:01] <DiegoMax_> is FFServer supposed to support realtime HLS segmenting and delivering ? or am i misunderstanding this?
[19:02] <kepstin-laptop> Zeranoe: inside the configure script, there's dependency logic where if e.g. a filter or protocol or anything requires that a codec is enabled, it can list it in a special "select" list
[19:02] <kepstin-laptop> then if you enable that filter or whatever, it'll automatically also (recursively) enable everything in the select list too
[19:05] <kepstin-laptop> you can trace this backwards by just looking through the configure script, searching for encoder names; e.g. https://bpaste.net/show/6c863abeb256
[19:06] <kepstin-laptop> hmm. In theory, I guess you could use gdigrab without the bmp decoder; you could for example use -c:v copy and save the screen capture to (a series of) bmp files.
[19:07] <kepstin-laptop> but that sounds silly :)
[19:10] <owonoko> i have 2 cores on this system, should i be able to get more than one core for a libx264 encoding?
[19:11] <owonoko> 1 socket, 2 cores, 4 logical processors (i5-337U at 1.8Ghz) i can see it's only using one of them
[19:11] <c_14> owonoko: did you compile with pthreads ?
[19:11] <c_14> (it should be default)
[19:11] <owonoko> good question, it's a windows tablet
[19:12] <c_14> ugh, eh
[19:12] <c_14> Where did you get ffmpeg from?
[19:13] <owonoko> i just read that rendering wma2 is slow sometimes because of the wrong use of an assembler like yasm as well, my wma2 is super slow to encode
[19:13] <owonoko> the zeranoe builds
[19:13] <owonoko> it's been a couple of years since i did a windows cross compile from source
[19:13] <owonoko> wonder if it's worth it
[19:14] <owonoko> it's abotu 220 seconds to produce a 60 second wmv file at 640x480
[19:15] <owonoko> it has pthreads support http://ffmpeg.zeranoe.com/builds/
[19:17] <owonoko> basically i've got a situation where my camera on the tablet produces mjpeg at 1280x720 and i need to preview that in a flash player based user interface (after it has been recorded) then i need to output a truncated and cropped version of the recording as a wmv
[19:18] <owonoko> so i'm going mjpeg avi container -> disk, disk -> vp, disk -> [filter] ->  wmv
[19:18] <owonoko> not sure if that's the best way to do it
[19:18] <c_14> Can you pastebin your current commandline and output?
[19:19] <owonoko> sure
[19:19] <pmarty> When piping WAVE data to stdout such as in: "ffmpeg -i image.ape -f wav - >image.wav" ffmpeg sets length fields in the RIFF header to 0xFFFF...FF values for some reason (treats input as unknown length?). Is it possible to make it set correct length (get it from APE input file)?
[19:20] <pmarty> I'm piping to application that splits the wav into multiple files and would like to avoid temp files
[19:20] <kepstin-laptop> pmarty: maybe for some special cases, but what if you were using an audio filter or something?
[19:20] <kepstin-laptop> pmarty: why can't you just use headerless (raw) audio?
[19:21] <pmarty> kepstin-laptop: hmm, that's an idea, I wonder if this thing (shnsplit) can consume it
[19:22] <c_14> pmarty: if you're just trying to run shnsplit on the ape, shnsplit can handle ape, you don't need to convert to pcm
[19:23] <pmarty> c_14: yes but it expects me to have mac in my PATH
[19:23] <c_14> export PATH=/path/to/mac:$PATH ?
[19:23] <c_14> Or do you not have the program at all?
[19:23] <kepstin-laptop> is it having any particular problem with that length value? It's fairly common for applications which are streaming wav to use that as a de-facto "no lenght set"
[19:24] <pmarty> I wanted to use ffmpeg instead because mac has some license issues
[19:24] <pmarty> (I don't know, it's not in debian)
[19:24] <c_14> mhmk
[19:24] <pmarty> kepstin-laptop: yes it chokes on last track
[19:25] <c_14> Try adding -t (length-of-file)
[19:25] <pmarty> it treats header length literally
[19:25] <c_14> ffmpeg might be smart enough to write the header if it knows it'll only create x seconds of output
[19:27] <kepstin-laptop> c_14: there's no general way for it to know in advance, tho, thanks to stuff like filter chains
[19:28] <c_14> Not even if you explicitly set the max length for that output file?
[19:28] <kepstin-laptop> huh, strange. it doesn't look like there is support for raw cd audio format data input in shnsplit.
[19:29] <pmarty> yep, shnsplit wants WAVE delivered from it's format backends
[19:29] <kepstin-laptop> I normally use 'bchunk' to do that, which does take raw audio input, but it doesn't support any fancy stuff like encoding audio for you :)
[19:34] <pmarty> This is how I imagined my clever command to do APE image -> *.flac: shnsplit -i 'ape ffmpeg -nostats -i %f -f s16le -' -o 'flac flac -8so %f -' -f cue image.ape
[19:36] <pmarty> s/s16le/wav/ to make it work for each except the last track
[19:56] <BtbN> What's the best way to concat a crazy amount of .ts files(like 300k), with random format changes all over the place?
[19:56] <BtbN> without re-encoding, with a new part each time the format gets incompatible
[19:56] <BtbN> -f concat takes _ages_
[19:56] <c_14> cat *ts > concat.ts
[19:57] <c_14> You should be able to concat ts files like that.
[20:00] <BtbN> c_14, as long as they contain compatible formats
[20:00] <BtbN> the moment it gets incompatible(resolution change, for example), it('it' beeing whatever youtube does with it) just continues with a gray video
[20:02] <BtbN> so i somehow have to identify those spots. running ffprobe on 300k files and analyzing its output might take a while
[20:04] <c_14> The only way I can think of is programmatical.
[20:10] <Hello71> -c copy?
[20:19] <BtbN> Hello71, doesn't help at all if the ts files don't belong to the same stream anymore
[20:22] <BtbN> the timestamps make a huge jump, and even the video format might change
[21:09] <moje> how can I record my desktop screen + virtual audio + (optional: microphone audio) ?
[21:10] <c_14> https://trac.ffmpeg.org/wiki/Capture/Desktop
[21:10] <c_14> https://trac.ffmpeg.org/wiki/Capture/ALSA
[21:11] <hannes3> hey, where can i find the logs for this channel? i want to follow up on a regression bug i had in march (really :) )
[21:12] <llogan> hannes3: !pb logs
[21:12] <llogan> damn it
[21:13] <hannes3> :)
[21:13] <llogan> http://lists.ffmpeg.org/pipermail/ffmpeg-devel-irc/
[21:13] <llogan> -devel and -user are stored there
[21:14] <hannes3> thanks
[21:16] <cyphase> anyone have any thoughts on encoding a 64 and 16 kbps mp3 from a 128kbps mp3 vs a ~700kbps wav? the 128kbps having been encoded from that wav
[21:18] <cyphase> how much difference would you expect in quality? this is for a talk show with intro/outro music beds
[21:18] <hannes3> i'd use opus if you can
[21:18] <kepstin-laptop> well, you're always gonna get better quality by reducing the number of generations of lossy encoding. But really, 64 and 16kbps mp3? that's just gonna be horrid.
[21:20] <llogan> ...and probably for stereo input too
[21:21] <cyphase> it's mono
[21:21] <kepstin-laptop> I suppose if you're doing mono, the 64kbps mp3 is probably acceptable.
[21:21] <cyphase> i was going to say, it's pretty good
[21:22] <cyphase> the 16kbps is certainly undesirable, but it's for low-bandwidth people
[21:22] <cyphase> the host keeps the 128kbps as the archival quality file
[21:22] <cyphase> there's little difference from the raw wavs
[21:23] <llogan> why not archive as flac?
[21:23] <cyphase> *i* would keep the wavs, just because, but it's not my call :)
[21:23] <cyphase> i thought about suggesting that; how do you think file size would compare?
[21:24] <llogan> i don't know. smaller.
[21:24] <cyphase> the 128kbps file is ~112MB. the wavs total just under 600MB
[21:24] <kepstin-laptop> depends; if it's mostly speech, I think the predictor in flac actually works pretty well for that.
[21:24] <kepstin-laptop> the closer your audio is to noise, the worse lossless compression works ;)
[21:25] <owonoko> c_14: hey mate, heres that output http://dpaste.com/038SAG5
[21:26] <Devrim> cyphase any reason you are using mp3 and not something like aac?
[21:26] <kepstin-laptop> amusingly, opus compresses noise pretty well, since it just encodes "this frequency band has noise at this power level" :)
[21:26] <owonoko> c_14: so it's only using one of the 2 cores, i've tried various -threads settings, the input avi is coming from a dump from the camera, i'd love to improve its performance at any rate since it's affecting the other outputs too (vp6, h264)
[21:27] <cyphase> Devrim, that's what the show's been using forever. but i could talk to the host if there was a benefit i could bring to him
[21:28] <c_14> owonoko: the msmpeg4 encoder doesn't support threading
[21:29] <owonoko> c_14: right that was only part of my question originally
[21:29] <owonoko> c_14: can i do anything else to improve the transcode to wmv
[21:29] <owonoko> c_14: re the yasm cross compile
[21:31] <c_14> The build you have doesn't seem to be disabling yasm, so it should be fine.
[21:32] <owonoko> what i was thinking is that there may be some advantage to building against the features of this specific cpu
[21:34] <c_14> There _might_ be, but you probably wouldn't notice without benchmarking and it'll only really be worth it if you're planning on using it extensively.
[21:34] <owonoko> this tablet is going to produce hundreds of output movies, i need to sort out the performance any way i can
[21:35] <cyphase> Devrim, what would be the benefit of switching their archives/podcast to aac?
[21:35] <llogan> owonoko: why not use a different encoder?
[21:35] <Devrim> cyphase better compression, higher quality for the same filesize
[21:35] <owonoko> thinking about paying microsoft for their pro encoder thing, but haven't looked into it yet
[21:35] <llogan> (i didn't read channel history)
[21:35] <owonoko> llogan: for coroporate internet capable wmv files?
[21:35] <Devrim> disadvantage would be mp3 is supported on more devices (very old mobile devices might not play it)
[21:35] <llogan> owonoko: bleh, sounds aweful.
[21:36] <llogan> you can't use H.264 in MP4 container?
[21:46] <owonoko> llogan: windows office users have a pretty limited wmp install
[21:46] <owonoko> but that's what i want long term, for us to come to an agreement about a codec
[22:13] <BtbN> Why can't i copy this mpegts file to another mpegts one, while flv output works fine?
[22:13] <BtbN> I get [mpegts @ 0x221b520] H.264 bitstream malformed, no startcode found, use the h264_mp4toannexb bitstream filter (-bsf h264_mp4toannexb)
[22:14] <BtbN> Adding said option only leads to "Failed to open bitstream filter h264_mp4toannexb for stream 0 with codec copy: Invalid argument"
[22:15] <BtbN> Oh, yeah. For flv i still need to add -bsf:a aac_adtstoasc
[23:23] <DiegoMax_> anyone here uses FFServer ?
[00:00] --- Fri Nov 21 2014


More information about the Ffmpeg-devel-irc mailing list