[Ffmpeg-devel-irc] ffmpeg.log.20160517

burek burek021 at gmail.com
Wed May 18 02:05:01 CEST 2016


[00:32:49 CEST] <pfelt> vikash: /me thinks more data would be helpful.  what streams did ffmpeg parse out of video5.mp4 and video7.mp4 ?
[00:35:56 CEST] <vikash> pfelt: I am trying to join two videos side by side and also their audio into a single channel
[00:39:31 CEST] <vikash> pfelt: ffprobe http://pastebin.com/4P2uJM3R
[00:56:50 CEST] <pfelt> vikash: can you rerun that for video5 too ?
[01:10:15 CEST] <vikash> pfelt: video5 http://pastebin.com/AucBNUkP
[01:13:05 CEST] <pfelt> beyond me :(
[01:14:31 CEST] <vikash> pfelt: its ok. anyway thanks :)
[05:07:14 CEST] <Ben321> I have a question about AVI output by FFMPEG. When saving an AVI file that is <2GB does it use AVI 1.0 (the older type, that supports up to 2GB files), that has one index at the end of the file? Or does it always output AVI 2.0 (the newer type that supports over 2GB files) that has a bunch of additional overhead to it (such as an additional index), regardless of the size of the AVI file that...
[05:07:16 CEST] <Ben321> ...will be output?
[05:08:02 CEST] <c_14> ffmpeg doesn't know how big the output will be when it starts writing, so probably the second
[05:08:31 CEST] <Ben321> I was hoping it could output AVI 1.0 files, because I'm writing my own video processor, but it only works on the bare simplest AVI file, that is one that is a version 1.0 AVI file, and only contains raw RGB pixel-format frames.
[05:09:44 CEST] <Ben321> Is there some kind of filter I can use in FFMPEG to signal it to only output AVI 1.0 files, and to simply stop its conversion process and display an error in the console if it exceeds 2GB in size?
[05:10:03 CEST] <c_14> Didn't see an option listed for the avi muxer, so probably no
[05:12:06 CEST] <Ben321> The great thing you see about this other program I have called VirtualDub, is you can set it to output AVI 1.0 files. Unfortunately VirtualDub doesn't accept as input the HUGE variety of video file formats that FFMPEG does. And with FFMPEG, while it does accept a wide variety of input file formats, its AVI output is limited to AVI 2.0, which makes it is very difficult to actually write...
[05:12:08 CEST] <Ben321> ...software to process, even if the frames are raw RGB.
[05:12:57 CEST] <Ben321> Are you aware if the developers of FFMPEG have any intention to implement the ability to write AVI 1.0 files?
[05:13:29 CEST] <c_14> Unlikely
[05:16:22 CEST] <Ben321> Are you aware of any other program that actually accepts the huge variety of input formats that FFMPEG does, that also is capable of writing to AVI 1.0 files? I'm guessing there MIGHT be, but it's probably some kind of professional video editing software that costs thousands of dollars, and is only used by professionals in the movie and TV industries. That's why I was hoping that FFMPEG...
[05:16:23 CEST] <Ben321> ...could do it.
[05:16:46 CEST] <c_14> *shrug*
[05:16:59 CEST] <Ben321> I don't want to shell out big bucks for some specialized professional software.
[05:17:02 CEST] <c_14> You could always just use the libav* libraries to demux whatever you want in your program
[05:18:11 CEST] <Ben321> Does FFMPEG use libav for decompression, or does it use its own routines. And is there a standalone libav.dll file I can download, or would I have to compile it from sourcecode?
[05:18:51 CEST] <c_14> FFmpeg is a project containing the libav* libraries (libavformat, libavcodec etc). ffmpeg is a commandline program that uses the libav* libraries
[05:19:25 CEST] <c_14> If you're on Windows, zeranoe has shared/dev builds that provide pre-built libraries
[05:21:04 CEST] <Ben321> Does that mean that libav can support all the same file formats (including exotic/rare ones) that FFMPEG can, as input to be decompressed?
[05:22:10 CEST] <c_14> Yes, the libav* libraries can support all the file formats the ffmpeg binary can. (Don't confuse libav with FFmpeg, libav is a separate project with similar functionality to what the FFmpeg project also provides)
[05:23:24 CEST] <c_14> https://ffmpeg.org/about.html
[05:23:26 CEST] <c_14> ^read that
[05:24:06 CEST] <Ben321> I'm looking at the shared (not static) version of FFMPEG now, and in the folder that has ffmpeg.exe, I see a number of DLL files, but none of them contain the name "libav". Some of them do start with "av" though, such as "avcodec-57.dll". Is these part of the libav library?
[05:24:39 CEST] <Ben321> Woops, typo. "Is" should have been "Are" in that last sentence.
[05:24:42 CEST] <c_14> There's several libraries, and yes each of those is one of them. The lib prefix is just stripped
[05:25:41 CEST] <Ben321> In the Zeranoe build for Windows, are these DLL files using the CDECL calling convention for their functions, or are they using the STDCALL calling convention?
[05:27:02 CEST] <c_14> I have no idea.
[05:28:00 CEST] <Ben321> I have a question on using FFMPEG. Is there a commandline switch that can be used to query statistics on a video file, without doing a conversion? For example, can I use it to find the Width, Height, and FrameRate of a video file?
[05:28:20 CEST] <c_14> https://ffmpeg.org/ffprobe.html
[05:28:24 CEST] <c_14> Short: Yes
[05:30:29 CEST] <Ben321> Is there a place that I can find information on all the exported functions from the libav DLLs? I mean something that tells all the parameters, and return values for these functions? Are they documented anywhere? Are there any tutorials for using them to decompress a video file?
[05:31:15 CEST] <c_14> https://ffmpeg.org/doxygen/trunk/index.html
[05:32:02 CEST] <c_14> https://git.ffmpeg.org/gitweb/ffmpeg.git/tree/HEAD:/doc/examples
[05:33:43 CEST] <Ben321> Thanks. I'll take a look at those.
[07:21:23 CEST] <lesshaste> kepstin, I meant drop some frames. It is currently 60fps and I need <=  30
[08:03:03 CEST] <taniey> hello ,
[08:03:11 CEST] <taniey> anybody here?
[08:04:15 CEST] <taniey> I have a question:
[08:04:33 CEST] <taniey> I compile the ffmpeg-trunk  dynamic library from source in windows 10 64bit ,the make command appear some error.
[08:05:01 CEST] <taniey> I use mys2-w64 to build, the ffmpeg complile command:
[08:05:07 CEST] <taniey> ./configure --target-os=win64 --disable-sdl --disable-static --enable-shared
[08:05:23 CEST] <taniey> and the command can execute success. but I make for the files , there is an error ,
[08:05:31 CEST] <taniey> ./compat/windows/makedef:L48: lib: *~0}ä
[08:05:31 CEST] <taniey> Could not create temporary library.
[08:05:31 CEST] <taniey> library.mak:97: recipe for target 'libavutil/avutil-55.dll' failed
[08:05:55 CEST] <taniey> I know the path no set to the lib , so , I add the path to 'lib' ,use the command:
[08:05:55 CEST] <taniey> export PATH=$PATH:'/c/Program Files (x86)/Microsoft Visual Studio 14.0/VC/bin'
[08:05:55 CEST] <taniey> then I execute make command, appear error:
[08:06:52 CEST] <taniey> I can't how solve about this error.
[08:07:11 CEST] <taniey> can anybody help me?
[08:30:37 CEST] <yongyung> When rendering a video with -qp x, ffmpeg outputs multiple lines (normal encoder output) containing q=n with n being between around x-1 and x+2, why is that? Shouldn't -qp force a constant q?
[09:08:54 CEST] <kgkg> windows command line drives me nuts: trying to create a timelapse with:   ffmpeg.exe -i '%*.jpg' -r 30 -q:v 2 timelapse.mp4
[09:10:30 CEST] <kgkg> but it fails for the wildcard expression. I also tried '^%*.jpg' and '%%*.jpg' and of course '*.jpg' and *.jpg. nothing works
[09:34:09 CEST] <DHE> yongyung: I believe libx264 will actually use different values for keyframes and bframes. you're probably seeing that
[10:36:11 CEST] <Macdeath> I want to read a live stream at an android client frame by frame. Does ffserver has a C API to support reading live stream frame by frame ?
[10:40:01 CEST] <AndroUser> I need help in receiving a live stream on an android client frame by frame
[10:40:17 CEST] <AndroUser> Does ffserver have a C API to do that ?
[10:42:34 CEST] <AndroUser> Or perhaps any other way to get individual frames using ffserver
[10:47:17 CEST] <JEEB> AndroUser: you're already out of support if you are using ffserver
[10:47:50 CEST] <JEEB> if it works for you, great. but in general it's a thing that is going to get removed during the next API bump as it's currently (mis)using some APIs that are going to get removed
[10:48:49 CEST] <AndroUser> JEEB: Is there a C API to read the stream published by ffmpeg frame by frame ?
[10:50:07 CEST] <JEEB> usually you receive or get a stream with a client and then you can read that with the libavformat/libavcodec APIs
[10:53:35 CEST] <AndroUser> JEEB: Should I then first read the stream using ffserver , then use clients on localhost
[11:22:52 CEST] <taniey>  How can I compile ffmpeg  dynamic library on windows ?
[11:23:18 CEST] <BtbN> --enable-shared
[11:24:53 CEST] <taniey> but I use the  '--enable-shared' option, there is an error when I make the projects
[11:25:03 CEST] <taniey> like this:
[11:25:23 CEST] <taniey> LD      libavutil/avutil-55.dll
[11:25:23 CEST] <taniey> gcc.exe: error: unrecognized command line option '-implib:libavutil/avutil.lib'
[11:26:00 CEST] <taniey> or like this:
[11:26:03 CEST] <taniey> Could not create temporary library.
[11:26:03 CEST] <taniey> library.mak:97: recipe for target 'libavutil/avutil-55.dll' failed
[11:59:02 CEST] <taniey>  I know why the error happened,
[11:59:44 CEST] <taniey> the option --target-os I set error.
[13:15:34 CEST] <BucketNinja> Hello all!
[13:15:44 CEST] <BucketNinja> Anyone on yet?
[13:20:50 CEST] <BucketNinja> Hi Fahadash
[13:21:04 CEST] <fahadash> Yo
[13:21:14 CEST] <JEEB> just ask your question - nobody is going to reply to random questions on whether or not someone is "around"
[13:21:25 CEST] <JEEB> and then stick around and you might get a reply
[13:22:09 CEST] <BucketNinja> Oh, sorry, first time on, just trying to be courteous.
[13:24:17 CEST] <BucketNinja> I really don't have any questions at the moment. I am a user of FFMPEG and a big fan of the tech, I just wanted to be around the chat channel for any news or questions that might arise. Is that okay?
[13:26:02 CEST] <JEEB> most people here are lurking until they have something to ask etc
[13:29:02 CEST] <iive> it's ok to lurk around. and if you ever know the answer of somebody's question, feel free to share :D
[13:30:00 CEST] <JEEB> yeah, that's the other side of it
[13:30:08 CEST] <JEEB> there are masochists like me who might reply to people's questions
[13:30:27 CEST] <BucketNinja> I will reply if I know the answer
[13:47:57 CEST] <r3Mod> hi
[13:48:58 CEST] <r3Mod> my ffmpeg stopps transcoding with error "invalid cbp -4 at 72 52". VLC doesn't raise this error and continues the playback. Is there an option to ignore this error?
[14:42:38 CEST] <sasos90> hey guys
[14:43:02 CEST] <sasos90> trying to merge audio with video. (video already has audio and i want to keep it) how can i put it on specific location?
[14:43:10 CEST] <sasos90> i am using ffmpeg for android (java)
[14:43:36 CEST] <sasos90> i tried everything (adelay...) adelay delays me everything and duplicate sounds
[15:31:41 CEST] <RoyK> hi all. any idea what sort of video format/codec is used over the HDMI connection?
[15:33:15 CEST] <furq> RoyK: rawvideo
[15:33:38 CEST] <furq> in yuv 4:4:4, yuv 4:2:2 or rgb24
[15:34:17 CEST] <RoyK> just having a discussion with a friend - what if she wants to view 4k with 60fps and 12bit colour depth, that's 19Gbps, how can the hardware really handle this?
[15:35:35 CEST] <jkqxz> It can't.  Use DisplayPort or something else instead.
[15:36:08 CEST] <furq> hdmi 2.0 can do 4k at 60fps
[15:36:26 CEST] <jkqxz> Not in 12-bit colour.
[15:36:26 CEST] <furq> i'm not sure whether it'll manage it at that kind of bit depth though
[15:36:45 CEST] <furq> the 2.0 spec allows 4:2:0 for 4k video, presumably because of bandwidth limitations
[15:37:15 CEST] <furq> but yeah according to wikipedia the maximum throughput is 18gbps
[15:37:20 CEST] <furq> which isn't far off
[15:38:19 CEST] <furq> actually nvm it's 14.4 once you remove the 8b/10b overhead
[16:07:38 CEST] <Timster> Hey, guys - I need to split a file at the blacks. I am using ffmpeg from earlier this month. Could somebody please tell me what's wrong with this: http://pastebin.com/xaZv98Vy
[16:07:58 CEST] <Timster> I set the path to the ffmpeg.exe and the path to the .mp4 file
[16:08:03 CEST] <Timster> But it doesn't seem to work
[16:08:39 CEST] <Timster> The script is from 2014 - maybe it uses features that are different now?
[16:12:26 CEST] <durandal_1707> what it does instead?
[16:14:30 CEST] <Timster> The windows console closes
[16:14:43 CEST] <Timster> Not sure where to find a log file if there is one
[16:15:12 CEST] <pgorley> I compiling against OS X 10.10 with FFmpeg master (pulled earlier today). Everything compiles, but when I run the ffmpeg tool, I'm getting an error that CoreImage.framework could not be found. Any ideas?
[16:39:42 CEST] <pihpah> I want to stream all videos from a directory but still can't figure out how I have to set up ffserver for that. In this https://ffmpeg.org/sample.html sample config file is only streaming of individual files.
[16:42:21 CEST] <MrSassyPants> Ok the past couple of days I asked how to enable scene cut key frames in ffmpeg+libvpx for vp9, I think I figured it out
[16:42:47 CEST] <MrSassyPants> It is necessary to do a 2 pass encode, even if you don't define a bitrate.
[16:43:28 CEST] <MrSassyPants> This also means that the information I received a week ago from this channel, that 2 pass encodes supposedly did nothing for constant-quality encodes, is wrong. Just FYI.
[16:45:07 CEST] <iive> is that written somewhere or you discovered this by testing?
[16:46:08 CEST] <iive> because scene change is quite easy to detect on single pass.
[16:47:50 CEST] <Admin__> hey guys.. quick simple question.. converted videos into HLS ( h265 ) , is there something i can do to speed up channel changes ? every time i change the channel to that content it takes like 3 seconds to start the video.
[16:48:03 CEST] <Admin__> any way to insert something on video or something that will allow quick playback ?
[16:49:16 CEST] <DHE> reduce the -hls_time parameter
[16:49:26 CEST] <Admin__> its 6 seconds
[16:49:27 CEST] <DHE> though it varies by player
[16:49:39 CEST] <DHE> try 4 for experimentation then
[16:49:39 CEST] <Admin__> but with h264 it plays fast
[16:49:41 CEST] <Admin__> odd right ?
[16:49:44 CEST] <DHE> oh?
[16:49:50 CEST] <MrSassyPants> iive, discovered with testing
[16:49:53 CEST] <Admin__> ya h264 its quick start
[16:50:17 CEST] <MrSassyPants> iive, I would have thought too, that scene change is easy to detect in single pass, but alas the codec doesn't seem to enable them in single pass.
[16:51:31 CEST] <furq> libvpx generally seems quite bad
[16:53:35 CEST] <JEEB> yes, esp. vp9 is not a bad format. buut libvpx is pretty bad
[16:53:46 CEST] <furq> yeah it's a shame
[16:54:58 CEST] <RoyK> furq: thanks - do you know where I can learn more about YUV differences?
[16:55:08 CEST] <furq> differences to what
[16:55:15 CEST] <RoyK> those numbers
[16:55:21 CEST] <RoyK> 444/420
[16:55:22 CEST] <RoyK> etc
[16:55:23 CEST] <furq> https://en.wikipedia.org/wiki/Chroma_subsampling
[16:55:42 CEST] <RoyK> thanks
[17:20:53 CEST] <pfelt> is there any way to improve the speed on changing the pixel format?  my incoming stream is yuv422p10le.  i'm trying to crop to 4 quarters, scale up, and output to 4 decklink cards which need uyvy422.  performance tanks when i do the pixel conversion
[17:23:27 CEST] <kepstin> yeah, the 10bit to 8bit conversion will be a bit slow, has to dither, etc.
[17:23:46 CEST] <kepstin> doing the cropping before conversion then scaling after would be fastest, i think
[17:24:34 CEST] <pfelt> i've tried variations with the format filter, but i don't believe i've tried crop -> format -> scale
[17:24:38 CEST] Action: pfelt will play with that
[17:25:02 CEST] <pfelt> one thing i don't actually know is if the crop filter supports input of uyvy422
[17:25:11 CEST] <pfelt> so it may be double converting just to be able to crop
[17:26:01 CEST] <kepstin> cropping is superfast, you want to convert as few pixels as possible, scaling is faster on 8bit.
[17:26:26 CEST] <kepstin> i'm pretty sure crop works on basically everything. you can up the log level to see if it autoinserted any convert filters if you want to be sure
[17:26:31 CEST] <pfelt> i don't suppose that crop can output 4 outputs?
[17:26:51 CEST] <pfelt> i'm toying witht he idea of seeing if uyvy is supported via gdb
[17:27:20 CEST] <kepstin> crop is a ridiculously simple 1-in-1-out filter. All it does is adjust some pointers and nubers in the frame metadata.
[17:27:47 CEST] <pfelt> ah.  so i *do* then need the split
[17:30:03 CEST] <kepstin> hmm. if you're gonna be using all parts of the image, rather than only 1/4 of it, then convert,split,(crop,resize)×4 is probably best.
[17:31:21 CEST] <kepstin> kind of a pity all the ffmpeg swscale code is single-threaded cpu stuff, that sort of color conversion thing is quite parallelizable - and video players usually do it in gpu rather than cpu.
[17:31:47 CEST] <furq> you could try zscale for scaling and colourspace conversion
[17:31:51 CEST] <furq> i have no idea if it'll be faster though
[17:32:44 CEST] <pfelt> so best i can get atm with split, crop, scale, convert or split, crop, convert, scale is about 11fps
[17:32:50 CEST] <pfelt> not suitable for a live stream :(
[17:33:02 CEST] <pfelt> let me look up zscale.  i've not used it before
[17:33:28 CEST] <furq> it's not included by default but it is in relaxed's static builds if you're using those
[17:33:37 CEST] <furq> https://ffmpeg.org/ffmpeg-filters.html#zscale
[17:34:01 CEST] <furq> it should be higher quality if nothing else
[17:34:46 CEST] <pfelt> i'm in git so if it's not there i can add it
[17:35:05 CEST] <pfelt> either i'm not understanding some of these options, or it doesn't convert pixfmt
[17:36:26 CEST] <kepstin> pfelt: it automatically converts pixel format if needed, you can use "format=xxx" to constrain it, I think?
[17:36:55 CEST] <kepstin> you'll want to make sure you're running ffmpeg with verbosity that shows autoinserted filters to make sure it's working
[17:37:09 CEST] <durandal_1707> you need to add format filter after it
[17:37:17 CEST] <pfelt> kepstin: you mean something like -loglevel debug ?
[17:37:18 CEST] <furq> yeah iirc -vf zscale,format=uyvy422 will do the conversion in zscale
[17:37:37 CEST] <pfelt> ah.  format is a completely different filter
[17:37:42 CEST] <pfelt> (that's what the , is)
[17:37:49 CEST] <furq> If the input image format is different from the format requested by the next filter, the zscale filter will convert the input to the requested format.
[17:37:50 CEST] <durandal_1707> No, packed is not supported
[17:38:18 CEST] <pfelt> so decklink supports some other formats
[17:38:33 CEST] <durandal_1707> Use planar than swscale to convert to packed
[17:38:34 CEST] <pfelt> but i was having a hard time converting from blackmagic speak to ffmpeg speak
[17:38:45 CEST] <pfelt> (in terms of what the format names actually are
[17:39:07 CEST] <kepstin> if decklink can do a planar 4:2:2 format, you can probably convince ffmpeg to do something that matches :)
[17:39:35 CEST] <pfelt> so here's where i'm *way* behind the curve
[17:39:39 CEST] <pfelt> planar is yuv right?
[17:39:51 CEST] <kepstin> no, y, u, v are the color components
[17:40:13 CEST] <kepstin> planar means you have all the y, followed by all the u, followed by all the v - like 3 separate monochrome images
[17:40:30 CEST] <pfelt> oh,
[17:40:41 CEST] <pfelt> and uyvy is actually just that pixel wise int eh stream
[17:40:55 CEST] <pfelt> i was reading a link about that last night
[17:41:22 CEST] <kepstin> yeah, that's a packed format, it means there's a single image stream with all the y, u, v components mixed together (alternating)
[17:41:27 CEST] <pfelt> https://www.ptgrey.com/KB/10092
[17:41:30 CEST] <pfelt> ok.  yeah
[17:41:39 CEST] <pfelt> let me see if i can figure out a planar format
[17:42:01 CEST] <kepstin> it's kind of complicated with subsampled formats like 4:2:2 - that's why it has the funky "u, y, v, y" order rather than just "y, u, v" repeating
[17:42:21 CEST] <furq> i expect planar to planar is faster if they're both 4:2:2
[17:42:24 CEST] <pfelt> safe to say that planar != packed though
[17:42:40 CEST] <furq> or especially if they're both 4:2:2, rather
[17:43:21 CEST] <furq> actually nvm one is 10-bit isn't it
[17:44:19 CEST] <pfelt> here's what decklink says:
[17:44:20 CEST] <pfelt> http://pastebin.com/rz8XtNLM
[17:44:34 CEST] <pfelt> not the best fmt, sry
[17:44:49 CEST] <furq> "8-bit YUV" that's helpful
[17:45:44 CEST] <pfelt> that's uyvy422
[18:00:34 CEST] <furq> pfelt: apparently the 10-bit yuv input format is v210, which is what you're getting as input
[18:00:45 CEST] <furq> -c:v v210
[18:01:02 CEST] <furq> s/input format/pixel format/
[18:05:16 CEST] <pfelt> but 10bit isn't planar either
[18:05:22 CEST] <pfelt> so crop/scale is slow
[18:05:26 CEST] <pfelt> ?
[18:07:15 CEST] <furq> i assume the slow part is adding the dithering for 8-bit
[18:07:44 CEST] <kepstin> i'm pretty sure crop is fast, even on packed formats (for ones it supports)
[18:08:02 CEST] <furq> i don't have any yuv422p10le source to test with
[18:10:46 CEST] <pfelt> i suppose i could remove the scale and see how performance is
[18:13:10 CEST] <pfelt> looks like i'm getting about 60fps with no scale but a format change
[18:14:13 CEST] <pfelt> ah hah!  i'm only getting about 28 with no conversion but a scale
[18:14:28 CEST] <pfelt> (so split -> crop -> scale)
[18:14:28 CEST] <furq> is that converting to uyvy or v210
[18:14:41 CEST] <pfelt> no conversion.  input is yuv422p10le, output is too
[18:14:58 CEST] <pfelt> so unless there is a convert transparently in one of those filters, it's staying 10bit
[18:15:18 CEST] <furq> are you scaling each output after cropping
[18:15:26 CEST] <pfelt> yep.
[18:15:39 CEST] <furq> is that faster than scaling before cropping
[18:15:42 CEST] <pfelt> it's going ffrom 640x360 up to 1920x1080
[18:16:02 CEST] <pfelt> let me try the scale then crop
[18:16:06 CEST] <furq> i'm not familiar enough with swscale's internals to know how it'd handle four concurrent scales
[18:21:11 CEST] <pfelt> so i'm still getting about 28fps.  now it's scale (3840x2160), split, crop (x4).  input is yuv422p10le as is the output
[18:34:37 CEST] <Fyr> is it possible to rip with ffmpeg?
[18:38:24 CEST] <furq> rip what
[18:38:29 CEST] <Fyr> BD
[18:38:57 CEST] <furq> if you decrypt it first then sure
[18:39:09 CEST] <Fyr> then, how to decrypt it?
[18:59:16 CEST] <pfelt> is there any sort of an option to kill ffmpeg after so many seconds or so many frames encoded?
[18:59:24 CEST] <furq> -frames:v
[18:59:50 CEST] <pfelt> sweet!
[19:05:12 CEST] <pfelt> hmm.  fyi: yuv422p10le is def not the same as uyvy422
[19:05:29 CEST] <pfelt> cool effect on the image tho
[19:08:18 CEST] <pfelt> so.  anything i can do programmatically to thread this?
[19:11:17 CEST] <furq> pfelt: if that was directed at me then i must have phrased something really badly
[19:11:34 CEST] <pfelt> the threading question?
[19:11:39 CEST] <pfelt> (or teh pixel statement)
[19:11:40 CEST] <furq> no the pixel formats thing
[19:12:11 CEST] <pfelt> oh ya know.  i thik you said v210 was what the input was
[19:12:18 CEST] <pfelt> not that v210 and uyvy were the same
[19:12:20 CEST] <pfelt> sry
[19:12:43 CEST] <furq> as far as i can tell you can use v210 as an output format and avoid the 10-bit to 8-bit conversion
[19:12:58 CEST] <furq> also i think the conversion isn't done through swscale so it might be faster
[19:13:30 CEST] <furq> `-f rawvideo -c:v v210` shouldn't need any explicit pix_fmt conversion
[19:14:21 CEST] <pfelt> heh.  segfault
[19:14:24 CEST] Action: pfelt debugs it
[19:14:27 CEST] <furq> fun
[19:14:50 CEST] <furq> that information is pieced together from three or four separate google searches so it might be bullshit
[19:19:53 CEST] <pihpah> I am getting this error whne trying to stream my video:  [mp4 @ 0x562e928333f0]Tag avc1/0x31637661 incompatible with output codec id '28' ([33][0][0][0])
[19:20:03 CEST] <pihpah> Anyone knows what's the problem?
[19:20:47 CEST] <pihpah> I was able to stream that video using Apache Web server but with ffserver it just not works.
[19:23:20 CEST] <OmegaVVeapon> Does anyone know how to create a looped mpeg stream with ffmpeg?
[19:23:37 CEST] <OmegaVVeapon> I'm currently doing "ffmpeg -i /opt/myvideo.ts -f mpegts udp://239.69.69.69:5678"
[19:23:53 CEST] <OmegaVVeapon> but it will exit as soon as the input video is over :/
[19:30:42 CEST] <pfelt> OmegaVVeapon: i think the movie filter might do it
[19:30:55 CEST] <pfelt> depends on your input file contents
[20:04:10 CEST] <thebombzen> OmegaVVeapon:
[20:04:13 CEST] <OmegaVVeapon> pfelt: It's cool, I figured it out (really need to read the man pages more...)
[20:04:15 CEST] <thebombzen> try -loop 0 before 0i
[20:04:22 CEST] <thebombzen> before -i*
[20:04:29 CEST] <OmegaVVeapon> -stream_loop number -1 does the trick
[20:04:44 CEST] <thebombzen> ah. I use -loop 0 with ffplay but IDK about ffmpeg.
[20:05:54 CEST] <OmegaVVeapon> "ffmpeg -re -i /opt/myvideo.ts -f mpegts -c copy udp://239.69.69.69:5678?pkt_size=188&buffer_size=16777216 -stream_loop -1"
[20:06:19 CEST] <OmegaVVeapon> that is the entire command, gets to my endpoint and looks awesome on vlc
[20:11:46 CEST] <lavalike> kepstin: small update on stillimage vs touhou, the latter is twice as slow (=
[20:12:16 CEST] <klaxa> touhou? in my #ffmpeg?
[20:12:16 CEST] <kepstin> it uses twice as many reference frames.
[20:12:37 CEST] <kepstin> so, yeah. (i'd expect -tune animation to be similar)
[20:12:46 CEST] <furq> better compression is slower? never
[20:13:25 CEST] <kepstin> (you could switch to a faster preset, but it'll offset some of the benefits of the tune...)
[20:21:42 CEST] <lavalike> alright (:
[20:30:14 CEST] <arbi> Hi! I'm converting to '-target pal-dvd' but I get '[mpeg2video @ 0x3571680] rc buffer underflow'
[20:34:59 CEST] <thebombzen> arbi: paste your full command and output to a paste website like pastebin.com
[20:35:06 CEST] <thebombzen> otherwise we're not going to be able to help you much.
[20:38:15 CEST] <arbi> http://pastebin.com/RK1hGBpm
[20:42:08 CEST] <arbi> Any idea what the problem is?
[20:44:19 CEST] <klaxa> well, does the file play fine?
[20:44:37 CEST] <klaxa> it also says max bitrate may be too small
[20:44:59 CEST] <arbi> klaxa: The playback is in slow motion
[20:46:02 CEST] <klaxa> that's odd
[20:46:31 CEST] <klaxa> what happens if you remove -b:v 4000k?
[20:46:37 CEST] <arbi> klaxa: -b:v 1000k seems to work ok
[20:46:48 CEST] <klaxa> you can use -t 10 to encode only 10 seconds of video to check
[20:47:13 CEST] <klaxa> huh
[20:47:15 CEST] <klaxa> ok
[20:49:11 CEST] <klaxa> DVD *should* be able to handle up to 9 mbps for video iirc, so 1 mpbs and 4 mpbs is definitely within those bounds, weird that it works with a lower bitrate when ffmpeg complains about it being too low
[20:49:41 CEST] <arbi> mmm
[20:50:14 CEST] <arbi> I used same parameters with other files and it works...
[20:50:16 CEST] <kepstin> ffmpeg's mpeg2 encoder just prints that whenever it can't find a way to make the video big enough to fill the requested bandwidth, i think
[20:50:34 CEST] <kepstin> if you don't care about the min bandwidth, you can just ignore it
[20:51:19 CEST] <kepstin> s/bandwidth/bitrate/
[20:51:53 CEST] <arbi> kepstin: what about the slow playback?
[20:52:18 CEST] <kepstin> that's kind of odd, I haven't seen that before :/
[20:52:32 CEST] <klaxa> i wouldn't rule out a slow cpu
[20:53:43 CEST] <arbi> ok thanks
[20:53:45 CEST] <klaxa> the encoding parameters show 25 fps (same as input)
[20:54:00 CEST] <klaxa> check your cpu usage during playback
[20:54:27 CEST] <arbi> cpu is around 25%
[20:54:37 CEST] <klaxa> hmm ok
[20:55:07 CEST] <klaxa> what are you using for playback? does ffplay also slow down?
[20:55:20 CEST] <arbi> I've only tried with ffplay
[20:56:12 CEST] <thebombzen> arbi: try with a dedicated player like mpv
[20:56:31 CEST] <arbi> wait it seems ok with Vlc
[20:57:16 CEST] <arbi> I think when I'm doing fast forward (since start of 10sec video i black) when this slowmotion happens
[20:58:02 CEST] <arbi> I'm going to do some tests
[21:19:59 CEST] <Fyr> is there a way to make FFMPEG proceed even if the file is broken?
[21:28:36 CEST] <HarryHallman> hi
[21:28:58 CEST] <HarryHallman> has anyone recorded from the loopback device using a mac?
[22:45:10 CEST] <DelphiWorld> hello
[22:45:15 CEST] <DelphiWorld> i compiled libmfx
[22:45:23 CEST] <DelphiWorld> but can't find it in pkgconfig ?
[22:45:29 CEST] <DelphiWorld> ./configure --prefix=/usr
[22:45:50 CEST] <DelphiWorld> my configure: ./configure --enable-gpl --enable-libass  --enable-libfdk-aac --enable-libfreetype --enable-libmp3lame --enable-libopus --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libx264 --enable-nonfree --enable-librtmp --enable-libmfx
[22:55:01 CEST] <DelphiWorld> my ERR is that libmfx not fousing pkgconfig!
[23:01:53 CEST] <jkqxz> DelphiWorld:  Can you find the libmfx.pc file somewhere?  (In /usr/lib/pkgconfig?)
[23:03:05 CEST] <jkqxz> Or, if you just want quicksync without all the funny libmfx extra stuff, use vaapi instead...
[23:03:35 CEST] <kbarry> Forgive my lack of clarity here, not sure what the right words are: How feasible is it for ffmpeg to be used as a homogonizing layer, meaning, as a way of taking audio (live streams), and changing the formats/configuration (ie, take X/y/z in, and spit out a very specifically formatted stream?) (also, this is audio only)
[23:05:12 CEST] <kbarry> Really looking for a mile-high view, "Yes, its possible, and not a terrible idea" or "Its possible, buta terrible idea" etc.
[23:19:24 CEST] <Aerroon> kbarry,  i think you'd have to be more specific
[23:19:31 CEST] <Aerroon> what do you want to spit out
[23:19:44 CEST] <Aerroon> but if in terms of codecs and formats it's something ffmpeg supports then it's entirely possible
[23:20:15 CEST] <Aerroon> for instance some applications use ffmpeg to import files (eg audacity)
[23:21:57 CEST] <kbarry> Aerroon: Good point, So, starting points might be fairly close to one another as far as "format" (using that generally)
[23:22:32 CEST] <kbarry> Lets assume everything is audio only, all is either  aac or mp3, and also, all are RTMP streams.
[23:23:41 CEST] <kepstin> kbarry: sure, ffmpeg cli tool can easily handle subscribing to an rtmp stream, transcoding audio to a particular format (and drop video if present, etc.), then send to a different rtmp stream.
[23:23:48 CEST] <kbarry> The desired output would be, for simpliciity, RTMP stream, aac, stero, 44100, FLTP, 128kbps.
[23:25:19 CEST] <kbarry> And I can explicity for all aspects of the output stream?
[23:25:40 CEST] <kbarry> Originally i was facing a problem with different bitrates, but fixed that.
[23:25:50 CEST] <kbarry> The next problem I ran into was things like, mono v stero,
[23:25:57 CEST] <kepstin> you'd be re-encoding it, so you can select a particular encoder, and whatever settings on that encoder you like
[23:25:59 CEST] <kbarry> or s16/p v fltp
[23:26:38 CEST] <Aerroon> kepstin, can ffmpeg do this in real time?
[23:26:41 CEST] <kepstin> s16 vs. fltp is a temporary working format, it's unrelated to how audio is stored in a lossy compressed stream
[23:26:42 CEST] <kbarry> kepstin: One question I was asked, and can't readily answer (lack of experience), what kind of "delay" does running the signal thru ffmpeg introduce
[23:27:02 CEST] <kepstin> Aerroon: given a sufficiently fast computer, ffmpeg can do anything in realtime. that question has no meaning.
[23:27:08 CEST] <kbarry> ie, in the above scenario, you might expect the output to be behind the original stream by X seconds?
[23:27:18 CEST] <kepstin> (in fact, for stuff this simple, it would normally be way faster than realtime)
[23:27:56 CEST] <kepstin> kbarry: depends mostly on network buffering, which is hard to control. Only way to minimize delay completely is do the transcoding inside the rtmp server itself.
[23:28:18 CEST] <kepstin> rtmp is tcp, so it's really dependent on network conditions
[23:28:24 CEST] <kbarry> right,
[23:28:38 CEST] <kbarry> assume network isnt an issue, it sounds like "no delay"
[23:28:43 CEST] <kepstin> (doing it on localhost loopback is the next best thing)
[23:28:54 CEST] <kepstin> well, codec decoding and encoding has some inherent delay
[23:29:05 CEST] <kepstin> and aac in particular is a fairly high-delay format
[23:29:18 CEST] <kepstin> but that should still be well under a second
[23:29:26 CEST] <kbarry> If i pull RTMP from, "sanitize (what the word for what I am talking about?), then push it out to my Server A,
[23:29:29 CEST] <DHE> yes, and x264 can buffer a lot of frames (40+ is absolutely doable). unless you explicitly set -tune zerolatency
[23:29:39 CEST] <kbarry> won't be any slower than pushing the  priginal stream to server A?
[23:30:22 CEST] <kepstin> kbarry: the additional network latency, and encoder delay in the transcode step, means it will be slightly slower. Would probably be noticable if playing both at the same time.
[23:30:36 CEST] <kepstin> well, by slower I mean delayed
[23:30:45 CEST] <kepstin> the audio would obviously play back at the same speed
[23:30:57 CEST] <kbarry> kepstin:  behind < 5 seconds?, 1 second? 30 seconds?
[23:31:03 CEST] <kbarry> sounds like < 1second
[23:31:11 CEST] <kepstin> kbarry: no way to say, too many factors. you have to test and see.
[23:31:20 CEST] <DHE> I did a multicast-to-multicast transcode. delay was about 5 seconds. x264 codec
[23:31:38 CEST] <kepstin> DHE: sure, but we're talking about audio-only here
[23:32:33 CEST] <kepstin> and x264 with 'zerolatency' tune and slice-based parallel encoding should be able to do mere milliseconds of delay on a fast box.
[23:33:12 CEST] <kepstin> (obviously at substantial loss in encoding efficiency compared to the normal look-ahead mode)
[00:00:00 CEST] --- Wed May 18 2016


More information about the Ffmpeg-devel-irc mailing list