[Ffmpeg-devel-irc] ffmpeg.log.20170722
burek
burek021 at gmail.com
Sun Jul 23 03:05:01 EEST 2017
[00:40:11 CEST] <Nixola> hi, quick question
[00:41:38 CEST] <Nixola> I'm trying to record both video and audio output using ffmpeg on Linux (using Jack) but capturing video and audio in the same command results in a lot of xruns, even if I'm not even compressing neither audio nor video, whereas only capturing audio works fine
[00:42:00 CEST] <Nixola> is there some way to prevent this from happening without having to run two separate ffmpeg instances?
[00:47:11 CEST] <DHE> you'd have to crank some buffers and make sure your CPU is up to the task
[00:51:11 CEST] <Nixola> my CPU is up to the task, and the buffer already is 512 which is quite a lot
[00:51:25 CEST] <Nixola> (6700K, not much else running except for Chrome)
[00:54:41 CEST] <Nixola> I'm encoding to a mkv file with libx264 (-preset ultrafast -qp 0) and pcm_s16le as codecs
[00:54:51 CEST] <Nixola> (if that's of any help)
[00:57:41 CEST] <Nixola> hi debian user (don't want to ping you)
[02:29:32 CEST] <agentsim> do I need to call av_register_all / avcodec_register_all once per thread, or just once?
[02:36:23 CEST] <kepstin> agentsim: I don't think it's a thread-safe function. Call it once in the main thread of your app before using other ffmpeg stuff.
[02:36:36 CEST] <agentsim> kepstin: thx
[02:49:49 CEST] <agentsim> I'm getting a segmentation fault in avcodec_open2 using ffmpeg 3.3.2. I've recently updated to 3.3.2 from 3.2.something, and didn't have the crash before.
[02:49:58 CEST] <agentsim> My code looks like this: https://pastebin.com/jP0Mv9ZC
[02:50:21 CEST] <agentsim> the codec is MJPEG and the size is 1920x1920, I'm resizing an image (in this case from 2048x2048)
[02:50:27 CEST] <agentsim> any ideas what I'm doing wrong?
[03:02:49 CEST] <kepstin> agentsim: well, that's obviously not all the code. Did you verify that 'codec' is not null?
[03:04:57 CEST] <kepstin> agentsim: also note the warning on 'avcodec_open2', "This function is not thread safe!"
[03:05:31 CEST] <agentsim> kepstin: yep, codec is not null and is MJPEG
[03:06:01 CEST] <agentsim> as for threads, I am using multiple threads, but the codec is created and used and ultimately freed all in the same thread, so that should be ok
[03:06:30 CEST] <agentsim> I'm rebuilding my library now with debug symbols so I can get some more info about the ffmpeg internals when it fails
[03:07:02 CEST] <kepstin> i think opening a codec can result in initializing some shared global state, which is why it's not threadsafe
[03:08:42 CEST] <agentsim> looks like the crash is in ff_mpv_common_init, for whatever reason I'm missing line numbers :(
[03:09:12 CEST] <kepstin> so it's not that the codec has to be created and freed in the same thread, it's that you have to make sure avcodec_open2 isn't run concurrently in multiple threads.
[03:18:31 CEST] <agentsim> that's not my problem, I'm reproduced the crash on one thread
[03:18:57 CEST] <kepstin> that said, the internal mpeg decoder doesn't appear to have much if any global state, it's all per-context
[03:19:15 CEST] <kepstin> (or global tables initialized on open, etc.)
[03:22:18 CEST] <kepstin> the context alloc obviously didn't fail or you would have gotten a segfault much earlier
[03:38:52 CEST] <agentsim> kepstin: was caused by --enable-lto on the mac
[03:39:52 CEST] <kepstin> hmm, that's not cool. macs use clang nowadays, right?
[03:40:38 CEST] <kepstin> with gcc on linux, i seem to recall there were issues with ffmpeg and lto due to the assembly code
[03:40:40 CEST] <agentsim> yep, their own version though
[03:40:44 CEST] <kepstin> or maybe i'm getting confused
[03:41:16 CEST] <agentsim> since they've started on Swift I've noticed more ICE issues and general compiler weirdness from them
[03:41:32 CEST] <agentsim> so hopefully this LTO issue is isolated to mac :)
[03:42:52 CEST] <kepstin> i'd expect minimal speed gains from lto with ffmpeg anyways, since so much of the hot code is either assembly or already in the same file together.
[03:43:10 CEST] <kepstin> but I haven't seen any testing, so :/
[07:37:43 CEST] <awesomess3> I want to put all songs in my Music folder (147 songs) and put them all into one song where they all play syncrounousilyican'tspell into one .ogg file. Possible with ffmpeg? 147 input songs all played in sync into one .ogg file?
[07:39:17 CEST] <awesomess3> I want to do this with a less than 200 character one-lined bash command.
[07:42:41 CEST] <Blubberbub> with 'in sync' you mean 'in parallel', so 'next to each other', so all hearable at the same time?
[07:43:32 CEST] <awesomess3> in parallel
[07:43:44 CEST] <awesomess3> all hearable at the same time
[07:46:33 CEST] <Blubberbub> amerge or amix filter might be able to do it...
[07:57:23 CEST] <Blubberbub> not sure if you can really improve the experience by listening to all your favorite songs at the same time, though ;)
[08:00:58 CEST] <awesomess3> the point is to hear what it sounds like and if I want it I am guaranteed to improve my experience at least until I actually get what I want.
[08:12:24 CEST] <c3r1c3-Win> Mixing that many finished/mastered tracks at one time would result in horribly blown out audio.
[08:13:37 CEST] <Blubberbub> but that is the desired result, isn't it?
[08:15:28 CEST] <awesomess3> yeah i need new speakers
[08:16:02 CEST] <awesomess3> but seriously, ok maybe I'll just start out with 3 songs together then
[10:26:04 CEST] <BtbN> Does ffmpeg nice itself to level 10? Or is something else doing that?
[10:52:14 CEST] <Nixola> hi, quick question
[10:52:17 CEST] <Nixola> I'm trying to record both video and audio output using ffmpeg on Linux (using Jack) but capturing video and audio in the same command results in a lot of xruns, even if I'm not even compressing neither audio nor video, whereas only capturing audio works fine
[10:52:21 CEST] <Nixola> is there some way to prevent this from happening without having to run two separate ffmpeg instances?
[10:52:26 CEST] <Nixola> my CPU is up to the task, and the buffer already is 512 which is quite a lot
[10:52:30 CEST] <Nixola> (6700K, not much else running except for Chrome)
[10:52:33 CEST] <Nixola> I'm encoding to a mkv file with libx264 (-preset ultrafast -qp 0) and pcm_s16le as codecs
[10:52:36 CEST] <Nixola> (if that's of any help)
[10:52:55 CEST] <Nixola> (I also tried changing the video codec to a raw format despite being incompatible with mkv, no dice)
[10:57:40 CEST] <JEEB> are you sure the video capture side up to the video encoder is not the bottleneck? same for the actual io.
[10:58:34 CEST] <JEEB> you can check with just video and putting -f null - or something as output for example
[11:08:26 CEST] <Nixola> I tried recording audio only and it apparently doesn't have any crossruns in that case; still, the video is uncompressed and even if I write it to a SSD or to /tmp/ I get the same issue
[11:08:47 CEST] <Nixola> increasing JACK's buffer size to 1024 helps, but then latency's increased for the whole system
[11:11:09 CEST] <JEEB> well yes, I see you've tested audio only
[11:11:17 CEST] <JEEB> also video is not that simple as you might think
[11:11:21 CEST] <JEEB> your screen is RGB
[11:11:37 CEST] <JEEB> video generally is YCbCr (colloquially called 'yuv')
[11:11:56 CEST] <JEEB> so you most likely have a conversion (most likely in software) somewhere in the middle
[11:12:03 CEST] <JEEB> the amount of data is also completely different
[11:14:27 CEST] <JEEB> so yes, I would check if just video is being a bottleneck or not. of course if the case is that you're running out of buffer because the video work takes too long to accomplish (in which case the audio buffer increase helping would explain), then I don't really see any other ways around it (you could in theory add another buffer in the audio component of lavd or something, but that would just move the buffer
[11:14:33 CEST] <JEEB> into another place)
[11:15:41 CEST] <JEEB> the jack module might be written not optimally or whatever video capture module you're using might have the same thing :P I just cannot know
[13:12:30 CEST] <yegortimoshenko> what is the shortest ffmpeg command to merge multiple flac files by glob?
[13:13:44 CEST] <yegortimoshenko> unfortunately neither `ffmpeg -f concat -i '*.flac' -c copy out.flac` nor `ffmpeg -i 'concat:*.flac' -c copy out.flac` work because it doesn't recognize the glob (*.flac: No such file or directory)
[13:15:18 CEST] <yegortimoshenko> %*.flac also doesn't work for some reason...
[13:15:29 CEST] <furq> https://trac.ffmpeg.org/wiki/Concatenate#demuxer
[13:16:17 CEST] <yegortimoshenko> furq: demuxer is definitely not the shortest or simplest way to do that
[13:16:26 CEST] <yegortimoshenko> s/simplest/easiest
[13:16:31 CEST] <furq> it is
[13:17:02 CEST] <yegortimoshenko> what about http://ffmpeg.org/ffmpeg-protocols.html#concat
[13:17:16 CEST] <furq> that only works for concatenatable formats
[13:17:27 CEST] <furq> which doesn't include flac
[13:17:50 CEST] <furq> the protocol is pretty much the same as doing cat 1.flac 2.flac > 3.flac
[13:18:27 CEST] <yegortimoshenko> i see. it's unfortunate, demuxer produces broken flac files for me for some reason
[13:19:21 CEST] <furq> did you try reencoding
[13:21:13 CEST] <yegortimoshenko> yes, sure. but i'd like to use fewer tools if possible. i could use sox or shntool and just reencode them.
[13:21:32 CEST] <furq> i mean using -c:a flac instead of -c:a copy
[13:21:40 CEST] <yegortimoshenko> oh, i didn't. thanks!
[13:22:05 CEST] <furq> it'll obviously take a lot longer but stream copying with concat is a bit flak
[13:22:06 CEST] <furq> y
[13:22:21 CEST] <furq> the concat filter is generally the most reliable in my experience but the syntax is even more annoying
[13:24:45 CEST] <JEEB> yes, the filter is most reliable because there's no format parsing specialities required. the content's already decoded at the point of concatenation
[13:40:47 CEST] <yegortimoshenko> furq: now it works as intended after i've switched to `-c flac`. thanks a lot! :-)
[14:13:32 CEST] <faLUCE> Hello. How can I set VARIABLE frame rate for an AVCodecContext instance? I tried with ->time_base = AVRational{1, 0} but it doesn't work
[14:28:42 CEST] <meriipu> I created a combined sink for pulseaudio. Until something outputs to it, it does not seem to be possible to use it as an input. If something outputs to it and ffmpeg is using it, if the process outputting stops (and ffmpeg is using thread_queue_size ?) memory usage will skyrocket. Is this a pulse issue, ffmpeg issue, or wrong use of sinks? https://bpaste.net/show/d74c48e0be26 e.g should I have had some
[14:28:48 CEST] <meriipu> sort of option to keep the sinks output "always active" or some empty signal to achieve the same?
[14:30:39 CEST] <meriipu> I amnot sure if it happens if I remove the video-part, maybe it does but audio is so small that it will take forever to build up.
[15:10:17 CEST] <Tzimmo> Due to ticket #2622 not being fixed and due to my TV not reliably displaying hdmv_pgs_subtitles, I
[15:11:17 CEST] <Tzimmo> 'd like to extract the subtitles, OCR them, spellcheck/fix them and watch the videos using it as external .sub/.srt/.ass which works reliably.
[15:12:09 CEST] <Tzimmo> Any suggestions on how to achieve this goal, i.e, which steps can ffmpeg or other reliable tools (on Linux) do and which steps is better for me to code myself?
[15:14:11 CEST] <Tzimmo> I've already made a tool to convert DVB subtitles in .son format (extracted by projectx) to text myself, so I'm familiar converting bitmaps to text without using any already made OCR software (which tend to be unreliable because they are so generic that the accept text in any orientation)
[15:14:46 CEST] <Tzimmo> but just coded one one my own which works on subtitles that are always in perfect orientation.
[15:16:09 CEST] <Tzimmo> For example, what kind of output format can ffmpeg provide me from which I could easily read the bitmap and do conversion myself (or use gocr if it works sell, often it doesn't seem to work, especially with non-ascii characters)
[15:22:57 CEST] <JEEB> Tzimmo: lavc decoder outputs rgb for the subpictures
[15:23:30 CEST] <JEEB> see libavcodec/pgssubdec (i think?) dot c
[15:23:59 CEST] <JEEB> the struct at the end shows the supported pix_fmts
[15:47:51 CEST] <faLUCE> Hello. How can I set VARIABLE frame rate for an AVCodecContext instance? I tried with ->time_base = AVRational{1, 0} but it doesn't work. The documentation says: "For fixed-fps content, timebase should be 1/framerate and timestamp increments should be identically 1." . But what about variable frame rate?
[15:49:41 CEST] <faLUCE> Should I set the vsync option to 2? (vfr) but how...?
[15:51:08 CEST] <JEEB> faLUCE: just set a large enough time base and set pts accordingly
[15:51:12 CEST] <JEEB> that's vfr
[15:51:49 CEST] <iive> faLUCE: mpeg uses 90kHz or 27MHz timer
[16:01:13 CEST] <faLUCE> [15:51] <JEEB> faLUCE: just set a large enough time base and set pts accordingly <--- I already did that. I tried with mVideoEncoderCodecContext->time_base = (AVRational){1, 1000000000}; But the player (gstreamer) parses the framerate, from the NAL unit, as 1/1000000000 and it doesn't understand that it is variable. Instead, when I encode with ffmpeg CLI, the player parses the framerate as {0/1} and it understand
[16:01:14 CEST] <faLUCE> it's variable
[16:01:48 CEST] <faLUCE> so, I suspect there's some flag which ffmpeg sets, for variable framerate
[16:06:25 CEST] <faLUCE> this post says that there's a flag (fixed_frame_rate_flag) for that: http://forum.doom9.org/archive/index.php/t-126584.html
[16:06:41 CEST] <faLUCE> but I don't understand if libav can manage it
[16:08:01 CEST] <faLUCE> (the flag is in SPS)
[16:09:31 CEST] <faLUCE> More precisely: http://gstreamer-devel.966125.n4.nabble.com/How-is-a-video-frame-rate-calculated-in-a-transport-stream-td4658547.html
[16:26:10 CEST] <faLUCE> well, people from x264 staff confirmed that:
[16:26:13 CEST] <faLUCE> [16:19] <BugMaster> there is fixed_frame_rate_flag in VUI
[16:26:14 CEST] <faLUCE> [16:23] <faLUCE> BugMaster: what is VUI ?
[16:26:15 CEST] <faLUCE> [16:24] <BugMaster> "Video usability information" part of SPS
[16:26:17 CEST] <faLUCE> [16:24] <faLUCE> thanks BugMaster. How can I set it through libx264 ?
[16:26:18 CEST] <faLUCE> [16:25] <faLUCE> (more precisely: I use libav which wraps x264, but I can search for the equivalent option)
[16:26:20 CEST] <faLUCE> [16:25] <BugMaster> b_vfr_input = 0
[16:28:41 CEST] <faLUCE> they just told me that 16:26] <BugMaster> for ffmpeg you probably would need to set --force-cfr
[16:37:11 CEST] <faLUCE> found the solution: av_opt_set(mVideoEncoderCodecContext->priv_data, "x264opts", "no-force-cfr",0);
[16:53:05 CEST] <kepstin> that's strange, because the ffmpeg cli wouldn't be doing that
[16:54:48 CEST] <faLUCE> kepstin: I just saw that cfr is set by zerolatency (which I set before)
[16:54:49 CEST] <kepstin> try setting the framerate field on the codec context to {1,0} to indicate vfr content
[17:12:58 CEST] <kepstin> i think if uninitialized as {0,0}, it'll try to guess the value or copy it from somewhere else, like interted time base
[17:45:50 CEST] <leif> Is there any way to redirect the ffmpeg log when using the libav* libraries?
[17:47:20 CEST] <kepstin> leif: yeah, av_log_set_callback
[17:47:29 CEST] <leif> kepstin: Cool, thanks
[18:34:23 CEST] <Azrael_-> kepstin: thx
[18:35:35 CEST] <rsegecin> to set the environment to compile ffmpeg for windows in Visual Studio I need to run "msys_shell.bat" in VS command prompt but I couldn't find any "msys_shell.bat" file
[18:36:15 CEST] <rsegecin> in C:\msys64 there's only autorebase.bat
[18:36:50 CEST] <rsegecin> does anyone knows where it is?
[18:37:52 CEST] <Blubberbub> maybe its not a bat, but a *.cmd file?
[18:38:16 CEST] <rsegecin> ohh I see
[18:38:31 CEST] <Blubberbub> not sure if thats the correct file, though...
[18:38:42 CEST] <rsegecin> hmmm
[18:39:03 CEST] <rsegecin> sure there is msys2_shell.cmd file there
[18:39:21 CEST] <Blubberbub> just test it - i guess :D
[18:39:33 CEST] <Blubberbub> i think i actually use just one of the normal .exe files - to be honest
[18:40:26 CEST] <Blubberbub> but i don't know if that has any weird side effects
[18:40:26 CEST] <rsegecin> ok I'll tryi it, this will be my fisrt attempt on compiling ffmpeg
[18:41:57 CEST] <rsegecin> np
[18:43:56 CEST] <rsegecin> not sure if it did anything, it just opened msys2 command prompt
[18:45:37 CEST] <Blubberbub> thats what its supposed to do
[18:46:02 CEST] <rsegecin> also there's no yasm anywhere just nasm
[18:46:16 CEST] <rsegecin> may be it was a typo?
[18:46:48 CEST] <Blubberbub> i think i used pacman to install the packages in msys2
[18:49:17 CEST] <rsegecin> hmmm
[18:50:45 CEST] <rsegecin> do I need to have a good cpu to compie ffmpeg?
[18:50:52 CEST] <rsegecin> compile*
[18:51:12 CEST] <Blubberbub> its either that or time - i think
[18:51:22 CEST] <faLUCE> is there a way to encode h264 in BYTESTREAM format?
[18:51:55 CEST] <rsegecin> lol I guess you're right
[18:52:17 CEST] <Blubberbub> rsegecin, you only know it, when you try. You can make it parallel with `make -j 4` if you have 4 cores available
[18:53:55 CEST] <Blubberbub> it will not rebuild everything after the first time - if you are lucky and don't need to reconfigure
[19:00:18 CEST] <rsegecin> Thank you Blubberbub but I think I'll put the ffmpeg compilation aside for now
[19:00:28 CEST] <Blubberbub> its not that hard.
[19:02:03 CEST] <rsegecin> the instructions to compile in Microsoft Visual C++ are not very clear and all I want to know is how ffmpeg does the parsing from wavex to pcm of a wave file
[19:02:27 CEST] <Blubberbub> you can always take a look at the sourcecode on github
[19:02:35 CEST] <rsegecin> I want to be able to debbug as well
[19:02:35 CEST] <Blubberbub> or any other mirror
[19:07:53 CEST] <kepstin> rsegecin: yeah, most of the ffmpeg windows builds around are actually cross-compiled from a linux box with mingw64 :/
[19:08:17 CEST] <yegortimoshenko> how do i fix that flac? ffplay warns me:
[19:08:19 CEST] <yegortimoshenko> [swscaler @ 0x7f84888a3400] deprecated pixel format used, make sure you did set range correctly
[19:08:28 CEST] <Blubberbub> i just spent half an hour trying to figure out, why my custom indev was not available even though all the configs appeared to be correct.... i was executing my system-ffmpeg and not my custom ffmpeg -.-
[19:08:45 CEST] <kepstin> yegortimoshenko: if you're using the ffmpeg cli tool, just ignore that warning - it's a message to the programmers.
[19:09:25 CEST] <yegortimoshenko> kepstin: i think it's a particular flac file problem, or at least some albums don't trigger it
[19:09:58 CEST] <kepstin> yegortimoshenko: that message would be triggered by the flac having an embedded jpeg file for cover art, yeah. There's nothing for you to fix, just ignore the message.
[19:10:58 CEST] <yegortimoshenko> kepstin: that's it! i plan to embed jpg cover art in quite a few albums, though. should i continue or is that considered to be deprecated?
[19:11:55 CEST] <yegortimoshenko> i use metaflac to embed cover art
[19:11:59 CEST] <kepstin> no, it'll work fine. the thing that's deprecated is a particular internal to ffmpeg representation of the color type that the jpeg file is using - it's just a reminder to programmers saying "please switch to using this other thing instead"
[19:12:30 CEST] <yegortimoshenko> got you. thanks!
[19:14:17 CEST] <kepstin> note that if you're copying cover art with ffmpeg, you probably want to use the "-c:v copy" option so it doesn't re-encode the images
[19:14:27 CEST] <kepstin> since that'll be a lossy conversion, lose quality each generation
[19:14:38 CEST] <kepstin> with ffplay, not an issue
[19:15:23 CEST] <faLUCE> is there a way to encode h264 in BYTESTREAM format?
[19:15:49 CEST] <yegortimoshenko> i use metaflac to embed cover art because ffmpeg doesn't seem to support that for flac files. if i'm wrong, though, please do tell me: i'd like to manage all my audio needs using ffmpeg without any external tools :-)
[19:16:54 CEST] <kepstin> faLUCE: like, raw h264 video as a series of nal units? That's usually done with the "h264" muxer
[19:17:30 CEST] <faLUCE> kepstin: http://yumichan.net/video-processing/video-compression/introduction-to-h264-nal-unit/
[19:17:56 CEST] <faLUCE> "a three-byte or four-byte start code, 0x000001 or 0x00000001, is added at the beginning of each NAL unit. "
[19:18:05 CEST] <faLUCE> kepstin: is this the "h264" muxer?
[19:18:20 CEST] <kepstin> the h264 muxer should do that, yeah
[19:18:29 CEST] <faLUCE> tnx
[19:19:20 CEST] <faLUCE> kepstin: -f h264 ?
[19:19:48 CEST] <faLUCE> I need to put them into a mpegts container
[19:19:52 CEST] <kepstin> faLUCE: as an output option when using ffmpeg cli tool? yes. It's also auto-selected by a file with the ".h264" extension
[19:20:07 CEST] <kepstin> .. if you want mpegts, just write to mpegts? ffmpeg can do that directly...
[19:20:19 CEST] <faLUCE> kepstin: I need to use both mpegts+bytestream
[19:20:27 CEST] <faLUCE> together
[19:20:52 CEST] <kepstin> like, the same encoded video both as a raw bytestream and muxed into mpegts?
[19:20:59 CEST] <faLUCE> exactly
[19:21:31 CEST] <kepstin> ffmpeg can do that at the same time, but it's a bit tricky to set up. Look up the "tee" muxer
[19:22:20 CEST] <kepstin> that can send the same encoded video to multiple muxers, so you can pass it to the h264 muxer and mpegts muxer at the same time.
[19:23:36 CEST] <faLUCE> kepstin: what if I do ffmpeg -i input output.h264; ffmpeg -i output.h264 -f mpegts output2.ts ?
[19:24:01 CEST] <kepstin> faLUCE: that would work too, but you have to wait for the whole encode to be done before writing the mpegts file :)
[19:24:15 CEST] <faLUCE> NP for that ;-)
[19:24:19 CEST] <kepstin> and it would also lose timestamps of course
[19:24:24 CEST] <dystopia_> why do, video bitate + audio bitrate not add up to overall bitrate?
[19:25:00 CEST] <kepstin> dystopia_: containers have some overhead for things like seek indexes, and being able to actually separate the audio from the video
[19:25:14 CEST] <kepstin> usually in single-digit percents, but varies by format.
[19:25:16 CEST] <relaxed> faLUCE: don't forget -c copy
[19:25:47 CEST] <dystopia_> 36kb diff
[19:25:59 CEST] <dystopia_> thanks for the answer kepstin
[19:26:06 CEST] <dystopia_> i thought i was just failing at math
[19:28:03 CEST] <faLUCE> yes
[19:43:43 CEST] <cryptodechange> the research window in the nlmeans filter, does 'window' mean frames?
[20:05:10 CEST] <Blubberbub> is there any special way i need to signal EOF when writing a demuxer? because it appears that ffplay keeps executing my read_packet after it returned EOF?
[21:49:47 CEST] <Dovid> Hi. Can anyone help me with this error?https://pastebin.ca/3845626
[21:51:41 CEST] <ChocolateArmpits> Dovid, did you try using lowercase "8k" ?
[21:55:23 CEST] <Dovid> ChocolateArmpits: Yes I tried that. I think the issue is that the stream is not 8k but I thought -ar is the output
[21:56:01 CEST] <ChocolateArmpits> it's used as an output option
[21:56:07 CEST] <ChocolateArmpits> in your command
[21:56:30 CEST] <Dovid> correct which is why I am trying to figure out why I am getting the error
[21:58:04 CEST] <ChocolateArmpits> Dovid, looks like an old version, what about using plain number ? '8000'
[21:58:39 CEST] <furq> yeah you probably want to use an ffmpeg that isn't five years old
[21:58:41 CEST] <Dovid> ChocolateArmpits: A simple 8000 seems to work from the cli. let me try from my app
[00:00:00 CEST] --- Sun Jul 23 2017
More information about the Ffmpeg-devel-irc
mailing list