[Ffmpeg-devel-irc] ffmpeg.log.20180305

burek burek021 at gmail.com
Tue Mar 6 03:05:01 EET 2018


[00:17:58 CET] <FartDaemon> Trying to re-encode a file leaving the video as it while encoding all audio tracks as mp3 and keeping all subs as is.
[00:18:50 CET] <FartDaemon> I used -c:v copy -c:a mp3 -c:s copy
[00:19:12 CET] <FartDaemon> Only the first audio and subtitle tracks are present in the output file though :(
[00:19:31 CET] <BtbN> that's the default. Use -map if you want more.
[00:21:41 CET] <FartDaemon> I should add -map 0:a -map 0:s to my command?
[00:22:42 CET] <BtbN> Or just -map 0 if you want straight up everything
[00:23:42 CET] <FartDaemon> I tried that before and got some subtitle errors.
[00:24:09 CET] <FartDaemon> Only SUBTITLE_ASS type supported.ed=9.34x Subtitle encoding failed
[00:27:25 CET] <kepstin> you shouldn't get that if you're using -c:s copy
[00:31:13 CET] <dustfinger> debianuser: Thanks for the basica on audio.
[00:33:35 CET] <dustfinger> debianuser: It looks like for this particular video I am being asked to use mp4 + H.265 encoding.
[00:36:38 CET] <FartDaemon> kepstin That's what I was missing, thank you! :D
[01:14:01 CET] <user342> How can I remove sections from a video without any re-encoding or writing of intermediate files? Like this: http://markheath.net/post/cut-and-concatenate-with-ffmpeg but without the intermediate files. I'm on windows.
[01:20:48 CET] <DHE> without transcoding is tricky. at very best the edits will be rough because you must start on a keyframe
[01:21:33 CET] <user342> Well cutting a single section to a file doesn't have problems with the keyframes, right?
[01:24:50 CET] <DHE> the start of the cut must be on a keyframe or at best decoding won't start properly until it does reach a keyframe
[01:25:56 CET] <user342> Because I tried some single-cuts to test that and didn't notice any problems with the keyframes. Is there a way to generate a single keyframe at the beginning instead of having to reencode everything?
[01:26:57 CET] <DHE> what I mean is if you want a cut to start at 122.341 seconds from the start of the video, it might actually end up being 123.500 seconds from the start. and any cut request between those two numbers always starts at 123.500
[01:27:12 CET] <user342> Oh I see
[01:27:44 CET] <user342> In my case it's not important that it's very exact
[01:27:51 CET] <DHE> then you have a problem
[01:28:13 CET] <DHE> (in this example, 123.500 is a keyframe... maybe 121.250 is also a keyframe)
[01:28:42 CET] <user342> It'll be fine if it's a second off or so
[02:16:43 CET] <kenav> How do I draw text on a video with certain words highlighted (different color)? I see that fontcolor takes an expression, but I don't see how I can show one color if the text is the text I want to highlight and another if it's not
[03:16:44 CET] <FartDaemon> So i've been using "-map 0 -c:v hevc_nvenc -c:a mp3 -c:s copy" succesfully for 2 videos so far but I just tried another video and got an error:
[03:16:46 CET] <FartDaemon> [hevc_nvenc @ 0000028ff0fcb640] OpenEncodeSessionEx failed: out of memory (10)trate=  -0.0kbits/s speed=N/A
[03:16:46 CET] <FartDaemon> [hevc_nvenc @ 0000028ff0fcb640] No NVENC capable devices found
[03:16:46 CET] <FartDaemon> Error initializing output stream 0:1 -- Error while opening encoder for output stream #0:1 - maybe incorrect parameters such as bit_rate, rate, width or height
[06:47:13 CET] <Ivoah> I'm trying to set the language of an audio track in a video file. I did some internet digging and found that the proper command to use would be this: ffmpeg -i in.mp4 -metadata:s🅰0 language=jpn out.mp4
[06:47:24 CET] <Ivoah> This works, but it seems to reencode the entire video
[06:47:42 CET] <Ivoah> Is there any way to just set the language of the audio stream without reencoding the whole video?
[06:48:14 CET] <user342> I know that feel bro
[06:48:48 CET] <furq> Ivoah: -c copy
[12:24:29 CET] <leriano7> hello!
[12:25:05 CET] <leriano7> can I write here my shell command to check ?
[12:32:09 CET] <braw> Hello. I have problem with concat, i got mp4 files h264+aac and i try to stream all files from directory with -i playlist
[12:32:28 CET] <braw> Cant find anything on web which is useful, all works just got this speed  bitrate=1112.7kbits/s speed= 687x
[12:32:53 CET] <braw> ffmpeg -safe 0 -f concat -i '/root/playlist.txt' -c:v copy -c:a copy  -f flv rtmp
[12:35:13 CET] <leriano7> in this channel does anyone helps?
[12:35:17 CET] <furq> braw: add -re before -i
[12:36:36 CET] <leriano7>  ffmpeg -y -thread_queue_size 512  -f alsa -i pulse -f x11grab -framerate $FRAMERATE -video_size $RESOLUTION -i :${DISPLAY_NUM} -c:a aac -c:v libx264 -preset ultrafast -crf 28 -refs 4 -qmin 4 -pix_fmt yuv420p -filter:v fps=$FRAMERATE "./recordings/${VIDEO_NAME}.mp4
[12:36:46 CET] <leriano7> I get next errors
[12:37:39 CET] <leriano7> Xlib:  extension "RANDR" missing on display ":99".
[12:37:39 CET] <leriano7> [1240:1283:0305/113017.208731:ERROR:bus.cc(394)] Failed to connect to the bus: Could not parse server address: Unknown address type (examples of valid types are "tcp" and on UNIX "unix")
[12:37:39 CET] <leriano7> (google-chrome:1240): LIBDBUSMENU-GLIB-WARNING **: Unable to get session bus: Unknown or unsupported transport 'disabled' for address 'disabled:'
[12:37:39 CET] <leriano7> ATTENTION: default value of option force_s3tc_enable overridden by environment.
[12:37:42 CET] <leriano7> [1287:1287:0305/113017.342195:ERROR:sandbox_linux.cc(375)] InitializeSandbox() called with multiple threads in process gpu-process.
[12:37:45 CET] <leriano7> [1240:1240:0305/113017.351096:ERROR:gpu_process_transport_factory.cc(1009)] Lost UI shared context.
[12:37:47 CET] <leriano7> [1240:1240:0305/113036.981719:ERROR:chrome_browser_main_extra_parts_x11.cc(62)] X IO error received (X server probably went away)
[12:37:50 CET] <leriano7> [1351:1351:0305/113036.981847:ERROR:x11_util.cc(86)] X IO error received (X server probably went away)
[12:37:52 CET] <leriano7> thats all my log
[12:39:30 CET] <braw> @furq thank u will try now!
[13:04:34 CET] <DHE> leriano7: use a pastebin for anything longer than 1 line. that's a very strict IRC rule
[13:05:09 CET] <leriano7> what is a pastebin
[13:05:52 CET] <DHE> a web site that lets you paste a bunch of text and it provides a short URL to the contents. eg: pastebin.com
[13:13:23 CET] <furq> what a nice young man
[13:13:43 CET] <furq> also i thought the convention was more than three lines
[13:13:51 CET] <furq> at least that's what they taught me in school
[13:14:25 CET] <DHE> I was worried he might have clicked my link and closed his IRC window... but usually it says "page closed" and the hostname indicates a browser client
[13:14:46 CET] <DHE> different channels have different rules. never heard someone yelled at for using a pastebin for a 2-line message..
[13:15:13 CET] <DHE> also, I'm curious how google chrome errors showed up in that output
[13:15:37 CET] <furq> isn't that just his Xorg.log
[13:16:13 CET] <furq> i'm not sure what we were supposed to divine from that
[13:18:00 CET] <DHE> maybe xsession errors... not the server itself...
[13:23:18 CET] <Nacht> I'm running into some issues with audio priming. I have, for example, 5 MPEGTS files. I wish to trim only the first file, and then concat all the TS files and transmux it afterwards. However, this causes a small gap in the audio. Is there a way to mimic the audio priming as done in the other TS files ?
[13:27:00 CET] <furq> Nacht: what are you remuxing to in the end
[13:33:20 CET] <Nacht> mp4
[14:01:23 CET] <gnarface> so, i have a video that lags on playback
[14:01:59 CET] <gnarface> i want to re-encode it so it plays smooth and doesn't choke the game UI responsiveness anymore
[14:02:28 CET] <gnarface> "ffmpeg -i" spits out this info about it: https://pastebin.com/Fyd5FNXf
[14:03:36 CET] <gnarface> i'm not sure if i should use -b or what
[14:04:05 CET] <gnarface> lower bitrate with -b, or does webm use -crf like libx264 or...
[14:04:18 CET] <gnarface> any suggestions appreciated
[14:04:54 CET] <furq> well definitely don't use vp9
[14:05:11 CET] <furq> libvpx-vp9 is really slow
[14:05:26 CET] <gnarface> furq: unfortuantely i don't think i can change that part, it's the background video in the main menu of a video game
[14:05:51 CET] <gnarface> i think the resolution and encoding has to remain the same
[14:05:56 CET] <furq> are you hacking your own intro video into dirt rally or something
[14:06:00 CET] <gnarface> probably pixel format too
[14:06:11 CET] <furq> i figured you were capturing and it was lagging
[14:06:12 CET] <gnarface> no i don't want to change it actually i'd like to reduce the quality as little as possible
[14:06:28 CET] <gnarface> but yes basically i want to replace the intro video
[14:06:37 CET] <gnarface> no capturing is happening
[14:06:40 CET] <furq> right
[14:06:43 CET] <gnarface> and the game actually runs great on low settings
[14:07:04 CET] <furq> well yeah there's a lot you can do but whether it'll work depends on a ton of stuff about the player that we don't know
[14:07:07 CET] <gnarface> the menus lag like mad though, and i've discovered it's all the fault of this video
[14:07:11 CET] <furq> so you'd just have to experiment with resolution, framerate etc
[14:07:41 CET] <gnarface> well i was hoping i could just use the same settings and change -crf like with h264
[14:07:53 CET] <furq> crf doesn't really reduce decode complexity
[14:07:59 CET] <furq> you're thinking of preset/profile/level
[14:08:05 CET] <gnarface> hmmm
[14:08:07 CET] <gnarface> ok
[14:08:15 CET] <furq> there might be some equivalent in vp9 but i couldn't really tell you what it is
[14:08:21 CET] <gnarface> it says: vp9 (Profile 0),
[14:08:33 CET] <gnarface> is there a way to set a different profile?
[14:08:34 CET] <furq> 0 is the most basic
[14:08:37 CET] <gnarface> oh
[14:08:45 CET] <gnarface> so maybe i should just lower the bitrate after all
[14:09:00 CET] <furq> the higher profiles in vp9 just allow more bit depths and chroma subsampling methods
[14:09:08 CET] <BtbN> bitrate has pretty much no impact on decode performance
[14:09:18 CET] <gnarface> oh
[14:09:23 CET] <gnarface> well, what would?
[14:09:40 CET] <gnarface> OTHER than resolution and pixel format i guess
[14:09:45 CET] <furq> you should be able to change the resolution?
[14:09:53 CET] <furq> i mean it'll be rescaling it for different resolutions anyway
[14:10:03 CET] <BtbN> not using vp9
[14:10:12 CET] <furq> i assume it needs vp8 or vp9
[14:10:24 CET] <furq> and iirc vp8 doesn't really decode any quicker
[14:10:30 CET] <furq> i might be wrong about that though
[14:10:50 CET] <furq> but yeah the first thing i would try is making it 720p30 and see if that works
[14:11:34 CET] <furq> if it doesn't then you can get into messing with ref frames and such
[14:14:08 CET] <gnarface> hmmm
[14:14:12 CET] <gnarface> maybe i can change it
[14:15:07 CET] <gnarface> i just realized this guy said he used 1080p 30fps in OpenShot, the video is actually 1920x1056
[14:15:39 CET] <furq> it's also 60fps
[14:17:27 CET] <gnarface> yea
[14:32:06 CET] <gnarface> would -qscale work perhaps?
[14:32:21 CET] <gnarface> on vp9?
[14:44:47 CET] <gnarface> hmmm, this page says that libvpx-vp9 *does* obey -crf if you set -b:v to 0... weird https://trac.ffmpeg.org/wiki/Encode/VP9
[14:51:30 CET] <gnarface> oh heh, i just realized... it says 30fps in the file name, that's ... dumb
[14:51:53 CET] <gnarface> and it plays at about 5fps
[14:56:44 CET] <gnarface> wait, so will libvpx-vp9 obey "-preset ultrafast" or is that a libx264 thing only
[14:56:45 CET] <gnarface> ?
[14:59:39 CET] <furq> that's a libx264 thing
[14:59:50 CET] <furq> the equivalent vpx thing is either speed or cpu-used
[15:00:04 CET] <furq> iirc -speed is the ffmpeg option that maps to vpxenc --cpu-used
[15:00:18 CET] <furq> that doesn't really affect decode speed though, just encode speed
[15:15:11 CET] <gnarface> i see
[15:15:18 CET] <gnarface> and the default is lowest already
[15:15:39 CET] <gnarface> -threads 4 will assign more threads to speed up encoding though, right?
[15:16:02 CET] <gnarface> heh, i just tried it and i was getting 0.2fps encoding with -deadline best
[15:16:33 CET] <gnarface> 1.4 with -deadline good
[15:21:48 CET] <furq> -threads doesn't work by itself iirc
[15:22:04 CET] <furq> i forget exactly what you need to do these days to get multithreading to work
[15:22:12 CET] <furq> vpx multithreading is a bit of a sorry affair
[15:23:22 CET] <Ivoah> furq: thanks, adding '-c copy' worked great
[16:03:23 CET] <gnarface> hmmm, you know what else is weird? seems like it must be something wrong with that player because it doesn't lag in vlc
[16:06:28 CET] <furq> maybe it's just using libvpx's decoder
[16:06:42 CET] <furq> i imagine vlc uses ffvp9 which is a lot faster
[16:35:11 CET] <Nacht> Hmm. It looks like ffmpeg adds a small audio gap at the end of the file, not really at the start of it.
[16:35:30 CET] <saml> is that good?
[16:35:35 CET] <saml> free audio gap for you
[16:36:21 CET] <Nacht> not if you concat it afterwards :/
[16:41:26 CET] <JEEB> it doesn't add anything in the end generally
[16:41:35 CET] <JEEB> encoder delay only happens at the beginning
[16:41:50 CET] <JEEB> and yes, if you concat it will  be in the beginning of *each* segment
[16:42:03 CET] <JEEB> you have to handle it with an edit list or so
[16:46:45 CET] <Nacht> Well that's the thing. I took 2 TS files. I concat the original ones, no gap. I transcode both files, then concat them. I get a gap. The I start trying. If I take the original for the first file, and the transcoded 2nd file. I get no gap. If I take the transcoded first file, and the original 2nd file. I do get a gap.
[16:47:06 CET] <Nacht> So I can only conclude that the first transcoded file, seems to have a gap.
[16:47:10 CET] <Nacht> at the end of it
[16:48:12 CET] <Nacht> I'm still trying to figure out with the PTS timings to see where this happands
[17:17:22 CET] <alexpigment> Nacht: maybe just add -shortest?
[17:19:34 CET] <Bear10> is there a way to do something like: ffmpeg -i udp://.... -i udp://.... where one is audio the other video and have them sync up?
[17:25:07 CET] <King_DuckZ> hi I'm following yet another sample c program and it's obviously different from the one I've been following so far - it's called encode_video.c this time. I also found this very helpful blog page (seriously)
[17:25:11 CET] <King_DuckZ> https://web.archive.org/web/20161219185225/https://blogs.gentoo.org/lu_zero/2016/03/29/new-avcodec-api/
[17:25:55 CET] <King_DuckZ> but I still don't understand where the AVStream* thing is - don't I need one?
[17:26:24 CET] <DHE> an AVFormatContext contains a collection of AVStreams... one for video, one for audio, etc.
[17:28:10 CET] <King_DuckZ> DHE: I still don't see any https://www.ffmpeg.org/doxygen/trunk/encode_video_8c-example.html#a8
[17:29:00 CET] <King_DuckZ> but there are here https://www.ffmpeg.org/doxygen/trunk/transcoding_8c-example.html
[17:29:52 CET] <DHE> encode_video.c is not using an AVFormatContext. they're writing a raw stream to disk. so you couldn't do audio+video with that example
[17:31:36 CET] <King_DuckZ> so basically transcoding.c is outdated and encode_video.c is incomplete?
[17:33:13 CET] <DHE> different examples for different jobs
[17:33:58 CET] <King_DuckZ> and each wrong in its own unique way, apparently
[17:34:58 CET] <King_DuckZ> so, when the blog says: "You setup the decoder or the encoder as usual", what's the usual way? where can I find a non-deprecated example, possibly with some explanation of what's going on?
[17:35:45 CET] <King_DuckZ> is dranger.com/ffmpeg/tutorial01.html up-to-date?
[17:36:02 CET] <BtbN> Code examples are based off of FFplay, Copyright (c) 2003 Fabrice Bellard, and a tutorial by Martin Bohme.
[17:36:14 CET] <BtbN> I'd say they are "slightly" outdated.
[17:37:57 CET] <Nacht> alexpigment: Nope, didnt help either
[17:39:05 CET] <alexpigment> k
[17:39:15 CET] <King_DuckZ> the problem I have with dranger.com, assuming that it's not outdated, is that it only talks about decoding, while I'm trying to encode to a new file
[17:44:13 CET] <relaxed> King_DuckZ: have you seen https://github.com/leandromoreira/ffmpeg-libav-tutorial#learn-ffmpeg-libav-the-hard-way
[17:46:10 CET] <King_DuckZ> relaxed: not yet, is it still current? I just want to make sure because this must be tutorial number 4 or 5
[17:50:16 CET] <King_DuckZ> relaxed: it looks very well made, I hope it's going to lead me somewhere near the end of my task
[18:57:08 CET] <bodqhrohro> How can I make mjpeg from gif or from a sequence of jpegs? Whatever I tried, there is only one frame in output
[18:58:33 CET] <relaxed> bodqhrohro: did you try, ffmpeg -i input -c:v mjpeg output.mkv ?
[18:59:36 CET] <bodqhrohro> relaxed: does it support only MKV container? Can I make something displayable in browsers (at least in some of)?
[19:01:42 CET] <saml> bodqhrohro, ffmpeg -i 'yolo%02d.png' -an -vcodec mjpeg yolo.mkv
[19:02:14 CET] <furq> bodqhrohro: mjpeg doesn't playback in browsers
[19:02:16 CET] <saml> bodqhrohro, how are you including it in the html?  <img src="yolo.jpg"> ?
[19:04:02 CET] <saml> is mjpeg good?
[19:05:09 CET] <bodqhrohro> furq: so Wikipedia article is wrong about it? Or it means playback with NPAPI/ActiveX plugins? As I know, HTML5 Video standards do not cover MJPEG
[19:06:23 CET] <relaxed> why do you want to use mjpeg?
[19:07:32 CET] <furq> oh wow
[19:07:34 CET] <furq> yeah that's very wrong
[19:07:41 CET] <furq> firefox doesn't support it and chrome hasn't supported since 2012
[19:07:48 CET] <furq> i assume safari doesn't either
[19:08:16 CET] <bodqhrohro> Safari for macOS supports probably everything QuickTime does
[19:10:23 CET] <bodqhrohro> relaxed: I'd like to fool the validator on the site where animated avatars are prohibited. It accepts only GIF/JPEG/PNG, and even APNG detector was detected after an accident. So MJPEG looks like a last solution if it is possible to make validator treat it as a valid JPG and make browsers display it animated when inserted in
[19:10:38 CET] <bodqhrohro> *was added after
[19:13:48 CET] <furq> oh wow
[19:14:06 CET] <furq> the mjpeg wikipedia page not only gets ffserver's name wrong ("ffmpeg-server") it refers to it as "robust"
[19:14:11 CET] <furq> this might be the worst article on the entire site
[19:17:45 CET] <relaxed> bodqhrohro: mjpeg requires a container, so you're probably out of luck
[19:27:39 CET] <furq> browsers used to support mjpeg elementary streams
[19:27:41 CET] <furq> at least firefox did
[19:27:45 CET] <furq> but that's been gone for a while
[19:28:05 CET] <furq> that's how webcams worked in the good old days
[19:30:49 CET] <atomnuker> they still do, mostly
[19:35:03 CET] <Mista_D> How is "-ac 2" better than "-af pan=" filter please?
[19:35:28 CET] <furq> it's easier to type
[19:36:09 CET] <Mista_D> furq: does "-ac 2" has any analysis/automation behind it?
[19:38:52 CET] <furq> at the very least -ac 2 uses the procedure outlined in the a/52 standards
[19:39:29 CET] <furq> you can presumably replicate that with pan if you wanted to but i don't know why you'd bother
[19:40:00 CET] <furq> like the filter docs say, don't use pan unless you have specific needs
[19:41:16 CET] <Mista_D> furq: understood. So it is a well tuned pan then
[19:41:26 CET] <furq> something like that
[19:41:32 CET] <Mista_D> Thank you
[21:00:25 CET] <Mista_D> Any way to expose the "-ac 2" filter's underlying "pan" settings please?
[21:00:36 CET] <Mista_D> -v 100 didn't show it
[21:36:03 CET] <solidus-river> hey all, i'm trying to compile ffmpeg with libx246 suipport in mingw-64 , i've compiled libx264 in the shell, how do i tell ffmpeg where my x264 lib / header files live?
[21:37:56 CET] <solidus-river> scratch that i got a .a file but i need to now make a dll
[21:43:52 CET] <solidus-river> well, this will be an interesting day
[21:52:08 CET] <pgorley> solidus-river: ./configure --enable-shared when compiling libx264
[21:52:16 CET] <JEEB> solidus-river: you do enable-shared and set a prefix when configuring x264. then you build and make install into the prefix. then add PKG_CONFIG_PATH=/your/prefix/lib/pkgconfig when configuring FFmpeg
[21:52:32 CET] <JEEB> pkg-config should keep you nice n' comfy with regards to finding the library
[21:52:58 CET] <solidus-river> JEEB, thanks and ah darn, i need to set the prefix then and reconfigure
[21:52:59 CET] <JEEB> mingw-w64 has .a files even with .dlls, the DLLs are in the bin/ directory and then the library that links against that DLL is called a ".dll.a"
[21:53:16 CET] <solidus-river> is there a default prefix?
[21:53:19 CET] <JEEB> solidus-river: the default one is usually /usr/local - the main thing is to just use the "make install"
[21:53:31 CET] <JEEB> that way the pc file gets installed and the correct directory structure gets created
[21:53:53 CET] <JEEB> the pc file then is what pkg-config utilizes to know what on earth compiler or linker flags you need
[21:53:56 CET] <solidus-river> awesome, thanks! i'm going dow nthat route now, just recompiling x264 to output a .def file and with enable-shared
[21:54:30 CET] <JEEB> after you install, you can try manually PKG_CONFIG_PATH=/your/prefix/lib/pkgconfig pkg-config --libs libx264 (or x264)
[21:54:34 CET] <JEEB> I don't remember which it was
[21:54:43 CET] <JEEB> it should give you the linker flags required for x264 usage
[21:54:51 CET] <JEEB> (--cflags should give you the include flags)
[21:54:59 CET] <solidus-river> if it goes to default, shouldn't PKG config find it when configuring ffmpeg on the same mingw-64 env?
[21:55:01 CET] <JEEB> that is what FFmpeg uses under the hood in the configure script
[21:55:21 CET] <JEEB> depends on if /usr/local/lib/pkgconfig is in the default search path :P
[21:55:33 CET] <JEEB> (given /usr/local is the default prefix in many configure scripts)
[21:55:47 CET] <JEEB> personally, just set the bloody PKG_CONFIG_PATH
[21:55:50 CET] <JEEB> :)
[21:56:02 CET] <JEEB> (that actually appends instead of overriding, funny enough)
[21:56:12 CET] <JEEB> PKG_CONFIG_LIBDIR is the one that overrides the whole search path
[21:57:27 CET] <solidus-river> yeah, looks like it put it in /usr/local/lib/pkgconfig
[22:12:18 CET] <solidus-river> aside from setting PKG_CONFIG_PATH correctly should I do anything else to get ffmpegs configure script to pick up and enable the x264 codec?
[22:12:41 CET] <BtbN> enable it, and enable gpl
[22:14:24 CET] <solidus-river> --enable-shared --enable-gpl --enable-libx264
[22:46:39 CET] <FishPencil> Can FFmpeg detect a scene change and split the video up into parts based on each scene?
[22:47:52 CET] <FishPencil> I would think the filter would be quite straight forward, compare the similarity of the current and previous frames and cut if it's above a threshold
[22:49:29 CET] <solidus-river> cool stuff, wow, this configure script on mingw64
[22:49:42 CET] <JEEB> solidus-river: yes it takes like 10min on native windows
[22:49:51 CET] <JEEB> which is what made me give up on building windows native
[22:49:55 CET] <solidus-river> 20 minutes so far and i havea  threadripper
[22:49:59 CET] <solidus-river> i doubt its using all the cores though
[22:50:09 CET] <JEEB> it's just the slowness of the forking tests and the check thing
[22:50:14 CET] <JEEB> barely takes a minute on *nix
[22:50:25 CET] <JEEB> probably closer to 20s?
[22:50:52 CET] <JEEB> many encoders have some sort of "far enough and big enough of a difference with the lookahead buffer", but I don't think there's a general thing in ffmpeg.c FishPencil
[22:51:48 CET] <FishPencil> Hm, perhaps I'll write a "scenesplit" filter then
[22:51:59 CET] <FishPencil> Anyone know if deshake is looking ahead?
[22:52:02 CET] <FishPencil> or behind
[22:54:28 CET] <solidus-river> yay, configure done and it did link against libx264!
[22:54:53 CET] <solidus-river> do i need to add any additional options to output into a  metroska container?
[22:55:05 CET] <FishPencil> solidus-river: no
[00:00:00 CET] --- Tue Mar  6 2018


More information about the Ffmpeg-devel-irc mailing list