[Ffmpeg-devel-irc] ffmpeg.log.20161211
burek
burek021 at gmail.com
Mon Dec 12 03:05:01 EET 2016
[00:03:38 CET] <vans163> JEEB: Yea I think tahts exactly what steam did and nvidia complained
[00:03:48 CET] <vans163> bcause if you look at steam now, there is no mention of hardware capture
[00:03:52 CET] <vans163> They only mention they use DXHooks now
[00:04:25 CET] <vans163> So A it was violation of nvidia license what steam did. B they wanted to use the DXHooks so players chat msgs wont show up
[00:04:51 CET] <vans163> But the DXhooks are so unstable each game needs a custom approach
[00:04:59 CET] <vans163> I guess steam has the resources to cover that tho
[00:05:38 CET] <vans163> Im guessing its a bit of both
[00:05:38 CET] <BtbN> Shadowplay still exists in the GeForce Experience
[00:05:51 CET] <BtbN> and still uses the same NvFBC Hardware-Capture-API
[00:06:15 CET] <vans163> BtbN: Yea but its not the same shadowplay as before. I dont think it uses NvFBC that captures direct from the video card hardware
[00:06:27 CET] <vans163> I think its the NvFBC part that shims dx
[00:06:31 CET] <BtbN> It does, that's the entire point.
[00:06:41 CET] <vans163> BtbN: Can you provide a source of this information please then
[00:06:44 CET] <BtbN> It works seamless on Vulkan and DX12 as well.
[00:06:59 CET] <BtbN> And if you tab out, it happily continues capturing the Desktop
[00:07:06 CET] <BtbN> so it definitely does not hook the game.
[00:07:11 CET] <vans163> BtbN: interesting then..
[00:07:32 CET] <vans163> youved tested it yea?
[00:07:46 CET] <vans163> Since it was like this 2years~ ago but since changed
[00:07:52 CET] <vans163> I think it was license changes
[00:08:12 CET] <BtbN> https://developer.nvidia.com/capture-sdk
[00:08:16 CET] <BtbN> Stuff still exists.
[00:08:25 CET] <BtbN> But those APIs are license-locked on Consumer Hardware
[00:08:39 CET] <BtbN> They do exist though, but not accessible to the public.
[00:08:41 CET] <vans163> yea so it NvFBC does not work on a GTX 1060 for example
[00:08:49 CET] <vans163> but itl work on a 2,000$ Tesla card
[00:08:50 CET] <BtbN> It works great.
[00:08:52 CET] <vans163> hum..
[00:08:59 CET] <BtbN> But you need to pass a license key to the API.
[00:09:14 CET] <BtbN> nvenc used to be the same way
[00:09:16 CET] <vans163> Yea.. those license keys as far as im aware only come with the expensive carsd
[00:09:27 CET] <BtbN> No, on expensive cards it works without a license key.
[00:09:50 CET] <vans163> iunno what your talking about TBH
[00:09:58 CET] <vans163> since iv worked with NvFBC api recently
[00:10:06 CET] <BtbN> License Key, some GUID, you need to pass to the API initlaization function.
[00:10:08 CET] <vans163> actually coded stuff for the latest version of it
[00:10:14 CET] <BtbN> Without it, it refuses to work on consumer hardware.
[00:10:17 CET] <vans163> and there was nothing like that from what i noticed
[00:10:36 CET] <vans163> There is only detection if you have a Quadro+ card, without quadro+ card NvFBC silently fails
[00:10:42 CET] <vans163> giving cryptic errors when you try to grab the buffer
[00:10:43 CET] <BtbN> Well, you are probably only using it on Quadro/GRID hardware then, if you don't pass a license key.
[00:10:56 CET] <vans163> I tried with a consumer 700 series card and with a quadro
[00:10:59 CET] <vans163> quadro works great
[00:11:18 CET] <vans163> From reading mountains of docs about it I never once saw mention you can plug in a license key
[00:11:28 CET] <vans163> So I would like a source if you dont mind on how to plug in this license key
[00:11:34 CET] <vans163> and where to buy it
[00:11:51 CET] <BtbN> you probably need to sign an NDA for those, like it used to be with NVENC when it was still non-public.
[00:12:09 CET] <BtbN> But it got so popular they dropped all of those restrictions.
[00:12:13 CET] <vans163> yea.. probably and really are two dif things.
[00:12:23 CET] <vans163> If its probably then this what your talking about might not even exist
[00:12:26 CET] <vans163> for all everyone here knows
[00:12:38 CET] <BtbN> But you can be sure, Nvidias own software still has perfect access to those software-locked APIs.
[00:13:07 CET] <BtbN> There is barely any hardware difference between Quadro and GeForce cards. It's all just in the drivers.
[00:13:08 CET] <vans163> yea if you RE it sure, but it could be turtled all the way down
[00:13:20 CET] <BtbN> Why RE it? They are Nvidia, they know their own drivers.
[00:13:28 CET] <BtbN> So of course they can use APIs the general public can't.
[00:14:03 CET] <vans163> Iunno where this conversation is going now, sure Nvidia can do anything. We are talking about general population
[00:14:18 CET] <vans163> Sure nvidia can sign any NDA with anyone and grant anyone access tyo anything they have.
[00:14:22 CET] <vans163> But will they grant that to someone here?
[00:14:26 CET] <vans163> Very unlikely
[00:14:27 CET] <BtbN> No, we were talking about the GeForce Experience still being able to use those locked features.
[00:14:36 CET] <BtbN> And as it's from Nvidia themselves, of course it can.
[00:14:39 CET] <vans163> Ah, I think it cant anymore
[00:14:43 CET] <vans163> From what I tested
[00:14:51 CET] <vans163> maybe it can with 1000 card..
[00:15:04 CET] <vans163> Since I had a keplar 700 series and shadowplay would not capture the desktop, onyl a game window
[00:15:11 CET] <BtbN> It could on my old GTX760, and still can on my GTX1060
[00:15:39 CET] <BtbN> You had and still have to tick a checkbox for it to capture the desktop. As a security/privacy feature.
[00:17:38 CET] <vans163> http://superuser.com/questions/1140176/geforce-experience-shadowplay-desktop-capture-is-gone-in-version-3
[00:17:56 CET] <vans163> so they got rid of it for something else
[00:18:10 CET] <vans163> Just called it "desktop capture" i guess
[00:18:17 CET] <vans163> guess I need to test again :P
[00:22:39 CET] <BtbN> The new GeForce Experience is horribly sadly
[00:22:49 CET] <BtbN> I uninstalled it after fidgeting with it for a while.
[00:23:21 CET] <furq> well you can't say it doesn't live up to its name
[06:41:24 CET] <roasted> hi friends.
[06:42:25 CET] <roasted> I have a situation where I'm trying to take an mkv, remux it, but retain the exact name of the original file. i.e. I'm writing to overwrite the input file. Best I could come up with was to append v2 in front of the output file name, thereby creating a copy. (for i in *.mkv; do ffmpeg -i "$i" -codec copy "v2-$i"; done). Could I do away with this?
[06:51:12 CET] <kurufu> output to v2, then replace the orginal with your copy.
[13:55:50 CET] <t4nk006> Hey there, I used this https://github.com/lutris/ffmpeg-nvenc to build ffmpeg with NVENC and it works an absolute treat with OBS, but tragtor seems to be relying upon an older version of FFMPEG that was pre-installed in Ubuntu 16.04... Where do I start?
[14:05:37 CET] <furq> probably by uninstalling ffmpeg from apt
[14:06:05 CET] <furq> if tragtor uses the libs rather than just calling the binary then you'll need to rebuild that as well
[14:06:23 CET] <furq> or, you know, you could just use the cli
[14:12:55 CET] <t4nk006> It actually appears to be working now.. strangely.. although I have no idea of parameters, so even with the gui I'm being caught up
[14:14:41 CET] <t4nk006> http://pastebin.com/RaVcYsvJ this is what I'm passing to ffmpeg...
[15:22:22 CET] <krokodilerian> I have a somewhat strange question. I need to loop a raw video frame to pass in .mkv format to voctomix (software mixer), with -c copy
[15:22:46 CET] <krokodilerian> (i tried this with just png or whatever image with -loop, but that takes too much CPU for us)
[15:23:21 CET] <krokodilerian> so, how do I create a image file that contains a single raw video 1280x720 yuv420p image, that ffmpeg can read?
[15:24:21 CET] <durandal_1707> perhaps by loop option it decodes same image over and over
[15:25:19 CET] <durandal_1707> try loop filter it can loop single frame forever just fine
[15:25:34 CET] <krokodilerian> durandal_1707: with -f lavfi?
[15:26:09 CET] <durandal_1707> -vf loop=options...
[15:26:24 CET] <durandal_1707> see documentation
[15:28:17 CET] <krokodilerian> durandal_1707: is this something that arrived after 2.6.9? I don't seem to have it in this one
[15:28:42 CET] <krokodilerian> (and because of some other dependencies, i can't upgrade to 3.2, where here's -stream_loop)
[15:29:16 CET] <durandal_1707> that option would consume cpu too
[15:29:31 CET] <durandal_1707> loop filter is in 3.1
[15:29:43 CET] <krokodilerian> durandal_1707: probably, but then I can have a mkv which is raw video, and just do -c copy, which solves the issue
[15:32:11 CET] <durandal_1707> krokodilerian: paste bin full command?
[15:34:08 CET] <krokodilerian> durandal_1707: http://pastebin.com/diB03qwZ
[15:34:14 CET] <krokodilerian> two of the things i tried
[15:36:41 CET] <durandal_1707> movie filter doesn't generate pts/dts when looping
[15:37:33 CET] <durandal_1707> both cases decodes over and over again
[15:38:30 CET] <durandal_1707> if it is few frames loop filter does job, otherwise it consumes too much memory
[15:50:17 CET] <user____> hi, anyone knows how I can prevent a corrupt mp4 file when ffmpeg is terminated while encoding?
[15:50:29 CET] <furq> you use something other than mp4
[15:50:31 CET] <JEEB> use movie fragments
[15:50:38 CET] <furq> or that, yeah
[15:51:05 CET] <user____> what would you recommend furq?
[15:51:09 CET] <user____> what container
[15:51:16 CET] <furq> if you want robustness then mpegts
[15:51:34 CET] <user____> I would like to have a container that writes the index and all the needed information
[15:51:40 CET] <user____> while encoding the image
[15:51:47 CET] <JEEB> yeah, mpeg-ts doesn't have an index
[15:51:51 CET] <JEEB> it's a bit stream
[15:52:11 CET] <user____> it has timestamps though right?
[15:52:17 CET] <JEEB> yes
[15:52:48 CET] <JEEB> it's the thing used in broadcast so it's not really meant for file playback but rather for streaming
[15:53:14 CET] <user____> I am encoding camera images to videos
[15:53:30 CET] <krokodilerian> user____: live images?
[15:53:34 CET] <user____> yes
[15:53:40 CET] <user____> and I need to as reliable as it gets
[15:53:40 CET] <furq> i'd recommend mkv for general usage, but that isn't as robust
[15:53:41 CET] <JEEB> and if you need mp4 then just find out why ffmpeg gets killed :P
[15:53:59 CET] <JEEB> or if *you* are killing it, don't send kill -9
[15:54:06 CET] <JEEB> but rather send normal kill and it will write the index
[15:54:10 CET] <user____> I am using ffmpeg on windows and I see in the logs that it received windows signal 2
[15:54:18 CET] <user____> however when I want to reproduce it my videos look okay
[15:54:57 CET] <user____> http://codepad.org/aELLR4QQ
[15:55:00 CET] <user____> this is all Ive got
[15:55:27 CET] <user____> ups missing the "received windows signal 2"
[15:55:40 CET] <krokodilerian> user____: how do you get the signal from the camera? something over IP?
[15:57:14 CET] <user____> I am encoding my images to png and pipe it to ffmpeg
[15:57:46 CET] <user____> using this: http://codepad.org/iqymvt9b
[15:58:08 CET] <user____> speaking of, somehow my image quality is a bit worse than when I look at the images directly
[15:58:18 CET] <user____> x264 is supposed to encode lossless this way I thought
[15:58:57 CET] <furq> it'll encode lossless but ffmpeg is probably converting the pixel format
[15:59:30 CET] <furq> depends what the source pixel format is
[15:59:43 CET] <furq> also you should use -qp 0, not -crf 0
[15:59:55 CET] <furq> -crf 0 isn't guaranteed to be lossless
[16:00:57 CET] <user____> ah okay thanks furq
[16:01:01 CET] <user____> Ill try qp 0
[16:01:12 CET] <furq> if the source is png then it'll be converting from rgb to yuv, probably to yuv444p
[16:01:28 CET] <furq> you can use libx264rgb to encode in rgb but i imagine you'll lose a lot of compatibility
[16:01:33 CET] <user____> can I mitigate that somehow
[16:01:36 CET] <user____> to get better quality
[16:01:59 CET] <furq> check the pixel format of the output with ffprobe or something
[16:02:19 CET] <furq> i'd be surprised if you could notice any degradation between rgb24 and yuv444p, but yuv420p will definitely look worse
[16:02:52 CET] <user____> and speaking of formats png is as best as it get's
[16:03:01 CET] <user____> for image2pipe right?
[16:03:03 CET] <furq> also make sure it's not your player which is making it look worse
[16:03:13 CET] <user____> ffmpeg supports PPM and JPG too as far as I know
[16:03:15 CET] <furq> check with mpv in opengl mode
[16:03:34 CET] <furq> also the best format depends what your camera outputs
[16:03:48 CET] <furq> if it's an mjpeg stream then going via png is probably bad
[16:03:59 CET] <user____> its h264
[16:04:35 CET] <furq> why are you using image2pipe for an h264 stream
[16:05:13 CET] <user____> but I receive it via opencv/directshow
[16:05:22 CET] <furq> oh
[16:05:25 CET] <user____> so I get matrix data
[16:05:29 CET] <user____> no idea if you know opencv
[16:05:43 CET] <user____> I am processing the data beforehand of course thats why I can't just pipe to ffmpeg
[16:05:46 CET] <kerio> user____: if you're receiving h264, keep it as h264
[16:05:46 CET] <furq> well yeah h264 -> png -> h264 does two unnecessary colourspace conversions
[16:05:56 CET] <furq> at least try to keep it in yuv
[16:06:31 CET] <furq> hopefully opencv will output y4m or something similar
[16:07:20 CET] <kerio> is lossless h264 ever a good idea
[16:07:35 CET] <furq> why wouldn't it be
[16:07:45 CET] <furq> it seems pretty solid as a lossless codec
[16:07:56 CET] <kerio> is it much better than ffv1?
[16:08:06 CET] <furq> it was about on a par last time i tried it
[16:08:21 CET] <furq> faster with -preset ultrafast, better compression with veryslow
[16:08:32 CET] <JEEB> and I think decoding in general is more optimized
[16:08:41 CET] <furq> i imagine compatibility is much better as well
[16:08:42 CET] <JEEB> because the h264 decoder is getting love
[16:08:50 CET] <JEEB> well, not many things support lossless AVC
[16:08:58 CET] <JEEB> it's the thing no-one really implements
[16:09:18 CET] <user____> i fear that opencv only lets me work with BGR
[16:09:30 CET] <user____> I can convert the image before of course
[16:09:38 CET] <furq> well it's not that big of a deal
[16:09:51 CET] <kerio> >wants lossless video
[16:09:53 CET] <kerio> >not that big of a deal
[16:10:01 CET] <furq> but if you're noticing a slight image degradation, i expect that's why
[16:10:06 CET] <user____> yes this is experimental data that needs to be processed afterwards again
[16:10:14 CET] <user____> and live encoded
[16:10:18 CET] <furq> do make sure that x264 is using yuv444p though
[16:10:23 CET] <user____> ok ill
[16:10:34 CET] <user____> and qp 0
[16:12:23 CET] <furq> check your camera's output formats as well
[16:16:11 CET] <user____> this is a png encoded image furq http://codepad.org/DeTMDptj
[16:16:23 CET] <user____> so rgb24
[16:16:38 CET] <kerio> why png tho
[16:17:06 CET] <user____> what would be your recommendation
[16:17:13 CET] <user____> I can probably encode it in a better way
[16:17:19 CET] <kerio> raw yuv probably
[16:17:27 CET] <kerio> if it's going straight to ffmpeg anyway
[16:18:33 CET] <furq> well yeah png is always rgb24
[16:18:40 CET] <furq> i meant the output mp4
[16:18:52 CET] <furq> *also i meant rgb, shut up)
[16:21:16 CET] <user____> ok ill try to get yuv directly or to convert my cv::Mat to yuvs and pipe that to ffmpeg
[16:21:23 CET] <user____> lets see :)
[16:21:48 CET] <furq> well if opencv only works with rgb then it doesn't really make a difference
[16:22:14 CET] <techtopia> http://i.imgur.com/wfHdZPb.jpg
[16:22:18 CET] <techtopia> what happened here?
[16:23:25 CET] <user____> mh yes furq, but encoding to PNG probably gives some unnessecary overhead too
[16:23:34 CET] <user____> when ffmpeg could deal with raw YUV
[16:23:41 CET] <furq> well it can deal with raw rgb as well
[16:23:44 CET] <user____> maybe converting Mat to yuv is faster than png encoding
[16:23:52 CET] <user____> aha I never seen that
[16:23:59 CET] <user____> only PPM and JPG for the pipe interface
[16:24:19 CET] <furq> -c:v rawvideo -pix_fmt bgr -i -
[16:24:22 CET] <furq> or something like that
[16:24:24 CET] <user____> ah i am so stupid
[16:24:27 CET] <user____> thank you so much
[16:24:47 CET] <furq> you can encode the output as rgb with libx264rgb or ffv1
[16:25:12 CET] <furq> not to say it's better but it's something to try
[16:43:29 CET] <kerio> i'm still somewhat convinced that there's got to be a way to keep the original encoding
[16:43:39 CET] <kerio> in opencv itself i mean
[16:43:57 CET] <kerio> (it doesn't make much sense if you're encoding something different than the raw camera feed obviously)
[16:48:25 CET] <kerio> user____: is that the case?
[17:05:18 CET] <durandal_1707> techtopia: how that happened?
[17:26:52 CET] <DHE> aw damn, older versions of nvenc aren't supported anymore. I must use the included .h file. no other options?
[17:28:51 CET] <c_14> you can use another version, that one's just bundled for convenience
[17:29:00 CET] <c_14> though in the future higher versions might become required
[17:29:48 CET] <c_14> you'll probably have to replace the bundled header though
[17:29:57 CET] <DHE> my loaded kernel driver don't support what's in the ffmpeg repo and the nvenc .h I have handy doens't work if I put it into ffmpeg
[17:30:22 CET] <c_14> what version do you have?
[17:30:42 CET] <DHE> kernel version 358.xx and nvenc sdk version 6.0.1
[17:31:28 CET] <BtbN> No, it's hardcoded to use the bundled header now.
[17:31:44 CET] <DHE> I swapped the header file manually just to try. didn't work
[17:31:46 CET] <BtbN> And it uses features only present in that SDK. So replacing it with an older header won't work as well.
[17:31:53 CET] <DHE> yep...
[17:32:09 CET] <BtbN> Headers are also slightly modified.
[17:32:21 CET] <c_14> you could revert the commits which add the headers and use the new features
[17:32:27 CET] <BtbN> good luck
[17:51:45 CET] <DHE> well damn...
[18:57:17 CET] <Threadnaught> so I'm writing jpg images to imagepipe and trying to stitch them into a video, but it only shows the first frame and its really low quality and doesn't exit (I have to use ctrl c) cat imagepipe | ffmpeg -y -f image2pipe -vcodec mjpeg -r 24 -i - -vcodec mpeg4 -qscale 5 -r 24 video.avi
[18:57:39 CET] <Threadnaught> help
[21:38:44 CET] <hurricanehrndz> Does anyone know if you can do a static compile of ffmpeg with mmal
[21:38:45 CET] <hurricanehrndz> ?
[23:30:09 CET] <BasisBit> am currently trying to port a quite famous floss karaoke game to ffmpeg 3. Already got video playback working again but am stuck for days on the audio part. Previously, the code called PaketDecodedSize := avcodec_decode_audio3(fCodecCtx, PSmallint(Buffer), DataSize, @fAudioPaket); . I tried to adapt this to the avcodec_decode_audio4 function, but had no success at all because I don't understand how ffmpeg works internally. Anyone here who could maybe show me
[23:30:09 CET] <BasisBit> some example implementation?
[23:31:21 CET] <BtbN> https://github.com/FFmpeg/FFmpeg/blob/master/doc/examples/demuxing_decoding.c
[23:31:45 CET] <JEEB> that's a good one :)
[23:32:05 CET] <JEEB> but basically avcodec_decode_audio4 removes you from some of the internals you shouldn't have to care about
[23:32:26 CET] <JEEB> so instead of you handling a random buffer
[23:32:30 CET] <JEEB> you get an AVFrame
[23:32:34 CET] <BtbN> Keep in mind that decode_audio4 is also kind of "deprecated" already
[23:32:44 CET] <JEEB> oh right, did we have 5 already? :D
[23:32:59 CET] <BtbN> the avcodec_send/receive_packet/frame calls are the latest API, and I'm not aware of any example beside ffmpeg.c
[23:33:16 CET] <JEEB> oh, right. the async API was it?
[23:33:17 CET] <BtbN> decoupling input and output
[23:33:37 CET] <BtbN> the API isn't hard to figure out from the headers though
[23:33:42 CET] <BtbN> it's pretty straight forward
[23:34:06 CET] <hurricanehrndz> anyone compile ffmpeg with mmal?
[23:34:57 CET] <BasisBit> BtbN, yes, this function is already deprecated, but I still have to implement it because there are distributions out there which only provide ffmpeg 2.8 or 3.0 for example. (just the usual api hell when you want to support ererything)
[23:35:31 CET] <BtbN> it exists for quite a while I think
[23:35:57 CET] <BasisBit> well, until 2.7, decode_audio3 worked well
[23:36:28 CET] <BtbN> that's the point of long deprecation cycles
[23:36:54 CET] <BtbN> No idea since when the latest API is in, but it seems like a reasonable dependency
[23:37:10 CET] <BasisBit> BtbN, "long" as in ~ 1.5 years? For many smaller projects that is not "long" at all.
[23:38:15 CET] <BasisBit> and from that 1.5 year it took roughly one year for most people to update to a ffmpeg version which supports decode_audio4.
[23:38:45 CET] <BtbN> That's thanks to distributions being behind
[23:39:25 CET] <JEEB> the libav guy who was doing debian work used to try and help a lot pretty much unmaintained packages to update their API usages
[23:39:38 CET] <JEEB> he seems to have left that role a year (?) or so ago
[23:40:02 CET] <BasisBit> eyup. I am not saying that this is the fault of ffmpeg, just that his is sort of hard to keep up for smaller projects who basically have to support various versions and have to use ots of compiler instructions and glue.
[23:42:32 CET] <JEEB> also there's the case of just dropping support for older versions since after all distributions wouldn't update your applications either
[23:42:49 CET] <BasisBit> for example I am lost on how to change the code (in a karaoke game I maintain which keeps an audio file and a video file and some lyrics data in sync) to from Buffer to AVFrame.
[23:43:09 CET] <JEEB> BasisBit: the change in libav/ffmpeg was done ~2011 https://lists.ffmpeg.org/pipermail/ffmpeg-cvslog/2011-December/044245.html
[23:43:26 CET] <JEEB> that's a commit that did the change for avconv/ffmpeg (the command line app)
[23:43:27 CET] <BasisBit> lol^^
[23:43:39 CET] <JEEB> so yeah, it's been a few years
[23:44:25 CET] <JEEB> so you can look at how the change was done there as an example
[23:45:05 CET] <JEEB> and the decoding example of course
[23:45:12 CET] <JEEB> which was linked before
[23:45:13 CET] <BasisBit> and still, most tutorials out there use code based on avcodec_decode_audio3 ^^ I'll try some more to get the code to work. Thanks so far for your good support!
[23:45:51 CET] <JEEB> well yeah
[23:46:04 CET] <JEEB> old stuff just won't die :P
[23:46:18 CET] <JEEB> this API was interoduced more than five years ago
[23:46:27 CET] <JEEB> *introduced
[23:47:34 CET] <JEEB> anyways, my alternatives would either to change to using something that utilizes FFmpeg's libraries (ffms2, libmpv), or keeping more connected with the FFmpeg development so these things don't come as a "surprise" five+ years later :)
[23:49:01 CET] <BasisBit> to be honest, the game I maintain regularly gets forum posts from people who ask for support on how to get it to work with some ffmpeg 1.x or 0.x version, because only exactly that version works flaflessly on their weird hardware and at the same time offers hardware based decoding
[23:49:20 CET] <JEEB> in most cases that's a lie
[23:49:42 CET] <BasisBit> well, I guess they just have old systems and don't know how to update them
[23:49:45 CET] <JEEB> yes
[23:49:50 CET] <JEEB> most likely something like that
[23:50:12 CET] <JEEB> but you can't keep supporting those and staying sane to be honest :P
[23:50:31 CET] <JEEB> the only realistic way to do it is to tell them to use an older version if you ever supported those versions
[23:51:57 CET] <BasisBit> the game flawlessly supports all these versions: https://i.imgur.com/DmFmABW.png
[23:51:57 CET] <JEEB> or you start packaging lavf/lavc with your binaries which is also not perfect but f.ex. mpv provides a build script thing that builds you latest libass, ffmpeg and mpv
[23:52:12 CET] <JEEB> jesus christ
[23:52:25 CET] <JEEB> I don't know if I should congratulate you or tell you you're mad
[23:52:42 CET] <BasisBit> well, I only revived the project last year...
[23:53:49 CET] <JEEB> but yeah, take a look at if something like this is worthwhile for you https://github.com/mpv-player/mpv-build
[23:53:58 CET] <BasisBit> and basically it is just a huge mess of compiler instructions on what code to use. So at every new ffmpeg version, we adapt it so it works and at the same time don't break older ffmpeg versions
[00:00:00 CET] --- Mon Dec 12 2016
More information about the Ffmpeg-devel-irc
mailing list