[Ffmpeg-devel-irc] ffmpeg.log.20181019

burek burek021 at gmail.com
Sat Oct 20 03:05:02 EEST 2018


[01:14:52 CEST] <zamba> i'm digitalizing some old tapes.. which container should i output to to be able to view the result while i'm encoding?
[01:15:03 CEST] <zamba> using .mp4 means i can't preview it until it's done
[01:25:39 CEST] <furq> mkv
[01:25:47 CEST] <furq> you should use mkv anyway because it won't be unrecoverable if ffmpeg crashes
[03:32:14 CEST] <Spring> is there a likely cause why the 'blocksize' option for the deshake filter won't run if the value is higher than '10'? The default value according to the docs is '8', with a possible range of 4 to 128.
[03:40:08 CEST] <Spring> also ffmpeg.pastebin.com is broken for me
[08:21:08 CEST] <poutine> I'm experiencing a bug in software when PTS rolls over in a HTTP Live Stream where I have not captured the segments the rollover occurs in. Any ideas how I could generate that or get an example mpeg ts file that contains a PTS rollover?
[14:48:58 CEST] <Zexaron> So it was said MKV precision is 10k for VFR, what about MPEG-TS ?
[14:49:24 CEST] <BtbN> precision?
[14:49:44 CEST] <BtbN> mpeg-ts has a 1/90000 timebase, if that's what you mean.
[14:49:51 CEST] <Zexaron> some kind of timecode stuff
[14:50:06 CEST] <Zexaron> which is needed for accurate VFR
[14:50:08 CEST] <JEEB> Zexaron: the matroska time base is adjustable, unlike MPEG-TS
[14:50:22 CEST] <JEEB> but only in tens if I recall correctly
[14:50:35 CEST] <Zexaron> What about decimal?
[14:50:47 CEST] <JEEB> as in, you cannot set a time base like 1001/24000 for matroska
[14:50:58 CEST] <JEEB> which is why all 24000/1001 fps content in matroska is effectively VFR :P
[14:51:21 CEST] <JEEB> MPEG-TS is as BtbN said - 1/90000 time base (and that one's hard-coded)
[14:51:29 CEST] <atomnuker> afaik a lot of demuxers break when the matroska timebase isn't 1/1000
[14:51:51 CEST] <JEEB> hard-coding is best coding
[14:54:01 CEST] <Zexaron> JEEB: If you recall the previous discussion about captuing frames, I did more thinking and I've theorized why does it have to be VFR and timecodes anyway, why couldn't the video just be CFR at 60FPS, ffmpeg would simply put the frames together like this, wait longer (just wait, no empty or dummy frame) or delay the frames if it's faster than a fixed FPS
[14:54:48 CEST] <Zexaron> There's no need for the video to actually be VFR
[14:54:58 CEST] <JEEB> if your source is rendering faster than your expected rate then you can't just delay the frames can you?
[14:55:05 CEST] <JEEB> you can only do CFR if you know your source to be CFR
[14:55:23 CEST] <JEEB> or if you are ready to do hacky crap like dropping frames or generating frames out of thin air
[14:55:30 CEST] <JEEB> which I kind of thought where something you wouldn't like to do
[14:55:49 CEST] <BtbN> just using timecodes is usually easier anyway
[14:55:53 CEST] <JEEB> you mean timestamps
[14:56:07 CEST] <BtbN> In the most simple case, you set the timebase to 1/60 or whatever, and advance by 1 evers frame
[14:56:11 CEST] <JEEB> matroska uses the wording "timecodes" which has driven people insane for years
[14:56:55 CEST] <Zexaron> Not exactly delay, delay as in queued, stored in buffer if the encoding can't happen fast enough due to CPU constraints, and I was already hoping to set ffmpeg's encoding to low priority and make it use less threads
[14:58:22 CEST] <Zexaron> What Im getting at is "construction" ... lets say ffmpeg encoder is building a syscraper, no matter how fast or slow the resources, materials, concrete, steel is coming in, you have the same distance between floors
[14:58:41 CEST] <Zexaron> or some kind of analogy
[14:59:06 CEST] <JEEB> do you always have the same distance between floors?
[14:59:20 CEST] <JEEB> I'd expect the emulation rendering speed to differ per frame depending on various variables
[15:00:03 CEST] <Hello71> I think it assumes frame perfect rendering
[15:00:06 CEST] <Hello71> or at least no skipping
[15:00:37 CEST] <JEEB> yes if you are rendering a demo or something you can just make the emulator as a whole work at non-1x speed
[15:00:48 CEST] <JEEB> but I'd guess most use cases that's not the case :P
[15:00:56 CEST] <BtbN> timestamps are just easier
[15:01:01 CEST] <Zexaron> Yes it does differ but also game speed differs, if it gets below 60 or 30 the time moves slower, audio is slower, like a time machine, so there's no point having recreated that in the resulting video
[15:01:16 CEST] <BtbN> CFR is a special case of VFR anyway, so if you implement the later, you get it for free
[15:01:21 CEST] <Zexaron> Because game time/speed is tied to FP
[15:01:22 CEST] <Zexaron> FPS*
[15:02:32 CEST] <Zexaron> And the game runs at it's normal speed only the predefined FPS it's suppose to run at, 30 FPS is the most common followed by 60
[15:04:42 CEST] <Zexaron> So does that make sense?
[15:05:27 CEST] <Zexaron> I hope there's not some kind of thing I'm missing there that would make that theory invalid
[15:06:16 CEST] <Zexaron> btbN it has to be official spec CFR for editors to have no problems with the resulting file, this is not the case currently
[15:06:36 CEST] <BtbN> What is official spec cfr?
[15:06:50 CEST] <Zexaron> so if it's VFR with timecodes approx that "it looks like CFR" that's not enough
[15:07:42 CEST] <Zexaron> well if there's no such spec, I guess, if all the frame have exact time delay that's CFR
[15:07:57 CEST] <Zexaron> and feeding that to a video editor should be bug free
[15:07:58 CEST] <JEEB> duration you mean :P
[15:08:08 CEST] <JEEB> and editors will always have issues, don't worry about that
[15:08:34 CEST] <JEEB> but given that editors can nowadays read both MPEG-TS and MP4 and MOV I'm pretty sure the basic support for things is there
[15:08:56 CEST] <BtbN> Just put in constantly incrementing timestamps yourself, if you want the video to match game fps, and not emulation fps
[15:09:02 CEST] <Zexaron> With dolphin's framedumping files they have a lot more than usual, it's one of the main issues, so I was told
[15:09:39 CEST] <JEEB> without further technical details I cannot comment on that whatsoever.
[15:09:40 CEST] <JEEB> but anyways
[15:09:50 CEST] <JEEB> I did ask you to look into some initial time base
[15:09:57 CEST] <JEEB> at which teh emulator as a whole works :P
[15:10:06 CEST] <JEEB> that can be 1/60, or that can be something else
[15:10:14 CEST] <JEEB> figure that out if there's one
[15:10:28 CEST] <JEEB> then you can stick your rendered stuff onto that time line
[15:11:05 CEST] <Zexaron> BtbN: for what reason would I want it to match game fps? In my logic, at least my opinion, I do not want to have a video that recreates the game speed, that's what the existing solution is already doing, it's not VFR even tho everyone's saying it is, MPC-HC reports 60FPS ... the VFR part might just be 0.9 FPS jitter aroung 60
[15:11:34 CEST] <BtbN> If you dump every frame the game renders, that's what you get
[15:11:35 CEST] <Zexaron> It may technically be VFR, but in practice it's not trying to be VFR
[15:11:48 CEST] <BtbN> Didn't you just ask the exact opposite minutes ago?
[15:11:56 CEST] <BtbN> How to make it _not_ cfr?
[15:13:04 CEST] <Zexaron> Yes because I was looking at that method, but I have simultaneously been thinking about going totally different, I have doubts it has to be VFR
[15:13:31 CEST] <Zexaron> different approach, from what the familiar developers were saying needs to happen in the end
[15:13:51 CEST] <Zexaron> "We do timecode adjusting so the resulting video always plays back at normal speed, not the speed it was rendered at"
[15:14:11 CEST] <BtbN> The emulation running anywhere between full speed 60 fps and anything less because lagging is a prime example of VFR
[15:15:42 CEST] <Zexaron> With more info I started theorizing why bother getting timecodes if you're going to have to adjust them later anyway to make it look like CFR
[15:17:13 CEST] <Zexaron> So that's when I had in mind, why wouldn't FFmpeg just pick a bunch of frames and put it together with constant frame time, same way you would pick a folder full of PNG image files, put them together with -r 30, then all this implmentation would be a convenient way to do this on the fly without the user having to run ffmpeg separately later
[15:18:15 CEST] <Zexaron> the rate just has to be exactly for the specific game, or else it wouldn't playback at normal gamespeed.
[15:19:00 CEST] <Zexaron> Nobody even wants any other speed except normal, afaik.
[15:19:24 CEST] <JEEB> well even if the renderer itself would be slown down it would still know when the frame is supposed to be shown, I'd think?
[15:19:49 CEST] <JEEB> so as long as that darn thing does its job, your job as the encoder is to just use that timestamp and pass it on to encoding
[15:20:26 CEST] <JEEB> doing any sorts of comparisons against -r is just fucking futile since generally you don't want to drop or duplicate any frames - which is exactly what that -r option does (it uses ffmpeg.c's dumb as fuck "vsync" logic)
[15:21:25 CEST] <JEEB> and hopefully you can get the audio in a 1x speed from the emulator's audio output :P
[15:21:26 CEST] <Zexaron> Indeed that's exactly what I wanted to eventually make sure ... if ffmpeg supports such a thing I'm trying to get at, "construction" like that
[15:21:56 CEST] <JEEB> FFmpeg's API literally just fucking lets you pass in raw video and audio frames in and codes them
[15:22:12 CEST] <BtbN> ffmpeg generally doesn't care about time at all
[15:22:17 CEST] <BtbN> it just passes through timestamps
[15:22:28 CEST] <BtbN> And converts them if the timebase can't be kept
[15:22:48 CEST] <Zexaron> And I already knew the next step if it doesn't, make support for that in FFmpeg first, if necessary, I could try but my skills/time, not sure yet
[15:23:00 CEST] <JEEB> what the flying fuck are you speaking about at this point
[15:23:20 CEST] <Zexaron> A feature like that might also come useful elsewhere not just in this dolphin-emu case
[15:23:23 CEST] <JEEB> ok, if you have the emulation running at <1x then the frames will render slower than they'd be rendering
[15:23:37 CEST] <JEEB> but the renderer should still understand what the supposed shown timestamp should be, no?
[15:23:41 CEST] <JEEB> or at least fucking hopefully
[15:23:57 CEST] <JEEB> you should not be attempting to fucking filter the stuff if your idea is to get what the emulator is feeding you
[15:24:09 CEST] <JEEB> not dropping input frames nor duplicating them
[15:24:13 CEST] <Zexaron> the gamespeed changes by proportionally, 60 is 100% game speed
[15:24:19 CEST] <JEEB> yes, I FUCKING UNDERSTAND THAT
[15:24:33 CEST] <JEEB> THAT IS EXACTLY WHY ALL YOU NEED IS THE FUCKING TIMESTAMP OF WHEN THE FRAME WAS SUPPOSED TO GET RENDERED
[15:24:48 CEST] <JEEB> even if it comes early or late the media framework does not fucking care
[15:24:56 CEST] <JEEB> as in, wallclock wise
[15:25:23 CEST] <BtbN> You don't even need that. The emulator dumps out images, and you know the framerate, as it's NTSC/PAL/...
[15:25:42 CEST] <BtbN> It only gets complicated when you want to match emu render speed, not intended games speed
[15:25:50 CEST] <JEEB> well I'm trying to not set up too many expectations depending on how the emulator will develop etc
[15:25:59 CEST] <BtbN> And since you aparently don't want that... I fail to see what you even want
[15:27:08 CEST] <JEEB> what can get funky is audio output if your game time goes <1x. but I hope the emulator can output audio at 1x speed in some way
[15:27:24 CEST] <JEEB> nothing to do with video since if you have a known maximum time base according to which all frames come through
[15:27:52 CEST] <JEEB> and the fact that the emulator does not drop frames by itself, of course - which is why timestamps are nice if it ever drops or whatever
[15:28:10 CEST] <Zexaron> Well maybe I simply don't know how to explain properly, so you guys don't understand what I mean by "construction" ... it may be something that's never been known to FFmpeg or similar, but I didn't knew that
[15:28:38 CEST] <Zexaron> I confirmed from others that it never drops frames to keep going
[15:28:50 CEST] <BtbN> I don't even know what you are talking about
[15:28:57 CEST] <Zexaron> If it can't render it, it slowls down, halts
[15:29:02 CEST] <BtbN> video timestamps are a pretty well established thing, nothing there to invent really
[15:29:06 CEST] <JEEB> ^
[15:30:24 CEST] <Zexaron> Sure, timestamps for a CFR 60 FPS should all be same distance apart right, which means it's soemthing fixed that FFmpeg can figure out on it's own, no need to have emulator feed it and do checks
[15:30:39 CEST] <JEEB> what the fuck are you once again talking about
[15:31:03 CEST] <JEEB> anyways, if your input never drops then you still do the same as usual. you get the supposed presentation time of that given frame and pass it along on the time base that you have. everyone's happy
[15:31:08 CEST] <JEEB> there's nothing to do there, literally
[15:31:16 CEST] <JEEB> with video that is
[15:31:57 CEST] <JEEB> with audio you just hope that the engine can output the audio samples as-is so that when you play them you get the original 1x speed :P
[15:32:22 CEST] <JEEB> if that is true, then the whole encoding and multiplexing part is very simple
[15:32:38 CEST] <Zexaron> Yes, see this is what I have a hard time explaining, all FFmpeg would do is to insert frames into a CFR video, timestamps are all known, tarts at 0 and for 60FPS video is what 16.6ms or 8.3 ms or how much is it again apart, they're all same distance apart, you don't even need to calculate them every time, can even have a predefined list stored in a template
[15:33:02 CEST] <Zexaron> I can do that with Excel right now
[15:33:30 CEST] <JEEB> you're not having an issue explaining the fact that currently it seems like all your frames are fucking +1 on some fucking time base
[15:33:43 CEST] <JEEB> it's just that I don't want to make the design such that if the design ever changes you get butt-fucked
[15:33:51 CEST] <JEEB> and thus I am talking about timestamps and getting them from the renderer
[15:33:59 CEST] <Zexaron> You don't need any timebase either, when framedumping starts, starts from zero, from that video
[15:34:16 CEST] <JEEB> TIME BASE IS FUCKING TICKS PER SECOND YOU FUCKING IMBECILE
[15:34:18 CEST] <Zexaron> if I understood timebase correctly, heh still a few holes
[15:34:37 CEST] <JEEB> I'm pretty sure I went through this shit during all the nights I've been up trying to explain you shit
[15:35:40 CEST] <JEEB> so if the fucking video renderer knows it's supposed to be 1/60 or 1/50 or 1001/60000 it can signal that
[15:35:46 CEST] <JEEB> that is your fucking video time base
[15:35:57 CEST] <JEEB> then if nothing ever gets dropped all the presentation time stamps would be +1
[15:36:00 CEST] <JEEB> on that fucking time base
[15:36:10 CEST] <JEEB> this should not be a fucking hard premise to understand
[15:36:25 CEST] <Zexaron> maybe ffmpeg doesn't have an internal timer, it doesn't have to be in ffmpeg, ofcourse that timecode stuff can be in the main program, in the framedumping code, running on another thread yes
[15:36:36 CEST] <JEEB> stop thinking about wallclock time
[15:36:38 CEST] <JEEB> STOP
[15:36:43 CEST] <JEEB> THIS HAS NOTHING TO DO WITH WALLCLOCK
[15:37:46 CEST] <Hello71> sounds like some proprietary crap where it has to have a bajillion adapters. can't possibly be something simple
[15:37:53 CEST] <Zexaron> Sorry I'm just using one term to talk them all, I heard about them a couple of days ago for the first time, I'll get up to speed sooner than later tho
[15:39:13 CEST] <JEEB> it's the presentation side wallclocks, sure. but literally the only thing you have to do with the multimedia APIs is that you pass the right timestamps on the correct time base along
[15:39:52 CEST] Action: Mavrik gives a cookie to JEEB . There, there.
[15:39:55 CEST] <Mavrik> Timebase is a hard concept :/
[15:40:23 CEST] <JEEB> I'm still trying to explain to him that he doesn't need anything from FFmpeg if his input gives him the time base and the presentation time stamp of both audio and video to him
[15:40:58 CEST] <Zexaron> JEEB: I understand that part, rendered suppose to know it has to be 1/60 .. it can signal that, but why does it have to signal each frame, before the framedumping starts the emulator and ffmpeg API code can negotiate the normal-speed frame rate, then FFmpeg would just fill things based on what is predefined, in a CFR video there's no difference, what's there to figure out, it's all the same, if it's fixed
[15:41:31 CEST] <JEEB> why the fuck would FFmpeg suddenly start touching your input in inappropriate ways?
[15:41:38 CEST] <JEEB> if you don't require any of that
[15:41:39 CEST] <Zexaron> Technically the emulator doesn't know anything, we just see it as slow-game speed if FPS dips
[15:42:00 CEST] <JEEB> also at the very fucking least you want some sort of fucking timestamps to start getting the video and audio synchronize'able
[15:42:26 CEST] <JEEB> unless your sources can 100% note that both start at exactly the same time and you don't need to adjust anything
[15:42:52 CEST] <JEEB> I just know way too many cases of where audio frames don't exactly align with video frames
[15:42:57 CEST] <Zexaron> Oh yes ... but as it is right now, video and audio is separate, audio is also tied to game-speed as FPS is, it'll slow down proportionally
[15:43:31 CEST] <JEEB> anyways, just fucking deal in fucking presentation time stamps because that way if any of the fucking details changes in anywhere else you don't have to fucking rewrite your fucking bullshit
[15:43:49 CEST] <JEEB> you can have a fucking piece of shit code hard-coding +1 for each fucking frame
[15:43:52 CEST] <JEEB> as the fucking PTS
[15:43:58 CEST] <JEEB> from the fucking 1/frame_rate time base
[15:44:02 CEST] <JEEB> yes, you can have that
[15:44:03 CEST] <JEEB> SURE
[15:44:11 CEST] <JEEB> BUT IS THAT A GOOD FUCKING SOLUTION GODDAMNIT
[15:44:14 CEST] <Zexaron> But I am, I was willing to rewrite the whole frame dumping logic ... even parts of the rendering if necessary
[15:44:25 CEST] <Zexaron> It's like 10 years old
[15:44:26 CEST] <JEEB> IF THE FUCKING RENDERER KNOWS THE FUCKING PRESENTATION TIME OF A FRAME
[15:44:34 CEST] <JEEB> AND IF IT KNOWS THE VIDEO FUCKING TIME BASE
[15:44:38 CEST] <JEEB> vittu saatana
[15:44:44 CEST] <JEEB> why do I even bother with you
[15:44:45 CEST] <TheAMM> saatanan tunarit
[15:45:05 CEST] <JEEB> sorry for the caps, at this point I fucking give up and maybe do something more productive
[15:45:36 CEST] <TheAMM> I haven't read this all but hasn't this been going on for a week
[15:45:59 CEST] <TheAMM> And the situation is exactly "the renderer has a timestamp"
[15:46:39 CEST] <Zexaron> Well, I'm wasn't assuming this method would be 100% perfect, it could simply be one option which may be useful for other reasons, there are debugging reasons which may seem like wrong way of doing things ofcourse
[15:48:50 CEST] <pagios>  hello, can i serve an m3u8 file from server1, and include in that file the .ts files to be consumed from server2?
[15:49:32 CEST] <JEEB> have you tried looking at the AVOptions of the HLS "muxer" :P
[15:49:55 CEST] <JEEB> https://www.ffmpeg.org/ffmpeg-all.html#hls-2
[15:50:18 CEST] <Zexaron> I apprecitate the fact he's trying to explain the correct way of how things are generally done, this isn't a general case tho, but I failed to explain in the beginning that this is just research and I'm creating a protoype for this method, the idea may not exist nor be possible with ffmpeg currently, and what my disagreement is that everyone thinks it wouldn't work, I was looking for an explanation why CFR wouldn't work but nothing is
[15:50:18 CEST] <Zexaron> conclusive, so that has strengthened my interest into making a prototype to see how would it work versus the existing solution and others
[15:51:09 CEST] <JEEB> there is nothing that you've mentioned that isn't possible with FFmpeg's APIs
[15:51:19 CEST] <JEEB> it just doesn't make any fucking sense for your fucking encoding module
[15:51:38 CEST] <JEEB> to be something more than the receiving end of push_rendered_frame(EmulatedFrame)
[15:51:52 CEST] <JEEB> where EmulatedFrame would have the actual image data of course, but also the fucking timestamp
[15:52:11 CEST] <JEEB> because that way you leave the actual timestamp handling etc to the stuff that comes fucking before you
[15:52:18 CEST] <JEEB> you can, yes, just fucking hard-code things
[15:52:30 CEST] <JEEB> but that's fucking last resort when no-one in the project wants to work with you and you need to ship something
[15:52:37 CEST] <pagios> talking to me?
[15:52:45 CEST] <JEEB> no, only the HLS ffmpeg-all page was for you
[15:53:12 CEST] <Zexaron> Maybe the issue is, i don't understand why it doesn't make sense, I would need a working example, or more easily, some kind of explanation or animation to explain that, I can ofcourse take a break and let this thing sit for awhile and do more learning on other stuff I don't know yet and then come back to it
[15:53:41 CEST] <JEEB> why would you not like whatever comes before you handle the "when is this picture supposed to be rendered" part?!
[15:53:51 CEST] <JEEB> if it can just pass you on that information from the fucking renderer
[15:53:58 CEST] <JEEB> there are things to try if that isn't possible, yes
[15:54:11 CEST] <JEEB> but generally a fucking capture thing should not do things it isn't required to
[15:54:52 CEST] <JEEB> and if the renderer only knows its own time base (50Hz or 60Hz or 60/1.001Hz)
[15:55:00 CEST] <JEEB> then it can do the +1 calculation on that time base
[15:55:14 CEST] <JEEB> but you as teh capture component should not fucking be doing +1 calculations
[15:56:12 CEST] <JEEB> if your job is to take in video frames and push them forwards then that's your job. multimedia is already hard enough so if require the frames to come with a timestamp then that's just super
[15:56:23 CEST] <JEEB> *so if you require
[15:57:03 CEST] <JEEB> I really fail to understand what the fuck is so hard about grasping such a design that you have a component that does a single. thing.
[15:57:15 CEST] <Zexaron> So you mean, you did understood my idea, and instead of having FFmpeg calculate or read from template, you would have the emulator report the timestamp or whatever when that frame was suppose to be rendered, in 60FPS it would be rendered on XYZ, but we got it on YXX, that's one way of doing it, hoever I'm still not sure if we're on the same page
[15:57:48 CEST] <JEEB> FFmpeg shouldn't be calculating anything. the API client handles making sure there are timestamps at all.
[15:58:19 CEST] <JEEB> but yes, the part of the emulator feeding you frames should know the presentation timestamp on the time base it's rendering things at
[15:58:23 CEST] <JEEB> so if it knows
[15:58:28 CEST] <JEEB> just pass it on together with the frame
[15:58:38 CEST] <pagios> JEEB, say a client asks for an m3u8 and gets 1.ts , 2.ts , 3.ts in that file so he starts consuming, now 3.ts is reached, how does he know how to consume 4.ts?
[15:58:46 CEST] <pagios> it is not in the playlist that he initially opened
[15:59:01 CEST] <furq> i only briefly looked at the existing code you posted for this but it was making use of some gamestate ticks value that was provided from elsewhere in the emulator
[15:59:28 CEST] <furq> it shouldn't be difficult to derive your pts values from that
[16:00:11 CEST] <JEEB> and if there were bugs due to you calling at the later stage (during encoding), then the logic passing completed frames from the renderer to you can just do that
[16:00:21 CEST] <JEEB> it shouldn't be the capture component coming up with the timestamps
[16:00:22 CEST] <JEEB> :P
[16:01:15 CEST] <furq> if you have absolute 100% faith in your heart that the emulator will definitely give you frames with monotonically increasing timestamps for audio and video then by all means ignore the timestamps it's giving you
[16:01:24 CEST] <furq> but if you did have that faith you probably wouldn't be here asking these questions
[16:01:50 CEST] <Zexaron> JEEB: Unfortunately I don't have enough dolphin-emu rendering experience yet to know if those time things are or can be known ahead like that, if you say that's the way to go ofcourse this will be considered and support would be written to make that happen if it's not too hard, I'll surely pass the idea forward to the appropriate people, I apologize as I'm a bit stubborn sometimes I want to reinvent the wheel
[16:02:41 CEST] <Zexaron> I'm also the wrong guy attempting to do this, others just don't have time, or they would done it already
[16:02:56 CEST] <JEEB> Zexaron: just fucking think of the component you're supposedly working about. it takes frames in, and they are supposed to be presented at time X on the time line. a "capture" component would not start coming up with the presentation time at that point, no?
[16:03:28 CEST] <JEEB> you are supposed to get the image together with a timestamp. and just work with that.
[16:03:33 CEST] <JEEB> division of responsibilities
[16:05:06 CEST] <furq> https://github.com/dolphin-emu/dolphin/blob/master/Source/Core/VideoCommon/AVIDump.cpp#L349
[16:05:18 CEST] <furq> i assume this is what you want
[16:06:50 CEST] <JEEB> yea, that exactly seems like the wrong place for that. and that should only be done if the rest of the framework cannot give you a presentation time stamp
[16:07:09 CEST] <JEEB> that might have historical reasons for that to be done in the capture component
[16:08:06 CEST] <JEEB> and yes, I just checked that VideoInterface::GetTargetRefreshRate gives you a frame rate, although it seems like /1001 time bases are not supported
[16:08:15 CEST] <JEEB> so the video interface already knows something about timing :P
[16:08:59 CEST] <JEEB> so why wouldn't it (or the renderer) be the one giving you the timestamp together with the image data
[16:09:31 CEST] <Zexaron> VideoInterface has it's separate frame rate, if you caught that last time I mentioned, it's sort of the analog TV , VPS is the framerate of the VideoInterface, sometimes game FPS would be lower, but usually they should both be the same
[16:10:08 CEST] <Zexaron> Yes that's completely fair and correct, for a true capture VFR video
[16:10:13 CEST] <JEEB> FUCKING STOP
[16:10:19 CEST] <JEEB> this has nothing to do with that
[16:10:47 CEST] <JEEB> also get out of your head real rendering frame rates (as in, the speed of the renderer that can be higher or lower than the emulator's 1x rate)
[16:11:07 CEST] <Zexaron> that might be useful for something I migt just have it as an option ... it's not?
[16:11:56 CEST] <JEEB> what we need to know is the time base of the emulation's video, and the time stamp.
[16:12:08 CEST] <JEEB> how quickly or slowly that fucking frame got rendered in real time we don't fucking care
[16:12:48 CEST] <JEEB> also this effectively means we don't have to care about if the emulator is VFR or not (aka does the game speed change depending on the video rendering speed)
[16:13:33 CEST] <JEEB> I hope this makes it clear?
[16:13:34 CEST] <Zexaron> I'm not sure how it is with other emulators, but I hope you are takin VI (VideoInterface) cause it's would matter in some ways
[16:13:34 CEST] <Harzilein> with, say, vice, i'd expect starting media output, setting pause, and pressing "advance frame" to call ffmpeg every frame, telling it about new data
[16:16:04 CEST] <Zexaron> I ofcourse agree tthat real-time doesn't matter yes
[16:16:05 CEST] <Hello71> stop trying to jam everything into your 80s video model
[16:16:35 CEST] <JEEB> Hello71: this has nothing to do with 1980s. just that he seems to have a tough understanding that the stuff could just give him the info he needs in a good design
[16:17:19 CEST] <Hello71> seems to me like Zexaron is insisting to write everything in terms of his proprietary interfaces (some of this proprietary stuff is just in Zexaron's head)
[16:17:59 CEST] <JEEB> Zexaron: anyways, don't you agree that as a capture component your starting point is to just get a frame, and its timestamp? no?
[16:18:10 CEST] <JEEB> as in, when it is supposed to be presented on the presentation time line
[16:18:53 CEST] <JEEB> and the presentation time stamp in this case is when it should be shown when the capture is being played, that is
[16:20:36 CEST] <JEEB> and whether that presentation time has anythign to do with the actual time line of the thing you're capturing shouldn't be *your* concern
[16:21:03 CEST] <JEEB> so for example with somethign that runs at <1x speed and doesn't drop frames, you would get the 1x speed presentation time stamp
[16:21:13 CEST] <JEEB> do you understand this far?
[16:23:01 CEST] <Zexaron> Yes I do that now, ofcourse, I just didn't thought it be that easy
[16:23:37 CEST] <Zexaron> The question is then how accurate that time stamp for 1x would be
[16:23:54 CEST] <Zexaron> If we tried to compare it to CFR, but okay, not necessary, no need to then
[16:24:18 CEST] <w1kl4s> every time i see a question about frames with ffmpeg i'm glad i listened to JEEB and used VapourSynth :P
[16:24:45 CEST] <durandal_1707> i gonna ban w1kl4s right now
[16:24:55 CEST] <w1kl4s> rip
[16:24:56 CEST] <JEEB> Zexaron: if the thing creating those frames is giving you the timestamp, you trust that. if that has problems, it's not an issue that the *capture* component should be fixing
[16:25:08 CEST] <JEEB> do you understand this?
[16:25:33 CEST] <Zexaron> I would have, eventually, got to the point, that I was trying to make a case for CFR, to eventually figure out there's going to be minute differences that would make it out of sync with audio/video, which JEEB it's only the list of TODOs, to make ffmpeg also include audio into it, so ofcourse the way you're saying it probably the way to go
[16:26:06 CEST] <JEEB> even if it was CFR I don't think it's the *capture* component's job to make up that timestamp
[16:26:11 CEST] <JEEB> like, seriously
[16:26:41 CEST] <JEEB> whatever gives you the frame upwards in the chain gives you the timestamp and the frame data
[16:26:42 CEST] <Zexaron> I suppose, I just hope that that's improvable on their end, I hope some kind of limitation isn't hit, because the emulator tries to be very accurate to how the console worked, but I might be jumping ahead so let's just forget about this unless it's an issue
[16:27:48 CEST] <JEEB> Zexaron: and as I said, if there's no better approximation than the +1 on the frame rate time base then that's it. but that's not *your* problem if what you're handling is the *capture* part
[16:28:59 CEST] <Zexaron> I guess we're trying to do a lot of things there, make it 1x speed, not the realtime, and make it glitch free, and video editor friendly ...
[16:29:50 CEST] <JEEB> well all of this me trying to preach what in my opinion is good software design is because IIRC you were talking about rewriting the stuff
[16:29:55 CEST] <Zexaron> Well, I don't actually know the video editor issues right now, someone else mentioned that, but AVI ofcourse has to go, so it would be MKV
[16:30:33 CEST] <JEEB> if you just want to do bug fixes then your shit's your shit and I have no idea what your issues are and if they are even on your side of the code
[16:30:52 CEST] <JEEB> and to be honest, it's usually a full-time support thing to figure those things out and I have enough projects I support :P
[16:31:21 CEST] <Zexaron> I t was more of a, I'll give it a try, it may take me several weeks because I'll probably take large breaks in between
[16:32:13 CEST] <Zexaron> No none of the github stuff is my code, that's still Master, I didn't do much of my code yet
[16:33:01 CEST] <Zexaron> I tried figuring it out before I go write some code in a rabbit hole for nothing
[16:33:20 CEST] <JEEB> also to be honest I would first just actually look at the issues at hand rather than grabbing at things randomly. knowing the problem space is good.
[16:33:43 CEST] <JEEB> also for the record, if your output is supposed to be supported by editors as-is then Matroska is not going to fly :P
[16:34:23 CEST] <Zexaron> Indeed that was the next thing, now matroska aint going to fly, ... so many factors
[16:34:44 CEST] <JEEB> just fucking look at your fucking problems first with teh current solution and actually *technically* figure them out
[16:34:47 CEST] <Zexaron> That's why I speculated about MPEG-TS earlier
[16:34:54 CEST] <JEEB> not fix them, but understand what sort of issues those are
[16:38:10 CEST] <Zexaron> I should go first figuring out fully how the current thing works in detail then, I ofcourse expected eventually have to do it
[16:40:42 CEST] <JEEB> a lot of it just seems like general libavformat/libavcodec usage and the doxygen as well as examples under docs in the FFmpeg code tree should generally help with understanding how things hopefully should be used that are there
[16:41:29 CEST] <JEEB> of course the module also seems to attempt to set a constant frame rate timestamp there which IMHO is not a capture component's job, but leaving that aside it would make sense for you to actually look into what are the issues people are having with this crap
[16:41:44 CEST] <JEEB> that way you then know if you need a full rewrite or for some issues you can just do bug fixes :P
[16:51:21 CEST] <Zexaron> The issues are all over the place, it puts too much load on GPU as it's doing some whatever-with-frames as FFmpeg isn't actually encoding on GPU obviously and generally it takes too much resources it than should if FFmpeg was ran independently
[16:51:32 CEST] <Zexaron> I might not be the one fixing all of that ofcourse
[17:25:20 CEST] <kepstin> the extra load on the gpu would be downloading the image from vram into system ram. unavoidable with any software encoder
[17:25:58 CEST] <kepstin> theoretically avoidable when using on-gpu hardware encoders, depending on vendor and api. Doing that might require tighter integration with the renderer
[17:31:36 CEST] <Zexaron> Yeah there's quite a back-and forth in Dolphin, including emulated ram which resides in RAM or VRAM ... not sure how much the option "GPU Texture Decoding" counts in this case
[18:53:36 CEST] <Zexaron> I was also thinking of a raw I-frames output in the ffmpeg, that should be straightforward to do in the API right, I wasn't thinking about quality here because this would be helpful for debugging if it makes a lot less impact on performance, but also a good way to store each frame in one file if one would want to do that for some reason
[18:54:09 CEST] <Zexaron> and the filesize isn't that of a problem in a debugging case as it's temporary allocation
[19:11:43 CEST] <kepstin> Zexaron: doing that wouldn't make any difference in terms of actually writing the frame dumper, it's just a single configuration option that you set on the encoder.
[19:12:06 CEST] <kepstin> depending on the encoder, I-frame-only may be slower since it has to output at higher bitrate.
[19:17:09 CEST] <Zexaron> Really? But it's a SSD, and it's not like it would take all the speed of a HDD imo
[19:51:30 CEST] <kepstin> no, in terms of cpu usage
[22:03:29 CEST] <poutine> I'm experiencing a bug in software when PTS rolls over in a HTTP Live Stream where I have not captured the segments the rollover occurs in. Any ideas how I could generate that or get an example mpeg ts file that contains a PTS rollover?
[22:04:21 CEST] <poutine> afaik, PTS is stored as a 33 bit signed integer, and most rollovers implement it that it just continues to increment as if it were an unsigned integer
[22:04:51 CEST] <JEEB> libavformat unless you disable wrap-around handling should at the very least handle it once (although I would hope for longer)
[22:05:58 CEST] <JEEB> HLS might or might not have to signal it separately to base MPEG-TS though
[22:05:59 CEST] <JEEB> not sure
[22:06:00 CEST] <poutine> JEEB, My bug is actually in converting embedded 608 captions to WebVTT in near real time with the timestamping, I really just need an example mpeg ts file with an rollover how most encoders would deal with it (Wowza, akamai, etc)
[22:06:16 CEST] <poutine> I know there's the setpts filter but that doesn't seem to modify the start_pts of the file
[22:06:57 CEST] <poutine> you'd think there'd be example files of weird conditions like this for OSS developers, but I'm having trouble finding it
[22:07:05 CEST] <poutine> think I might just have to capture ts chunks when it occurs
[22:07:47 CEST] <JEEB> I'm pretty sure there are, although for me libavformat itself handles this :P
[22:07:47 CEST] <poutine> afaik I'd just need to start pts ~8589934592
[22:08:08 CEST] <JEEB> yes, you can create one yourself with libavformat if you just start far enough and mux enough packets
[22:08:29 CEST] <poutine> Ok I will check out that angle, thanks
[22:08:58 CEST] <JEEB> also is your input or output HLS?
[22:09:03 CEST] <JEEB> which is giving you the problems :P
[22:09:16 CEST] <poutine> input is HLS, output is WebVTT
[22:09:32 CEST] <JEEB> yea, then the HLS "demuxer" might not be marked for the discontinuity handling
[22:10:48 CEST] <JEEB> yea, I have a feeling it isn't
[22:11:15 CEST] <JEEB> yea, it would have to have AVFMT_TS_DISCONT
[22:11:22 CEST] <JEEB> in the flags
[22:12:06 CEST] <poutine> so I have not seen any discontinuities in these HLS streams, so I think they just let it overflow into the signed bit, and continue until it wraps to 0, so 0-8589934592 -> -4294967296 -> 0
[22:12:18 CEST] <poutine> w/ 2 rollovers
[22:12:44 CEST] <poutine> I will just capture some segments, I think that's the easiest path, thanks for the pointer to libavformat, but think this might be the easiest route
[22:12:54 CEST] <JEEB> the MPEG-TS thing is 33 bits unsigned I think
[22:13:07 CEST] <JEEB> so there's no negative part to fall into
[22:13:19 CEST] <JEEB> it just goes back to around zero
[22:13:29 CEST] <poutine> http://libav-users.943685.n4.nabble.com/How-to-handle-33-bits-rollover-in-MPEG-td944277.html <- kind of touches on it
[22:14:27 CEST] <JEEB> in 2012 libavformat added basic support for wrap-around handling
[22:14:34 CEST] <JEEB> and since then I think it's been improved a few times
[22:14:49 CEST] <JEEB> at least the samples with singular wrap-arounds have passed through fine in my case
[22:14:55 CEST] <JEEB> as in, libavformat handles them
[22:15:04 CEST] <JEEB> I get constantly rising timestamps
[22:15:21 CEST] <JEEB> some people say that the wrap-around handling in lavf only works once, but I'm not sure of that
[22:15:24 CEST] <poutine> I'm not actually using ffmpeg/libavformat for parsing NAL units and pulling the captions
[22:15:28 CEST] <poutine> just trying to generate a test case with it
[22:15:33 CEST] <JEEB> right
[22:16:00 CEST] <JEEB> I think for wrap-around handling MPEG-TS specifically I would probably take a look at upipe :P
[22:16:25 CEST] <JEEB> https://github.com/cmassiot/upipe/blob/master/lib/upipe-ts/upipe_ts_demux.c#L655.L698
[22:16:30 CEST] <JEEB> this function specifically
[22:16:45 CEST] <poutine> Thanks for that pointer that's very useful
[22:24:16 CEST] <Zexaron> Side research ... would it be possible for those timecodes to pack into a file and for ffmpeg to use it later?
[22:25:13 CEST] <Zexaron> I struck one other idea, framedumping and running ffmpeg while trying to play the game is in it self the biggest issue that lowers the frame rate
[22:25:14 CEST] <furq> no but you can have other muxing tools use it later
[22:25:22 CEST] <furq> mkvmerge and l-smash both support timecode files
[22:25:32 CEST] <JEEB> you are using the APIs
[22:25:40 CEST] <JEEB> how does not using timestamps there help you in any way or form?
[22:25:48 CEST] <JEEB> anyways, sorry for commenting
[22:25:49 CEST] <furq> i didn't want to ask that
[22:27:05 CEST] <Zexaron> Well, I kinda struck the idea while I was making an image sequence video with Vegas PRO earlier, to 60FPS, I played while dumping to PNGs, PNG dumping is slow, only 2-8 FPS, resulting video is kinda weird, it seems 60FPS ingame time but it's speed up considerably
[22:28:00 CEST] <Zexaron> The emulator it self may not be actually trying to output a frame when it has to, that's another thing I don't know, I only know it doesn't skip frames but whatever that counts
[22:28:17 CEST] <furq> you don't know that the game itself is running at 60fps though
[22:29:37 CEST] <Zexaron> Well it's suppose to, it's programmed that way, normal speed at 60, currently only PNG image dumping is supported that's why it was so slow, I tried if I could make it ouput RAW and do second step, at least for this experiment https://github.com/dolphin-emu/dolphin/blob/master/Source/Core/VideoCommon/ImageWrite.cpp
[22:30:02 CEST] <iive> you might want to capture to huffyuv, it's fast, lossless, and produces big files
[22:31:17 CEST] <Zexaron> there was a nother option there yeah
[22:32:08 CEST] <Zexaron> I'll see if some of the video lossless options have a significant benefit, might not need to do the image thingy I was playing with
[22:32:38 CEST] <Zexaron> Someone said before I-frame only would be even harder on CPU, true ?
[22:34:40 CEST] <Zexaron> JEEB: The idea bases on what you said before, it would just output timecodes to file, one way is a big ZIP with image frame files and timecode for each, and it be processed all later when gameplay stops, inside dolphing, not as an external step
[22:35:21 CEST] <Zexaron> Now if there's a lossless video codec/format that be faster than that, sure
[22:35:29 CEST] <iive> sorry, i thought you are using ffmpeg...
[22:35:46 CEST] <durandal_1707> Zexaron: JEEB left, he could not consume too much crap
[22:37:11 CEST] <Zexaron> The whole thing why ffmpeg runs at the same time the emulator is dumping was because of filesize issues, but with dolphin people telling me that this is a niche feature and most average users use OBS for streaming, this has opened more possibility now to produce really good looking, accurate videos
[22:37:32 CEST] <Zexaron> And for a small subset of users who are technical the space is a nonissue
[22:39:01 CEST] <Zexaron> durandal_1707: If he heard me out to the finish ... he assumed wrong, but that's understandable for today oh well
[22:39:30 CEST] <Zexaron> This idea doesn't go agains his, but whatever
[22:58:02 CEST] <Hello71> you don't *have* an idea
[23:07:25 CEST] <kepstin> the way to get good looking accurate videos is 1. to get the emulator to push frames to the encoder as they're rendered, and also say what the time each frame corresponds to is, and 2. use ffmpeg with a configurable codec
[23:27:46 CEST] <angular_mike_2> I'm trying to simulate someone skipping through the video by tapping right arrow key. How can I drop N frames after every M frames? Or is there a way to programmatically modulate frame rate, the way you can apply jitter to effects in some video editors?
[23:33:15 CEST] <ChocolateArmpits> angular_mike_2, did you look into "select" filter?
[23:58:07 CEST] <Zexaron> kepstin, alright, so that would just automatically produce a CFR video ?
[23:58:26 CEST] <Zexaron> in this case the emulator should be pushing out frames 60FPS
[00:00:00 CEST] --- Sat Oct 20 2018



More information about the Ffmpeg-devel-irc mailing list