[Ffmpeg-devel-irc] ffmpeg.log.20191105
burek
burek at teamnet.rs
Wed Nov 6 03:05:01 EET 2019
[00:09:46 CET] <kingsley> c_14: I'm happy to report you saved me precious time, and I found the upstream bug, and seem to have fixed it. Thanks again.
[00:37:44 CET] <pbox> yeah weird, I use the atomicwrites library to write the file and I still get the same error.
[00:57:14 CET] <kepstin> pbox: are you writing an empty file (no text) at some point?
[00:57:28 CET] <kepstin> pbox: if so, you could try putting a space or something in the file instead
[01:39:40 CET] <edenist> heyo everyone
[01:50:02 CET] <void09> trying to stitch together losslessly 2 piece of a tv capture (they overlap at some point, where i want to stich them)
[01:50:31 CET] <void09> being the same transmission (.ts) i assume they have keyframe in the same spots
[01:51:40 CET] <void09> I have managed to cut them with ffmpeg, but the last frame of the first video and the first frame of the second video are the same. this is kind of confusing, i was expecting them to be one frame apart, no ?
[01:52:18 CET] <void09> as ffmpeg cuts to the closest keyframe. maybe it also includes a keyframe as the last frame?
[02:25:33 CET] <montana> i have a question
[02:26:12 CET] <montana> if i have 1280x720 source video and i am cropping 10 pixels on all 4 sides, how is it possible i can still create 1280x720 video at the end
[02:26:33 CET] <AmyMalik> you cannot
[02:26:36 CET] <montana> shouldn't i be able to only create 1260x700
[02:26:49 CET] <AmyMalik> you then upscale to 1280x720
[02:26:58 CET] <AmyMalik> also, you should be cropping a ratio appropriate amount of pixels
[02:27:24 CET] <montana> amymalik i could be cropping because of black bars
[02:27:37 CET] <AmyMalik> what are you trying to crop out
[02:27:41 CET] <AmyMalik> what's the actual source video
[02:27:46 CET] <montana> black bars
[02:27:56 CET] <montana> 640x480
[02:28:50 CET] <AmyMalik> what dimension arethe black bars in
[02:28:59 CET] <AmyMalik> are you trying to create a 640x360 video
[02:29:06 CET] <montana> no 4:3 video
[02:29:19 CET] <AmyMalik> so the video itself is
[02:29:31 CET] <AmyMalik> once you have cropped the black bars what is the resolution you're upscaling from
[02:29:36 CET] <montana> let's say i have to crop 2 pixel on all 4 side
[02:29:57 CET] <AmyMalik> no, let's say you have to crop 4 pixels of height and 4 pixels of width
[02:30:13 CET] <montana> okay
[02:30:15 CET] <AmyMalik> per-side terminology is more difficult to mathematic
[02:30:23 CET] <montana> what should i be my final resolution
[02:30:29 CET] <montana> if i have to you have to crop 4 pixels of height and 4 pixels of width
[02:30:41 CET] <AmyMalik> nice typo
[02:30:51 CET] <AmyMalik> 640x480 minus 4 on all sides is 636x476
[02:30:52 CET] <montana> if i have to crop 4 pixels of height and 4 pixels of width
[02:31:16 CET] <AmyMalik> which is a 1.34 aspect ratio
[02:31:42 CET] <montana> right that should be the right resolution but i can still choose 640x480
[02:31:52 CET] <AmyMalik> yes, because ffmpeg can upscale
[02:31:54 CET] <montana> i shouldn't able to choose 640x480
[02:31:58 CET] <AmyMalik> yes you should
[02:32:07 CET] <AmyMalik> you can choose any output rez, as long as you know it is upscaling
[02:32:10 CET] <montana> upscale should be never allowed
[02:32:17 CET] <AmyMalik> yes it should, are you stupid?
[02:32:33 CET] <AmyMalik> but the target rez is 636x477 for this task to preserve the aspect ratio
[02:32:45 CET] <montana> you mean i can even upscale 640x480 to even 720x540
[02:33:08 CET] <AmyMalik> yes
[02:33:11 CET] <AmyMalik> why the hell would you want to
[02:33:13 CET] <AmyMalik> but yes
[02:33:35 CET] <montana> what is the point of upscalling
[02:33:41 CET] <montana> it would look like crap
[02:34:05 CET] <AmyMalik> the point of upscaling is to display the video on a larger screen than the video itself
[02:34:21 CET] <AmyMalik> if you have a HDTV but you plug a VHS machine into it, you have to do upscaling, or you can only use like a 1/5 the screen
[02:34:23 CET] <montana> i can do that during playback
[02:34:39 CET] <montana> i don't need to do that in encoding/trancoding
[02:34:53 CET] <AmyMalik> right, evaluate the video without and with transcoding level upscaling
[02:35:01 CET] <AmyMalik> if you're uploading to a platform you may need to upscale
[02:35:51 CET] <montana> and lot of dvd shows up as 720x480 but 4:3 aspect ratio
[02:36:22 CET] <montana> why is it 720x480 and not 640x480
[02:36:25 CET] <AmyMalik> yes, that's called anamorph
[02:36:52 CET] <montana> is anamorph downscaling?
[02:36:59 CET] <AmyMalik> both
[02:37:02 CET] <AmyMalik> it's also nonlinear
[02:37:04 CET] <AmyMalik> the 720x480 is scaled up or down (to 720x540, or to 640x480) to get the 4:3
[02:37:17 CET] <AmyMalik> like it's linear in the sense that it doesn't bend the image
[02:37:17 CET] <montana> i see
[02:37:42 CET] <montana> so how does it determine to use 720x540 or 640x480 during anarmorphing
[02:37:54 CET] <AmyMalik> by what size display you have
[02:37:59 CET] <AmyMalik> if you have a 480 line display, it uses 640
[02:38:00 CET] <montana> 1920x1080
[02:38:05 CET] <AmyMalik> then it uses 1440x1080
[02:38:30 CET] <montana> i am talking about viewing on computer
[02:38:40 CET] <AmyMalik> then it uses 1440x1080 on a 1080p/i screen
[02:38:52 CET] <AmyMalik> in a 640x480 box it will use 640x480
[02:38:57 CET] <AmyMalik> in a 720x540 box it will use that
[02:39:26 CET] <montana> the 720x480 is scaled up or down (to 720x540, or to 640x480) to get the 4:3 : which one is it on my 1920x1080 computer screen
[02:39:32 CET] <AmyMalik> 1440x1080.
[02:39:40 CET] <montana> no it doesn't
[02:39:45 CET] <AmyMalik> yes it does
[02:39:48 CET] <AmyMalik> it goes directly to 1440x1080
[02:39:57 CET] <AmyMalik> an anamorph video will always scale to fit the smaller dimension of the display multiplied by the aspect ratio
[02:40:14 CET] <montana> i am not talking about "full screen"
[02:40:26 CET] <AmyMalik> in a 720x480 box it will scale to 640x480
[02:40:36 CET] <AmyMalik> in a 720x540 box it will scale to that
[02:40:47 CET] <AmyMalik> in a 1280x720 box it will scale to 960x720
[02:41:04 CET] <montana> i don't know what kind of video player you use
[02:41:09 CET] <montana> on your computer
[02:41:16 CET] <montana> i am not talking about "full screen"
[02:41:28 CET] <AmyMalik> you are being obstinate deliberately
[02:41:34 CET] <AmyMalik> it doesn't matter if you are talking about full screen or not
[02:41:45 CET] <AmyMalik> if the video display box is 1920x1080, then the video will scale to 1440x1080
[02:42:10 CET] <montana> 640x480 regular video will always show up as 640x480
[02:42:57 CET] <AmyMalik> if the video display box (as is typical on a non full screen 1080 display) is, say, 900 px tall, it will scale to 1200x900
[02:43:39 CET] <AmyMalik> if the video display box is 540 lines high with unlimited width it will show 720, and if it's 480 lines high with unlimited width it will show 640
[02:46:24 CET] <montana> okay if you have 720x480 anamoph source and let's say you have to crop 4 pixels of height and 4 pixels of width , would you use 640x480 with upscaling? or would you use 636x476
[02:47:41 CET] <AmyMalik> if you have a 720x480 source, you do not have a 640x480 source, you have a 720x480 source
[02:47:54 CET] <AmyMalik> are there 4 display pixels you are cropping, or 4 source pixels
[02:48:17 CET] <AmyMalik> because 4 display pixels are 4.5 (~=5) source pixels
[02:49:03 CET] <montana> then what would you do as final resolution?
[02:50:22 CET] <AmyMalik> i would upscale
[02:50:26 CET] <AmyMalik> to 720x480
[02:50:38 CET] <AmyMalik> i don't have time to fuss
[02:51:06 CET] <montana> but 720x480 is not right aspect ratio
[02:51:34 CET] <montana> why not use 640x480 with upscaling then?
[02:52:00 CET] <AmyMalik> because that's both up and downscaling
[02:52:56 CET] <montana> i don't understand your reason
[02:53:37 CET] <AmyMalik> i don't understand why you aren't doing what works for you and saying fuck you to the people here
[02:54:13 CET] <montana> lol because i respect people's opinion
[02:54:42 CET] <AmyMalik> honestly
[02:54:51 CET] <AmyMalik> i would just upscale all the way to 720x540 in transcoding
[02:55:37 CET] <montana> i see
[02:55:58 CET] <montana> not all programs allow that
[02:56:19 CET] <AmyMalik> those programs are thick as a brick
[02:56:21 CET] <AmyMalik> do not use them.
[03:11:48 CET] <edenist> hey, is anyone around who is familiar with libx264, and in particular tuning -crf values?
[03:12:46 CET] <DHE> what about it?
[03:13:11 CET] <kepstin> it's pretty simple - you look at the video, if it's not good enough, you make the crf value smaller
[03:13:20 CET] <kepstin> if it's too big, you make the crf value bigger
[03:13:25 CET] <void09> well, found a better way to join the two overlapping .ts files. https://serverfault.com/questions/350546/join-large-overlapping-files
[03:13:49 CET] <DHE> when in doubt, a good starting value is 20 for good quality and maybe 27 if space matters
[03:14:04 CET] <void09> oh nvm it doesn't work :(
[03:14:08 CET] <DHE> but everyone has their opinions
[03:14:28 CET] <edenist> yeah I get what it does, it's more about some inconsistencies with latencies with various values
[03:14:42 CET] <DHE> latency?
[03:15:46 CET] <edenist> I'm using x11grab to do streaming of my main desktop machine to another system, along with -tune zerolatency -preset ultrafast
[03:16:27 CET] <DHE> oh dear...
[03:16:42 CET] <edenist> if I use -crf 18, I get maybe 20ms of display latency. But when I go up to, say, -crf 23, it goes up to around 80-100ms
[03:17:19 CET] <edenist> so I'm unsure why decreasing the quality has such a big impact on the latency
[03:19:00 CET] <montana> edenis why is 20ms latency matter so much to you, may i ask
[03:19:05 CET] <edenist> I've tried tuning -maxrate, -bufsize and -g as well, and I've got quite low latency on -crf 18....
[03:19:08 CET] <edenist> 20ms is fine!
[03:19:24 CET] <montana> edenis why is 80ms latency matter so much to you, may i ask
[03:19:38 CET] <edenist> but I don't want to use -crf 18, I'd like to decrease the quality down to say crf 23 or more. but it makes the latency much worse
[03:19:54 CET] <edenist> because high input latency makes the system difficult to use in real time
[03:20:04 CET] <montana> i don't understand
[03:20:15 CET] <montana> edenist do you use obs-studio?
[03:20:32 CET] <edenist> I'm basically implementing what steam does with it's remote play. I want to use my main desktop machine from my laptop
[03:20:42 CET] <edenist> so I'm sending inputs from the laptop, and receiving a stream of the desktop
[03:20:59 CET] <edenist> particularly if I want to do something like gaming.... low latency is critical
[03:21:15 CET] <montana> no it's not
[03:21:17 CET] <edenist> montana: no I don't use obs-studio
[03:21:22 CET] <edenist> and yes... it is
[03:21:38 CET] <montana> then what is better than obs-studio
[03:22:47 CET] <edenist> I'm using ffmpeg from the command line. obs-studio is for people streaming their stuff to remote viewers. latency doesn't matter there.
[03:23:58 CET] <edenist> but anyway. it's all much for muchness. I'm having success with ffmpeg, but I'm having some behaviour I don't understand. Hence I'm wondering if anyone in the [shock!] ffmpeg channel might have some insight as to why ;-)
[03:24:06 CET] <montana> ednist no , i use obs-studio for just recording
[03:24:20 CET] <edenist> ok, cool.
[03:24:37 CET] <edenist> I'm not recording
[03:24:54 CET] <montana> then what are you doing if you are not recording nor streaming?
[03:25:20 CET] <edenist> do you know what remote play is on steam? or the same on ps4?
[03:25:34 CET] <montana> no
[03:25:36 CET] <edenist> ok
[03:25:38 CET] <edenist> well
[03:26:35 CET] <edenist> my desktop machine is my main gaming machine. I also have a lower powered laptop. I can log into steam on both machines, and stream the game from my desktop to my laptop, with all inputs coming from the laptop itself.
[03:26:53 CET] <edenist> you can play games remotely [or, just use the desktop itself]
[03:27:46 CET] <montana> why not just play in your main gaming machine
[03:27:46 CET] <edenist> it's basically remote desktop but with streaming video instead of a protocol like RDP or vnc
[03:27:57 CET] <edenist> because that's sometimes not practical
[03:28:03 CET] <void09> edenist: main gaming pc is linux ?
[03:28:06 CET] <edenist> yes
[03:28:22 CET] <void09> nice, i used parsec for streaming from windows
[03:28:53 CET] <void09> wondering if nobody did a ffmpeg based solution for this as it seems a very common usecase
[03:28:59 CET] <edenist> void09: oh ok. Yeah it's going well.
[03:29:20 CET] <edenist> and yeah, I've done quite a bit of analysis on valve's solution, and I'm almost certain its using ffmpeg in some form
[03:29:37 CET] <void09> parsec is closed source and does not support streaming from linux, just to linux
[03:29:41 CET] <edenist> I'm just trying to reverse engineer their parameters, heh
[03:30:12 CET] <void09> having such a thing and using software encoding would be nice. parsec uses the gpu's x264 encoder which results in crap quality
[03:30:22 CET] <edenist> yeah I've got a few friends that use that. I think it has better HW encoding on nvidia than nvidia's own solution?
[03:30:26 CET] <void09> but very low latency
[03:30:52 CET] <edenist> I mean, like I said, I've got good quality and low latency with -crf 18
[03:31:05 CET] <edenist> but for some reason when I change it to -crf 23 the latency quadruples
[03:31:09 CET] <void09> don't know nvidia's own solution. i did use it on nvidia though, 1080ti. and the scene change quality is bad
[03:31:10 CET] <edenist> and that's where my confusion lies
[03:31:27 CET] <void09> i mean, takes a bit for it to catch up to scene changes before it becomes clear
[03:31:45 CET] <void09> what about in between values?
[03:32:35 CET] <edenist> it's pretty similar. I can't remember exact values
[03:33:00 CET] <void09> so reducing quality = more lag ?
[03:33:07 CET] <edenist> correct!
[03:33:20 CET] <void09> tried x265 ?
[03:33:28 CET] <edenist> lag in the actual literal use of the word ;-)
[03:33:34 CET] <edenist> no I haven't
[03:33:48 CET] <edenist> I'm trying to stick to x264 because I've got hardware decode support on my client
[03:34:01 CET] <edenist> which again minimises latency ;-)
[03:34:39 CET] <void09> you are pretty smart for a gamer ;)
[03:35:10 CET] <edenist> I'm unsure how to feel about that comment... haha ;-)
[03:35:25 CET] <void09> feels like these days most gamers are mindless zombies
[03:35:45 CET] <void09> "gaaaames" (braainz)
[03:35:56 CET] <edenist> maybe you get a lot of kids wanting to know how to become youtube stars with their 1337 fortnite skillz?
[03:36:32 CET] <void09> even ignoring those, and the minecraft/gta v kids, still looks bad :d
[03:37:16 CET] <edenist> I mean.... I find the technical challenges of obscure setups to be more fun than the gaming itself
[03:37:36 CET] <void09> I just remembered i tried to do what you did myself
[03:37:45 CET] <edenist> otherwise, yeah, just go for lowest hanging fruit. install windows and lots of LED lighting...
[03:38:17 CET] <void09> was pretty surprised input forwarding just worked.. after i googled and tried a solution
[03:38:28 CET] <void09> what are you using for input ?
[03:38:42 CET] <edenist> I'm not using x11 for input. I'm just using synergy.
[03:38:48 CET] <edenist> super low latency
[03:39:06 CET] <void09> oh payware
[03:40:02 CET] <edenist> well, you can pay. but its mostly GPLd. you don't need a license if you don't want it
[03:40:51 CET] <edenist> I did pay though, because free software devs still need to pay the bills ;-)
[03:41:11 CET] <void09> oh did not look like a gpl thing from their webpage
[03:41:48 CET] <void09> but of course you can't make it too obvious it's free if you want people to pay
[03:42:02 CET] <edenist> well, they are a company and the developers work on it full time, so they probably don't want to advertise [hey, come get this for free over here!]
[03:42:10 CET] <edenist> yup ;-)
[03:43:22 CET] <edenist> hmmmm, ok, so it seems I'm out in the fringes of applications of ffmpeg then ;-) I'll keep plugging at it then and maybe write up my work and what parameters are best once I've finished tuning
[11:17:13 CET] <th3_v0ice> Hi guys. I was wondering why would CUDA encoder output different PTS and DTS of a packet from the software encoder (CPU)? I have exactly the same code and they produce different values for encoded packets.
[11:31:19 CET] <BeerLover> I am using a static build (https://johnvansickle.com/ffmpeg/releases/ffmpeg-release-amd64-static.tar.xz). When I execute it, I get an error -> /opt/ffmpeg: /opt/ffmpeg: cannot execute binary file
[11:52:23 CET] <pbox> kepstin: thanks a lot for the answer. the file is never empty, as what is written is actually a string + a new line
[11:53:07 CET] <pbox> so at least the new line should be there even if the input is an empty string
[12:01:05 CET] <BeerLover> nevermind
[12:01:07 CET] <BeerLover> figured out
[18:25:19 CET] <BLZbubba> hi guys, I'm trying to convert a ts to mkv, but it is complaining about either missing streams or missing timestamps depending on the options I've tried - what is the best way to tell ffmpeg what the input file format is? someone gave me a good one here a few weeks ago but I lost it
[18:56:00 CET] <GenTooMan> Good question. You tried ffprobe on the file first?
[18:58:30 CET] <BLZbubba> yes, ffprobe seems happy with it
[19:01:20 CET] <GenTooMan> TS is just short for Transport Stream. That means it could be anything. ffprobe just checks what it is. To make MKV/MP4 files one needs a set of indexs for time codes. The transport stream may not have the time codes hence "no time stamps". If you are adding a subtitle make sure the subtitle has time stamps also. A raw text file shan't work well.
[19:05:47 CET] <kepstin> hmm? ts almost always refers to mpeg-ts, which is a specific container format. But it has some trickiness, because it doesn't have a header to list streams at the start, and it can contain multiple concatenated streams with timestamps restarting each time.
[19:06:33 CET] <kepstin> ffmpeg tries to find streams in mpegts by probing the file up to either a byte limit or time limit - but it's possible that in some mpeg-ts inputs this won't find all the media streams.
[19:08:27 CET] <BLZbubba> right, is there a way to tell it to look ahead further for the stream info?
[19:09:51 CET] <kepstin> yes, -probesize and -analyzeduration. see https://www.ffmpeg.org/ffmpeg-formats.html#Format-Options
[19:46:02 CET] <GenTooMan> now that I think about it I actually have a question. I'm converting MP4 files copying the audio stream and re-encoding the video. So I keep getting GSOM (Grey Screns of Meh) in it, the encoding pauses for a long period of time, and the player pauses at the same points. I get a grey screen then delta data afterward. It acts like a key frame was deleted. Settings I used https://pastebin.com/7NRavQV4
[19:47:31 CET] <kepstin> GenTooMan: without any other info, the best guess is that your input file is corrupt.
[19:49:27 CET] <GenTooMan> kepstin not much I can add but specifically it happense in transitions. So I guess I should look at ways to find issues in the input file.
[19:49:57 CET] <kepstin> does it play back fine in mpv?
[20:01:15 CET] <GenTooMan> now that you ask I'll have to look. I am uncertain I tested it with mpv specifically
[20:13:05 CET] <feos> I have a question about using PTS with libav when writing video frames. say the footage has frames that need to repeat, and we only write unique frames to the dump. we use PTS to indicate where the frame is meant to be in time, in our case it's just the framecount, since resulting framerate is constant. is the first frame's PTS meant to be 0? the code I'm looking at (video dumper of Dolphin emulator) always assigns non-zero values to first
[20:13:05 CET] <feos> frame's PTS, and I'm suspecting that it's problematic when dealing with segmented video dump.
[20:15:48 CET] <ddubya> is there a node-based gui for ffmpeg filter graph?
[20:15:56 CET] <GenTooMan> kepstin no mpv doesn't show that so likely ye olde mplayer is having issues. Right seems to be mplayer is not up to date in debian. mpv is likely better maintained.
[20:16:49 CET] <furq> feos: it's fine for the first pts to be non-zero
[20:18:28 CET] <feos> furq: in segmented video (like we have to split on resolution changes, then resize and splice back together) I think non-zero PTS may shift entire segment's timing, no?
[20:19:07 CET] <feos> I'm looking at the code and trying to understand if it's already shifting everything for no good reason https://github.com/dolphin-emu/dolphin/blob/master/Source/Core/VideoCommon/FrameDump.cpp#L336
[20:20:31 CET] <furq> it shouldn't make any difference
[20:20:39 CET] <furq> the concat demuxer or whatever has to rewrite the timestamps anyway
[20:20:47 CET] <feos> it's calculating delta based on last and current pts but it's actually dealing with the *previous* emulated frame, yet it uses new frame's timing for it
[20:22:29 CET] <feos> furq: my quest started from experiencing extra frames in segments that cause av desync. if I avoid segmentation altogether by forcing the same resolution, the frames are correct. so I have to understand what can be wrong with those timings when new segments are created
[20:23:17 CET] <kepstin> dunno why you're calculating deltas, if you just throw the timestamp the frame was rendered (from a monotonic clock of some sort) into the pts field with an appropriate timebase, you're done.
[20:23:19 CET] <feos> so far I noticed that it doesn't tell duration of the last frame at all. so it's always shown only once even if it's meant to repeat
[20:23:29 CET] <feos> yep
[20:23:54 CET] <furq> wasn't someone in here a while back talking about rewriting all of this code and we told him to use wallclock timestamps
[20:24:14 CET] <feos> it wasn't Fog, right?
[20:24:34 CET] <furq> i don't think so
[20:24:39 CET] <kepstin> if you have something like a emulator cycle count or whatnot, that would work fine too.
[20:24:41 CET] <furq> it was a while back though
[20:24:53 CET] <kepstin> depending what you're trying to sync with
[20:24:56 CET] <feos> there are talks about complete overhaul, but for now I just need my tiny bug fixed, since it's causing lots of problems
[20:25:36 CET] <feos> (I'm calling it mine even tho I haven't written any of that code lol)
[20:25:51 CET] <feos> dolphin people are clueless so I had to go here
[20:26:39 CET] <feos> kepstin: so if I understand it correctly, duration of the current frame is meant to be calculated based on next frame's pts. in libav that is. is it correct?
[20:27:14 CET] <kepstin> duration of the current frame is the timestamp of the next frame (or of end of file, in formats that support that) minus the current frame pts.
[20:27:59 CET] <feos> yeah. but what happens if the next frame is in another segment, yet the last frame of this segment is meant to last for a while?
[20:28:25 CET] <kepstin> then you need to either use a format that can indicate an end of file after the time of the last frame, or have an audio track
[20:28:27 CET] <feos> so that if I splice them together, they align
[20:28:49 CET] <feos> okay I don't have an audio track since it's dumper separately
[20:28:54 CET] <feos> *dumped
[20:29:04 CET] <kepstin> that's inconvenient :/
[20:29:12 CET] <feos> indeed
[20:29:43 CET] <feos> I could dump that frame several times, as much as needed, I think that'd make it "last" for how many frames it needs to
[20:30:06 CET] <feos> I also tried setting packet duration. I got even more extra frames when I spliced tho
[20:30:27 CET] <feos> so then I started suspecting wrong *starting* timings for the next segment too
[20:30:34 CET] <kepstin> you'd do that by dumping it one extra time, at the timestamp of when it should disappear
[20:30:40 CET] <kepstin> what file format are you using?
[20:30:58 CET] <feos> *cough* AVI+FFV1 *cough*
[20:31:09 CET] <kepstin> ok, so first thing to do is stop using avi
[20:31:31 CET] <feos> unfortunately that's unfeasible for now :(
[20:31:35 CET] <furq> i was going to suggest just mpegts and cat them together to preserve the timestamps
[20:31:46 CET] <furq> but i guess that won't work great with resolution changes (or ffv1)
[20:31:46 CET] <kepstin> avi doesn't really support vfr stuff, so this whole thing is just terribly broken with avi
[20:32:02 CET] <kepstin> mpegts has no issue with resolution changes :)
[20:32:05 CET] <kepstin> but players might
[20:32:31 CET] <feos> well I admit it works well for single-segment, since it's not arbitrary VFR but rather repeating frames a few times, still CFR overall
[20:33:01 CET] <feos> each output frame's duration is the same
[20:33:14 CET] <kepstin> yeah, avi is a terrible ancient format.
[20:33:22 CET] <furq> kepstin: yeah i just meant having a file with resolution changes in general is going to be a bad time
[20:33:37 CET] <feos> apparently even winapi importers like avisynth and virtualdub handle ffv1 very well
[20:34:02 CET] <furq> are people seriously still using virtualdub
[20:34:14 CET] <feos> <kepstin> you'd do that by dumping it one extra time, at the timestamp of when it should disappear -- should it have zero duration somehow?
[20:34:16 CET] <furq> not sure whether that's more of an indictment on them or the stuff that was supposed to have replaced it
[20:34:24 CET] <furq> of
[20:34:30 CET] <kepstin> feos: you can't do that with avi, i was assuming you had a real container format
[20:35:11 CET] <furq> yeah the only way to do this with avi is to write a bunch of dup frames
[20:35:24 CET] <feos> regarding virtualdub, we (tasvideos) are currently stuck at avisynth so there's no way out YET. the plan is moving over to vapursynth
[20:35:56 CET] <feos> <furq> yeah the only way to do this with avi is to write a bunch of dup frames -- I tried setting packet duration and it *seemed* to do the job?
[20:36:25 CET] <kepstin> hmm. the muxer might be smart enough to encode some NULL frames to fill the duration
[20:36:32 CET] <furq> 19:30:06 ( feos) I also tried setting packet duration. I got even more extra frames when I spliced tho
[20:36:45 CET] <furq> yeah that sounds like something is duping frames for you because it's avi
[20:36:54 CET] <Hello71> furq: there's still *active development* on virtualdub.
[20:36:59 CET] <furq> incredible
[20:37:01 CET] <Hello71> not the original, virtualdub2, but close enough
[20:37:03 CET] <feos> yeah I mean the endings looked correct now but starts probably still weren't. and I suspected dolphin's dumping code
[20:37:15 CET] <furq> have they actively developed support for more than one container format yet
[20:37:24 CET] <Hello71> and they use ffmpeg for most i/o now
[20:37:30 CET] <kepstin> the avi muxer does an unreasonably good job of pretending avi isn't as bad as it is ;)
[20:37:58 CET] <furq> there's a whole ecosystem of pretending avi isn't as bad as it is
[20:39:48 CET] <feos> what do you guys think of vapoursynth, if anything?
[20:39:53 CET] <furq> it's pretty good
[20:39:56 CET] <Hello71> the avi junk is still all there, but you can open/save modern formats reasonably easily
[20:40:01 CET] <furq> shame it's python but it's better than avs
[20:40:10 CET] <feos> avs is a nightmare
[20:40:25 CET] <furq> but the main reason it's better than avisynth is that you get multithreading that doesn't crash 100% of the time
[20:41:54 CET] <feos> from the perspective of a person who has to maintain tas-encoding-package since 2012 (VfW, avisynth, batch, scripts that change text files), having a REAL programming language for all this feels like a more important reason :)
[20:42:10 CET] <feos> with python it can actually be an all-in-one thing
[20:42:45 CET] <furq> yeah i rarely used avs for anything other than qtgmc, which definitely needs stable multithreadnig
[20:43:29 CET] <feos> we pipe it to x264 that does have MT
[20:44:15 CET] <furq> that doesn't really help me
[20:47:48 CET] <feos> another question. all emulators I've seen use avi for video dumping. and only a few use ffmpeg. yet they actually integrate it and use directly, with hardcoded commands. but there's another approach, involving nut and piping, that allows a given application to send frames to ffmpeg along with arbitrary commands that it supports https://github.com/clementgallet/libTAS/blob/master/src/library/encoding/AVEncoder.cpp
[20:47:48 CET] <feos> I've been advocating this approach for years, and it looks like the perfect solution to most problems emulator dumping may experience. does that make sense?
[20:48:57 CET] <kepstin> have to be careful with piping raw video, frames are big, pipe buffers are small, it's easy to make things block :)
[20:49:09 CET] <Hello71> pipe is not very fast compared to a fast codec.
[20:49:12 CET] <furq> not sure what you're saying is the alternative to this
[20:49:34 CET] <kepstin> so you need an encoder thread anyways, and it's honestly not much more work at that point to use ffmpeg libraries directly if it's license compatible
[20:49:39 CET] <feos> sending framebuffers to external ffmpeg
[20:49:43 CET] <Hello71> furq: tl;dr libav* vs pipe to ffmpeg
[20:49:52 CET] <feos> yeah
[20:50:06 CET] <furq> oh
[20:50:10 CET] <furq> i mean i'd probably just use the libs
[20:51:16 CET] <feos> the benefit is infinite flexibility. like it's tricky to set up full-blown libav support inside your app, while external ffmpeg can have all the libs and stream to twitch directly and whatnot
[20:56:50 CET] <kepstin> there's some major benefits to having frames in shared memory in terms of speed, and enough flexibility for most cases can be given by letting the user have some text fields for output filename/url, format, codec names, and avoptions per stream. (with some reasonable defaults or presets) worst case the user can set it up to pipe to an external tool if that's what's needed.
[21:01:56 CET] <furq> yeah piping huge amounts of data to an external tool is something i'd generally avoid in a widely-used multiplatform tool
[21:02:49 CET] <furq> unless you like getting hundreds of ffmpeg tickets on your issue tracker
[21:10:27 CET] <feos> what if we slightly compress before piping? for example with some fast lossless codec
[21:10:42 CET] <feos> "slightly"
[21:10:57 CET] <furq> well now you have your own implementation of nut and ffvhuff in your code
[21:11:04 CET] <furq> this isn't really simplifying things
[21:11:35 CET] <Hello71> furq: that's easy, i'll use ffmpeg!
[21:11:59 CET] <furq> damn that's smart
[21:12:43 CET] <feos> well honestly, having some super basic libav stuff built in and piping to a full-blown thing, why not
[21:13:28 CET] <furq> it seems weird to depend on the ffmpeg libs and also the ffmpeg cli
[21:13:33 CET] <feos> :D
[21:13:58 CET] <furq> especially on windows where you wouldn't get one of those for free
[21:14:10 CET] <feos> as an option that may not be too bad
[21:15:13 CET] <furq> well at this point you're encoding and muxing the file with the ffmpeg libs
[21:15:25 CET] <furq> is that not what you wanted to avoid
[21:16:09 CET] <feos> I wanted the flexibility, and previously I didn't know a lot about all those options
[21:17:02 CET] <feos> okay dolphin's code actually needs to calculate delta from last pts because it needs to set new segment's timings properly
[21:17:24 CET] <feos> they should act like it starts from scratch
[21:17:56 CET] <feos> I'm almost convinced repeating frames directly is the easiest solution
[21:18:18 CET] <feos> no need to worry about setting pts or anything
[00:00:00 CET] --- Wed Nov 6 2019
More information about the Ffmpeg-devel-irc
mailing list