[Ffmpeg-devel-irc] ffmpeg.log.20190305

burek burek021 at gmail.com
Wed Mar 6 03:05:02 EET 2019


[00:00:06 CET] <furq> it was probably the mingw pthreads lib
[00:00:16 CET] <JEEB> (or if you can get a backtrace of that, go report that to whomever provided that pthreads wrapper)
[00:00:32 CET] <JEEB> yes, at this point of time in 2019 most likely the mingw-w64 pthreads wrapper
[00:00:45 CET] <GuiToris> hello, how would you get the right values for the perspective filter?
[00:00:56 CET] <GuiToris> I'd like to match to separate videos
[00:02:55 CET] <kevinnn> JEEB: would you happen to be able to link me to an example of how to manually implement this with the UDP protocol? Instead of relying on ffmpeg for receiving h264 units, decoding them and threading those processes?
[00:03:13 CET] <kevinnn> so that I don't have to depend on pthreads
[00:03:40 CET] <JEEB> I was meaning to within the UDP protocol within libavformat
[00:03:45 CET] <JEEB> but you can of course do it yourself as well
[00:04:03 CET] <kevinnn> I figured that might be a better approach
[00:04:14 CET] <kevinnn> but I can't find any nice examples to do it
[00:04:36 CET] <JEEB> also, no idea because I would look into stuff like that after I see it could be worth some effort :P
[00:08:15 CET] <kevinnn> so you don't think its worth my effort?
[00:08:28 CET] <kevinnn> instead I should do something with libavformat?
[00:09:04 CET] <JEEB> no, I just mean that if I would start looking into it I would probably start my pay-o-meter or something :P (or someone would have to spring a very curious interest in things within me)
[00:09:23 CET] <JEEB> and my default solution would be to make it within libavformat so that everything using libavformat can gain from it
[00:09:57 CET] <kevinnn> hmm, I am not quite following how I could do this within libavformat
[00:10:15 CET] <JEEB> implement the separate I/O thread within libavformat/udp ?
[00:10:25 CET] <JEEB> just like the pthread variant is done right now
[00:10:47 CET] <JEEB> anyways, I don't remember if the pthreads version was just exiting normally or crashing for you
[00:11:06 CET] <JEEB> if it's crashing then you might be able to boink the internest of the party that provides the pthreads windows wrapper for you
[00:11:25 CET] <kevinnn> for some reason even after I compiled with pthreads the output still suggested that I hadn't linked it properly
[00:11:27 CET] <JEEB> you would just need to come up with a backtrace + bug report :P
[00:11:52 CET] <kevinnn> like it still had the circular buffer error
[00:12:04 CET] <JEEB> if things are not going fast enough then that will happen yes
[00:12:13 CET] <kevinnn> any simple tips to get it to compile correctly?
[00:12:23 CET] <kevinnn> oh so it may have compiled correctly still?
[00:12:34 CET] <JEEB> no idea
[00:13:09 CET] <kevinnn> so how do I get started on creating a separate I/O thread within libavformat/udp
[00:13:35 CET] <JEEB> you can already see the implementation of the pthreads version :P
[00:13:48 CET] <JEEB> it's in the UDP protocol layer in libavformat
[00:13:51 CET] <JEEB> udp.c most likely
[00:16:30 CET] <kevinnn> okay thank you, I'll take a look
[00:19:05 CET] <JEEB> also I recommend you try and find information on how Windows networking documentation recommends you handle high bandwidth UDP. just make sure it's applicable for modern windows (NT6+)
[00:43:00 CET] <bencc> is there something like libass that is up to date?
[00:43:12 CET] <bencc> for example, I need to control text padding
[00:43:18 CET] <bencc> and auto line breaks
[00:43:38 CET] <JEEB> libass is pretty up to date?
[00:43:46 CET] <JEEB> it's relatively actively developed, even
[00:44:38 CET] <bencc> I see \fspy vertical spacing tag in the planned features of v5.0
[00:44:39 CET] <bencc> https://github.com/libass/libass/wiki/ASS-v5.0
[00:44:51 CET] <bencc> from 2017. is it possible to use it?
[00:45:15 CET] <bencc> last release is from 2017
[00:45:31 CET] <bencc> how is it to date?
[00:46:15 CET] <furq> well afaik the spec is defined by libass so i don't see how anything else could be more up to date
[00:46:23 CET] <furq> or the same team
[00:47:01 CET] <bencc> I'm trying to put text box annotations to a video
[00:47:28 CET] <bencc> a semi transparent rect with a text in it at specific time/position
[00:49:36 CET] <bencc> maybe this: https://github.com/libass/libass/commit/a7807b03b0d1f25f448c14e88e0d8aa6ef4d0961
[00:49:41 CET] <bencc> but I'm not sure how to use it
[00:51:41 CET] <JEEB> bencc: I'd probably recommend #libass on this network for ASS stuff
[00:51:57 CET] <JEEB> also something called ASS v5 has come and gone :P
[00:52:13 CET] <bencc> come and gone?
[00:52:28 CET] <JEEB> as in, different people have come and come up with something that they'd call ASS v5
[00:52:46 CET] <JEEB> the best bet is to just start adding extensions to libass if you want something. as long as it seems sanely specified
[00:53:10 CET] <JEEB> and for ASS basics there's the Aegisub tutorial :P
[00:53:19 CET] <JEEB> for what BorderStyles etc are
[00:59:25 CET] <furq> bencc: i take it you have way too many annotations to make it feasible to do this with drawbox/drawtext
[01:00:17 CET] <bencc> furq: drawbox/drawtext means I need to have a long command line command?
[01:00:30 CET] <furq> yeah or use -filter_complex_script
[01:00:36 CET] <furq> that's just putting the very long command in a file though
[01:20:03 CET] <cstk421> i am setting up a youtube live stream from a camera via rtsp.  What i am looking to do is have ffmpeg take snapshots of it every 60 secs or so and create a timelapse video that updates as more and more images are taken.  Is this possible ? any guides I can read to set this up?
[02:03:28 CET] <fructose> When trying to cut between -ss and -t, is there a way to get ss to seek to the last DTS time while ensuring audio stays synced?
[02:15:25 CET] <cstk421>  Csman so youtube has an 8 hour time limit on streaming live and then it auto archives the stream.  Does anyone know if after the 8 hours are reached if it starts the stream again and goes for another 8 hours automatically ?
[03:40:09 CET] <Mista_D> How would one add "days" to timecode/timestamp with a drawtext filter please? Need to see how long live feed can run without interuptions...
[06:17:48 CET] <psachin> Hi #ffmpeg. Any can help me to convert the video from MKV to MP4(PS4 compatible)? The close I got was using the command `ffmpeg -i input.mkv -c:v mpeg4 -c:a libmp3lame -b:a 128K output.mp4`. But the output video is not identified by PS4 console.
[06:21:34 CET] <CVJoshua> Hi, I'm reading the docs for -frame_drop_threshold, they say "The default is -1.1.". I'm wondering what the behaviour of negative values is. Does it cause ffmpeg drop frames immediately, or never? And why is the default -1.1 instead of -1.0?
[06:35:42 CET] <another> psachin: try -tag:v xvid
[06:35:49 CET] <another> see also: https://trac.ffmpeg.org/wiki/Encode/MPEG-4
[06:36:02 CET] <another> but you probably want something more modern
[06:36:07 CET] Action: psachin checking...
[06:36:17 CET] <another> https://trac.ffmpeg.org/wiki/Encode/H.264
[06:38:14 CET] <another> psachin: according to the ps4 manual (https://manuals.playstation.net/document/en/ps4/music/mp_format_m.html) it should support mkv
[06:45:57 CET] <psachin> another: Hmm. I discovered that the output video doesn't match the exact MKV specs as described on the page. Anyways I'll try the H.264 as you suggested.
[06:46:52 CET] <another> have you tried the original video?
[06:58:38 CET] <beandog> psachin, here you go - https://dvds.beandog.org/doku.php?id=ps4
[06:58:52 CET] <beandog> somewhat helpful
[06:59:08 CET] <beandog> also https://manuals.playstation.net/document/en/ps4/videos/mp_format_v.html
[06:59:31 CET] <beandog> it's pretty fussy about what'll accept
[07:54:35 CET] <psachin> another: Yes. It is not identified either.
[07:56:32 CET] <psachin> beandog: In your example command, does the original MKV video was identified on PS4?
[09:33:49 CET] <Ariyasu> psachin
[09:35:56 CET] <Ariyasu> ffmpeg -i input.mkv -vcodec libx264 -preset slow -profile:v high -level 4.2 -crf ## -acodec aac -b:a 128k output.mkv
[09:36:17 CET] <Ariyasu> pick a crf value depending on the content, somewhere between 18 to 22 should do it
[09:36:27 CET] <Ariyasu> and that should play back fine on your ps4
[09:37:33 CET] <Ariyasu> if you do
[09:37:47 CET] <Ariyasu> ffmpeg -t 10 -i input.mkv -vcodec libx264 -preset slow -profile:v high -level 4.2 -crf 21 -acodec aac -b:a 128k output.mkv
[09:38:04 CET] <Ariyasu> it will just encode a 10second sample video, which you can use to test it on your ps4
[09:38:07 CET] <psachin> Ariyasu: Currently I'm trying `ffmpeg -i input.1080p.BluRay.x264.DTS-HDC.mkv -c:v libx264 -c:a libmp3lame -b:a 48K output.mkv`
[09:38:16 CET] <Ariyasu> it it works you can remove the -t 10 and rework all your files
[09:38:41 CET] <psachin> Ariyasu: Ohk. Using -t is new info for me
[09:38:44 CET] <Ariyasu> you havent set the level to 4.2 so im pretty sure it will fail to play back
[09:39:00 CET] <Ariyasu> you can run another instance in parallel to check
[09:39:16 CET] <Ariyasu> ffmpeg -t -i input.1080p.BluRay.x264.DTS-HDC.mkv -c:v libx264 -c:a libmp3lame -b:a 48K test.mkv
[09:39:20 CET] <Ariyasu> ern
[09:39:25 CET] <Ariyasu> ffmpeg -t 10 -i input.1080p.BluRay.x264.DTS-HDC.mkv -c:v libx264 -c:a libmp3lame -b:a 48K test.mkv
[09:39:43 CET] <Ariyasu> like that, and test it, if it dosnt work, you can kill your running job and start over
[09:50:19 CET] <beandog> plus I don't think it'll playback mp3
[09:51:13 CET] <beandog> oh, nvm, docs say it does in mkv
[10:38:03 CET] <psachin> Ariyasu: beandog This works "ffmpeg -t 3 -i input.mkv -c:v libx264 -c:a libmp3lame -preset slow -profile:v high -level -crf 21 4.2 -b:a 128K output.mkv"
[10:38:46 CET] <psachin> Oops sorry --  "ffmpeg -t 3 -i input.mkv -c:v libx264 -c:a libmp3lame -preset slow -profile:v high -level 4.2 -crf 21 -b:a 128K output.mkv"
[10:39:00 CET] <psachin> Thank Guys!!
[11:11:57 CET] <beandog> cool :)
[11:11:59 CET] <beandog> have fun
[12:43:26 CET] <Shibe> hi i cant quite seem to figure out how to capture kmsgrab and pulseaudio at the same time
[12:43:28 CET] <Shibe> sudo ffmpeg -f kmsgrab -framerate 60 -i - -f pulse -i INPUT -vf 'hwmap=derive_device=vaapi,scale_vaapi=w=1280:h=720:format=bgr0' -c:v h264_vaapi /tmp/output669.mkv
[12:43:31 CET] <Shibe> heres what i have now
[12:43:35 CET] <Shibe> but it says input/output error
[12:50:02 CET] <durandal_1707> Shibe: you put "-f pulse -i INPUT" in wrong position
[12:52:49 CET] <Shibe> durandal_1707: what would be the correct way?
[12:55:33 CET] <durandal_1707> Shibe: put it before "-f kmsgrab"
[12:57:46 CET] <Shibe> durandal_1707: sudo ffmpeg -f pulse -i INPUT -f kmsgrab -framerate 60 -i - -vf 'hwmap=derive_device=vaapi,scale_vaapi=w=1280:h=720:format=bgr0' -c:v h264_vaapi output.mkv
[12:57:50 CET] <Shibe> still seems to be giving input/output error
[13:04:37 CET] <Shibe> https://pastebin.com/raw/R4yaA337
[13:06:32 CET] <durandal_1707> Shibe: replase INPUT with 0
[13:06:53 CET] <durandal_1707> INPUT does not work here for me with pulse
[13:08:04 CET] <Shibe> durandal_1707: replacing it with -i default seems to be working
[13:08:06 CET] <Shibe> thanks!
[13:08:14 CET] <Shibe> though it's picking up on my mic not my speaker output but i'll figure it out
[15:37:09 CET] <Hello71> also instead of sudo I think you can add yourself to video group
[15:37:24 CET] <Hello71> wait, no, that's for v4l
[15:37:44 CET] <Hello71> wait, no it isn't.
[16:14:59 CET] <th3_v0ice> If I send a packet that has smaller DTS then the previous DTS, is the muxer sending it or buffering it? (RTMP output)
[16:17:11 CET] <DHE> depends on if they're in different streams and if you use av_[interleaved]_write_frame with or without the interleaved bit
[16:20:35 CET] <th3_v0ice> Its the same stream. So interleaved will buffer it and av_write_frame will just send it. Do I need to arrange packets in order of increasing DTS in that case?
[16:22:18 CET] <DHE> for the same stream that is mandatory, yes
[16:22:35 CET] <DHE> you will get errors if you don't, and I'm not sure if it will actually send what you give it at all
[16:26:59 CET] <Mavrik> IIRC the out-of-order packets get dropped on muxer layer
[16:51:54 CET] <th3_v0ice> Ok guys, thanks!
[22:05:12 CET] <brimestone> hey guys, can FFMpeg use AMD GPU?
[22:05:49 CET] <JEEB> for decoding there's vaapi (lunix) and dxva2 and d3d11va (windows)
[22:06:08 CET] <JEEB> for encoding there's vaapi (lunix) and there was something for windows but I'm not sure if that ever got merged :P
[22:06:10 CET] <brimestone> Nothing for macOS?
[22:06:13 CET] <JEEB> no
[22:06:22 CET] <JEEB> unless the mac hwdec APIs work
[22:06:29 CET] <pink_mist> I thought macOS used nvidia anyway
[22:06:30 CET] <JEEB> those are generic
[22:06:42 CET] <brimestone> How would I test that?
[22:06:51 CET] <brimestone> If hwdec API is working?
[22:08:09 CET] <JEEB> either with an FFmpeg build with videotoolbox enabled, or an mpv that was built with FFmpeg that had vt enabled
[22:08:38 CET] <JEEB> (found my macos building documents that mention --enable-videotoolbox when building for macos
[22:29:36 CET] <brimestone> JEEB: how do I test if video toolbox is enabled or not?
[22:29:50 CET] <JEEB> in what?
[22:30:17 CET] <brimestone> Like ffmpeg -protocols to see which protocols are supported
[22:31:00 CET] <JEEB> ffmpeg -hwaccel videotoolbox -i blah.mp4 -f null -
[22:31:25 CET] <JEEB> also it could be that ffmpeg -hwaccels lists them
[22:31:38 CET] <JEEB> yes, seems like that's the case
[22:33:07 CET] <brimestone> Hey! I get  videotoolbox opencl videotoolbox (2 video toolbox)
[22:34:24 CET] <Sesse> hi. when transcoding using the ffmpeg command line tool, is there any way I can override the autodetected input ycbcr information? I have JPEGs that I know for sure have left chroma placement, limited ycbcr range and rec709 coefficients
[22:35:14 CET] <Sesse> but I'm fairly certain libavcodec won't mark them as such, so I'd need to override :-)
[22:35:35 CET] <JEEB> yes. scale/zscale filters let you override what the input is, and you can use the normal override parameters on the encoding side if you just need to encode it in some way
[22:35:45 CET] <Sesse> let's see
[22:35:57 CET] <JEEB> (note: swscale might do funky things by default so always keep log level at least verbose)
[22:36:00 CET] <JEEB> -v verbose
[22:36:12 CET] <JEEB> that shows the generated filter chain if such was generated
[22:36:43 CET] Action: JEEB wonders if he should add an option to ffmpeg.c that tells libavfilter to not insert implicit format conversions
[22:36:44 CET] <Sesse> -vf scale looks great for this
[22:38:08 CET] <Sesse> hsub doesn't exist, though, even though the manual claims it should
[22:38:22 CET] <Sesse> neither does ohsub
[22:39:11 CET] <JEEB> not sure if swscale handles that
[22:39:19 CET] <Sesse> https://ffmpeg.org/ffmpeg-filters.html#scale-1
[22:39:34 CET] <Sesse> but it's fine, the subsampling is detected correctly, and I guess I can just do -pix_fmt to set the output
[22:39:56 CET] <JEEB> http://ffmpeg.org/ffmpeg-filters.html#zscale-1
[22:40:02 CET] <JEEB> zimg which is what zscale uses
[22:40:11 CET] <JEEB> does handle input/output chroma location being different
[22:40:16 CET] <JEEB> and other things
[22:40:30 CET] <Sesse> well, I want output to be left, too
[22:40:33 CET] <Sesse> so that's perfect :-)
[22:40:38 CET] <Sesse> ie., having in and out be the same
[22:40:42 CET] <JEEB> ok
[22:41:04 CET] <Sesse> my JPEGs are essentially made at H.264 spec
[22:41:08 CET] <brimestone> JEEB, videotoolbox only uses CPU :( both of my D500 are idle while CPU is high.
[22:41:42 CET] <Sesse> but ok, let's try zscale
[22:42:02 CET] <JEEB> Sesse: if you don't need to scale etc then just set the metadata with the general options that ffmpeg.c has
[22:42:08 CET] <JEEB> which sets the AVCodecContext values I think
[22:42:19 CET] <JEEB> brimestone: I am not wholly surprised :P
[22:42:35 CET] <Sesse> [AVFilterGraph @ 0x5647242df200] No such filter: 'zscale'
[22:42:36 CET] <Sesse> pah
[22:42:45 CET] <JEEB> yea, you need to have built with zimg for it
[22:42:52 CET] <Sesse> JEEB: I do need to scale, I want 420p in the end
[22:42:54 CET] <JEEB> very nice if you want gamma correct scaling among other things
[22:42:57 CET] <Sesse> even though my input is 422p
[22:43:05 CET] <JEEB> ok
[22:43:29 CET] <Sesse> well, OK, I guess I could live with 422 on disk, but it's kind of wasteful for my use
[22:44:24 CET] <Sesse> let me try to find the metadata options in ffmpeg.c
[22:45:03 CET] <shibboleth> anyone know how to force kodi to use tcp for rtsp?
[22:45:38 CET] <Foaly> i'd just cut the two pixels off rather than scaling
[22:46:26 CET] <Sesse> JEEB: I can't find any relevant options in ffmpeg_opt.c
[22:47:16 CET] <Sesse> JEEB: question: if I just do a normal transcode from e.g. mpeg-2 (center chroma) to h.264 (left chroma), is there nothing that will shift the chroma placement for me?
[22:47:30 CET] <JEEB> zscale seems like it could do it
[22:47:36 CET] <Sesse> no, I mean without any options
[22:47:37 CET] <JEEB> since it lets you set input and output of it
[22:47:55 CET] <Sesse> like, ffmpeg -i whatever_dvd.vob -c:v libx264 test.mp4
[22:48:30 CET] <JEEB> highly unlikely anything will touch that stuff if the pixel format matches. in the best case it will just set the values on the libx264 side accordingly :P
[22:48:37 CET] <Sesse> oh wow :-)I
[22:48:39 CET] <Sesse> :-)
[22:48:45 CET] <JEEB> https://www.ffmpeg.org/ffmpeg-all.html ctrl+F "primaries"
[22:48:55 CET] <JEEB> and the other options nearby
[22:49:01 CET] <JEEB> those set the AVCodecContext values
[22:49:43 CET] <Sesse> ok... except no chroma placement
[22:49:50 CET] <JEEB> chroma_sample_location
[22:49:56 CET] <JEEB> you just had to scroll a bit more
[22:50:08 CET] <Sesse> huh, are we looking at the same file?
[22:50:12 CET] <Sesse> ah, we're not
[22:50:12 CET] <JEEB> yes
[22:50:18 CET] <JEEB> well I just linked ffmpeg-all.html
[22:50:19 CET] <JEEB> :P
[22:50:29 CET] <Sesse> yes, and somehow I managed to look in another tab nevertheless
[22:50:32 CET] <Sesse> so my mistake :-P
[22:50:53 CET] <JEEB> but do note that most likely any automated conversion will only look at the pixel format
[22:51:05 CET] <JEEB> so that's metadata forcing only
[22:51:08 CET] <JEEB> effectively
[22:52:06 CET] <Sesse> well, yes, it's a shame that ffmpeg doesn't really understand color
[22:52:33 CET] <JEEB> you basically have to do conversions like that with zimg via zscale and set things manually
[22:52:37 CET] <Sesse> mm
[22:53:01 CET] <JEEB> which I think in many cases is better than ffmpeg.c or libavfilter adding some auto-conversion somewhere because it feels like it
[22:53:13 CET] <Sesse> but something like 422 -> 420 necessarily has to assume something about chroma placement
[22:53:26 CET] <JEEB> probably
[22:53:54 CET] <JEEB> most likely swscale does it in the MPEG-2/H.264 way
[22:53:59 CET] <JEEB> as opposed to the older MPEG-1 way
[22:54:03 CET] <JEEB> although I might be 100% incorrect
[22:54:05 CET] <Sesse> MPEG-2 and MPEG-1 has the same
[22:54:07 CET] <Sesse> H.264 is different
[22:54:15 CET] <JEEB> huh, did I misremember that
[22:54:35 CET] <Sesse> MPEG-2 is mostly MPEG-1 + interlacing support
[22:54:43 CET] <Sesse> (there are some other minor differences
[22:54:44 CET] <Sesse> )
[22:55:26 CET] <JEEB> Sesse: also btw in my API usage I specifically disable any automated format conversions in libavfilter, which is why I thought about adding an option for that in ffmpeg.c. the last thing you want is swscale deciding to add a random swscale conversion somewhere >_>
[22:56:00 CET] <Sesse> =)
[22:56:26 CET] <JEEB> avfilter_graph_set_auto_convert(filter_graph, AVFILTER_AUTO_CONVERT_NONE);
[22:56:32 CET] <JEEB> "none, thank you"
[22:59:26 CET] <Sesse> so I converted MJPEG -> H.264 with -chroma_sample_location left and -chroma_sample_location center, decoded the first frame of both H.264 files to PNGs using ffmpeg, and the end result got the same md5sum :-/
[23:00:06 CET] <JEEB> yes, as I said - those options are just the metadata. all the default automagic conversions are based on swscale and that currently only looks if the pix_fmt matches or not
[23:00:21 CET] <Sesse> wonder if VLC does differently
[23:00:24 CET] <JEEB> you need something like zscale
[23:00:37 CET] <Sesse> JEEB: yeah, but it essentially means ffplay will play videos wrong
[23:00:38 CET] <JEEB> basically you need to use a filter that can do things proper :P
[23:00:39 CET] <Sesse> which I didn't know
[23:00:58 CET] <JEEB> most likely yes
[23:01:11 CET] <JEEB> I would only trust mpv's gpu renderer and VLC's new libplacebo based renderer for video rendering :P
[23:01:21 CET] <JEEB> (and with non-GPU stuff, zimg)
[23:01:29 CET] <JEEB> (which is the library that zscale uses)
[23:01:37 CET] <Sesse> for GPU stuff, I trust my own library (movit) :-)
[23:02:30 CET] <JEEB> libplacebo is basically mpv's stuff moved into a separate library (and VLC has since taken it into usage in vulkan at least)
[23:02:52 CET] <JEEB> and then zimg is a thing from a person I expected it the least from :P
[23:02:56 CET] <JEEB> but it has been very useful
[23:03:00 CET] <JEEB> https://github.com/sekrit-twc/zimg
[23:03:06 CET] <brimestone> Can OpenCL be use to accelerate filter?
[23:04:21 CET] <JEEB> Sesse: but most video things will most likely just ignore stuff like chroma location
[23:04:35 CET] <JEEB> I would guess most renderers in VLC, as well as the default Windows things
[23:09:32 CET] <Sesse> JEEB: sad, but probably true
[23:10:35 CET] <JEEB> they just expect top left for HD pretty much
[23:10:40 CET] <JEEB> I would guesstimate
[23:11:23 CET] <JEEB> anyways, the three things I check things against I've noted: mpv's gpu renderer, libplacebo stuff as long as the metadata is passed to the thing, zimg
[23:13:38 CET] <Sesse> let's try mpv, then
[23:14:33 CET] <Sesse> on visual inspection, I see zero difference between left and center (on yuv422, so unscaled)
[23:14:41 CET] <Sesse> that's with mpv's gpu renderer
[23:14:59 CET] <Sesse> wait, it was never set correctly according to ffprobe
[23:15:05 CET] <JEEB> and you can press i or shift+i to see the metadata passed
[23:15:27 CET] <JEEB> (or the logs)
[23:15:44 CET] <Sesse> it writes about primaries, coloar matrix, levels, gamma
[23:15:49 CET] <Sesse> but no chroma positioning
[23:16:12 CET] <Sesse> (on shift+i)
[23:16:19 CET] <JEEB> https://mpv.io/manual/master/#video-filters-vf
[23:16:26 CET] <JEEB> see the format filter
[23:17:01 CET] <JEEB> oh geez, does that not have the chroma location in there :P
[23:17:09 CET] <Sesse> indeed not :-P
[23:17:17 CET] <Sesse> chroma info is sort of an odd outlier anyway
[23:17:22 CET] <JEEB> video/filter/vf_format.c:    int chroma_location;
[23:17:26 CET] <JEEB> but it's in the sauce
[23:17:34 CET] <Sesse> ffmpeg conflates chroma format and pixel format
[23:17:42 CET] <Sesse> but they're really quite different things
[23:17:59 CET] <JEEB> nah, they're separate values and read from f.ex. H.264 or HEVC metadata
[23:18:07 CET] <JEEB> it's just that the conversion system is 100% pix_fmt based
[23:18:08 CET] <JEEB> ;P
[23:18:13 CET] <JEEB> it just ain't looking anywhere else
[23:18:26 CET] <Sesse> sure, but it should have put them in entirely different structs IMHO :-P
[23:18:39 CET] <Sesse> chroma upsampling and gamma handling are, well, not really related
[23:18:44 CET] <JEEB> wait what
[23:18:48 CET] <JEEB> where does it collate them
[23:19:07 CET] <JEEB> chroma_sample_location is just left/topleft/blah/blah
[23:19:14 CET] <Sesse> for one, it calls ycbcr primaries colorspace
[23:19:16 CET] <JEEB> and color_trc
[23:19:21 CET] <JEEB> is the transfer function
[23:19:22 CET] <Sesse> where it isn't a color space at all
[23:19:34 CET] <JEEB> true
[23:19:44 CET] <JEEB> but that's still not pix_fmt
[23:19:48 CET] <Sesse> sure
[23:19:58 CET] <Sesse> but it should have lived together with ycbcr pix_fmt information
[23:20:14 CET] <Sesse> I mean, ffmpeg pretty much assumes you can do filters on ycbcr data
[23:20:16 CET] <Sesse> which isn't really true
[23:20:27 CET] <JEEB> it's there together with it in both AVCodecContext and AVFrames
[23:20:35 CET] <JEEB> in a perfect world the framework for filters would do itw ell
[23:20:37 CET] <Sesse> they can never ever be gamma-correct, then (except some special cases like mirror)
[23:20:54 CET] <JEEB> unfortunately, currently the automated filtering thing uses swscale
[23:20:57 CET] <JEEB> which is from early 2000s
[23:21:01 CET] <JEEB> you can guess how that goes :P
[23:21:07 CET] <Sesse> mm
[23:21:15 CET] <JEEB> the name for colorspace might be bad, but the values are all there :P
[23:21:32 CET] <Sesse> yeah, I know
[23:21:36 CET] <JEEB> and they are both in AVCodecContext as well as AVFrames which are each frame that come out of stuff
[23:21:40 CET] <Sesse> oh, and don't get me started on yuv422j :-P
[23:21:41 CET] <JEEB> so in theory the thing's set
[23:21:57 CET] <JEEB> someone just has to rewrite the whole giant big pile of legacy
[23:22:05 CET] <JEEB> which is the default conversion routines
[23:22:18 CET] <JEEB> until then, manual filter chain it is :P
[23:22:19 CET] <Sesse> sounds like a small and easy task with no chance of controversy or regressions :-)
[23:22:23 CET] <JEEB> ayup
[23:22:29 CET] <JEEB> esp. since swscale has some really esoteric stuff there
[23:22:44 CET] <Sesse> the mmx routines regressed 2%
[23:23:10 CET] <JEEB> thankfully at this point people really probably wouldn't care too much about MMX, but they probably would care about some paletted stuff or something if that were to go away - or something
[23:23:23 CET] <JEEB> (You could almost leave swscale there for the esoteric stuff)
[23:23:31 CET] <JEEB> and move to zimg for the normal stuff
[23:23:40 CET] <JEEB> unfortunately, piles of legacy
[23:23:45 CET] <JEEB> and people already can handle it manually
[23:23:54 CET] <Sesse> yeah, I remember trying to post patches to actually respect the color metadata given in the mux, instead of blindingly assuming JPEGs were center chroma
[23:23:58 CET] <Sesse> and getting ignored
[23:24:07 CET] <Sesse> meanwhile people were optimizing cinepak decoding :-)
[23:24:27 CET] <JEEB> you did? at this point unless the thing did something wrong I'd say such a patch would go in just fine
[23:24:29 CET] <Sesse> (OK, I wasn't fully ignored -- I was refused, and my followup questions were ignored)
[23:24:33 CET] <JEEB> huh
[23:24:35 CET] <JEEB> funky
[23:24:47 CET] <JEEB> anyways, I'm getting brainderp'd at $dayjob
[23:24:50 CET] <Sesse> enjoy
[23:25:02 CET] <JEEB> so I often have less than normal amounts of time to give stuff
[23:25:10 CET] <JEEB> like there was a patch for the FFmpeg examples to update some of them :P
[23:25:15 CET] <JEEB> and I still haven't reviewed it
[23:25:19 CET] <JEEB> nor has anyone else seemingly cared
[23:25:51 CET] <JEEB> meanwhile the thing I do have time for - sometimes - is trying to get ARIB captions going internally (I already got an (L)GPLv3 wrapper in)
[23:26:02 CET] <JEEB> but there's a whole history about a certain text encoding handling not going into any upstream iconv
[23:26:11 CET] <Sesse> joy
[23:26:21 CET] <JEEB> Sesse: also I'm not sure if it's documented but chroma-location is in the format filter in mpv
[23:26:26 CET] <JEEB> I just looked at the sauce
[23:26:37 CET] <Sesse> mm, ok
[23:26:42 CET] <JEEB> so it can be overridden if not provided correctly
[23:26:47 CET] <Sesse> well, I just said -chroma_sample_location left and hoped it would beo k
[23:27:01 CET] <Sesse> if no players do it correctly anyway...
[23:27:27 CET] <JEEB> see if ffprobe -v verbose your_output_h264.mp4 shows it
[23:27:32 CET] <JEEB> if yes, it should get applied
[23:27:40 CET] <Sesse> it doesn't (which is funky)
[23:27:46 CET] <JEEB> then something didn't get somewhere
[23:27:53 CET] <Sesse> anyway, I'm converting to 420 now too, so who knows what it will do :-)
[23:27:56 CET] <Sesse> I'll just accept broken chroma
[23:28:06 CET] <JEEB> for future reference, build with zimg :)
[23:28:18 CET] <JEEB> that's the sw implementation I more or less trust
[23:28:50 CET] <JEEB> Sesse: ahahahaha
[23:28:51 CET] <Sesse> well, I don't generally build ffmpeg :-)
[23:28:56 CET] <JEEB> libx264.c
[23:29:05 CET] <JEEB> guess if that value from AVCodecContext is read
[23:29:14 CET] <Sesse> ha :-P
[23:29:20 CET] <JEEB> classic
[23:29:26 CET] <Sesse> well, in theory it should have been stored in the mux...
[23:29:49 CET] <JEEB> generally you don't want to trust container metadata unless you're against a wall
[23:30:01 CET] <JEEB> since it can be more easily mucked with
[23:30:02 CET] <JEEB> or lost
[23:30:04 CET] <JEEB> in remux
[23:30:07 CET] <Sesse> well, so in this case I'm doing MJPEG
[23:30:14 CET] <Sesse> and JPEG doesn't really have metadata
[23:30:20 CET] <Sesse> there's only JFIF, which specifies MPEG-2-like data
[23:30:29 CET] <Sesse> (rec. 601, center chroma, that kind of stuff)
[23:30:45 CET] <JEEB> also you are really making me want to check H.262
[23:30:58 CET] <Sesse> and then ffmpeg has a magic, private comment, where you can override one of those
[23:31:10 CET] <Sesse> CS=itu709 or something like that
[23:31:18 CET] <Sesse> CS=ITU601
[23:31:49 CET] <Sesse> which you'd think specified bt.601
[23:31:56 CET] <Sesse> but no, it specifies limited ycbcr range :-P
[23:32:06 CET] <Sesse> (I think it originates with mplayer)
[23:32:09 CET] <JEEB> I bet that's also from the depths of early 2000s
[23:32:10 CET] <JEEB> yea
[23:32:22 CET] <JEEB> anyways, can you give me the keyword to search for in H.262 (MPEG-2 Video)
[23:32:33 CET] <JEEB> I really want to double-check it with my eyes :P
[23:33:07 CET] <JEEB> ah-ha
[23:33:10 CET] <JEEB> it actually is in annex D.9
[23:34:10 CET] <JEEB> http://up-cat.net/p/e5b126c3
[23:38:13 CET] <JEEB> looks like left-mid according to the 4:2:0 image in figure 6-1 in 6.1.1.8
[23:39:19 CET] <Sesse> yeah, you're right, wikipedia confirms
[23:39:24 CET] <Sesse> In MPEG-2, Cb and Cr are cosited horizontally. Cb and Cr are sited between pixels in the vertical direction (sited interstitially).
[23:39:27 CET] <Sesse> In JPEG/JFIF, H.261, and MPEG-1, Cb and Cr are sited interstitially, halfway between alternate luma samples.
[23:40:08 CET] <JEEB> yea, I did remember JPEG and MPEG-1 being similar but dissimilar to MPEG-2 and further
[23:40:33 CET] <JEEB> I don't remember what started using top-left, was it BT.709?
[23:41:00 CET] <Sesse> unsure if the bt.* standards talk much about chroma placement
[23:41:12 CET] <Sesse> mostly about rgb color space and ycbcr coefficients
[23:41:37 CET] <JEEB> I think some of them surprisingly do
[23:41:43 CET] <JEEB> not sure which tho
[23:43:41 CET] <JEEB> jesus christ
[23:43:57 CET] <JEEB> already forgot the darn chroma sample loc type
[23:44:00 CET] <JEEB> image in H.264
[23:44:08 CET] <JEEB> that is probably one of the most confusing images I have ever seen
[23:44:30 CET] <Sesse> haha
[23:45:00 CET] <JEEB> under annex E.2 (VUI semantics(
[23:45:29 CET] <Sesse> I don't have the standard before me (I don't think I've ever read it)
[23:45:52 CET] <Sesse> thankfully never really needed to understand the details of the codec
[23:45:59 CET] <JEEB> https://www.itu.int/rec/T-REC-H.264
[23:46:09 CET] <JEEB> it's mostly the metadata fields etc
[23:46:28 CET] <JEEB> which generally interest me. my codec expertise as such ends at my baby's first Ut Video encoder
[23:46:39 CET] <JEEB> which was just left/median prediction and huffman
[23:47:09 CET] <Sesse> I did a better-than-JPEG still image codec at some point, for GPU purposes
[23:47:13 CET] <Sesse> but the encoder was too slow :-)
[23:48:30 CET] <JEEB> and yes, in that latest edition Figure E-1 is as flawless as ever
[23:49:46 CET] <JEEB> anyways, sleep for me. way overdue for that.
[23:50:27 CET] <JEEB> feel free to poke me with a link to the JPEG patch thread to see if it was just bikeshedding or actual possible issues
[23:50:27 CET] <Sesse> good night, and thanks
[23:50:36 CET] <JEEB> either in pm or here
[23:50:42 CET] <Sesse> yeah, no, I found a local workaround
[23:50:42 CET] <JEEB> o/
[23:50:48 CET] <Sesse> which was easier for my case :-)
[00:00:00 CET] --- Wed Mar  6 2019


More information about the Ffmpeg-devel-irc mailing list