[Ffmpeg-devel-irc] ffmpeg-devel.log.20190620

burek burek021 at gmail.com
Fri Jun 21 03:05:03 EEST 2019


[00:22:58 CEST] <iive> jya, why not use vdpau in firefox?
[00:23:41 CEST] <jya> iive: one thing at a time, and there's far more people with an intel GPU on linux than nvidia
[00:23:59 CEST] <jya> and vdpau is now deprecated
[00:24:37 CEST] <iive> mesa3d supports vdpau for all radeons too. nvidia depricated it, but it is still supported. i think they opensourced it too.
[00:25:34 CEST] <jya> AFAIK, it's always been open sourced 
[00:26:34 CEST] <iive> not always, but it was open sourced long ago.
[00:27:01 CEST] <iive> what i mean, it is higher level and should be easier to support.
[00:27:39 CEST] <jya> ease of implementation isn't really a concern here. 
[00:28:11 CEST] <jya> i see no point in adding support for an already deprecated API. 
[00:37:19 CEST] <iive> jya, i can't find official deprecation notice with google. could you provide me with a link?
[02:33:52 CEST] <vel0city> how do I use a decoder from another one?
[02:34:11 CEST] <vel0city> like, in my case I want to use ff_mjpeg_decode_frame from tiff.c's decode_frame
[02:34:58 CEST] <jlut> iive: vdpau is not officially deprecated afaik, just that nvidia released nvdec (for newer chipsets and with vp9) and amd has amx now. things dont look good for vdpau
[02:35:13 CEST] <jlut> amf*
[02:37:08 CEST] <iive> jlut, what is the point of video acceleration api, if each manufacturer is going to make its own incompatible api...
[02:37:34 CEST] <jlut> ik its quite sad :(
[02:39:07 CEST] <iive> vdpau is mostly good API.
[02:43:11 CEST] <jlut> yeah, but hopefully in the future we can have nvdec/amx backends for vaapi. everyone will support vaapi, life will be good. oh well.
[02:47:41 CEST] <iive> is there somebody who likes vaapi?
[02:50:02 CEST] <jlut> idk it is the most supported one, given how chrome/firefox also plan to implement it. its the only open API standard left, if we assume vdpau will die.
[03:07:32 CEST] <iive> it is kind of circular logic
[03:07:57 CEST] <iive> whatever is supported would become the supported standard.
[03:13:46 CEST] <iive> n8 ppl
[03:16:51 CEST] <jlut> yeah tru. n8
[03:36:05 CEST] <kierank> jamrial: I think the UB stuff isn't too bad, but the timeout stuff is silly
[05:09:31 CEST] <cone-064> ffmpeg 03Bodecs Bela 07master:86f04b918c0d: avformat/hlsenc: enhanced %v handling with variant names
[10:43:40 CEST] <grosso> hi
[10:43:51 CEST] <grosso> Do you offer courses on ffmpeg developing?
[10:44:32 CEST] <JEEB> not yet at least
[10:44:53 CEST] <JEEB> although every now and then someone has some time to go through a bit of hand-holding on the user channel for API usage
[10:47:42 CEST] <grosso> I have a problem very difficult to solve. I've been looking into the code to figure out how things work, but it is die-hard. I need some developer into the flv format and h264 and aac codecs to help me.. I can pay for it
[11:13:55 CEST] <willson> test
[14:34:21 CEST] <cone-047> ffmpeg 03Gyan Doshi 07master:91f5950f833f: avformat/segment: fix muxing tmcd tracks in MOV
[19:05:50 CEST] <vel0city> how do I use a decoder from another one?
[19:05:59 CEST] <vel0city> like, in my case I want to use ff_mjpeg_decode_frame from tiff.c's decode_frame
[19:06:46 CEST] <vel0city> but the former of course accepts an MJpegDecodeContext
[19:07:08 CEST] <vel0city> so how to bridge the gap?
[19:10:44 CEST] <JEEB> ok, so in a perfect world due to TIFF being essentially an image container you'd have it be an avformat thing, and then it would set the correct AV_CODEC_ID (which would be JPEG or raw or whatever)
[19:11:14 CEST] <JEEB> but atm it seems like tiff is a decoder in libavcodec only
[19:11:32 CEST] <JEEB> which is then utillized through the meta demuxer/muxer for images :P
[19:11:44 CEST] <JEEB> ("raw" images specifically)
[19:12:03 CEST] <JEEB> vel0city: I'm not sure how hard it would be to move tiff to be like that
[19:13:31 CEST] <JEEB> alternatively, you could initialize a sub-AVCodecContext for JPEG, and then call into that. but that is just :< since it breaks the abstraction of format vs actual image data
[19:14:21 CEST] <vel0city> hm
[19:15:31 CEST] <vel0city> idk, I'm definitely not familiar enough with the codebase that I'd feel comfortable converting tiff to libavformat
[19:16:06 CEST] <JEEB> basically the part which decides if it's raw or JPEG or whatever would have to be re-created on lavf side
[19:16:22 CEST] <JEEB> and then the decoder itself can be kept on the other side for as long as required
[19:17:09 CEST] <vel0city> oh I see
[19:18:08 CEST] <vel0city> so lavf side can only basically pick an enum?
[19:18:21 CEST] <vel0city> because it's not like we I can direct it straight to mjpegdec
[19:18:32 CEST] <JEEB> no, it provides the data packets as well
[19:18:39 CEST] <JEEB> it's the part that parses the container
[19:18:40 CEST] <vel0city> it needs processing because TIFFs contain tils of separate jpegs
[19:18:48 CEST] <vel0city> DNGs*
[19:18:50 CEST] <JEEB> and provides AVPackets that can be fed into a decoder
[19:19:05 CEST] <vel0city> so it can call into the decoder multiple times?
[19:19:15 CEST] <vel0city> for a single frame
[19:19:21 CEST] <JEEB> no, it just provides the interfaces to read the AVStreams' contents from a container
[19:19:24 CEST] <JEEB> for example if you know mp4
[19:19:34 CEST] <JEEB> a demuxer reads its index, generates all relevant AVStreams
[19:19:47 CEST] <JEEB> and then the user can read packets from the container and apply them to matching decoders
[19:19:58 CEST] <JEEB> demuxers read containers and provide their data
[19:20:05 CEST] <JEEB> decoders decode provided data
[19:20:27 CEST] <JEEB> so TIFF is a container that can have either raw or JPEG or whatever
[19:20:45 CEST] <JEEB> and then the actual image data within would be AVPackets
[19:21:31 CEST] <JEEB> and if there's multiple images in the container you can either provide them as separate AVStreams, or as separate AVPackets within a single AVStream
[19:21:45 CEST] <JEEB> of course, they should all be the same codec in the AVStream :P
[19:22:12 CEST] <JEEB> a decoder should not be concerned about container level things, which is what the tiff decoder currently is doing
[19:22:24 CEST] <JEEB> it's doing that just because it was originally implemented so that the lavf part is 100% pass-through
[19:22:27 CEST] <JEEB> :P
[19:22:46 CEST] <JEEB> but as soon as you get to something like this you actually can't do that any more
[19:23:00 CEST] <JEEB> which was a problem coming to roost because TIFF is a container, and not only for raw images
[19:23:11 CEST] <JEEB> for just raw images you can set the pixel format to whatever matches and provide those :P
[19:24:32 CEST] <JEEB> so currently what happens is libavformat/img2dec.c has an entry registered for tiff
[19:24:37 CEST] <JEEB> > IMAGEAUTO_DEMUXER(tiff,    AV_CODEC_ID_TIFF)
[19:24:57 CEST] <JEEB> which then calls tiff_probe
[19:25:27 CEST] <JEEB> and if that probe matches, it literally just provides the data as-is to the tiff "decoder"
[19:27:06 CEST] <JEEB> while the correct way would be that lavf would have the thing that handles the TIFF container side and would either provide the raw images as the raw video avcodec id, or the compressed JPEG frames with the JPEG avcodec id
[19:27:57 CEST] <JEEB> vel0city: I hope this is all clear and unfortunately compressed stuff within TIFF was never a theme when the TIFF support was originally made. I'm sorry.
[19:29:51 CEST] <vel0city> @JEEB: it's clear, thanks for explaining
[19:31:28 CEST] <vel0city> I suppose I'll look into lavf then, I've never done anything with it
[19:31:49 CEST] <vel0city> so the solution would involve removing tiff stuff from img2dec right?
[19:32:37 CEST] <JEEB> yes
[19:32:54 CEST] <JEEB> you would build a libavformat/tiffdec.c
[19:33:30 CEST] <vel0city> "dec"? I thought it was supposd to demux not decode
[19:33:31 CEST] <JEEB> which would parse the TIFF structure, decide streams and their types etc, and be the thing that provides the compressed or uncompressed frames to decoders
[19:33:38 CEST] <JEEB> vel0city: it's a wording in the code base :)
[19:33:48 CEST] <JEEB> dec/enc is used both in lavc and lavf
[19:34:08 CEST] <vel0city> right
[19:35:30 CEST] <vel0city> so, would it be feasible - as a first goal - to make this lavf file directly use tiff.c as it is now?
[19:37:49 CEST] <JEEB> I'd guess yes, as a kludge
[19:38:07 CEST] <JEEB> you set AV_CODEC_ID to TIFF and when packet is requested you give the whole thing or whatever
[19:38:15 CEST] <JEEB> (not sure how img2 exactly works)
[19:39:21 CEST] <JEEB> while for the JPEG part you would have to parse the actual JPEG image
[19:39:27 CEST] <JEEB> and pass that on
[19:39:47 CEST] <vel0city> ah and it would create the correct AVCodecContext/TiffContext'es by itself?
[19:40:27 CEST] <JEEB> the format specific stuff (jpeg, "tiff", etc) is internal, but the API user would effectively then create an AVCodecContext of the right type
[19:40:38 CEST] <JEEB> since you export the AVStream with the correct one
[19:41:24 CEST] <JEEB> anyways, will try to do some jogging so be back in a while
[19:41:37 CEST] <vel0city> cool
[19:59:27 CEST] <cone-883> ffmpeg 03Andreas Rheinhardt 07master:a1a8815220fc: libavcodec: Reduce the size of some arrays
[22:15:10 CEST] <JEEB> https://developer.apple.com/av-foundation/HEVC-Video-with-Alpha-Interoperability-Profile.pdf
[22:15:13 CEST] <JEEB> what a lovely spec
[22:15:28 CEST] <JEEB> also TIL there's an alpha layer SEI
[22:47:30 CEST] <TD-Linux> can't decide if that is better or worse than webm alpha
[23:32:35 CEST] <jamrial> JEEB: well, the sample in the ticket above already isn't following that spec. it has one sps and one pps, when it's supposed to be two
[23:33:35 CEST] <JEEB> lol
[00:00:00 CEST] --- Fri Jun 21 2019


More information about the Ffmpeg-devel-irc mailing list