[Ffmpeg-devel-irc] ffmpeg-devel.log.20190704

burek burek021 at gmail.com
Fri Jul 5 03:05:04 EEST 2019


[13:44:09 CEST] <cone-116> ffmpeg 03Jun Zhao 07master:9269bccbb340: doc/muxers: fix docs format for DASH muxer
[18:45:28 CEST] <prem> can I do decoding as well as inference using ffmpeg?
[22:12:19 CEST] <_bluez> Hey! I am currently working on implementing tiles in HEIF format... can I get some advice regarding that?
[22:13:23 CEST] <_bluez> First question is should I set each tile to a different stream?
[22:14:03 CEST] <_bluez> Or is it possible to store the tiles as multiple frames in a single stream?
[22:14:48 CEST] <durandal_1707> arent tiles makking one big image?
[22:14:57 CEST] <_bluez> Yeah they are
[22:15:18 CEST] <_bluez> but first I have to decode them individually i guess
[22:15:34 CEST] <durandal_1707> then making multiple streams of it is bad
[22:15:44 CEST] <_bluez> How do I stitch them is another question in my mind...
[22:15:48 CEST] <Lynne> that's not a good solution at all
[22:16:05 CEST] <Lynne> better to just output each tile as a separate frame and let the decoder reconstruct them
[22:16:25 CEST] <Lynne> you'll just need some side data per-frame to tell the decoder what do it
[22:17:18 CEST] <Lynne> we already do this for interlaced streams for some codecs to produce 1 frame from 2 fields
[22:18:24 CEST] <_bluez> Lynne: Okay but how do I do that... can you elaborate a bit?
[22:18:46 CEST] <_bluez> what side data would that be?
[22:19:41 CEST] <Lynne> it should contain the final image resolution, the tile resolution, the tile pixel format and the tile location
[22:20:48 CEST] <Lynne> is there a very specific order in which tiles appear in heif files or are they random?
[22:22:16 CEST] <_bluez> for the samples I have (and a bit of research I did) it looks like they are sequentially arranged from left to right and top to bottom (8x6 grid)
[22:23:41 CEST] <JEEB> thankfully I think the HEIF spec is public
[22:24:05 CEST] <jdarnley> Why do lots of "recent" formats do that?
[22:24:09 CEST] <_bluez> we mentioned we already do something similar... could you tell me where? It will be helpful.
[22:24:34 CEST] <JEEB> well it's not exactly the same
[22:24:45 CEST] <JEEB> but we do wait for multiple fields in H.264
[22:25:02 CEST] <JEEB> so that we create a picture with two fields interleaved, even with PAFF coding
[22:25:07 CEST] <Lynne> _bluez: its similar but not exact, we check if the stream is interlaced, then check each frame's field indicator
[22:26:41 CEST] <Lynne> _bluez: to begin with, just create a new side data type in libavutil/frame.h and make the demuxer produce one frame per tile with the metadata
[22:27:18 CEST] <Lynne> the decoder part shouldn't be too difficult to do
[22:28:55 CEST] <_bluez> Okay I won't say I understood all of what you said but I think I have an idea
[22:29:09 CEST] <Lynne> another idea I've had would be to write a bsf which takes in segmented tiles and combines them to make a single avpacket with a standard tiled hevc stream
[22:29:30 CEST] <Lynne> but I'm not too familiar with the hevc bitstream to know if that's possible
[22:30:43 CEST] <Lynne> the decoder wouldn't need to be modified in that case and would be able to use slice threads
[22:32:38 CEST] <_bluez> do you mean combining all tiles into a single packet first and then decode it?
[22:34:29 CEST] <_bluez> Sorry if I am not making much sense here... still have to know more before I can.. m just starting now ':D
[22:34:31 CEST] <Lynne> yes, if the hevc spec is sane and defines that each tile needs to be aligned to the nearest byte it should be doable
[22:35:25 CEST] <_bluez> but i found somewhere that each tile is a seperate hevc frame and hence needs to be decoded first.. how far is that accurate?
[22:35:27 CEST] <Lynne> especially since we already have an infrastructure to parse and write hevc headers
[22:35:54 CEST] <Lynne> yes, but since each tile in a standard hevc stream is also independent there's no difference
[22:36:15 CEST] <_bluez> okay..
[22:36:35 CEST] <Lynne> for now you just need to write the side data type and make the demuxer output frames from tiles
[22:37:39 CEST] <_bluez> I am using av_append_packet to attach pics to each stream for now... am i doing it right?
[22:39:12 CEST] <_bluez> each attached pic is a tile... that is what i tried doing at first... maybe its not correct though
[22:42:43 CEST] <Lynne> yes, streams are a container level feature, codecs don't care (and they shouldn't) about stream indices and containers
[22:43:13 CEST] <_bluez> Okay, thanks!
[22:44:37 CEST] <_bluez> I'll try to do what you said... 
[00:00:00 CEST] --- Fri Jul  5 2019


More information about the Ffmpeg-devel-irc mailing list