[Ffmpeg-devel-irc] ffmpeg-devel.log.20150729

burek burek021 at gmail.com
Thu Jul 30 02:05:02 CEST 2015


[01:46:24 CEST] <cone-787> ffmpeg 03Michael Niedermayer 07master:15eda746e738: avcodec/proresenc_anatoliy: remove coded_frame use
[01:46:25 CEST] <cone-787> ffmpeg 03Michael Niedermayer 07master:9789595189e7: avcodec/utils: Set coded_frame.pict_type in generic code
[03:24:36 CEST] <cone-787> ffmpeg 03Michael Niedermayer 07master:0a6aa30f7c05: avcodec/h264_refs: extend RAP heuristic to multiple PPS
[03:24:37 CEST] <cone-787> ffmpeg 03Michael Niedermayer 07master:9ec17e45b28a: avcodec/h264_refs: Allow slightly larger pps_ref_count[0] in PAFF RAP detection heuristic
[09:11:03 CEST] <rcombs> why does yuv410p have `.log2_chroma_h = 2`?
[09:11:09 CEST] <rcombs> (AV_PIX_FMT_YUV410P)
[09:11:16 CEST] <rcombs> seems like that should be 1
[09:13:45 CEST] <rcombs> michaelni: ^ seems like it's been like that since it's existed
[09:13:51 CEST] <rcombs> (pixdesc.c, that is)
[10:08:27 CEST] <L0RE> Does someone know "#EXT-X-KEY:METHOD=AES-128,URI=" ?
[10:09:17 CEST] <L0RE> in m3u playlists?
[10:09:29 CEST] <JEEB> sounds like the usual 128bit AES encryption :P
[10:09:58 CEST] <JEEB> the URI is usually something that is behind some kind of authentication (possibly over HTTPS)
[10:10:05 CEST] <L0RE> why does ffmpg not support it? 
[10:11:28 CEST] <L0RE> why iam asking is, i found out how to decode it, maybe it would be usefull for ffmpg
[10:12:44 CEST] <L0RE> so i thought asking here where to put it if its wanted. (i dont know policy off ffmpg , maby drm isnt wanted in ffmpg, so i thought asking here befor making any afford)
[10:13:45 CEST] <nevcairiel> its just aes encryption, its not hard to find out how  to decode it =p
[10:14:39 CEST] <L0RE> +nevcairiel: that why im wandered why ffmpg doenst support it...
[10:15:49 CEST] <nevcairiel> it should be supported
[10:16:45 CEST] <L0RE> then im to stupid to use it. can you give me an hint?
[10:17:30 CEST] <L0RE> how to give ffmpg the keyfile and the salt, so the files could be decoded...
[10:17:50 CEST] <nevcairiel> they keyfile is supposed to be referenced in the URI parameter
[10:18:49 CEST] <L0RE> hmm, then i wonder why it is not working .. since which version should it working?
[10:22:53 CEST] <wm4> just use git
[10:24:43 CEST] <L0RE> o.k thanks ill try
[10:24:52 CEST] <L0RE> by
[11:15:29 CEST] <michaelni> rcombs, yuv410 has chroma for each 4x4 luma
[11:18:45 CEST] <rcombs> oh, hmm, so it's inconsistent notation?
[11:18:59 CEST] <rcombs> https://patches.libav.org/patch/27507/ looks like libav had some discussion on the topic
[11:20:57 CEST] <rcombs> but the comment on the AV_PIX_FMT_YUV410P definition is pretty clear about it, so I guess it's probably fine
[11:21:14 CEST] <nevcairiel> 4:1:0 should be one chroma for 2x4 pixels, shouldnt it
[11:22:25 CEST] <nevcairiel> all these notations operate on a 2 rows 4 columns pixel grid
[11:39:00 CEST] <michaelni> there was one description of the yuv notation from someone who knew the people in the actual comittee writing this somewhere but i cant find it anymore
[11:42:51 CEST] <rcombs> apparently some people call the 1-per-4x4 format "YUV9"
[11:43:23 CEST] <rcombs> nevcairiel: yeah, to keep with the common syntax, and that's also what e.g. Wikipedia says on the topic (though they don't have a specific source for it)
[12:15:46 CEST] <kierank> pffft still more crashes with sliced threads
[12:15:51 CEST] Action: kierank starts doing more fuzzing
[13:27:36 CEST] <cone-116> ffmpeg 03Ganesh Ajjanagadde 07master:13d605e090bd: wavdec: make sample count check more precise
[14:06:31 CEST] <cone-116> ffmpeg 03Michael Niedermayer 07master:f40ec7047864: avformat/wavdec: Check for data_size overflow
[15:16:53 CEST] <Compn> goddamn "internet police"
[15:20:55 CEST] <Compn> j-b : has anyone threatened your hosts' internet backbone about your website? 
[15:21:09 CEST] <Compn> i thought it happened to ffmpeg/mplayer once but i cant remember.
[15:21:41 CEST] <Compn> sometimes they threaten the first host, then when that doesnt work they go bug the backbones haha
[15:21:46 CEST] <j-b> Compn: yes.
[15:22:05 CEST] <Compn> and your host handled it ok ?
[15:22:15 CEST] <Compn> they didnt freak out and shut you down etc
[15:24:05 CEST] Action: Compn afk
[15:34:44 CEST] <J_Darnley> Who the heck freaks out because 1 anti-virus scan by 1 scanner from 1 company found 1 possibly infected file elsewhere on the internet?
[15:38:09 CEST] <mathieu> I'd like to modify the H264 decoder and add it to ffmpeg, but I can't get it working
[15:38:47 CEST] <ubitux> J_Darnley: more like 10 AV
[15:39:01 CEST] <mathieu> How do I add the new decoder?
[15:39:23 CEST] <mathieu> in allcodecs.c, I added a REGISTER_DECODER line
[15:39:24 CEST] <J_Darnley> Are you looking for allcodecs.c?
[15:40:00 CEST] <J_Darnley> Okay, then what happens when you build?
[15:40:02 CEST] <mathieu> but I don't know where to put the other files (at this time, just a copy of the h264 is enough)
[15:40:35 CEST] <J_Darnley> codecs go in libavcodec.
[15:40:40 CEST] <mathieu> get an error: undefined symbol ff_h264_custom_decoder
[15:40:55 CEST] <J_Darnley> you need to add files to the makefile if you want them built
[15:41:05 CEST] <mathieu> it's the H264 codec, but I'm going to modify the decoding code
[15:41:47 CEST] <mathieu> Makefile.am?
[15:41:51 CEST] <J_Darnley> No
[15:41:58 CEST] <J_Darnley> libavcodec/Makefile
[15:42:34 CEST] <J_Darnley> No auto-tools rubbish here
[15:45:16 CEST] <mathieu> okey, I'll have a look at that, thanks
[16:09:49 CEST] <mathieu> I can get my own copy compiled & working, but only if i disable the original h264
[16:10:29 CEST] <BtbN> if your copy defines colliding symbols, that's not too surprising.
[16:10:33 CEST] <BtbN> What are you even trying to do?
[16:10:45 CEST] <mathieu> I'd like to keep both, and have my custom.c file call functions in the original h264.c
[16:11:21 CEST] <mathieu> I'm going to modify the code of H264 to decode, but I need both
[16:11:49 CEST] <mathieu> so I want to have one extra decoder available, running my custom code, but calling the core functions of h264
[16:12:07 CEST] <mathieu> is that possible? feasible?
[16:12:49 CEST] <mathieu> Ultimately I need to create a custom gstreamer element based on the avdec_h264 decoder
[16:38:05 CEST] <cone-116> ffmpeg 03Michael Niedermayer 07master:d64ba25a4dd7: ffplay: Use sws_scale to scale subtitles
[17:03:23 CEST] <cone-116> ffmpeg 03Nedeljko Babic 07master:902bfa5b2208: avcodec/aacdec_fixed: Fix preparation for resampler
[17:32:15 CEST] <cone-116> ffmpeg 03Michael Niedermayer 07master:f4ada6dc3ff7: ffmpeg: Implement support for seeking relative to EOF
[17:50:27 CEST] <ocrete> how am I supposed to know the final format of the decoded picture when get_buffer2() is called? for example, I can't find out of the stream is interlaced or not before a frame comes out? and I need the information to setup the sink to get the frames, since if I wait for a frame to come out, it's too late as I can't change the stride once decoding has started as I undrerstand it ?
[17:55:49 CEST] <nevcairiel> interlacing is of no consequence to the memory requirements for the image buffer, and all other relevant properties are well known at that time
[18:01:55 CEST] <ocrete> and the heigh is incorrect in the AVFrame
[18:02:09 CEST] <ocrete> it's the coded height, not the display height
[18:02:21 CEST] <ocrete> so it can't be used to initialise the display device
[18:02:28 CEST] <nevcairiel> during get_buffer2, thats the crucial height
[18:02:35 CEST] <nevcairiel> since thats the size the buffer needs to be
[18:03:06 CEST] <nevcairiel> you can get the display height in most cases from AVCodecContext->height
[18:03:28 CEST] <ocrete> "in most cases" doesn't cut it.. I'm working on gst-ffmpeg, so we need something generic
[18:03:48 CEST] <nevcairiel> thats as generic as it gets
[18:04:04 CEST] <nevcairiel> and get_buffer2 is not a callback to initialize your display, its a callback to get a memory buffer to decode the frame into
[18:04:16 CEST] <nevcairiel> if you use it for something else, you get to deal with problems from that :)(
[18:04:19 CEST] <ocrete> in many hardware, you need to initialise the display to get the ram from
[18:04:39 CEST] <BtbN> Is there any documentation on how to use the opencl stuff ffmpeg has?
[18:04:45 CEST] <BtbN> In terms of adding support to other filters.
[18:06:05 CEST] <ocrete> in particular, for anything that uses v4l2 output devices
[18:07:31 CEST] <wm4> you don't need to override get_buffer2
[18:08:35 CEST] <ocrete> ?
[18:08:55 CEST] <wm4> lavc will allocate the AVFrame for you
[18:19:14 CEST] <ocrete> wm4: yes, if I wanted memory from malloc(), I'm trying to make it work with externally allocated memory
[18:19:45 CEST] <wm4> why does it have to be externally allocated memory?
[18:20:33 CEST] <ocrete> wm4: because you want to avoid a memcpy? for example, when outputting to a device that requires contiguous memory ?
[18:21:45 CEST] <wm4> are you sure software decoding would be fast enough to even reach realtime on such a crappy restricted device?
[18:23:40 CEST] <ocrete> wm4: yes
[18:23:55 CEST] <wm4> but a memcpy kills it?
[18:27:35 CEST] <cone-116> ffmpeg 03Michael Niedermayer 07master:1fc20af6af3e: avcodec/dvbsubdec: Implement display definition segment fallback from  ETSI EN 300 743 V1.3.1
[18:27:36 CEST] <ocrete> well it doesn't help
[18:27:50 CEST] <ocrete> wm4: you'd be surprised how fast ffmpeg's decoder is vs a memcpy
[18:28:37 CEST] <BtbN> You must be using a crappy memcpy implementation then.
[18:28:48 CEST] <wm4> at this point you could probably decode a frame to software memory, initialize the display, and then decode to video ram
[18:28:54 CEST] <wm4> by starting again
[19:10:15 CEST] <cone-116> ffmpeg 03Michael Niedermayer 07master:42209eb95547: doc/ffmpeg: Use @code
[19:10:16 CEST] <cone-116> ffmpeg 03Michael Niedermayer 07master:0949869e7b80: ffmpeg_op: Print warning if duration isnt known when -sseof is used
[20:33:02 CEST] <cone-116> ffmpeg 03James Almer 07master:6c87b866285f: avformat/rtmpproto: use AVHMAC instead of a custom implementation
[21:34:45 CEST] <nevcairiel> f'ing mips, i hope the aac guy isnt distracted for months again trying to fix some silly mips problem
[21:35:59 CEST] <jamrial> you mean the recent aac_fixed failures?
[21:36:07 CEST] <nevcairiel> no
[21:36:08 CEST] <nevcairiel> aac encoder
[21:36:20 CEST] <nevcairiel> for some reason the encoder re-implements a lot of functions in C
[21:36:24 CEST] <nevcairiel> not even asm or simd
[21:36:25 CEST] <nevcairiel> just C
[21:36:30 CEST] <wm4> wat
[21:36:32 CEST] <nevcairiel> and it breaks occasionally when the encoder is patched
[21:36:42 CEST] <nevcairiel> because the mips C isnt updated
[21:36:56 CEST] <wm4> why does the aac encoder even have mips optimization
[21:37:07 CEST] <nevcairiel> more importantly, why special C code
[21:37:19 CEST] <jamrial> if they are plain c then they should be removed
[21:37:59 CEST] <jamrial> c code that outputs optimized asm with an specific compiler is silly
[21:38:10 CEST] <wm4> the code seems to have inline asm
[21:40:40 CEST] <jamrial> is c99-to-89 still maintained or in development?
[21:46:28 CEST] <rcombs> raise your hand if you give a shit about MIPS in 2015
[21:49:29 CEST] <jamrial> isn't it used in some smartphones?
[21:49:30 CEST] Action: J_Darnley raises hand
[21:49:42 CEST] <J_Darnley> I totally want to encode 4k video on my router
[21:50:27 CEST] <BtbN> The new chinese loongson thing is mips
[21:50:33 CEST] <BtbN> Which is also where most mips patches come from.
[21:50:35 CEST] <wm4> "return AVPROBE_SCORE_MAX + 1;"
[21:50:38 CEST] <wm4> sure looks funny
[21:51:31 CEST] <rcombs> wat
[21:52:00 CEST] <wm4> you'd think MAX is already the maximum
[21:56:02 CEST] <JEEB> up to eleven, son
[22:09:28 CEST] <rcombs> oh wow, it goes back a very long time
[22:09:29 CEST] <rcombs> 07c4ed85
[22:09:35 CEST] <rcombs> I'd advocate for changing that
[22:11:54 CEST] <wm4> heh
[22:12:49 CEST] <rcombs> looks like it could have easily been a mistake
[22:12:54 CEST] <wm4> it's this ffserver pseudo-protocol
[22:13:00 CEST] <wm4> it's a lost cause anyway
[22:30:07 CEST] <wm4> uh, int-size_t is size_t, right?
[22:38:13 CEST] <nevcairiel> int-size_t wat
[22:41:46 CEST] <BtbN> Is the opencl stuff realy only used in two filters?
[22:43:20 CEST] <wm4> nevcairiel: typeof((int)1-(size_t)1)
[22:43:30 CEST] <wm4> BtbN: probably, why not
[22:44:01 CEST] <BtbN> Seems odd, it might be usefull for a lot of other filters
[22:45:08 CEST] <jamrial> but someone needs to write them, and the guys behind the opencl stuff apparently only cared about those two
[22:46:11 CEST] <nevcairiel> wm4: then yes, its size_t
[22:46:17 CEST] <nevcairiel> although .. isnt size_t unsigned
[22:46:33 CEST] <wm4> yes
[22:53:33 CEST] <wm4> nevcairiel: did you ever see what crap modern mkvmerge versions write
[22:53:54 CEST] <wm4> tons of tags with muxer info
[22:54:46 CEST] <nevcairiel> yeah, but at least those are namespaces
[22:55:10 CEST] <BtbN> I might look into adding opencl support to the color/chroma key stuff. But the two filters that support it aren't exactly simple, so finding out how to use it is quite hard.
[22:55:29 CEST] <wm4> oh wait, somehow I thought this patch just used tags which mkvmerge already "standardized"
[22:55:32 CEST] <wm4> I should check this
[22:57:08 CEST] <nevcairiel> if you find out if there is one that mkvmerge already writes that just does this, i might not complain anymore
[22:58:14 CEST] <wm4> mkvmerge writes per-track DURATION entries, but as strings
[22:59:04 CEST] <wm4> http://sprunge.us/OieR
[23:21:52 CEST] <Guest35713> hi! how does the chapters functionality work? can I use avpriv_new_chapter thing to write my own custom implementation of chapters? instead of chapters being define by timestamps, I want to define chapters as data offsets in the underlying bytestream. is this possible?
[23:22:05 CEST] <Guest35713> s/define by/defined by
[23:23:09 CEST] <wm4> why do they have to be data offsets?
[23:23:16 CEST] <Guest35713> also, while I am demuxing an input file into an output files, can I tell the demuxer / codec to print the current timestamp (by using this timestamp value, I can create .cue files).
[23:24:28 CEST] <Guest35713> wm4, that is how the underlying file format works, it seems
[23:25:14 CEST] <Guest35713> the bytestream undergoes some internal processing before it can be given to the decoder (codec)
[23:25:53 CEST] <wm4> in ffmpeg, packets must have timestamps
[23:26:02 CEST] <wm4> so the data has timestamps before it goes to the decoder
[23:26:21 CEST] <Daemon404> (NOPTS)
[23:26:22 CEST] Action: Daemon404 runs
[23:27:11 CEST] <Guest35713> wm4, oh, cool! I don't see "takdec.c" handling timestamps at all. how is this done? :)
[23:27:44 CEST] <wm4> not sure what's the question at all
[23:27:51 CEST] <Guest35713> in my "read_packet" function, can I access the timestamp of the last / current packet? is this possible?
[23:28:03 CEST] <wm4> in tak, each packet generates a fixed number of samples I think
[23:28:30 CEST] <wm4> so you always know what timestamp a packet has by adding the number of samples of the preceding elements
[23:28:35 CEST] <wm4> s/elements/packets
[23:29:24 CEST] <Guest35713> ah ok, AVPacket has some internal state which I need to take a look at then.
[23:30:14 CEST] <Guest35713> also, I don't really know how many samples are being written out from my "read_packet" function, hopefully there will be some AVPacket field keeping track of this?
[23:33:25 CEST] <cone-183> ffmpeg 03Michael Niedermayer 07master:f8b81a02c980: avformat/oggdec: ogg_read_seek: reset ogg after seeking
[23:34:48 CEST] <Guest35713> wm4, I know when a new chapter starts when I am in "read_packet" function, and I now am trying to print the corresponding timestamp (so that I can make a .cue file). also, I don't really know how many samples I have written out (samples per packet are quite variable too). any hints?
[23:40:01 CEST] <wm4> Guest35713: ffmpeg API users assume chapters are available after the file has been opened
[23:40:24 CEST] <wm4> and I'm not sure how the format you're dealing with works
[23:40:40 CEST] <wm4> if the classic demuxer/decoder separation doesn't work at all, you could move the decoder into the demuxer
[23:40:46 CEST] <wm4> (though we try to avoid such things)
[23:41:59 CEST] <Guest35713> wm4, the chapters are known by their data offsets (and not by any timestamps). weird but that is how it is. converting these data offsets into timestamps seems to be a real pain (after all this talk) ;( heh
[23:44:12 CEST] <Guest35713> the file format can actually use any decoder (codec), so I can't move all the decoder into the demuxer. one hack -> in read_packet, I can ask the underlying decoder to give me its timestamp (or something) maybe? then I print this out to stderr / file to create a cue file? ugly but better than nothing.
[23:44:41 CEST] <Guest35713> s/all the decoder/all the decoders
[23:45:09 CEST] <wm4> there are parsers
[23:45:18 CEST] <Guest35713> in oggdec.c there is something called ogg_calc_pts (it sounds promising) but I don't really know what it does
[23:45:54 CEST] <wm4> you can look at how mp3dec.c works
[23:45:59 CEST] <wm4> though it's not very intuitiv
[23:46:00 CEST] <wm4> e
[23:46:32 CEST] <wm4> read_packet in mp3dec.c just reads relatively arbitrary 1024 bytes
[23:46:41 CEST] <wm4> and the parser rearranges this into actual mp3 packets
[23:47:05 CEST] <wm4> and libavformat/utils.c automagically determines the packet timestamp by calling the parser
[23:49:13 CEST] <Guest35713> wm4, sounds promising! I will take a look and come back, thanks! :)
[00:00:00 CEST] --- Thu Jul 30 2015


More information about the Ffmpeg-devel-irc mailing list