[Ffmpeg-devel-irc] ffmpeg.log.20180718

burek burek021 at gmail.com
Thu Jul 19 03:05:03 EEST 2018


[02:59:36 CEST] <bob123> Cracki thanks for the reply
[02:59:49 CEST] <bob123> I will check those links out
[03:00:19 CEST] <bob123> I found out that you can export color curves from adobe photoshop and load them in libavfilter
[03:00:30 CEST] <bob123> so I have been experimenting with that
[05:07:31 CEST] <killown> [mov,mp4,m4a,3gp,3g2,mj2 @ 0xc0a280] moov atom not found
[05:07:43 CEST] <killown> using ffprobe -v error -show_entries format=duration -of default=noprint_wrappers=1:nokey=1
[12:38:53 CEST] <King_DuckZ> hi, anyone knows if 32 bit rgb float is supported in ffmpeg? I want to convert from that to yuv444p using swsScale
[12:41:21 CEST] <durandal_1707> nope
[12:41:44 CEST] <durandal_1707> only way is using zscale filter via libavfilter
[12:42:47 CEST] <King_DuckZ> hm I just found a AV_PIX_FMT_FLAG_FLOAT defined in libavutil/pixdesc.h, is it what it's used with?
[12:44:43 CEST] <remlap> Possibly the right place to ask, is there any video editor for linux that can do smart h264 encoding as in only recodes from cut to the next keyframe only
[12:45:38 CEST] <durandal_1707> King_DuckZ: yes, but that is just helper, there is pixel format, but no swscale support
[12:47:10 CEST] <King_DuckZ> durandal_1707: so what's the difference between that and the zscale thing you mentioned earlier? it's not like I must use swscale either...
[12:50:00 CEST] <durandal_1707> King_DuckZ: zscale filter interface is much simpler
[12:50:32 CEST] <durandal_1707> but you need to use special filtergraph incarnation
[12:51:55 CEST] <King_DuckZ> ok, I have no idea what you're talking about and in the docs I can only see zscale_class, zscale_options and ZScaleContext, is there a usage example somewhere? one with explanations, if possible?
[12:53:38 CEST] <durandal_1707> King_DuckZ: if you do not how to use libavfilter library do not use it
[12:55:02 CEST] <King_DuckZ> I thought you just said that's what I should do
[12:55:15 CEST] <King_DuckZ> <durandal_1707> only way is using zscale filter via libavfilter
[13:00:40 CEST] <durandal_1707> King_DuckZ: have you ever used libavfilter?
[13:01:11 CEST] <durandal_1707> for zscale you  will need to install/have zimg library too
[13:05:31 CEST] <InTheWings> can I haz confirmation on mailing list plz
[13:36:30 CEST] <King_DuckZ> durandal_1707: I don't think I did, but if that's the way forward... I guess my other alternative otherwise would be to convert to 8 bit RGB myself first, then pass that buffer on to swscale... which sounds very slow
[13:37:47 CEST] <King_DuckZ> or I could just do the conversion myself... I could check how slow that would be maybe
[13:54:55 CEST] <zap0> ffprobe -gimme_all_details    outputs piles of text.... what character encoding is that?
[14:00:51 CEST] <JEEB> it should be utf-8
[14:01:08 CEST] <JEEB> except maybe with windows where it might be windows unicode (UCS-2/UTF-16)
[14:01:37 CEST] <JEEB> also I recommend if you're going to parse ffprobe's stuff that you utilize a specific output format
[14:01:40 CEST] <JEEB> -of json, for example
[14:02:20 CEST] <zap0> just been reading lots of manual,     -print_format flat     is in "shell output escaped"  (wtf that  means i don't know)
[14:02:48 CEST] <zap0> the -print_format ini   says it is in UTF-8..  which i might be able to parse with similar effciency
[14:03:34 CEST] <zap0> i was using flat.    and i say non-ascii-7bit in the output of 1 video file
[14:03:42 CEST] <zap0> (which is the cause of my problems)
[14:04:16 CEST] <zap0> there is json and xml..  but i'd rather eat glass than use those
[14:04:42 CEST] <JEEB> python etc just make usage of JSON pretty simple and iterable entities are nice
[14:04:53 CEST] <JEEB> from json import loads as load_json
[14:05:02 CEST] <JEEB> load_json(ffprobe_output)
[14:05:05 CEST] <JEEB> and you get a dictionary
[14:05:16 CEST] <zap0> https://www.ffmpeg.org/ffprobe-all.html#flat    4.3 flat.  why does that not say with more clarity the encoding
[14:06:11 CEST] <zap0> (i'm using C++)
[15:10:10 CEST] <zx_> I have download the ffmpeg source 4.0.1,and build it on my Mac, it works. when I try to use the example in the doc dir , the example is decode_video.c. I had tried  building decode_video.c using "make example " or using Xcode. Both will be ok. But when I supply tow mp4 file for input , Error info is https://pastebin.com/32SkNzYp
[15:12:10 CEST] <zx_> 
[15:14:20 CEST] <zx_> quit
[15:54:32 CEST] <shinsh> hello~
[15:57:25 CEST] <shinsh> When will I see an article about compiling https://ffmpeg.zeranoe.com/forum?
[15:57:45 CEST] <DHE> huh?
[15:59:18 CEST] <shinsh> When can I see an i wrote article in the Compiling category of https://ffmpeg.zeranoe.com/forum?
[15:59:43 CEST] <zap0> why are you asking in here?
[16:01:15 CEST] <shinsh> I'm sorry it was not IRC for forum of ffmpeg.zeranoe.com.
[16:02:46 CEST] <JEEB> no, this is the actual FFmpeg project's IRC channel
[16:02:58 CEST] <JEEB> Zeranoe is Zeranoe
[16:20:28 CEST] <Tu13es> hi all.  I've started ripping old family VHS tapes to mp4 via an Elgato Video Capture.  it just creates one large .mp4 file for each tape.  can I use ffmpeg to somehow split the file based on a scene change and/or black frames and/or static?
[16:29:17 CEST] <DHE> that's not really an ffmpeg feature in one shot. you can probably do it in two shots using the blackdetect filter for pass 1, then run another pass with the segment output target and a timestamp listing
[16:32:52 CEST] <Tu13es> i see.  what about using scene detection?
[16:33:05 CEST] <Tu13es> seems like i'd do what you mentioned, then do it again except with scene detection instead of blackdetect?
[18:11:22 CEST] <ChocolateArmpits> Does ffmpeg cache a single still image input or does it read continuously ?
[19:33:51 CEST] <kepstin> ChocolateArmpits: I'm pretty sure that ffmpeg's image input with the -loop option will re-read the image sequence
[19:34:38 CEST] <ChocolateArmpits> kepstin, heh don't remember if I used -loop or not
[19:34:41 CEST] <ChocolateArmpits> for the still
[19:34:54 CEST] <ChocolateArmpits> So I can hope that the drive at best has the file cached?
[19:34:59 CEST] <kepstin> if you don't use -loop, it'll decode a single frame then give end of file
[19:35:02 CEST] <kepstin> and exit
[19:35:19 CEST] <kepstin> (you can use the "loop" filter to repeat the frame in-memory if you prefer)
[19:35:51 CEST] <ChocolateArmpits> what do you mean in-memory, would it not read the image from the disk under those circumstances?
[19:36:29 CEST] <kepstin> filters are completely separate from the input/demuxer
[19:36:43 CEST] <ChocolateArmpits> oh I missed the "filter" thing
[19:38:11 CEST] <ChocolateArmpits> hmm would the process not quit when the end of file was met with this filter somehow?
[19:40:21 CEST] <ChocolateArmpits> just to state my situation, I want to overlay an image over a video
[19:41:02 CEST] <JEEB> I think the overlay filter had an option regarding what to do if one input EOFs
[19:41:03 CEST] <ChocolateArmpits> so the overlay is absolutely clear, I just don't want the drive to be pointlessly read
[19:41:31 CEST] <JEEB> https://www.ffmpeg.org/ffmpeg-all.html#overlay-1
[19:41:39 CEST] <JEEB> https://www.ffmpeg.org/ffmpeg-all.html#framesync
[19:41:49 CEST] <JEEB> > repeatlast=1
[19:42:18 CEST] <JEEB> also eof_action=repeat should be the same
[19:44:52 CEST] <kepstin> yeah, if you're doing overlay, the overlay filter's builtin eof stuff is the way to go for sure.
[21:40:55 CEST] <ChocolateArmpits> ok it seems that overlay does it all right from the start
[00:00:00 CEST] --- Thu Jul 19 2018


More information about the Ffmpeg-devel-irc mailing list