[Ffmpeg-devel-irc] ffmpeg.log.20190510

burek burek021 at gmail.com
Sat May 11 03:05:01 EEST 2019


[03:56:46 CEST] <sim590> WHy does ' ffmpeg -f x11grab -framerate 25 -s 1920,0 -i :0.0' gives me 'invalid argument' ?
[04:11:04 CEST] <Hello71> one or more arguments are invalid
[04:12:36 CEST] <furq> sim590: -s is the size of the capture window
[04:12:43 CEST] <furq> so 1920,0 is 0px high
[04:14:06 CEST] <furq> also 1920,0 is invalid syntax for -s anyway, it'd be 1920x0
[04:18:54 CEST] <sim590> furq: Yes. Right. I mixed up two options.
[10:46:34 CEST] <pagios>  hello, is there any element that would give me theresolution/bandwidth being sent from a broadcasted stream
[10:55:35 CEST] <JEEB> as you decode frames you get width/height/aspect ratio in the AVFrame
[10:55:59 CEST] <JEEB> for bandwidth I think the AVIO stuff should be able to give you some info on that
[10:56:17 CEST] <JEEB> you can also implement your own custom AVIO callbacks and that way you can control the IO yourself (and get full stats on it)
[11:46:46 CEST] <pagios> JEEB, so i need to do it programatically, i cannot get it using ffmpeg command line?
[12:00:59 CEST] <bilboed> Hi all. Is there any chance the following patch could be backported to the 4.1 branch ? https://patchwork.ffmpeg.org/patch/11148/
[12:12:35 CEST] <kubast2> https://ffmpeg.org/doxygen/0.5/pixfmt_8h.html What does " PIX_FMT_PAL8 	8 bit with PIX_FMT_RGB32 palette " means?
[12:12:48 CEST] <kubast2> Is it an 255 color pallete tied to rgb32?
[12:12:58 CEST] <kubast2> *256
[12:17:57 CEST] <furq> kubast2: it's an 8-bit palette that contains rgb32 colours
[12:18:38 CEST] <furq> also those docs are ancient, you probably want https://ffmpeg.org/doxygen/trunk/pixfmt_8h.html
[12:34:01 CEST] <Cracki> cheers. question on AVCodecContext, time_base vs framerate, for _variable_ frame rate stuff. (1) is it ok to set frame rate to 25/1 but time_base to 1/1000 and emit pts that aren't on multiples of 1/25 sec? (2) says in the docs that time_base "decoding: the use of this field for decoding is deprecated. Use framerate instead.
[12:34:01 CEST] <Cracki> " what exactly is one supposed to do?
[12:34:17 CEST] <Cracki> the sample code for decoding uses time_base as well
[12:34:49 CEST] <JEEB> uhh, I haven't used anything else than time base in general...
[12:34:54 CEST] <JEEB> since the frame rate value can be a lot of things
[12:35:09 CEST] <JEEB> like, you get a packet from avformat and you have the stream's time base and the packet's timestamps are on that
[12:35:16 CEST] <Cracki> good. then I should disregard the docs on that? :P
[12:35:57 CEST] <Cracki> my first question is basically... might it happen that a video player is confused or suddenly thinks it's got 1000 fps video when I do what I proposed?
[12:36:31 CEST] <Cracki> or is that not "supposed" to happen and it's supposed to believe the 25 fps claim while also presenting frames at the pts I specified
[12:36:48 CEST] <JEEB> players in general should only trust the decoded PTS
[12:36:58 CEST] <Cracki> good good
[12:37:00 CEST] <JEEB> also there's the time base AV_TIME_BASE_Q
[12:37:10 CEST] <JEEB> which you might want to utilize instead of 1/1000
[12:37:41 CEST] <Cracki> benefits in doing that?
[12:37:42 CEST] <JEEB> aynways, do note that I didn't give this too much thought :P I just mostly remember matching things between dmeuxer stream/decoder/filter chain/encoder/muxer stream
[12:37:55 CEST] <JEEB> Cracki: 1/1000 is not often good enough
[12:38:10 CEST] <JEEB> you can see that with the FLV container for example
[12:38:21 CEST] <JEEB> which has a hard-coded time base of 1/1000
[12:38:25 CEST] <Cracki> it'll do for my purpose. I have source data with millisecond precision at most
[12:39:10 CEST] <Cracki> so my stuff _really not_ 25 fps but I think I should report something at least
[12:39:26 CEST] <Cracki> (or maybe 0/1 but dunno what adobe premiere does with that)
[12:41:10 CEST] <JEEB> thankfully mp4 f.ex. doesn't even make you try to guesstimate
[12:41:17 CEST] <JEEB> you just don't have a "frame rate" field in the container :P
[12:41:23 CEST] <Cracki> heh
[12:41:34 CEST] <JEEB> just DTS and CTS (~ PTS)
[12:42:14 CEST] <Cracki> yeah the video track will be ~1/1000 timebase ebcause source data, and i have no audio track, so container timebase should be obvious
[12:42:23 CEST] <Cracki> cresentation?
[12:42:43 CEST] <Cracki> ah composition
[12:46:05 CEST] <JEEB> (almost) the same thing, just mp4 uses a different word :)
[14:52:24 CEST] <kubast2> Hey I have a question? Which pixel formats are reversible losslessly to rgb24?
[14:58:28 CEST] <c_14> all the rgb/bgr formats with at least 8 bits per channel, xyz12 and potentially yuv444 with >8 bits per channel
[15:11:51 CEST] <garyserj>  When I try    ffmpeg -i blah.mp4 -acodec libmp3lame -vcodec libx264 -vf transpose="vflip" aaa.mp4     I get  "Failed to inject frame into filter network: Invalid argument"  https://pastebin.com/raw/TEskU0MB
[15:12:35 CEST] <durandal_1707> garyserj: invalid command
[15:12:44 CEST] <garyserj> why?
[15:13:01 CEST] <durandal_1707> transpose filter does not accepts such argument
[15:13:13 CEST] <garyserj> so should I skip acodec and vcodec?
[15:13:35 CEST] <durandal_1707> how i could know that?
[15:13:53 CEST] <garyserj> What  iam tryiing to do is flip the video that's all
[15:14:07 CEST] <durandal_1707> horizontal or vertical flip?
[15:14:12 CEST] <garyserj> vertical
[15:14:19 CEST] <durandal_1707> -vf vflip
[15:14:27 CEST] <garyserj> ok thanks
[15:40:28 CEST] <saml> what is bitrate ladder? is that good?
[15:51:16 CEST] <garyserj> can AAC audio codec go with libx264 in an mp4?
[15:51:30 CEST] <Mavrik> yp
[15:51:33 CEST] <garyserj> ta
[15:52:22 CEST] <BtbN> That's pretty much the accepted default of the internet by now
[17:29:39 CEST] <bartzy> Hey. Ive been using the LosslessCut app on mac to cut a long video into small parts (using the Normal cut mode instead of near keyframe, to get accurately timed cuts).
[17:30:01 CEST] <bartzy> I now have a bunch of short mp4 videos. Im uploading them to Youtube (well, to Google Drive, but it seems like the player and the whole environment is the same as youtube).
[17:30:22 CEST] <bartzy> Should I re-encode them? I know I should set faststart on them as per Youtubes guideline, but is it smart to re-encode them as well?
[17:30:46 CEST] <bartzy> And if so, is this post (just found it with a google search) is still valid? https://www.virag.si/2015/06/encoding-videos-for-youtube-with-ffmpeg/
[17:37:37 CEST] <saml> why re-encode? let youtube do re-encoding
[17:37:49 CEST] <saml> youtube is good at re-encoding
[17:39:24 CEST] <saml> while ffmpeg is running,  I cannot play the output video. is it possible to play output video as ffmpeg is encoding?
[17:40:57 CEST] <BtbN> saml, mp4?
[17:46:24 CEST] <bartzy> saml: So why do they have the guidelines? And do you mean also I shouldnt even set faststart without re-encoding?
[17:46:47 CEST] <bartzy> saml: And also, do you think this is true for Google Drive videos as well?
[17:46:56 CEST] <furq> bartzy: you don't even need to set faststart
[17:47:00 CEST] <saml> BtbN, yup mp4
[17:47:10 CEST] <BtbN> saml, that's a no then.
[17:47:12 CEST] <furq> the guidelines are broadly recommendations for stuff they'll have an easy time processing
[17:47:20 CEST] <BtbN> mp4 is useless without its index, which gets written at the end.
[17:47:21 CEST] <saml> BtbN, thanks
[17:47:28 CEST] <saml> ah i see
[17:47:31 CEST] <furq> and/or stuff they won't butcher like interlaced content which they'll just encode as progressive
[17:47:48 CEST] <saml> bartzy, I don't know. I just upload shitty videos to youtube so that their engineers can do some work
[17:47:52 CEST] <furq> faststart just means it'll start processing faster but it'll work fine either way
[17:48:11 CEST] <furq> and if you're uploading a ton of things it probably makes no difference
[17:48:41 CEST] <saml> i think you may get bounty if you come up with a crazy video that can shutdown their system or expose security hole
[17:48:52 CEST] <furq> their backend is just using ffmpeg afaik
[17:49:13 CEST] <furq> certainly for the decoding part
[18:06:27 CEST] <bartzy> furs: So no need to even do the faststart thing? And do you know if all that is probably true for videos uploaded to Google Drive?
[18:06:32 CEST] <bartzy> furq* ^
[18:07:30 CEST] <furq> if it's drive and not photos/youtube then faststart is probably a good idea
[18:07:45 CEST] <furq> assuming you want to watch it in a browser
[18:07:53 CEST] <furq> it's not necessary but i'd still do it
[18:08:08 CEST] <furq> also use a dedicated mp4 muxer like l-smash or mp4box because it'll be faster
[18:08:17 CEST] <bartzy> whats that?
[18:08:32 CEST] <bartzy> ffmpeg -i input.mp4 -c copy -movflags +faststart output.mp4
[18:08:35 CEST] <bartzy> is that not enough?
[18:08:41 CEST] <furq> that'll work but it'll write the entire file twice
[18:08:49 CEST] <DHE> the way it works is it basically writes the .mp4 without +faststart, then rebuilds it
[18:08:52 CEST] <furq> yeah
[18:09:14 CEST] <bartzy> I have like 80 files, each 20MB, I think it will be OK
[18:09:21 CEST] <furq> yeah that'll probably be fine
[18:11:21 CEST] <bartzy> ok yeah that was super fast
[18:12:15 CEST] <bartzy> thanks everyone
[18:52:00 CEST] <ossifrage> I'm playing around with streaming live h.265 generated with a hardware encoder that supports long term reference images
[18:53:43 CEST] <ossifrage> It seems like it would be useful to cache the long term reference images so they can be played back when a new user starts playing the stream
[20:19:39 CEST] <MatthewAllan93> Hey, if you don't specify a framerate for the output video, does it just use the same framerate that as the input video?
[20:19:59 CEST] <JEEB> not frame rate, timestamps
[20:20:10 CEST] <JEEB> since many formats don't have a concept of frame rate to begin with :)
[20:20:29 CEST] <MatthewAllan93> Ah ok thanks JEEB :)
[20:41:07 CEST] <kepstin> note that there's a few cases where ffmpeg might detect the input file as having constant framerate, and it'll "fix up" the timestamps when outputting
[20:41:34 CEST] <kepstin> this can sometimes cause issues if the detection was wrong (this is one cause of the "past duration too large" messages)
[20:42:34 CEST] <TheAMM> Speaking of timestamps
[20:43:01 CEST] <TheAMM> Two files (say, mkvs) can have the same stream (ie. same data in packets) with different timestamps?
[20:43:42 CEST] <TheAMM> (same data in this case being -f data -c copy)
[20:44:09 CEST] <kepstin> i don't see why not. By default ffmpeg will do some timestamp cleanups, like making it start at 0, which apply even when stream copying.
[20:45:26 CEST] <TheAMM> hmmh
[20:45:45 CEST] <TheAMM> Guess I'll have to play it safe and process everything
[20:46:55 CEST] <TheAMM> Wish ffprobe had a flag to disable decoding the actual frame data
[20:47:53 CEST] <TheAMM> Maybe I'll add that myself, or just go without pict_type
[20:48:36 CEST] <kepstin> yeah, would be nice - that should be determinable from the container sometimes, and just require a parser at worst rather than a decoder.
[20:49:50 CEST] <DHE> just something to get the resolution, profile, etc?
[20:51:11 CEST] <TheAMM> I want to dump stream timestamps somewhat raw
[20:51:49 CEST] <TheAMM> ffmsindex can grab pict_type etc without decoding the frame data, patching ffprobe to skip that should be easy as well
[20:52:17 CEST] <TheAMM> although I don't know/haven't looked which part deduces the pict_type etc, I don't need resolution
[20:53:03 CEST] <kepstin> -show_packets should give you all the timestamps without decoding the video
[20:53:05 CEST] <TheAMM> Just want to future-proof a bit - -show_packets can display the K_ flag, but it'd be neat to know if the frame is I, P, B, etc
[20:53:37 CEST] <DHE> there's a function called avformat_find_stream_info (?) which does the bulk of the work. you just give it an open AVFormatContext and it fills it in with the stream types, resolutions, etc. but it's a heavy-handed process that does decode at least a little bit
[20:53:38 CEST] <TheAMM> -show_frames works, of course, but it's slooooow
[20:53:57 CEST] <TheAMM> kepstin: it does
[20:54:03 CEST] <TheAMM> but not pict_type
[20:54:10 CEST] <TheAMM> Just keyframe flag
[20:54:42 CEST] <TheAMM> I'm not too eager to start writing my own tool in C, just because
[20:54:50 CEST] <DHE> usually the packets have a keyframe flag, used mainly for seeking purposes
[20:55:05 CEST] <TheAMM> Yes, that they o
[20:55:07 CEST] <TheAMM> do, even
[20:55:34 CEST] <TheAMM> plus another bother is that h264 can apparently have just one field in a packet
[20:55:44 CEST] <TheAMM> courtesy of ffms source
[20:56:14 CEST] <kepstin> huh. x264 shouldn't ever be generating that, but I guess people do sometimes use other encoders :/
[20:56:33 CEST] <DHE> h264 can really screw you up though. it has I-frames that are not keyframes, and it has an alternative keyframe system called intra refresh. :)
[20:57:08 CEST] <TheAMM> https://github.com/FFMS/ffms2/blob/master/src/core/indexing.cpp#L336
[20:57:58 CEST] <TheAMM> I only really need the true duration of the stream, since the duration in -show_streams can be whatever
[20:58:25 CEST] <TheAMM> the pict_type and rest is "hmm, could use that in the future", since I'm not going to keep the data I'm processing, just metadata
[20:58:50 CEST] <TheAMM> Like looking at the index and going "Hey, this file uses B-frames, fancy"
[21:04:22 CEST] <kepstin> yeah, we tried to use intra-refresh on a screen sharing stream, but it turned out that the flash decoder couldn't handle the resulting stream because it didn't have any frames marked as keyframes in the rtmp stream - that was fun
[21:04:43 CEST] <kepstin> it worked if it decoded from the start, but it couldn't join a stream in the middle because it hung waiting for a keyframe
[21:05:30 CEST] <kepstin> we considered hacking it to just mark every frame as a keyframe, but chose to do normal keyframes instead.
[21:13:58 CEST] <TheSashmo> does anyone know if there is a way to force FFPLAY to open on a specific screen on mac??
[21:14:34 CEST] <ekiro> what does [Parsed_subtitles_0 @ 0x558583dc9a80] Error opening memory font 'subtitles.ass' mean ?
[21:15:02 CEST] <ekiro> https://pastebin.com/46aGYcbj
[21:19:29 CEST] <kepstin> ekiro: you can ignore that for this use case. The 'subtitles' filter lets you use a full mkv file (with embedded subtitle track and fonts) as a source for the sub rendering, so that message is from it trying to load fonts from the input file.
[21:19:59 CEST] <kepstin> oh, hmm,
[21:20:10 CEST] <kepstin> i might need more context there - are you using an mkv source?
[21:21:10 CEST] <kepstin> i think it tries to load every attachment as a possible font, but for some reason the subtitles are present as an attachment? and the font loader can't load that, so you get a message.
[21:21:20 CEST] <kepstin> either way, not a big deal.
[21:21:38 CEST] <ekiro> kepstin, i dump the attachments into a dir, then tell ffmpeg to look into that dir for the fonts. usually it works, but in this case i see this error. instead of loading the font from the dir they were dumped in, it loads the font from the OS
[21:22:30 CEST] <kepstin> huh, weird, it gives up and doesn't use the other fonts?
[21:22:52 CEST] <ekiro> there is only one font in this video.
[21:23:05 CEST] <kepstin> does it work if you remove the subtitles.ass file from the attachments directory?
[21:26:17 CEST] <ekiro> in that case it wont encode any subs
[21:26:19 CEST] <ekiro> this is my cmd:
[21:26:20 CEST] <ekiro> ffmpeg -i file.mkv -c:v libx264 -tune animation -preset fast -profile high -crf 24 -movflags faststart -c:a copy -vf "subtitles=attachments/subtitles.ass:fontsdir='attachments/' video.mp4
[21:27:00 CEST] <kepstin> alright, then move the subtitles.ass file out of the attachments directory to somewhere else, make sure the fonts directory has *only* fonts in it
[21:27:21 CEST] <ekiro> ill give that a try thx
[21:31:34 CEST] <ekiro> that worked, dont see the error.
[21:31:54 CEST] <ekiro> hmm
[21:32:47 CEST] <ekiro> yep looks good
[21:33:19 CEST] <ekiro> there was another message i was curious about, this: https://pastebin.com/xUR7uTGL
[21:33:48 CEST] <ekiro> [h264 @ 0x55ff251c0540] Invalid NAL unit 8, skipping.:19.55 bitrate=1099.7kbits/s dup=2 drop=0 speed=4.74x
[21:33:48 CEST] <ekiro> [h264 @ 0x55ff251c0540] error while decoding MB 10 10, bytestream -36
[21:33:54 CEST] <kepstin> that just looks like a corrupted input file :/
[21:34:47 CEST] <ekiro> the video does finish encoding and the output looks fine, i dont notice anything wrong with it
[21:35:18 CEST] <kepstin> yeah, ffmpeg's decoder conceals errors in h264 reasonably well, and it appears to have affected just one frame.
[21:37:13 CEST] <ekiro> i see. cool.
[21:37:23 CEST] <cehoyos> TheSashmo: This is an sdl-related question, there may be an environment variable iirc
[21:38:47 CEST] <ekiro> and one last question. i will be encoding a batch of files for web streaming that will take a few weeks to complete using the command i posted above. would you say it's optimal for web streaming? i would hate to have to re-encode the batch of files again. what are your thoughts?
[21:44:10 CEST] <kepstin> ekiro: it's ... fine? seeking won't be great since you're using the default gop size (which is quite large)
[21:49:42 CEST] <ekiro> kepstin, what would be a good gop size?
[21:52:01 CEST] <DHE> the general suggestion is "as high as possible", but any seek operation seeks to a keyframe. so you have to decide how fine you want seeking to be
[21:52:05 CEST] <DHE> 3 seconds? 5 seconds?
[21:53:28 CEST] <ekiro> 5s would be fine. what is the default set to?
[21:54:52 CEST] <DHE> if I'm reading this right, x264 specifically defaults to 250 frames
[21:58:09 CEST] <kepstin> hmm, that's actually not to bad, with a typical 30fpsish stream that'll be <10s
[21:59:04 CEST] <DHE> yeah it looks like x264 uses the codec-interal default gop size rather than ffmpeg's default. so I had to go into x264 itself to find the default and I see 250.
[22:40:58 CEST] <SixEcho> starting a project here& would like to overlay some data-driven gauges via ffmpeg&  curious if there are any recommendations where to start, e.g. for tech/formats to look at and how to get that into ffmpeg?  svg?  +html/css?  other?  would prefer something completely scriptable (not some other heavy gui tool exporting masked video; although it it pairs well for initial prototyping or gauge/layout design i would be interested)
[23:13:27 CEST] <klaxa> SixEcho: one way of going about it would be to render the gauges with whatever software you like and then pipe them as images into ffmpeg
[23:13:42 CEST] <klaxa> or create a number of single images and let ffmpeg assemble them
[23:14:41 CEST] <shadoxx> question for the channel. trying to compile an older version of ffmpeg. 3.4.2, on ubuntu 18.04
[23:15:00 CEST] <shadoxx> it's complaining about libfreetype2 not being found by pkg-config, but it 100% is there
[23:15:12 CEST] <shadoxx> all the prereqs are installed. hitting my head against a wall
[23:15:23 CEST] <shadoxx> I'm attempting to statically compile as well, if that changes things
[23:33:21 CEST] <JEEB> config.log under your build dir (ffbuild/config.log for a while now in newer versions)
[23:33:29 CEST] <JEEB> should contain all the full output of those checks
[23:33:39 CEST] <JEEB> also make sure you have the dev package for freetype2
[23:33:50 CEST] <JEEB> heck, 3.4.x is what ubuntu 18.04 packages, no?
[23:38:16 CEST] <SixEcho> klaxa: yeah i've seen people do that& however i'm going to be doing this with videos that are ~2hrs long @60fps, so making intermediate images on disk will not be an option& i would rather directly generate an alpha video at that point and merge with ffmpeg.
[23:39:41 CEST] <klaxa> you can do that as well, you will then want to generate the graphs and feed it as a video directly to ffmpeg, as well as the video you want to overlay over, then use the overlay filter
[23:39:57 CEST] <klaxa> https://ffmpeg.org/ffmpeg-filters.html#overlay-1
[23:40:33 CEST] <klaxa> you can also feed live "images"
[23:40:38 CEST] <klaxa> since that might be easier to create
[23:40:57 CEST] <klaxa> err "live" images
[23:42:36 CEST] <klaxa> i.e.: <some command that produces pngs to stdout> | ffmpeg -f image2pipe -i pipe:0 -i base_video.mp4 -vf "overlay=<however you want this>" -c:a copy [optional encoding settings] output.mkv
[23:43:54 CEST] <klaxa> ah, i think i have the inputs the wrong way, so it should be ffmpeg -i base_video.mp4 -f image2pipe -i pipe:0 ...
[23:45:08 CEST] <cehoyos> shadoxx: Remove --enable-libfreetype from your configure line
[00:00:00 CEST] --- Sat May 11 2019


More information about the Ffmpeg-devel-irc mailing list