[Ffmpeg-devel-irc] ffmpeg.log.20190607
burek
burek021 at gmail.com
Sat Jun 8 03:05:04 EEST 2019
[01:06:38 CEST] <MoziM> is there a way to concatonate the images in 2 different directories together like "ffmpeg -f image2 -framerate 24 -i new_frames/%d.png new_frames2/%d.png -i audio.mp3 -r 24 -vcodec libx264 -crf 16 video.mp4" ?
[01:45:16 CEST] <steve___> MoziM: try adding a '-i' for the second dir and the option after the inputs '-filter_complex "concat=n=2"'
[02:09:01 CEST] <MoziM> steve___: i renamed the images instead and tried ffmpeg -f concat -safe 0 -framerate 24 -i output/* -i ~/audio.mp3 -r 24 -vcodec libx264 -crf 16 output.mp4
[02:09:16 CEST] <MoziM> does ffmpeg need a list of images from a text file instead?
[02:10:04 CEST] <MoziM> i also tried ffmpeg -f image2 -framerate 24 -i output/* -i ../audio.mp3 -r 24 -vcodec libx264 -crf 16 ~/inque/video.mp4
[02:10:27 CEST] <steve___> MoziM: if you want to use '*' then you'll need '-pattern_type glob'
[02:10:28 CEST] <MoziM> but it just gives me the prompt "File 'output/000002.png' already exists. Overwrite ? [y/N] ^C"
[02:10:41 CEST] <MoziM> steve___: where would i need to put that?
[02:10:56 CEST] <steve___> before -i output/*
[02:11:30 CEST] <another> also: put 'output/*' in single quotes
[02:11:49 CEST] <MoziM> may i ask why single quotes?
[02:11:50 CEST] <steve___> or else you need to use something like FILENAME%02d.png
[02:12:17 CEST] <another> because otherwise your shell might expand it
[02:12:41 CEST] <MoziM> there's no prefix name and each image name is 6 digits long
[02:12:46 CEST] <MoziM> so %06d?
[02:12:52 CEST] <steve___> yep
[02:12:58 CEST] <another> %06d.png
[02:13:08 CEST] <MoziM> thanks
[02:13:17 CEST] <MoziM> and use the globa option?
[02:13:50 CEST] <MoziM> thanks it's working
[02:13:56 CEST] <another> :)
[09:10:16 CEST] <MoziM> how do i encode a video at the native fps of a video with ffmpeg?
[13:10:14 CEST] <kepstin> MoziM: if you use no special options, ffmpeg will preserve frame tming from the source when availabe
[14:09:50 CEST] <dongs> hmm
[14:09:58 CEST] <dongs> should mpv.exe be using nvdec for 4k/8k decoding?
[14:11:03 CEST] <dongs> ah ok. i had to force -hwdec nvdec
[14:11:48 CEST] <dongs> damn, its just a little too slow for 8k decode
[14:11:58 CEST] <dongs> goes to 95-100% nvdec usage
[14:12:04 CEST] <dongs> while mpc-hc is at around 70%
[14:44:00 CEST] <kepstin> they don't even list 8k support in nvdec except for some pascal and newer cards, and then only with hevc.
[14:44:13 CEST] <kepstin> er, and vp9 apparently
[14:45:44 CEST] <kepstin> i would expect the cards to manage realtime decoding of 1 8k stream if they support it at all tho
[14:47:17 CEST] <dongs> kepstin: correct.
[14:47:20 CEST] <dongs> i have hevc 8k @ 60p
[14:47:25 CEST] <dongs> it plays fine on t uring gtx 1660
[14:47:34 CEST] <dongs> with mpc-hc, using nvdec
[14:47:58 CEST] <BtbN> It makes no sense that Decode-Unit-Load would be different between players, they are sending the exact same stuff to the card.
[14:48:10 CEST] <dongs> like i saud ~70% nvdec uasge. but for some reason mpv (based on ffmpeg?) uses too much
[14:48:17 CEST] <dongs> i suspect its maybe due to HDR color scaling stuff
[14:48:26 CEST] <BtbN> That's not decode unit load.
[14:48:33 CEST] <kepstin> mpv runs that on the shader cores, completely independent
[14:48:37 CEST] <dongs> hmmm
[14:48:41 CEST] <dongs> so why its failing then
[14:49:01 CEST] <dongs> mpv.exe --hwdec nvdec lol.mkv < stuttery, but looks good
[14:49:11 CEST] <dongs> mpc-hc lol.mkv = looks not HDR, but smooth
[14:51:49 CEST] <kepstin> (assuming the scaling is linear with number of macroblocks or whatever the hevc equivalent is decoded, i'd expect a turing nvenc instance to be able to manage ~2 streams of hevc 8kp60)
[14:52:09 CEST] <dongs> no, even 1 stream takes up 70% decode
[14:52:20 CEST] <kepstin> s/nvenc/nvdec/
[14:52:25 CEST] <dongs> its unplayable on pascal
[14:52:31 CEST] <dongs> 100%, and around 30-40fps
[14:52:39 CEST] <kepstin> i guess it's not linear then
[14:52:58 CEST] <dongs> Video: HVC1 7680x4320 59.94fps [V: hevc main 10, yuv420p10le, 7680x4320 [default]]Video: HVC1 7680x4320 59.94fps [V: hevc main 10, yuv420p10le, 7680x4320 [default]]
[14:53:27 CEST] <dongs> actaully 60% nvdec, not 70
[14:53:33 CEST] <kepstin> pascal can't manage very many streams of 4k even, so that's totally expected :)
[14:54:13 CEST] <dongs> hmm
[14:54:17 CEST] <kepstin> but yeah, i'd really expect that to work. no idea why you're seeing issues with mpv.
[14:54:17 CEST] <dongs> i'd like to know why mpv fails this tho.
[14:57:44 CEST] <kepstin> you've checked the other gpu load indicators right? it's not getting maxed out somewhere /other/ than the video decode unit?
[14:57:49 CEST] <dongs> correct
[14:58:34 CEST] Action: kepstin had that issue recently when he found out that his ryzen 2200g could decode 4k fine, but couldn't do the post-processing in hq preset at that res
[14:59:17 CEST] <kepstin> yeah, a 1660 has enough shader power that i wouldn't expect that to be an issue.
[14:59:36 CEST] <BtbN> sounds to me like you're decoding stuff twice
[14:59:43 CEST] <BtbN> Cause nothing else but decoding puts load on the decoder.
[15:00:38 CEST] <dongs> https://i.imgur.com/4VEGYtf.png
[15:00:48 CEST] <dongs> mpv, exit it, re-run same content in mpc-hc
[15:02:10 CEST] <kepstin> can you replicate this issue with ffmpeg or ffplay (with ffmpeg, try with -re, or just see what framerate you get at max)?
[15:02:51 CEST] <kepstin> that'll narrow down if it's an issue with ffmpeg or mpv
[15:02:58 CEST] <dongs> uh how do i make ffmpeg decode to screen or wahtever
[15:03:23 CEST] <BtbN> No need to show it
[15:03:28 CEST] <BtbN> the decode load is there no matter what
[15:03:34 CEST] <kepstin> don't need to, just discard the decoded frames with e.g. '-f null -'
[15:03:35 CEST] <dongs> sure ok so how do i make it just decode
[15:03:37 CEST] <dongs> ok
[15:03:58 CEST] <dongs> its not using nvdec by default
[15:04:13 CEST] <angular_mike> i'm trying to use ffplay to play a file encoded with xvid
[15:04:14 CEST] <BtbN> -hwaccel cuda
[15:04:39 CEST] <angular_mike> `-vcodec libxvid` flag seems to fail tho with No codec could be found with name 'libxvid'
[15:04:45 CEST] <angular_mike> am I specifying it wrong?
[15:04:54 CEST] <dongs> BtbN: no
[15:05:14 CEST] <kepstin> dongs: https://trac.ffmpeg.org/wiki/HWAccelIntro might be a useful reference
[15:05:29 CEST] <dongs> [hevc @ 000001b90558e600] decoder->cvdl->cuvidCreateDecoder(&decoder->decoder, params) failed -> CUDA_ERROR_OUT_OF_MEMORY: out of memory
[15:05:33 CEST] <dongs> lol wut
[15:05:39 CEST] <kepstin> angular_mike: you shouldn't need to specify a codec for decoding - ffmpeg autodetects it, normally.
[15:06:53 CEST] <dongs> frame= 290 fps= 23 q=-0.0 size=N/A time=00:00:05.75 bitrate=N/A speed=0.448x
[15:07:01 CEST] <angular_mike> hm, well it's an undocumented binary data file that just happens to have som evideo streams embedded in it
[15:07:01 CEST] <dongs> well, it uses around 42% of nvdec
[15:07:15 CEST] <angular_mike> im am feeding it to ffplay and it seems to be able to play some of the videos but with artifacts and stuff
[15:08:39 CEST] <kepstin> angular_mike: that said, the decoders in ffmpeg are all named after the codec they decoder - libxvid isn't the name of a codec but rather is the name of a specific encoder
[15:09:04 CEST] <kepstin> angular_mike: libxvid produces videos that can be decoded by ffmpegs "msmpeg4" decoder, iirc.
[15:09:43 CEST] <angular_mike> kepstin: I'm just trying to find a way to specify otions that are optimal for DivX chunk playback
[15:12:15 CEST] <dongs> so eh?
[15:15:37 CEST] <dongs> why does ffmpeg nvdec not have enough memory to decode heh
[16:47:40 CEST] <short-bike> I'm looking for a way to tag / search mp4 and avi files from the command line. Would ffmpeg allow me to do this ?
[17:00:29 CEST] <DHE> not without rebuilding the files from nigh-scratch. ffmpeg isn't a metadata tweaking and browsing sort of tool
[17:04:11 CEST] <short-bike> Yeah - there's not a lot to choose from out there for what I'm looking for. But thanks anyway DHE.
[17:42:54 CEST] <brimestone> Hey guys, what version of ffmpeg and ffprobe can I use for a macOS 10.8.5 ?
[17:43:52 CEST] <Mavrik> What do you mean?
[17:45:30 CEST] <brimestone> The latest static build is only works for macOS 10.9 and above.. So I need an older version.
[17:46:06 CEST] <BtbN> That's up to whoever made that build.
[17:46:56 CEST] <brimestone> Im talking about the static builds from http://ffmpeg.org/download.html#build-mac
[17:47:13 CEST] <BtbN> That only links to external sites. There are no ffmpeg provided builds.
[17:48:00 CEST] <Mavrik> Yeah, just use homebrew
[17:48:07 CEST] <Mavrik> And grab it there
[17:48:25 CEST] <BtbN> Upgrading that ancient OS would be a very good idea though. It's horribly insecure and should not be online.
[17:48:54 CEST] <brimestone> The hombrew fails to get from repo due to ssl tls. GitHub doesn't support older than 1.1
[17:49:13 CEST] <BtbN> Yeah, your OS is just too old to still be supported by anything.
[17:49:22 CEST] <brimestone> Tell me about it! :)
[17:49:29 CEST] <BtbN> It's a 7 year out of date system
[18:53:06 CEST] <kepstin> looks like most 2012 or later macs are still supported by macos mojave (10.14)...
[18:53:39 CEST] <kepstin> and even if not you should be able to get at least 10.13 on there.
[19:38:56 CEST] <electrotoscope> what's the standard way to make ffmpeg feature requests?
[19:44:54 CEST] <kepstin> ... post a patch that implements your desired feature on the mailing list?
[19:45:15 CEST] <kepstin> other than that, i guess you can open a trac ticket, but absolutely no guarantee anyone will work on it :/
[19:45:58 CEST] <kepstin> what kind of feature?
[19:47:33 CEST] <kepstin> it never hurts to ask "how do I do X?" here, it might be that what you wanted is actually already there and you didn't know it :)
[19:51:13 CEST] <electrotoscope> I tried to ask on trac at https://trac.ffmpeg.org/ticket/7946 but it was closed as invalid
[19:59:21 CEST] <kepstin> trac is so slow that hasn't loaded for me yet. mind restating it here?
[20:00:55 CEST] <kepstin> ah, nvm, there it goes
[20:03:23 CEST] <kepstin> yeah, that's probably not gonna get any further unless you either write a patch for it or find someone else to do so. your ticket was also a bit misleading because you were talking about metadata, but the things you wanted to use in drawtext were not metadata.
[20:04:11 CEST] <kepstin> "metadata" has a specific meaning in the ffmpeg code, referring to user-provided data attached to the file (sometimes called 'tags' e.g. in music files)
[20:04:31 CEST] <electrotoscope> also called TAG: in ffprobe apparently
[20:05:00 CEST] <electrotoscope> what would other points of information about the data be called?
[20:05:20 CEST] <kepstin> hmm. there's actually another challenge with your request
[20:05:26 CEST] <electrotoscope> like the timestamp or frame number? not packet metadata but packet info?
[20:05:34 CEST] <kepstin> the data you're asking about is packet properties, but filters don't see packets
[20:05:45 CEST] <kepstin> filters run on decoded frames
[20:06:06 CEST] <kepstin> some of the packet information is passed through, but not all
[20:06:51 CEST] <kepstin> anyways, your feature request should basically be "please expose packet/frame properties X, Y, Z as variables in the drawtext filter"
[20:07:20 CEST] <kepstin> there's no generic way to do it for all properties, they'd have to be individually coded, so you need to request the specific ones you want.
[20:08:40 CEST] <kepstin> and keep in mind that the answer for some might be "that property isn't available on decoded frames, so it's not possible". (I suspect packet size and position might be in this category)
[20:12:36 CEST] <kepstin> "ffprobe -show_frames" will give you a better idea of what filters see.
[20:20:04 CEST] <electrotoscope> how can I tell what a filter can see and can't see from the list?
[20:21:07 CEST] <electrotoscope> like it can obviously see pkt_pts_time
[20:21:18 CEST] <electrotoscope> but I guess not pkt_duration_time
[20:30:03 CEST] <kepstin> a filter should be able to see everything you see in the [FRAME] block with -show_frames
[20:30:31 CEST] <kepstin> not everything is necessarily exposed in the showtext filter, but it could be added.
[20:38:35 CEST] <electrotoscope> ah okay! well yeah that's all I was looking for
[20:39:03 CEST] <electrotoscope> I guess my feature request would be if there was some way to go (for instance) %{packetinfo:pkt_pos}
[20:39:19 CEST] <electrotoscope> and then it would get the info called pkt_pos in ffprobe
[20:55:57 CEST] <kepstin> looks like pkt_pos does get passed from the decoder through to the avframe, you you might be lucky on that.
[20:59:32 CEST] <feliwir> Hey, where can i download a stable source tarbal of libaom?
[20:59:41 CEST] <feliwir> The googlesource page of libaom is a bit weird...
[21:00:03 CEST] <electrotoscope> oh nice! Okay so can I rename the ticket on trac? Or ask someone to do that? Or should I make a new ticket?
[21:00:09 CEST] <kepstin> electrotoscope: adding some new variables that could be accessed as %{pkt_pos} and %{pkt_duration} would actually be pretty trivial, fwiw
[21:00:52 CEST] <kepstin> looks like pkt_size is available too actually
[21:01:04 CEST] <kepstin> (note that whether or not they have useful values depends on the decoder)
[21:02:14 CEST] <electrotoscope> fantastic!!
[21:02:56 CEST] <electrotoscope> would it be possible to have something that's universal? like %{metadata} allows you to define your own key
[21:03:19 CEST] <kepstin> no, because these have to be manually coded to access particular fields in the frame structure
[21:06:38 CEST] <electrotoscope> ah okay darn
[21:06:52 CEST] <electrotoscope> okay so should I make a new trac ticket, or should I put comments in requesting that it be changed?
[21:07:58 CEST] <kepstin> you know what, have a patch. https://www.kepstin.ca/dump/0001-vf_drawtext-Add-pkt_pos-pkt_duration-pkt_size-as-var.patch
[21:08:08 CEST] <kepstin> (i have't even compile tested that, but it's pretty trivial)
[21:08:21 CEST] <kepstin> to get that merged it would also need documentation updates
[21:08:47 CEST] <electrotoscope> oh man thanks! I don't have a build chain thing up right now
[21:09:25 CEST] <electrotoscope> would you be able to write one for pkt_duration_time as well?
[21:09:30 CEST] <electrotoscope> or wait is that included
[21:09:43 CEST] <electrotoscope> this is amazing!!
[21:11:18 CEST] <kepstin> ah, I should probably just make pkt_duration return time instead of pts values to match the 't' variable
[21:11:39 CEST] <kepstin> i'm kind of surprised there's not variables that return the frame pts, but whatever
[21:12:04 CEST] <electrotoscope> there
[21:12:07 CEST] <electrotoscope> 's one as
[21:12:08 CEST] <electrotoscope> pts The timestamp of the current frame. It can take up to three arguments. The first argument is the format of the timestamp; it defaults to flt for seconds as a decimal number with microsecond accuracy; hms stands for a formatted [-]HH:MM:SS.mmm timestamp with millisecond accuracy. gmtime stands for the timestamp of the frame formatted as UTC time; localtime stands for the timestamp of the frame formatted as local time zone
[21:12:27 CEST] <electrotoscope> it seems to match what ffprobe [FRAME] calls "pkt_pts_time"
[21:13:20 CEST] <kepstin> oh, i see, pts is a special one, it has a formatting function instead of just being a variable.
[21:13:40 CEST] <electrotoscope> I thought it might be a special case
[21:14:58 CEST] <electrotoscope> could i help by writing the documentation update?
[21:15:44 CEST] <kepstin> the way I did that as variables, you'd have to use %{e:pkt_pos} to access them.
[21:16:18 CEST] <electrotoscope> oh okay cool! So would they also be accessible to the x and y function math then?
[21:16:25 CEST] <kepstin> yes
[21:28:43 CEST] <electrotoscope> I roughed it out http://www.gluce.ca/drawtext_documentation.txt but I'm happy to format appropriately, I don't know how
[21:33:37 CEST] <kepstin> huh, yeah, pict_type is available as a variable in expressions but it wasn't documented. That one's not very useful, since you get the numeric enumeration value; that's why there's a special function documented in the text expansion section that turns it into a character.
[21:35:48 CEST] <electrotoscope> oh right
[21:36:41 CEST] <electrotoscope> I guess someone might want to do positioning based on that... doesn't seem likely
[21:38:38 CEST] <electrotoscope> is the source that https://ffmpeg.org/ffmpeg-filters.html is generated from posted somewhere?
[21:38:47 CEST] <kepstin> it's generated from a file inside the ffmpeg code repository
[21:39:30 CEST] <kepstin> (so documentation updates are committed in the same patch as the code change is made, normally)
[21:48:42 CEST] <electrotoscope> Makes sense! Trying to update my format to match
[21:50:12 CEST] <kepstin> if you see any other properties in https://www.ffmpeg.org/doxygen/trunk/structAVFrame.html that you think would be useful, might as well add them at the same time.
[21:50:44 CEST] <kepstin> but stuff that's numbers is easier, anything that's an enum would need a helper function to render it as a nice string
[21:51:38 CEST] <kepstin> (it doesn't make sense to add audio stuff tho, because the filter only receives video frames)
[21:52:02 CEST] <electrotoscope> hmm looking
[21:53:22 CEST] <electrotoscope> Can't see anything! maybe pix_fmt for if things switch bit depth midway through a file but that's pretty esoteric and only affects people using Avid MediaComposer basically
[21:53:55 CEST] <electrotoscope> as far as I've ever heard. And they just don't like it because it generates errors and messes with drawtext and then I have to complain to the editor to stop doing that, it's not something anyone's going to want to burn in
[21:54:28 CEST] <electrotoscope> I tried to format the documentation change like your patch was written but this is my first time writing something like that so I might have gotten things wrong
[21:54:28 CEST] <electrotoscope> http://www.gluce.ca/drawtext_documentation_2.txt
[21:55:11 CEST] <kepstin> you'd want to actually have a git checkout of the code, and use git's diff functionality to generate that.
[21:55:14 CEST] <electrotoscope> (also fixing a typo right before it, and adding clarification to the "@item metadata" section because from google searching I know I'm not the only one who got confused by that term
[21:55:28 CEST] <kepstin> but having the desired text is a useful step anyways
[21:55:54 CEST] <electrotoscope> I figured that would be the 100% correct way to do it, but I don't have git set up
[21:56:53 CEST] <electrotoscope> Is there anything else I could do to help?
[21:59:15 CEST] <kepstin> hmm, at this point the last remaining step is getting everything into a properly formatted git diff and posting it to the mailing list. but collecting the proposed changes in a ticket might be useful so they don't get lost/forgotten.
[22:01:34 CEST] <electrotoscope> okay cool I'm happy to do that. Should I make a new ticket or add comments to the previous one do you think?
[22:02:41 CEST] <kepstin> I'd suggest reopening it and changing the title to "expose more frame properties as variables in drawtext filter" or something like that.
[22:03:03 CEST] <kepstin> if you're able to, i haven't used the ffmpeg trac much so i'm not sure what's permitted
[22:04:18 CEST] <electrotoscope> I can reopen but I can't change the description. Maybe I'll just make a new one
[22:12:38 CEST] <electrotoscope> I added it to trac at https://trac.ffmpeg.org/ticket/7947
[22:12:41 CEST] <electrotoscope> thank you so much!!!
[22:13:29 CEST] <electrotoscope> This will be a huge help to me, save me having to write everything out from ffprobe to an external text file with tons of line returns and then setting x=(n*[offset]) to scroll up every frame
[22:33:32 CEST] <void09> found this script that detects scene changes and makes screenshots at those points + text time log. I want something that also cuts the file at those points, with file names reflecting frame range. help ?
[22:33:34 CEST] <void09> ffmpeg inputvideo.mp4 -filter_complex "select='gt(scene,0.3)',metadata=print:file=time.txt" -vsync vfr img%03d.png
[22:46:08 CEST] <electrotoscope> @void09 maybe you could take those timecodes and create a concat list?
[22:48:06 CEST] <kepstin> void09: if you can get a list of timecodes or frame numbers, then you can pass them to the segment muxer to tell it when to generate breaks. There's not currently any way I know of to do it in a single pass.
[22:49:22 CEST] <kepstin> You'd probably want to make sure there are keyframes where you want breaks too if you do that, which I'm not sure how to do with ffmpeg cli. there might be encoder specific options?
[22:49:41 CEST] <kepstin> other than that, you could just script a loop of ffmpeg using -ss and -t to extract video sections.
[23:02:24 CEST] <void09> kepstin: thanks but that's what I wanted to find out : P
[23:02:31 CEST] <void09> I'm not good with scripting or ffmpeg
[23:04:11 CEST] <void09> already got the timecodes, but i'd rather have frame number
[23:04:30 CEST] <void09> frame:0 pts:38789 pts_time:38.789
[23:04:33 CEST] <void09> lavfi.scene_score=0.419210
[23:04:53 CEST] <kepstin> timecodes are what you want, you can seek to timecodes but you can't seek to frame number
[23:05:02 CEST] <void09> oh
[23:05:21 CEST] <void09> so I guess they are precise enough to not go to the wrong frame
[23:05:42 CEST] <void09> oh yeah, second with 3 decimals
[23:05:58 CEST] <void09> ok, got time codes
[23:06:10 CEST] <void09> I want to put together a distributed encoding thingie
[23:06:27 CEST] <void09> because av1 encoding is suuper slow
[23:06:53 CEST] <kepstin> hmm, that's tricky. to segment video at exact frame boundaries... you have to re-encode it
[23:07:18 CEST] <kepstin> but if you're doing a distributed encoding thing, you want to avoid multiple re-encoding steps
[23:07:19 CEST] <void09> re-encode it ?
[23:07:24 CEST] <void09> of course
[23:07:37 CEST] <void09> hm yes you are right, unless the cutting point is a keyframe, right?
[23:07:41 CEST] <void09> then no re-encoding is needed
[23:08:18 CEST] <kepstin> for this application, i'd recommend ignoring the whole scene detection thing, and just do segments of a fixed length (duration), which can be done easily enough with the segment muxer, cutting on keyframe boundaries without re-encoding
[23:08:20 CEST] <void09> and I am guessing the x264 encoder is sane enough to detect a radical scene change and start a new keyframe with it
[23:08:46 CEST] <void09> yes but this presents an adiional problem..
[23:09:01 CEST] <void09> maybe the x264 keyframe will not be the av1 keyframe, if it were not cut
[23:09:18 CEST] <kepstin> void09: if your segments are long enough, it doesn't make a big difference
[23:09:38 CEST] <void09> ok so the best strategy is to do both then
[23:09:42 CEST] <kepstin> and really, there's not much point in transcoding h264 to av1 unless the original h264 was extremely high quality
[23:09:50 CEST] <void09> scene detection -> if keyfrmae then cut, else proceed
[23:09:52 CEST] <kepstin> you'll get lossy encoding losses :/
[23:10:06 CEST] <void09> it's for bluray sources only of course
[23:10:14 CEST] <void09> 25-35mbit
[23:10:24 CEST] <kepstin> bluray has a fixed keyframe interval that's fairly short
[23:10:30 CEST] <void09> oh it does?
[23:10:41 CEST] <void09> i had no idea about that
[23:10:51 CEST] <void09> ok then.. this is complicated :)
[23:11:00 CEST] <kepstin> to make seeking work reasonably on slow optical media, ... :/
[23:11:16 CEST] <void09> yes that makes sense, they also got lots of space to spare
[23:11:48 CEST] <void09> what is this keyframe interval ?depends on the bluray disc?
[23:11:57 CEST] <kepstin> anyways, just segment into chunks on keyframe boundaries, use fairly long segments (think 5-10 minutes maybe?), and the loss from not getting perfect keyframe placement will be small enough to not matter
[23:12:12 CEST] <void09> kepstin: I want "perfect" :/
[23:12:20 CEST] <void09> 10kb loss will upset me :P
[23:12:32 CEST] <kepstin> if you want perfect then don't segment the video before encoding.
[23:12:51 CEST] <void09> hmm
[23:12:57 CEST] <void09> yeah, or re-encode losslessly
[23:13:01 CEST] <void09> which would be a pretty big file
[23:13:03 CEST] <kepstin> to get "perfect" you have to run the encoder to find out where it wants to place keyframes in order to know where to segment
[23:13:19 CEST] <kepstin> and yeah, lossless transcode would work too
[23:13:51 CEST] <void09> or just try scene changes with low change value, and choose ones that intersect with teh bluray encode keyframes
[23:13:53 CEST] <void09> if
[23:13:57 CEST] <kepstin> blu-ray is max of 2 second gop size (some video formats lower that to 1s)
[23:14:28 CEST] <kepstin> ffmpeg's scene change detection doesn't necessarily match the encoder's keyframe placement decisions
[23:15:08 CEST] <kepstin> an example would be a flashing scene where it goes blank then back to the previous image - that's detected as a "scene change" but a good encoder might notice that the previous frame comes back, so it'll want to keep it in the same gop to use the reference frame.
[23:17:28 CEST] <kepstin> if you want maximum efficiency and also segmented encoding, you just need to use big segments to amortize the cost of the extra keyframes :/ I suppose attempting to align it with scene changes may help slightly, but i'd expect it to be pretty minimal.
[23:40:21 CEST] <void09> ok I guess I could do a test run. get a file with keyframe times ? or frame numbers?
[23:41:15 CEST] <void09> and then a file with scene changes as detected by ffmpeg. and see how many match
[23:52:29 CEST] <void09> any help with getting indices/timeframes of keyframaes in a video?
[00:00:00 CEST] --- Sat Jun 8 2019
More information about the Ffmpeg-devel-irc
mailing list