[Ffmpeg-devel-irc] ffmpeg.log.20171127

burek burek021 at gmail.com
Tue Nov 28 03:05:01 EET 2017


[00:16:04 CET] <JEEB> SortaCore: I'm pretty sure there's a thing on "if you don't have any user interface then blah blabh"
[00:24:24 CET] <ZexaronS> Hello
[00:26:58 CET] <ZexaronS> I have trouble with MPEGTS muxing when doing a test, the original TS is a bit broken (doesn't start on keframe) but encoders can deal with it, so I put it to MKVToolNix, made sure DVBSUB is supported as of this year, good, out to MKV and then used ffmpeg to mux/copy everything back to TS
[00:27:21 CET] <ZexaronS> but it produces a laggy file, both VLC and MPC-HC look the same
[00:28:56 CET] <ZexaronS> I'll also try testing with a recoded MKV, which was origTS -> Handbrake -> MKV1 -> MKVToolnix Add DVBSub from origTS -> MKV2 -> ffmpeg-MPEGTS
[00:29:47 CET] <ZexaronS> So the laggy example was, origTS -> MKVToolnix do nothing (copy all streams) -> MKV - > ffmpeg-MPEGTS
[00:30:31 CET] <ZexaronS> ffmpeg -loglevel verbose -i input.mkv -vcodec copy -acodec copy -scodec copy -threads 0 -f mpegts output.ts
[00:35:58 CET] <ZexaronS> oh it may be framerate .. maybe i need to force, it's interlaced source
[00:36:05 CET] <ZexaronS> no dropped frames
[00:36:21 CET] <ZexaronS> 25, it should be 50 fps
[00:36:38 CET] <ZexaronS> hopefully it's an easy fix
[00:38:37 CET] <ZexaronS> I was trying to find which options ffmpeg has for mpegts that are for mux only
[00:40:26 CET] <ZexaronS> actually, maybe it doesn't play here right, but it will on tv, but doubt it, it'll be played with the video player, not the tvs internal DVB player from signal
[00:46:47 CET] <ZexaronS> well that's truncated fps, oh ok, 25 is right,
[00:50:06 CET] <ZexaronS> could be some timestamp issue, but i can see the audio sync is just fine, it feels as if half the frames get dropped
[00:59:00 CET] <ZexaronS> Oh
[00:59:25 CET] <ZexaronS> jitter is at 54 ms in the muxed one, original has 0 to 1 ms most of the time
[02:08:53 CET] <ZexaronS> OH the test where recoding is involved seems to fix it - Handbrake gets rid of most of the corrupted bits
[03:34:26 CET] <yukiup> this is my current command: "ffmpeg -i in.mkv -c:v hevc_nvenc -c:a libopus -b:a 320k -ac 2 -c:s copy out.mkv"
[03:34:42 CET] <yukiup> any suggestions to imrpove the quality of dark black scenes?
[04:08:56 CET] <whysohard> Hi guys. For years I couldn't understand. Ffmpeg is very powerful. But it cannot handle with a very simple trim case: Trim end of file.. There are several commands -ss -sseof -t... But it's not sufficient
[04:09:34 CET] <whysohard> There is a question about that: https://stackoverflow.com/questions/31862634/i-need-to-cut-mp4-videos-a-part-at-the-beginning-and-a-part-at-the-end-in-a-bat
[04:10:13 CET] <whysohard> The solution is sooo complex. Why we dont have an easy single command :(
[05:03:56 CET] <sado[de]> Hi
[05:05:55 CET] <sado[de]> I have video files in mp4 and i can't play them, the moov atom isn't found, for this error it worked. Have someone an idea to repair it with ffmpeg ?
[05:06:11 CET] <sado[de]> ... before this ...
[05:07:48 CET] <sado[de]> mom reconnect
[05:23:31 CET] <TheRock> decoder=avi,wma,ogg,wmv,vp8,vp9,opus,vorbis,vc1,mpegvideo,flv,mp3,aac,h264,mpeg2video,mpeg4,asf,swf,mov,wav
[05:23:37 CET] <TheRock> which one enables h263 ?
[05:24:04 CET] <TheRock> i added some decoders and h263 is now listed in configure overview
[05:24:42 CET] <TheRock> --enable-demuxer=wmv,ogg,vc1,m4v,flv,mp3,aac,avi,h264,matroska,mov,m4v,rawvideo,wav,mpegvideo,wma,asf,swf,vp8,vp9
[05:24:50 CET] <TheRock> i know some of them dont exist
[05:25:00 CET] <TheRock> but it's just for safety
[05:40:16 CET] <sado[de]> sry for reconnect
[05:41:21 CET] <sado[de]> Is it possible to repair a mp4 with missed atom section with ffmpeg ?
[11:49:14 CET] <fps> cann i call  av_register_all more than once per process?
[12:39:01 CET] <fps> or put more technically. are these functions idempotent_
[12:39:05 CET] <fps>     avcodec_register_all();
[12:39:05 CET] <fps>     av_register_all();
[12:39:05 CET] <fps>     avformat_network_init();
[12:39:59 CET] <DHE> it is, but it isn't thread-safe
[12:47:36 CET] <fps> DHE: ok, ty
[14:12:18 CET] <Nacht> Is it possible to get ffmpeg's output in JSON, just like ffprobe ?
[14:12:32 CET] <JEEB> no
[14:12:40 CET] <JEEB> it's not supposed to be machine-parsed
[14:12:54 CET] <Nacht> Hmm. That's a shame
[14:13:33 CET] <Nacht> ffprobe isn't giving me the correct clip length, due to it being an estimate, so I thought I'd just use ffmpeg to get the correct length
[14:13:56 CET] <JEEB> -show_frames is what you want then
[14:14:01 CET] <JEEB> which will decode the pictures
[14:14:09 CET] <Nacht> Ah cheers
[14:15:15 CET] <DHE> but it does raise CPU requirements and takes time to run since it literally decodes the video/audio in memory
[14:15:22 CET] <BtbN> that can take _a long time_ though, depending on what you are decoding there
[14:18:55 CET] <Nacht> So be it. I'm using it to calculate the -ss & -t time. So it has to be accurate
[14:55:44 CET] <furq> Nacht: -count_frames, not -show_frames
[16:26:59 CET] <kepstin> i think -count_frames doesn't decode, it only demuxes. But still, high io requirement since it has to read the whole file
[16:38:17 CET] <yvi> playback with mplayer -benchmark seems to be the best way to reduce latency (coming from ffmpeg x11grab->x265 encode->pipe->mplayer) according to my (simplistic) measurements..can i replicate whatever it does with ffplay/ffmpeg
[16:38:59 CET] <yvi> using ffplay or ffmpeg (with -f sdl) i can get around 100ms best case
[16:39:47 CET] <yvi> also i can't seem to make kmsgrab work..none of the examples work for me
[16:43:43 CET] <yvi> more like 120ms+ actually..with mplayer its half that
[16:44:43 CET] <yvi> this is for sending desktop over the network..at first i thought i had delays at the encoder/network but the buffering of all the fucking players is just terrible
[16:45:25 CET] <JEEB> normal players generally aren't by default suited for low latency usage
[16:45:35 CET] <JEEB> nor is libavformat or libavcodec by default
[16:47:30 CET] <bencoh> libavcodec is fine actually
[16:59:30 CET] <JEEB> bencoh: for encoding you might have to configure things
[16:59:39 CET] <JEEB> for decoding yea
[16:59:44 CET] <JEEB> you generally get as good as you get
[17:00:55 CET] <xuing> Hello https://github.com/openwrt/packages/issues/5181 May I help?.Thanks.
[17:04:12 CET] <yvi> have zero problems with encoding latency
[17:04:30 CET] <fps> xuing: do you want to help? or do you want to get help
[17:04:30 CET] <fps> ?
[17:04:48 CET] <yvi> seems to be that mplayer is the only player that can actually do this for some reason
[17:05:10 CET] <yvi> i tried mpv also with untimed, no demuxer readahead, no caches etc.. not working
[17:05:32 CET] <xuing> sorryMy English is very poor
[17:05:40 CET] <xuing> I mean I need a help
[17:10:31 CET] <bencoh> JEEB: ah, for encoding libavcodec is terrible yeah :)
[17:10:41 CET] <JEEB> well as terrible as any other thing to be honest
[17:10:50 CET] <JEEB> since most encoders just are by default not latency optimized
[17:10:53 CET] <bencoh> s/encoding/low latency encoding/ mybad
[17:11:23 CET] <JEEB> in theory I think you can get the lavc libx264 layer to as low latency as x264 permits, I think? although not sure about the separate slice outputs
[17:11:43 CET] <JEEB> you might need to wait until a picture is fully encoded so you can get the AVPacket
[17:11:54 CET] <atomnuker> bencoh: kierank has some patches to support the equivalent of draw_horiz_band in encoding
[17:12:01 CET] <bencoh> oh :)
[17:12:09 CET] <bencoh> JEEB: that, and some vbv issues
[17:13:02 CET] <JEEB> yea but that isn't avcodec specific, is it?
[17:13:12 CET] <bencoh> afaicr (that was a few years ago) avcodec doesn't pass along all the vbv information needed for muxing
[17:13:13 CET] <JEEB> unless it's one of the internal encoders which I have no idea of :D
[17:13:37 CET] <bencoh> (while directly calling x264 "just worked" fine)
[17:13:52 CET] <JEEB> yea, MPEG-TS mux rate is separate and UDP's bitrate are separate
[17:14:22 CET] <JEEB> if you set maxrate/bufsize those are set into the avcodec context and thus applied to libx264
[17:14:40 CET] <JEEB> (plus you have the x264-params thing if you need something not available through lavc's struct)
[17:14:42 CET] <bencoh> yep, but you're still missing per-frame information back from the encoder
[17:14:56 CET] <bencoh> (at least that's what I recall from back then)
[17:15:35 CET] <JEEB> if you need the nal-hrd vbr stuff that has to be enabled specifically in x264 cli , libx264 and of course lavc
[17:15:49 CET] <JEEB> it doesn't "just work" with x264 or libx264 either
[17:16:23 CET] <JEEB> and it's highly unlikely that libavcodec strips some things out of the bit stream given to you by libx264, although theoretically that is possible
[17:17:05 CET] <bencoh> no it doesn't strip anything, it was metadata available in the picture struct iirc
[17:19:41 CET] <bencoh> although I can't seem to find what, which has me confused now
[17:28:53 CET] <JEEB> bencoh: nal-hrd vbr/cbr output extra stuff
[17:29:01 CET] <JEEB> which contain the current state of the NAL-HRD stuff
[17:52:04 CET] <Cracki> cheers. where would I submit a feature request for the dshow input device? got a hardware-h264-capable webcam and would like to set its bitrate too. I've found some relevant places in the code
[17:52:35 CET] <Cracki> namely libavdevice/dshow.c:1284 (static const AVOption options[])
[17:53:02 CET] <Cracki> and anything that uses the VIDEO_STREAM_CONFIG_CAPS structure from dshow
[17:53:43 CET] <Cracki> that thing _has_ Min/MaxBitsPerSecond and a debug-guarded print output in the dshow module also lists these values
[17:53:55 CET] <Cracki> however I see nothing that sets them
[17:59:34 CET] <Cracki> seems the cli argument parsing puts the given values into a struct dshow_ctx, which might need extending too
[18:05:47 CET] <Cracki> I think 'dshow_cycle_formats' is where most of the work needs to be
[18:06:19 CET] <Cracki> I am not sure I am set up to build ffmpeg myself (on windows), but I might try...
[18:26:09 CET] <fps> yvi: do you have libshine installed?
[18:28:12 CET] <TheRock> player.audio()->setVolume(100);
[18:28:17 CET] <TheRock> is that normal that the sound is so loud
[18:28:24 CET] <TheRock> that my notebook virbates ?
[18:28:35 CET] <TheRock> i never heard such a loud sound from my notebook
[18:33:14 CET] <alexpigment> TheRock: if your notebook vibrates at max volume, then that's just something you have to accept about your notebook. There's nothing software-wise that can increase the volume of playback above what the driver already considers to be 100%
[18:34:20 CET] <alexpigment> also, keep in mind that most audio is not brickwalled at 0db. a lot of the content out there sits between -6db and -3db, maybe peaking near 0db at points
[18:34:51 CET] <alexpigment> if you have a highly compressed audio track and you normalize it to 100%, it's going to be louder than most things you hear
[18:35:06 CET] <TheRock> i'm just wondering, because if i dont set the volume and control it through windows, the sound is not loud as when i set it in the software with 100
[18:36:26 CET] <alexpigment> TheRock: out of curiosity, what is the scale of setVolume?
[18:36:54 CET] <TheRock> the libray has 0-100
[18:37:03 CET] <TheRock> and if i dont set it, the windos volume is usedd
[18:37:11 CET] <alexpigment> what if you set it to 1?
[18:38:10 CET] <TheRock> setVolume(1) is quite loud
[18:38:11 CET] <alexpigment> if it works on the same scale as the audio volume filter in FFMPEG, then 1 is 100%
[18:38:24 CET] <alexpigment> try 0.5
[18:38:27 CET] <alexpigment> see if it's half volume now
[18:38:43 CET] <TheRock> and 100 is louder than a jet engine
[18:39:45 CET] <TheRock> you are right
[18:39:50 CET] <TheRock> 0.05 - 1
[18:39:53 CET] <TheRock> is normal
[18:40:58 CET] <alexpigment> ok, then 100 is simulating what I described above, which is a brick-walled track at 0db. kinda the theoretical max volume of your computer
[18:43:14 CET] <TheRock> yeah
[18:43:15 CET] <TheRock> at 100
[18:43:25 CET] <TheRock> you don't even understand the words of the movie
[18:43:48 CET] <alexpigment> yeah, intelligibility requires dynamics ;)
[18:44:06 CET] <TheRock> so i could shock my app users
[18:44:10 CET] <TheRock> by setting it to 100
[18:44:21 CET] <alexpigment> just call it "distortion" and sell it as a feature ;)
[18:44:42 CET] <alexpigment> "I've digitally recreated a guitar distortion pedal in this app"
[18:46:13 CET] <TheRock> :P
[19:08:35 CET] <devinheitmueller> Question:  if I need a generic FIFO for AVPackets, is there some existing implementation I should be using?  Should I be trying to reuse av_fifo_generic*() ?
[19:12:11 CET] <thebombzen> is there any easy way to combo -vf cropdetect with -vf crop? or does that require two passes and copy/pasting by hand?
[19:12:54 CET] <thebombzen> I mean I could always run -vf cropdetect and output to null, copy/paste from the terminal, but I was looking for a way to perhaps expedite the process (or rather, what I really want is to make it more scriptable)
[19:13:21 CET] <BtbN> no
[19:17:52 CET] <thebombzen> is there a way to have ffmpeg.c output the cropdetect info in a more machine-parseable way?
[19:19:16 CET] <devinheitmueller> Feels like the sort of case where you would want cropdetect to set frame metadata, which can then be consumed by the crop filter.
[19:19:49 CET] <devinheitmueller> Oh look, cropdetect does set metadata!
[19:19:55 CET] <devinheitmueller>         SET_META("lavfi.cropdetect.x1", s->x1);
[19:20:17 CET] <BtbN> that won't work
[19:20:29 CET] <BtbN> The frame dimensions have to be known, you cannot change them on the fly
[19:21:22 CET] <devinheitmueller> BtbN: Presumably that would depend on the output format.  Some out formats/codecs do support frame dimension changes.
[19:21:50 CET] <devinheitmueller> Let me rephrase - the specs for some codecs support frame dimension changes.  No idea if ffmpegs implementation of such codecs do.
[19:21:53 CET] <BtbN> none that are in ffmpeg
[19:22:06 CET] <BtbN> And I'm pretty sure filtergraphs and ffmpeg.c don't support it either
[19:23:11 CET] <devinheitmueller> Continues to amaze me how some things that happen all the time in broadcast television are seen as some weird/wacky edge case in the eyes of ffmpeg.  :-)
[19:23:29 CET] <JEEB> decoders can output different sized frames just fine
[19:23:39 CET] <JEEB> at least stuff like H.264
[19:23:48 CET] <JEEB> the issue really is the rest of the business like lavfi
[19:23:52 CET] <thebombzen> could you handle it with something like sendcmd?
[19:24:00 CET] <BtbN> no
[19:24:02 CET] <JEEB> ffmpeg.c is a completely separate thing
[19:24:08 CET] <JEEB> and other pile of lulz
[19:24:29 CET] <devinheitmueller> JEEB: Yeah, I can imagine things like resolution changing after the filtergraph has been setup can cause some confusion (i.e. in principle you may need to renegotiate the pipeline)
[19:24:30 CET] <BtbN> For filters, the output resolution has to be known when config is called on the outlink
[19:24:38 CET] <BtbN> which iirc happens before even a single frame is fed
[19:24:55 CET] <thebombzen> I say "scriptable" but realistically I'm just calling ffmpeg.c from a language that isn't C/C++
[19:25:08 CET] <JEEB> devinheitmueller: yea
[19:25:10 CET] <thebombzen> it's easier to call ffmpeg.c than it is to deal with bindings to libav* from high level languages
[19:25:58 CET] <thebombzen> either way, would it be possible to use vf cropdetect to generate a crop filter for one frame, and then use that for vf_crop without changing it?
[19:26:02 CET] <JEEB> devinheitmueller: that said if you control your own flow you can do that just fine. flush the filter graph and re-create it if necessary. of course it would be better if lavfi could handle it nicer but it is handle'able
[19:26:30 CET] <JEEB> devinheitmueller: for broadcast stuff the limitations start appearing in things like "you cannot add streams after you've done av_write_header"
[19:26:39 CET] <devinheitmueller> I continue to look for good libav* bindings in any high level language.  Recommendations welcome.
[19:27:05 CET] <JEEB> I think I tested some python bindings generator some time ago that was not as backwards as SWIG
[19:27:11 CET] <devinheitmueller> JEEB: you can add streams after av_write_header if a flag is set, but I think the problem is really there is no way to *remove* streams.
[19:27:33 CET] <JEEB> yea, that too
[19:27:34 CET] <devinheitmueller> Ugh SWIG.  I just threw up in my mouth a little.  :-/
[19:27:40 CET] <JEEB> yes, I tried that in 2014
[19:27:42 CET] <JEEB> it was awful
[19:27:52 CET] <JEEB> recently I tried rust and I liked bindgen
[19:28:11 CET] <JEEB> too bad I'm not sure of the rust ecosystem just yet
[19:28:19 CET] <JEEB> golang is OK until you hit an edge case
[19:28:41 CET] <devinheitmueller> Thats the issue.  Sure, I could learn rust and write it, but there are so few developers I could hire to maintain the result.
[19:29:05 CET] <JEEB> one of my friends has been unlucky enough to be in multiple golang tickets about the C FFI
[19:29:30 CET] <JEEB> I think there's some stuff for C# and Java as well to generate the mappings during build time
[19:29:58 CET] <JEEB> of course you have to then create your own higher level things but that generally depends so much on what you're trying to do that I'm not sure if it's worth abstracting too much
[19:30:35 CET] <devinheitmueller> Would just be nice to be able to assemble pipelines without having to write my own version of ffmpeg.c.
[19:31:00 CET] <JEEB> no, you don't want to be nowhere near ffmpeg.c to be honest :D
[19:31:10 CET] <JEEB> ffmpeg.c is trying to be too many things and isn't dynamic at all
[19:32:02 CET] <devinheitmueller> I argued at the VDD conference that there is way too much being done by ffmpeg.c that should really be part of the framework itself, but I dont think we would ever see any consensus on what parts to abstract or how they should work.
[19:32:26 CET] <JEEB> I was around that table as well :)
[19:33:04 CET] <JEEB> but yes, I agree there are too many things in ffmpeg.c that fixup some weirdness in how lavf or so work (esp. wrt timestamps)
[19:33:59 CET] <devinheitmueller> Yeah, Ive got some pretty nasty patches related to passing along the original timestamps to the output, which I suspect will result in some lively debate.
[19:34:58 CET] <JEEB> well, that's how for example the MPEG-TS demuxer is currently (supposed) to work, which from the other side is just pants-on-head stupid for something that most of the framework thinks should be a proper monotonically rising timestamp
[19:35:14 CET] <JEEB> which is why upipe has I think three timestamps
[19:35:33 CET] <JEEB> coded timestamp, aligned (?) timestamp and receival timestamp
[19:35:38 CET] <devinheitmueller> Right.  The TS demux passes along the original timestamps, which ffmpeg.c immediately throws away and re-timestamps everything before the frames hit the output.
[19:36:16 CET] <JEEB> first being the original, aligned being what a lot of the framework expects and receival being there just for the shits n' giggles (and for some heuristics)
[19:36:39 CET] <JEEB> I would say we could use at least two of those
[19:37:10 CET] <JEEB> because in some cases I want the original timestamps, but then in other cases I as the API user just want the monotonically rising stuff that is fixed from the original coded stuff
[19:37:10 CET] <devinheitmueller> From what I hear, proper timestamps is one of the strengths that upipe claims to have.
[19:37:16 CET] <JEEB> yea
[19:37:29 CET] <JEEB> I might or might not have noted that at the discussion
[19:37:45 CET] <JEEB> because both use cases are 100% valid but having them in the same DTS/PTS field makes IMHO no sense
[19:38:03 CET] <JEEB> because then you get disconnects like half of the framework not taking things in
[19:38:58 CET] <devinheitmueller> Yup.
[19:40:27 CET] <devinheitmueller> Ill be the first to acknowledge that the concept of timestamps in general are hard.  At this point Im just trying to find low-impact ways to extend ffmpeg to be able to accomplish the use cases I care about.
[19:43:33 CET] <JEEB> for me my initial thought was to move the mpeg-ts specific timestamp fixing into the demuxer, and since some people wanted the original timestamps make it an option to make life simpler for the API (ab)user
[19:44:01 CET] <devinheitmueller> JEEB: I started injecting the original timestamp as side data.  No idea if that will get accepted upstream though.
[19:44:07 CET] <JEEB> :D
[19:44:19 CET] <JEEB> that's sad yet I understand why you're doing that
[19:44:21 CET] <JEEB> which is also sad
[19:44:28 CET] <devinheitmueller> Thats the least invasive approach I could come up with.
[19:45:02 CET] <JEEB> yea, because you can ignore all the ffmpeg.c stuff and just check in the muxer
[19:45:09 CET] <JEEB> and suddenly you've got original stuff
[19:45:13 CET] <devinheitmueller> Yup.
[19:45:24 CET] <devinheitmueller> Ive got a bitstream filter which acts on it.
[19:47:53 CET] <JEEB> I wonder if -copyts and -vsync passthrough was all that I was using at some point... (for some other reasons). and I bet there was some code in ffmpeg.c to poke even that
[19:48:03 CET] <JEEB> I remember at some point looking into the "vsync" code in ffmpeg.c
[19:48:13 CET] <JEEB> iä iä cthulhu f'thagn
[22:21:47 CET] <TheRock> Is XAudio2 thread friendly?
[22:22:22 CET] <TheRock> XAudio2 backend doesn't make any sound
[22:22:26 CET] <TheRock> in a thread
[22:22:29 CET] <TheRock> any ideas?
[22:22:47 CET] <TheRock> When DirectSound backend is activated all works as expected
[22:22:54 CET] <durandal_1707> whats XAudio2?
[22:23:43 CET] <TheRock> Looks like QtAV (lib for ffmpeg) uses it as backend
[22:25:57 CET] <BtbN> You should ask them about it then
[22:27:48 CET] <JEEB> TheRock: this is FFmpeg, not <random project utilizing FFmpeg in the background>
[22:31:36 CET] <TheRock> nah
[22:31:58 CET] <TheRock> if you guys link/recommend the library on the page, there could be at least a chance one of you guy knew something ;)
[22:32:43 CET] <JEEB> eh
[22:32:50 CET] <JEEB> do we recommend QtAv anywhere?
[22:32:56 CET] <JEEB> if we do please note so it can be removed
[22:33:35 CET] <durandal_1707> why?
[22:34:08 CET] <camjac251> Does anyone have any experience with converting variable framerate to constant framerate videos with ffmpeg? I have this command https://pastebin.com/ZBfAYdgF and was wondering if it is correct for conversion of VFR to CFR (shadowplay NVENC footage to dnxhd)
[22:34:25 CET] <TheRock> there is no need to remove it, QtAV took the authr a lot of work. it can be found on your ffmpeg page https://trac.ffmpeg.org/wiki/Projects
[22:34:33 CET] <JEEB> that's the wiki
[22:34:37 CET] <JEEB> free reign pretty much
[22:34:39 CET] <JEEB> :P
[22:34:49 CET] <JEEB> you cannot take it as any sort of recommendation
[22:34:52 CET] <TheRock> ah, ok
[22:41:24 CET] <alexpigment> camjac251: have you tried just putting in -r 60 or something like that?
[22:42:06 CET] <alexpigment> i know there's a force-cfr param for x264, but I'm not sure about dnxhd
[22:42:19 CET] <alexpigment> either way, specifying the frame rate explicitly *should* convert it to CFR
[22:42:59 CET] <camjac251> It's added right after the input
[22:45:34 CET] <alexpigment> camjac251: apparently I didn't look at your command close enough ;)
[22:45:43 CET] <alexpigment> at any rate, that should work fine
[22:47:07 CET] <camjac251> i noticed something weird. in mpc-hc with madvr it looks fine but in After Effects it's different. The start of the ffmpeg converted video and the same source converted with handbrake to constant framerate at 60fps starts on different frames
[22:47:19 CET] <camjac251> one of the files is 2 frames behind the other
[22:47:36 CET] <gon_> Hello, having some trouble with combining files. I'm trying to combine video clips with pictures using hstack. The clips and pictures are the same height but the image resolution seems off, it's too wide. Any ideas?
[22:49:39 CET] <camjac251> could be aspect ratio?
[22:50:28 CET] <camjac251> pixel format
[22:50:32 CET] <durandal_1707> use setsar, latest master have this limitation removed
[22:50:51 CET] <camjac251> thank you alexpigment for your help
[22:52:24 CET] <gon_> Yeah I seem to be missing something. The video clip is 720x960 so made sure the image height was also 960 thinking the rest would be taken care of. Using hstack to merge them together
[22:53:41 CET] <durandal_1707> gon_: as said use setsar,  pastebin full ffmpeg output
[23:03:12 CET] <gon_> Sorry still very new to ffmpeg. Just trying this out. How could I use setsar instead of hstack? or along with hstack? Here's where I'm at so far https://pastebin.com/cj8zr4Zp
[23:03:22 CET] <bray90820> How would I use ffmpeg with audacity
[23:04:07 CET] <durandal_1707> gon_: before each hstack call setsar=1/1 on each input video
[23:04:51 CET] <durandal_1707> bray90820: by compiling audacity with ffmpeg support
[23:10:28 CET] <durandal_1707> gon_: i see nothing wrong with that pastebin output, perhaps disable shortest of hstack
[23:11:58 CET] <gon_> Yeah it's coming out but just having that aspect ratio problem with the image. The video AR seems to be ok but just having trouble keeping the AR on the image. Now trying to figure out the order for the call you mentioned
[23:13:01 CET] <alexpigment> camjac251: it's possible there's a cue sheet or something that tells the player to start at a particular time. that's pretty common for quicktime files
[23:13:13 CET] <alexpigment> not sure if it's what's going on, but I figured i'd mention it
[23:13:36 CET] <durandal_1707> gon_: well if aspect ratios do not match you will need to manually fix that with pad and scale
[23:18:31 CET] <gon_> So the same AR even though height is the same for both inputs?
[00:00:00 CET] --- Tue Nov 28 2017


More information about the Ffmpeg-devel-irc mailing list