[Ffmpeg-devel-irc] ffmpeg.log.20190514
burek
burek021 at gmail.com
Wed May 15 03:05:01 EEST 2019
[01:58:31 CEST] <le_chiffre> one more try?
[01:58:49 CEST] <le_chiffre> sorry 'bout that. wouldn't let me send before registering. ok here goes
[01:59:01 CEST] <le_chiffre> i'm having a problem with m2ts->mkv->mp4->fcpx import. first thing i discovered is that i suspect ffmpeg is adding blank frames
[01:59:41 CEST] <le_chiffre> is there any chance ffmpeg does this? 'ffmpeg -i foo.m2ts -c copy foo.mkv', i think this is growing the frames from 189260 in the original m2ts to 189280 in the mkv
[02:00:28 CEST] <le_chiffre> this growth isn't falling on a whole second boundary
[02:03:41 CEST] <le_chiffre> if i do this same thing with a commercial tool, dvdfab, it creates an mkv with the 189260 frames. both video streams in the 2 generated mkv's are frame-for-frame identical except that the ffmpeg one has the 20 extra blank frames at the end.
[08:48:38 CEST] <lindylex> This is broken because it makes my video file play time to long and does not connect the audio seamlessly. https://pastebin.com/g7bVmNZn
[09:34:46 CEST] <lindylex> Solved it I needed to add setpts=PTS-STARTPTS after each audio cut.
[09:35:03 CEST] <lindylex> Like this : [0:a]atrim=02:10,asetpts=PTS-STARTPTS,atempo=1/0.6[a2];
[13:59:07 CEST] <ButtDog> Has anyone here used ffmpeg-python library for Python?
[14:39:38 CEST] <killown> can anyone help me curt 5 seconds from a video given the start time?
[14:40:07 CEST] <killown> I have this which doesn't work ffmpeg -y -ss 0:00:10 -i input -strict -2 -c copy -t 0:00:13 output
[15:35:55 CEST] <kepstin> killown: you've got two problems: 1. when using stream copy (-c copy), seeking is not exact - it will start at a nearby keyframe instead
[15:36:13 CEST] <kepstin> killown: and 2. the -t option specified output duration, so to get 5 seconds, use -t 5
[20:12:07 CEST] <cartman412> Hey peeps, I've been using ffmpeg to generate memes (by adding padding and overlaying an image over the input video) and I use palettegen/paletteuse. To achieve this I first run a command to generate the color palette and then use it as one of the inputs to generate my final gif. However, I was wondering if it was at all possible to chain these two processes? I tried this filter:
[20:12:09 CEST] <cartman412> fps=15,scale=720:720:force_original_aspect_ratio=decrease:flags=lanczos[x2];[x2][1:v]overlay=0:0[b];[b]palettegen[x3];[b][x3]paletteuse
[20:12:33 CEST] <cartman412> but it throws an error saying [b] is not a valid input
[20:12:51 CEST] <JEEB> &34
[20:13:04 CEST] <cartman412> I'm guessing I can only use the output from the previous filtergraph rather than any of the outputs in the chain?
[20:13:13 CEST] <cartman412> Exception being to the inputs?
[20:15:01 CEST] <cartman412> In order for me to chain them I guess I would have to call the same filters that I applied to generate the palette in the first place?
[20:15:10 CEST] <DHE> you'll have to use the split filter
[20:15:17 CEST] <DHE> [b]split[b1][b2]
[20:15:37 CEST] <cartman412> so its because I already used that stream I guess?
[20:15:45 CEST] <cartman412> rather than having to reapply the filters?
[20:20:03 CEST] <lazylemur> Hey y'all trying to use 'apad' with 'whole_dur' option and it's throwing an error. Paste: https://pastebin.com/GMSBRBwg
[20:20:09 CEST] <lazylemur> Thoughts?
[20:21:46 CEST] <lazylemur> It's just saying 'whole_dur' option is not found. I'm using the version provided by: https://www.npmjs.com/package/ffmpeg-static
[20:23:51 CEST] <cartman412> Thank you DHE that worked :)
[20:33:21 CEST] <cartman412> So let's say I want to generate a thumbnail sized gif along with my main gif, how would I go about it? this is my current command for generating the gif:
[20:34:01 CEST] <cartman412> -y -t 0.67 -i videoPath -i overlayPath -filter_complex fps=15,scale=720:720:force_original_aspect_ratio=decrease:flags=lanczos,pad=720:720:(ow-iw)/2:(oh-ih)/2:0xFF0A64[x2];[x2][1:v]overlay=0:0[b];[b]split[b1][b2];[b1]palettegen[x3];[b2][x3]paletteuse outputPath
[20:36:00 CEST] <cartman412> I guess I can split the paletteuse output and use one stream to save to file and resize the other output and then save it?
[20:36:13 CEST] <DHE> that's what I would do
[20:36:36 CEST] <cartman412> better than split before the scale filter right?
[20:37:01 CEST] <DHE> ;[b2][x3]paletteuse[pout];[pout]split[mainout][resizeme];[resizeme]scale=...[resizeout]
[20:37:28 CEST] <DHE> -map [mainout] bigoutput.gif -map [resizeout] smalloutput.gif
[20:37:36 CEST] <cartman412> wow, thank you for the help again DHE!
[20:38:17 CEST] <cartman412> I'm guessing that this one won't significantly improve the overall time thou?
[20:38:33 CEST] <cartman412> (as opposed to my previous chain)
[20:38:49 CEST] <DHE> I'm just appending to the end of your current chain. split the output into a main and smaller version, and save both copies
[20:39:02 CEST] <DHE> it'll be faster than running it twice with different options both times
[20:39:48 CEST] <cartman412> I mean, compared with just running that final filter in a new command, using the input from disk
[20:39:58 CEST] <cartman412> I guess the difference will be one needs to read from disk as opposed to from memory
[20:40:08 CEST] <DHE> and all the initial filtering is only done once
[20:40:51 CEST] <DHE> there is one thing I'm not sure of though. rescaling the output may cause the work done by paletteuse to be ruined. it may be necessary so split upstream of that, and then run paletteuse on both paths
[20:41:09 CEST] <DHE> unless you do a low quality nearest-neighbour scaling
[20:41:27 CEST] <cartman412> I mean, I currently don't run paletteuse on the rescaling and while the quality drops a bit its not substantial, so I think its ok
[20:42:47 CEST] <cartman412> I've only been using ffmpeg for a couple of months now but I get more and more impressed by it every day :D Plus its the second time I'm asking stuff in this channel and people are always so quick to help :) Awesome community
[20:45:51 CEST] <ossifrage> What is the advantage of column intra-refresh over row? It seems like row refresh would have less overhead
[20:47:33 CEST] <DHE> is that even an option?
[20:49:48 CEST] <ossifrage> DHE, it is with some encoders
[21:06:10 CEST] <another> can anybody recommend a deinterlace filter? there are quite a few
[21:06:55 CEST] <DHE> it depends on your exact needs, but for simple watching try just yadif
[21:09:53 CEST] <another> any resources to read up on them?
[21:21:59 CEST] <DHE> the raw docs are at https://ffmpeg.org/ffmpeg-filters.html but for some of the more complex filters you may want to google for how people use them in filter-chains and stuff
[21:24:16 CEST] <another> hm
[21:25:28 CEST] <another> think i'll go with bwdif
[22:40:00 CEST] <ossifrage> /join #fedora
[22:40:17 CEST] <ossifrage> doh, bad finger, bad...
[00:00:00 CEST] --- Wed May 15 2019
More information about the Ffmpeg-devel-irc
mailing list