[Ffmpeg-devel-irc] ffmpeg.log.20180810

burek burek021 at gmail.com
Sat Aug 11 03:05:01 EEST 2018


[00:04:17 CEST] <Maverick|MSG> anyone using the latest 'ffmpeg windows build helpers' script run into an issue on ubuntu where the compiler complains "libmfx not found"
[01:23:16 CEST] <botik101> guys, how can I get ffmpeg to stop searching for frames AFTER it found all the frames in the video? I have a 10 hr long video and i want to grab few framnes. If the frames are in the beginning of the video, ffmpeg keeps on scanning until the end of the video even after it found all the frames I have specified with -select
[01:26:38 CEST] <c_14> use -frames:v with the total number of frames you want
[01:27:07 CEST] <botik101> c_14 - makes total sense!
[01:27:21 CEST] <botik101> c_14 - i was using that, but there is a proble: ffmpeg -i /home/victor/Videos/FIFA/mundo-full.mp4 -filter:v "select='lt(prev_pts*TB\,(96.056835, u'church'))*gte(pts*TB\,(96.056835, u'church')) + 'lt(prev_pts*TB\,(96.967261, u'church'))*gte(pts*TB\,(96.967261, u'church'))'" -vsync vfr ./tmp/still_%06d.png
[01:28:14 CEST] <botik101> c_14: as you can see, because we cannot specify exact timestamp (since ffmpeg might not find it) we are specifying next best thing - LT/GT...so the total umber is a big question mark. We do know the ranges though
[01:29:02 CEST] <botik101> arghhh, ignore anything other than values please....
[01:30:07 CEST] <c_14> use -t as an input option to limit the duration of the file to be used
[01:31:16 CEST] <botik101> -t will allow me to specify the maximum length? so if file is 10hrs long, and the last frame is 2:34:21 then i shuold just specify t to be 3:00:00 >
[01:31:18 CEST] <botik101> ?
[01:31:36 CEST] <c_14> yeah
[01:32:01 CEST] <botik101> like this? ffmpeg -i /home/victor/Videos/FIFA/mundo-full.mp4 -t 97 -filter:v "select='lt(prev_pts*TB\,96.056835)*gte(pts*TB\,96.056835) + 'lt(prev_pts*TB\,96.967261)*gte(pts*TB\,96.967261)'" -vsync vfr ./tmp/still_%06d.png
[01:33:29 CEST] <c_14> -t 97 before -i
[01:33:50 CEST] <botik101> excellent! This makes life so much easier.
[01:35:08 CEST] <botik101> Sometimes frames are 40 minutes, 1 hour or even 7 hours a apart. For now, I am spliting the videos into smaller fragments and then seaarching in those fragments, but it feels ery inefficient. Is there a way to tell ffmpeg to jump to the times where I want it to cut?
[01:35:42 CEST] <c_14> -ss before -i
[01:36:06 CEST] <poutine> isn't it typically recommended to do it both before and after as before seeks to iframe?
[01:36:22 CEST] <poutine> which might not be that exact point
[01:36:30 CEST] <c_14> poutine: nah, you don't have to do that anymore
[01:36:41 CEST] <poutine> good to know, thanks c_14
[01:36:53 CEST] <c_14> -ss is frame-exact when used as an input option these days (since a few years)
[01:37:16 CEST] <poutine> I got it from an old stackoverflow, makes sense, thanks for clearing that up
[01:37:55 CEST] <furq> c_14: is that true of all demuxers
[01:38:16 CEST] <furq> the manual still seems to think -ss as an input option is imprecise
[01:38:39 CEST] <c_14> furq: if you're reencoding, yes
[01:38:42 CEST] <c_14> if no, no
[01:38:59 CEST] <c_14> >When transcoding and -accurate_seek is enabled (the default), this extra segment between the seek point and position will be decoded and discarded. When doing stream copy or when -noaccurate_seek is used, it will be preserved.
[01:39:55 CEST] <botik101> WHAT? I do not have to do this horrible lt/gte?!!!!! and can just specify ss?!!!
[01:40:31 CEST] <furq> i don't think you need to do the lt/gte anyway
[01:40:36 CEST] <furq> but -ss only works once
[01:41:02 CEST] <furq> so you'd need to decode the same part of the file multiple times
[01:41:03 CEST] <botik101> furq: so when I am specifying hundreds of frames to extract?
[01:41:16 CEST] <furq> then you'd need to decode part of the file hundreds of times
[01:41:31 CEST] <botik101> furq - using -select is the only way right? and that would require lte and gte like this: ffmpeg -i /home/victor/Videos/FIFA/mundo-full.mp4 -filter:v "select='lt(prev_pts*TB\,96.056835)*gte(pts*TB\,96.056835) + 'lt(prev_pts*TB\,96.967261)*gte(pts*TB\,96.967261)'" -vsync vfr ./tmp/still_%06d.png
[01:41:52 CEST] <furq> i feel like there's a better way around timestamp imprecision
[01:43:15 CEST] <furq> i'd probably do something like eq(floor(n/1000),96.056)
[01:45:04 CEST] <furq> emphasis on "something like" because that won't work at all
[01:45:25 CEST] <furq> thank you all for being so polite to not point that out until i noticed it
[01:48:59 CEST] <furq> ok i guess i'd do between(t, 96.967, 96.968)
[02:05:26 CEST] <botik101> furq - i was thinking about floor and then thought about between
[02:06:19 CEST] <botik101> furq: between presents a different problem because we are not guaranteed that frame will be ther eunless we make between wideenough. At the same time, this makes it impossible for us to pull out frmes that are very close to each other
[04:53:14 CEST] <Wallboy> Is there a way to use the segment muxer to segment a video into exactly N segments? Each segment doesn't have to be the same duration, I just need the constraint of exactly N segments
[04:55:16 CEST] <Wallboy> for example if I have a 40 second video and i want to split it into 4 segments. Ideally it would be 4 x 10 second segments, but if there like (12, 10, 8, 10) that's fine
[08:29:40 CEST] <botik101> what is the maximum length of ffmpeg stirng allowed in Ubuntu OS?
[08:36:56 CEST] <botik101> I cant figure out how to prepare a -selectcomplexfile filter .... my call to ffmpeg is too long for my OS and i need to put it into txt file, but the format inside the file is very confusing and I need a PhD in ffmpeg to do it.
[08:37:38 CEST] <botik101> can someone pelase explain? all I want to do is pull out 100 frames from a video and i specify it with times and lt and gt combinations
[09:10:28 CEST] <barhom> Can I find out what the GOP is for a certain input with ffprobe?
[09:57:59 CEST] <GuiToris> hey, does ffmpeg -i video.mkv images%d.png  save all the frames?
[09:58:52 CEST] <durandal_1707> it should
[09:59:31 CEST] <GuiToris> durandal_1707, I have to manually edit some frames, how can I convert them back to a video file?
[10:09:08 CEST] <kerio> ffmpeg -i images%d.png video.mkv
[10:11:55 CEST] <GuiToris> kerio, why is the video slower?
[10:12:36 CEST] <kerio> add -framerate 60 or whatever before -i
[10:12:49 CEST] <kerio> the images2 input format might default to 25
[10:13:34 CEST] <GuiToris> 29.970
[10:13:41 CEST] <GuiToris> -framerate 29.970?
[10:15:36 CEST] <GuiToris> it seems okay now
[10:15:39 CEST] <GuiToris> thank you kerio
[10:16:05 CEST] <kerio> GuiToris: nouuu
[10:16:17 CEST] <kerio> -framerate 30000:1001
[10:17:13 CEST] <GuiToris> kerio, does 30000:1001 mean 29.970?
[10:17:33 CEST] <kerio> not quite
[10:18:36 CEST] <kerio> the NTSC framerate is not 29.97, it's 30000/1001
[10:19:11 CEST] <kerio> ie 29.970029(970029)
[10:19:41 CEST] <kerio> because whoever added chroma to NTSC over-the-air transmissions was taking the fucking piss
[10:19:44 CEST] <GuiToris> oh, it's a division
[10:20:08 CEST] <GuiToris> yes, it's such a lunatic framerate
[10:20:12 CEST] <GuiToris> thank you for your help kerio
[10:20:26 CEST] <kerio> anyway a framerate of 2997:100 will result in a movie file that's not dvd-compliant, for instance
[10:21:43 CEST] <GuiToris> I assume the audio wouldn't have been okay with -framerate 29.970
[10:22:16 CEST] <GuiToris> 30000:1001 looks perfect
[10:23:29 CEST] <th3_v0ice> I have been streaming to Twitch with FFmpeg API and the whole operation works for the first 6 - 8 hours, after that video starts to freeze and then it completely dissapears. Audio is fine and continues as if nothing has happened. I am sure that all of the packets are sent to the muxer. What could be the problem?
[10:30:02 CEST] <kerio> GuiToris: is this some heavy editing of all frames?
[10:30:11 CEST] <kerio> and/or is the source file uncompressed?
[10:32:00 CEST] <GuiToris> the source file must be compressed I created with my bridge camera
[10:32:09 CEST] <GuiToris> it can't even shoot raw
[10:32:53 CEST] <GuiToris> I'd like to combine two different videos, but the angles are different, so I have to align them first
[10:33:29 CEST] <GuiToris> I guess there's no way to align two videos, so I'm about to do it one by one
[10:33:40 CEST] <GuiToris> there are about 400 images
[10:33:52 CEST] <GuiToris> it can't take too long ... :/
[10:34:04 CEST] <Cracki> wat
[10:34:07 CEST] <Cracki> define "align"
[10:34:46 CEST] <GuiToris> we took 2 shots in the very same place, but the angle is a little bit different
[10:34:54 CEST] <GuiToris> I can't just cut them
[10:35:02 CEST] <GuiToris> it would be unnatural
[10:35:10 CEST] <Cracki> http://www.robots.ox.ac.uk/~vgg/hzbook/
[10:35:14 CEST] <Cracki> browse that
[10:35:20 CEST] <Cracki> you'll understand the scope of your problem
[10:36:04 CEST] <Cracki> define "very same place", define "angle different"
[10:36:39 CEST] <Cracki> and *that* book is for research
[10:37:03 CEST] <Cracki> if you need to do much fuckery with 3d composition, consider buying commercial video editing/composing software (aftereffects and such)
[10:37:57 CEST] <GuiToris> I wouldn't like to spend much money on editing a single clip. I don't do this regularly
[10:38:18 CEST] <durandal_1707> cant you use perspective filter?
[10:38:21 CEST] <Cracki> show me pics, show me videos
[10:38:41 CEST] <botik101> I am using -ss flag wrong - it is not making ffmpeg stop ... I have 5 hour slong video and I am pulling out few frames. Frames are in the very beginning for this test. I want ffmpeg to stop scanning the rest of the file past 200 seconds... I am saying this: ffmpeg -i -ss 100 /home/victor/Videos/FIFA/mundo-full.mp4 -filter:v "select='lt(prev_pts*TB\,3.534736)*gte(pts*TB\,3.534736) + 'lt(prev_pts*TB\,4.561807)*gte(pts*TB\,4.561807) + 
[10:38:52 CEST] <Cracki> botik101, -ss is START, not stop
[10:38:55 CEST] <Cracki> use -t
[10:39:29 CEST] <Cracki> -t is length
[10:39:36 CEST] <botik101> Cracki: so I would put -t before the -i to tell it to stop scanning after 100 seconds? -t 100 -i ?
[10:40:04 CEST] <GuiToris> durandal_1707, Cracki, clip 1 : https://ptpb.pw/9mIZ  clip 2 : https://ptpb.pw/sesU
[10:40:05 CEST] <Cracki> if you put them before -i, they apply to undecoded video. you won't know the difference.
[10:40:37 CEST] <Cracki> GuiToris, when you say clip, you mean video, right?
[10:40:42 CEST] <Cracki> are the cameras static at least?
[10:40:49 CEST] <botik101> crack - you are genius!
[10:40:57 CEST] <Cracki> and yes, try the perspective filter
[10:41:05 CEST] <Cracki> botik101, no, I read the documentation
[10:41:11 CEST] <GuiToris> I'll take a look, I've never heard of it
[10:42:19 CEST] <Cracki> this is what a perspective filter can do https://szakrewsky.files.wordpress.com/2015/08/homography-example.png
[10:42:25 CEST] <Cracki> i.e. turn the camera virtually
[10:42:30 CEST] <Cracki> (and a bunch more)
[10:42:49 CEST] <Cracki> BUT if the optical center (i.e. position) moves, you can't compensate that
[10:43:42 CEST] <Cracki> it can tilt planes (image plane), which includes panning the camera
[10:44:05 CEST] <GuiToris> it sounds great
[10:44:26 CEST] <Cracki> https://docs.opencv.org/3.4.1/d9/dab/tutorial_homography.html
[10:44:47 CEST] <Cracki> https://ffmpeg.org/ffmpeg-filters.html#perspective
[10:44:58 CEST] <Cracki> your luck: you just have to give ffmpeg four corner pairs
[10:45:03 CEST] <Cracki> no math for you
[10:45:45 CEST] <GuiToris> what luck, ffmpeg is quite smart
[10:47:50 CEST] <botik101> what the command that makes 3-4=0 but 6-4=2? in other words, only positive numbers without if
[10:48:03 CEST] <botik101> I thought there was a ubiquotous command for it
[10:48:07 CEST] <botik101> i mean operator
[10:48:14 CEST] <botik101> wrong channel
[10:50:01 CEST] <botik101> what happens if -ss is negative number? why i am asking - i am automating script and it may add -s -20
[10:50:06 CEST] <botik101> iss -20
[10:50:15 CEST] <botik101> arghh  -ss -20
[10:53:30 CEST] <botik101> Cracki: hold on. if I do -ss would this throw off my -select filter? it is now defined as this: select='lt(prev_pts*TB\,3.534736)*gte(pts*TB\,3.534736)  ...would the values be relative to the start of -ss or to 0?
[10:54:09 CEST] <Cracki> don't ask me, I don't know
[10:57:28 CEST] <Cracki> GuiToris, those would be good coordinates: https://gist.github.com/crackwitz/3a07cdd457a3a88089e6f9a3ed010316
[11:02:04 CEST] <Cracki> ¯\_(Ä)_/¯
[11:19:43 CEST] <luc4> Hello! I'm doing some tests to determine the best compression algorithm to use with ffmpeg for my project. I find it a bit difficult to test, but it seems like creating a mjpeg stream is heavier than creating a h264 stream. I also tried to measure the compression of a single jpeg even with the GPU but I still got around 30ms. Would you say that with ffmpeg is costs much for the CPU to create a mjpeg or a h264 stream?
[11:24:13 CEST] <botik101> ok, there is some problem with the way i use -ss ...i do not think it is seeking to that starting point - ffmpeg -t 3609.401899 -ss 3499.02293 -i takes forever to get to that part of the video
[11:24:25 CEST] <botik101> I timed it with and without -ss and it is the same
[11:29:28 CEST] <Nacht> botik101: Try using -ss as an input param
[11:31:17 CEST] <botik101> Nacht: sorry...what you mean? -ss is before -i as is....
[11:33:01 CEST] <Nacht> botik101: ss can be used as an input or output param. ss as input should be fastest
[11:34:45 CEST] <botik101> Nacht: I thought so. but I added -ss prior to -i with the time and it is the same...weird
[11:42:22 CEST] <botik101> Nacht: do you know if -select filter time is based on an offset from -ss ?
[11:44:25 CEST] <botik101> what is wrong with this ffmpeg statement? it claims it cannot find any frames in a file that is 2.5 hours long
[11:44:28 CEST] <botik101> https://pastebin.com/mbpSSPrp
[11:45:08 CEST] <botik101> the output is: Output file is empty, nothing was encoded (check -ss / -t / -frames parameters if used)
[11:48:19 CEST] <botik101> I cant figure out why it claims this.
[11:57:32 CEST] <botik101> ok, video is 2.5 hours long and if i remove -ss and -to ffmpeg works as expected:
[11:57:34 CEST] <botik101> -i /home/victor/Videos/FIFA/mundo-full.mp4 -filter:v "select='lt(prev_pts*TB\,222.780169)*gte(pts*TB\,222.780169) + 'lt(prev_pts*TB\,224.734318)*gte(pts*TB\,224.734318) + 'lt(prev_pts*TB\,225.889945)*gte(pts*TB\,225.889945) + 'lt(prev_pts*TB\,231.318188)*gte(pts*TB\,231.318188) + 'lt(prev_pts*TB\,232.40512)*gte(pts*TB\,232.40512) + 'lt(prev_pts*TB\,233.568523)*gte(pts*TB\,233.568523) + 'lt(prev_pts*TB\,234.589855)*gte(pts*TB\,234.58
[11:57:56 CEST] <botik101> this works, but  ffmpeg -ss 212.780169 -to 333.011171 -i /home/victor/Videos/FIFA/mundo-full.mp4 -filter:v "select='lt(prev_pts*TB\,222.780169)*gte(pts*TB\,222.780169) + 'lt(prev_pts*TB\,224.734318)*gte(pts*TB\,224.734318) + 'lt(prev_pts*TB\,225.889945)*gte(pts*TB\,225.889945) + 'lt(prev_pts*TB\,231.318188)*gte(pts*TB\,231.318188) + 'lt(prev_pts*TB\,232.40512)*gte(pts*TB\,232.40512) + 'lt(prev_pts*TB\,233.568523)*gte(pts*TB\,233.568
[11:58:13 CEST] <botik101> does not. the only difference is -ss and -to
[11:59:06 CEST] <botik101> does -ss support 220.3 format which would mean it  is in seconds?
[12:22:21 CEST] <botik101> ok, I confirmed the following - -ss parameter in seconds PRIOR to -i does not work and causes ffmpeg never to find anything. -ss in hh:mm:ss format works, BUT, then all your time references insie SELECt filter MUST BE offset! They are no longer absolute but in relation to -ss....
[12:54:15 CEST] <ksk> hola
[12:55:00 CEST] <ksk> if I take an audio file as input, ffmpeg says:
[12:55:06 CEST] <ksk>   Duration: 00:03:50.83, start: 0.007500, bitrate: 49 kb/s
[12:55:06 CEST] <ksk>     Stream #0:0(eng): Audio: opus, 48000 Hz, stereo, fltp
[12:55:40 CEST] <ksk> why are there like two bitrates given? I have seen great differences in between them, and am wondering why
[12:56:08 CEST] <ksk> like what is in there to increase the bitrate if not the single audio stream?
[12:56:33 CEST] <ksk> (Im talking about "bitrate: 49 kb/s" vs "Audio: opus, 48000 Hz")
[13:19:37 CEST] <c_14> ksk: 48000 Hz isn't the bitrate
[13:19:40 CEST] <c_14> It's the audio sampling rate
[13:19:58 CEST] <c_14> They are (mostly) unrelated to each other
[13:58:44 CEST] <pi--> I'm trying to embed ultrasonic content in the audio stream of a video.
[13:59:15 CEST] <pi--> I've split audio and video streams, performed the embedding.
[13:59:19 CEST] <pi--> https://www.dropbox.com/s/63v0nolpbwtd76o/Screenshot%202018-08-10%2012.57.20.png?dl=0
[13:59:33 CEST] <pi--> Top plot is the final WAV
[13:59:44 CEST] <pi--> But then I merge.
[13:59:52 CEST] <pi--> Bottom plot is the audio track from the merged video.
[14:00:05 CEST] <pi--> It seems to be performing HPF at 18kHz
[14:00:44 CEST] <furq> add -cutoff 22000
[14:00:52 CEST] <JEEB> many lossy audio formats' encoders do high cutting
[14:01:01 CEST] <furq> also you shouldn't need -strict experimental any more
[14:01:07 CEST] <pi--> Also it is introducing artifacts
[14:01:17 CEST] <furq> well yeah, it'll do that
[14:01:46 CEST] <pi--> Each horizontal bar is a 'tone', and I'm getting burring at the start of some of these that feeds its way all the way down into the audible spectrum
[14:01:59 CEST] <pi--> Is there any way I can minimise this?
[14:02:19 CEST] <furq> increase the bitrate
[14:02:49 CEST] <furq> or use fdk-aac
[14:03:02 CEST] <furq> which either means building ffmpeg with fdk or encoding the audio standalone and then merging that in
[14:03:04 CEST] <pi--> Thanks furq, you are a lifesaver
[14:03:28 CEST] <furq> idk if either of those will actually work
[14:03:35 CEST] <furq> but they're worth trying
[14:03:35 CEST] <pi--> What is fdk-aac?
[14:03:42 CEST] <furq> a different aac encoder
[14:03:53 CEST] <furq> it has a dumb license so you can't distribute ffmpeg binaries with it built in
[14:03:59 CEST] <pi--> Why do you think it may solve the problem?
[14:04:13 CEST] Action: pi-- toasts RMS
[14:05:02 CEST] <acresearch> for the 1,000,000,000 time apple changes the way it reads .mp4 videos !!!!!    i have (again) 240 images that I am trying to fuse into a video, i am using the following command: ffmpeg -f image2 -i Movie/video%4d.png -r 30 -vcodec libx264 -b:v 480M -pix_fmt yuv444p -preset:v slower -level 4.2 video.mp4    the resulting video is perfect but cannot be viewed in an ios !!!   please help?
[14:05:15 CEST] <furq> fdk is probably higher quality
[14:05:26 CEST] <furq> although nobody's really tested it against ffmpeg aac properly yet
[14:06:41 CEST] <furq> https://github.com/nu774/fdkaac
[14:06:51 CEST] <furq> there's a standalone encoder there if you just want to check the output
[14:07:01 CEST] <JEEB> uhh, mstorsjo's thing is the "upstream"
[14:07:09 CEST] <furq> that's just the library though
[14:07:12 CEST] <JEEB> no
[14:07:32 CEST] <JEEB> the fdk-aac repo has the cli apps etc
[14:07:59 CEST] <JEEB> https://github.com/mstorsjo/fdk-aac/commit/e45ae429b9ca8f234eb861338a75b2d89cde206a
[14:08:05 CEST] <JEEB> for example this :P it's the encoder
[14:09:10 CEST] <furq> i guess that's fine for this use case
[14:09:15 CEST] <furq> the nu774 one is a better frontend
[14:10:54 CEST] <pi--> furq: That is _amazing_. Just doubling the bit rate tidies up the tails.
[14:11:24 CEST] <furq> what ffmpeg version is that
[14:11:27 CEST] <pi--> And setting the cut-off to 22k works a treat.
[14:11:48 CEST] <pi--> 4.0.1
[14:11:54 CEST] <furq> that's fine then
[14:12:23 CEST] <pi--> Why do you ask? My deployment server is a Linux box, I might have to check the version number there...
[14:12:34 CEST] <furq> you had -strict experimental in the command
[14:12:53 CEST] <furq> which you still needed in <3.0 before the aac encoder was any good
[14:13:08 CEST] <pi--> I originally outsourced this task (I'm now recoding it) .. I'm not sure why my dev did that.
[14:13:18 CEST] <furq> probably pasted from stackoverflow
[14:13:34 CEST] <pi--> yupperRR
[14:13:36 CEST] <furq> but yeah if you're using ffmpeg aac on multiple boxes then make sure they're all at least 3.0
[14:13:39 CEST] <JEEB> at least he used strict experimental, instead of -2
[14:13:42 CEST] <JEEB> :V
[14:14:59 CEST] <pi--> I really need to find an expert willing to look over my ffmpeg calls :s
[14:15:09 CEST] <pi--> I will wager there is dodgy cut&paste going on.
[14:16:47 CEST] <pi--> https://www.dropbox.com/s/99h92wsaxnghc3j/Screenshot%202018-08-10%2013.16.20.png?dl=0 that's the difference from doubling my bitrate
[14:18:47 CEST] <ksk> c_14: mhhmkay, thanks.
[14:24:01 CEST] <pi--> https://paste.pound-python.org/show/ew13nMA2AgIdjPq4pXeK/ <-- does this code look sane?  The task is to add black to the start of a video (no-audio)
[15:02:44 CEST] <acresearch> for the 1,000,000,000 time apple changes the way it reads .mp4 videos !!!!!    i have (again) 240 images that I am trying to fuse into a video, i am using the following command: ffmpeg -f image2 -i Movie/video%4d.png -r 30 -vcodec libx264 -b:v 480M -pix_fmt yuv444p -preset:v slower -level 4.2 video.mp4    the resulting video is perfect but cannot be viewed in an ios !!!   please help?
[15:04:14 CEST] <furq> did iOS ever support 4:4:4 h264
[15:04:25 CEST] <Mavrik>  No.~
[15:04:32 CEST] <Mavrik> Also 480Mbit bitrate? :D
[15:04:51 CEST] <furq> also hwdec stuff generally doesn't like anything above level 4.1
[15:05:02 CEST] <furq> although i can't speak for apple devices in particular
[15:05:16 CEST] <acresearch> so what do you reccomend i change?
[15:06:00 CEST] <furq> -pix_fmt yuv420p for starters
[15:06:32 CEST] <furq> also you probably want -framerate 30 before -i
[15:10:55 CEST] <Mavrik> Also lower video bitrate for about 10x :D
[15:19:51 CEST] <acresearch> furq: ok i will try these
[15:34:41 CEST] <Hello71> what's 1080p30 at 12bpp in raw video anyways
[15:35:59 CEST] <Hello71> apparently 746 Mb/s
[15:36:28 CEST] <Hello71> I dunno how good huffyuv is but I figure it does better than halving the bitrate
[15:49:07 CEST] <acresearch> furq: i finished the command but i got a Gstreamer back end error (not sure what that means)
[15:49:16 CEST] <acresearch> furq: decoding error
[15:56:17 CEST] <acresearch> i am not sure what is wrong with the codec, i used this same command before and the videos worked in apple, now it doesn't
[16:25:33 CEST] <Cracki> acresearch, have you considered NOT using yuv444p? some decoders don't like that.
[16:25:50 CEST] <Cracki> wew 480M
[16:26:27 CEST] <Cracki> huffyuv really needs P-frames...
[16:26:48 CEST] <Cracki> GuiToris, those would be good coordinates: https://gist.github.com/crackwitz/3a07cdd457a3a88089e6f9a3ed010316
[16:28:47 CEST] <acresearch> Cracki: i tried yuv420p   but i got that error,   i did not get an error with 444p
[16:31:36 CEST] <Cracki> how about you work on that error
[16:31:59 CEST] <Cracki> or... you share it first, this mysterious error
[16:35:01 CEST] <GuiToris> Cracki, let me show you what I'd like to do, here's a frame from video#1 https://ptpb.pw/9mIZ  and from video#2 https://ptpb.pw/sesU  as you see they aren't quite similar so I'd like to rotate the second one like this: https://ptpb.pw/19Oy
[16:35:22 CEST] <acresearch> Cracki: the only thing i got was decoder error, what else can i provide? where can i find more info?
[16:35:44 CEST] <Cracki> acresearch, why do you keep that exact error message a secret?
[16:35:51 CEST] <Cracki> do you not value what ffmpeg tells you?
[16:36:37 CEST] <Cracki> GuiToris, that third picture is absolute rubbish. try my coordinates, they're computed from keypoint matching
[16:37:19 CEST] <Cracki> acresearch, *what* threw a decoder error?
[16:37:32 CEST] <Cracki> and why do you make such crazy videos? 480 Mbit/s and whatnot?
[16:38:00 CEST] <Cracki> several people have given you advice on better encoding parameters
[16:40:31 CEST] <GuiToris> Cracki, are these x,y coordinates?
[16:41:08 CEST] <Cracki> yes
[16:46:03 CEST] <GuiToris> Cracki, are you sure ffmpeg can handle these coordinates?
[16:46:08 CEST] <Cracki> no idea
[16:46:13 CEST] <Cracki> don't see why not
[16:46:19 CEST] <Cracki> it computes a homography from them
[16:48:46 CEST] <Cracki> GuiToris, this is done with opencv and I expect ffmpeg will do exactly that too https://imgur.com/a/nLDvgLb
[16:48:58 CEST] <Cracki> click on the picture and use arrow keys to see the effect
[16:49:08 CEST] <Cracki> oh it doesn't support arrow keys
[16:49:52 CEST] <Cracki> it appears the *position* of the camera changed too!
[16:51:24 CEST] <GuiToris> Cracki, shoot, that's really good indeed!
[16:52:01 CEST] <GuiToris> how long does it take to compute and rotate an image?
[16:52:16 CEST] <Cracki> it's just a matrix multiplication per pixel
[16:52:25 CEST] <Cracki> no time essentially
[16:52:35 CEST] <Cracki> told you, that's the perspective filter of ffmpeg
[16:52:36 CEST] <Cracki> use it
[16:52:51 CEST] <Cracki> still, your videos can not be aligned properly because they aren't from the same position
[16:53:09 CEST] <GuiToris> there are days between them
[16:53:20 CEST] <GuiToris> but I can use masking later
[16:53:26 CEST] <Cracki> and hours, judging by the shadows
[16:54:19 CEST] <GuiToris> did you rotate the image with opencv?
[16:54:25 CEST] <Cracki> no comment
[16:54:37 CEST] <GuiToris> this is done with opencv
[16:54:50 CEST] <Cracki> unless you have background in computer vision, that library is ABSOLUTELY irrelevant to you
[17:00:40 CEST] <acresearch> Cracki: sorry i was checking something in the lab
[17:01:05 CEST] <acresearch> Cracki: i don't know, i seeked help in this channel 2 months ago and the command i pasted worked, but now it does not
[17:01:36 CEST] <acresearch> Cracki: i am not good in ffmpeg, it is too complicated for me, i am just trying to use it to generate simple 10 second videos of molecules for presentations
[17:02:00 CEST] <Cracki> no need to keep highlighting thx
[17:03:10 CEST] <GuiToris> Cracki, I'm sure I messed up something: ffmpeg -i sesU.png -vf perspective=149.37562561:76.51576996:58.29593277:1141.00280762:2017.33569336:1285.13061523:2023.41247559:173.2217865 great.png
[17:03:17 CEST] <GuiToris> but what is that?
[17:03:37 CEST] <GuiToris> I got a really odd image
[17:03:49 CEST] <Cracki> show me
[17:03:55 CEST] <Cracki> perhaps you ordered the points wrong
[17:04:18 CEST] <GuiToris> https://ptpb.pw/maMX
[17:04:50 CEST] <GuiToris> 'Set coordinates expression for top left, top right, bottom left and bottom right corners'
[17:05:15 CEST] <Cracki> ordered wrong
[17:05:23 CEST] <Cracki> my coords are not ordered like that
[17:05:33 CEST] <Cracki> figure where they go, then order them
[17:07:15 CEST] <Cracki> mine are left top, left bottom, right bottom, right top
[17:10:22 CEST] <Cracki> you can round the numbers to one decimal point, it's not that accurate
[17:10:49 CEST] <GuiToris> this one is much better
[17:10:52 CEST] <GuiToris> it works
[17:11:09 CEST] <GuiToris> now I need that program which can calculate these
[17:11:24 CEST] <GuiToris> that's just what I need
[17:14:10 CEST] <GuiToris> Cracki, where did you get that python file? I can't find it here: https://docs.opencv.org/3.4.1/d9/dab/tutorial_homography.html
[17:14:33 CEST] <Cracki> you saw my gist
[17:14:37 CEST] <Cracki> first line is a link
[17:14:50 CEST] <Cracki> I don't use C++ if I can avoid it
[17:16:00 CEST] <GuiToris> https://gist.github.com/crackwitz/3a07cdd457a3a88089e6f9a3ed010316
[17:16:02 CEST] <GuiToris> this gist?
[17:16:14 CEST] <GuiToris> https://docs.opencv.org/3.4/d1/de0/tutorial_py_feature_homography.html
[17:16:15 CEST] <Cracki> that gist
[17:17:42 CEST] <GuiToris> I still can't find 'download simple_homography.py now' anywhere
[17:17:58 CEST] <Cracki> lol
[17:18:24 CEST] <Cracki> you think that would be enough?
[17:18:34 CEST] <GuiToris> I hope so
[17:18:37 CEST] <Cracki> nope.
[17:18:40 CEST] <GuiToris> ahhh
[17:18:45 CEST] <Cracki> have you ever programmed in python?
[17:18:55 CEST] <GuiToris> nay
[17:18:58 CEST] <Cracki> you'd be in luck if opencv just installs on your system
[17:19:36 CEST] <Cracki> but if that's the case, your system is probably some open source thingy where they don't distribute patented algorithms, and then that example code will only run modified
[17:19:53 CEST] <Cracki> use photoshop. it has perspective stuff too
[17:20:01 CEST] <Cracki> and might even tell you the corner points
[17:22:21 CEST] <GuiToris> ahh :( I wish we'd used a tripod
[17:23:21 CEST] <Cracki> either that or a time-of-flight laser range scanner
[17:23:35 CEST] <Cracki> then you have a 3d picture
[17:26:12 CEST] <GuiToris> thank you for you help Cracki, I'm leaving now and I'm thinking how I can compute the rest
[17:27:11 CEST] <Cracki> have fun, come back again :P
[17:27:26 CEST] <GuiToris> see you later :)
[17:39:12 CEST] <luc4> Hello! I would like to use libavcodec to encode a sequence of images to h264 using vaapi (I see there is an encoder to do this in ffmpeg). I'm referring to this example to learn how to encode: https://ffmpeg.org/doxygen/trunk/encoding-example_8c-source.html. But is this still valid when I want to encode using vaapi?
[18:14:52 CEST] <jkqxz> luc4:  Mostly valid, but there is also doc/examples/vaapi_encode.c which may be better :)
[18:29:01 CEST] <acresearch> people i am using the following command: ffmpeg -f image2 -framerate 1 -i video/%1d.png -r 30 -vcodec libx264 -b:v 480M -preset:v slower -level 4.2 video.mp4    to fuse 20 images into a 20 second video, my images are 1000x632 but my final video is low quality   anyone can help me optimise my command?
[18:33:35 CEST] <Cracki> are you trolling?
[18:34:09 CEST] <acresearch> Cracki: why do you think i am trolling? i am asking a genuine question
[18:34:15 CEST] <Cracki> if you are actually serious, show a screenshot of the "low quality"
[18:34:34 CEST] <acresearch> Cracki: ok 1 moment
[18:34:46 CEST] <Cracki> you must be trolling if you actually encode an 1fps video that's not even HD, with FOUR HUNDRED AND EIGHTY MEGABITS PER SECOND
[18:35:06 CEST] <Cracki> that's more than uncompressed bitmaps would need
[18:36:28 CEST] <acresearch> Cracki: original image: https://pasteboard.co/Hyy7xkA.png
[18:36:34 CEST] <acresearch> Cracki: resulting video: https://pasteboard.co/Hyy7HSx.png
[18:36:53 CEST] <Cracki> so?
[18:36:56 CEST] <Cracki> what's the matter?
[18:37:20 CEST] <Cracki> except that it's not the same view
[18:37:35 CEST] <acresearch> the video is lower quality (a bit pixilated)  will be very prominant on a large presentation screen, i want it to be the original image quality
[18:38:06 CEST] <furq> you probably want to read up on chroma subsampling
[18:38:18 CEST] <Cracki> both "screenshots" resize the picture/video
[18:38:30 CEST] <Cracki> have you considered setting the video player to 1:1 scaling?
[18:38:37 CEST] <furq> yeah it isn't that
[18:38:49 CEST] <furq> 4:2:0 on images like that is always going to look bad
[18:38:51 CEST] <Cracki> I see pixelation mostly on the edges to transparent
[18:39:19 CEST] <acresearch> furq: so then what can i change?
[18:39:29 CEST] <furq> you already changed it from 4:4:4 to 4:2:0
[18:39:40 CEST] <furq> if you want it to playback on iOS then you're stuck on that front
[18:39:50 CEST] <acresearch> furq: i don't care if it takes a long time to render or if it is a large video size, i mostly interested in the final quality
[18:39:57 CEST] <Cracki> if it's really gonna be a mobile device...
[18:40:04 CEST] <furq> you could try scaling it to 2x resolution
[18:40:06 CEST] <Cracki> I think i remember quicktime making screen recordings in 444
[18:40:29 CEST] <acresearch> Cracki: oh i removed that 444 thing
[18:40:32 CEST] <Cracki> good
[18:40:47 CEST] <furq> like i said, you should probably read up on chroma subsampling
[18:40:59 CEST] <furq> https://en.wikipedia.org/wiki/Chroma_subsampling
[18:41:43 CEST] <acresearch> furq: probably after i read the article i will end up with the answer that i can't do it right? :-(
[18:42:08 CEST] <furq> in summary if you want sharp coloured lines to look good then 4:2:0 isn't going to work
[18:42:15 CEST] <furq> but iOS will only play 4:2:0 h264
[18:42:30 CEST] <furq> like i said you can try scaling to 2x resolution but idk how effective that'll be
[18:42:40 CEST] <acresearch> furq: ok lets forget about ios for the this time and let me try an get a good video
[18:42:48 CEST] <Cracki> export the original images in higher resolution
[18:42:51 CEST] <furq> -vf scale=iw*2:ih*2:flags=lanczos
[18:43:01 CEST] <Cracki> or that
[18:43:08 CEST] <acresearch> Cracki: i can't, it took 2 days to generate these 20 images
[18:43:20 CEST] <acresearch> furq: ok let me try
[18:43:40 CEST] <Cracki> I don't believe that. the geometry in the pictures, maybe, but rendering it can't take that long
[18:44:03 CEST] <furq> if you don't care about iOS compat then just add -pix_fmt yuv444p again
[18:44:12 CEST] <acresearch> Cracki: no not rendering, the setup of the image, it is very slow and lots of manual work
[18:44:13 CEST] <furq> instead of scaling
[18:44:22 CEST] <acresearch> furq: ok let me try
[18:45:14 CEST] <acresearch> furq: 444 got me the same result,   but   -vf scale=iw*2:ih*2:flags=lanczos    made it much better
[18:45:32 CEST] <acresearch> furq: mabe i can try x3 ?
[18:45:35 CEST] <furq> sure
[18:45:53 CEST] <furq> i mean you might start struggling to play it back at that point
[18:46:14 CEST] <acresearch> furq: i see
[18:48:31 CEST] <acresearch> yeh it looks good but it won't play on ios
[18:49:26 CEST] <acresearch> i'll just give up and tell my students to go and read about that algorithm themselves. too much time on making a 20 second video
[18:49:32 CEST] <acresearch> thanks guys, i appriciate your help
[22:26:08 CEST] <localhorse> hey
[22:26:15 CEST] <localhorse> how can i take the video (not audio!) from a video file and combine it with a given audio file (replace the video's audio track with the given audio file) at a given time offset (to sync them)?
[22:37:35 CEST] <Cracki> multiple inputs, use -map
[22:37:40 CEST] <Cracki> oh, offset too?
[22:38:26 CEST] <Cracki> if you want the easy way, kdenlive/adobe premiere
[22:39:00 CEST] <Cracki> google says https://superuser.com/questions/982342/in-ffmpeg-how-to-delay-only-the-audio-of-a-mp4-video-without-converting-the-au
[23:03:59 CEST] <johnnny22> is there a way to improve somehow, maybe through hardware this conversion: [scaler_out_0_0 @ 0x1ea2a80] w:1920 h:1080 fmt:bgr0 sar:0/1 -> w:1920 h:1080 fmt:uyvy422 sar:0/1 flags:0x4
[23:11:56 CEST] <Cracki> improve how
[23:12:13 CEST] <Cracki> what's good and bad
[23:15:41 CEST] <johnnny22> *did i miss an answer to my question while having lost the internet ? :P
[23:30:21 CEST] <kepstin> johnnny22: you missed Cracki asking what you think could be improved
[23:31:33 CEST] <kepstin> I mean, if you're converting from RGB to YUV 4:2:2, there's not really a wide selection of possible ways to do it...
[23:36:14 CEST] <Cracki> simd instructions, using the gpu, ... that perhaps
[23:37:08 CEST] <Cracki> both are packed, so no scattering
[23:59:01 CEST] <iive> johnnny22, there is cost for transferring data from/to GPU. since this transformation is quite simple, it may become slower.
[23:59:28 CEST] <iive> my question is, why is this conversion done at all? most final encodes are in yuv420.
[00:00:00 CEST] --- Sat Aug 11 2018


More information about the Ffmpeg-devel-irc mailing list