[Ffmpeg-devel-irc] ffmpeg.log.20191028

burek burek at teamnet.rs
Tue Oct 29 03:05:01 EET 2019


[01:00:56 CET] <oblio> howdy
[01:01:36 CET] <oblio> is there a filter for adapting videos btw aspect ratios? e.g. i have a 9:16 i want fit within a 4:5 ratio with a blurred background
[01:03:38 CET] <klaxa> https://stackoverflow.com/questions/30789367/ffmpeg-how-to-convert-vertical-video-with-black-sides-to-video-169-with-blur
[01:16:02 CET] <oblio> klaxa: nice, thanks
[01:19:39 CET] <oblio> klaxa: do you have a suggestion if i dont want the video center cropped?
[01:42:32 CET] <Cyberworm> what do you people think about x264 vs GPU encoding?
[01:42:57 CET] <pink_mist> x264 any day of the week
[01:45:11 CET] <Cyberworm> pink_mist why? you had bad luck with GPU encoding?
[01:49:23 CET] <JEEB> it really depends on your use case
[01:49:57 CET] <JEEB> x264 can be pretty fast with alright quality, or very good at slower speeds (although to be honest with modern CPUs even the slower presets are quite fast)
[01:50:20 CET] <JEEB> on the other hand GPU encoding is usually locked to a single quality level, and generally is meant for realtime encoding
[01:50:53 CET] <JEEB> and usually optimized for low latency situations (or just no intent was made to put more buffering and capabilities into the hardware ASIC)
[01:51:20 CET] <pink_mist> yeah, basically unless you need realtime encoding, I wouldn't even consider GPU
[01:51:42 CET] <JEEB> even with realtime on a scale it often ends up being cheaper to get a beefier CPU :P
[01:51:52 CET] <DHE> I hear the rtx2000 series cards have a good nvenc chip, but at the end of the day GPU encoding is for when you need realtime or faster on 1080p at 30fps or higher
[01:52:19 CET] <JEEB> since server GPUs are stupid expensive and your server vendor will kill you for putting a normal thing there
[01:52:25 CET] <JEEB> (like, losing support etc)
[01:53:57 CET] <Cyberworm> <JEEB> on the other hand GPU encoding is usually locked to a single quality level :  what do you mean i can choose  higher quality or low quality
[01:55:12 CET] <JEEB> if you have the hardware you can start making your own tests if you really want, but if you just want to offload it onto the GPU no matter what, then just use the GPU :P
[01:56:55 CET] <JEEB> DHE: yea the ASICs have been getting better but I'm still not sure how scalable those are. like, if you want 5 or more outputs per input, and then figuring out how many inputs you can stick onto a box :P of course if you really don't care about handling "actual servers" with support etc it might get much, much simpler.
[01:58:50 CET] <DHE> JEEB: unless you have the server grade GPUs, nvidia's drivers will limit you to 2 streams regardless of how many cards you have.
[01:59:00 CET] <JEEB> yea
[01:59:00 CET] <DHE> assholes
[01:59:24 CET] <JEEB> of course you can patch the drivers apparently, but that goes into the whole "is this thing legit" part of business
[01:59:40 CET] <DHE> that said, it is my understanding that said server grade GPUs scale horizontally nigh-perfectly. if it can do 320fps at 1080p, it can probably do 10x 30fps 1080p streams
[02:00:06 CET] <JEEB> right
[02:01:53 CET] <JEEB> also x264 could be optimized to share stuff between instances, like I think x265 does
[02:03:29 CET] <Cyberworm> i am encoding my computer screen  @  1280x720  using 5 fps
[02:03:48 CET] <Cyberworm> and i want smallest file possible
[02:04:04 CET] <JEEB> if you want best compression then just use x264 :P
[02:04:13 CET] <JEEB> although to be honest I would do capture lossless
[02:04:32 CET] <JEEB> unless you need that thing to be pushable to a service right away
[02:04:47 CET] <Reinhilde> vp9 if your system can run it
[02:04:50 CET] <Reinhilde> mine can't
[02:05:00 CET] <Cyberworm> capture lossless and compress to smaller file later?
[02:05:00 CET] <JEEB> too bad all OSS vp9 encoders still suck
[02:05:03 CET] <JEEB> Cyberworm: yea
[02:05:13 CET] <JEEB> that way you don't need to care about encoding speed during the capture
[02:05:15 CET] <Cyberworm> how do i capture lossless using obs-studio
[02:05:31 CET] <JEEB> this is #ffmpeg, I'm sorry :P
[02:05:49 CET] <Cyberworm> reinhilde what is better quality/size?  vp9 or x265
[02:06:02 CET] <Cyberworm> jeeb sorry you are right
[02:06:05 CET] <Reinhilde> try both
[02:06:06 CET] <JEEB> VP9 is a format, x264 is an encoder
[02:06:09 CET] <JEEB> *x265
[02:06:20 CET] <Reinhilde> JEEB: libvpx-vp9 is an encoder
[02:06:22 CET] <Cyberworm> reinhilde what is better quality/size?  libx_vp9 or x265
[02:06:28 CET] <JEEB> Reinhilde: yes but that is libvpx
[02:06:33 CET] <Reinhilde> and it doesn't work on my vps
[02:06:35 CET] <JEEB> not just "VP9"
[02:06:51 CET] <Reinhilde> Cyberworm: try both and see where you go
[02:06:56 CET] <Cyberworm> does ffmpeg use same vp9 encoder that youtube uses?
[02:07:08 CET] <JEEB> Cyberworm: libvpx has awful psychovisual optimizations. x265 is not much better, but I'd probably expect it to do better
[02:07:15 CET] <JEEB> (also libvpx rate control is a joke)
[02:07:27 CET] <Cyberworm> what is "rate control"?
[02:07:41 CET] <JEEB> the algorithm that decides how many bits to use and where
[02:08:09 CET] <JEEB> and GOOG uses libvpx the last I knew, and that's a VP9 encoder available in FFmpeg at the moment, so yes.
[02:08:09 CET] <Cyberworm> then what encoder has good "rate control"
[02:08:30 CET] <JEEB> x264 is a classic example of good rate control. x265 I'm not sure. better than libvpx I'd guess.
[02:08:56 CET] <JEEB> note: x265 was not done by the community so it's quite a different thing compared to x264
[02:09:17 CET] <JEEB> they got the name rights from x264 LLC as part of a deal
[02:09:20 CET] <Cyberworm> i hate x264 version number system
[02:10:52 CET] <JEEB> but yea, it's sad that x264 is still one of the gold standards for sw video encoding :P all the new features etc in video formats make psychovisual optimizations a harder problem
[02:11:11 CET] <JEEB> and apparently companies in general are more OK with blurriness
[02:11:38 CET] <Cyberworm> even if you say x264 encoder has great "rate control" "psychvisual optimization"  x264 is old format now
[02:12:06 CET] <JEEB> H.264 is an old format, and x264 is an old encoder, yes (although it does still get changes)
[02:12:11 CET] <Cyberworm> or i should  say h264 is old format now
[02:12:16 CET] <JEEB> yes, you should
[02:12:28 CET] <JEEB> x264 is just an encoder that happened to be great :P
[02:12:57 CET] <Cyberworm> isn't x265 and libvpx  also done by community?
[02:12:58 CET] <JEEB> but yea, age still doesn't make it any worse compared to the newer encoders. it had a simpler/a bit different job to tackle, but it seemed to do it really well
[02:13:01 CET] <JEEB> no
[02:13:07 CET] <JEEB> libvpx = GOOG throw-over-wall
[02:13:16 CET] <JEEB> x265 = MCW throw-over-wall
[02:13:38 CET] <Cyberworm> what do you mean by "throw over wall"
[02:13:56 CET] <JEEB> open source that generally is just corporate work thrown over the wall into a public repo
[02:14:14 CET] <pink_mist> Cyberworm: unfortunately, x265 and libvpx are too new and so haven't had even a miniscule amount of the work that was put into x264 put into them yet, so they are still playing catchup for a lot of things, even if the format itself is an improvement
[02:14:15 CET] <JEEB> x265 I guess attempts to look more like open source by the patches being sometimes posted on the mailing list
[02:14:32 CET] <JEEB> pink_mist: I mean, they're not even that new any more.
[02:14:41 CET] <pink_mist> comparably I mean :P
[02:14:56 CET] <JEEB> like, x264 6 years after release (2009) was quite a different situation already
[02:15:09 CET] <JEEB> both libvpx's VP9 and x265 stem from circa 2013
[02:15:37 CET] <JEEB> but yea, you don't have someone hacking it off like when the encoder stemmed from community
[02:15:47 CET] <JEEB> the closest we have for a community encoder atm is rav1e
[02:16:00 CET] <JEEB> which is one that's new
[02:16:09 CET] <JEEB> so it's got its problems
[02:16:23 CET] <Cyberworm> jeeb would you able to tell the difference between  x264 2019 version  vs  x264  2009 version encoded video  (using same video settings)
[02:17:16 CET] <JEEB> Cyberworm: I'm not sure if you could utilize the exact same options. a lot of the base stuff would already be there, though. haven't tried so no idea :P
[02:17:35 CET] <JEEB> you did get some nice stuff post-2009 pretty sure
[02:17:42 CET] <JEEB> psy-rd/trellis at least?
[02:17:49 CET] <JEEB> MB-tree?
[02:18:06 CET] <JEEB> I'd have to double-check if I remember the dates right tho
[02:18:12 CET] <Cyberworm> i wonder if i can do a test:  not sure where i can get x264 2009 version
[02:18:37 CET] <JEEB> getting the version really isn't the problem; the problem is if you can make sure you're using the same options
[02:18:48 CET] <JEEB> anyways, sleep for me.
[02:24:07 CET] <Cyberworm> x264 core 65 = how do i tell what year this was released
[02:25:28 CET] <DHE> https://github.com/mirror/x264/commit/60455fff82906da0237a4f56b3686a588579e41f I believe this is your answer
[02:26:23 CET] <Cyberworm> Sep 30, 2008
[02:26:24 CET] <Cyberworm>  ?
[02:28:24 CET] <DHE> I'm showing october 1st, 2008, but I suspect there's some timezone quirkiness involved there
[02:29:17 CET] <DHE> let's pencil in a yes
[02:31:07 CET] <Cyberworm> how did you find that so fast
[02:31:14 CET] <Cyberworm> close enough
[02:31:42 CET] <DHE> I have the git repo on my local machine and asked for all the the commits to x264.h with their diffs, and searched for "BUILD 65"
[02:31:54 CET] <Cyberworm> i see
[02:32:16 CET] <Cyberworm> i see so many video using that x264 core 65 version
[02:35:42 CET] <DHE> weird.. core 66 was in january 2009 so that's a rather narrow time range...
[02:36:45 CET] <furq> probably some windows build that didn't get updated for a while
[02:37:04 CET] <furq> probably shipped with megui or something like that
[02:37:12 CET] <DHE> or some troll who's faking the identification string. :)
[02:37:29 CET] <DHE> (I've done that, sorta)
[02:37:38 CET] <furq> it'd be weird to pretend to be using an older build
[02:39:08 CET] <furq> if this is scene stuff then they're not exactly famous for being adventurous with using new stuff
[02:41:11 CET] <DHE> maybe, but it doesn't directly impact playback so its' not illegal...
[05:02:17 CET] <AlexApps> Hello,  I'm just getting started with FFmpeg and I'm wondering what the difference is between using "-i pipe:" or "-i -"
[05:02:55 CET] <furq> there isn't one, but you can set an fd with pipe:0 or whatever
[05:03:17 CET] <AlexApps> Where is the "-" flag documented?
[05:03:48 CET] <furq> it's not a flag, that's a standard idiom meaning stdin or stdout
[05:03:59 CET] <AlexApps> Ok
[05:04:05 CET] <AlexApps> Thanks
[05:34:04 CET] <AlexApps> Hello again, I have used FFmpeg-Python to generate a command to make a sequence of images padded to a video, and although the padding works there is only one image contained in the video when I have multiple .png files in my directory
[05:34:16 CET] <AlexApps> ffmpeg -f image2pipe -framerate 1/5 -i pipe: -filter_complex "[0]scale=force_original_aspect_ratio=decrease:h=1080:w=1920[s0];[s0]pad=color=white:h=1080:w=1920:x=-1:y=-1[s1]" -map [s1] -r 30 -c:v libx264 -pix_fmt yuv420p video.mp4
[05:34:55 CET] <AlexApps> I know it's a lot to read, but even just a quick guess would be a lot of help to me :)
[05:35:53 CET] <AlexApps> I pipe in the image data with cat *.png in a directory with 3 png images
[05:38:35 CET] <pink_mist> AlexApps: I'd suggest you don't pipe them in, but instead allow ffmpeg to open them on its own
[05:39:13 CET] <AlexApps> The images I will provide in the finished version will be stored in memory instead of in a directory, so I would prefer to use pipes than temporary files
[05:39:15 CET] <pink_mist> AlexApps: it wouldn't surprise me if since you pipe them in, ffmpeg is convinced it's just a single image with some garbage on the end (that garbage being the other images)
[05:39:40 CET] <AlexApps> How would I fix that?
[05:39:43 CET] <pink_mist> AlexApps: since it can't tell when one file ends and the next begins
[05:39:49 CET] <AlexApps> The wiki has a similar example here: https://trac.ffmpeg.org/wiki/Slideshow#Pipe
[05:40:21 CET] <pink_mist> ok, then I'm probably mistaken, so disregard me
[05:45:04 CET] <AlexApps> If anybody else has any idea what I've done wrong, please lmk :)
[05:55:11 CET] <AlexApps> It worked once I replaced the images with different PNGs, and added "-vcodec png" to input
[05:56:09 CET] <AlexApps> I should have noticed, because in the logs of the previous runs it said "Stream #0:0: Video: mjpeg (Baseline)"
[05:57:13 CET] <AlexApps> There might be a mistake on the wiki page I was looking at, as it did not say that that was needed
[06:24:10 CET] <AlexApps> Are there any docs for image2pipe?
[06:27:32 CET] <furq> they're the same as the docs for image2
[06:28:00 CET] <AlexApps> Thanks
[08:05:45 CET] <mtcdood> is there a way to detect whether or not a video clip is inside another video clip?
[08:05:55 CET] <mtcdood> I'm try to deduplicate some files
[08:07:29 CET] <mtcdood> so if there's some way to do fingerprinting or something that might work
[08:08:30 CET] <AlexApps> Hello once again, if I'm making a video from an image sequence is there any way to loop an mp3 file over it for the entire duration of the generated video?
[08:11:21 CET] <furq> AlexApps: -stream_loop 1 -i foo.mp3 -shortest
[08:12:40 CET] <AlexApps> Thanks yet again, you've been really helpful :)
[08:12:57 CET] <mtcdood> what is "signature" ? https://amiaopensource.github.io/ffmprovisr/index.html#compare_video_fingerprints
[08:14:15 CET] <mtcdood> I see patches from three years ago, but it's not in my ffmpeg or the documentation
[08:14:47 CET] <mtcdood> looks like it was in 3.3
[10:24:52 CET] <AiNA_TE> is there a way to apply filters during two timestamps instead of to a whole video?
[10:25:25 CET] <AiNA_TE> like i need to use deblur for a couple of mins and then for the rest of the job i dont want deblur applied
[10:26:03 CET] <AiNA_TE> or is cutting the video in two, processing each part and concat them the only way
[10:26:28 CET] <durandal_1707> AiNA_TE: see timeline support via enable option and expressions
[10:26:52 CET] <durandal_1707> which filter you use for debluring?
[10:33:31 CET] <AiNA_TE> sorry i meand delogo
[10:33:38 CET] <AiNA_TE> i do this at the moment
[10:33:40 CET] <AiNA_TE> delogo=x=1676:y=36:w=182:h=24
[10:34:00 CET] <AiNA_TE> to remove a serial number from my cable provider
[10:35:16 CET] <durandal_1707> "delogo=old_params:enable='between(t, 5, 10)'"
[10:35:37 CET] <durandal_1707> "delogo=old_params:enable='between(n, 5, 10)'"
[10:35:57 CET] <durandal_1707> first is seconds between 5 and 10
[10:36:12 CET] <durandal_1707> second is frame number between 5 and 10
[10:38:00 CET] <durandal_1707> http://ffmpeg.org/ffmpeg-filters.html#Timeline-editing
[10:47:22 CET] <AiNA_TE> thank you very much
[11:50:39 CET] <Spring> can ffmpeg generate animated WebP images?
[12:13:08 CET] <w1kl4s> http://ffmpeg.org/ffmpeg-codecs.html#libwebp
[12:17:25 CET] <Spring> w1kl4s, I'm not seeing a note about animated support. Is it supported?
[12:33:03 CET] <Spring> nvm, apparently if video is the input it treats the output as such.
[18:05:32 CET] <rtwld> hi to all, i'd like to checkpoint/resume ffmpeg on Linux, what have I to use? (i'm using debian unstable)
[18:05:51 CET] <JEEB> what does checkpoint/resume mean?
[18:06:12 CET] <JEEB> I think using the pause functionality in linux works with the process at least
[18:06:24 CET] <JEEB> and unless you're using something that might time out while you're at it, it /should/ JustWork
[18:18:00 CET] <rtwld> pause? did you mean hibernation?
[18:18:53 CET] <rtwld> i want to save the state of encoding between shutdown
[18:20:53 CET] <JEEB> pause as in using the pause button I think
[18:20:57 CET] <JEEB> that generally just stops the process
[18:21:15 CET] <JEEB> to save the state you'd have to dump the whole memory of the process and pull it back afterwards
[18:21:19 CET] <JEEB> and hope that nothing changed in the process
[18:51:59 CET] <kepstin> if you have some checkpoint/resume functionality that works on arbitrary linux programs, it should also work on ffmpeg.
[18:52:25 CET] <kepstin> there's nothing builtin to ffmpeg to do that (and it would be very hard to add)
[18:53:24 CET] <kepstin> honestly if you don't have any other options, probably the easiest way would be to run ffmpeg in a VM that supports saving state, tbh.
[18:57:18 CET] <rtwld> thx bye
[19:05:12 CET] <DHE> JEEB: there is a project called checkpoint/restore which aims to save the state of running processes to disk. imagine live migrating a docker container or such
[19:05:33 CET] <DHE> but yeah it should work unless there's some hardware interaction that can't be saved properly
[19:07:15 CET] <JEEB> DHE: or solibs unless they are fully loaded into that memory dump at that point
[19:13:05 CET] <DHE> there is the expectation that the filesystem is present on the receiver side and hence mmap() is easily taken care of
[20:53:44 CET] <cards> Not an ffmpeg question per se (unless it is), but what are some modern tools for dejuttering / stabilizing a video?  In particular, an old film, which are notorious for being very bouncy when copied digitally.
[20:55:23 CET] <cards> In many cases, this is what pre-2000's films primarily mean when they are "remastered" either professionally or by amateur pirates
[20:59:23 CET] <bluejaypop> hi guys, how is the correct form to apply more than 1 audio filter at the same time on the audio? I see some ways to do it as -f:a "volume=0.8" and in the ffmpeg.org homepage i see on the Audio Filters section as -filter-complex
[21:00:16 CET] <durandal_1707> bluejaypop: -af volume=0.8,volume=0.8
[21:01:04 CET] <bluejaypop> oh, only comma separated the filters then.
[21:10:33 CET] <ironm> !include jpg into mp4
[21:15:51 CET] <ironm> Hello. Is it possible to convert a single jpg file to an mp4 stream with defined length (like 2s) with an ffmpeg command?
[21:16:01 CET] <ironm> like: ffmpeg -framerate 10 -i Path/To/File/filename%3d.jpg -r 5 -y Path/To/File/test.mp4
[21:41:55 CET] <ironm> Ok, it looks it works: ffmpeg -loop 1 -framerate 10 -i in.png -r 50 -t 5 -c copy out.mp4
[21:42:48 CET] <ironm> However I am not sure about the -framerate (input) and -r )output)
[21:43:19 CET] <ironm> How do the correlate to each other
[21:43:34 CET] <ironm> s/ the / they /
[22:07:58 CET] <airking2> Hello!  I'm trying to stream a raspberry pi's desktop using the guide located at the bottom of this page: https://wiki.archlinux.org/index.php/Streaming_to_twitch.tv
[22:08:10 CET] <airking2> I get the error "No such process" when attempting to run the script
[22:08:53 CET] <airking2> Here is a paste of the output
[22:08:54 CET] <airking2> https://pastebin.com/utiF3bGm
[22:08:59 CET] <pink_mist> that doesn't sound like an error that ffmpeg would give
[22:09:08 CET] <pink_mist> so you should probably ask whoever wrote that script
[22:11:23 CET] <airking2> The script just takes in a single argument and passes a few other arguments to ffmpeg, I don't really know how that error could come from the script
[22:15:25 CET] <kepstin> airking2: well, try running the generated command manually
[22:15:31 CET] <pink_mist> oh, you missed the "0:" in "0: No such process" ... now it looks more like something ffmpeg would output
[22:15:32 CET] <kepstin> (and include the complete command line in the paste)
[22:15:36 CET] <pink_mist> I didn't check your link until just now
[22:16:04 CET] <airking2> my bad, I didn't realize I missed the 0
[22:20:42 CET] <airking2> kepstin https://pastebin.com/vCeFRfgT
[22:21:00 CET] <airking2> It outputs the same error
[22:21:12 CET] <kepstin> ok, your problem is the `-f pulse -i 0`
[22:21:25 CET] <kepstin> 0 is not a valid pulseaudio device name
[22:22:58 CET] <airking2> hmmm, how can I get a list of the vaild ones?
[22:23:22 CET] <kepstin> `pacmd list-sources` and the names you use are the stuff between <> in the name field
[22:23:51 CET] <kepstin> if you're doing screen capture, you'd probably want to find the ".monitor" device attached to your main audio output
[22:24:00 CET] <airking2> This is a raspberrypi, unfortunately it isn't running arch
[22:24:13 CET] <kepstin> has nothing to do with arch?
[22:24:34 CET] <airking2> whoops, my bad saw pacmd and mixed it up with pacman
[22:25:52 CET] <kepstin> also lol, a raspberry pi almost certainly cannot do that command as specified
[22:26:02 CET] <kepstin> i expect it'll work, but your framerate will be much lower than 30fps
[22:26:23 CET] <airking2> If I just get rid of the "-f pulse -i 0" can I just stream without audiO?
[22:26:28 CET] <airking2> I just realized I don't actually need the audio
[22:26:39 CET] <airking2> It *appears* to work without that
[22:27:44 CET] <kepstin> most streaming services require an audio track, you can probably replace it with something like "-f lavfi -i aevalsrc=0"
[22:33:31 CET] <ironm> [ffmpeg/video] png: Invalid PNG signature 0x1019E626A.
[22:36:36 CET] <kepstin> ironm: yep, that's not a png signature.
[22:37:51 CET] <kingsley> What's the fastest video format to encode? Lossless? mkv?
[22:38:09 CET] <kepstin> rawvideo
[22:38:09 CET] <ironm> kingsley, thank you. What I did was:
[22:38:12 CET] <ironm> ffmpeg -loop 1 -framerate 10 -i in2.png -r 50 -t 5 -c copy out2.mp4
[22:38:33 CET] <kepstin> ironm: your png file is not a png file. what does "file in2.png" say?
[22:38:49 CET] <ironm> and then: ffmpeg -f concat -i mymp4.list -c copy  outX.mp4
[22:39:07 CET] <kepstin> oh, wait, you're copying png into an mp4 file?
[22:39:40 CET] <ironm> kepstin, in2.png: PNG image data, 3839 x 2159, 8-bit/color RGB, non-interlaced
[22:39:48 CET] <kepstin> if you're concatenating data from multiple files with the concat demuxer, they all have to have the same video/audio codecs
[22:40:08 CET] <kepstin> (and you might have trouble if they're made with different encoders or encoder settings even)
[22:40:27 CET] <ironm> kepstin, thank you. The screenshot does not have an audio codec at all
[22:40:38 CET] <kepstin> it sounds like your first file in your playlist has png video, then the following ones have something different
[22:40:51 CET] <ironm> other mp4 files are from GoPRo7
[22:40:55 CET] <kepstin> which then breaks when the png decoder gets handed the "something different"
[22:41:04 CET] <ironm> kepstin, yes
[22:41:23 CET] <kepstin> if you have files with different codecs, you need to use the concat filter and re-encode
[22:41:30 CET] <ironm> otherwise the mp4 files generated from png are ignored
[22:41:53 CET] <kepstin> you cannot concatenate videos that have different codecs with -c copy
[22:41:55 CET] <ironm> I see, thank you. I will have to check how to do it
[22:43:55 CET] <ironm> kepstin, could I created an mp4 file from a png file settings the same codes as in original MP4 files from GoPro7 ?
[22:44:06 CET] <ironm> setting
[22:44:15 CET] <ironm> codecs
[22:44:25 CET] <kepstin> ironm: it's very hard to make h264 encoded files match when you use two different encoders.
[22:45:25 CET] <ironm> maybe it is possible to specify the codec inside "ffmpeg -loop 1 -framerate 10 -i in2.png -r 50 -t 5 -c copy out2.mp4" ?
[22:46:00 CET] <nicolas17> you can't make ffmpeg use the same encoder as your gopro since that's likely proprietary
[22:46:17 CET] <kepstin> that said, if the new file you're making is the first file, and you use x264 settings that enable a lot of features and use the x264 'stitchable' option, there's a chance it might work :/
[22:47:52 CET] <ironm> That means I would have to convert GoPro7 MP4 files to x264 (I thought I happens automatically)
[22:48:25 CET] <ironm> when I run  ffmpeg .. (hover with the "-c copy" option)
[22:48:48 CET] <kepstin> you used "-c copy" which is how you explicitly tell ffmpeg to do no conversion
[22:48:59 CET] <ironm> true
[22:50:08 CET] <ironm>  (+) Video --vid=1 (*) (h264 1920x1080 50.000fps)
[22:52:55 CET] <ironm> I use like: ffmpeg -i BA10_05_GH040086f00.mp4 -to 00:00:20 -c copy BA10_05_GH040086f00-20s.mp4
[23:00:34 CET] <ironm> I try now: ffmpeg -i GH060085.MP4 -c:v libx264 -preset ultrafast -crf 0 output.mp4
[23:04:15 CET] <ironm> kepstin, I run: ffmpeg -i BA10_04_GH030086f00-10s.mp4 -c:v libx264 -preset ultrafast -crf 0 output.mp4
[23:04:38 CET] <ironm> but the output.mp4 file is still:  (+) Video --vid=1 (*) (h264 1920x1080 50.000fps)
[23:04:59 CET] <kepstin> libx264 is an h264 encoder, so yes...
[23:04:59 CET] <ironm> I have expected x264 instead
[23:05:10 CET] <ironm> omm
[23:06:51 CET] <ironm>  (+) Video --vid=1 (*) (png 3839x2159 10.000fps)
[23:07:34 CET] <kepstin> you're also going to have to scale that png to the right size before you can concatenate the videos
[23:08:01 CET] <ironm> Yes, it looks like I miss one pix
[23:08:37 CET] <ironm> thank you kepstin
[23:10:40 CET] <ironm> for some reasen "scrot -s" cuts one pixel in every direction (4K display)
[23:21:44 CET] <ironm> Is there an easy way to create frames with text only in defined target resolution (FHD or 4K) based on text file?
[23:22:17 CET] <ironm> like title pages before the scene
[23:29:35 CET] <ironm> quite a lot of stuff to read - https://trac.ffmpeg.org/
[23:38:11 CET] <DHE> there is a text renderer, but that's quite a lot...
[23:45:16 CET] <ironm> DHE thank you, I will check it
[00:00:00 CET] --- Tue Oct 29 2019


More information about the Ffmpeg-devel-irc mailing list