[Ffmpeg-devel-irc] ffmpeg.log.20170929
burek
burek021 at gmail.com
Sat Sep 30 03:05:01 EEST 2017
[00:00:40 CEST] <JEEB> -af "pan=5.1|c0=blah|c1=blah" and so on
[00:00:42 CEST] <JEEB> I guess?
[00:01:12 CEST] <JEEB> for mapping the FL and FR channels as stereo it'd be -af "pan=stereo|c0=FL|c1=FR"
[00:02:03 CEST] <JEEB> so the "5.1" is the name of the 5.1 channel mapping
[00:02:03 CEST] <SpeakerToMeat> Yes but then I don't get the center to left and right or back to left and right that I get with the built in mapper
[00:02:26 CEST] <JEEB> I just gave the latter as a nexample
[00:02:46 CEST] <JEEB> you map the correct channels from your 12ch input as 5.1
[00:03:00 CEST] <JEEB> by making the first line I posted full with all the six channels
[00:03:10 CEST] <JEEB> and instead of blah having the correct identifiers
[00:03:17 CEST] <JEEB> then after that you cna downmix to stereo
[00:04:53 CEST] <SpeakerToMeat> Ok so the first line, the filter goes before the input file so it gets associated with it?
[00:05:00 CEST] <JEEB> no
[00:05:18 CEST] <JEEB> filtering is always in transcoding, post decoding
[00:05:38 CEST] <SpeakerToMeat> Ok,
[00:05:50 CEST] <JEEB> also I wonder if there's a better way... I remember utilizing filter_complex for this sort of stuff ages ago
[00:06:24 CEST] <SpeakerToMeat> and in that pan filter, c0 is the input or output? I mean: c0=FL is assigning a channel tagged FL to channel 0, or assigning channel 0 to the FL channel tag
[00:06:45 CEST] <JEEB> channel 0 being FL (from the input)
[00:06:54 CEST] <JEEB> probably other ways of notating things too
[00:07:53 CEST] <SpeakerToMeat> I'll see if I can kick firefox into responding to get the pan filter doc
[00:08:18 CEST] <JEEB> oh
[00:08:19 CEST] <JEEB> pan="5.1| c0=c1 | c1=c0 | c2=c2 | c3=c3 | c4=c4 | c5=c5"
[00:08:24 CEST] <JEEB> the examples had this :P
[00:08:53 CEST] <JEEB> so you can just use c0=c0 , and so forth
[00:09:22 CEST] <JEEB> https://ffmpeg.org/ffmpeg-filters.html#Remapping-examples
[00:09:29 CEST] <SpeakerToMeat> A thing of beauty
[00:09:39 CEST] <SpeakerToMeat> Thank you, sorry to ask so much but my FF is locked up right now
[00:10:09 CEST] <JEEB> also do note that if you copypasta what I pasted from the example as is, the first two channels get swapped
[00:10:27 CEST] <JEEB> just in case you didn't notice :p
[00:10:38 CEST] <JEEB> if you started typing it yourself again then it should be OK
[00:10:49 CEST] <SpeakerToMeat> Yes I gathered that
[00:12:12 CEST] <SpeakerToMeat> Thanks
[00:50:40 CEST] <moniker-> does anyone know what is VLD bitstream decoding
[00:51:13 CEST] <moniker-> when i turn it off in my potplayer i can play multiple 1080p60 streams without problem, right now im playing 4
[00:51:44 CEST] <moniker-> when it is on (by default) i can only play 1 stream if i open second then the other stream will be at 15-20 fps
[00:54:02 CEST] <pzich> moniker-: according to the internet, it's "variable-length decoding", which I could see being more expensive to decode
[00:54:40 CEST] <moniker-> oh wait im stupid
[00:54:44 CEST] <moniker-> well ofc
[00:54:59 CEST] <moniker-> if i turn of VLD then it's not hw accelerated anymore, it's sw
[00:55:08 CEST] <pzich> oh hah, that'd do it
[00:55:08 CEST] <moniker-> nvm -.-
[00:55:18 CEST] <moniker-> that explains why i can run so many streams
[01:03:31 CEST] <SpeakerToMeat> Is there any way I can tell ffmpeg that a video is nto actually 25 fps but 24? constant frame count...
[01:04:10 CEST] <SpeakerToMeat> Specially when converting from discrete image frames...
[01:06:53 CEST] <pzich> I think you just need -r 24
[01:07:06 CEST] <pzich> Or maybe -framerate, I can't remember
[01:07:58 CEST] <SpeakerToMeat> pzich: Left of the input I think
[01:08:05 CEST] <pzich> Ah yes
[01:08:12 CEST] <SpeakerToMeat> I think that's it, thanks
[01:08:20 CEST] <pzich> something like ffmpeg -framerate 24 -i images%d.jpg ...
[01:13:24 CEST] <SpeakerToMeat> it's -r I think, let me try
[01:13:49 CEST] <SpeakerToMeat> Hmm no doesn't seems to
[01:14:56 CEST] <pzich> pretty sure it's -framerate, at least according to https://en.wikibooks.org/wiki/FFMPEG_An_Intermediate_Guide/image_sequence#Making_a_video_from_an_Image_Sequence
[01:16:38 CEST] <SpeakerToMeat> Yep you're right
[01:16:44 CEST] <SpeakerToMeat> Thank you
[01:17:53 CEST] <SpeakerToMeat> Ok time to run home, see you all.
[01:17:55 CEST] Action: SpeakerToMeat bows
[03:12:47 CEST] <hendry> is there a FFMPEG script to create like a 10 frame poster image like how Youtube do it?
[03:12:58 CEST] <hendry> Browser's seem hopeless as previewing a bunch of MP4s
[03:26:26 CEST] <rafasc> Is there a way to limit this just to the first couple of seconds? ffmpeg -f lavfi -i 'amovie=file.mp4,showspectrumpic=s=800x600' spec.jpg
[03:28:58 CEST] <furq> trim=end=2,showspectrumpic
[03:29:10 CEST] <furq> also i'm not sure why you'd use the movie filter there but i guess it still works
[03:30:39 CEST] <rafasc> I tried not using it, but wasn't able to make it work
[03:32:54 CEST] <rafasc> thanks furq, in my case needed atrim, but you pointed me to the right direction. :)
[03:35:34 CEST] <furq> -i file.mp4 -filter_complex [a:0]atrim=end=2,showspectrumpic
[03:35:37 CEST] <furq> but nvm
[03:36:04 CEST] <furq> 0:a rather
[03:45:54 CEST] <rjp421> anyone with a pi and pi-cam, and updated packages+kernel+ffmpeg-git: can you please test piping raspivid to ffprobe or ffmpeg? please confirm whether or not ffmpeg crashes (pls let me know results so i can stop spamming this)
[03:57:08 CEST] <rafasc> showspectrumpic doesn't allow setting limits to the vertical resolution? I just wanted to plot lets say, between 0 and ~200hz
[06:03:27 CEST] <blap> showspec-trump-ic
[06:55:49 CEST] <lindylex> why does this fail : ffmpeg -i left.mp4 -i right.mp4 -filter_complex "[0:v]crop=638:720, pad=640:720:0:0:green[tmp0]; [1:v]crop=638:720, pad=640:720:2:0:green[tmp1]; [tmp0][tmp1]hstack[v];" -y o.mp4 with this error : [AVFilterGraph @ 0x558bbf274f20] No such filter: ''
[06:55:49 CEST] <lindylex> Error initializing complex filters.
[06:55:49 CEST] <lindylex> Invalid argument
[07:05:50 CEST] <dystopia> what are you trying to do lindylex
[07:06:40 CEST] <lindylex> I am adding to videos side by side and placing a border color between them. Both videos have no audio and this is causing problems I think.
[07:06:45 CEST] <lindylex> No audi stream
[07:06:47 CEST] <lindylex> audio
[07:09:32 CEST] <lindylex> dystopia : I solved it ffmpeg -i left.mp4 -i right.mp4 -filter_complex "[0:v]crop=638:720, pad=640:720:0:0:green[tmp0]; [1:v]crop=638:720, pad=640:720:2:0:green[tmp1]; [tmp0][tmp1]hstack[v]; anullsrc[silent1]; anullsrc[silent2]; [silent1][silent2]amerge=inputs=2[a]" -map [v] -map [a] -ac 2 -y o.mp4
[09:40:58 CEST] <dreamon_> hello. I have a UHD Video 29GB converted by using "ffmpeg -i input.mov -c:v libx265 -crf 28 output.mp4" now File is around 1.9GB. But having issues to let it play. Its hooking and vlc player isnt playing its sometimes gray fragmented.
[09:42:15 CEST] <dreamon_> I play it with a intel i5 notebook and nvidia support. Original file 29GB is also hooking on this laptop. but runs on a UHD TV perfectly
[09:46:42 CEST] <dreamon_> how can I find out my cpu/gpu is fast enough to play it in UHD?
[09:55:45 CEST] <JEEB> dreamon_: if mpv can play it then it's fast enough :P
[09:58:31 CEST] <rabbe> hi. i want to stream a webcam under linux, using ffmpeg and ffserver in (modified) mp4 and webm containers for support across major browsers in a html5 video tag. but since i don't have the hardware yet i want to use the screen as source for now
[09:59:08 CEST] <rabbe> someone have any premade ffmpeg parameters for this?
[10:00:15 CEST] <dreamon_> JEEB, mpv plays it same hooking. but Im not sure it uses hw support nvidia. tried windows10 and ubuntu ..
[10:00:24 CEST] <rabbe> am i on the right track that i first create a .ffm stream and then in ffserver-config i specify information of the mp4 and webm streams which will then be available to the web page?
[10:00:57 CEST] <JEEB> dreamon_: --hwdec=dxva2-copy with nvidia
[10:01:29 CEST] <JEEB> dreamon_: but basically that means that your CPU is not fast enough, and using dxva2 the dedicated decoder hardware will be utilized for HEVC
[10:04:05 CEST] <dreamon_> dxva2 is that a cpu or a gpu part?
[10:04:39 CEST] <JEEB> depends on which gets utilized by DXVA2 if you have a HW decoder chip in your CPU
[10:05:11 CEST] <JEEB> generally speaking it utilizes decoder hardware on graphics stuff (which both an intel CPU would have in addition to the nvidia GPU)
[10:05:23 CEST] <JEEB> most likely if you have an nvidia GPU stuck in there it will utilize it
[10:05:39 CEST] <JEEB> as you have to have a screen connected to the intel iGPU to get that poked, I think
[10:05:40 CEST] <dreamon_> JEEB, how can I find out?
[10:05:52 CEST] <JEEB> -v outputs it I think
[10:06:27 CEST] <JEEB> but it's not the CPU part of the CPU even if it uses the CPU
[10:06:39 CEST] <JEEB> it's the dedicated chip that does video decoding :P
[10:06:45 CEST] <JEEB> (which is part of the GPU on the CPU)
[10:07:01 CEST] <JEEB> (this all with --hwdec=dxva2-copy)
[10:07:12 CEST] <JEEB> without that it's all CPU
[10:07:27 CEST] <JEEB> as in, CPU CPU
[10:08:21 CEST] <dreamon_> cat /proc/cpuinfo | grep dxva2 no output.
[10:09:22 CEST] <dreamon_> can you give me how syntax?
[10:10:24 CEST] <JEEB> oh, you're on linux
[10:10:28 CEST] <JEEB> you just said windows 10 :P
[10:10:42 CEST] <dreamon_> Im on Linux, too
[10:10:58 CEST] <JEEB> DXVA2 is the windows hw decoding API
[10:11:53 CEST] <JEEB> on linux I'm actually not sure. it used to be vdpau for nvidia, vaapi for intel/amd
[10:11:56 CEST] <dreamon_> ffmpeg is only running on Linux.. at the moment. but playing video I tried on both OS
[10:12:16 CEST] <JEEB> but now vdpau is broken for some formats and doesn't support 10 bit :P
[10:13:07 CEST] <JEEB> anyways, I'll let you figure it out yourself on linux :P
[10:23:45 CEST] <dreamon_> JEEB, tried in Windows ffmpeg -v --hwdec=dxva2-copy.. but invalid
[10:23:52 CEST] <JEEB> I was speaking of mpv
[10:24:06 CEST] <JEEB> mpv --hwdec=dxva2-copy file
[10:26:33 CEST] <rabbe> please help me when you can. i need this for a robot project
[10:28:19 CEST] <dreamon_> JEEB, What the ****.. now it plays totaly smoothly.
[10:29:12 CEST] <dreamon_> Fast as hell.
[10:30:23 CEST] <dreamon_> JEEB, how can I tell windows it should start every video play by mpv with this option?
[10:37:19 CEST] <JEEB> dreamon_: basically your CPU is not fast enough but your GPU's hw decoder *is* fast enough :P
[10:37:31 CEST] <JEEB> you can add a hwdec=dxva2-copy to the config file if you really want to
[10:39:32 CEST] <dreamon_> JEEB, an the other Players like M$, VLC, dont use this option?
[10:40:46 CEST] <JEEB> no fucking idea
[10:41:40 CEST] <dreamon_> JEEB, THANK YOU VERY VERY MUCH!!
[10:42:48 CEST] <dreamon_> do you think there is such a option for linux users too?
[10:43:01 CEST] <JEEB> vaapi or vdpau are the alternatives
[10:43:27 CEST] <JEEB> or I think they might have nvdec now for nvidia as well
[10:47:30 CEST] <rabbe> can someone help me getting my mp4 stream for html5 video to work? https://pastebin.com/vF7WzHb2
[13:19:05 CEST] <Kadigan> Hey, I have a question -- ffmpeg accepts two duration formats - the classic H:MM:SS.ms, and the S.ms format... but is there a way to escape the : when doing this for filters?
[13:19:44 CEST] <Kadigan> I was getting a fairly confusing error about unknown filter configuration 23.76, until I realized it was the : being parsed, splitting the time format and breaking things.
[13:21:12 CEST] <Kadigan> Or do I need to use strictly the seconds[.ms] format in there? (I was using this in conjuction w/ afade, ie. "afade=t=out:st={computed time}:d=01.00")
[13:57:00 CEST] <dreamon> JEEB, For Linux "mpv --hwdec=vaapi-copy movie.mp4" works very well!!
[13:57:23 CEST] <JEEB> congrats
[13:57:31 CEST] <JEEB> I have vaapi set up on my intel laptop as well
[13:57:46 CEST] <JEEB> although I only need it in very specific cases because generally software decoding is more stable
[13:58:48 CEST] <dreamon> JEEB, Linux laptop only as intel gpu activated. But this as you told me a cpu option.. ;) So it works with this device perfect
[13:59:56 CEST] <dreamon> Yes. without -copy its get stucked by seeking video. but with -copy it works good for me
[14:11:37 CEST] <JEEB> dreamon: it doesn't matter if the chip is on CPU or GPU
[14:11:43 CEST] <JEEB> it's hw decoding
[14:11:44 CEST] <JEEB> not sw
[14:13:03 CEST] <dreamon> JEEB, How can I check out witch one (cpu/gpu) has it implented? -v?
[14:19:01 CEST] <titbang> lol wut JEEB
[14:32:12 CEST] <JEEB> dreamon: yes
[14:32:20 CEST] <JEEB> or well, it will list which driver it loads
[14:32:30 CEST] <JEEB> i965 for intel's, and otherwise for nvidia
[14:32:44 CEST] <JEEB> titbang: intel iGPU vs (any) dGPU.
[14:32:51 CEST] <JEEB> where the decoder ASIcs are
[15:38:46 CEST] <lindylex> Greeting all. I am placing to videos with no audio side by side. I ran the following and it seems to be in a never ending loop. It is take forever and still has not completed. ffmpeg -i left.mp4 -i right.mp4 -filter_complex "[0:v]crop=638:720, pad=640:720:0:0:green[tmp0]; [1:v]crop=638:720, pad=640:720:2:0:green[tmp1]; [tmp0][tmp1]hstack[v]; anullsrc[silent1]; anullsrc[silent2]; [silent1][silent2]amerge=inputs=2[a]" -map [v]
[15:38:48 CEST] <lindylex> -map [a] -ac 2 -shortest -y o.mp4
[15:43:52 CEST] <isabella> hi, Can anyone help me with applying instagram-like filter on video, please?
[15:45:29 CEST] <lindylex> isabella : What do you want your video to look like? Show us a photo.
[15:45:58 CEST] <isabella> https://www.cssfilters.co/
[15:46:29 CEST] <isabella> like the first one
[15:46:32 CEST] <isabella> 1977 style
[15:47:20 CEST] <isabella> i'm currently doing a project enabling the user adding filter to their video
[15:47:34 CEST] <isabella> for the front side, it's easy to do that using css filters.
[15:48:07 CEST] <isabella> but it's difficult to apply the exact same effect using ffmpeg. since i'm not familiar with it.
[15:50:52 CEST] <isabella> i do find some video filters corresponding like hue, with which i can configure brightness, saturation and hue (same with css filters)
[15:51:04 CEST] <lindylex> isabella : try this ffmpeg -i myvideo.mov -f lavfi -i "color=brown:s=1280x720" -filter_complex "[0:v]setsar=sar=1/1[s]; [s][1:v]blend=shortest=1:all_mode=screen:all_opacity=0.9[out]" -map [out] -map 0:a -y out.mp4
[15:51:28 CEST] <isabella> thank you. i'm trying it.
[15:59:49 CEST] <isabella> lindylex: it works if i change the size to 320x240 and opacity to 0.5
[16:03:22 CEST] <isabella> is there any clue that can help me to map the css filters to the ffmpeg video filters?
[16:07:22 CEST] <isabella> lindylex May i ask you how to do the 'Lofi'?
[16:13:11 CEST] <isabella> for the color, can i assign it with hex value instead of color name like 'brown'?
[16:13:54 CEST] <lindylex> yes
[16:14:49 CEST] <lindylex> isabella : Like this 0x000000 . This give you black.
[16:16:27 CEST] <isabella> lindylex: ok, thank you
[16:18:07 CEST] <lindylex> isabella : The grsnular control you desire will come from this >>> eq= I do not have enough experience to consult you on this, others do.
[16:20:56 CEST] <isabella> i tried the eq by setting only the contrast, but it seems not exactly the same effect with css filters.
[16:28:24 CEST] <isabella> lindylex how can i keep the same quality of video? its size decreased to 880KB from 1.1MB
[16:31:56 CEST] <lindylex> isabella : Quality is a difficult question to answer. Does it look terrible from the original? How will it be consumed? What is the quality of the video that is being modified?
[16:33:12 CEST] <titbang> 0xFFFFFFF
[16:35:40 CEST] <isabella> it's fine. it does not look terrible
[16:36:08 CEST] <lindylex> Congrats and best of luck have a great weekend.
[16:37:14 CEST] <isabella> lindylex thx. you too. but i have a question of processing mp4 video.
[16:37:45 CEST] <isabella> it provoke error if the input video is mp4
[16:38:03 CEST] <isabella> Stream map '0:a' matches no streams. To ignore this, add a trailing '?' to the map.
[16:38:31 CEST] <lindylex> It looks like your video has no audio. Am I correct?
[16:38:56 CEST] <isabella> exactly
[16:39:28 CEST] <lindylex> Your ffmpeg string has map 0:a in it and this is searching for a video with audio stream.
[16:39:52 CEST] <lindylex> You must modify it to handle videos without audio.
[16:40:41 CEST] <isabella> adding a trailing '?' will solve the problem? if no. how can i modify it
[16:40:49 CEST] <isabella> to no audio case
[16:42:50 CEST] <lindylex> Please post your ffmpeg string.
[16:43:16 CEST] <isabella> ffmpeg -i video4.mp4 -f lavfi -i "color=0x4c59ba:s=1280x720" -filter_complex "[0:v]setsar=sar=1/1[s]; [s][1:v]blend=shortest=1:all_mode=screen:all_opacity=0.5[out]" -map [out] -map 0:a? -y video4.2.mp4
[16:44:46 CEST] <isabella> it's in progress. not yet finished.
[16:45:05 CEST] <furq> !filter eq @isabella
[16:45:05 CEST] <nfobot> isabella: http://ffmpeg.org/ffmpeg-filters.html#eq
[16:45:36 CEST] <furq> presumably for 1977 you just want -vf eq=1.1:1.1:1.3
[16:45:58 CEST] <furq> !filter colorchannelmixer
[16:45:58 CEST] <nfobot> furq: http://ffmpeg.org/ffmpeg-filters.html#colorchannelmixer
[16:46:02 CEST] <furq> then you can do grayscale and sepia with that
[16:46:04 CEST] <lindylex> I am trying to solve the same problem with how do I handle a video with no audio. I did this : ffmpeg -i left.mp4 -i right.mp4 -filter_complex "[0:v]crop=638:720, pad=640:720:0:0:green[tmp0]; [1:v]crop=638:720, pad=640:720:2:0:green[tmp1]; [tmp0][tmp1]hstack[v]; anullsrc[silent1]; anullsrc[silent2]; [silent1][silent2]amerge=inputs=2[a]" -map [v] -map [a] -ac 2 -y o.mp4 but it renders the video indefinitely. It is in some kind
[16:46:04 CEST] <lindylex> of loop.
[16:47:01 CEST] <furq> you can just do anullsrc=c=2 for stereo fyi
[16:47:08 CEST] <furq> and then add -shortest as an output option
[16:47:24 CEST] <lindylex> furq: thanks let me try this.
[16:47:27 CEST] <isabella> lindylex ok. so it's not gonna work with no audio video
[16:47:35 CEST] <furq> although anullsrc apparently defaults to stereo
[16:47:45 CEST] <isabella> furq nofobot. thank you
[16:48:10 CEST] <isabella> i tried. but i don't know how to map the values
[16:48:14 CEST] <furq> just remove -map 0:a
[16:48:32 CEST] <lindylex> isabella : correct furq is much more experienced with this. I will try my best to help you. If I solve mine then it will apply to yours.
[16:48:40 CEST] <furq> you don't really need filter_complex if you only have one input stream
[16:49:00 CEST] <furq> just ffmpeg -i in.mp4 -vf eq=1.1:1.1:1.3 out.mp4
[16:49:32 CEST] <isabella> ok. for the moment single input feed my needs
[16:49:40 CEST] <lindylex> furq : was this for me? anullsrc=c=2
[16:49:52 CEST] <furq> lindylex: apparently it's anullsrc=stereo
[16:49:56 CEST] <furq> but also that should be the default
[16:50:26 CEST] <lindylex> This is what I need to modify : ffmpeg -i left.mp4 -i right.mp4 -filter_complex "[0:v]crop=638:720, pad=640:720:0:0:green[tmp0]; [1:v]crop=638:720, pad=640:720:2:0:green[tmp1]; [tmp0][tmp1]hstack[v]; anullsrc[silent1]; anullsrc[silent2]; [silent1][silent2]amerge=inputs=2[a]" -map [v] -map [a] -ac 2 -y o.mp4
[16:50:46 CEST] <furq> just get rid of everything after [silent1]
[16:51:27 CEST] <furq> unless you actually want quadraphonic output
[16:51:59 CEST] <furq> and yeah add -shortest as an output option to stop it from running forever
[16:53:05 CEST] <isabella> furq https://www.cssfilters.co/ how to do the 'Lofi' filter with eq, please?
[16:53:22 CEST] <furq> 1.5:1:1.1
[16:53:50 CEST] <furq> it's probably not going to match exactly fwiw
[16:55:28 CEST] <isabella> furq i get all white.
[16:56:43 CEST] <isabella> ffmpeg -i video1.mp4 -vf "eq=1.5:1:1.1" video4.1.mp4
[16:57:03 CEST] <isabella> is it because of the input video is mp4
[16:57:15 CEST] <furq> oh nvm brightness is -1 to 1
[16:57:26 CEST] <furq> it should be 1.5:0:1.1 then
[16:57:52 CEST] <furq> but like i said the values won't map exactly between the sliders on that page and the eq filter
[16:57:55 CEST] <furq> you'll need to mess around with it a bit
[16:59:19 CEST] <isabella> ok
[16:59:22 CEST] <furq> eq and colorchannelmixer will get you most of the effects on there though
[16:59:24 CEST] <isabella> how to do the sepia?
[16:59:45 CEST] <furq> !filter colorchannelmixer
[16:59:45 CEST] <nfobot> furq: http://ffmpeg.org/ffmpeg-filters.html#colorchannelmixer
[16:59:47 CEST] <furq> see the examples on there
[17:01:15 CEST] <furq> there's also hue for hue rotate, and boxblur (and a million other blur filters) for blur
[17:02:03 CEST] <isabella> i don't understand it colorchannelmixer=.393:.769:.189:0:.349:.686:.168:0:.272:.534:.131
[17:02:19 CEST] <isabella> what are those values?
[17:04:53 CEST] <isabella> nfobot furq could you please explain it to me?
[17:05:45 CEST] <isabella> what does .393 correpond to?
[17:09:59 CEST] <isabella> ah. ok i understand.
[17:10:32 CEST] <isabella> <feColorMatrix type="matrix" values=" (0.393 + 0.607 * [1 - amount]) (0.769 - 0.769 * [1 - amount]) (0.189 - 0.189 * [1 - amount]) 0 0 (0.349 - 0.349 * [1 - amount]) (0.686 + 0.314 * [1 - amount]) (0.168 - 0.168 * [1 - amount]) 0 0 (0.272 - 0.272 * [1 - amount]) (0.534 - 0.534 * [1 - amount]) (0.131 + 0.869 * [1 - amount]) 0 0 0 0 0 1 0"/>
[17:14:33 CEST] <rhinoTMT> Hello good people. I'm trying to join two image sequences together in two different folders. I see that one way to do it is to create a separate text file containing the paths i.e. "file './intro/frame_%04d.png'. I don't want my image sequences to start at 0, though. Is there any way to incorporate the -start_number and -vframes tags into that text file before the concat occurs?
[17:17:29 CEST] <relaxed> rhinoTMT: might be easier to write a script and `ln` exactly what you want to a temp dir
[17:18:46 CEST] <cryptodechange> I tried aq-mode 3 on anime
[17:18:56 CEST] <rhinoTMT> relaxed: thanks. I think I might be over my head here, so I'm just going to bulk rename the sequences I need.
[17:18:57 CEST] <cryptodechange> Not sure if it's the aq, but the colours seem to be a bit different
[17:20:52 CEST] <isabella> furq ffmpeg -i video4.mp4 -vf "colorchannelmixer=.393:.769:.189:0:.349:.686:.168:0:.272:.534:.131" video4.1.mp4
[17:21:07 CEST] <isabella> it does not work with this command
[17:21:44 CEST] <isabella> how can i pass the sepia value?
[17:24:48 CEST] <isabella> nfobot ?
[17:46:17 CEST] <ubitux> isabella: -vf curves=vintage? :p
[17:48:34 CEST] <lindylex> furq : ffmpeg -i left.mp4 -i right.mp4 -filter_complex "[0:v]crop=638:720, pad=640:720:0:0:green[tmp0]; [1:v]crop=638:720, pad=640:720:2:0:green[tmp1]; [tmp0][tmp1]hstack[v]; anullsrc[silent1]" -map [v] -map [silent1] -ac 2 -shortest -y o.mp4
[17:48:34 CEST] <lindylex> this run forever as it looks like it is in a loop.
[18:04:23 CEST] <isabella> ubitux thx.
[18:07:32 CEST] <cryptodechange> what can cause a somewhat overall color pallette/tone shift in anime encodes?
[18:08:59 CEST] <cryptodechange> using deblock=0 aq=3:0.8 psy-rd=0.7:0.1
[18:31:54 CEST] <dystopia> did you have "-tune animation" set cryptodechange?
[18:36:57 CEST] <kepstin> cryptodechange: probably colour matrix issues, x264 preserves colours quite well in general
[18:51:15 CEST] <lindylex> This runs forever. I need to place these two video side by side they have no audio and it is creating a problem. ffmpeg -i left.mp4 -i right.mp4 -filter_complex "[0:v]crop=638:720, pad=640:720:0:0:green[tmp0]; [1:v]crop=638:720, pad=640:720:2:0:green[tmp1]; [tmp0][tmp1]hstack[v]; anullsrc[silent1]" -map [v] -map [silent1] -ac 2 -shortest -y o.mp4
[19:15:51 CEST] <Johnjay> Hrm when I do the detectvolume feature on these two audio files, both are max 0db
[19:16:08 CEST] <thebombzen> lindylex: it's probably a buffering issue that's making it take so long. I'd recommend using a color= video source to create a 4x720 green video
[19:16:19 CEST] <Johnjay> the quieter one is -20db mean vol and the louder one is -9.8db
[19:16:23 CEST] <thebombzen> and then hstack the three videos together. it's probably faster than padding it and you might not run into buffering issues
[19:16:34 CEST] <thebombzen> when you have enormous filtergraphs it can hang
[19:16:41 CEST] <Johnjay> how do I go about normalizing the audio in this case?
[19:17:32 CEST] <thebombzen> Johnjay: are you looking for the loudnorm filter?
[19:17:36 CEST] <thebombzen> !filter loudnorm
[19:17:36 CEST] <nfobot> thebombzen: http://ffmpeg.org/ffmpeg-filters.html#loudnorm
[19:18:20 CEST] <Johnjay> possibly. I also need to understand what the hell I'm doing as well
[19:18:32 CEST] <thebombzen> I have no idea how to use that, but that does do loudness normalization
[19:18:45 CEST] <Johnjay> it says it resamples to 192Mhz. Is that a problem?
[19:18:52 CEST] <Johnjay> er Khz sorry
[19:19:30 CEST] <thebombzen> well you can just add an aresample=48k afterward
[19:19:50 CEST] <thebombzen> but also you could consider acompressor
[19:20:17 CEST] <thebombzen> also checkout dynaudnorm
[19:20:19 CEST] <thebombzen> !filter dynaudnorm
[19:20:19 CEST] <nfobot> thebombzen: http://ffmpeg.org/ffmpeg-filters.html#dynaudnorm
[19:21:55 CEST] <Johnjay> no such filter 'loudnorm'
[19:21:56 CEST] <Johnjay> huh?
[19:22:49 CEST] <Johnjay> does the windows port not have it?
[19:24:47 CEST] <Johnjay> well it's not in ffmpeg -filters so...
[19:25:07 CEST] <thebombzen> it might be recent
[19:26:05 CEST] <cryptodechange> dystopia: I used the animation settings from -fullhelp, and dialed it a bit closer to that of film.
[19:26:15 CEST] <cryptodechange> kepstin: how would I identify/rectify the colour matrix issues?
[19:26:30 CEST] <Johnjay> well at any rate I think I manually solved it with brute force with -af volume=5.0
[19:27:10 CEST] <Johnjay> but basically i have two sets of audio files and one is much louder than the other
[19:27:23 CEST] <Johnjay> not sure what the best way is to fix that in general without that loudnorm thing you said
[19:28:16 CEST] <thebombzen> Johnjay: loudnorm is recent
[19:28:29 CEST] <thebombzen> grab a recent build from Zeranoe https://ffmpeg.zeranoe.com/builds/ and it should be there
[19:28:45 CEST] <Johnjay> ok but can you explain in layman's terms if it's even what I need?
[19:29:14 CEST] <Johnjay> I'm listening to the volume=5.0 file now and it sounds a little tinny
[19:29:31 CEST] <thebombzen> that's because the volume filter just multiplies all the samples by 0.5
[19:30:04 CEST] <Johnjay> loudnorm will do it more intelligently i guess?
[19:30:59 CEST] <thebombzen> you probably are looking for acompressor though
[19:32:15 CEST] <thebombzen> !filter acompressor
[19:32:15 CEST] <nfobot> thebombzen: http://ffmpeg.org/ffmpeg-filters.html#acompressor
[19:32:46 CEST] <Johnjay> apparently my ffmpeg is located in a folder called "FFmpeg for Audacity"
[19:33:00 CEST] <thebombzen> the FFmpeg for Audacity is very old
[19:33:22 CEST] <Johnjay> i thought i installed it separately though. wtf
[19:33:29 CEST] <thebombzen> download a build from Zeranoe and put it in something in your path (I prefer ~/bin)
[19:33:44 CEST] <Johnjay> well i'm on windows
[19:33:57 CEST] <Johnjay> I've been using C:\Users\Public\bin for that
[19:34:02 CEST] <thebombzen> sure, okay
[19:34:06 CEST] <Johnjay> no idea what that is but it seems to work
[19:34:10 CEST] <stephan_> what's the size of the buffer allocated with av_samples_alloc ?
[19:35:21 CEST] <Johnjay> google tells me it is what it is, a folder for all users to share files with each other
[19:41:26 CEST] <Diag> Wow guys, the epa is looking to outlaw R718(dihydrogenmonoxide) refrigeration systems by 2020 because of how harmful it is to the environment.
[19:50:01 CEST] <redrabbit> you old troll
[20:03:11 CEST] <lindylex> thebombzen : it's probably a buffering issue that's making it take so long. I'd recommend using a color= video source to create a 4x720 green video << This and then overlay it?
[20:03:27 CEST] <thebombzen> no, don't use overlay
[20:03:41 CEST] <lindylex> I have no idea what to do.
[20:04:08 CEST] <lindylex> It is balking because of the no audio. It work well when the video has audio.
[20:04:24 CEST] <thebombzen> you said it's going forever
[20:04:35 CEST] <thebombzen> do you mean it's hanging? or do you mean it just won't stop encoding null audio?
[20:04:56 CEST] <lindylex> Wont stop encoding I believe.
[20:05:03 CEST] <thebombzen> oh, then you have a different issue
[20:05:10 CEST] <thebombzen> but -shortest should fix that, though.
[20:05:52 CEST] <thebombzen> emphasis on complete console output, (as you already pasted your command)
[20:06:04 CEST] <pzich> if necessary, it's also pretty easy to add a silent audio track
[20:06:19 CEST] <pzich> -shortest sounds like it should fix it based on what you've said though
[20:06:20 CEST] <thebombzen> that's what it looks like they're doing with anullsrc
[20:06:40 CEST] <thebombzen> although if your goal is to add silence, you should probably be using aevalsrc=0, rather than anullsrc
[20:06:44 CEST] <thebombzen> because anullsrc doesn't zero out memory
[20:07:00 CEST] <thebombzen> unless I'm wrong about that, it just allocates and passes uninitilized data
[20:07:03 CEST] <pzich> I think I missed the pasted command, is it pretty far back?
[20:07:19 CEST] <thebombzen> it was: ffmpeg -i left.mp4 -i right.mp4 -filter_complex "[0:v]crop=638:720, pad=640:720:0:0:green[tmp0]; [1:v]crop=638:720, pad=640:720:2:0:green[tmp1]; [tmp0][tmp1]hstack[v]; anullsrc[silent1]" -map [v] -map [silent1] -ac 2 -shortest -y o.mp4
[20:07:20 CEST] <lindylex> https://pastebin.com/wy5XQ74y
[20:07:32 CEST] <thebombzen> lindylex: emphasis on complete console output
[20:08:10 CEST] <pzich> incase it's unclear: run it and copy everything that ffmpeg spits out too
[20:09:17 CEST] <lindylex> https://pastebin.com/qtL9QsAR
[20:10:18 CEST] <pzich> was it sitting at 799 for a while before you ^C-ed?
[20:11:03 CEST] <lindylex> It was working and I had to ctrl - c to sto it.
[20:11:46 CEST] <ianbytchek> greetings. can anyone tell what AVMEDIA_TYPE_NB means, NB abbreviation in particular? documentation doesn't say anything on this. not narrowband is t?
[20:11:51 CEST] <pzich> and this works when both files have audio, but not one? or neither?
[20:12:49 CEST] <lindylex> thebombzen : This is correct >> ffmpeg -i left.mp4 -i right.mp4 -filter_complex "[0:v]crop=638:720, pad=640:720:0:0:green[tmp0]; [1:v]crop=638:720, pad=640:720:2:0:green[tmp1]; [tmp0][tmp1]hstack[v]; anullsrc[silent1]" -map [v] -map [silent1] -ac 2 -shortest -y o.mp4
[20:14:14 CEST] <lindylex> thebomzen : The solution you were suggesting was to dynamically create a video with the dimension and then add three videos horizontally.
[20:14:32 CEST] <pzich> ianbytchek: it's hard to say for sure, but based on other uses of _NB and _nb, it may indicate the number of occurences or type of something? https://github.com/FFmpeg/FFmpeg/blob/183fd30e0f6fdc762fd955a24cfc7e6a49e1055c/libavformat/segment.c
[20:15:45 CEST] <ianbytchek> pzich: interesting& :D
[20:16:42 CEST] <thebombzen> lindylex: ignore what I said before, I had misunderstood your problem
[20:16:51 CEST] <lindylex> Got it.
[20:17:08 CEST] <thebombzen> either way, what happens if you don't use the anullsrc and don't use -map [silent1]
[20:17:10 CEST] <pzich> ianbytchek: yeah, I'm also seeing functions like "av_get_channel_layout_nb_channels", which "Return the number of channels in the channel layout." https://www.ffmpeg.org/doxygen/2.2/group__lavu__audio.html#gac95619abeb32e4a3daa18e3ff3419380
[20:17:19 CEST] <thebombzen> that is what happens if you don't add an extra audio track?
[20:18:19 CEST] <pzich> it seems like with -shortest added it should work with or without audio tracks
[20:18:27 CEST] <lindylex> Let me try to construct this.
[20:22:23 CEST] <lindylex> thebombzen : I hate my life. What you said works : ffmpeg -i left.mp4 -i right.mp4 -filter_complex "[0:v]crop=638:720, pad=640:720:0:0:green[tmp0]; [1:v]crop=638:720, pad=640:720:2:0:green[tmp1]; [tmp0][tmp1]hstack[v]" -map [v] -y o.mp4
[20:22:34 CEST] <pzich> yup
[20:22:38 CEST] <pzich> you didn't want the audio, did you?
[20:22:59 CEST] <thebombzen> what I find strange is that -shortest did nothing
[20:23:15 CEST] <thebombzen> perhaps it's a bug in FFmpeg that has been fixed since last may? seems recent for a bug like that though
[20:23:26 CEST] <pzich> I see that hstack has its own shortest flag? maybe that needs to be set? https://ffmpeg.org/ffmpeg-filters.html#toc-hstack
[20:23:35 CEST] <thebombzen> hstack does, but the audio was the issue
[20:23:36 CEST] <pzich> err https://ffmpeg.org/ffmpeg-filters.html#hstack
[20:23:37 CEST] <thebombzen> not one of the videos
[20:24:17 CEST] <thebombzen> The only reason -shortest shouldn't do anything would be if the video also was running forever, but it wasn't, which is why I asked that question
[20:24:19 CEST] <pzich> do you need the audio to be mixed? or passed through at all? if not, just leave it off, way easier
[20:25:08 CEST] <lindylex> pzich : I am writting a python wrapped around this and I needed to handle situations when there is no audio.
[20:25:46 CEST] <pzich> right, but do you need the audio in the resulting file? if not, just don't map an audio track and it'll be ignored
[20:25:58 CEST] <pzich> or add -an
[20:27:00 CEST] <thebombzen> using -map will automatically unmap anything else. so if you're using -map [vout] or whatever you called it, you won't have audio by default
[20:27:08 CEST] <pzich> right
[20:27:40 CEST] <lindylex> I tested to see if both videos have audio. If one of them was missing audio I just sent the videos to the ffmpeg command that handles no audio track
[20:27:42 CEST] <thebombzen> if you want to preserve the audio from the left file, you can add -map 0:a? -c:a copy which will map the audio stream(s) from the left input file but only if they exist
[20:28:16 CEST] <thebombzen> using -map 0:a? will map audio tracks from input 0, if they exist. similarly, -map 1:a? will map audio stream(s) from input 1, but only if they exist
[20:28:32 CEST] <thebombzen> then you can use -c:a copy to avoid re-encoding, or you can use -c:a aac -b:a 128k
[20:28:33 CEST] <pzich> if anything, I'd do the opposite. If it has audio, run `ffmpeg -i in.mp4 -an -c copy out.mp4` and handle the both with no audio
[20:28:43 CEST] <pzich> ok bye then
[20:28:57 CEST] <thebombzen> rip
[20:53:08 CEST] <BytesBacon> Question with the -map switch. If I'm using mediainfo to look at the track information, I'm assuming that it's like an array in programing. Where 0 is the first one? (I have that right)
[20:57:31 CEST] <DHE> BytesBacon: yes, but also keep in mind that -map supports things like 0:v for "first video stream" to assist in decision making
[20:58:05 CEST] <BytesBacon> DHE, thanks.
[20:58:52 CEST] <JEEB> if you want a parse'able info of the file, use ffprobe with -of json and -show_streams
[21:00:07 CEST] <Diag> redrabbit: if you were talking to me water DOES in fact have an R number assigned XD
[21:01:47 CEST] <BytesBacon> JEEB, thank you! That'll make it even easier for me.
[21:02:38 CEST] <JEEB> 3/25
[22:16:06 CEST] <cryptodechange> is there any documentation regarding aq-mode 3 and 4? can't seem to find it on x264 fullhelp
[22:17:13 CEST] <cryptodechange> oh nvm found it
[22:33:36 CEST] <cryptodechange> Getting some banding (look at top right), and the colour tone shift > https://imgur.com/a/QWq34
[22:34:59 CEST] <cryptodechange> With the aq-strength being at 0.8, a little higher than animation, what can be done to reduce banding? if that is even banding
[22:42:05 CEST] <lindylex> Sorry about that I had some hardware failure.
[22:55:08 CEST] <thebombzen> cryptodechange: try -aq-mode:v 3
[22:55:26 CEST] <thebombzen> you should use that by default for your encodes anyway, it's a good ide
[22:55:35 CEST] <thebombzen> idea*
[22:57:09 CEST] <thebombzen> aq-mode:v 3 weights darker areas as greedier for bits, so color banding in darker areas is less of a problem. the cost will be color banding in lighter areas which is far less noticeable by the way human eyes work
[22:57:47 CEST] <cryptodechange> I have an aq-mode:3 in that example, unless you mean something different?
[22:58:17 CEST] <thebombzen> rip, didn't see it :O
[22:58:22 CEST] <thebombzen> ignore me, I'm being dumb right now
[22:59:26 CEST] <thebombzen> cryptodechange: in that case, try increasing psy-rd
[22:59:32 CEST] <thebombzen> perhaps to 0.8. see what happens
[22:59:41 CEST] <cryptodechange> there's so much conflicting advice when it comes to anime that deviates from the animation tune defaults
[22:59:58 CEST] <cryptodechange> People stating it softens the linework too much, etc
[23:00:01 CEST] <thebombzen> correct, the usual advice is that "tune animation is bad, don't use it"
[23:00:11 CEST] <thebombzen> it sets the psy-rd too low
[23:00:29 CEST] <thebombzen> also aq-strength=0.8 is fairly standard for high quality sources, so I would leave it at 0.8
[23:00:35 CEST] <thebombzen> you can up it if you want to preserve grain though
[23:00:44 CEST] <cryptodechange> I've used deblock -3 and aq=0.8 for most of my film streams
[23:00:55 CEST] <thebombzen> I would not use deblock -3 for anime though
[23:01:02 CEST] <thebombzen> I usually leave it at -1:-1
[23:01:02 CEST] <cryptodechange> even -1 seemed too much
[23:01:21 CEST] <thebombzen> -1 is good for anime
[23:01:24 CEST] <cryptodechange> e.g. I just downloaded some encoded anime that uses deblock 1,-1
[23:01:32 CEST] <thebombzen> ew
[23:01:39 CEST] <thebombzen> should try to keep the numbers the same
[23:01:56 CEST] <cryptodechange> aq .85 and default psy-rd 1,0
[23:02:10 CEST] <thebombzen> either way, in your case, leave aq-mode at 3, and aq-strength at 0.8 and try upping the psy-rd slightly. perhaps to 0.8, and see how that affects the artifacts in the upper right
[23:02:27 CEST] <cryptodechange> What's your opinion on the overall color?
[23:02:34 CEST] <cryptodechange> the browns specifically
[23:02:48 CEST] <cryptodechange> There's a tint of difference in both aq 1 and 3
[23:03:20 CEST] <thebombzen> it's hard for me to tell without diff.pics
[23:04:14 CEST] <thebombzen> yea, you're right. that's probably a colorspace or range issue
[23:04:22 CEST] <thebombzen> rather than an issue with x264 itself
[23:04:57 CEST] <thebombzen> make sure the colorspace stuff is correct (probably bt.709(
[23:05:22 CEST] <thebombzen> are you converting full to partial range?
[23:05:30 CEST] <cryptodechange> lemme compare on mediainfo
[23:06:00 CEST] <cryptodechange> -c:v libx264 -preset veryslow -pix_fmt yuv420p10le -profile:v high10 -level 4.1
[23:06:31 CEST] <thebombzen> don't set the level or the profile manually
[23:06:36 CEST] <thebombzen> unless you *really need* to
[23:06:48 CEST] <thebombzen> but for 10-bit video I find it hard to believe that you would really need to
[23:07:02 CEST] <thebombzen> and that's probably libswscale screwing with the conversion
[23:07:13 CEST] <thebombzen> I would try this: -vf zscale,format=yuv420p10le
[23:07:30 CEST] <cryptodechange> original = BT.709
[23:07:33 CEST] <thebombzen> you're upping to 10-bit with libswscale. I would recommend using zimg instead
[23:07:53 CEST] <cryptodechange> mediainfo for encode test doesn't have anything regarding BT
[23:08:28 CEST] <thebombzen> cryptodechange: rather than using -pix_fmt yuv420p10le, try using -vf zscale,format=yuv420p10le
[23:08:51 CEST] <cryptodechange> will run a test now
[23:08:58 CEST] <cryptodechange> ty
[23:09:13 CEST] <JEEB> also you most likely want to actually make sure you are tagging your output as the same colorspace as well
[23:09:21 CEST] <JEEB> so if BT.709 in then BT.709 out as well
[23:10:56 CEST] <cryptodechange> How would I do that JEEB?
[23:13:10 CEST] <thebombzen> cryptodechange: if you want it to be exact, you can use -vf zscale=p=709:t=709:m=709:r=full:agamma=false,format=yuv420p10le
[23:13:37 CEST] <cryptodechange> sweet christmas
[23:13:49 CEST] <cryptodechange> alright i'mma do it!
[23:13:52 CEST] <thebombzen> but I believe JEEB meant to tag the metadata
[23:13:57 CEST] <JEEB> thebombzen: that doesn't map the output stream necessarily
[23:14:21 CEST] <JEEB> and I think if you don't override and input is set and your family is the same then the filter should keep it
[23:14:35 CEST] <thebombzen> probably
[23:14:45 CEST] <thebombzen> as for how to tag the output as 709? I don't know
[23:15:08 CEST] <JEEB> -colorspace bt709
[23:15:16 CEST] <thebombzen> ah, that does it lol
[23:15:26 CEST] <cryptodechange> so back to my original?
[23:15:35 CEST] <cryptodechange> pixfmt and add colorspace?
[23:15:56 CEST] <thebombzen> no, keep -vf zscale,format=yuv420p10le
[23:16:01 CEST] <JEEB> https://www.ffmpeg.org/ffmpeg-all.html => ctrl+F "color_primaries"
[23:16:08 CEST] <JEEB> and then all that are under it until whatchamacallit
[23:16:20 CEST] <feliwir> hey, question: when i use srw_convert_frame does it says that the output frame must already be initialized (with format set). The format is already set when i create my resampler, so why that additional step?
[23:16:21 CEST] <JEEB> until color_range are relevant
[23:16:36 CEST] <JEEB> feliwir: just for the record when I was doing resampling I just used avfilter
[23:16:39 CEST] <thebombzen> cryptodechange: what -pix_fmt does it is automatically adds "-vf scale,format=yuv420p10le" but using -vf zscale,format=yuv420p10le should help
[23:16:47 CEST] <JEEB> since I could just give it avframes and om nom nom
[23:16:56 CEST] <thebombzen> aresample filter yea
[23:16:56 CEST] <JEEB> also avfilter let me get N samples
[23:17:14 CEST] <JEEB> which was very useful when a decoder returned a different amount of samples than an encoder wants
[23:17:21 CEST] <JEEB> which is a real issue
[23:17:29 CEST] <feliwir> hm, so there are 2 libraries to do the same thing?
[23:17:40 CEST] <JEEB> no, underneath avfilter uses swresmaple
[23:17:46 CEST] <JEEB> but it offers a more "convenient" interface
[23:17:51 CEST] <JEEB> with the filtering
[23:18:10 CEST] <JEEB> you create the audio filter chain and push/pull to/from it
[23:23:30 CEST] <cryptodechange> hopefully when I tuned this right, it will be a one-size fit all for all my anime encodes
[23:23:34 CEST] <Mista_D> vdpau advice needed, can't use Nvidia in CENTOS to decode.
[23:23:45 CEST] <cryptodechange> apart from the super grainy 'remastered' blurays...
[23:25:04 CEST] <Mista_D> https://pastebin.ca/3880077 VFPAU errors
[23:27:10 CEST] <BtbN> Is h264_vdpau even a thing? oO
[23:29:37 CEST] <Mista_D> BtbN: apprently not... )
[23:30:32 CEST] <BtbN> Well, it does not fail, so it has to exist
[23:30:57 CEST] <BtbN> try if the vdpau -hwaccel just works. And if not, it might just not be wried up in ffmpeg.c
[23:39:19 CEST] <cryptodechange> thebombzen a bit of an improvement https://imgur.com/a/VjTaC
[23:40:07 CEST] <cryptodechange> but even with the colorspace parameter there is still a tonal shift
[23:40:14 CEST] <CoreX> never seen somebody spend so much time wanting to get the colours right
[23:40:40 CEST] <JEEB> then you have not seen the mpv opengl renderer guy
[23:40:58 CEST] <CoreX> OCD about it i bet
[23:41:25 CEST] <JEEB> well, you just want to get things right after you've spent reading the specifications enough
[23:41:28 CEST] <JEEB> lol
[23:41:45 CEST] <cryptodechange> this has ruined my life
[23:41:55 CEST] <cryptodechange> honestly, it started me just enjoying content I download
[23:42:14 CEST] <cryptodechange> then encoding my own media to try and get a consistent quality
[23:42:25 CEST] <cryptodechange> then buying my own blurays and here I am
[23:42:46 CEST] <cryptodechange> can't binge watch anything because I deem the quality not to my standards :D
[23:43:37 CEST] <JEEB> if I watch I don't encode
[23:43:44 CEST] <JEEB> after I've watched I can encode
[23:43:50 CEST] <JEEB> (if needed)
[23:44:18 CEST] <cryptodechange> Yeah, I am happy just storing raw remuxes, but I travel a lot and use plex
[23:44:47 CEST] <cryptodechange> Plex uses ffmpeg veryfast if I'm not mistaken, which has a noticeable quality degradation vs. encoding it myself to that bitrate or thereabouts
[23:45:41 CEST] <cryptodechange> but I'm not too fussed about getting the colours right, but flicking back and forth between the sample images (which I normally do for films, etc), there's quite a noticeable change in the overall area
[23:45:58 CEST] <cryptodechange> I'm more concerned what causes it if there's something I've missed
[23:46:01 CEST] <cryptodechange> about*
[23:47:01 CEST] <JEEB> what are you utilizing to check the clip btw?
[23:48:24 CEST] <cryptodechange> VLC, stepping frames with E and taking snapshots with shift+S
[23:48:31 CEST] <JEEB> I recommend mpv
[23:48:32 CEST] <thebombzen> http://mpv.io/
[23:48:37 CEST] <thebombzen> lol you ninjad me on that one
[23:49:17 CEST] <cryptodechange> That being said, the original was played on a network drive whilst I'm connected to my VPN as it's ~4gb
[23:49:37 CEST] <cryptodechange> Not sure if that would affect anything
[23:49:44 CEST] <cryptodechange> switching to mpv naow
[23:49:48 CEST] <thebombzen> mpv has a cache feature so you should be fine
[23:50:35 CEST] <cryptodechange> will resample both images
[23:51:16 CEST] <JEEB> also you should always check if swscale is stabbing you in the back
[23:51:22 CEST] <JEEB> I think -v verbose does output that
[23:51:33 CEST] <JEEB> (hopefully, I remember it should probably for the filter hcains)
[23:52:13 CEST] <cryptodechange> I also did some comparisons with psy-trellis
[23:53:01 CEST] <cryptodechange> 5 tests between 0 to 0.2, it ended up being 'eeny meeny miny moe'
[23:54:08 CEST] <cryptodechange> as there was only subtle variations in the artifacts and grains
[23:55:53 CEST] <cryptodechange> In relation to grains, apart from denoising filters, what should I focus on with much grainier sources?
[23:56:23 CEST] <redrabbit> deblock
[23:56:52 CEST] <cryptodechange> with my 'anime preset' at deblock -1, would you recommend 1 instead?
[23:57:03 CEST] <cryptodechange> I suppose a higher AQ too? possibly 1?
[23:57:36 CEST] <redrabbit> there's a preset for grain
[23:57:58 CEST] <redrabbit> lower deblock values to preserve grain
[23:58:25 CEST] <redrabbit> if you put enough bitrate of course
[23:58:34 CEST] <Mista_D> anyway to improve FFmpeg's efficiency in terms of using more cores with libx264?
[23:58:49 CEST] <cryptodechange> With grain, I suppose 2pass would be better than CRF (CRF I normally use)
[23:59:18 CEST] <notdaniel> Mista_D, run 3 instances of ffmpeg and split your inputs between them ;)
[23:59:26 CEST] <cryptodechange> I wouldn't mind flattening the grain a little if it doesn't hurt the quality too much
[23:59:32 CEST] <BtbN> two pass encoding is only good if you target a specific file size, nothing else
[23:59:36 CEST] <Mista_D> notdaniel: concat is not too accurate there...
[23:59:59 CEST] <cryptodechange> E.g. with this test encode of One Punch Man I had an average bitrate of 8.5mbps
[00:00:00 CEST] --- Sat Sep 30 2017
More information about the Ffmpeg-devel-irc
mailing list