[Ffmpeg-devel-irc] ffmpeg.log.20181001
burek
burek021 at gmail.com
Tue Oct 2 03:05:02 EEST 2018
[00:04:38 CEST] <svanheulen> I'm trying to get kmsgrab+vaapi to work and I'm running into some errors, hoping someone can give me some pointers
[00:04:48 CEST] <svanheulen> I got x11grab+vaapi to work fine
[00:05:14 CEST] <svanheulen> and then I just tried ksmgrab+libx264 and I got this error: https://trac.ffmpeg.org/ticket/7375
[00:05:55 CEST] <svanheulen> then I tried the settings given in the response to that issue and got this error instead: https://pastebin.com/kBDWV0Ds
[00:09:27 CEST] <jkqxz> svanheulen: Does it work if you have DRM master?
[00:09:54 CEST] <jkqxz> libva has a weird thing where it refuses to work with a non-master fd if it isn't a render node.
[00:10:17 CEST] <jkqxz> If that's your problem, you can get around it by creating the device separately on a render node and mapping to that.
[00:11:55 CEST] <svanheulen> hmmm, well I don't even know that "DRM master" is so no idea :/
[00:12:23 CEST] <jkqxz> Try running it not in X or Wayland (or similar).
[00:14:13 CEST] <svanheulen> ah, do i need X to be down completely?
[00:14:36 CEST] <jkqxz> Switching to a different VT with ctrl-alt-fN is sufficient.
[00:19:08 CEST] <svanheulen> yeah, that works
[00:19:28 CEST] <svanheulen> but then I can't seemed to access X
[00:21:24 CEST] <jkqxz> Ok, if that works then that I said is probably your problem.
[00:22:03 CEST] <jkqxz> You should be able to work around it using something like "ffmpeg -device /dev/dri/card0 -f kmsgrab -i - -init_hw_device vaapi=v:/dev/dri/renderD128 -filter_hw_device v -vf 'hwmap,hwdownload,format=bgr0' -c:v libx264 out.mp4".
[00:23:38 CEST] <svanheulen> awesome! that works :)
[00:25:34 CEST] <jkqxz> Does using that route actually gain much? kmsgrab is really meant for cases where you keep everything on the GPU side; once you download to CPU memory it ends up doing something pretty similar to x11grab (though can capture outside X).
[00:27:43 CEST] <svanheulen> wont pairing kmsgrab with h264_vaapi do then entire operation on the GPU? I was just trying to get kmsgrab to work at all first, not i need to test it with h264_vaapi
[00:27:55 CEST] <svanheulen> *now
[00:29:15 CEST] <jkqxz> Yeah, that will. (No download in that case.)
[00:34:21 CEST] <svanheulen> and I think I have that working too :)
[00:34:50 CEST] <svanheulen> does this look right? "ffmpeg -device /dev/dri/card0 -f kmsgrab -i - -init_hw_device vaapi=v:/dev/dri/renderD128 -filter_hw_device v -vf 'hwmap=direct,scale_vaapi=w=1920:h=1080:format=nv12' -c:v h264_vaapi -bf 0 -profile:v main output.mp4"
[00:35:16 CEST] <svanheulen> it recorded so i'm guessing it's doing what i want haha
[00:37:01 CEST] <jkqxz> Yeah. You might want to set some size/quality option (bitrate or qp); the default is pretty high fixed quality so it makes large files.
[00:37:42 CEST] <svanheulen> ah yeah
[00:47:22 CEST] <svanheulen> jkqxz: thanks for all the help :)
[02:10:22 CEST] <none2give> hi everyone. i'm trying to compile ffmpeg 4.0.2 under mingw32 on windows 10 x64 in order to get a minimal 32-bit build of ffmpeg. i intend to distribute it under the LGPLv2.1 license as is stated on the website so i want to compile it with all optional external libs stripped as i will be using it to encode/decode mpeg only.
[02:10:46 CEST] <none2give> everything is configuring and compiling fine, but i don't think i'm entirely understand how i should be configuring because when i attempt to run what i've built, i get errors for missing external dependencies
[02:10:51 CEST] <none2give> (libiconv etc)
[02:11:12 CEST] <none2give> i'm assuming there's something glaring i'm missing here so any assistance would be appreciated
[02:36:35 CEST] <none2give> it's looking like this might be a mingw32 issue and not an ffmpeg issue so i will direct my question to the appropriate board/channel
[10:14:39 CEST] <th3_v0ice> Can ffmpeg handle multiple input files and create one hls stream from it? What I am really interested in is, is it creating multiple hls muxers and then just generates the master file, or is it using one muxer for everything?
[10:36:01 CEST] <JEEB> th3_v0ice: yes, ffmpeg.c can handle multiple input files, and if you want to use the master playlist stuff in hlsenc.c then you need to use a single muxer instance
[10:37:46 CEST] <th3_v0ice> JEEB: So then the process is just opening an AVFormatContext with multiple video and audio streams and just feeding the data to it?
[10:38:48 CEST] <JEEB> which part?
[10:38:58 CEST] <JEEB> the muxing? yes, and then you have to define the map
[10:39:10 CEST] <JEEB> see the hlsenc muxer docs
[10:40:33 CEST] <th3_v0ice> I will be doing this in API, I was just wondering whether or not AVFormatContext and HLS muxer can take multiple video streams.
[10:40:41 CEST] <JEEB> yes
[10:40:44 CEST] <JEEB> yes they can
[10:40:49 CEST] <th3_v0ice> Cool, thanks!
[10:40:50 CEST] <JEEB> specifics are hlsenc specific
[10:41:03 CEST] <th3_v0ice> Yeah, I am looking at it right now
[10:41:45 CEST] <JEEB> if a muxer doesn't take in multiple video/audio streams or video/audio streams at all, generally it herps a derp at you at init. but I've been able to use hlsenc.c for a while and it has its kinks but kind of seemed to work
[10:43:02 CEST] <th3_v0ice> Haha, ok. Any particular kink that comes to mind of which I should be aware?
[10:51:07 CEST] <JEEB> at least with brief testing re-using tracks in multiple profiles (like having all MPEG-TS profiles having the same default audio track in mux)
[10:51:33 CEST] <JEEB> and then multiple languages, for which I have a patch locally that I need to poke out if it's how it's meant to be used
[10:52:36 CEST] <JEEB> *multiple audio languages
[11:06:25 CEST] <linux> durandal: for voice disguise you advised ffmpeg -i voice.wav -f lavfi -i sine=400 -lavfi amultiply out.wav ,I get "no such filter amultiply ,any idea ??
[16:32:31 CEST] <trashPanda_> Hello, this is a more general video question but if someone could direct me to a source it would be much appreciated. When setting PID's in a mpeg2 TS, are there "correct" id values?
[16:34:25 CEST] <JEEB> you have PAT that is hard-coded to some PID value, which then contains the PIDs for the different PMTs for different programs
[16:34:39 CEST] <JEEB> and the PMTs then contain the PIDs for the video/audio/whatever within that program
[16:35:19 CEST] <JEEB> as long as your PID selections match what PAT and PMT advertise
[16:36:03 CEST] <trashPanda_> Is that taken care of behind the scenes as long as I set the correct values in the individual AVStreams?
[16:37:28 CEST] <trashPanda_> but I suppose my general question was, are there "colloquial" usage of id's, such as h264 to 0x34 etc.
[17:24:20 CEST] <JEEB> trashPanda_: yes, lavf should take care of signaling your pids
[17:24:55 CEST] <JEEB> as for the a/v/s stream identifiers those should be set by the muxer according to the spec
[17:29:42 CEST] <trashPanda_> thank you
[18:46:18 CEST] <lays147> anyone here knows about left parent of mpeg2video?
[19:43:56 CEST] <safinaskar> how to keep one frame of any 9?
[19:44:35 CEST] <safinaskar> i. e. i have video with frames 0, 1, 2, 3, 4, ... . i want to output video with the following frames: 0, 9, 18, ...
[19:45:47 CEST] <ChocolateArmpits> safinaskar, between 0 and 9 there's 9 frames, but between 9 and 18 there are 8 frames. Are you sure the indexes are correct?
[19:46:52 CEST] <safinaskar> ChocolateArmpits: i am sure. between 0 and 9 there are 8 frames (not counting 0 and 9 themselves), and between 9 and 18 there are 8 frames, too (not counting 9 and 18 themselves, again)
[19:47:08 CEST] <ChocolateArmpits> oh snap
[19:47:13 CEST] <ChocolateArmpits> can't count ;-;
[19:47:38 CEST] <kepstin> safinaskar: you can either use the select filter with an expression that matches the frames you want to keep, or use the fps filter with a rounding mode that picks the frames you want.
[19:47:56 CEST] <safinaskar> kepstin: please, give me example
[19:48:13 CEST] <kepstin> there's some examples of the select filter in the docs
[19:48:32 CEST] <safinaskar> please give me at least command line for throwing every second frame
[19:48:53 CEST] <kepstin> safinaskar: there's an example of keeping 1 out of every 100 frames in the docs, it would be easy to adapt
[19:49:26 CEST] <safinaskar> kepstin: where it is?
[19:49:34 CEST] <ChocolateArmpits> https://ffmpeg.org/ffmpeg-filters.html#select_002c-aselect
[19:49:47 CEST] <ChocolateArmpits> specifically https://ffmpeg.org/ffmpeg-filters.html#toc-Examples-122
[19:51:48 CEST] <safinaskar> ChocolateArmpits: kepstin: thanks
[21:04:20 CEST] <GuiToris> hey, vp[8-9]'s -cpu-used is equivalent to h26[4-5]'s -preset?
[21:04:44 CEST] <JEEB> libvpx, yes. somewhat. look into libvpx's documentation
[21:05:21 CEST] <GuiToris> I've read it through, I just didn't quite understand it
[21:05:50 CEST] <ChocolateArmpits> GuiToris, the parameter function depends on the quality parameter used
[21:06:04 CEST] <kepstin> but in general, it's an adjustment for speed vs. encoding efficiency tradeoff
[21:07:01 CEST] <GuiToris> I thought I gave it a try; ffmpeg -i input -c:v libvpx-vp9 -crf 15 -b:v 0 -deadline best -cpu-used 0 output.webm, well this encoding hasn't finished
[21:07:11 CEST] <ChocolateArmpits> cpu-used has no function for -quality best
[21:07:30 CEST] <ChocolateArmpits> For -quality good it for the most part regulates both the speed and the accuracy of ratecontrol, e.g. at -cpu-used 4-5 your ratecontrol will be a mess
[21:07:57 CEST] <ChocolateArmpits> -quality good -cpu-used 0 is comparable to -quality best but slighty faster
[21:08:16 CEST] <kepstin> GuiToris: if you're using -quality best, you're asking for the best quality - speed it up would make it not best, right? ;)
[21:08:22 CEST] <GuiToris> 'When the deadline/quality parameter is good or best, values for -cpu-used can be set between 0 and 5.'
[21:08:36 CEST] <ChocolateArmpits> for -quality realtime -cpu-used regulates how much of the processor speed should be expended to to render the video at realtime. There's a formula in the documentation for that
[21:08:42 CEST] <kepstin> (but that said, best seems to be like x264's placebo, basically not generally useful and doesn't really improve quality)
[21:09:14 CEST] <ChocolateArmpits> GuiToris, did you read this document ? https://www.webmproject.org/docs/encoder-parameters/
[21:09:34 CEST] <ChocolateArmpits> Specifically @2. Encode Quality vs. Speed
[21:10:16 CEST] <GuiToris> no I didn't, thanks ChocolateArmpits, it looks more detailed
[21:10:31 CEST] <ChocolateArmpits> yeah read that, it should clear up all the confusion
[21:11:36 CEST] <kepstin> note that google's recommendations for VOD services (youtube-like stuff) say to use -quality good and then two-pass, with -speed 4 on the first pass, and -speed 1 or 2 on the second pass.
[21:15:39 CEST] <GuiToris> ChocolateArmpits, hmmm is this statement inaccurate then? 'When the deadline/quality parameter is good or best, values for -cpu-used can be set between 0 and 5.' (source : https://trac.ffmpeg.org/wiki/Encode/VP9 ) According to webmproject.org --best has nothing to do with --cpu-used, just as you mentioned
[21:15:45 CEST] <GuiToris> ffmpeg didn't complain though
[21:16:15 CEST] <ChocolateArmpits> well it's not ffmpeg documents developing the codec so...
[21:19:26 CEST] <GuiToris> oh comes to my mind, I have an error message. When I'd like to save an image from a video I use: ffmpeg -ss 2 -i input -framerate 1 output.png but it says: Could not get frame filename number 2 from pattern 'output.png' either set update or use a pattern like %o3d within the filename pattern, av_interleaved_write_frame(): Invalid argument
[21:19:40 CEST] <GuiToris> what should I do differently?
[21:19:47 CEST] <GuiToris> I only need one image
[21:20:34 CEST] <ChocolateArmpits> GuiToris, there's a -frames parameter to specify how many frames to write to the ouptut
[21:20:39 CEST] <ChocolateArmpits> don't use -framerate
[21:21:00 CEST] <GuiToris> oh, just frames? My bad :S
[21:21:04 CEST] <ChocolateArmpits> besides that's an input option
[21:21:38 CEST] <ChocolateArmpits> I sometimes read through ffmpeg commands, there's always something hidden
[21:22:48 CEST] <GuiToris> yes, it works well with frames, I messed it up, I must have used -frames but it was a while ago and I've forgotten it since then
[21:22:49 CEST] <GuiToris> thank you
[21:23:18 CEST] <GuiToris> back to the reading, thank you for your help :)
[00:00:00 CEST] --- Tue Oct 2 2018
More information about the Ffmpeg-devel-irc
mailing list