[Ffmpeg-devel-irc] ffmpeg.log.20190730

burek burek021 at gmail.com
Wed Jul 31 03:05:05 EEST 2019


[03:59:07 CEST] <renatosilva> hi, how can I make the audio stream the first one when mapping streams?
[03:59:44 CEST] <renatosilva> something like -map 0:a -map 0:rest
[04:07:06 CEST] <DHE> I think you're gonna have to actually check the stream ahead of time and manually map all but the one you want.
[04:15:38 CEST] <renatosilva> ok thanks
[05:22:10 CEST] <ossifrage> Wow, the live555 guy is a bit of a dick
[08:21:24 CEST] <kab0m> hi @ all
[08:23:28 CEST] <kab0m> when i run "ffmpeg -i in.mp4 -c copy -c:s srt out.mkv" the styling of the subtitles gets broken and the text is very small and has a different font...how does one copy an mp4 with subtitles to mkv and keep the subtitle styling?
[09:40:55 CEST] <suryajagtap> how to stream local video file using ffpeg and rtsp??
[10:42:08 CEST] <Hackerpcs> is it possible to hide just metadata from output?
[10:44:32 CEST] <cehoyos> From console output or from output file?
[10:44:41 CEST] <Hackerpcs> just console
[10:44:51 CEST] <cehoyos> Not that I know of
[10:55:28 CEST] <koz_> I read this page for VP9 encoding: https://developers.google.com/media/vp9/settings/vod/, and for what I have, they recommend these settings: http://paste.debian.net/1093545/ . However, I have a GPU with hardware encoding for VP9. Is it possible to use the vp9_vaapi with these settings? Do I have to change how these calls are spelled?
[10:55:33 CEST] <koz_> (aside from the obvious)
[10:58:14 CEST] <koz_> s/the vp9_vaapi/the vp9_vaapi encoder/
[11:00:46 CEST] <cehoyos> The settings you pasted are specifcally for libvpx encoding. If you want to use hardware encoding, they do not apply
[11:01:38 CEST] <koz_> cehoyos: Let me rephrase my question then. What settings should I feed to vp9_vaapi to get the same result, in terms of what they describe here: https://developers.google.com/media/vp9/settings/vod/ ?
[11:01:53 CEST] <koz_> (assuming that my goal is 1920x1080 at 60fps)
[11:02:05 CEST] <cehoyos> -s 1920x1080 -r 60
[11:02:52 CEST] <cehoyos> Note that nothing in your pasted command enforces the frame rate (neither 60 fps nor anything else)
[11:03:29 CEST] <koz_> I'm just going by what they say in the link.
[11:04:13 CEST] <cehoyos> Just use the bitrate mentioned in the table and test if your hardware encoder can reach it.
[11:05:15 CEST] <koz_> So the rest of the stuff they mention (tile columns, threads, min and max bitrate, etc) don't apply?
[11:05:45 CEST] <cehoyos> threads make little sense with hardware encoding
[11:06:26 CEST] <cehoyos> I don't know if tiles are supported by your hardware encoder, it should support min framerate (but you should not set it imo), you will find out if max bitrate is supported or not
[11:06:42 CEST] <koz_> Why do you think setting the minimum bitrate is a bad plan?
[11:06:51 CEST] <cehoyos> Because it reduces overall quality
[11:07:18 CEST] <koz_> Wait, _reduces_? Could you explain why? Not questioning you, but I think I don't understand bitrates properly.
[11:07:22 CEST] <cehoyos> You spend bits on a scene that does not need the bits, and you cannot spend them elsewhere
[11:08:26 CEST] <cehoyos> (Just ignore it and test if your hardware encoder supports two-pass bitrate encoding at all, it is possible that this does not work, testing will tell you)
[11:08:35 CEST] <koz_> cehoyos: OK, thanks.
[11:09:40 CEST] <cehoyos> If it does not work, you either have to use one-pass constant bitrate (which definitely has worse quality for real world content) or use constant quality where it is difficult to reach a target bitrate
[11:10:04 CEST] <koz_> I've got an Intel GPU and an AMD GPU both.
[11:10:14 CEST] <koz_> Wait never mind, my AMD can't hardware encode VP9.
[11:10:23 CEST] <koz_> So it's an Intel.
[11:10:30 CEST] <koz_> (if that tells you anything)
[11:10:43 CEST] <koz_> I'm guessing two-pass encoding is a driver thing though?
[11:32:23 CEST] <cehoyos> I suspect support in the whole infrastructure is necessary.
[12:02:42 CEST] <kab0m> ffmpeg -i inout.mp4 -c copy -c:s webvtt output.mkv  How do i tell ffmpeg that the subtitle-stream should be disabled by default when opening the mkv-file?
[12:06:32 CEST] <kab0m> i could only find the -disposition parameter, but i didnt find a way to disable the default set subtitle
[12:09:07 CEST] <kab0m> ok i think i got it... -disposition:s:0 0 was the trick
[15:08:29 CEST] <Ua-Jared> Hey all, I have a quick question about general ffmpeg use (from the command line). So I have an IP camera that streams an H.264 encoded stream. I've used OpenCV (with the Java 8 wrappers) to successfully grab the stream from the camera, frame by frame, and paint it on screen. This works well, but unfortunately when I run my code, it doesn't use the
[15:08:29 CEST] <Ua-Jared>  GPU at all. And I know that OpenCV (and like everything else, haha) uses ffmpeg in the background.
[15:08:30 CEST] <Ua-Jared> in my Java program somehow, so that the decoding is done by the GPU-accelerated ffmpeg, but I still have access to the images in Java? I was thinking something like using ffmpeg to convert the stream into an .avi or .mp4 (in realtime), and then at the same time reading in images from that in-creating video file. But this seems a little janky haha.
[15:08:31 CEST] <Ua-Jared> Fundamentally I just want to use gpu-accelerated ffmpeg to decode this H264 stream and read the decoded images into my Java program for further processing.
[15:24:19 CEST] <GuiToris> hey, does ffmpeg mind if it has been suspended?
[15:36:42 CEST] <DHE> not directly, but if it's getting a realtime video feed there will be dropped packets and time warping which could result in a thoroughly entertaining result after resuming
[15:37:17 CEST] <JEEB> yea, also hardware components might have weird stuff
[15:37:23 CEST] <JEEB> if you are using hw dec/enc
[16:07:48 CEST] <lofo> Hello ! New ffmpeg user here. I'd like to figure out ways to debug my ffmpeg commad
[16:08:39 CEST] <JEEB> -v verbose and if you need more -v debug are your friends
[16:08:55 CEST] <kepstin> lofo: in general, you want to read the complete ffmpeg output for messages, and reference the command line that you ran
[16:08:57 CEST] <lofo> i'm trying to make a video out of a series of png. it works fine when i use PNGs that i produced with ffmpeg (from a video). But it fails when i try to use png produced elsewhere. I can't find more info about what could cause that
[16:09:15 CEST] <kepstin> lofo: to the point where we have a bot macro in this channel to ask people for those things
[16:09:36 CEST] <JEEB> paste the command line and full failing terminal output, preferably with -v verbose onto a pastebin or gist or so
[16:09:39 CEST] <JEEB> and then link that here
[16:13:52 CEST] <GuiToris> DHE, JEEB sorry I wasn't listening.
[16:14:02 CEST] <GuiToris> So I had to ctrl+z my encoding
[16:14:21 CEST] <GuiToris> it isn't live stream
[16:14:40 CEST] <GuiToris> pngs to x265
[16:14:56 CEST] <DHE> well that's fine
[16:15:23 CEST] <lofo> here ! output might be weird since i use mobile-ffmpeg https://pastebin.com/1vQqmJqR
[16:15:27 CEST] <GuiToris> x265 conversion is super slow and unpredictable
[16:15:30 CEST] <GuiToris> thank you for your help
[16:16:31 CEST] <lofo> erratum : i ran it with -v debug not -v verbose
[16:19:05 CEST] <kepstin> lofo: oh, wow, i've never seen that. when ffmpeg is calling libz to do part of the png uncompression, it's getting an error code back
[16:20:40 CEST] <kepstin> lofo: error code 3 appears to be "data error", which means corrupt input
[16:21:43 CEST] <lofo> i suspected the png to be corrupted so i ran pngcheck on it. i got "ff-595.png  illegal (unless recently approved) unknown, public chunk iDOT"
[16:21:58 CEST] <lofo> which is a chunk added by png made using apple's code
[16:22:31 CEST] <lofo> kepstin How did you get the info about error code 3 being a data error ?
[16:23:00 CEST] <lofo> could a non-recognized chunk make png uncompression fail ?
[16:23:26 CEST] <kepstin> i read zlib.h, since it said that was the error returned from the inflate function
[16:23:44 CEST] <kepstin> no, it just skips the chunk. this means the data inside the iDAT couldn't be decoded.
[16:23:58 CEST] <kepstin> er, IDAT
[16:24:26 CEST] <durandal_1707> lofo: what application can sucesfully display such pngs?
[16:24:28 CEST] <lofo> oh ok, and inflate is part of zlib ?
[16:24:55 CEST] <lofo> durandal_1707 i can display it on a Mac and iPohne
[16:25:06 CEST] <DHE> yes, PNGs are basically deflate-compressed (same algorithm as gzip) so ffmpeg makes use of zlib to decode/encode them
[16:25:08 CEST] <kepstin> from other knowledge, I know that one step in png compression is zlib, and so it followed that that function was probably from zlib.
[16:26:20 CEST] <lofo> oh ok. just trying to find ways to debug on my own
[16:27:11 CEST] <kepstin> either your system zlib is broken, or your png is broken :/
[16:27:21 CEST] <kepstin> given that you said you can decode other pngs, it's probably the latter.
[16:28:43 CEST] <lofo> this is a more verbose output of pngcheck https://pastebin.com/RvwBunR3
[16:29:21 CEST] <lofo> yes i can recompose a video out of png if i produced the png by decomposing a video (all the time using ffmpeg)
[16:29:45 CEST] <durandal_1707> lofo: apple devs like to fck with everything
[16:29:55 CEST] <lofo> durandal_1707 sure they do
[16:30:44 CEST] <kepstin> lofo: as far as I can tell, that pngcheck output is basically saying the same thing as ffmpeg - the png is corrupt and can't be decoded.
[16:31:42 CEST] <durandal_1707> hey, we wil make incompatible pngs for our people, they will probably never notice, because we can ...
[16:33:03 CEST] <lofo> thats crazy
[16:33:03 CEST] <kepstin> the iDOT thing appears to be bad, but benign (it just stores info about the screen scaling factor)
[16:33:22 CEST] <lofo> but that could make the uncompression fail altogether ?
[16:33:30 CEST] <kepstin> no, it'll be ignored
[16:33:40 CEST] <kepstin> the issue is that the actual image data in your png file is corrupt
[16:34:00 CEST] <lofo> but that pngcheck wont tell me
[16:34:07 CEST] <kepstin> pngcheck did say that
[16:34:14 CEST] <kepstin> that's what "ERRORS DETECTED in ff-595.png" means
[16:34:46 CEST] <kepstin> (it has a different last line if the image data could be decoded, looks like "No errors detected in XXX (N chunks, ~M% compression)."
[16:34:56 CEST] <lofo> i thought it was linked to the iDot a line above
[16:36:29 CEST] <durandal_1707> hadnt it said it works for him under other apps?
[17:23:55 CEST] <Ua-Jared> Hey all, can anyone tell me why this command: ./ffmpeg -hwaccel dxva2 -threads 1 -i <streamURIforAnH264Stream> output.mp4 would only be using 2% of my GPU? I can see this in task manager, I'm not even sure if it's faster than using no hw acceleration. I know offloading tasks to the GPU is certainly not always going to be benificial, but I figured H
[17:23:55 CEST] <Ua-Jared> 264 decoded would benefit from it :P. Here's the full command and output: https://pastebin.com/x1R2i9jB
[17:25:03 CEST] <Ua-Jared> And I'm on Windows 10, 64-Bit, with an integrated graphics card for reference. It's an Intel HD Graphics 630 with Direct3D support
[17:31:54 CEST] <jkqxz> Seems about right?  720p video will happily decode at >1000fps on that sort of decoder, so given that it's bounded by something else at only 26fps (either the encode or the input stream?) 2% GPU use is totally plausible.
[17:35:01 CEST] <Ua-Jared> Ohhhhh.... I didn't know that! That kinda makes sense. So, maybe this is a daft question, but when I run that command I get about 15% CPU usage and 2% GPU usage in task manager. Would there be anyway to offload more of that to the GPU? My goal is to reduce the load on the CPU more than anything else
[17:51:25 CEST] <BtbN> Video Decoding also does not load the GPU, but the Video Decoder.
[17:55:44 CEST] <Ua-Jared> Perhaps a dumb question but... wouldn't the video decoder be in the GPU? It's atleast listed under the GPU tab in task manager. So I'd think video decoded would load the GPU / would be something the GPU would be good for
[18:27:27 CEST] <DHE> Ua-Jared: typically the video decoder/encoder is a discrete part of the GPU separate from the computation/3d graphics components
[18:30:36 CEST] <Ua-Jared> Ahh, well that makes sense. I guess I'd need to check if my Intel HD Graphics 630 supports H264 decoding in its hardware to see if there'd be any real benefit to using the GPU, right?
[19:37:46 CEST] <kepstin> fwiw, 630 is kaby lake, and can do h264, hevc(+10bit), vp8, and vp9(+10bit) decoding
[00:00:00 CEST] --- Wed Jul 31 2019


More information about the Ffmpeg-devel-irc mailing list