[Ffmpeg-devel-irc] ffmpeg.log.20190112

burek burek021 at gmail.com
Sun Jan 13 03:05:01 EET 2019


[00:13:49 CET] <scriptease> is there a qsv h265 encoder for ffmpeg?
[00:14:00 CET] <scriptease> ffmpeg -hwaccel qsv -c:v h265_qsv    doesnt work
[00:29:51 CET] <analogical> Hi. Does anyone know how to you download all the videos from a youtube playlist?
[00:30:45 CET] <iive> analogical, `youtube-dl` has such feature
[00:31:41 CET] <iive> scriptease, is qsv the intel acceleration? is h265 encoding supported by your hardware? does h264 work?
[00:31:47 CET] <analogical> iive, I've tried youtube-dl but I don't understand how to download all the videos in the playlist
[00:32:48 CET] <scriptease> yes
[00:33:46 CET] <scriptease> i tried qsv hevc with xmedia recode
[00:33:49 CET] <scriptease> and it worked
[00:34:04 CET] <scriptease> its a new bean canyon intel nuc
[00:35:55 CET] <iive> scriptease, do you get an error? do you see the encoder in `ffmpeg -encoders | grep qsv`
[00:36:15 CET] <iive> analogical, just give the url of the playlist? just the playlist.
[00:37:55 CET] <scriptease> ok i got it
[00:38:08 CET] <scriptease> ffmpeg -hwaccel qsv -i "imput.mkv" -c:v hevc_qsv
[00:38:09 CET] <scriptease> was it
[00:38:30 CET] <iive> scriptease, great !
[00:38:35 CET] <scriptease> *input
[00:38:56 CET] <analogical> iive, https://www.youtube.com/watch?v=AIbzmHBw4NY&list=PLT0hfPWJS6_ssS2DrtMQ7zhvNU-11JY6i
[00:39:42 CET] <iive> analogical, i think old youtube-dl help was more useful... let me try myself. I've done it before
[00:40:31 CET] <scriptease> *damn...cuda is 4 times faster!
[00:40:51 CET] <Retr0id> When I try to use ffmpeg to download a CENC encrypted DASH stream, it a) doesn't decrypt anything b) doesn't preserve the "senc" atoms or other atoms required to decrypt later. e.g.:
[00:41:08 CET] <Retr0id> ffmpeg -i 'https://bitmovin-a.akamaihd.net/content/art-of-motion_drm/mpds/11331.mpd' -decryption_key 100b6c20940f779a4589152b57d2dacb -c copy -map 0:6 foo.mp4
[00:41:15 CET] <Retr0id> (note, that is a public testing key)
[00:41:39 CET] <Retr0id> if I run the same command, but on a single .ts file from the stream, it decrypts fine
[00:41:52 CET] <Retr0id> anyone know what I'm doing wrong?
[00:42:10 CET] <iive> scriptease, new nvidia special encoder is a beast. pro card can encode over 16 HDTV channels at once.
[00:45:56 CET] <scriptease> ?
[00:46:05 CET] <scriptease> what do i need?
[00:46:56 CET] Action: scriptease got an old gtx 960
[00:48:02 CET] <iive> analogical, yes, just giving the URL seems to work, there is however a catch. if you don't escape the '&' bash would accept it as command and drop the rest of the url, including the part with the list
[00:49:09 CET] <analogical> iive, please pase a working link
[00:49:16 CET] <analogical> paste*
[00:49:45 CET] <iive> analogical,  you can also remove the v=xxx& part entirely. youtube-dl will pick it.
[00:50:44 CET] <analogical> iive, please paste a working link
[00:51:04 CET] <iive> just a moment.
[00:51:04 CET] <analogical> so I can see what you did
[00:51:46 CET] <scriptease> iive what do you mean with "new nvidia special encoder"?
[00:52:21 CET] <scriptease> ah ic
[00:52:28 CET] <scriptease> new OBS and RTX encoder
[00:52:41 CET] <iive> analogical, `youtube-dl -s https://www.youtube.com/watch?list=PLT0hfPWJS6_ssS2DrtMQ7zhvNU-11JY6i`
[00:55:25 CET] <iive> scriptease, might be the same one you call cuda. the thing is that it doesn't use general purpose shaders, but a dedicated silicon for encoder.
[00:55:44 CET] <iive> analogical, ops... I used "-s" to avoid downloading the videos
[00:55:59 CET] <analogical> np
[00:56:08 CET] <iive> replace it with proper "-f" for the video quality you want.
[00:56:09 CET] <analogical> iive, you removed v=AIbzmHBw4NY& and now it works
[00:56:28 CET] <analogical> strange that youtube-dl doesn't understand this
[00:56:35 CET] <iive> analogical, as I said, usually the problem is bash using & as command.
[00:56:45 CET] <analogical> I'm on windows
[00:56:50 CET] <iive> oh...
[00:56:57 CET] <analogical> but the same is true here
[00:57:01 CET] <analogical> obviously
[00:58:46 CET] <analogical> iive, thanks! :D
[00:59:19 CET] <iive> analogical, i'm glad I could help.
[00:59:31 CET] <Retr0id> Sooo, anyone have advice on how to download a .mpd DASH stream while preserving the encryption metadata (senc atoms etc.)???
[01:00:03 CET] <iive> btw the syntax for -f is  "-f <video>+<audio>" in case you want better than 1280x720
[01:00:35 CET] <iive> Retr0id, unfortunately I don't have an idea. the regular helpers seem to be away.
[01:01:18 CET] <Retr0id> haha, what times of day are the regular helpers usually here?
[01:01:41 CET] <scriptease> the rtx 2060 is still expensive and i dont use the pc for gaming anymore. wouldnt make sense for me to buy such a card just for encoding. i guess there would be better hardware like xilinx cards or streaming boxes. im a bit confused about the low speed of ffmpeg qsv so far. with xmedia recode it was much much faster
[01:02:09 CET] <iive> Retr0id, if you use latest ffmpeg release you might want to fill a bug. or ask again later.
[01:02:36 CET] <iive> Retr0id, to be honest... no idea (about time)...
[01:02:52 CET] <iive> scriptease, rtx is overkill.
[01:02:57 CET] <Retr0id> well, I don't know if it's a bug or not tbh
[01:03:08 CET] <Retr0id> it's probably just me doing something wrong
[01:04:22 CET] <iive> scriptease, on nvidia site, they say maxwell (2gen) can do hevc encoding.( strangely, no decoding).
[01:04:45 CET] <scriptease> but the new encoder is only working with the rtx cards?!
[01:05:13 CET] <iive> scriptease, it's relatively new... few years.
[01:05:56 CET] <scriptease> and does ffmpeg make use of it?
[01:06:08 CET] <scriptease> same encoder?
[01:06:13 CET] <iive> scriptease, yes
[01:07:03 CET] <scriptease> well iv got a gtx960
[01:07:10 CET] <scriptease> thats the 1st gen maxwell
[01:07:21 CET] <scriptease> 2nd gen would be gtx 970
[01:08:09 CET] <iive> 1st gen gets only h264 encoding. and decoding mpeg1 to 4 (h264 included)
[01:08:25 CET] <scriptease> nah
[01:08:49 CET] <scriptease> i can encode h265 with my gtx 960
[01:08:57 CET] <scriptease> using cuda
[01:09:19 CET] <iive> that's what i'm saying. cuda is using general purpose shaders
[01:09:31 CET] <iive> the encoder is specialized module
[01:09:48 CET] <iive> try encoding h264_nvenc and see if it is faster.
[01:10:13 CET] <scriptease> mhm but i wont get the vids so small with h264
[01:10:24 CET] <furq> the gtx960 is second-gen maxwell
[01:10:24 CET] <iive> just try it :D
[01:10:29 CET] <furq> gm206
[01:10:42 CET] <scriptease> *ah ok
[01:10:55 CET] <furq> pascal and turing both claim efficiency improvements but who knows how significant they are
[01:11:28 CET] <iive> furq, there are actually two cards named gtx960
[01:11:36 CET] <scriptease> https://www.nvidia.com/en-us/geforce/news/geforce-rtx-streaming/
[01:11:46 CET] <furq> well one is 206 and one is 204
[01:11:49 CET] <furq> they're both second-gen
[01:12:01 CET] <furq> first-gen is GM1xx
[01:12:04 CET] <iive> one is OEM and is GM204, the other is just 960 and is GM206
[01:12:13 CET] <iive> the gm206 is like 3'd gen.
[01:12:27 CET] <furq> second gen nvenc, then
[01:13:57 CET] <scriptease> but i wonder why qsv is so slow with ffmpeg and my new bean canyon (nuc5i7bek)
[01:14:11 CET] <scriptease> with xmedia recode it was about 3 or 4 times faster
[01:14:45 CET] <furq> i take it you're using hardware decoding
[01:15:01 CET] <furq> and not doing anything that would cause a bunch of copying
[01:16:09 CET] <scriptease> hardware encoding
[01:16:38 CET] <scriptease> i try different hardware...intel quicksnc on my barebone and the 960 gtx on my desktop pc
[01:17:08 CET] <scriptease> with xmedia recode the quicksync was faster than the cuda encoding
[01:17:21 CET] <scriptease> with ffmpg it just turned
[01:17:24 CET] <iive> scriptease, if you card is GM206, then you have hevc encoding and decoding.
[01:17:33 CET] <iive> you/your
[01:17:38 CET] <scriptease> i have hevs encoding
[01:17:40 CET] <scriptease> i know that
[01:17:53 CET] <scriptease> as i said...i used xmedia recode before
[01:17:57 CET] <scriptease> and everything worked
[01:18:06 CET] <scriptease> *hevc encoding
[01:18:26 CET] <iive> actually, 2gen Maxwell does not have hevc decoding..
[01:18:33 CET] <scriptease> cuda encoding also works now on my dektop pc using ffmpeg
[01:18:50 CET] <scriptease> exactly iive ....decoding didnt work
[01:20:41 CET] <scriptease> there is something wrong with intel qsv in ffmpeg
[01:20:44 CET] <iive> furq, in the wikipedia list, only GT945A is listed as GM108, no other GM1xx cards.
[01:20:59 CET] <furq> scriptease: i meant were you doing both decoding and encoding in hardware with ffmpeg
[01:21:08 CET] <furq> software decoding will slow down hardware encoding quite a bit
[01:21:15 CET] <furq> maybe not 3x but shrug
[01:21:33 CET] <furq> i can't think of any other reason why ffmpeg would be that much slower unless it's setting very different defaults
[01:21:50 CET] <scriptease> furq...no...as iive said...hevc-decoding didnt work...but hevc-encoding works with the gtx 960
[01:22:19 CET] <scriptease> nah i talk about different hardware here
[01:22:29 CET] <iive> scriptease, furq means, that you should be using hardware decoding (e.g. h264) AND hardware encoding.
[01:22:38 CET] <scriptease> i made a comparison using my barebone and my desktop pc
[01:23:02 CET] <scriptease> the barebone uses intel hardware acceleration and the desktop pc uses nvenc
[01:23:19 CET] <iive> intel is "meh"
[01:23:21 CET] <furq> yeah i meant with qsv
[01:23:27 CET] <furq> that's the one you said was much slower with ffmpeg right
[01:23:36 CET] <scriptease> yep
[01:23:44 CET] <scriptease> and with xmedia recode it was MUCh faster
[01:24:20 CET] <scriptease> thats why i say there is something wrong with the utilisation of intel quicksync in ffmpg
[01:24:54 CET] <scriptease> *ffmpeg
[01:25:36 CET] <scriptease> yeah iive....hardware decoding AND encoding would be better ofc
[01:32:22 CET] <scriptease> http://prntscr.com/m62vwg
[01:34:51 CET] <scriptease> http://prntscr.com/m62wqv
[01:37:20 CET] <iive> scriptease, https://developer.nvidia.com/ffmpeg :D
[01:40:50 CET] <scriptease> yep
[01:40:53 CET] <scriptease> know that site
[01:40:56 CET] <scriptease> :-)
[07:15:26 CET] <Gambit-_> Hello everyone again
[07:16:23 CET] <Gambit--> I am trying to add keyframes to a video so as to get smaller segmentation on HLS generation.  Right now, despite all of my efforts, the HLS generation process generates 8 fragments and reports "frame I:8" in the summary output.
[07:17:10 CET] <Gambit--> I tried to create more keyframes using this command line: ffmpeg  -I in.mp4 -force_key_frames "expr:eq(mod(n, ${GOPSZ}),0)" -x264opts rc-lookahead=${GOPSZ}:keyint=${GOPSZ_2X}:min-keyint=${GOPSZ} out.mp4
[07:17:20 CET] <Gambit--> (with GOPSZ=30 and GOPSZ_2X=60)
[07:17:27 CET] <Gambit--> this was as per a SO post I can di gout.
[07:17:59 CET] <Gambit--> the resulting file, when I count how may keyframes there are using show_entries frame=pict_type, has the expected 30 frames of type "I".
[07:18:50 CET] <Gambit--> But that doesn't sync with the output from the HLS generation command: ffmpeg -I out.mp4 -hls_playlist_type vod -hls_flags single_file -hls_init_time 2 -hls_time 2 -hls_list_size 0 -hls_segment_filename main.ts test.m3u8
[07:19:38 CET] <Gambit--> (those -I should be -i, in those command lines, fwiw)
[07:19:48 CET] <Gambit--> Thoughts?  Suggestions?
[07:20:25 CET] <c_14> well, you're reencoding the hls output
[07:20:39 CET] <Gambit--> How's that?
[07:20:51 CET] <furq> why are you setting keyint to gopsz*2
[07:20:52 CET] <c_14> the command that generates your hls reencodes the input
[07:21:35 CET] <Gambit--> furq: magic incantation specified in a SO post - but the summary was that min_keyint (at that point in time) was hardcoded to be no less than keyint/2+1
[07:23:07 CET] <Gambit--> Here's the thread: https://superuser.com/questions/908280/what-is-the-correct-way-to-fix-keyframes-in-ffmpeg-for-dash
[07:25:25 CET] <Gambit--> The stream I'm working with appears to be the right type:
[07:25:26 CET] <Gambit-->     Stream #0:0(und): Video: h264 (libx264), yuv420p, 1280x720 [SAR 1:1 DAR 16:9], q=-1--1, 30 fps, 90k tbn, 30 tbc (default)
[07:33:01 CET] <Gambit--> c_14: I'm still missing what you mean.  Are you suggesting I need to put all of the force_key_frame stuff in the same command that generates the hls?
[07:33:17 CET] <c_14> either that (and then don't reencode beforehand) or use -c copy
[07:34:31 CET] <Gambit--> Huh.
[07:34:47 CET] <Gambit--> Well how about that.
[07:34:52 CET] <Gambit--> Thanks c_14!
[07:35:04 CET] <Gambit--> I would never have guessed that would have been necessary.
[07:35:55 CET] <Gambit--> which has actually described every solution to a FFmpeg problem I've encountered to date :D
[07:38:13 CET] <c_14> ffmpeg will by default always reencode your content unless you explicitly tell it not to
[12:31:54 CET] <Kadigan> Hey. I'm looking through the "building ffmpeg" guides, and googling... and I'm having trouble finding a guide how to build ffmpeg -for- Windows. Not -on- Windows, -for- it. Under Debian, specifically. Does anyone have a good guide handy, perchance? For now I'm following a guide to build it for Linux, as I'll want the experience + a workable latest there.
[12:32:20 CET] <Kadigan> If it's further down in the official guide, don't hesitate to call me a blind hobo. :)
[12:33:13 CET] <JEEB> you just need to set/enable cross-compile, target-os and cross-prefix
[12:33:49 CET] <JEEB> you can see if the mingw-w64 toolchain in debian is new enough, the one in fedora was generally OK but the headers and CRT were too old for a lot of newer APIs
[12:34:13 CET] <JEEB> also you want to make sure that only the static versions of the CRT and C++ stdlib are available to the toolchain
[12:34:24 CET] <JEEB> as otherwise it will end up requiring the gcc DLLs on runtime :P
[12:35:56 CET] <JEEB> then just make sure you have new enough nasm installed and the very basic FFmpeg should be buildable (which means 99% of all decoders etc)
[12:36:19 CET] <JEEB> external libraries are generally just required for encoding, and you generally should know which encoders you need, if any
[12:37:54 CET] <JEEB> for example for 64bit windows I usually use something like --arch=x86_64 --enable-cross-compile --target-os=mingw64 --cross-prefix="x86_64-w64-mingw32-"
[12:38:24 CET] <JEEB> which tells FFmpeg's build system to utilize the tools that are prefixed with x86_64-w64-mingw32-
[12:55:45 CET] <Retr0id> Anyone have advice on how to download a .mpd DASH stream while preserving the encryption metadata (senc atoms etc.)?
[12:57:21 CET] <Kadigan> JEEB: I'm following the guide to build under Ubuntu, and I'm generally building things directly instead of installing them (Debian is known to stick to stable, read - old, stuff). So I should have stuff new enough... hopefully. Thank you for the advice, appreciate it.
[12:58:41 CET] <Kadigan> I'm building to encode (mainly x264 and, newly, x265), but yeah - I generally know what I need for x264... we'll see what I need for x265 :D
[12:59:20 CET] <Kadigan> Mainly because my Windows build is from 2014 :D
[13:09:50 CET] <JEEB> Kadigan: x265 is just cmake + C++ + nasm
[13:21:08 CET] <Kadigan> Hm... I obviously have no idea what I'm doing, sadly. I have the Linux binary up and going, but building for Windows isn't something I've ever done under Linux... So yeah, way out of my depth here. :)
[13:22:09 CET] <JEEB> you just need a mingw-w64 toolchain aiming either for i686 or x86_64 windows
[13:22:32 CET] <JEEB> also if you want a thing with a toolchain that some other place is already using, look into videolan's docker registry
[13:22:41 CET] <JEEB> they are using docker images to build their windows etc binaries
[13:22:57 CET] <JEEB> and/or you can look at their dockerfile and see which things they build and which they just install
[13:26:20 CET] <Kadigan> Hm... Well, https://ffmpeg.org/platform.html#Cross-compilation-for-Windows-with-Linux-1 says I only need to reconfigure ffmpeg, but it fails finding libass, so I obviously need to do something about that as well.
[13:26:35 CET] <Kadigan> Unless I don't need libass for the Win64 build.
[13:26:41 CET] <JEEB> uhh
[13:27:04 CET] <JEEB> what libraries you need depends on your needs, and you need those for the matching mingw-w64 target
[13:27:10 CET] <JEEB> lunix binaries naturally will not work
[13:27:17 CET] <Kadigan> Yeah, figured.
[13:27:43 CET] <JEEB> what I generally do is I have some sort of custom prefix for each cross-compilation target
[13:27:57 CET] <JEEB> then I set --prefix towards it when configuring things
[13:28:11 CET] <JEEB> and then PKG_CONFIG_LIBDIR=/path/to/that/prefix/lib/pkgconfig tells pkg-config to look for things there
[13:28:28 CET] <JEEB> PKG_CONFIG_PATH *appends*, PKG_CONFIG_LIBDIR *replaces*
[13:28:37 CET] <JEEB> and when you're cross-compiling you don't want your system libraries to pop up :)
[13:28:57 CET] <JEEB> (which are of course lunix, not mingw-w64 win32 or win64)
[13:34:49 CET] <Kadigan> https://hub.docker.com/r/moiamond/ffmpeg-windows-build-helpers/ here's something that might work. Once I have a running build, I'll look into how it works.
[13:35:01 CET] <Kadigan> Assuming I get one.
[13:53:09 CET] <JEEB> Kadigan: for docker containers I recommend the videolan ones for a toolchain base
[13:53:18 CET] <JEEB> then you can run of course an FFmpeg build on top of that
[14:24:51 CET] <kolesovdv> Hello. Can anybody help me? I need to use option "aac_seq_header_detect" via Dictionary (av_dict_set).
[14:25:09 CET] <kolesovdv> For ffplay it is used in the next form "-flvflags aac_seq_header_detect". Can someone show example how to use it via av_dict_set? I think about using it in the avio_open2 but I am not sure.
[16:23:17 CET] <Kadigan> Hm... it seems it fails on AX_SPLIT_VERSION for me, something to do w/ libtesseract from what I found on a (Chinese? Korean?! xD) website... Do I need Tesseract for something specific?
[18:16:26 CET] <pixelou> Hi, I'm trying to use the C API to read video frames but I can't decode the frames with the new decoding API (av_read_frame, avcodec_send_packet). I have taken inspiration from a few examples online but it fails on avcodec_send_packet. Could anybody have a look at my code please? The error message is written at the bottom: https://pastebin.com/raw/evXQXmmz
[18:20:56 CET] <DHE> pixelou: does the input file contain audio as well as video?
[18:21:14 CET] <kepstin> yeah, that's my guess as well - you're passing an audio frame to the video decoder.
[18:21:36 CET] <pixelou> yes, it has both (big buck bunny trailer).
[18:21:56 CET] <DHE> and you're not discriminating what comes out of av_read_frame() before feeding it to the video decoder
[18:22:37 CET] <DHE> the AVPacket has a stream_index (sp?) that will tell you if it's the audio or video, which matches the stream_idx from your call to https://pastebin.com/raw/evXQXmmz
[18:22:42 CET] <DHE> oops, copy/paste fail...
[18:22:54 CET] <DHE> av_find_best_stream
[18:23:46 CET] <pixelou> Ok thanks, I will fix that.
[18:44:36 CET] <pixelou> Well it now fails with 'Resource temporarily unavailable' after reading one empty packet: https://pastebin.com/raw/2v6qmGNx
[18:46:36 CET] <pixelou> Note that I'm not sure I have clearly understood the api documentation about memory management with av_packet_unref.
[18:50:13 CET] <kepstin> looks like a logic error - when the avcodec_receive_frame function returns AVERROR(EAGAIN), you need to loop back and get another packet to feed to the decoder
[18:50:22 CET] <kepstin> but your code treats that as an error case and exits
[18:55:51 CET] <pixelou> Indeed, I misinterpreted the documentation, I think I will do a more careful reread of the doc before asking another question. Thank you very much for your help.
[19:53:15 CET] <Kadigan> Okay...
[19:53:20 CET] <Kadigan> How the hell do I pass deblock to x265? :D
[19:54:20 CET] <Kadigan> I'm aware I need to pass them in -x265-params "string", in the format of option=value separated by :
[19:54:29 CET] <Kadigan> but how do I pass deblock, which itself is in format x:x?
[19:56:01 CET] <c_14> escape it, either with \ or ''
[19:56:12 CET] <c_14> probably anyway
[19:56:34 CET] <Kadigan> Yes, that worked. I didn't think of that. Thank you.
[20:19:56 CET] <Kadigan> (specifically, \ worked -- I failed to mention this)
[22:27:40 CET] <BtbN> Does ffmpeg have some automatic way to do stinger transitions?
[22:28:35 CET] <BtbN> As in, play one video, overlay the end of it with a transparent video, in the middle of that overlay-video cut to the next video, also overlaid with the stinger video
[22:30:39 CET] <BtbN> All I can think of would end up with a _massive_ non-generic complex filter chain
[22:30:57 CET] <BtbN> And I want to do this for hundreds of videos
[22:32:41 CET] <furq> what about concat,overlay=enable
[22:33:56 CET] <BtbN> Hm yeah, I could just concat them, and then overlay the stinger at the right time
[00:00:00 CET] --- Sun Jan 13 2019


More information about the Ffmpeg-devel-irc mailing list