[Ffmpeg-devel-irc] ffmpeg.log.20191218
burek
burek at teamnet.rs
Thu Dec 19 03:05:02 EET 2019
[00:04:33 CET] <jokoon> I don't understand how to make a letterbox
[00:08:15 CET] <jokoon> It reencoded a 1280x540 to a 480x270, but when I open it, the letterbox doesn't show, I'm guessing VLC has a feature to hide them?
[00:10:54 CET] <DHE> codecs can indicate non-square pixels. by default ffmpeg will set this field in order to maintain the aspect ratio of the original video unless you manually override
[00:11:18 CET] <jokoon> you mean SAR vs DAR?
[00:16:17 CET] <jokoon> im so confused
[00:16:54 CET] <jokoon> what's this non square pixel?
[00:17:29 CET] <jokoon> input says [SAR 1:1 DAR 64:27]
[00:17:37 CET] <jokoon> is that what you're talking about?
[00:18:19 CET] <jokoon> I use 'scale=480:270:force_original_aspect_ratio:1,pad=480:270:(ow-iw)/2:(oh-ih)/2',
[00:18:34 CET] <jokoon> shouldn't this have a letterbox?
[00:24:33 CET] <jokoon> 64:27 is 16/9 multiplied by 4/3
[00:24:42 CET] <jokoon> who invents those things?
[00:25:08 CET] <DHE> sample aspect ratio, or the ratio of the pixels. 1:1 means perfectly square pixels. DAR is display aspect ratio, or the ratio of the screen/window it plays in
[00:25:57 CET] <jokoon> so the video player will stretch the video to match the DAR
[00:26:08 CET] <jokoon> but why can't I have a letterbox?
[00:26:29 CET] <DHE> because the default is to preserve the DAR when encoding by playing with the SAR if needed. you must override
[00:28:45 CET] <jokoon> with force_original_aspect_ratio ?
[00:29:32 CET] <jokoon> no not that
[00:36:51 CET] <jokoon> setdar=0 doesn't work either
[00:49:39 CET] <jokoon> going to sleep
[00:49:42 CET] <jokoon> not giving up
[00:49:57 CET] <jokoon> I wish I could letterbox any video and have a generic solution
[00:50:20 CET] <jokoon> I gather it's currently night in california and in the US
[00:50:30 CET] <jokoon> actually it's evening
[00:50:41 CET] <jokoon> I'll see
[00:50:44 CET] <jokoon> good night
[00:51:00 CET] <jokoon> no echelog :(
[01:02:47 CET] <shibboleth> i was unable to test ffmpeg with qsv-support compiled in since support the old sdk (supporting ivy bridge and haswell) has been deprecated in ffmpeg
[01:03:08 CET] <shibboleth> and i'm unable to test the new sdk since it doesn't support ivy/haswell.
[01:03:24 CET] <shibboleth> anyway: using vaapi i'm getting these errors from some sources:
[01:04:59 CET] <shibboleth> Failed to get HW surface format for vaapi_vld
[01:05:00 CET] <shibboleth> Available HW surface format was vdpau.
[01:05:03 CET] <shibboleth> Available HW surface format was yuvj420p.
[01:05:04 CET] <shibboleth> Failed to get HW surface format for vaapi_vld.
[01:05:06 CET] <shibboleth> Available HW surface format was vdpau.
[01:05:08 CET] <shibboleth> Available HW surface format was yuvj420p.
[01:05:13 CET] Last message repeated 1 time(s).
[01:05:13 CET] <shibboleth> oops, was supposed to be comma-sep
[01:05:14 CET] <shibboleth> sorry
[01:06:50 CET] <shibboleth> ffprobe: Stream #0:0: Video: h264 (Baseline), yuvj420p(pc, progressive), 2032x1536, 30 fps, 30 tbr, 90k tbn, 60 tbc
[01:07:33 CET] <shibboleth> ffprobe for the working sources: Stream #0:0: Video: h264 (High), yuvj420p(pc, bt709, progressive), 1920x1080 [SAR 189:190 DAR 168:95], 25 fps, 25 tbr, 90k tbn, 50 tbc
[01:08:50 CET] <shibboleth> does anything jump out as to why vaapi can't handle the first variant? vainfo output: https://paste.debian.net/hidden/35bd7059/
[01:16:21 CET] <BtbN> The resolution could simply be too high for that old of a GPU
[01:16:35 CET] <shibboleth> just beat me to it
[01:16:42 CET] <shibboleth> i was looking at vdpauinfo
[01:16:49 CET] <shibboleth> more vebose:
[01:16:51 CET] <shibboleth> verbose
[01:16:59 CET] <shibboleth> Video surface:
[01:17:00 CET] <shibboleth> name width height types
[01:17:00 CET] <shibboleth> -------------------------------------------
[01:17:00 CET] <shibboleth> 420 1920 1080 NV12 YV12 UYVY YUYV Y8U8V8A8 V8U8Y8A8
[01:17:00 CET] <shibboleth> 422 1920 1080 NV12 YV12 UYVY YUYV Y8U8V8A8 V8U8Y8A8
[01:17:00 CET] <shibboleth> 444 1920 1080 NV12 YV12 UYVY YUYV Y8U8V8A8 V8U8Y8A8
[01:17:16 CET] <BtbN> vdpau info is pretty meaningless for an Intel GPU though, since it does not support vdpau.
[01:17:40 CET] <shibboleth> ibvdpau-va-gl
[01:17:43 CET] <shibboleth> libvdpau-va-gl
[01:17:51 CET] <BtbN> That stuff is seriously old and broken, don't use it.
[01:18:09 CET] <shibboleth> under decoder capabilities: H264_BASELINE 51 16384 2048 2048
[01:18:10 CET] <shibboleth> H264_MAIN 51 16384 2048 2048
[01:18:10 CET] <shibboleth> H264_HIGH 51 16384 2048 2048
[01:18:29 CET] <shibboleth> still, suggests that this is, in fact, caused by the resolution
[01:18:35 CET] <BtbN> the vdpau-vaapi wrapper hasn't seen work in 5 years or so. No degree of brokenness would surprise me about it.
[01:18:45 CET] <BtbN> That whole thing itself could be limited to 1080p
[01:18:58 CET] <shibboleth> sure, i mention it only because vainfo output is less verbose
[01:20:36 CET] <shibboleth> would it be possible to offload some cpu usage by using opencl hwaccel?
[01:21:14 CET] <BtbN> no
[01:21:24 CET] <DHE> x264 does this, but whether it helps or hurts performance varies. I've seen it do both. 20% was the best improvement I saw though
[01:21:25 CET] <shibboleth> drm?
[01:21:40 CET] <BtbN> GPUs are bad at video de/encoding. They are not helpful in any way.
[01:22:03 CET] <shibboleth> x264 uses opencl?
[01:22:12 CET] <BtbN> It has an option to, but it sometimes makes it slower.
[01:22:27 CET] <BtbN> so it's generally pretty pointless
[01:22:57 CET] <shibboleth> ok. clinfo finds the device, CL_DEVICE_TYPE_ACCELERATOR not found however
[01:23:23 CET] <BtbN> Again, GPUs suck at video de/encoding. OpenCL does not help you with it.
[01:23:32 CET] <BtbN> Video filtering however it's very useful for
[01:23:41 CET] <shibboleth> anyway, i've been able to shave off about 50% prev cpu usage after switching to vaapi for the 1080 streams
[01:24:06 CET] <shibboleth> might have to settle for no accel for the last source
[01:40:59 CET] <shibboleth> how well does ffmpeg handle v4l-devices (bttv)?
[03:48:09 CET] <whoareU> using "ffprobe my.mp4" , one line is "Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'MDF.mp4':", why are there so many format on this video which is obvious mp4
[04:46:08 CET] <Spring> With animted WebP how similar is it to VP8 which it's based upon? Since in my tests with libwebp in ffmpeg, using the -qscale quality option, I found that filesizes were still pretty large for the resolution and quality, while for VP8 it can handle higher resolutions and quality for smaller filesizes.
[04:54:37 CET] <matto312> I've been stuck for while on issue. Any help that could push me in right direction would be awesome. Input is live RTMP input and output is multi-variant HLS. Most of the time works as expected, but sometimes output is frozen video. TS files are being generated but it's frozen to single frame (and the video files are ~1/3 size). Here is the cmd and output when the issue happens. https://pastebin.com/PQMQVzXZ Thanks!
[04:55:14 CET] <matto312> I recently upgrade ffmpeg to 4.2, but still happens
[06:42:06 CET] <BeerLover> bitrate * duration of song should give me the size of audio file right? But it's not the case: https://dpaste.org/f3vH
[06:42:19 CET] <BeerLover> Song's size = 53 mb
[06:43:31 CET] <BeerLover> but bitrate 128kbps (0.128 mbps) * 3478.308562 = 424.59 mb
[06:56:45 CET] <Spring> BeerLover, I think it depends whether it's constant bitrate or variable bitrate.
[07:00:18 CET] <BeerLover> Spring i was doing wrong calculation
[07:00:20 CET] <BeerLover> never mind
[07:00:35 CET] <BeerLover> I want to know if I can estimate the encoding time beforehand?
[07:00:50 CET] <BeerLover> since i have a 57min audio file
[07:01:13 CET] <BeerLover> I want to know the time ffmpeg might take to encode it into a specific bitrate
[10:07:54 CET] <jokoon> ok I finally managed to make that letterbox
[10:08:00 CET] <jokoon> youhouuuu
[10:15:24 CET] <bencc1> kepstin: I've used SSD on the server with much more iops and bandwidth and still getting freezes during the capture
[11:08:14 CET] <ePirat> whats the replacement for the deprecated avcodec_register_all ?
[11:08:33 CET] <ePirat> the changelog and API doc is quite unhelpful in that regard
[11:09:06 CET] <JEEB> I think you don't need to register them at all any more
[11:09:49 CET] <ePirat> why isnt the function a no-op then?
[11:10:14 CET] <ePirat> in current git master, that is
[11:10:50 CET] <JEEB> good question :P
[11:13:54 CET] <BeerLover> i there any way to know how much time ffmpeg will take to encode a song before starting to encode?
[11:14:24 CET] <JEEB> it calls av_codec_init_next which just creates a linked list
[11:14:31 CET] <JEEB> but the list is supposed to be static now
[11:14:47 CET] <jokoon> I'm planning to open 35 videos which combined, weight 900MB, I have 12GB of ram, am I expecting to have problem? (using xstacks)
[11:14:55 CET] <jokoon> weigh*
[11:22:02 CET] <ePirat> JEEB, Ah ok, thanks
[11:34:17 CET] <meepmeep> :)
[12:18:47 CET] <jokoon> when doing those [3:v] identifiers, can I use any integer value?
[12:19:21 CET] <jokoon> I mean they don't have to be contiguous like 0 1 2 3
[12:19:33 CET] <jokoon> I can use 101 102 201 202...
[12:22:12 CET] <jokoon> mmh seems I can't
[12:22:28 CET] <jokoon> the -i are always numbered 0 1 2 3
[12:24:13 CET] <JEEB> yes
[12:25:51 CET] <jokoon> yes? you mean I can't
[12:26:02 CET] <jokoon> or yes there's another way?
[12:26:10 CET] <JEEB> they are numbered as you input them
[12:26:13 CET] <jokoon> ok
[13:05:14 CET] <LFSVeteran> https://pastebin.com/TezruxRt
[13:05:44 CET] <LFSVeteran> trying to cross build FFmpeg for aarch64, however arguing with pkgconfig
[13:06:03 CET] <JEEB> for cross-compilation you should utilize PKG_CONFIG_LIBDIR
[13:06:05 CET] <JEEB> _PATH appends
[13:06:09 CET] <JEEB> _LIBDIR overrides
[13:06:46 CET] <JEEB> and the actual error should be within ffbuild/config.log
[13:18:00 CET] <LFSVeteran> undefined reference to `cabs'
[13:18:08 CET] <LFSVeteran> https://pastebin.com/iN75PEcp
[13:20:05 CET] <JEEB> cexp in the vorbis test, yup
[13:20:23 CET] <JEEB> if all of your dependencies are static libs, then add the --static option to pkg-config options
[13:20:35 CET] <JEEB> --pkg-config-flags=--static
[13:20:49 CET] <JEEB> that tells pkg-config to also utilize Libs.private
[13:20:54 CET] <JEEB> in addition to Libs
[13:21:11 CET] <JEEB> since vorbis's pc file seems to have Libs.private: -lm there
[13:24:08 CET] <LFSVeteran> just parse --pkg-config-flags=--static to configure right?
[13:24:47 CET] <JEEB> pass, yes
[13:24:58 CET] <JEEB> that tells configure to use --static when calling pkg-config
[13:29:06 CET] <LFSVeteran> hmm still the same
[13:30:11 CET] <LFSVeteran> https://pastebin.com/vwET1892
[13:34:10 CET] <JEEB> PKG_CONFIG_LIBDIR=.. pkg-config --static vorbis
[13:34:14 CET] <JEEB> what does this give?
[13:34:53 CET] <JEEB> because clearly something is derp
[13:35:23 CET] <LFSVeteran> If just do that on the commandline, I get no result
[13:35:35 CET] <JEEB> also apparently it doesn't even get to testing that? since the test doesn't contain vorbis_info_init
[13:35:53 CET] <JEEB> LFSVeteran: sorry, --lib or so as well
[13:36:02 CET] <JEEB> *libs
[13:36:44 CET] <LFSVeteran> PKG_CONFIG_LIBDIR=.. pkg-config --static --libs vorbis
[13:36:58 CET] <LFSVeteran> Package vorbis was not found in the pkg-config search path.
[13:38:16 CET] <LFSVeteran> PKG_CONFIG_LIBDIR=/opt/aarch64-unknown-linux-gnu/usr/lib/pkgconfig pkg-config --static --libs vorbis
[13:38:22 CET] <LFSVeteran> -L/opt/aarch64-unknown-linux-gnu/usr/lib -lvorbis -lm -logg
[13:38:28 CET] <LFSVeteran> PKG_CONFIG_LIBDIR=/opt/aarch64-unknown-linux-gnu/usr/lib/pkgconfig pkg-config --libs vorbis
[13:38:35 CET] <LFSVeteran> -L/opt/aarch64-unknown-linux-gnu/usr/lib -lvorbis
[13:38:55 CET] <LFSVeteran> strange...since I have native vorbis installed as well
[13:46:28 CET] <bencc1> what does "-chunk_duration 1000k" do?
[13:46:55 CET] <bencc1> does it has any effect when converting mkv to mp4 while copying audio and video?
[13:47:29 CET] <bencc1> ffmpeg -i in.mkv -c:a copy -c:v copy -chunk_duration 1000k -movflags +faststart -threads 1 out.mp4
[13:47:50 CET] <JEEB> ffmpeg -h muxer=mp4
[13:47:55 CET] <JEEB> and see the chunk_duration option's description
[13:53:19 CET] <bencc1> JEEB: there is no chunk_duration option
[13:53:26 CET] <bencc1> in the output
[13:53:35 CET] <bencc1> does it means that it's irrelevant?
[13:57:19 CET] <LFSVeteran> interesting...https://pastebin.com/gXAs4zL6
[13:57:37 CET] <JEEB> bencc1: ah no, it seems like a global option visible in `ffmpeg -h full |grep chunk_duration`
[13:58:44 CET] <bencc1> -chunk_duration <int> E....... microseconds for each chunk (from 0 to 2.14748e+09) (default 0)
[13:59:01 CET] <bencc1> what does it means? where it can be used?
[13:59:12 CET] <JEEB> it seems to be a global option for the muxing logic. whatever that means :P
[13:59:22 CET] <bencc1> I have the above command in a script and trying to decide if it should be removed
[13:59:30 CET] <JEEB> probably related to interleaving logic
[13:59:31 CET] <bencc1> ffmpeg -i in.mkv -c:a copy -c:v copy -chunk_duration 1000k -movflags +faststart -threads 1 out.mp4
[13:59:41 CET] <bencc1> relevant here?
[14:00:02 CET] <JEEB> no effing idea unfortunately :P
[14:00:10 CET] <JEEB> I've never had to adjust that option for any purposes
[14:00:25 CET] <bencc1> thanks
[14:09:23 CET] <LFSVeteran> can't see the error...
[15:13:06 CET] <lesshaste> is there a tool to speed up a video but keep the audio at the right frequency and synchronised?
[15:24:01 CET] <pink_mist> librubberband?
[16:37:35 CET] <kepstin> lesshaste: ffmpeg can do it, but note that you have to scale the speed of the audio and video separately in matching amounts using filters. you have a bunch of choice on how exactly you'd like to deal with each stream.
[16:39:43 CET] <phobosoph> hi
[16:41:10 CET] <phobosoph> so I got a h264 video in full HD streamed from HTTP
[16:41:20 CET] <phobosoph> is there bandwidth shaping tech supported by modern browsers?
[16:42:24 CET] <kepstin> player side bandwidth control has to be done in javascript, but it is doable (most big video streaming sites do it)
[16:42:50 CET] <kepstin> typically done by having multiple encodes at different rates available via dash or hls, can switch on segment boundaries.
[16:45:47 CET] <phobosoph> kepstin: so it can't dynamically just send only parts of it? It has to be a wholly differently encoded video stream thing?
[16:45:54 CET] <phobosoph> kepstin: any js player solutions out there?
[16:46:21 CET] <phobosoph> I can use Adobe Premiere, Encoder, ffmpeg
[16:46:21 CET] <phobosoph> can these programs encode a video file for different resolutions?
[16:46:21 CET] <phobosoph> similar to <picture> and srcset for img in html5?
[16:46:27 CET] <kepstin> if you build your own sidechannel thing where the player can notify the server to change rate, you can do it in-stream i suppose.
[16:46:44 CET] <phobosoph> kepstin: in-stream, so a HTTP server thing has to transcode it on-demand?
[16:46:53 CET] <phobosoph> or does it simply pick something else?
[16:46:56 CET] <phobosoph> :/
[16:47:00 CET] <kepstin> it's up to how you build it
[16:47:04 CET] <kepstin> browser does nothing automatically
[16:47:28 CET] <phobosoph> I see
[16:47:37 CET] <phobosoph> youtube seems to encode it for different p, 1080, 720, 480p?
[16:47:39 CET] <kepstin> as i said, most places do it by having dash or hls with multiple rates available, and then having a javascript player switch between them.
[16:49:43 CET] <phobosoph> kepstin: hm, so DASH claims to be much more efficient than HLS, but less supported?
[16:49:56 CET] <phobosoph> https://caniuse.com/#feat=http-live-streaming
[16:49:58 CET] <phobosoph> ofc, the IE lol
[16:50:02 CET] <BtbN> DASH is HLS with more XML.
[16:50:06 CET] <phobosoph> hm, chrome also
[16:50:07 CET] <BtbN> Otherwise... about the same
[16:50:24 CET] <BtbN> If you do HLS with fragmented mp4 it's virtually identical to DASH
[16:50:25 CET] <phobosoph> BtbN, kepstin: any ready solutions for this? except hosting on youtube or similar?
[16:50:33 CET] <BtbN> Any http server?
[16:50:44 CET] <phobosoph> yes
[16:50:49 CET] <kepstin> phobosoph: https://caniuse.com/#feat=mediasource is the one that matters
[16:50:54 CET] <phobosoph> ah
[16:50:58 CET] <kepstin> the actual dash or hls is implemented in javascript
[16:51:07 CET] <phobosoph> support looks much better
[16:51:18 CET] <phobosoph> kepstin: any JS library/player thing that supports this that I should look into?
[16:51:22 CET] <BtbN> It's still a horrible mess from my experience, and only really works reliably in Chrome.
[16:51:24 CET] <phobosoph> I also want to use it for background
[16:51:44 CET] <BtbN> Luckily every browser except firefox is chrome nowadays.
[16:51:48 CET] <kepstin> apple devices can natively play hls without a javascript player
[16:51:59 CET] <kepstin> but that's a historical artifact
[16:52:04 CET] <BtbN> Edge can also do that
[16:52:24 CET] <kepstin> i bet chromium edge can't tho :)
[16:52:30 CET] <BtbN> It did when I tested it
[16:52:38 CET] <phobosoph> I use video backgrounds and I want to make it more efficient
[16:52:52 CET] <phobosoph> which means I am interested in adaptive video stuff for <video> backgrounds, too
[16:52:55 CET] <BtbN> That sounds inefficient no matter what
[16:52:56 CET] <kepstin> phobosoph: I have autoplay disabled, that makes it more efficient :)
[16:53:15 CET] <phobosoph> kepstin: and then I want to offer optional audio on/off (opt-in as chrome enforces it anyway)
[16:53:30 CET] <phobosoph> is it possible to download the audio part only when the user enables it?
[16:53:31 CET] <BtbN> Playing a video for no particular reason is a waste
[16:53:34 CET] <phobosoph> and otherwise skip the audio?
[16:53:42 CET] <kepstin> moving backgrounds is an accessibility problem too, for motion sensitive people
[16:53:45 CET] <BtbN> Audio is entirely irrelevant in size compared to video
[16:53:47 CET] <phobosoph> BtbN: in this very particular case it is a short looping background video :)
[16:53:51 CET] <phobosoph> ah
[16:54:00 CET] <BtbN> No matter if it's short looping
[16:54:03 CET] <phobosoph> kepstin: I need to look for a css selector for motion sensitive
[16:54:04 CET] <phobosoph> hm
[16:54:08 CET] <BtbN> it still needlessly keeps the GPU and CPU busy for no reason
[16:54:12 CET] <BtbN> specially mobile users will hate you
[16:54:51 CET] <phobosoph> hm
[16:54:57 CET] <phobosoph> alternative to a short movie thing? gifv?
[16:55:13 CET] <kepstin> gifv isn't a real format, that's just a short movie
[16:55:28 CET] <furq> gifv is just mp4
[16:56:09 CET] <BtbN> How about just not permanently playing a video?
[16:56:12 CET] <furq> or more precisely it's just a fake file extension imgur uses to give you a bare html page with a video tag with looping set
[16:56:14 CET] <kepstin> my recommendation is use a static background, and if something you have would be best explained with motion video, then have a click-to-play video element for that.
[16:56:20 CET] <BtbN> Browsers just won't let you autoplay a video anyway
[16:56:31 CET] <BtbN> It's all forced click to play
[16:56:32 CET] <furq> also yeah video backgrounds don't work in chrome any more
[16:56:34 CET] <furq> for pretty good reasons
[16:56:51 CET] <BtbN> Only a certain huge video platform has an exception
[16:57:01 CET] <furq> can't imagine how they swung that
[16:58:03 CET] <phobosoph> ok
[16:58:15 CET] <phobosoph> BtbN: well, it does autoplay with mute attribute
[16:58:41 CET] <phobosoph> https://developers.google.com/web/updates/2017/09/autoplay-policy-changes
[16:58:42 CET] <kepstin> depends on the browser, and in chrome it depends on weird heuristics about how often you access the site
[16:58:51 CET] <phobosoph> "Muted autoplay is always allowed.
[16:59:00 CET] <furq> oh right yeah
[16:59:02 CET] <kepstin> that's from 2017
[16:59:03 CET] <phobosoph> :D
[16:59:07 CET] <phobosoph> but it is like that
[16:59:10 CET] <kepstin> 2017 is a long time ago :)
[16:59:36 CET] <phobosoph> they hadn't changed it
[16:59:46 CET] <LFSVeteran> https://pastebin.com/CbVqWSk0
[17:00:34 CET] <phobosoph> https://github.com/Dash-Industry-Forum/dash.js/blob/5f0cb17824bc121123952b1a7bf86d3bc7926bf1/contrib/videojs/README.md
[17:00:43 CET] <kepstin> ah, the heuristics stuff only affects video with sound
[17:00:45 CET] <phobosoph> dash.js looks like a good player for dash/hls/browser stuff
[17:01:03 CET] <furq> i've used hls.js before and it's fine
[17:01:04 CET] <phobosoph> now I need to setup a nginx with rtmp/dash/hls support for video files?
[17:01:15 CET] <furq> only for live streaming
[17:01:20 CET] <phobosoph> furq: hls.js vs dash.js? any differences? it seems that dash.js can fall back to hls and be more efficient?
[17:01:25 CET] <kepstin> any plain http server can host vod dash or hls content
[17:01:30 CET] <furq> hls is just a playlist and a bunch of segments in a directory
[17:01:37 CET] <phobosoph> furq: ok, so now I have to provide multiple video files in different resolution, like youtube does?
[17:01:42 CET] <phobosoph> kepstin: that's nice
[17:01:49 CET] <BtbN> DASH and HLS are effectively the same thing, just one with and one without XML
[17:02:02 CET] <furq> or multiple playlists in this case
[17:02:08 CET] <bencoh> (and dash allows mp4 segments)
[17:02:08 CET] <phobosoph> https://github.com/videojs/videojs-contrib-dash
[17:02:12 CET] <kepstin> does hls still mandate mpeg-ts, or do they support mp4 now?
[17:02:14 CET] <BtbN> They both support multiple quality lists
[17:02:18 CET] <phobosoph> this seems to offer the best of both worlds, dash and hls
[17:02:24 CET] <furq> hls nominally supports mp4 but not all implementations do
[17:02:28 CET] <bencoh> oh
[17:02:39 CET] <phobosoph> "A video.js source handler for supporting MPEG-DASH playback through a video.js player on browsers with support for Media Source Extensions."
[17:02:40 CET] <furq> also you don't really need both dash and hls
[17:02:41 CET] <phobosoph> nice!!!
[17:02:47 CET] <phobosoph> furq: so what should I prefer?
[17:02:58 CET] <phobosoph> dash claims to be up to 200% or whatever more efficient than hls?
[17:03:00 CET] <furq> i forget what the deal is with apple stuff and dash
[17:03:01 CET] <phobosoph> however this should work
[17:03:03 CET] <phobosoph> lol
[17:03:10 CET] <furq> also idk what their basis for claiming dash is more efficient is
[17:03:13 CET] <phobosoph> ah right, ios seems to natively do hls?
[17:03:14 CET] <phobosoph> hm
[17:03:23 CET] <kepstin> apple stuff should be fine with dash, assuming recent updates, you get mse with current ios versions
[17:03:27 CET] <bencoh> I see no proper reason for it to be more efficient
[17:03:35 CET] <BtbN> Neither DASH nor HLS are in any way efficient/inefficient. They are just playlists.
[17:03:36 CET] <furq> a lot of livestreaming stuff used hls for years because safari and ios stuff doesn't do mse
[17:03:45 CET] <furq> like youtube used dash for vod and hls for live
[17:03:47 CET] <kepstin> mpeg-ts has more muxing overhead than other formats, and hls normally uses mpeg-ts
[17:03:53 CET] <furq> because apple stuff would fall back for http for vod
[17:03:56 CET] <kepstin> so that might be what they mean? shouldn't be "200%" tho
[17:03:59 CET] <furq> but i guess apple finally caved
[17:04:06 CET] <furq> kepstin: does it have much more overhead than fmp4
[17:04:13 CET] <furq> i guess for sufficiently large segments it does
[17:04:34 CET] <furq> but yeah that's like 10% overhead compared to non-fragmented mp4
[17:04:44 CET] <phobosoph> hm, other sites recommend HLS
[17:04:52 CET] <phobosoph> because apple,right
[17:04:55 CET] <furq> probably
[17:05:04 CET] <kepstin> always check dates on this stuff, things change quickly in this space.
[17:05:06 CET] <furq> i used hls because hls.js was way easier to get running when i last looked into this
[17:05:11 CET] <furq> but that's like two years ago now
[17:05:41 CET] <BtbN> HLS is just way nicer and more readable of a format
[17:05:48 CET] <furq> yeah that as well
[17:05:48 CET] <BtbN> While DASH is horribly over-engineered
[17:05:56 CET] <furq> i still don't really understand dash
[17:06:01 CET] <furq> granted because i've never needed to
[17:06:14 CET] <phobosoph> https://github.com/Viblast
[17:06:21 CET] <phobosoph> they claim to be even better than dash.js and video.js
[17:06:22 CET] <phobosoph> hmmmm
[17:06:27 CET] <phobosoph> so many different players
[17:06:29 CET] <furq> whereas hls is just an m3u8 file full of mpegts segment
[17:06:30 CET] <furq> s
[17:06:39 CET] <BtbN> Or mp4 fragments
[17:06:40 CET] <furq> or full of other m3u8 files
[17:07:03 CET] <phobosoph> hm, now I tend towards HLS because it requires less js fallback
[17:07:18 CET] <furq> they're both fine
[17:07:28 CET] <furq> as far as client experience goes
[17:07:39 CET] <phobosoph> ah, so the client doesn't notice a difference
[17:07:44 CET] <furq> like i said youtube used both and nobody ever noticed
[17:08:06 CET] <furq> and i think they now use dash for live streams and nobody noticed
[17:08:35 CET] <BtbN> Don't most sites use some kind of self-baked WebSocket-Straight-to-MSE construct nowadays?
[17:08:50 CET] <furq> i'd rather not think about that
[17:09:04 CET] <BtbN> At least when you inspect the players, you end up with a wss:// url for the video
[17:09:47 CET] <furq> yeah nowadays if youtube-dl doesn't already support it i just assume it's not worth it
[17:10:28 CET] <furq> someone smarter than me has probably already tried and given up
[17:10:35 CET] <phobosoph> https://bitmovin.com/status-mpeg-dash-today-youtube-netflix-use-html5-beyond/
[17:11:01 CET] <phobosoph> This looks like the best opensource player solution out there: https://github.com/videojs/videojs-contrib-dash
[17:11:12 CET] <phobosoph> video.js for DASH and HLS fallback
[17:11:13 CET] <furq> didn't safari actually secretly support mse for years but only for whitelisted sites, and the whitelist was netflix.com
[17:11:25 CET] <phobosoph> furq: woa, microsoft-style competition practics :/
[17:11:28 CET] <furq> or the ios native video element thing
[17:11:34 CET] <furq> which i assume is some safari component
[17:11:40 CET] <phobosoph> furq: they hadn't got in trouble with that whitelisting competition thign?
[17:11:49 CET] <furq> idk i heard that once from someone in here
[17:12:06 CET] <furq> you could tell me anything about apple doing that sort of thing and i'd uncritically believe it without checking
[17:12:11 CET] <furq> it's like the dprk at this point
[17:13:31 CET] <phobosoph> ah
[17:13:45 CET] <phobosoph> furq: so let's say I want to use that https://github.com/videojs/videojs-contrib-dash thing
[17:13:56 CET] <phobosoph> I have to set up a directory on the web server to host the video files, right?
[17:14:00 CET] <furq> yeah
[17:14:03 CET] <phobosoph> and a m3u and dashwhatever playlist thing?
[17:14:11 CET] <phobosoph> can I autogenerate them? or is it just some piece of text?
[17:14:19 CET] <furq> ffmpeg will generate them if you use the hls muxer
[17:14:24 CET] <furq> along with all the segments
[17:14:38 CET] <furq> or the dash muxer i guess
[17:14:49 CET] <phobosoph> ah cool
[17:14:56 CET] <phobosoph> furq: and it can generate both, for dash and for hls?
[17:15:13 CET] <furq> i don't think so but i've never tried
[17:15:54 CET] <phobosoph> ah
[17:16:00 CET] <furq> at least not for the same set of segments
[17:17:18 CET] <furq> https://www.ffmpeg.org/ffmpeg-formats.html#dash-2
[17:17:19 CET] <furq> hls_playlist hls_playlist
[17:17:19 CET] <furq> Generate HLS playlist files as well.
[17:17:21 CET] <furq> cool
[17:17:24 CET] <phobosoph> https://github.com/videojs/http-streaming
[17:17:24 CET] <phobosoph> ah
[17:18:10 CET] <furq> i don't think there's any reason to use both though
[17:18:26 CET] <furq> if the browser supports mse and javascript then it supports both hls and dash
[17:19:21 CET] <phobosoph> furq: so I just use dash.js
[18:53:51 CET] <phobosoph> In Audacity one can store the stereo channel difference instead each channel separate - this should yield the same results, right?
[18:53:57 CET] <phobosoph> because it is digital?
[18:58:29 CET] <kepstin> phobosoph: need more context. what codec?
[18:59:05 CET] <phobosoph> mp3
[18:59:06 CET] <kepstin> note that most lossy codecs automatically decide whether to use stereo or mid/side (stereo difference), sometimes even on a per-frame basis
[18:59:11 CET] <kepstin> there shouldn't be any config needed
[18:59:40 CET] <kepstin> with mp3 (lame encoder), you normally want to use one of the preset modes, and it'll automatically use mid/side as appropriate
[18:59:52 CET] <furq> lame uses joint stereo by default
[19:01:10 CET] <furq> it yields better results with lossy codecs because it takes less space
[19:01:34 CET] <kepstin> it takes less space on signals where there's correlation between the two channels, anyways
[19:01:41 CET] <kepstin> which is true of most musc
[19:02:01 CET] <furq> well like you said it'll only be enabled for frames where it's of benefit
[19:03:39 CET] <kepstin> phobosoph: anyways, it's an internal implementation detail of how the codec works, and isn't something most people need to look at let alone touch :/
[19:04:27 CET] <furq> i assume it's some encoder configuration checkbox
[19:05:01 CET] <furq> there's no point messing with anything other than -V or -b for lame
[19:05:09 CET] <kepstin> in some special cases where you have unusual audio signals you might notice reduced stereo separation, in which case having the option to manually force dual-mono encoding could be useful.
[20:09:08 CET] <phobosoph> ah
[20:09:23 CET] <phobosoph> another thing: When I got a normal headset microphone, the input is mono, right? :)
[20:09:30 CET] <phobosoph> so I could just use a mono channel in e.g. audacity?
[20:11:11 CET] <pink_mist> completey depends on the microphone in question
[20:12:21 CET] <JEEB> aughey: anyways not sure if you can get much more help unless you post some code onto a pastebin or so
[20:12:49 CET] <JEEB> because it feels like something is awry or expecting something else (Since I've been able to convert YCbCr to RGB just fine
[20:15:31 CET] <aughey> Here's my code. https://github.com/aughey/ffmpegtest/blob/master/FFMPEGPipeC2.cpp very minorly modified from the standard examples to output the video buffers to stdout.
[20:16:25 CET] <aughey> There are at least 3 attempts to get it to work correctly. All have the same results.
[20:21:35 CET] <JEEB> good ol' swscale
[20:22:06 CET] <kepstin> aughey: i don't have any context for what's wrong, but at a quick glance at your code, that's using the old apis, hmm. And you generally don't want to preallocate the buffers for video frames when decoding, they'll normally be automatically allocated.
[20:22:36 CET] <JEEB> aughey: I used to use swscale ages ago
[20:22:38 CET] <JEEB> https://github.com/jeeb/matroska_thumbnails/blob/master/src/matroska_thumbnailer.cpp#L321..L338
[20:22:43 CET] <JEEB> that's the left-over history of that one :P
[20:23:51 CET] <kepstin> if you're using swscale, you need to use the allocation methods that add a bit of extra space to the lines/frames for alignment reasons. i forget what those are offhand.
[20:24:14 CET] <JEEB> apparently that he is doing
[20:24:21 CET] <JEEB> also what on earth is that avpicture_fill doing
[20:25:34 CET] <JEEB> also thankfully we've gained a lot of helpers for this stuff
[20:26:05 CET] <aughey> I dont' need to use swscale, it's somewhat left over from modifying other examples in an attempt to get this to work. I shoudl be able to write out the planes directly from avcodec_decode_video2
[20:26:35 CET] <aughey> I've tried that too, similar results
[20:26:54 CET] <JEEB> then something's weird
[20:27:21 CET] <aughey> I know my GPU rendering of the frame buffer is correct, because if I extract out raw video frames using the command line ffmpeg.exe, I get correct results
[20:27:33 CET] <kepstin> i'm missing something here, what's the bad results that you're getting?
[20:27:44 CET] <aughey> I'm writing my own frame extractor so that I can seek and do other processing.
[20:28:05 CET] <JEEB> note: if you are planning on wanting frame-exact seeking in the end you might want to look at ffms2
[20:28:06 CET] <aughey> "-hide_banner -loglevel panic -fflags nobuffer -flags low_delay -ss " + start_offset + " -i " + filename + " -f rawvideo -vcodec rawvideo pipe:1"
[20:28:11 CET] <JEEB> that does indexing on the lavf level
[20:28:24 CET] <aughey> This is the command line I generate to generate raw frames with ffmpeg
[20:28:32 CET] <kepstin> note that ffmpeg's rawvideo output is *not* the same as ffmpeg's in-memory video representation
[20:28:33 CET] <JEEB> yea, that gives you no extra stride I think
[20:28:43 CET] <aughey> this outputs native yuv420p frames which my shader on the gfx side handles correctly
[20:29:11 CET] <aughey> So if I write out each line of the frame (in the commented out section of my code), I get the same skewed results
[20:29:25 CET] <kepstin> ffmpeg's native representation of planar videos requires using separate pointers for the different planes, and using stride arithmetic to handle the extra line padding for alignment reasons
[20:29:33 CET] <aughey> Here's the distorted frame I get https://github.com/aughey/ffmpegtest/blob/master/skew.JPG
[20:30:14 CET] <JEEB> aughey: I think you should first minimize the example to only have demux+decode there. you will get an AVFrame from lavc
[20:30:45 CET] <aughey> in https://github.com/aughey/ffmpegtest/blob/master/FFMPEGPipeC2.cpp around line 179 I should be doing the stride math
[20:30:49 CET] <JEEB> lavc is able to generate the buffers itself, and you should just have an AVFrame (structure, not buffers) allocated for it
[20:31:17 CET] <kepstin> yeah, drop all your manual buffer allocation stuff
[20:31:23 CET] <JEEB> I've clearly gotten the same stuff to work just fine, and various GPU based things also work
[20:31:44 CET] <JEEB> as in, things that then move the buffer to the GPU and render with a shader
[20:34:55 CET] <ry> DOes anyone have an idea of what syntax I would use with ffmpeg to send an IP stream to broadcast (so all hosts on the same subnet would receive it). (This is an experiment, I know multicast is the way to go for production use).
[20:43:09 CET] <JEEB> no idea, I've just used multicast on the interface generally
[20:45:28 CET] <aughey> When you say to use lavc, what do you mean?
[20:50:03 CET] <JEEB> aughey: libavcodec's short hand
[20:50:06 CET] <JEEB> lavf is libavfilter
[20:50:12 CET] <JEEB> *libavformat
[20:50:16 CET] <JEEB> lavfi is libavfilter
[20:53:35 CET] <kepstin> ry: it should in theory be as simple as using the subnet broadcast address as the destination address
[20:53:51 CET] <kepstin> with udp, of course.
[20:57:05 CET] <ry> kepstin, I get permission denied trying to use udp://10.27.0.255:4900 -- as an example: "sudo ffmpeg -stream_loop -1 -re -i CNBC-WDC-180213-0901.ts -map 0 -c copy -f mpegts udp://10.27.0.255:4900" ends up with the following error: "av_interleaved_write_frame(): Permission denied", "Error writing trailer of udp://10.27.0.255:4900: Permission denied"
[20:58:26 CET] <ry> Am I doing anything wrong here?
[20:59:22 CET] <kepstin> looks like you need to enable the broadcast option on the udp protocol handler, https://www.ffmpeg.org/ffmpeg-protocols.html#udp
[21:00:55 CET] <ry> Ah, that did it. Thanks kepstin
[21:05:03 CET] <aughey> So I have an example that uses demuxing_decoding.c straight from the ffmpeg examples. This should just be using lavc. This is the stock code and my only change is to neuter the printfs audio writing, and to direct the fwrite for video to stdout. I get the same thing. And again, the same draw code using the output of ffmpeg rawvideo gives
[21:05:04 CET] <aughey> correct results.
[21:05:20 CET] <aughey> I've been on this issue for days, I'm at my wits end. :-)
[21:14:58 CET] <JEEB> aughey: I really can't speak of the examples but since so many can get it right, there is something that's utilizing the buffer wrong, one some side
[21:30:05 CET] <nicolas17> ry: note there's a multicast address that means "all hosts", I think that should work the same
[21:34:42 CET] <ry> This started with me talking to a network engineer who was completely unable to utilize multicast on their network (not technical reasons, but not well thought out corp policy). They are streaming to broadcast, and using IP Directed Broadcast on their Cisco hardware to get those broadcast IP streams across VLANs. They have this working and demo'ed it for me - that was the first I'd even heard of IP Directed Broadcast, no less doing IP streams with broadcast. I'd
[21:34:43 CET] <ry> like to try it out in my lab, and see what I can get it working with.
[21:38:03 CET] <ddubya> is there a better way to detect errors in files (e.g. bitrot) besides -err_detect ?
[21:41:25 CET] <nicolas17> keep checksums of your files :P
[21:41:52 CET] <TheAMM> (let zfs keep checksums and repair your files)
[21:45:21 CET] <pink_mist> yeah, zfs will detect that
[21:45:39 CET] <pink_mist> and if you have raidz it will repair it
[21:47:40 CET] <ddubya> thanks
[21:51:00 CET] <ddubya> was wondering if there was a filter combination that might indicate a problem
[21:51:22 CET] <ddubya> most formats/codecs don't implement a crc for some reason
[21:53:29 CET] <ddubya> maybe a noise detector?
[21:54:43 CET] <srandrews> Hi everyone, am a novice user of ffmpeg (well save for audio) and am attempting to get it to input RTSP from a hardware HVEC encoder that supports multiple streams. It appears to set up properly and get the RTSP SDP but then times out after mapping the input to the output. A hardware decoder, and VLC seem to receive the RTSP stream which
[21:54:43 CET] <srandrews> discounts the configuration of the encoder as the root cause. https://pastebin.com/xYERHWhL
[22:16:29 CET] <srandrews> facepalm. Ignore my Q. -rtsp_transport udp is obviously wrong since the encoder is streaming tcp.
[00:00:00 CET] --- Thu Dec 19 2019
More information about the Ffmpeg-devel-irc
mailing list