[Ffmpeg-devel-irc] ffmpeg.log.20190817

burek burek at teamnet.rs
Sun Aug 18 03:05:03 EEST 2019


[05:02:43 CEST] <argent0> hi, why is this command, intended to generate white noise audio, also generating video when I pipe it to ffplay? ffmpeg -ar 44100 -ac 2 -f s16le -i /dev/urandom -c:a copy -t 2 -f nut - | ffplay -f nut -
[05:03:26 CEST] Action: argent0 wants to generate white noise audio and play it at the same time
[05:04:56 CEST] <furq> argent0: ffplay -nodisp
[05:05:27 CEST] <argent0> furq: but is the output audio only?
[05:05:30 CEST] <furq> also you can just use ffmpeg -f lavfi -i anoisesrc
[05:05:39 CEST] <furq> and yes
[05:06:22 CEST] <nicolas17> you could output to a file and check with ffprobe :P
[05:08:24 CEST] <argent0> furq: nicolas17: Thanks
[05:12:36 CEST] <argent0> now ffplay keeps running
[05:13:01 CEST] Action: argent0 ffmpeg -f lavfi -i anoisesrc -c:a copy -t 2 -f nut - | ffplay -f nut -nodisp -
[05:16:36 CEST] Action: argent0 i used `ffmpeg ... | mpv -`
[12:20:30 CEST] <Foaly> i'm using the blend filter with addition
[12:20:35 CEST] <Foaly> but it comes out purple
[12:20:40 CEST] <Foaly> why is that?
[12:49:07 CEST] <Foaly> :(
[13:26:28 CEST] <goel> Hi is it possible to pipe in compressed video data from an RTMP stream to ffmpeg and then possibly for it to output the decoded images? The compressed data from an RTMP stream would be FLV1.
[13:27:49 CEST] <goel> I accidentally disconnected.
[13:28:12 CEST] <goel> I just wanted to know if ffmpeg supported piping in data to decode?
[13:28:45 CEST] <klaxa> i don't see why not
[13:28:46 CEST] <DHE> yes, you can use stdin for a source
[13:29:29 CEST] <goel> Ok, so I would just have to specify the type of video data coming for the decoder? E.g. H263/FLV1 or H264?
[13:29:56 CEST] <goel> That's brilliant, thank you anyway!
[13:30:55 CEST] <DHE> ffmpeg attempts autodetection if it's detectable and wrapped in something parsable
[13:31:28 CEST] <goel> I mean it wouldn't really by in a container, it would be just raw bytes.
[13:31:42 CEST] <goel> A bytestream which needs to be decoded as FLV1 or H264
[13:32:04 CEST] <DHE> h264 elementary stream is a thing... not sure about flv1
[13:32:59 CEST] <goel> Ok, that's fine, it's not a problem at all. Thank you everyone.
[14:43:13 CEST] <Foaly> is there a way to make the "blend" filter run in rgb color format?
[14:44:12 CEST] <Foaly> even when i convert both inputs to rgb24 or rgba, it still adds an "auto_scaler" that convert it to yuv
[14:45:24 CEST] <c_14> looking at the query_formats list in vf_blend.c makes it look like bgra should work
[14:45:37 CEST] <c_14> eeh, gbr
[14:45:48 CEST] <Foaly> thx, i will try
[14:47:06 CEST] <c_14> gbrp or gbrap
[14:47:47 CEST] <fling> Is there an ultimate guide on muxing multiple live sources together?
[14:48:00 CEST] <Foaly> great, it works!
[14:48:02 CEST] <Foaly> thank you!
[14:48:04 CEST] <fling> Like multiple v4l devices into different streams in a single file
[14:48:09 CEST] <fling> Maybe with sound too
[14:48:17 CEST] <fling> Can't I sync two pulse devices together btw?
[15:15:37 CEST] <snatcher> is there such a thing as "volume level" of audio?
[15:18:13 CEST] <DHE> relatively speaking yes. there is a "maximum volume" that can be produced, though the measurement is normalized since it doesn't directly correspond to real-world volume levels
[15:23:28 CEST] <snatcher> what's the unit of measurement in such a case? for example: audio1 "volume level" is 1, audio2 is 2, so audio2 louder than audio1, is there a filter/tool?
[15:25:11 CEST] <furq> snatcher: https://en.wikipedia.org/wiki/DBFS
[15:26:48 CEST] <furq> not to be confused with dB SPL
[15:28:58 CEST] <snatcher> is there a way to extract it with ffmpeg?
[15:29:53 CEST] <furq> https://ffmpeg.org/ffmpeg-filters.html#volumedetect
[15:47:44 CEST] <fling> audio snatcher!
[17:01:07 CEST] <kepstin> snatcher: note that if you're concerned about perceptual loudness - i.e. if a person would think A or B is louder - you want https://en.wikipedia.org/wiki/LKFS (also called LUFS). The ebur128 filter in ffmpeg can measure this.
[17:08:08 CEST] <DHE> sounds like a competitor to ZFS or btrfs...
[17:08:12 CEST] <DHE> (joke)
[17:16:20 CEST] <durandal_1707> really bad one
[17:23:37 CEST] <pink_mist> it wasn't particularly good, but at least it wasn't offensive ... saying it was really bad is a bit harsh
[17:24:41 CEST] <durandal_1707> i feel insulted
[18:14:24 CEST] <Foaly> is there a trick to correctly overlay one video with alpha ontop of another
[18:14:27 CEST] <Foaly> ?
[18:14:30 CEST] <Foaly> or rather, png sequences
[18:14:58 CEST] <durandal_1707> tried trick with overlay filter?
[18:15:24 CEST] <Foaly> i use overlay=format=gbrp:alpha=straight
[18:17:14 CEST] <Foaly> wait, i might be really dumb
[18:21:28 CEST] <Foaly> yeah, i just had the order wrong :(
[18:21:36 CEST] <Foaly> sorry, it works fine of course
[18:22:20 CEST] <durandal_1707> i thought ffmpeg was wrong all this years
[18:23:41 CEST] <durandal_1707> just for one second
[19:17:55 CEST] <Oldest> why is ffprobe.exe file big as ffmpeg.exe ?
[19:18:54 CEST] <durandal_1707> because it is static build
[19:19:12 CEST] <Oldest> i understand that but ffprobe doesn't do anything like ffmpeg
[19:19:14 CEST] <DHE> because it has all the core ffmpeg features. really the encoders and muxers are about the only things you could omit
[19:19:36 CEST] <Oldest> ffmpeg is the file that does everything, ffprobe file is just for probing
[19:20:02 CEST] <DHE> you say that, but you can write a filter chain and have ffprobe tell you about the resulting output
[20:03:58 CEST] <Krock> Hello everyone. Is there an option to playback or process videos with GLSL using ffmpeg/ffplay?
[20:04:21 CEST] <furq> have you tried mpv
[20:05:23 CEST] <Krock> yes, but is it also capable processing videos?
[20:05:36 CEST] <BtbN> It's pure render-shaders for all I'm aware.
[20:05:39 CEST] <BtbN> ffmpeg has no such support
[20:06:08 CEST] <Krock> so it's up to the front end software to add shader support on top of ffmpeg?
[20:06:44 CEST] <BtbN> ffmpeg generally does not interface with OpenGL at all
[20:06:56 CEST] <BtbN> There used to be opengl stuff, but I think it's all gone?
[20:07:00 CEST] <Oldest> krock what is GLSL and why do you need that
[20:07:03 CEST] <BtbN> And it was only some weird renderer
[20:07:03 CEST] <Krock> originally came here from VLC but apparently yet only MPV suports that
[20:07:15 CEST] <BtbN> FFmpeg focuses on OpenCL
[20:09:51 CEST] <Krock> Oldest: GLSL are shaders which can be applied on videos for visual effects such as sharpen or upscale
[20:11:39 CEST] <BtbN> They require OpenGL though, as the name suggests
[20:12:29 CEST] <furq> there are thirdparty glsl filters for ffmpeg
[20:12:37 CEST] <furq> i assume they won't get merged because of dependencies
[20:13:00 CEST] <BtbN> There is no OpenGL interop at all, no compatible frames. So those are bound to be pretty inefficient
[20:14:03 CEST] <Krock> I see. Thanks for the answers :)
[20:25:14 CEST] <jph> howdy. anyone know a way to get ffmpeg (or really, ffprobe) to show if an audio stream is VBR?
[20:26:04 CEST] <der_richter> don't ask in mpv, it can't use the rende routput as encoding input
[20:26:10 CEST] <der_richter> *render
[20:26:25 CEST] <der_richter> your best bet, if you can program, might be libplacebo
[20:27:38 CEST] <der_richter> i was assuming encoding since you asked here, if it's just playback, yeah mpv..
[23:07:38 CEST] <poutine> Is there a guaranteed order with the optional PES header's data? if pts/dts indicator, escr flag, and es rate information had their proper bits set, would I expect them to be in that order?
[23:08:17 CEST] <DHE> mpegts headers?
[23:08:36 CEST] <poutine> DHE: yes, MPEG-2 PES header's Optional PES Header
[23:13:14 CEST] <Thomas_J> Is there a way to get ffmpeg to import part of it's command line from files or external input?
[23:13:58 CEST] <nicolas17> I don't think so, but if you're on non-Windows you can let the shell do that for you
[23:14:42 CEST] <nicolas17> ffmpeg $(cat args.txt)
[23:15:00 CEST] <durandal_1707> you can set filtergraphs from file
[23:15:54 CEST] <Thomas_J> I was just thinking that. I should be able to build the ffmpeg command string with a bash script. Is there any documentation on using args?
[23:18:24 CEST] <Aerroon> i don't understand what loudnorm does
[23:18:31 CEST] <Aerroon> i read about it and it all makes sense to me
[23:18:39 CEST] <Aerroon> but when i use it the results are nothing like what i read about
[23:18:59 CEST] <Aerroon> the idea is great: it makes audio sound roughly at a similar loudness, right?
[23:19:21 CEST] <Aerroon> but when i actually test it, i have two audio clips with almost the same integrated loudness, yet one sounds MUCH louder
[23:19:48 CEST] <Aerroon> and even the peak loudness is lower on the louder clip according to loudnorm
[00:00:00 CEST] --- Sun Aug 18 2019


More information about the Ffmpeg-devel-irc mailing list