[Ffmpeg-devel-irc] ffmpeg.log.20191215
burek
burek at teamnet.rs
Mon Dec 16 03:05:02 EET 2019
[00:36:18 CET] <BtbN> grosso, both of those functions are deprecated.
[00:37:05 CET] <BtbN> And generally, feeding a de/encoder with an empty input signals EOF.
[00:37:36 CET] <BtbN> So after it has dumped all queued frames out, it is not supposed to be used again, and behaviour is undefined.
[02:16:44 CET] <bodqhrohro_> Why I don't have libx264 options section in ffmpeg -h?
[02:18:33 CET] <BtbN> Because libx264 options are not ffmpeg options.
[02:18:55 CET] <BtbN> If all the encoder/decoder/filter/... options were in there, it'd be a mess.
[02:19:24 CET] <bodqhrohro_> BtbN: so where did they go now?
[02:19:40 CET] <BtbN> Read the top if the -h output...
[02:19:50 CET] <BtbN> Also, that's been like that for forever, not sure if it ever was different.
[02:20:07 CET] <bodqhrohro_> Oh, full. Thanks.
[02:20:42 CET] <BtbN> full is several thousand likes, you usually don't want that.
[02:20:53 CET] <BtbN> *lines
[02:42:18 CET] <bodqhrohro_> The -chromaoffset seems to be ignored even if I set -subq 5 -psy 0 -trellis 0 :/ Why?
[06:04:48 CET] <byte4byte_> wtf?
[06:04:54 CET] <byte4byte_> is there an op around
[06:05:01 CET] <byte4byte_> im being harassed via pm by a user name blb
[06:20:58 CET] <f00lest> I am generating images from a video using transcoding example and image 2 oformat, but I am not able to generate jpg images
[06:21:41 CET] <f00lest> is there anyway I can ask ffmpeg to convert to png images?
[06:22:33 CET] <f00lest> I've tried using `avformat_alloc_output_context2(&ofmt_ctx, av_guess_format("image2", NULL, NULL), NULL, "image%d.jpg");`
[06:23:04 CET] <f00lest> but the image viewer says file is not jpeg image
[06:54:35 CET] <f00lest> I've also tried to this on line 128 of trancoding.c example `encoder = avcodec_find_encoder(AV_CODEC_ID_MJPEG);`
[06:54:45 CET] <f00lest> but that throws a segmentation fault
[09:08:31 CET] Last message repeated 1 time(s).
[09:08:31 CET] <f00lest> http://dranger.com/ffmpeg/tutorial01.html
[09:08:58 CET] <f00lest> Can I use dranger's first tutorial
[09:09:23 CET] <f00lest> replace the AV_PIX_FMT from RGB to something else and get jpeg pictures
[09:12:52 CET] <f00lest> has any one tried to get jpeg pictures using drangers tutorial and a video file?
[09:16:52 CET] <f00lest> does anyone know how to write headers for jpeg file, I think AV_PIX_FMT_YUVJ420P is what I need.
[16:28:49 CET] <void09> anyone know how to make a black&white image mask representing the logo position in a video ?
[16:29:28 CET] <void09> automagically, that is : ) so I can use this in a script to removo the tv logo in a video at a keypress
[16:30:31 CET] <void09> something like, take some screenshots at various poitns in the video, and see what pixels match in all of them
[16:31:00 CET] <void09> and somehow excluding black bars
[16:33:56 CET] <scramblez> I've converted a webm UHD/HDR video from you tube to mp4 and see the "Side data:" field is not carried across into mp4. Is this how it is meant to work? Is the mp4 container not suitable for HDR content?
[16:34:18 CET] <scramblez> This is the output from the two videos: http://dpaste.com/39EK36A
[16:34:50 CET] <JEEB> there's a colr atom but generally formats have their own headers in place so you don't need the info on the container side
[16:35:07 CET] <JEEB> google did the thing it did with webm because they had standardized color spaces not-so-well with VP9
[16:35:18 CET] <JEEB> so they more or less were like "uhh, yea. we totally didn't think this through back then"
[16:35:45 CET] <scramblez> JEEB: I C ... so mp4 will still carry across the full HDR luminance and colors?
[16:36:10 CET] <JEEB> depends on what you did, if it's just remux the content is the same
[16:36:18 CET] <JEEB> and it seems like you got the colorspace values through?
[16:36:28 CET] <JEEB> since you have BT.2020 and SMPTE ST.2084 marked there
[16:36:33 CET] <scramblez> Yes, just remux into mp4 container. The video stream was just copied over.
[16:36:54 CET] <scramblez> Yes, the gamut, but not sure about luminance?
[16:37:18 CET] <JEEB> the only thing you lost is metadata that says that the content will not have more than 1000 nits
[16:37:33 CET] <JEEB> which is a helper thing for things trying to play it
[16:37:41 CET] <scramblez> I C
[16:37:44 CET] <JEEB> mostly because that metadata in most formats is in the video stream
[16:37:55 CET] <JEEB> and I don't think the colr atom in mp4 has that
[16:38:07 CET] <JEEB> (for formats that don't have that in the video stream itself)
[16:38:18 CET] <scramblez> So why is this "Side data" added in the stream? How is it meant to be used?
[16:38:55 CET] <JEEB> the mastering screen one is mostly informational and probably useless for playback. the content light level thing is a helper thing for playback
[16:39:09 CET] <JEEB> so that when you're tone mapping the content to the screen, you know that you don't need to consider stuff higher than 1000 nits
[16:39:14 CET] <JEEB> thankfully a lot of stuff is dynamic anyways :P
[16:39:24 CET] <scramblez> Thanks JEEB, what you say makes sense.
[16:39:34 CET] <JEEB> like the video renderer in mpv which dynamically calculates the brightness
[16:40:31 CET] <scramblez> Sadly I played it on the TV, not a PC. The TV comes with its own video player and limited formats compatibility.
[16:41:03 CET] <JEEB> yes, most things ain't gonna take VP9 in MP4
[16:41:07 CET] <JEEB> I would guess?
[16:41:19 CET] <JEEB> not that I've tested :P
[16:43:20 CET] <scramblez> Another Q, perhaps not relevant to ffmpeg - I created a series of sinusoidal tones for testing different frequecies. I would like to measure (with a laptop) the SPL output from the speakers as these frequencies are played back. Could I use ffmpeg to do this?
[16:44:24 CET] <scramblez> So something which shows the dB output would be what I'm after.
[16:45:49 CET] <scramblez> arecord has a VU meter, but was wondering if ffmpeg can display dB on sound as captured by a mic.
[16:52:16 CET] <pink_mist> scramblez: ffmpeg doesn't really do display stuff, there's ffplay, but it's mainly a debugging tool
[16:59:08 CET] <Hello71> also it's meaningless.
[16:59:17 CET] <Hello71> you need to calibrate either the speaker or the microphone first
[17:05:10 CET] <scramblez> Hello71: I was going to play back each frequency and see what dB I get out of the speakers for each one of those.
[17:05:23 CET] <Hello71> ... how.
[17:05:49 CET] <scramblez> Sorry, I don't understand. :-/
[17:09:16 CET] <scramblez> I play a file I prepared with various frequencies, set the AVR to a loud enough level so I can hear it and then use a laptop with a mic to display the level of each frequency using some app (like e.g. arecord)
[17:10:03 CET] <scramblez> I some frequencies show up with a much lower dB level then I have a speaker/subwoofer tuning issue
[17:12:10 CET] <Hello71> no
[17:12:21 CET] <Hello71> if the speaker is non-linear, why would the microphone be linear
[17:13:38 CET] <Hello71> if you have some professional microphone with calibrated frequency response curves then you can normalize it
[17:14:00 CET] <Hello71> and then you have reinvented speaker calibration
[17:14:29 CET] <Hello71> s/linear/flat/
[17:22:26 CET] <scramblez> Hello71: yes, I see what you're saying.
[17:24:30 CET] <scramblez> I guess I'd be better off using my ears. :-)
[17:34:26 CET] <scramblez> Thank U all, need to run now.
[22:11:31 CET] <rigid> ahoy
[22:12:26 CET] <rigid> is there a cpu-friendly way to switch encoding between two v4l2 devices without restart?
[22:14:40 CET] <rigid> i'm currently offering a stream using "ffmpeg -f video4linux2 [...] -i /dev/video0 [...] -f flv -listen 1 rtmp://hostname:8080/cam" it would be nice if I could switch that stream to /dev/video1 and back without interruption
[22:40:18 CET] <kepstin> rigid: the ffmpeg cli tool isn't designed for that sort of thing, but it is technically possible if you make the command read both inputs in parallel and use the "streamselect" filter to pick one. You can then use filter commands (e.g. via the zmq filter) to tell it to switch which input is selected.
[22:41:16 CET] <rigid> ah, hm... i thought a way would be more on the v4l2 side, but that's even better
[22:41:22 CET] <durandal_1707> only designed to work with files
[22:41:54 CET] <kepstin> oh, does streamselect not discard frames from the non-selected source?
[22:42:10 CET] <rigid> i failed to find "streamselect". with that it's probably straightforward
[22:42:13 CET] <rigid> kepstin: thanks
[22:42:15 CET] <kepstin> huh, guess that won't work then
[22:42:30 CET] <rigid> kepstin: so it reads both sources at the same time?
[22:42:43 CET] <rigid> that'd probably be cpu-hungry
[22:42:52 CET] <rigid> but better than nothing
[22:43:06 CET] <durandal_1707> yes, and discard not used one, designed to work with files only
[22:43:14 CET] <kepstin> oh, no, streamselect uses framesync, it should work ok with this use case
[22:43:36 CET] <durandal_1707> but such stuff was never tested
[22:44:00 CET] <rigid> i will test it
[22:44:25 CET] <rigid> if really "everything is a file" it should work :-P
[22:44:42 CET] <kepstin> it's in the "i don't see why it wouldn't work" kinda thing
[22:45:21 CET] <kepstin> v4l cameras are kernel timestamped, i think, so you shouldn't be getting timestamp drift between multiple cameras
[22:45:45 CET] <rigid> i could imagine that there's a lag if a video device needs some time to come up, but i wouldn't care as long as the stream doesn't drop
[22:46:43 CET] <kepstin> the main issue would be that ffmpeg opens inputs sequentially, so it'll start one camera before the other. dunno if this'll actually cause any issues other than lag.
[22:47:03 CET] <rigid> suppose not
[22:47:23 CET] <rigid> in theory i could also restart ffmpeg, but it'd be nice if I could avoid that lag
[22:47:47 CET] <kepstin> also if one camera is slower framerate than the other, the fact that ffmpeg doesn't have input threads might cause issues.
[22:47:48 CET] <rigid> i'm replaying with ffplay and it needs some quirks to be low-latency
[22:48:09 CET] <kepstin> or might not, i dunno if there's kernel buffering :/
[22:48:13 CET] <rigid> wouldn't it switch framerate in-stream?
[22:48:19 CET] <rigid> i suppose that's not possible
[22:48:46 CET] <kepstin> no, the issue is that ffmpeg cli tool as a while is single threaded, so if it gets blocked waiting for a frame from one camera, it might not be able to read from the other.
[22:48:59 CET] <kepstin> i've never tried multiple v4l inputs, i dunno if that's actually an issue with v4l
[22:49:39 CET] <rigid> we'll see, i'll try tomorrow. thanks a lot for your suggestions!
[22:50:16 CET] <kepstin> this is all in all kind of a hack, ffmpeg cli tool is not designed to work as a realtime video mixer :)
[23:04:36 CET] <Arnob> test
[23:04:46 CET] <Arnob> Hi, how can I change the color of the waveform in this command: ffmpeg -y -i 1.wav -loop 1 -i b.jpg -filter_complex "[0:a]showwaves=s=1280x720:mode=line,colorkey=0x000000:0.01:0.1,format=yuva420p[v];[1:v][v]overlay[outv]" -map "[outv]" -pix_fmt yuv420p -map 0:a -c:v libx264 -c:a copy -shortest output1.mkv Thank you.
[23:13:25 CET] <furq> Arnob: showwaves=s=1280x720:mode=line:colors=0xff0000|0x0000ff
[23:13:49 CET] <furq> also you shouldn't need chromakey and format, showwaves draws on a transparent background
[23:46:51 CET] <Kaedenn1> I'm trying to compile a program using mingw (on x86_64 Linux) that uses the libav* libraries. Can I just use the prebuilt libraries and header files for Windows, or do I have to compile it from source?
[23:47:17 CET] <Kaedenn1> I was told not to try compiling ffmpeg for Windows.
[23:48:38 CET] <johnjay> are regular non-bluray DVDs typically interlaced?
[23:49:15 CET] <DHE> Kaedenn1: you can use the prebuilt libraries if you have the right headers, which it sounds like you do...
[23:49:37 CET] <Kaedenn1> I don't. I'd need to grab those two.
[23:49:47 CET] <Kaedenn1> The entire ffmpeg-dev stuff; includes and libs.
[23:49:54 CET] <Kaedenn1> s/two/too/
[23:50:26 CET] <kepstin> johnjay: it really varies. most dvds of hollywood film releases in the past 10 years won't be interlaced
[23:50:30 CET] <kepstin> tv shows usually are
[23:50:57 CET] <kepstin> weird stuff from the mid-2000s is sometimes mixed interlaced/progressive/telecined because lol
[23:51:23 CET] <johnjay> i looked on wikipedia to figure out if dvds were progressive scan or not
[23:51:30 CET] <johnjay> it says.. 240p? that doesn't seem right
[23:52:00 CET] <kepstin> dvds can encode progressive scan content at 24fps (along with instructions on how to apply telecine to display on an analogue video signal)
[23:52:31 CET] <johnjay> yeah i thought a plain dvd movie could be 720p
[23:52:34 CET] <johnjay> is that wrong?
[23:52:37 CET] <kepstin> 30fps "progressive" is possible by doing a 1:1 pulldown, i.e. splitting the frame into a top field and bottom field
[23:52:40 CET] <johnjay> or is it 720i?
[23:52:55 CET] <kepstin> that's wrong, dvd only encodes standard definition
[23:53:10 CET] <johnjay> ok
[23:53:39 CET] <johnjay> i found this old dvd player with the component cables on it, was wondering if worth using
[23:53:40 CET] <kepstin> so 480i60 or 480p24 or mixed/switching is possible on ntsc dvds
[23:53:59 CET] <kepstin> better to rip the dvd to digital video on a pc
[23:54:02 CET] <johnjay> so it can achieve higher framerate by interlacing? radical
[23:54:12 CET] <kepstin> higher field rate, not frame rate
[23:54:22 CET] <kepstin> (well, it's 30 frames/s, so technically higher frame rate too)
[23:55:44 CET] <kepstin> (i'm not sure exactly what frame/field rates are supported on pal dvd, i suspect it might be only 576i50)
[23:56:18 CET] <johnjay> interesting.
[23:56:38 CET] <johnjay> i guess if i'm going to watch a SD i.e. standard dvd movie then there's no point in getting a dvd player with an hdmi
[23:56:47 CET] <johnjay> i already have this old one in the den where the tv is anyway
[23:57:06 CET] <johnjay> the only reason to buy something would be to buy a blu-ray, correct?
[23:57:07 CET] <kepstin> you'll always get a better picture using a digital cable instead of analogue
[23:57:22 CET] <kepstin> and most (all?) blu-ray players can also play dvds
[23:57:30 CET] <johnjay> because of the digital to analog conversion?
[23:58:02 CET] <kepstin> yeah. composite video (single wire, yellow cable ends) is particularly bad.
[23:59:01 CET] <johnjay> i vaguely knew that the component cables are better than the yellow one yeah
[23:59:05 CET] <johnjay> but didn't know the reason
[23:59:09 CET] <kepstin> if you're using an lcd tv, then using an analogue cable means it's going from digital (on the dvd) to analogue (on the cable) back to digital (in the tv)
[23:59:36 CET] <johnjay> ouch
[23:59:55 CET] <kepstin> you will get noticably better picture using hdmi instead of composite
[00:00:00 CET] --- Mon Dec 16 2019
More information about the Ffmpeg-devel-irc
mailing list