[Ffmpeg-devel-irc] ffmpeg.log.20191004
burek
burek at teamnet.rs
Sat Oct 5 03:05:02 EEST 2019
[00:05:56 CEST] <c_14> not that I know of
[00:06:32 CEST] <brimestone> Not a book, but I found this awesome tutorial. https://github.com/brimestoned/ffmpeg-libav-tutorial
[00:12:38 CEST] <pink_mist> found? the link makes it look like something you created
[00:13:01 CEST] <brimestone> I forked it just in case it goes away..
[00:13:08 CEST] <pink_mist> ahh
[00:13:15 CEST] <brimestone> Originally https://github.com/leandromoreira/ffmpeg-libav-tutorial
[03:01:43 CEST] <analogical> does the mp4 container support the vp9 video codec?
[03:03:09 CEST] <furq> yes
[03:03:13 CEST] <furq> do players support it? probably not
[03:53:41 CEST] <Kaedenn> I have an mjpeg video file that I'd like to add a title card to.
[03:53:57 CEST] <Kaedenn> Text that fades in, displays for a few seconds, then fades out
[03:54:57 CEST] <Kaedenn> Should I just generate the frames using some image program or can ffmpeg generate all this for me?
[03:55:16 CEST] <Kaedenn> White text, black background, fixed durations for the fade in, display, and fade out.
[13:52:38 CEST] <DanielTheKitsune> Hi there, what should be the ./configure options to allow me to compile/install ffmpeg sources into a custom directory? I already have the binary package provided by Debian, but it lacks some functions I need, and so, I want to have two ffmpeg builds (the Debian one and this custom one).
[13:53:02 CEST] <DanielTheKitsune> let's say it's going to sit at /home/joe/ffmpeg/
[13:58:13 CEST] <DHE> well by default you get a static link (of the ffmpeg libraries), so you can just take the ffmpeg binary produced, rename it if need be, and copy it whereever you want and it should just work
[13:58:39 CEST] <DHE> do you need more than just the main ffmpeg (and if applicable, ffprobe and ffplay) binaries installed?
[13:59:56 CEST] <DanielTheKitsune> do I need to change the prefix or something?
[14:00:26 CEST] <DanielTheKitsune> ffmpeg and the included libraries (of which I installed their -dev packages), that's all I need, I think
[14:00:26 CEST] <kepstin> DanielTheKitsune: that depends on your answer to DHE's question
[14:00:51 CEST] <DanielTheKitsune> only ffmpeg
[14:01:18 CEST] <DanielTheKitsune> (and its corresponding tools)
[14:01:49 CEST] <kepstin> right, then DHE's answer will work for you.
[14:02:04 CEST] <DanielTheKitsune> ok, so no need to touch prefixes at all
[14:02:18 CEST] <DanielTheKitsune> just ./configure <and link the proper libraries>, make and make install
[14:03:28 CEST] <DHE> or just ./configure, make, cp ffmpeg /usr/bin # or whatever
[14:04:17 CEST] <DanielTheKitsune> when I set the prefix to /home/joe/ffmpeg, it gets automatically moved to such directory during make install
[14:04:43 CEST] <DanielTheKitsune> however, when I attempt to execute that specific binary, it appears to clash with the existing Debian build
[14:05:51 CEST] <DanielTheKitsune> like, WARNING: Library configuration mismatch
[14:06:05 CEST] <DanielTheKitsune> first it shows the configure options correctly
[14:06:54 CEST] <DanielTheKitsune> after the warning, it shows the configure options of the Debian build instead, exactly 8 times repeated
[14:07:06 CEST] <DanielTheKitsune> and it behaves like the Debian build instead
[14:07:26 CEST] <DanielTheKitsune> what did I do wrong, exactly?
[14:07:45 CEST] <DHE> sounds like a dynamic library build of ffmpeg found the libs from the wrong version it was built against
[14:08:28 CEST] <DanielTheKitsune> how can I force it to use its own libraries instead?
[14:08:41 CEST] <DanielTheKitsune> (so as not to use the Debian build ones)
[14:08:53 CEST] <DHE> well did you run ./configure --enable-shared ?
[14:08:58 CEST] <DanielTheKitsune> no
[14:09:11 CEST] <pink_mist> you'll either want to build it statically or change the order ld looks through directories for libraries to load
[14:09:25 CEST] <DanielTheKitsune> the only "strange" thing I do is to add --prefix=/home/joe/ffmpeg
[14:09:27 CEST] <pink_mist> or perhaps use rpath magic, but that would also require rebuilding afaik
[14:09:47 CEST] <DHE> I believe it requires a relink at least
[14:09:47 CEST] <DanielTheKitsune> how do I build statically?
[14:09:57 CEST] <DanielTheKitsune> yeah, I won't mind having to recompile
[14:10:02 CEST] <DHE> well that's the thing. I figured it would have been if you didn't specify --enable-shared or --disable-static
[14:10:14 CEST] <DHE> the actual installed ffmpeg binary should be fairly large, like 20 megabytes
[14:10:58 CEST] <DanielTheKitsune> the executable itself I just compiled is 275 KB, dang
[14:11:06 CEST] <DanielTheKitsune> no wonder why is it failing
[14:11:07 CEST] <DHE> well that's definitely a shared lib
[14:11:51 CEST] <DanielTheKitsune> now, how do I enforce static linking? somehow the shared lib thing was enabled without notice
[14:12:36 CEST] <DanielTheKitsune> in fact, neither --enable-shared or --disable-static are found in the configure options when I compiled the thing
[14:12:39 CEST] <DHE> in your build dir, is there a .so file or a .a file named libavcodec.* under the directory libavcodec
[14:12:45 CEST] <DanielTheKitsune> yeah
[14:12:50 CEST] <DHE> this might be the linker selecting system libs over specified libs
[14:13:22 CEST] <DanielTheKitsune> wait
[14:13:24 CEST] <DanielTheKitsune> hrm
[14:13:30 CEST] <DanielTheKitsune> cd ..
[14:13:32 CEST] <DanielTheKitsune> ls
[14:13:35 CEST] <DanielTheKitsune> oops
[14:13:38 CEST] <pink_mist> lol
[14:13:44 CEST] Action: DHE needs to have that XKCD on macro
[14:14:19 CEST] <DanielTheKitsune> such dir only exists on include/ dir
[14:14:48 CEST] <DanielTheKitsune> and then, only .h files are present
[14:14:50 CEST] <DanielTheKitsune> *there
[14:15:18 CEST] <DHE> that's not the source code to ffmpeg. that's the ffmpeg-dev package contents. you looking in /usr/include ?
[14:15:23 CEST] <DanielTheKitsune> libavcodec.so is in the lib directory
[14:15:36 CEST] <DanielTheKitsune> DHE: nah, in my own build
[14:15:46 CEST] <DanielTheKitsune> so where should I stare at
[14:16:16 CEST] <DHE> the directory where you unpacked ffmpeg or checked out some git code. there should be a configure script in front of you
[14:16:39 CEST] <DanielTheKitsune> right
[14:17:07 CEST] <DHE> ls libavcodec/libavcodec.*
[14:17:24 CEST] <DanielTheKitsune> yes, both .so and .a files exist
[14:17:52 CEST] <DHE> okay, that's probably why
[14:18:07 CEST] <DanielTheKitsune> what should I do with them
[14:18:28 CEST] <DHE> what's the timestamps on the files?
[14:18:48 CEST] <DHE> are they identical? or is the .so much older?
[14:18:55 CEST] <DanielTheKitsune> for the .a file, October 2
[14:19:18 CEST] <DanielTheKitsune> for the .so.58 file (where the .so is symlinked to) is Sep 26
[14:19:47 CEST] <DHE> rm */*.so* ffmpeg_g ffmpeg ; make ffmpeg
[14:19:49 CEST] <DanielTheKitsune> the .a file is 103 MB, the .so is 62 MB
[14:20:09 CEST] <DanielTheKitsune> oh, should I really destroy every .so file?
[14:20:23 CEST] <DHE> in the ffmepg build directory, yes
[14:20:28 CEST] <DanielTheKitsune> ok
[14:20:38 CEST] <DHE> obviously not from /usr/lib[64] or anything
[14:21:02 CEST] <DanielTheKitsune> hrm
[14:21:12 CEST] <DanielTheKitsune> ok, removed the specified files
[14:21:18 CEST] <DanielTheKitsune> so only make ffmpeg?
[14:21:49 CEST] <DHE> for now, yes
[14:21:56 CEST] <DanielTheKitsune> ok, nice
[14:21:57 CEST] <DHE> just needs a relink
[14:22:11 CEST] <DanielTheKitsune> and then, all I needed was to ./ffmpeg (and do my own business there)
[14:22:52 CEST] <DanielTheKitsune> yeah, no library clashing anymore, now I just need to test if it encodes amr-nb files (which is what I actually compiled it for)
[14:24:24 CEST] <DanielTheKitsune> huh, I appear to forget syntax, -encoders will help
[14:25:14 CEST] <DanielTheKitsune> oh, I just needed _ instead of -
[14:26:00 CEST] <DanielTheKitsune> yeah, it works, thanks, really, you saved my day :D
[14:26:18 CEST] <DHE> /usr/share/Adobe/doc/example/android_vm/root/sbin/ls.jar: Error: Device is not responding
[14:26:21 CEST] <DHE> cool that works
[14:27:04 CEST] <DanielTheKitsune> how do I disable smart resample dither/noise?
[14:27:31 CEST] <DanielTheKitsune> so ffmpeg tries to do a dumb resample instead of smartly introducing noise to hide the effects of resample
[14:29:49 CEST] <durandal_1707> see aresample and swresample options
[14:29:55 CEST] <furq> -dither_scale 0
[14:30:10 CEST] <DanielTheKitsune> yeah, indeed, amr_nb definitely does what I want: hyper low bitrate and still audible quality
[14:30:28 CEST] <furq> did you try opus
[14:30:57 CEST] <DanielTheKitsune> yeah, unsupported by my dumbphone
[14:30:57 CEST] <furq> or is 6kbps too much
[14:31:00 CEST] <furq> oh fun
[14:31:31 CEST] <DanielTheKitsune> the same dumbphone records .AMR from microphone, presumably hardware-encoded by the 2G controller
[14:32:02 CEST] <DanielTheKitsune> look for "NYX xyn306", that's my dumbphone, appears to be exclusively sold in one country tho
[15:16:11 CEST] <phobosoph> hi channel
[15:16:15 CEST] <phobosoph> you were very helpful btw
[15:16:27 CEST] <phobosoph> the webcam got h264+, but sadly youtube doesn't like the keyframe frequency
[15:16:36 CEST] <phobosoph> although it is 60, which means for 30fps, 2 keyframes/sec
[15:16:40 CEST] <phobosoph> youtube tells me it wants 4 or less
[15:16:51 CEST] <phobosoph> well, I disabled h264+ and now I can manually specify the keyframe freq
[15:45:25 CEST] <king09x> is it normal for the length of the adaptation field of an mpegts packet to be reported as larger than the packet itself? Can that field span multiple packets or must it fit within the 188 bytes?
[17:41:26 CEST] <snatcher> what's the modern alternative to phash for video comparison?
[17:43:32 CEST] <kepstin> snatcher: (joking, but) upload to youtube and see whether you get a content-id match
[17:52:44 CEST] <snatcher> https://amiaopensource.github.io/ffmprovisr/index.html#generate_video_fingerprint hmm
[17:56:11 CEST] <snatcher> how does it work? what's the right way to set compression level/get more compressed result?
[17:57:38 CEST] <kepstin> huh, that's this filter https://www.ffmpeg.org/ffmpeg-filters.html#signature-1 I've never used it :/
[17:57:45 CEST] <kepstin> it has nothing to do with compression...
[18:05:02 CEST] <snatcher> is there a such a thing for audio?
[18:13:41 CEST] <kepstin> snatcher: take a look at https://acoustid.org/ maybe
[18:20:51 CEST] <snatcher> seems so, ffmpeg supports chromaprint
[19:21:57 CEST] <AndyAndyBoBandy> Hello! I'm using -ss and -to to extract segments of video files (with -c copy) and it's all great. But sometimes I'd like to cut a segment that is at the end of the video, and I'd rather not have to try to specify the final timestamp. I thought maybe `-to '-0'` would do the trick, but no. Is there another way to specify the final timestamp?
[19:23:52 CEST] <TheAMM> Just don't specify it?
[19:25:05 CEST] <AndyAndyBoBandy> TheAMM: ohh, without '-to' at all? Thanks!
[19:29:40 CEST] <Diag> can/will ffmpeg do a video source type detection from input devices?
[19:42:44 CEST] <kepstin> Diag: depends on the input device
[19:43:02 CEST] <Diag> Yknow thats a goodass question
[19:43:08 CEST] <Diag> pont*
[20:12:07 CEST] <Diag> kepstin: is it possible to attempt to get it to list what the current mode of a dshow device is
[20:16:25 CEST] <Diag> oh god im retarded
[21:56:23 CEST] <snatcher> what's the difference between filter and muxer concepts? signature is filter while chromaprint is muxer, why?
[22:01:45 CEST] <klaxa> a muxer works on bytestreams, a filter works on raw frames
[22:03:29 CEST] <furq> you could probably implement a signature muxer or a chromaprint filter if you wanted to
[22:04:01 CEST] <furq> i'm guessing chromaprint is a muxer because it sends the data to an external lib rather than doing it internally
[22:04:19 CEST] <furq> not like that's a requirement obviously
[22:14:27 CEST] <snatcher> https://ffmpeg.org/ffmpeg-formats.html#chromaprint-1 is fp_format option broken?
[22:15:14 CEST] <snatcher> doesn't work for me in case: ffmpeg -i FILE -vn -f chromaprint -fp_format raw FILE.raw
[22:16:17 CEST] <snatcher> https://clbin.com/XaxGN error related output
[22:17:23 CEST] <snatcher> other options like an -algorithm works(? doesn't raise error at least) i think
[22:28:55 CEST] <Diag> another stupid question
[22:29:03 CEST] <Diag> can you set how the video is scaled in ffplay
[22:33:42 CEST] <snatcher> hmm, is chromaprint even worth to compare audio? tried with 1.flac converted to 1.opus and end with different fingerprints between 1.opus and 1.flac
[22:46:19 CEST] <snatcher> what's the shortest way to check if video has audio stream?
[22:46:34 CEST] <BtbN> define shortest
[22:46:44 CEST] <BtbN> least time? Shortest commandline?
[22:46:51 CEST] <snatcher> BtbN: both
[22:47:05 CEST] <BtbN> You can always just play the video and listen.
[22:47:13 CEST] <BtbN> pretty fast, pretty short commandline
[22:48:00 CEST] <BtbN> Or just ffprobe it and see if it lists any audio streams.
[23:04:24 CEST] <kepstin> Diag: no, it's a super-simple player, it just uses SDL for the video window.
[23:04:34 CEST] <kepstin> Diag: if you care, use a real player like mpv :)
[23:04:40 CEST] <Diag> wow rude
[23:04:44 CEST] <Diag> i want a refund
[23:04:58 CEST] <kepstin> Diag: you can of course software scale the video with a filter and then display it with ffplay
[23:05:03 CEST] <Diag> thats legit what i did lol
[23:05:14 CEST] <Diag> it adds a small amount of latency though which, eh
[23:05:35 CEST] <Diag> when im using it for what i want to use it for it wont matter if its scaled with nearest neighbor, so yaknow
[23:06:40 CEST] <kepstin> ffplay is intended mostly as a combination of ffmpeg example application and a test program to reproduce issues and narrow down if they're in ffmpeg or a player application. It just happens to also work as a minimal standalone player.
[23:07:13 CEST] <Diag> yee, i was just trying to get the lowest dshow latency as possible
[23:07:20 CEST] <Diag> which it actually does
[23:19:22 CEST] <cehoyos> Diag: I don't see the beginning of the discussion but you can also software-scale with ffplay
[23:59:07 CEST] <snatcher> why output different for "-i INPUT -vn -map_metadata -1 -c:a copy -fflags +bitexact -flags:a +bitexact -f chromaprint OUTPUT" and "-i INPUT -vn -f chromaprint OUTPUT"?
[00:00:00 CEST] --- Sat Oct 5 2019
More information about the Ffmpeg-devel-irc
mailing list