[Ffmpeg-devel-irc] ffmpeg.log.20180625

burek burek021 at gmail.com
Tue Jun 26 03:05:01 EEST 2018


[00:30:02 CEST] <teratorn> asked this a couple times so sorry if I missed any answers...
[00:30:11 CEST] <teratorn> any tips of tricks to profile an ffmpeg filter-graph execution?
[00:30:15 CEST] <teratorn> s/of/or/
[01:27:50 CEST] <kepstin> teratorn: what are you trying to accomplish that makes you think profiling an ffmpeg filter-graphi is the solution?
[01:30:23 CEST] <teratorn> I want to sanity-check my particular solution that combines multiple videos, still images and various effects using a fairly complicated filter-graph... just want to make sure I'm not screwing up badly somewhere and making it much slower than a straight transcode
[01:31:04 CEST] <Cracki> >much slower than a straight transcode
[01:31:10 CEST] <Cracki> they're filters. they cost something.
[01:31:28 CEST] <teratorn> knowing how much cpu time each node takes would give me some insight
[01:31:30 CEST] <Cracki> and you use them because you have to.
[01:31:42 CEST] <teratorn> sure, I get that
[01:31:45 CEST] <Cracki> yes true, I understand the desire to see which costs what
[01:32:20 CEST] <Cracki> I have no clue, but perhaps looking at the threads and their call stacks could be insightful.
[01:32:31 CEST] <Cracki> that is, if filter graphs are modeled as a thread per filter...
[01:33:14 CEST] <teratorn> could be.. I'm not sure what graph execution model is
[01:33:21 CEST] <teratorn> *what the
[01:37:09 CEST] <kepstin> they're not really modeled like that. for the most part, ffmpeg filters are all single-threaded and run in the thread of whatever created/calls into the filter graph
[01:38:34 CEST] <teratorn> I'm using the front-end
[01:40:11 CEST] <teratorn> I guess I can just time a transcode of my output, vs how long it took my invocation of ffmpeg to produce that output
[01:40:59 CEST] <kepstin> anyways, ffmpeg doesn't contain any code for measuring execution time per filter instance. Profiling it with a regular code profiling tool would be tricky, particularly if you have multiple instances of a filter.
[01:42:25 CEST] <kepstin> something that might actually be useful is running ffmpeg with -v verbose and make sure that you're not getting a lot of auto-inserted scale filters for pixel format conversions between filters.
[01:43:14 CEST] <kepstin> other things to keep an eye out is to avoid using overlay filters when you can use hstack or vstack.
[01:44:44 CEST] <kepstin> other than that, presumably you put in a filter because you want to use it. So if it makes things slow, well, :/
[01:54:49 CEST] <furq> teratorn: run it with a null output
[01:54:56 CEST] <furq> -f null -
[01:55:32 CEST] <furq> it's not really a great profiling tool but it will at least tell you how fast the full filterchain processing is
[01:55:37 CEST] <teratorn> kepstin: ok, I /am/ doing some overlay filters
[01:56:10 CEST] <teratorn> this is what I mean... yeah as far as /I know/ i'm not using any filters I don't have to.. but maybe there are other ways to skin the same cat
[01:56:31 CEST] <teratorn> furq: good tips, thanks
[01:56:41 CEST] <furq> if you're using overlay to put videos side-by-side then that's wrong
[01:56:46 CEST] <furq> otherwise it's probably fine
[01:57:02 CEST] <teratorn> ah, cool
[01:57:18 CEST] <furq> just pastebin the filterchain if you want people to look over it
[02:05:28 CEST] <teratorn> sure if you're curious, I'm doing something like this: https://hastebin.com/raw/tahuyeyupe which produces output something like this: https://docs.zoho.com/file/0cpqed0d0433728d14865b43654a09eed8059
[02:29:01 CEST] <Cracki> that is one huge complicated filter
[02:30:28 CEST] <Cracki> maybe you want to use a video editing program for that
[02:40:42 CEST] <furq> you know you can have spaces and newlines in the filterchain, right
[02:40:46 CEST] <furq> this seems like a good time to make use of that
[03:01:30 CEST] <teratorn> Cracki, furq: it's programmatically generated to produce effects like in the sample output I pasted :)
[03:01:40 CEST] <teratorn> furq: no I wasn't aware of that; noted
[03:04:00 CEST] <kepstin> for large programmatic filters you can also write them to a separate file then use the -filter_complex_script option. That avoids command line length issues, and sometimes makes it easier to read/debug.
[03:19:09 CEST] <teratorn> kepstin: cool beans
[05:13:26 CEST] <galex-713_> hi
[05:13:37 CEST] <galex-713_> is stereo3d filter interleaved mode notably broken?
[05:13:44 CEST] <galex-713_> because Ive the impression it is
[05:13:52 CEST] <galex-713_> I just tested bino, and it worked where mpv didnt
[06:37:19 CEST] <TheAMM> nvenc_h264 takes a small moment to warm up, and on realtime sources this results in a small pause/lag at the start of the output video
[06:37:40 CEST] <TheAMM> Is there some way I could initialize the encoder without using the first output frame?
[06:38:40 CEST] <TheAMM> I'm not at the machine to test it right now, but would vf trim work, or do I have to go wild with nut-piping into another ffmpeg that would drop the first (premature) frame?
[06:40:14 CEST] <TheAMM> Actually nevermind, that won't work, since I can't drop the frame without re-encoding which defeats the purpose of using nvenc for realtime encoding
[08:29:32 CEST] <pragomer> I have a video file called webm.dat7  any chance to convert this to something readable? "mediainfo" tells me nothing about that file
[08:34:11 CEST] <Cracki> where is it from
[08:34:27 CEST] <Cracki> be forthcoming with relevant info
[08:34:39 CEST] <Cracki> or else it's the spanish inquisition for you
[08:35:58 CEST] <pragomer> LOL ... its a webvideo in chromes browser cache...
[08:37:33 CEST] <Cracki> share it if you want. or open it with a hex editor and show the first page
[08:37:48 CEST] <Cracki> ffprobe it
[08:38:23 CEST] <pragomer> mm. unfort. I cannot share it as it is part of an investigation in our company :-(... but I surely can post you the first hex-view page
[08:38:30 CEST] <pragomer> just a moment (and thank you already)
[08:39:52 CEST] <pragomer> hex-view
[08:39:53 CEST] <pragomer> https://snag.gy/f6oJ3A.jpg
[08:40:02 CEST] <pragomer> I know.. this doesnt look like any video format
[08:41:02 CEST] <Cracki> indeed. could be a fragment.
[08:41:11 CEST] <Cracki> I'd expect dat6 and dat8 to exist?
[08:43:37 CEST] <Cracki> take the file size, represent in hex, see if you can find this (or the reverse) in the hex view
[08:43:50 CEST] <Cracki> the first 1-2 lines look not-very-random
[08:44:08 CEST] <pragomer> I think too, that it is just a fragment.. so... I think it wont be worth the efforts.
[08:44:15 CEST] <Cracki> potentially some kind of header. that's all I would hope to get out of that file alone.
[08:44:16 CEST] <pragomer> but thank you in any way for your support
[08:45:25 CEST] <Cracki> there might be ways to reassemble that browser cache
[08:54:10 CEST] <furq> pragomer: does ffprobe give anything useful
[08:55:11 CEST] <pragomer> webm:  Invalid data found when processing input
[08:56:05 CEST] <pragomer> ps: originally the file has no extension, just "webm"... the dat7 extension was "defined" from our forensics tool "x-ways"
[11:35:13 CEST] <Celmor> "No such filter: 'zscale'" meh
[11:43:34 CEST] <haystack> Hi, I would like to share my screen using ffmpeg but i'm not able to host the streaming server within this company. Has anyone solved this problem already? Or can recommend a service that accept encrypted ffmpeg streams?
[11:45:00 CEST] <JEEB> Celmor: yea it bases on a separate library
[11:45:00 CEST] <JEEB> zimg
[11:45:05 CEST] <JEEB> https://github.com/sekrit-twc/zimg/
[11:50:31 CEST] <Celmor> which I need to first "install"?
[11:50:49 CEST] <JEEB> yes, then you build FFmpeg with it
[11:51:01 CEST] <Celmor> according to my package manager I have zimg installed, does ffmpeg not recognize that?>
[11:52:11 CEST] <JEEB> if it was not built with it, then it will not have it
[11:52:23 CEST] <JEEB> since the module only gets built if it was enabled during build time
[13:51:59 CEST] <YokoBR> hi there
[13:52:14 CEST] <YokoBR> please, how can I get a remote m3u8 (hls) and re-stream it?
[13:52:37 CEST] <YokoBR> i've tried -c copy but it didn't worked
[13:53:09 CEST] <YokoBR> I want to receive an hls stream and push it to a rtmp server
[13:59:41 CEST] <kepstin> assuming the codecs are compatible, i'd expect that to work.
[14:04:26 CEST] <YokoBR> https://pastebin.com/4KA6CFwm
[14:04:31 CEST] <YokoBR> I've done this so far
[14:05:14 CEST] <ozette> can ffmpeg follow that link?
[14:08:12 CEST] <ariyasu> don't need the "ffmpeg -i ffmpeg -i" just on of them
[14:08:27 CEST] <ariyasu> other than that it looks ok i guess, can you paste bin the console output
[14:08:39 CEST] <ariyasu> one*
[14:11:46 CEST] <YokoBR> ariyasu: it's only one of them at once
[14:11:55 CEST] <YokoBR> but I can't forward it yet
[14:25:09 CEST] <Franciman> Hi
[14:25:23 CEST] <Franciman> av_frame_get_best_effort_timestamp is now deprecated, what should I use instead?
[15:27:48 CEST] <King_DuckZ> hi, I'm porting some old code that does something based on a codec id, eg if (codec_id == AV_CODEC_ID_DNXHD) { //blah }, but in my new code I only have the codec string name and no direct access to the ffmpeg library to convert it to a codec_id
[15:28:26 CEST] <King_DuckZ> my question is is it safe to do the same if over the string name? is there anything I should watch out for?
[15:30:27 CEST] <andreas> Hi, I want to make a slideshow from a number of images (AMB_00001.gif, AMB_00002.gif, ...). When using this command:
[15:30:27 CEST] <andreas> ffmpeg -i AMB_%05d.gif output.mp4
[15:30:27 CEST] <andreas> I get the following output:
[15:30:27 CEST] <andreas> ffmpeg version 2.8.14 Copyright (c) 2000-2018 the FFmpeg developers
[15:30:27 CEST] <andreas>   built with gcc 4.8.5 (GCC) 20150623 (Red Hat 4.8.5-16)
[15:30:28 CEST] <JEEB> just keep in mind that there can be multiple decoders or encoders for a format, although their names tend to be different :)
[15:30:29 CEST] <andreas>   configuration: --prefix=/usr --bindir=/usr/bin --datadir=/usr/share/ffmpeg --incdir=/usr/include/ffmpeg --libdir=/usr/lib64 --mandir=/usr/share/man --arch=x86_64 --optflags='-O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -m64 -mtune=generic' --extra-ldflags='-Wl,-z,relro ' --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libvo-amrwbenc --enable-versio
[15:30:34 CEST] <andreas> n3 --enable-bzlib --disable-crystalhd --enable-gnutls --enable-ladspa --enable-libass --enable-libcdio --enable-libdc1394 --disable-indev=jack --enable-libfreetype --enable-libgsm --enable-libmp3lame --enable-openal --enable-libopenjpeg --enable-libopus --enable-libpulse --enable-libschroedinger --enable-libsoxr --enable-libspeex --enable-libtheora --enable-libvorbis --enable-libv4l2 --enable-libx264 --enable-libx265 --enable-libxvid --enable-x11g
[15:30:41 CEST] <andreas> rab --enable-avfilter --enable-avresample --enable-postproc --enable-pthreads --disable-static --enable-shared --enable-gpl --disable-debug --disable-stripping --shlibdir=/usr/lib64 --enable-runtime-cpudetect
[15:30:44 CEST] <andreas>   libavutil      54. 31.100 / 54. 31.100
[15:30:46 CEST] <andreas>   libavcodec     56. 60.100 / 56. 60.100
[15:30:48 CEST] <andreas>   libavformat    56. 40.101 / 56. 40.101
[15:30:50 CEST] <andreas>   libavdevice    56.  4.100 / 56.  4.100
[15:30:52 CEST] <andreas>   libavfilter     5. 40.101 /  5. 40.101
[15:30:54 CEST] <andreas>   libavresample   2.  1.  0 /  2.  1.  0
[15:30:56 CEST] <andreas>   libswscale      3.  1.101 /  3.  1.101
[15:30:58 CEST] <andreas>   libswresample   1.  2.101 /  1.  2.101
[15:31:00 CEST] <andreas>   libpostproc    53.  3.100 / 53.  3.100
[15:31:02 CEST] <andreas> *.gif: No such file or directory
[15:36:28 CEST] <King_DuckZ> geez why are you spamming like that
[15:38:37 CEST] <andreas> realized that it would be annoying when I saw it myself, sry
[15:40:05 CEST] <JEEB>  /33
[16:01:32 CEST] <King_DuckZ> I'm looking at how find_codec_by_name() is implemented, it seems to be just looping over a list and stopping at the first match, so I should be ok right?
[16:01:53 CEST] <King_DuckZ> I can't really find where that list comes from, but I don't think it matters at this point
[16:19:52 CEST] <King_DuckZ> anyone who can help with my question please?
[16:20:22 CEST] <JEEB> I actualyl replied in the middle of that spam by that other guy
[16:21:02 CEST] <King_DuckZ> JEEB: ah I didn't see that, it got lost in the spam sorry!
[16:39:18 CEST] <King_DuckZ> idk if this is correct but I ran this: ffmpeg -encoders | ag 'V.{5}\s' | sed -E 's/^\s+//' | sed -E 's/\s+/\t/g' | cut -sf2 | sort | column
[16:40:02 CEST] <King_DuckZ> and I do see several codecs with the same name
[16:40:05 CEST] <King_DuckZ> or maybe not :s
[16:42:19 CEST] <King_DuckZ> uhm not sure what I did earlier but I can't see any duplicates right now
[16:44:14 CEST] <JEEB> yea the names shouldn't have any
[16:48:47 CEST] <King_DuckZ> JEEB: I'm confused... that commands prints h264_nvenc h264_omx h264__v4l2m2m h264_vaapi, would the codec_id be the same for all them? is that what you meant?
[16:49:11 CEST] <JEEB> yes, that is what I meant. that if you ask for a decoder or encoder for a codec id, you might get something you don't expect
[16:49:25 CEST] <King_DuckZ> dammit
[16:52:25 CEST] <King_DuckZ> my problem is that I have this case: my_program -> my_lib -> ffmpeg, and ffmpeg is private to my_lib, but this logic has to be inside my_lib
[16:52:42 CEST] <King_DuckZ> so I can't access any ffmpeg function from there, not even the AVCodecID type
[17:11:13 CEST] <tuna> Hello again, I previously was here asking about hardware encoding...that works great now, however I'd like to implement hardware decoding now. My question is where should I start with the decoding...I have a software decoder setup and running...will I need a hardware frame context like with the encoder?
[17:17:48 CEST] <atomnuker> yes, it should be identical as the encoder contexts
[17:19:14 CEST] <tuna> So, I should be able to essentially copy paste that part (maybe some minor changes like size/format) but then the hardware frame that I alloc with ffmpeg, I can then expect to be filled by the nvdec?
[17:46:01 CEST] <Celmor> if there were only an ffmpeg version downloadable with zimg any other optional libs included, tried looking for docker images for ffmpeg but they don't list zimg as far as I can see
[17:46:20 CEST] <Celmor> recompiling just for library support is a hassle
[18:01:10 CEST] <Hello71> install gentoo
[18:04:25 CEST] <Celmor> arch is hard-core enough for me
[19:17:41 CEST] <Celmor> can I interrupt a conversion, play the output file to see if it's as I want it and then continue the conversion afterward?
[19:18:41 CEST] <klaxa> in some shells ctrl+z puts a process in pause, depending on your output format you can just play the output, that wouldn't require pausing the encode though
[19:19:03 CEST] <klaxa> you can put the process back in the foregrund by running the "fg" command
[19:19:06 CEST] <Celmor> yeah, I know about that, but often the output isn't playable at that point
[19:19:34 CEST] <klaxa> like i said, depends on the format, e.g. mp4 is usually not playable until it is fully written
[19:19:51 CEST] <Celmor> usually using mkv
[19:20:16 CEST] <klaxa> you should be able to play that without problems i think
[19:20:24 CEST] <klaxa> (maybe with some obscure settings not)
[19:22:07 CEST] <ntd> anyone know of a no-nonsense x/xorg program that can/will just display N video inputs in a grid view?
[19:22:16 CEST] <ntd> had a look at mythtv, not a fan of the forced ui/background
[19:23:07 CEST] <ntd> been using a cfgraph, works swell if all inputs have the same fps, surely there must be some software that simply does this?
[19:23:08 CEST] <Celmor> I'd use i3wm + mpv
[19:24:05 CEST] <ntd> will mpv handle say twelve diff inputs (rtsp, http mjpeg, v4l), diff fps at the same time?
[19:24:18 CEST] <Hello71> maybe google i3wm first
[19:24:28 CEST] <Celmor> juse use different mpv instances
[19:24:28 CEST] <ntd> i had a look at it before
[19:25:50 CEST] <Celmor> then what made you drop it as the solution
[19:28:16 CEST] <ntd> grid was out of sync
[19:28:48 CEST] <Celmor> you mean the video outputs?
[19:29:00 CEST] <Celmor> or mpv instances
[19:29:31 CEST] <ntd> latter
[19:29:47 CEST] <benlieb> is it possible to change the seek duration (left / right keys) using ffplay? For what Im doing 10 second jumps are way too big...
[19:29:48 CEST] <Hello71> --video-sync=display-x
[19:29:59 CEST] <Hello71> benlieb: consider using a real media player
[19:31:00 CEST] <Cracki> Real[tm] Media Player
[19:31:21 CEST] <benlieb> @Hello71 I need to capture time frames down to the milisecond the way ffplay allows. Do you know of a media player that does this (and then allows the time to be copied or written to a file)?
[19:31:28 CEST] <benlieb> Im ussing the ffplay log.
[19:31:53 CEST] <Cracki> any video editor where you can set keyframes or markers
[19:31:56 CEST] <benlieb> Am I to assume that the answer is that this 10 second value isnt configurable.
[19:32:10 CEST] <Hello71> did you read the ffplay man page
[19:32:19 CEST] <benlieb> Hello71: yup
[19:32:25 CEST] <Hello71> well then I guess not
[19:32:58 CEST] <Cracki> you can single step frames
[19:33:03 CEST] <benlieb> @Hello71 Is that because we can always assume that documentation is fully representative of whats possible?
[19:33:13 CEST] <Cracki> just singlestep the frames
[19:33:17 CEST] <Celmor> ntd, maybe this video helps somewhat https://youtu.be/TtOqqj7RhZY
[19:34:18 CEST] <ntd> scripting this is just asking for trouble afaik. some sw must be able to read N inputs, decide on a common "start" and just display them?
[19:35:09 CEST] <Cracki> if you have a "wheel/jog" input device (e.g., contour shuttle, or some DIY USB HID), map the singlestep keys to your wheel
[19:35:10 CEST] <Celmor> well, how do you decide on a "common start" if all you get are video fits, they would need a marker, there are hardware solutions that do that
[19:35:33 CEST] <Hello71> I mean, if you want them all to be in the same picture, then just use n fps filters
[19:54:44 CEST] <tuna> For decoding (via hardware) off of a RTSP stream....Earlier you (i forgot the user name and I got detatched so I lost the history) said that I need the same context setup as the hardware encode. However, to init the hwframescontext I need width and height of the frame....but I do not have that until I recieve the NAL unit from the stream....is it possible to init the hwframe without knowing the size until I get that NAL unit?
[19:56:37 CEST] <ntd> Hello71, go on?
[19:57:18 CEST] <Hello71> tuna: probably not
[19:57:19 CEST] <ntd> https://unix.stackexchange.com/questions/434868/play-4-rtsp-streams-in-fullscreen <--- googling ""linux" " video" "grid view""
[19:57:28 CEST] <ntd> one relevant result, no stack answers..
[19:57:29 CEST] <Hello71> I mean, you also don't have the codec information
[19:57:49 CEST] <Hello71> usually video-sync=audio works pretty well
[22:05:04 CEST] <acos> Howdy all.  Trying to find out if a blackmagic hardware capture card will work in this software. What's the cheapest one? Thanks in advance.
[22:23:55 CEST] <ThugAim> hey guys. Trying to get 4.0 installed on Tahrpup 6.0.6
[22:25:41 CEST] <ThugAim> compiled and installed with checkinstall, but even when when installing the 'supposedly' 64 bit version it gives the /usr/bin/sensible-pager no such file or directory error.
[23:21:19 CEST] <BtbN> That sounds like you are using some random script, and it wants you to replace that with, wenn, a sensible pager.
[23:21:23 CEST] <BtbN> *well
[23:55:30 CEST] <Cracki> acos, I hear the decklinks work just fine in linux. I would not recommend the "intensity shuttle". check out magewell (chinese), they make proper stuff to capture analog, hdmi, sdi, either over usb, or via pcie
[23:56:45 CEST] <Cracki> as for "will it work", that's a matter of device interoperating with the OS. ffmpeg uses standard apis (linux v4l2, windows dshow/msmf/...)
[00:00:00 CEST] --- Tue Jun 26 2018


More information about the Ffmpeg-devel-irc mailing list