[Ffmpeg-devel-irc] ffmpeg.log.20200108

burek burek at teamnet.rs
Thu Jan 9 03:05:02 EET 2020


[00:22:16 CET] <kepstin> cpplearner: do you mean, you want a list of all the possible keys ffprobe can output in the [FORMAT] section, regardless of file type?
[04:06:24 CET] <unixabg> lastlog unixabg
[04:06:36 CET] <unixabg> oops sorry
[04:08:48 CET] <unixabg> kepstin: first ty for the response. Is there an example page or is that information in the docs somewhere?
[04:16:25 CET] <kepstin[m]> for the zmq filter commands stuff? There's docs with examples at https://ffmpeg.org/ffmpeg-filters.html#zmq_002c-azmq
[09:01:52 CET] <Beam_Inn> is it possible to make animated gifs with ffmpeg from image files?
[09:02:43 CET] <Beam_Inn>  Iwanted to follow up on something I read the other day.  Basically, someone was talking about videos using the same image over multiple frames rather than multiple instances of the same photo over multiple frames.  Is that really possible?
[09:02:57 CET] <Beam_Inn> I don't know anything about how videos are coded.
[10:06:41 CET] <Lypheo> Beam_Inn: yes. something like ffmpeg -f image2 -framerate 1 -i image%d.jpg animated.gif should do it
[10:07:33 CET] <Lypheo> also what you read might have been referring to interframe compression which is indeed possible
[10:13:17 CET] <lofo> I'd like to stream video from a device to another on my local network. I want my receiver to be able to connect/disconnect multiple times without having the emitter to re-create a stream every time. is that possible ?
[14:54:00 CET] <AlexanderBrock> identify AlexanderBrock Nsc6mwUoWoHuQMzPrp94UjKGXRh7iFUxOB11CRlKWRpOcYMk4FZJ7SQoUwYlI2my
[14:56:19 CET] <AlexanderBrock> I want to compile FFmpeg on Debian testing with mp3 encoder support. It complains that libmp3lame is too old or missing ("ERROR: libmp3lame >= 3.98.3 not found").
[14:56:21 CET] <DHE> good password but I think you need a new one
[14:56:30 CET] <AlexanderBrock> DHE: already changed it :-)
[14:56:53 CET] <BtbN> Well, is libmp3lame too old?
[14:57:17 CET] <DHE> or is often the case, you need some kind of libmp3lame-dev package installed as well
[14:57:26 CET] <AlexanderBrock> "apt-cache madison libmp3lame0" shows "3.100-3"
[14:58:21 CET] <AlexanderBrock> ah, -dev package was missing
[15:42:19 CET] <tMH> hello ffmpeg gurus, I want to ask a question. I'm user of videoredo software that has very nice feature 'quickstream fix', what this feature does I did paste here: https://justpaste.it/77nef
[15:43:19 CET] <tMH> the question is - how to do the same with ffmpeg codec ? I really need to make this quickstream fix on many files while VideoRedo software is not easy to use in batch mode... that is why I came to this channel to ask you about ffmpeg ability to do the same as videoredo does.
[15:43:28 CET] <tMH> ffmpeg program I mean
[15:45:07 CET] <tMH> videoredo itself tells me that it uses ffmpeg libraries for video operations...
[15:46:34 CET] <DHE> sounds like it's just a remxer. ffmpeg can do that
[15:46:42 CET] <tMH> dhe: how to ?:)
[15:47:10 CET] <DHE> ffmpeg -i INPUTFILE -c copy OUTPUTFILE
[15:47:16 CET] <tMH> just it ?
[15:47:19 CET] <tMH> hm.
[15:47:32 CET] <DHE> maybe. having not used this software I can only go by what this brief text blub says
[15:47:36 CET] <DHE> blurb
[15:48:02 CET] <tMH> and ffmpeg cannot operate files in batch mode, I suppose..
[15:48:10 CET] <DHE> that's what your shell is for
[15:48:37 CET] <tMH> btw
[15:48:53 CET] <tMH> dhe: no difference encountered between INPUTFILE and OUTPUTFILE
[15:49:10 CET] <DHE> ... good?
[15:49:15 CET] <tMH> no work proceeded.
[15:49:19 CET] <BtbN> It cannot do it in place.
[15:49:21 CET] <tMH> nothing has changed.
[15:49:24 CET] <BtbN> They need to be distinct files.
[15:49:38 CET] <tMH> so what about real remuxing with ffmpeg ?
[15:49:44 CET] <BtbN> ?
[15:49:56 CET] <tMH> btbn: https://justpaste.it/77nef
[15:50:03 CET] <tMH> I need to do the same with ffmpeg.
[15:50:18 CET] <BtbN> Yes, and that command is how you remux.
[15:50:22 CET] <tMH> doing ffmpeg -i input -c copy output gives no difference between two files
[15:50:32 CET] <BtbN> Like I said, they need to be distinct.
[15:50:36 CET] <BtbN> It cannot modify a file in-place.
[15:50:57 CET] <tMH> don't get what you mean, sorry.
[15:51:22 CET] <DHE> you are providing a file with a bad initial timestamp I assume?
[15:52:21 CET] <tMH> dhe: no, youtube downloaded video. I'm trying to play it on some oldie media player and it stops video while audio is continue playing. after I do quickstream fix in videoredo - mediaplayer plays that video ok
[15:52:24 CET] <tMH> video and audio.
[15:53:15 CET] <tMH> e65fb32da7af19e6e3e283877e312e31 *ANCIENT BARDS-2018-Impious Dystopia (Official Video 4K).fixed.mp4
[15:53:15 CET] <tMH> e65fb32da7af19e6e3e283877e312e31 *ANCIENT BARDS-2018-Impious Dystopia (Official Video 4K).mp4
[15:53:16 CET] <DHE> how long does it take to fix it in videredo?
[15:53:18 CET] <tMH> no difference.
[15:53:25 CET] <tMH> dhe: pretty fast, no recoding at all
[15:53:36 CET] <tMH> it does 'remux' as I understand.
[15:54:16 CET] <th3_v0ice> How can I init hardware scaler? I am currently getting this error https://pastebin.com/XHsrNLAk
[15:55:20 CET] <BtbN> Use scale_cuda instead of scale. And build ffmpeg with support for it, of course.
[15:58:43 CET] <th3_v0ice> BtbN: Let me test if that will work.
[16:01:43 CET] <lofo> I'd like to stream video from a device to another on my local network. I want my receiver to be able to connect/disconnect multiple times without having the emitter to re-create a stream every time. is that possible ?
[16:08:28 CET] <th3_v0ice> BtbN: This is what I get now https://pastebin.com/CgAsfwyB
[16:09:58 CET] <DHE> lofo: a live/realtime stream?
[16:10:37 CET] <lofo> yup
[16:15:14 CET] <DHE> what's the device? have you considered multicast?
[16:15:41 CET] <lofo> a smartphone
[16:16:00 CET] <lofo> what do you mean by multicast ?
[16:18:00 CET] <DHE> UDP to IP addresses from 224.0.0.0. through 239.255.255.255 are multicast. users on the local lan can "join" a stream and watch a live feed, and "leave" at any time and the sender is largely unaware
[16:18:18 CET] <DHE> this is how I watch over-the-air TV with this little coax-to-ethernet box I have
[16:21:04 CET] <lofo> UDP wasn't performing well and delay is an acceptable thing on my setup. Would that work with TCP as well ?
[16:21:27 CET] <DHE> no. the nature of multicast doesn't work with TCP
[16:22:05 CET] <DHE> so if you want something TCP based, this isn't really something ffmpeg handles directly. you probably want to get something like nginx-rtmp involved, or HLS or DASH with a static HTTP content server (apache is fine)
[18:12:49 CET] <zerodefect> I'm trying to build ffmpeg in Ubuntu 18.04.3 with --enable-gnutls but when I do ./configure, I get error 'ERROR: gnutls not found using pkg-config'
[18:13:02 CET] <zerodefect> Now I've done an install of gnutls-dev but to no avail
[18:13:19 CET] <zerodefect> I've also tried setting PKG_CONFIG_PATH to the location of .pc file
[18:15:42 CET] <DHE> I'd check the config.log (possibly in the ffbuild directory) to see what went wrong. could be a problem with gnutls itself, version mismatch, etc...
[18:20:21 CET] <zerodefect> Brilliant. Thanks. I wasn't aware of that file. It tells me that I was missing libunistring.so. Installed and good to go.
[20:43:52 CET] <garoto> does anyone knows why these configure options fail to generate a static ffmpeg binary? -> http://sprunge.us/L0VUkb
[20:45:09 CET] <garoto> don't mind the extra spaces, it was inserted by my console app)
[20:56:12 CET] <DHE> --enable-static is for the libraries, not the binary itself. if you want a static link of ffmpeg, ffprobe you need --extra-ldflags=-static
[20:56:50 CET] <DHE> though you may find that hard to do on many distros...
[20:58:15 CET] <garoto> ah ok, gonna give it a try with --extra-ldflags=-static
[21:15:44 CET] <garoto> yep, doesn't work. `ERROR: libass not found using pkg-config`
[21:15:48 CET] <Stryker> Hi, I need help using FFMPEG. I want to create a clip from a video at a smaller resolution, then append the same clip but at half the playback speed to it. For some reason I can't get the latter to work, ie. I only get the clip but without the replay. https://pastebin.com/HT3kMrYk Can anyone tell me what I'm doing wrong? Thanks in advance!
[21:16:36 CET] <garoto> and once you remove that configure option, another `pkg-config` error pops-out
[22:53:42 CET] <ManDay> How do I concatenate a large set of ts files into one? Specifying them all by a sequence of "-i ..."s fails with an error about "too many open files"
[22:54:21 CET] <pink_mist> make a script that does a few at a time
[22:54:28 CET] <ManDay> I tried putting their names into an m3u file, but "-i list.m3u" gives "Invalid data" as an error
[22:54:41 CET] <ManDay> pink_mist: thanks, but how awful
[22:55:22 CET] <nicolas17> ManDay: using multiple -i isn't the way to concatenate anyway
[22:55:39 CET] <ManDay> nicolas17: What would be correct instead?
[22:56:17 CET] <nicolas17> -i video1.ts -i video2.ts would create a file with multiple video streams (if the output even supports that), not concatenate them in time
[22:56:24 CET] <ManDay> I mean the synopsis in ffmpeg(1) seems to kind of suggest -i ... times X is the way to go
[22:57:22 CET] <ManDay> hm I see, there is a wiki entry
[22:57:24 CET] <nicolas17> afaik .ts files can be concatenated byte-wise due to how the file format works, so you can use 'cat' :)
[22:57:27 CET] <ManDay> thanks for pointing that out
[22:57:38 CET] <nicolas17> or you can use the concat protocol as the wiki says
[22:57:39 CET] <ManDay> nicolas17: heh, okay - didn't know that :D
[22:58:05 CET] <nicolas17> "Certain files (MPEG-2 transport streams, possibly others) can be concatenated"
[23:05:24 CET] <ManDay> fwiw, trying the "concat:" protocol, the error is "Filename too long"
[23:06:38 CET] <BtbN> You can concat .ts files by just using cat.
[23:06:56 CET] <BtbN> And maybe follow that with a remux via ffmpeg for some fixups
[23:07:13 CET] <ManDay> yes, thanks. I just wanted to mention it
[23:07:24 CET] <ManDay> using the concat demuxer also works
[23:23:38 CET] <AiNA_TE> https://trac.ffmpeg.org/wiki/Concatenate
[23:23:53 CET] <AiNA_TE> echo the fils to a txt file and then use that for concat
[23:24:02 CET] <AiNA_TE> oh bleh, i was scrolled up
[23:24:20 CET] <saber1> hey! I have a video which has negative frames, because I cut it from the original video. And then I used a command like this to add timestamps to the video `ffmpeg -nostdin -i after-remove-orange.mp4 -c:v libx264 -vf "drawtext=x=(w-(max_glyph_w*13))/2: y=H-h/10:fontcolor=white:fontsize=h/14:box=1:boxcolor=0x000000AA:text='%{pts\\:hms}'" -y ./after-remove-timestamp.mp4`, the mystery to me is the output video isn't the same as
[23:24:20 CET] <saber1> the input video
[23:25:02 CET] <saber1> Looks like a few frames got removed, and I don't know why. I wonder if it could be related to the negative frames
[23:48:29 CET] <Soni> so uh, how do you even encode DSD?
[23:48:46 CET] <Soni> this is for Sega 32X development btw
[23:56:51 CET] <ddubya> I'm supplying my own execute() and execute2() in AVCodecContext, and it seems to work for some codecs, but h264 codec will not use it. It seems to use execute(): https://github.com/FFmpeg/FFmpeg/blob/95e5396919b13a00264466b5d766f80f1a4f7fdc/libavcodec/h264_slice.c#L2837
[00:00:00 CET] --- Thu Jan  9 2020


More information about the Ffmpeg-devel-irc mailing list