[Ffmpeg-devel-irc] ffmpeg.log.20170422

burek burek021 at gmail.com
Sun Apr 23 03:05:02 EEST 2017


[01:41:15 CEST] <Ekho> does anyone know how to add metadata as with  (-metadata:s:a:0 TITLE="$title") using a ffmetadata file? it seems to do everything globaly and unsure how to get it to apply to only a stream. also dumping a files metadata seems to only dump the encoder value? (working with .opus files)
[01:42:33 CEST] <Ekho> Running into issues with with argument list length trying to add METADATA_BLOCK_PICTURE tags via command line.
[01:52:24 CEST] <yui^^_> hey, does anyone know if i can copy the mebx part of an iphone .mov file using ffmpeg?
[03:10:25 CEST] <Tatsh> qtgmc, video jumping up and down :/
[03:29:02 CEST] <Tatsh> nnedi is working way better for me for this particular video
[03:39:47 CEST] <haasn> Is there anyway to make libvpx-vp8 encoding faster? I'm encoding a 2 minute video but it's only using a single thread :(
[03:39:55 CEST] <haasn> colossal waste of CPU power
[03:40:35 CEST] <c_14> Is not using vp8 an option?
[03:40:36 CEST] <haasn> Seems like -threads and -tile-columns can maybe do something
[03:40:38 CEST] <haasn> c_14: no
[03:41:17 CEST] <c_14> tile-columns and threads should in theory make it multithread
[03:44:30 CEST] <furq> the latest libvpx has row multithreading which should be less hopeless
[03:45:03 CEST] <haasn> [encode-lavc] ovcopts: key 'tile-columns' not found.  # I guess that means I screwed something up
[03:45:06 CEST] <furq> or not the latest but git head
[03:46:52 CEST] <furq> also tile-columns is specific to vp9
[03:47:11 CEST] <furq> looks like row-mt is as well so bad luck
[03:51:27 CEST] <Erick3k> Hi everyone, i am trying to stream youtube live 24/7 but it randomly crashes, maybe after a day sometimes hours, etc. Where can i find the cause?
[03:52:06 CEST] <Erick3k> If anyone can help i appreciate it
[04:02:15 CEST] <TD-Linux> haasn, best way is to write the gop parallel encoder that I've always threatened to do
[04:37:03 CEST] <kepstin> haasn: if you can, use vp9 instead of vp8, and grab a recent git build for some multithreaded encoding improvements
[04:37:39 CEST] <kepstin> (git build of libvpx, not ffmpeg)
[04:38:10 CEST] <kepstin> (git build of ffmpeg is fine too, but it won't make your encodes faster)
[05:20:35 CEST] <haasn> kepstin: unfortunately I wanted to post this on a mongolian basket weaving website which only supports vp8
[05:34:49 CEST] <johnjay> hey I'm trying to stream video over a LAN and I can't figure it out
[05:34:59 CEST] <johnjay> I've been reading this thread so far: http://ffmpeg.gusari.org/viewtopic.php?f=12&t=562
[05:35:45 CEST] <johnjay> does anybody know how to stream say an mp4 file with ffmpeg over a LAN?
[06:28:21 CEST] <damdai> if i want to convert ac3 to opus. what container does opus use
[06:28:49 CEST] <furq> ogg
[06:29:40 CEST] <damdai> ok thanks
[06:30:00 CEST] <furq> just use .opus
[06:32:13 CEST] <damdai> you just said it uses .ogg
[06:32:27 CEST] <furq> .opus is ogg
[06:32:49 CEST] <furq> they'll be exactly the same but .opus is more common
[06:33:14 CEST] <johnjay> ah i finally had success using udp in vlc player
[06:33:24 CEST] <johnjay> so i don't need ffmpeg after all i guess
[06:33:26 CEST] <furq> the only difference as far as ffmpeg is concerned is that the ogg opus muxer defaults to opus, so you don't need to specify the codec
[06:33:28 CEST] <damdai> ffmpeg -i 2.ac3 -acodec libopus -b:a 128k -vbr on -compression_level 10 R:\2.ogg
[06:33:39 CEST] <johnjay> you have to type the udp addr of the machine you're using to view the media which is weird af
[06:33:52 CEST] <damdai> what does -compression_level 10   do
[06:36:37 CEST] <furq> it should make it compress better
[06:36:57 CEST] <furq> i'm not really sure how it works with opus
[06:36:59 CEST] <damdai> what is default if you don't put  -compression_level 10
[06:37:02 CEST] <furq> no idea
[06:53:28 CEST] <Tatsh> nnedi is so slow
[06:53:30 CEST] <Tatsh> wow
[06:53:35 CEST] <Tatsh> 0.155x on a 480p
[06:53:41 CEST] <Tatsh> with no other filters
[06:53:42 CEST] <furq> it's singlethreaded
[06:59:25 CEST] <Tatsh> getting some output i don't get
[06:59:26 CEST] <Tatsh> https://gist.github.com/Tatsh/60b3ab945ab9e5f24c5784768d60b260
[06:59:34 CEST] <Tatsh> [Parsed_concat_35 @ 0xd620a0] Buffer queue overflow, dropping
[06:59:47 CEST] <Tatsh> trying to encode from an original source that has extra junk in between
[06:59:58 CEST] <Tatsh> pretty sure i have my filters correct
[07:15:53 CEST] <Tatsh> so the issue is that nnedi is too slow
[07:15:58 CEST] <Tatsh> for this sort of thing
[07:33:16 CEST] <Tatsh> the answer: fifo and afifo
[08:31:55 CEST] <LanDi> hey guys, I'm using this command to record the audio from my desktop apps, it works fine, but I can not hear anything when it's recording... ffmpeg -f pulse -ac 2 -ar 44100 -i auto_null.monitor -filter_complex amix=inputs=1 -ar 44100 -q:a 1 out.wav
[08:31:55 CEST] <LanDi> ffmpeg -f pulse -ac 2 -ar 44100 -i auto_null.monitor -filter_complex amix=inputs=1 -ar 44100 -q:a 1 out.wav
[08:32:15 CEST] <LanDi> sorry for duplicate the command
[14:39:56 CEST] <paul_uk> hi, im trying to overlay multiples images and have them scaled.  im just trying with 2 right now.  here's what I have right now https://pastebin.com/raw/mAAV5wFE.  I can always see the first image, but never the second.  also im confused about the labels?  [1:v] and so on.
[15:02:27 CEST] <paul_uk> nevermind worked it out.
[18:22:07 CEST] <echelon> hey
[18:22:20 CEST] <echelon> how do i disable the encoder tag?
[18:22:27 CEST] <echelon> encoder: Lavf57.56.101
[18:22:40 CEST] <echelon> i'm just copying the streams, i'm not encoding anything
[18:24:28 CEST] <furq> maybe -fflags bitexact
[18:24:34 CEST] <furq> that might just remove the version number though
[18:24:54 CEST] <echelon> -metadata encoder='whatever'
[18:25:03 CEST] <echelon> but i don't want the tag to show up at all
[18:53:17 CEST] <andai> opus encoding is disabled in my ffmpeg. Is that the default setting? I thought it would be enabled by now, came out years ago
[18:54:02 CEST] <andai> aha! I've upgraded it and now it's enabled. :) ( although, I installed it only a few weeks ago, so i am still surprised it wasn't included back then)
[18:54:25 CEST] <andai> hmm, i see! The encoder 'opus' is experimental but experimental codecs are not enabled, add '-strict -2' if you want to use it.
[18:55:16 CEST] <furq> andai: that's the builtin opus encoder, not libopus
[19:00:19 CEST] <DHE> andai: yeah, you probably have a version that comes with an external encoded included but the new one doesn't have it
[19:01:33 CEST] <furq> i assume this distro doesn't include libopus in ffmpeg and the old version predates the internal opus encoder
[19:01:39 CEST] <furq> but who knows
[19:02:17 CEST] <furq> there's no good reason to not include libopus in ffmpeg, so if this is true then it would be the first recorded instance of a bad decision in distro ffmpeg packaging in history
[19:02:38 CEST] <furq> and by history i mean 24 hours
[19:12:40 CEST] <andai> :)
[19:13:02 CEST] <andai> is there a way to configure it so i don't have to type -strict -2 every time
[19:13:14 CEST] <andai> short of making an alias
[19:13:26 CEST] <furq> you should probably get a build with libopus anyway
[19:13:46 CEST] <furq> if an encoder is marked as experimental that's generally a sign it's not ready yet
[19:14:34 CEST] <andai> idk works fine. I converted a m4a because it was taking 5-10 seconds to seek, the opus version does it instantly
[19:15:00 CEST] <andai> is there an official mac build? i'm using homebrew
[19:15:57 CEST] <andai> ohh i see, when i install it i need --with-opus
[19:18:05 CEST] <andai> the things we can do nowadays!
[19:25:48 CEST] <andai> thanks, bye :)
[20:17:14 CEST] <djk> If I make a time lapse video from jpegs collected is there a way to add to more jpeg the video every 15 minutes or do I have rebuild.
[20:20:17 CEST] <dystopia_> you want to add jpg's to an already existing video?
[20:20:57 CEST] <djk> yes or combine videos is another option
[20:20:58 CEST] <dystopia_> you can split the video at 15m intervals, add what you want and merge them all again
[20:21:12 CEST] <dystopia_> but it would be better to start over imo if you have all the source jpg's
[20:22:44 CEST] <djk> this will be 2-3 days of per second stills to a time lapse. Ideally would like to have a building/growing video to share as updates through the days and a final product
[20:25:32 CEST] <BtbN> you'd have to re-generate the entire video every time
[20:26:10 CEST] <c_14> or just use mpeg-ts and concatenate?
[20:26:12 CEST] <furq> i would probably just use the segment muxer with mpegts, and then have a cron job concat and remux all the segments once an hour or so
[20:26:26 CEST] <djk> why not gen 15 and concat?
[20:26:35 CEST] <furq> 15 what
[20:26:41 CEST] <djk> minute
[20:27:58 CEST] <djk> so every 15m make a video (mpg4 for web sharing?) and the concat to the growing base?
[20:28:13 CEST] <furq> how are you generating the timelapse
[20:43:41 CEST] <djk> This is the last way I did it. Getting ready to script a process to automate now that the live stream Facebook is seeming to work with their time limit feed. They still randomly without reason expire the unlimited one.
[20:43:41 CEST] <djk> ffmpeg -f image2 -pattern_type glob -framerate 1 -i 'wmstill*.jpg' -crf 35 -preset veryslow -c:v libx264 -pix_fmt yuv420p out2.mp4
[21:04:57 CEST] <mattwj2002> hey guys
[21:05:08 CEST] <mattwj2002> I have a quick question
[21:05:29 CEST] <mattwj2002> can ffmpeg do accelerated h265 encoding using a graphics card?
[21:05:46 CEST] <mattwj2002> I see talk about accelerated h264
[21:06:11 CEST] <c_14> with nvenc_hevc yes
[21:06:45 CEST] <mattwj2002> hi c_14 thank you
[23:11:26 CEST] <petecouture> Happy Saturday All. Question for folks who've done a bit of image2 frame captures from mp4 videos. I have video of a simple text based animation sequence on a white background. In the video everything is very crisp and clean. When I render out a JPG using image2 I get a lot of antialiasing artifacts it degrades the quality of the video a bit. Does anyone have any tips on how to improve this? I'm rendering out still
[23:11:26 CEST] <petecouture> at the same framerate and resolution as the movie.
[23:12:17 CEST] <durandal_1707> add qscale 0?
[23:13:03 CEST] <petecouture> Cheers I'll give it a shot. This isn't 100% a big issue. I think I can render out a highly crisp image sequence using AfterEffects only I don't know the process and this is easier.
[23:15:02 CEST] <durandal_1707> if its rgb , use png
[23:15:45 CEST] <petecouture> Oh your right I didn't even think of that. Thanks durandal!
[00:00:00 CEST] --- Sun Apr 23 2017


More information about the Ffmpeg-devel-irc mailing list