[Ffmpeg-devel-irc] ffmpeg.log.20170602
burek
burek021 at gmail.com
Sat Jun 3 03:05:01 EEST 2017
[00:12:14 CEST] <ac_slater_> ugh guys! Timestamps/timebases ...
[00:13:14 CEST] <ac_slater_> when I do `ffprobe -show_streams` on an mpegts file, the timebase for EACH stream (h264 and another is just raw data), it shows they both have a 1/90000 timebase. Even though I set the raw data stream to 1/15
[00:13:50 CEST] <ac_slater_> what's happening there
[00:15:22 CEST] <furq> mpegts always has a timebase of 1/90000
[00:15:49 CEST] <ac_slater_> I wish I could get that timebase from the AVFormatContext ...
[00:15:58 CEST] <ac_slater_> but it just is no where to be found
[00:16:10 CEST] <furq> you don't need to
[00:16:12 CEST] <furq> it's 1/90000
[00:16:17 CEST] <ac_slater_> I mean, I know the constant
[00:16:26 CEST] <ac_slater_> it just makes it hard to write generic muxer interfaces :p
[00:16:42 CEST] <ac_slater_> I would think that AVFormatContext would carry around the muxer/demuxer timebase
[00:19:28 CEST] <furq> https://ffmpeg.org/doxygen/trunk/structAVStream.html#a9db755451f14e2bf590d4b85d82b32e6
[00:19:41 CEST] <furq> In avformat_write_header(), the muxer will overwrite this field with the timebase that will actually be used for the timestamps written into the file (which may or may not be related to the user-provided one, depending on the format).
[00:22:20 CEST] <JodaZ> anyone got me a hail mary commandline to fix timestamps/seeking?
[00:36:48 CEST] <ac_slater_> furq: awesome!
[02:10:54 CEST] <sea> http://sprunge.us/NcJX <- What's the explanation for this odd behavior?
[02:11:36 CEST] <sea> Somehow, ffmpeg removes characters from stdin
[02:12:16 CEST] <c_14> add -nostdin
[02:12:40 CEST] <sea> AhA! brilliant.
[02:12:59 CEST] <sea> ..should it really be eating a character when that's not supplied though? I gave it -y
[02:13:15 CEST] <c_14> ffmpeg supports certain commands on stdin
[02:13:56 CEST] <sea> I notice it eats the b, but not really any other character. I guess b is a command of some kind
[02:14:07 CEST] <sea> Okay it makes sense then.
[03:33:21 CEST] <ac_slater_> quit
[03:33:23 CEST] <ac_slater_> sorry!
[03:33:25 CEST] <ac_slater_> quit
[03:33:27 CEST] <ac_slater_> fuuuu
[03:33:32 CEST] <furq> bye
[03:34:09 CEST] <furq> there's probably a "saved by the bell" joke i could've made there, but i don't remember anything anyone said in that show
[03:37:22 CEST] <johnjay> hey furq, sup
[03:37:46 CEST] <johnjay> do you know why i can't install ffmpeg on my rpi?
[03:55:18 CEST] <petecouture> johnjay what error do you get when you build it?
[03:59:11 CEST] <johnjay> petecouture: I meant literally install as in, apt-get install ffmpeg fails
[03:59:20 CEST] <johnjay> i haven't tried compiling it myself
[03:59:40 CEST] <DHE> then that's a debian/ubuntu/whatever fault
[04:00:11 CEST] <johnjay> ah ok. i thought maybe since i'm on a rpi maybe there's a problem with getting ffmpeg to work on it
[04:00:55 CEST] <furq> probably because your distro is still running debian jessie, which is still on libav
[04:01:17 CEST] <furq> the ffmpeg from repos would be useless on rpi though because it doesn't have omx or mmal
[04:03:34 CEST] <DHE> I wouldn't want software decoding/encoding on a Pi...
[04:12:04 CEST] <johnjay> furq: ah ok I didn't realize that fork thing was still relevant even in jessie
[04:13:00 CEST] <furq> well jessie is over two years old now
[04:13:10 CEST] <furq> stretch is coming out any day now
[04:13:39 CEST] <furq> then we'll never have to worry about this ever again, he lied
[04:14:04 CEST] <johnjay> lol speaking of forks I read about something with mplayer
[04:14:20 CEST] <johnjay> when I tried to apt-get install mplayer it said "warning selecting mplayer2 because that other one is deprecated"
[04:14:32 CEST] <johnjay> and I was like what so I googled and it was the same thing, mplayer2 is a dead fork
[04:15:03 CEST] <furq> this is why you shouldn't use a stable distro
[04:16:16 CEST] <DHE> mpv is the new hotness anyway....
[04:16:24 CEST] <furq> also yeah this is why you shouldn't use mplayer
[04:16:31 CEST] <johnjay> lol apparently i was foolishly trying to though
[04:16:35 CEST] <DHE> and I say "new" relative to people who are asking for mplayer in the first place. :)
[04:17:16 CEST] <johnjay> well once you go sid you can never go back
[04:17:23 CEST] <johnjay> so maybe i'll use a different sd card for that
[04:17:35 CEST] <furq> i went sid and went back
[04:17:39 CEST] <furq> testing for life
[04:47:42 CEST] <k_sze[work]> Does anyone know whether the JPEG standard specify maximum sizes for the quantization and Huffman tables? Apparently the (Huffman) table size can vary, with optimization like jpegmini.
[04:47:51 CEST] <JEEB> .2
[04:51:32 CEST] <johnjay> I did google a bit before hand though
[04:51:48 CEST] <johnjay> I randomly found someone on the mailing list talking about a patch for ffmpeg on the rpi
[04:52:07 CEST] <johnjay> something about using hand coded assembly is always better and gcc problems with FATE
[04:56:55 CEST] <johnjay> btw it's odd you say rpi doesn't have mmal. when I googled for it it says mmal is specificall designed for the videocore which the pi has
[05:00:06 CEST] <furq> i didn't say that
[05:00:13 CEST] <furq> i said the ffmpeg builds in the debian repos don't have mmal enabled
[05:00:35 CEST] <johnjay> oh
[05:01:08 CEST] <johnjay> based on my experience compiling hostapd yesterday to make my pi into an AP
[05:01:14 CEST] <johnjay> those distro ppl must have a lot of work though
[05:01:19 CEST] <johnjay> you have to compile... everything
[05:01:59 CEST] <furq> well they have scripts and build farms for all that
[05:02:36 CEST] <tdr> or just cross compile it on a beefy box
[05:03:08 CEST] <johnjay> sounds like a nightmare. but i guess if you have scripts and beef then it's fine
[05:03:43 CEST] <tdr> its steps to setup but not super hard, just alot of checks and testing
[05:04:08 CEST] <tdr> cross compiling now is a lot easier than say 10 yrs ago
[05:04:16 CEST] <furq> it's piss easy if you have a debian testing box
[05:04:19 CEST] <furq> or a recent ubuntu
[05:04:51 CEST] <furq> the armhf toolchain is part of the repos now, plus you can just install all the deps with multiarch
[05:05:09 CEST] <furq> apparently it's similarly easy on fedora but i don't touch rpm
[05:07:38 CEST] <johnjay> furq: maybe i'll learn how to do it. for now i'm stuck with this pi until I get a new fan for my PC
[05:07:56 CEST] <johnjay> so i tried installing ffmpeg naturally and wondered why it failed
[05:11:35 CEST] <tdr> how did you install?
[05:11:50 CEST] <tdr> could be missing deps or be build against different library versions
[05:12:33 CEST] <johnjay> tdr: though apt-get. apparently jessie still has libav according to furq
[05:13:03 CEST] <johnjay> but yeah that's a thing too. I tried installing an n64 emulator and I got a message about the package not existing but another package depends on it
[05:38:35 CEST] <tdr> johnjay, check the location where it added the files. it could be somewhere "odd"
[05:39:59 CEST] <johnjay> tdr: the n64 thing? on apt-get it refuses to download if it doesn't satisfy dependencies
[05:40:04 CEST] <johnjay> although there's a way to force it
[05:40:14 CEST] <furq> have you been mixing and matching repos
[05:40:35 CEST] <furq> granted i wouldn't expect a pi to be able to passably emulate an n64
[05:41:26 CEST] <johnjay> furq: more like distros. I have raspbian on this card and retropi on the other one
[05:41:49 CEST] <furq> i meant within one distro
[05:42:04 CEST] <furq> apt shouldn't be throwing dependency errors unless you have unofficial repos or something
[05:42:13 CEST] <johnjay> nope. but am I to understand that running testing is a better option than mixing and matching repos?
[05:42:27 CEST] <furq> testing is generally just better
[05:42:42 CEST] <johnjay> well maybe i misspoke. it just said that the package didn't exist but was referred to by another package
[05:42:48 CEST] <furq> mixing and matching official repos is a surefire way to end up in hell
[05:42:49 CEST] <johnjay> some kind of n64 mulator called mupen64plus
[05:43:20 CEST] <furq> that's amd64 and i386 only
[05:43:24 CEST] <furq> so no wonder it doesn't exist
[05:44:29 CEST] <furq> https://www.libretro.com/index.php/retroarch-2/
[05:44:32 CEST] <furq> you probably want that anyway
[05:50:43 CEST] <johnjay> i think I searched all the names
[05:50:49 CEST] <johnjay> retroarch, retropi, emulation
[05:51:29 CEST] <furq> retroarch isn't in the debian repos but they provide deb packages iirc
[05:51:49 CEST] <furq> i've got it running on testing on my laptop, so you should be able to massage it into working
[05:53:26 CEST] <johnjay> i'm trying to follow these rules
[05:53:38 CEST] <johnjay> so mixing and matching repos is bad. testing is good. downloading .debs is ok
[05:53:48 CEST] <furq> yup
[05:53:59 CEST] <furq> downloading debs is fine because it'll either work or it won't
[05:54:38 CEST] <furq> mixing and matching repos is not fine because it'll download some dependency from the unofficial repo that ends up causing unresolvable dependency issues two months later
[05:55:05 CEST] <furq> now the manually installed deb might also break two months later, but it can only really break itself
[05:56:15 CEST] <johnjay> like you download package2.7 and later find out you have to have package2.8 in 2 months and it's not allowed to have both?
[05:58:29 CEST] <furq> like package1 depends on libdep > 2.0 and < 2.1, package2 depends on libdep 2.1, then package1 gets updated and now it depends on libdep 2.2
[05:59:08 CEST] <furq> imagine that except with a thousand packages which potentially have 100 dependencies
[06:00:09 CEST] <furq> you'd think i would have either used numerals or words for both of those numbers. don't be silly
[06:02:33 CEST] <johnjay> yeah sounds horrible
[06:02:43 CEST] <johnjay> i don't like having to use packages in the first place but i guess it's better than the alternative
[06:09:19 CEST] <johnjay> by the way why don't you think the pi can emulate an n64?
[06:09:39 CEST] <johnjay> idk how you would estimate something like that
[06:14:41 CEST] <furq> because i know it struggles with the psx
[06:15:02 CEST] <furq> even an overclocked pi 3 has difficulty with 3d consoles
[06:15:13 CEST] <furq> and n64 emulation was much less mature than psx last i checked
[06:15:28 CEST] <furq> not that i care, i never had one
[06:20:33 CEST] <johnjay> interesting, i thought it would have come along since the days of UltraHLE
[08:23:48 CEST] <ac_slater> guys, when I do the following on the command line, `ffmpeg -i whatever -f mpegts -f rtp udp://0.0.0.0:9999`, what kind of AVFormat muxer chain is being constructed? I know it creates both mpegts and rtp muxers, but how are they bound via AVFormatContexts?
[08:24:15 CEST] <ac_slater> I guess, how do I bind two AVOutputContexts
[08:24:17 CEST] <ac_slater> ?
[08:29:33 CEST] <ac_slater> I guess I should just breakpoint the hell out of the command line app
[08:33:24 CEST] <ac_slater> maybe I'm really binding and output to an input at that point...
[08:34:14 CEST] <ac_slater> or stream really. I think I got it
[08:51:50 CEST] <kerio> how do i downsample from 96khz to 48khz
[08:52:01 CEST] <furq> -ar 48000
[08:53:22 CEST] <furq> or -af aresample=resampler=soxr if ya nasty
[08:53:30 CEST] <furq> er
[08:53:40 CEST] <furq> -af aresample=48000:resampler=soxr
[08:53:56 CEST] <kerio> will -ar use sox
[08:54:00 CEST] <furq> no
[08:54:03 CEST] <kerio> :<
[08:54:04 CEST] <furq> also you need a build with --enable-sox
[08:54:17 CEST] <furq> i think you can do -ar 48000 -resampler soxr
[08:54:35 CEST] <furq> i'm pretty sure that just uses aresample
[08:54:48 CEST] <kerio> Requested resampling engine is unavailable ðŸ˜
[08:55:02 CEST] <furq> rip
[08:55:24 CEST] <furq> sox's resampler was much better than ffmpeg's at the time the site that did comparisons made their nice images
[08:55:28 CEST] <furq> but that was with ffmpeg 1.1
[08:55:41 CEST] <furq> i have no idea whether it's worth it any more
[08:56:17 CEST] <furq> i'm sure durandal_1707 will be along shortly to make explicit threats against my life for suggesting that sox is any good
[08:58:37 CEST] <kerio> anyway new radiohead ;o
[08:58:41 CEST] <kerio> it's actually old radiohead tho ;o
[09:04:37 CEST] <durandal_1707> furq: same images can be made with ffmpeg resampler, it just have bad defaults
[09:04:55 CEST] <furq> why doesn't it have good defaults
[09:05:44 CEST] <durandal_1707> complain to michaelni, he needs proof that when listening to it one can hear difference
[09:06:36 CEST] <furq> so what are the good settings for swresample then
[09:06:51 CEST] <ac_slater> something I always wished existed within ffmpeg is a filtergraph capable of showing stuff like streaming loss or a decoding timeline including malformed input
[09:07:02 CEST] <ac_slater> so like, you can see in realtime some decoding visuals
[09:07:05 CEST] <ac_slater> would be cool
[09:07:10 CEST] <ac_slater> anyway, carry on
[09:07:54 CEST] <durandal_1707> furq: search mailing list for topic
[09:08:15 CEST] <furq> i don't care that much, i don't have anything above 48khz anyway
[09:09:46 CEST] <furq> maybe if someone was on irc asking how to downsample some good music that is worth listening to, i would check
[09:09:49 CEST] <furq> but that has never happened
[09:10:14 CEST] <ac_slater> so, If I wanted to emulate the `-f mpegts -f rtp ...` command line via libavformat, would I construct the mpegts to be an output format (avformat_alloc_output_context2) and the rtp be an input context via avformat_open_input?
[09:10:20 CEST] <furq> is that even a thing
[09:11:05 CEST] <ac_slater> ?
[09:11:27 CEST] <furq> i'm pretty sure that's just -f rtp
[09:11:35 CEST] <ac_slater> hmm maybe
[09:11:44 CEST] <ac_slater> but it does do the right thing
[09:11:57 CEST] <furq> i assume you want rtp_mpegts for mpegts over rtp
[09:12:40 CEST] <ac_slater> wait
[09:12:44 CEST] <ac_slater> wtf is rtp_mpegts
[09:13:10 CEST] <kerio> furq: >:c
[09:13:38 CEST] <ac_slater> furq: the docs for rtp_mpegts are basically non existant
[09:13:49 CEST] <furq> that sounds about right
[09:13:55 CEST] <ac_slater> ;)
[09:14:43 CEST] <ac_slater> but either way furq I'm a little puzzle on how to take an AVFormatContext I alloc'ed with avformat_alloc_output_context2 (ie - an output context), and tell it to send somewhere else rather than it's default
[09:15:06 CEST] <ac_slater> I can do it with context->pb if I wanted to override it with avio manually... but it should be easy to "chain" these things right?
[09:18:21 CEST] <ac_slater> it's actually really bothering me that I cant seem to figure it out
[09:19:38 CEST] <ac_slater> furq: you're my only hope
[09:49:05 CEST] <ac_slater> furq: well... jeeze rtp_mpegts should be documented
[09:49:22 CEST] <ac_slater> it's weird how it doesnt spit out an sdp file or anything ... like it's a special thing
[09:50:17 CEST] <ac_slater> I cant even find which encoder/format source files it belongs to
[09:50:29 CEST] <ac_slater> oh I did find it
[09:50:34 CEST] <ac_slater> but wtf seriously
[09:50:53 CEST] <ac_slater> wouldnt `-f mpegts -f rtp` make more sense?
[10:06:41 CEST] <ac_slater> I'm going to make a small assertion here to say that it's near impossible to easily "chain" a muxer with another muxer
[10:08:21 CEST] <ac_slater> I just dont think it's supported
[10:35:07 CEST] <ac_slater> eeek getting a SIGBUS on av_packet_free_side_data
[11:25:02 CEST] <zerodefect> I'm generating a mpeg2video wrapped in a TS. Can you suggest a tool/method to determine if my video contains B-frames. MediaInfo says N=1. Does that indicate intra-only?
[14:25:15 CEST] <eightfold> it is a known problem that video filmed with many mobiles have variable frame rates (vfr). most solutions involve transcoding to cfr and h.264 using handbrake.
[14:25:48 CEST] <eightfold> i should also add that this results in audio desync in premiere, which is why people editing video transcodes in the first place...
[14:27:20 CEST] <eightfold> i would like to transcode from the mobile vfr files to prores cfr files. does anybody know if theres a switch i could use to make the audio and video stay in sync?
[15:18:49 CEST] <thebombzen> eightfold: if you transcode with ffmpeg it'll handle VFR properly
[15:19:24 CEST] <eightfold> thebombzen: hmm, ill have to give it a spin. last time i tried video and audio was out of sync.
[15:19:34 CEST] <thebombzen> last time you tried it with handbrake?
[15:20:12 CEST] <thebombzen> just use ffmpeg directly. try this: ffmpeg -i input.mp4 -c:v prores -c:a copy -vsync cfr output.mov
[15:20:30 CEST] <thebombzen> you might want to set a bitrate with -b:v <bitrate>
[15:20:46 CEST] <thebombzen> otherwise it'll default to something *really low*
[15:28:28 CEST] <nelder> hello all, i have this error "[ffmpeg] https: the user-agent option is deprecated, please use user_agent option". why?
[15:29:15 CEST] <nelder> how i can use user_agent, where it may be pasted?
[15:31:40 CEST] <thebombzen> nelder: you probably have something that's setting -user-agent
[15:31:41 CEST] <thebombzen> don't do that
[15:32:21 CEST] <thebombzen> nelder: I'm guessing you probably wouldn't be asking this quesiton if you didn't have "user-agent" on the actual command line
[15:33:58 CEST] <JonnyG> Hi everyone
[15:34:04 CEST] <nelder> well, i just want to see youtube online videos with mpv (it uses ffmpeg) and after i type at console #mpv <some link> i see this error
[15:35:13 CEST] <JonnyG> Sorry to disturb you, just wanted to know if someone knows about iOS, or could help me on an issue. I cannot add any audio to an mp4 video
[15:35:33 CEST] <JonnyG> Not sure if the video is not supported by iOS, but I don't see anything wrong with it
[15:35:35 CEST] <thebombzen> nelder: I don't recommend running mpv as root
[15:35:39 CEST] <thebombzen> you have no reason to do that
[15:35:50 CEST] <thebombzen> nelder: you probably have an environment variable issue. either way, I'd ask in #mpv rather than #ffmpeg
[15:36:02 CEST] <nelder> thebombzen: i use only user console for it
[15:36:20 CEST] <thebombzen> nelder: then don't say: # mpv <some link>
[15:36:25 CEST] <thebombzen> that implies you're running it as root
[15:36:40 CEST] <thebombzen> JonnyG: some audio codecs like ALAC need -f ipod
[15:36:53 CEST] <thebombzen> that is if you want to put them inside an mp4 container like .m4a with ALAC
[15:37:24 CEST] <thebombzen> JonnyG: If you use `-f ipod` then FFmpeg will use a special case of the mp4 muxer that's Apple-Compatible
[15:37:39 CEST] <nelder> http://dpaste.com/1RJ467K
[15:37:53 CEST] <nelder> ^^^ this all that i see
[15:38:00 CEST] <thebombzen> what version of ffmpeg do you have?
[15:38:18 CEST] <thebombzen> and what version of mpv? It sounds like there's some old versions clashing b/c of deprecated stuff
[15:38:35 CEST] <nelder> 3.2.4
[15:38:39 CEST] <thebombzen> also you need to have youtube-dl in your path to make that work
[15:38:44 CEST] <JonnyG> thebombzen : Thanks for the infos ! Actually the mp4 video doesnt have an audio track, and I want to add an AAC track. I can add the AAC to track to many other videos, but I iOS give me an error when I add it to this particular video
[15:39:13 CEST] <thebombzen> JonnyG: well iOS isn't adding audio, so what is your command that produces a file that iOS complains about?
[15:39:58 CEST] <thebombzen> nelder: what happens if you run: youtube-dl -g 'https://youtu.be/CmELf8DJAVY'
[15:40:27 CEST] <nelder> versions -> http://dpaste.com/2HSG69R
[15:40:42 CEST] <thebombzen> that mpv is very old
[15:40:43 CEST] <thebombzen> update it
[15:41:33 CEST] <thebombzen> youtube-dl is also a few months. youtube-dl is constantly being updated to keep up with YouTube's internal changes and to unbreak videos that used to work, so keep that in mind as well. I recommend updating youtube-dl
[15:41:38 CEST] <nelder> youtube-dl -> http://dpaste.com/1DH5R79
[15:41:53 CEST] <thebombzen> update your youtube-dl as well
[15:42:02 CEST] <nelder> eh... i see
[15:42:09 CEST] <thebombzen> it works with 2017-05-29
[15:42:41 CEST] <thebombzen> in fact, it even tells this to you on the error message: Make sure you are using the latest version; see https://yt-dl.org/update on how to update.
[15:44:09 CEST] <thebombzen> nelder: what distro are you using? because if you're on an LTS distro that doesn't update its packages regularly, I'd recommend building some of these from source, like mpv or
[15:44:36 CEST] <nelder> i use gentoo
[15:44:49 CEST] <thebombzen> well then you have no excuse not to have your packages updated ^_^
[15:45:02 CEST] <JonnyG> thebombzen, fflogger : I'm using an iOS official SDK (AVFoundation), to add audio. I don't use ffmpeg so I don't have ffmpeg commands
[15:45:19 CEST] <JonnyG> But I thought maybe you guys know why this particular video doesnt work
[15:45:26 CEST] <JonnyG> maybe it's corrupted or something
[15:45:33 CEST] <nelder> )
[15:45:33 CEST] <JonnyG> could I send you a link the video (dropbox) >?
[15:45:48 CEST] <thebombzen> JonnyG: this is the ffmpeg help channel, so if AVFoundation fails at doing something, I'm going to tell you to use FFmpeg to do the same thing
[15:46:21 CEST] <thebombzen> I cannot possible answer why something that isn't related to FFmpeg doesn't work. I dont' really want a dropbox link, I recommend just using FFmpeg
[15:46:24 CEST] <JonnyG> Of course, I use ffmpeg to do the same thing on Android, but it's hard to use ffmpeg on iOS ;)
[15:46:39 CEST] <thebombzen> are you trying to create the video *on* the phone?
[15:46:44 CEST] <JonnyG> because command lines are not accepted from a program
[15:47:06 CEST] <JonnyG> I could use the ffmpeg C api
[15:47:09 CEST] <thebombzen> Execv is sandboxed I believe, but you could always bundle libavcodec and libavformat yea
[15:47:13 CEST] <JonnyG> but I have no idea how it works
[15:47:27 CEST] <thebombzen> if you're a developer dealing with audio/video it's a good investment to learn how it works
[15:47:38 CEST] <thebombzen> I don't actually know but there's some examples somewhere
[16:01:22 CEST] <eightfold> thebombzen: no, last time i tried with ffmpeg
[16:01:43 CEST] <eightfold> Ill try that line, thanks
[16:05:24 CEST] <thebombzen> the -vsync cfr does what you probably think it does... force cfr. Alternatively, you could use -r or -vf fps if you need to force a very particular framerate
[16:05:30 CEST] <thebombzen> otherwise it'll use the average framerate of the input
[16:06:03 CEST] <eightfold> thebombzen: trying now. it spits out a lot of these: Past duration 0.710045 too large 1278609kB time=00:00:19.07 bitrate=549071.1kbits/s dup=9 drop=0 speed=0.251x
[16:06:39 CEST] <eightfold> i need to force a particular framerate
[16:06:48 CEST] <eightfold> 25 fps
[16:07:09 CEST] <eightfold> should i just add -vf 25
[16:07:16 CEST] <eightfold> to the exact line you gave above?
[16:09:04 CEST] <eightfold> thebombzen: obiously not, i got [AVFilterGraph @ 0x7f7ffa415580] No such filter: '25'
[16:09:05 CEST] <eightfold> :)
[16:09:21 CEST] <thebombzen> -vf fps means use the fps filter
[16:09:25 CEST] <thebombzen> -vf fps=25
[16:12:09 CEST] <eightfold> thebombzen: thanks! is this something something i should care about: [swscaler @ 0x7fafa304a800] deprecated pixel format used, make sure you did set range correctly
[16:23:45 CEST] <ac_slater> when I open an AVFormatContext (input or output), does specifying an output file like udp://0.0.0.0:9999?pkt_size=752 actually pass pkt_size = 752 to the udp output format?
[16:24:19 CEST] <thebombzen> eightfold: that means it probably is "yuvj444p" or "yuvj420p"
[16:24:28 CEST] <thebombzen> that means fullrange rather than partial
[16:24:48 CEST] <thebombzen> however it's required for mjpeg and perhaps prores so I don't know. it can be safely ignored and it's kind of annoying
[16:25:17 CEST] <thebombzen> ac_slater: I believe that libavformat will handle the URL
[16:25:24 CEST] <thebombzen> including the params
[16:25:28 CEST] <thebombzen> you mgith want to doublecheck that though
[16:25:36 CEST] <ac_slater> thebombzen: yea I will
[16:25:49 CEST] <ac_slater> I just dont see another way to pass a dict of options to a explicit demuxer
[16:26:22 CEST] <ac_slater> like calling avformat_alloc_output2 (or whatever it is) creates an output context
[16:26:31 CEST] <ac_slater> but no real way to craft a dict ahead of time and pass it in
[16:26:40 CEST] <ac_slater> unless the "metadata" field is that
[16:27:02 CEST] <eightfold> thebombzen: audio and video is in sync! awesome, thanks
[16:28:05 CEST] <thebombzen> I doubt it
[16:28:16 CEST] <thebombzen> this might go under "just try it, I guess?"
[16:28:22 CEST] <ac_slater> yea
[16:28:24 CEST] <ac_slater> thanks mate!
[16:28:31 CEST] <eightfold> thebombzen: and no, the default bitrate for prores does not seem to be low. 10,2 gb for 2 minutes of 4k video :)
[16:39:49 CEST] <thebombzen> usually ffmpeg defaults to 200k or 2000k I think. perhaps prores is different? idk
[16:42:49 CEST] <Prelude2004c> hey everyone good day. I am doing HLS with AES encryption. Does FFMPEG currently support key rotation ?
[16:43:07 CEST] <Prelude2004c> Do i simply just need to replace the key file but then how does the m3u8 know to insert new key ?
[16:55:47 CEST] <DHE> Prelude2004c: replace the whole keyinfo file atomically
[16:57:22 CEST] <Prelude2004c> thank you DHE, so if i just replace the whole key file with a new one, the HLS m3u8 will automatically be modified ?
[17:02:50 CEST] <DHE> when new .ts files are made the keyinfo is re-read
[17:03:17 CEST] <Prelude2004c> ahh it looks like it works :0 fantastic
[17:04:36 CEST] <Prelude2004c> last question that you may be able to help.. in the shell script that i built i am generting a key then strarting ffmpeg.... how do i put the key on say a 1 hour rotation.. do i just put a while 1; do < key commands > ; sleep 3600 ; done ... but how do i then continue while running that in background .. like do i just add something like && at the end of done so the script keeps executing while the loop for the key keeps generating ?
[17:05:39 CEST] <DHE> that's shell scripting... you can run a while command in the background with & like you would any other shell command by placing it after the word 'done'
[17:06:38 CEST] <Prelude2004c> ok... that makes sense.. only problem is... what happens if i kill the script.. does the & stay running.. just a question i can ask in #bash but maybe your more savvy about that sort of thing
[17:09:49 CEST] <DHE> no, that is definitely a #bash thing
[17:09:51 CEST] <DHE> :)
[17:16:40 CEST] <Prelude2004c> ok.. so you know.. i just put whole thing in a screen and then put the & at the end and it works like a charm.. thank you
[17:16:54 CEST] <Prelude2004c> funny, there isn't very much documentation online about key rotation .. very odd :) .. DHE, thank you again
[18:36:45 CEST] <ChocolateArmpits> Does anyone know if multiple outputs have frames get written to output at the same time even if one of the encoders buffers longer ?
[18:40:31 CEST] <ChocolateArmpits> I'm testing for encoder+network+decoding latency. One of the outputs encodes and sends the frames over network, other output gets piped to ffplay to be compared with the prepared first stream. I fear this may be flawed and not have encoding latency in check
[18:43:53 CEST] <c_14> I'm pretty sure they're written sequentially
[19:02:53 CEST] <Papi> Hello
[19:03:35 CEST] <Papi> I am using with ffmpeg -i parameter. Can be ffmpeg stopped automatically when no data incoming throught given URL? It cause only problems when rtmp stream ends and ffmpeg is still running...
[19:04:28 CEST] <ChocolateArmpits> Papi, check timeout setting in the docs
[19:05:54 CEST] <Papi> Timeout won't help me. Because it's for rtmp stream in livestream service
[19:12:44 CEST] <Papi> Does anoyone know?
[19:12:48 CEST] <Papi> *anyone
[19:14:08 CEST] <zerodefect> I'm trying to generate an MPEG2 SD video stream @ 25fps with it all being wrapped in TS from within my own application using the C-API. At the moment, it all works well. I can view the video in ffplay. The problem I have at the moment, is that I can seem to get b frames working. I've set 'gop_size' to 12 and 'max_b_frames' to 2 in the AVCodecContext. Are there any other settings I should
[19:14:08 CEST] <zerodefect> tweak?
[19:14:47 CEST] <ChocolateArmpits> Papi, check -progress and if there's no incoming data kill the process
[19:15:31 CEST] <BtbN> zerodefect, -bf?
[19:15:44 CEST] <BtbN> But I think that maps to max_b_frames
[19:17:40 CEST] <Papi> I thought ffmpeg can do it automatically. With ur solution I have to pair PID with correct streamer
[19:17:44 CEST] <zerodefect> Okay, thanks for the tip. I'll see what the source does with the mapping. Is a value of 2 a valid value?
[19:19:24 CEST] <zerodefect> Is explicitly setting the profile or level required?
[19:20:17 CEST] <kepstin> zerodefect: I don't think there are any profile/level options in the mpeg1/2 encoder?
[19:20:42 CEST] <Papi> Only full output settings is required. I am using ffmpeg for transcoding
[19:20:51 CEST] <kepstin> make sure you don't have any 'low latency' flags enabled, those obviously would disable b-frames.
[19:22:25 CEST] <Papi> Here's part of my command... ffmpeg -i rtmp://localhost:1935/$1/$2 -vcodec libx264 -vprofile high -preset ultrafast -x264opts keyint=40 -vf scale=640x360 -minrate 600k -maxrate 800k -acodec aac -strict -2 -f flv rtmp://localhost:1935/hls/$STREAM_NAME_low
[19:22:41 CEST] <zerodefect> kepstin: Ok. I'll double check. Thanks
[19:29:12 CEST] <zerodefect> kepstin: There is a 'profile' and 'level' field in the AVCodecContext --> http://ffmpeg.org/doxygen/trunk/structAVCodecContext.html
[19:32:57 CEST] <kepstin> looks like in mpeg12enc it's just used to put a number in the header, and doesn't actually do anything to the video itself
[19:35:29 CEST] <zerodefect> Yeah, that is what I thought. It is why I hadn't set it.
[19:35:36 CEST] <zerodefect> I guess the other point is that MediaInfo says that 'N=1' for my GOP structure. Can I trust it?
[19:37:59 CEST] <Papi> Is here some1 who can help me with my problem?
[19:39:16 CEST] <Mista-D> force_key_frames every 90th frame ? can't use "-g 90".
[19:41:24 CEST] <Papi> How forcing key frame can help me?
[19:43:22 CEST] <Papi> hm?
[19:43:34 CEST] <Papi> @Mista-D
[19:45:32 CEST] <Mista-D> Papi: I'd like to know how to force keyframe every 90 frames instread of using "g 90".
[19:46:13 CEST] <Papi> Ah... I thought that was react to my questing
[19:46:16 CEST] <Papi> *question
[19:46:45 CEST] <Mista-D> As per your problem, I'd look into using vlc as live transcoder - it can have a playlist input, when live stream ends it switches to next item (a looped video, static image, etc)
[19:51:38 CEST] <Papi> I need to kill ffmpeg process when rtmp stream ends, because this "sleeping" ffmpeg processes consume RAM and causing problems
[19:52:21 CEST] <Mista-D> -xerror ??
[19:52:24 CEST] <Papi> Cause error in ffmpeg that will cause drop of ffmpeg is also solution for me
[19:52:51 CEST] <Papi> I'll try
[19:54:26 CEST] <Papi> Doesn't work :/
[19:55:03 CEST] <Papi> It seems empty rtmp stream won't cause error
[20:04:31 CEST] <zerodefect> kepstin:Is there more than one mp2v encoder? Could I be using a version that doesn't support b frames?
[20:05:00 CEST] <kepstin> zerodefect: ffmpeg only has one supported mpeg2 video encoder, the built-in one, and it does b-frames just fine.
[20:05:48 CEST] <zerodefect> I'll keep digging :)
[20:08:35 CEST] <kepstin> with the default b frame strategy, it should just be using the 'max_bframes' value as-is and encoding that many b-frames every time (aside from the end of gop, when there's sometimes weird behaviour)
[20:20:08 CEST] <Papi> I can't believe my problem has no solution
[20:20:09 CEST] <zerodefect> Yeah, the examples are pretty easy to follow, but I had to make some alterations to get them going because they use deprecated APIs. When I initially ran into this problem, I considered there was no additional work to get B-frames working.
[20:22:34 CEST] <zerodefect> *no=now :)
[21:06:53 CEST] <DHE> My video decoding application using avcodec_send_packet/receive_frame is losing a huge amount of data. On a test stream that's only 30 seconds long (29.97fps) I lose ~59 frames when decoding.
[21:07:35 CEST] <DHE> Yes I send a null packet and collect as much output as possible from receive frame when the input ends. I get a couple extra frames out but not nearly enough for the whole video
[21:36:11 CEST] <kepstin> DHE: where are the frames getting "lost"? all missing from the end? are there gaps in timestamps scattered throughout?
[21:38:29 CEST] <DHE> kepstin: trying to determine that now... was hoping someone might have an idea off the top of their head of what I might have missed...
[21:39:14 CEST] <kepstin> the first thing to check is whether the file actually has all the frames you think it does, e.g. compare the output of ffprobe -show_frames
[21:39:27 CEST] <DHE> I'm outputting the video twice - once as a remux and once as a transcode - in the same process from the same AVFormatContext doing the reading. they're producing dramatically different lengths
[21:46:57 CEST] <imperito> Any suggestions on how I might make a web compatible live-stream out of a low-rate data source?
[21:47:43 CEST] <imperito> (I'm not very familiar with ffmpeg, the folks at #python recommended it)
[21:48:08 CEST] <DHE> HLS or MPEG-DASH might be usable straight-up, depending on player capabilities. these have the advantages of being served by an HTTP server. otherwise you need something like an RTMP server
[21:48:18 CEST] <kepstin> imperito: use ffmpeg to convert to the HLS (or DASH) segment muxer, re-encode to h264/aac if it's in some other codec. Serve files via HTTP server.
[21:49:00 CEST] <ChocolateArmpits> kepstin, he's talking about livestreaming, I don't think ffmpeg supports dash livestreaming
[21:49:24 CEST] <kepstin> hmm, it doesn't? pity. It's not something I've tried tho.
[21:49:29 CEST] <kepstin> the HLS should work at least.
[21:49:41 CEST] <imperito> Serve files... I guess I was assuming I'd emit some stream of whatever video format is in vogue and that would be my endpoint. Is that not how it is done?
[21:49:59 CEST] <furq> not if you want to stream to a browser without flash
[21:50:01 CEST] <ChocolateArmpits> kepstin, seems only webm dash
[21:50:10 CEST] <kepstin> imperito: ffmpeg isn't a server, so it needs some other distribution server to send the media to clients
[21:50:39 CEST] <kepstin> so e.g. HLS can be served by a web server, or another alternative is to send media via rtmp to the nginx-rtmp module, which can serve both HLS and rtmp streams to clients.
[21:51:13 CEST] <furq> if your encoder needs to be separate from the endpoint then use nginx-rtmp
[21:51:15 CEST] <DHE> in a local subnet you can multicast out a video stream. unicast streams are also possible but a receiver needs to be set up ahead of time.
[21:51:19 CEST] <furq> otherwise just use ffmpeg's hls muxer directly
[21:51:43 CEST] <furq> DHE: i don't know if that qualifies as "web-compatible"
[21:51:55 CEST] <imperito> OK, yes, I would definitely want to be served to a normal web browser. Like you open example.com/video in chrome or on an iphone and a video plays
[21:52:21 CEST] <DHE> yeah, HLS can make that work. I have some videos like that on live webcams.
[21:52:37 CEST] <furq> imperito: https://github.com/video-dev/hls.js
[21:52:42 CEST] <furq> you'll need that for desktop browsers
[21:52:43 CEST] <imperito> Yeah, that's pretty much the idea I have, is something like a webcam
[21:53:12 CEST] <DHE> yeah. I got a livestreaming webcam pointed at the construction site outside my office. running ffmpeg for the last ~2 weeks or so streaming
[21:53:20 CEST] <imperito> except instead of a camera producing the input it is a function
[21:53:56 CEST] <imperito> So would the input be like a directory that it could watch for frames, or a FIFO or something like that?
[21:54:16 CEST] <ChocolateArmpits> imperito, what's your input ?
[21:54:19 CEST] <DHE> fifo could work. a directory full of files works but not so well for live-updates
[21:54:23 CEST] <furq> if you're generating input programmatically then you probably just want to pipe it into ffmpeg
[21:54:47 CEST] <furq> python foo.py | ffmpeg -f rawvideo -i - ...
[21:54:48 CEST] <kepstin> if your input is actually a webcam, ffmpeg can probably read it directly
[21:54:54 CEST] <imperito> ChocolateArmpits: ultimately it is a radiometer on a satellite. but there is so much conversion I have to do that it might as well be programmatic
[21:55:41 CEST] <kepstin> yeah, piping it to ffmpeg is probably the best option then. Make sure you're using an IO thread on the writer side, since the writes might block.
[21:58:05 CEST] <imperito> So if I did something like python foo.py | ffmpeg [stuff] it would output these HLS files which if I put in an HTTP server it would have the effect of a live video?
[21:58:20 CEST] <furq> it generates a bunch of segments and an m3u8 playlist
[21:58:29 CEST] <imperito> Like, someone coming in later would get the current video, as opposed to starting from the beginning?
[21:58:37 CEST] <furq> point a video tag at the m3u8 and it'll play them back
[21:58:44 CEST] <furq> by default it'll start two or three segments from the latest
[21:58:57 CEST] <imperito> Oh, I've seen m3u8 files before. On totally legitimate live sports streaming sites...
[21:59:14 CEST] <furq> filthy
[22:00:12 CEST] <imperito> I guess I'd need to delete old... segments? Or does it do that automatically as like a rolling store?
[22:00:24 CEST] <furq> it does that automatically if you ask it to
[22:00:30 CEST] <imperito> clever
[22:00:34 CEST] <furq> !muxer hls
[22:00:34 CEST] <nfobot> furq: http://ffmpeg.org/ffmpeg-formats.html#hls-1
[22:03:10 CEST] <imperito> Thanks. piped input, HLS output, http server. Sounds reasonable
[22:06:27 CEST] <imperito> What about frame rate? Is it going to be a problem if my input only updates every 1-2 seconds?
[22:10:29 CEST] <TMM> hi all
[22:11:31 CEST] <TMM> I'm working on completing some support for the interplay mve format (outside of ffmpeg at the moment) I'm using this information : https://wiki.multimedia.cx/index.php/Interplay_MVE I haven't quite figured out how opcode 0x6 works but I have more information
[22:11:56 CEST] <TMM> does someone here happen to know if multimedia mike is on irc? I have some questions and additions to that page
[23:07:08 CEST] <echelon> hi :)
[23:07:54 CEST] <echelon> i have an mpeg-ts container with h.264/aac encoded video, and i want to convert it to a .m4v container
[23:08:22 CEST] <echelon> will it need to be re-encoded?
[23:08:55 CEST] <ChocolateArmpits> echelon, no
[23:09:04 CEST] <echelon> cool
[23:09:30 CEST] <ChocolateArmpits> use this : ffmpeg -i input.mp4 -vcodec copy -an output.m4v
[23:09:44 CEST] <ChocolateArmpits> this will strip the audio and copy video frames only
[23:09:44 CEST] <sfan5> umm
[23:09:45 CEST] <echelon> why no -an?
[23:09:48 CEST] <sfan5> that will omit audio
[23:09:52 CEST] <sfan5> just use -c copy
[23:10:00 CEST] <ChocolateArmpits> isn't m4v strictly for video ?
[23:10:02 CEST] <furq> no
[23:10:03 CEST] <echelon> why do i wanna strip audio again? lol
[23:10:26 CEST] <ChocolateArmpits> ok then remove -an
[23:10:27 CEST] <sfan5> pretty sure .m4v is just a fancy way of saying .mp4
[23:10:30 CEST] <sfan5> i might be wrong though
[23:10:36 CEST] <furq> it is
[23:10:36 CEST] <sfan5> ChocolateArmpits: then he would be re-encoding the audio
[23:10:39 CEST] <furq> but then so is m4a
[23:10:52 CEST] <ChocolateArmpits> so he should use -codec copy
[23:10:56 CEST] <sfan5> indeed
[23:11:16 CEST] <furq> m4v and m4a are mostly used by itunes
[23:11:48 CEST] <echelon> another thing.. the .ts file lacks seekability, can't skip around easily as there's no timestamp when it plays
[23:11:54 CEST] <furq> m4a will use -f ipod with ffmpeg, but m4v and mp4 are identical
[23:12:04 CEST] <echelon> so will it be indexed and added to the m4v?
[23:12:07 CEST] <furq> echelon: that's an inherent property of mpegts
[23:12:15 CEST] <furq> mp4 will be seekable
[23:12:20 CEST] <echelon> oh, cool
[23:12:59 CEST] <echelon> furq: i copied the mpegts from an encrypted stream, so i don't know how it's able to let me seek on the web browser, but not in any local video player
[23:13:20 CEST] <furq> there's no seek index
[23:13:32 CEST] <furq> you can bruteforce seek, but some players just don't do that
[23:25:33 CEST] <notbrontosaurusr> can you please unban me?
[23:26:28 CEST] <notbrontosaurusr> freshly compiled ffmpeg on Debian with libxcb: How do I screencapture with mouse pointer?
[23:26:53 CEST] <notbrontosaurusr> this: ffmpeg -hide_banner -f x11grab -r $fps -s 1920x1200 -i :0.0 \
[23:26:53 CEST] <notbrontosaurusr> -vcodec libx264 -preset medium -tune fastdecode -pix_fmt yuv420p -crf 30 -an -y
[23:26:58 CEST] <notbrontosaurusr> is not working.
[23:27:26 CEST] <notbrontosaurusr> I mean, it is working, only without the pointer.
[23:29:01 CEST] <DHE> https://www.ffmpeg.org/ffmpeg-all.html#x11grab parameter "-draw_mouse 1"
[23:29:06 CEST] <DHE> start with that
[23:29:18 CEST] <notbrontosaurusr> DHE: thanks.
[23:31:28 CEST] <notbrontosaurusr> ffmpeg -hide_banner -f x11grab -draw_mouse 1 .... ?
[23:31:36 CEST] <DHE> looks right
[23:31:50 CEST] <notbrontosaurusr> Did't do the trick, should i blame nvidia drivers?
[23:36:22 CEST] <DHE> uhh, what?
[23:36:32 CEST] <DHE> what's nvidia got to do with anything?
[23:37:06 CEST] <notbrontosaurusr> dunno, guessing, the other intel machine is working as it should (older ffmpeg compile thought)
[23:39:38 CEST] <notbrontosaurusr> ok, it is working with debian version; ffmpeg version 3.2.5-1
[23:40:10 CEST] <notbrontosaurusr> DHE, thanks, later. And if you ppl can unban me, that would be great ;)
[23:52:50 CEST] <DHE> silly silly people asking nicely to be unbanned...
[00:00:00 CEST] --- Sat Jun 3 2017
More information about the Ffmpeg-devel-irc
mailing list