[Ffmpeg-devel-irc] ffmpeg.log.20200114
burek
burek at teamnet.rs
Wed Jan 15 03:05:02 EET 2020
[00:31:42 CET] <zumba_addict> can ffmpeg be used to analyzed and guess if there were dropped frames on a video?
[00:31:51 CET] <zumba_addict> analyze^
[00:40:46 CET] <void09> depens on how they were dropped
[00:41:09 CET] <void09> dropped from a stream cause of packet loss/insufficient bandwidth ?
[00:42:15 CET] <zumba_addict> dropped from a stream
[00:43:33 CET] <cehoyos> In general, ffmpeg is not an analyzing tool but a transcoder, for "real" (compressed) video, you will get (many) error messages for damaged streams
[00:43:47 CET] <zumba_addict> got it
[03:36:01 CET] <javashin> hello
[03:36:03 CET] <javashin> amigos
[03:36:45 CET] <javashin> i have a problem a i made a video a large video with simple screen recorder
[03:36:54 CET] <javashin> and its all thrash out
[03:37:07 CET] <javashin> with a lot of flickering
[03:37:15 CET] <javashin> how i can fix the video
[03:37:28 CET] <javashin> i really dont want to re do it again
[03:37:42 CET] <javashin> its there any solution ?
[03:40:26 CET] <nicolas17> javashin: hard to tell you how to fix it if we don't know how it's broken
[03:42:16 CET] <javashin> i made it on freebsd
[03:42:37 CET] <javashin> and looks like the compositor made the flicker happen
[03:43:04 CET] <javashin> without the compositor the videos with simple screen recoder looks fine
[03:43:20 CET] <javashin> the video is complete is not like corrupted or something
[03:44:00 CET] <javashin> its the image thats shows the video flicker a lot
[03:44:17 CET] <javashin> like unwanted visual effect
[11:04:22 CET] <lofo> hi
[11:07:36 CET] <lofo> i was tweaking parameters on OBS and ended up using their "Custom Output (FFmpeg)" panel. On the "container format" field, which is a dropdown list, there is an "mpegts" but also an "rtp_mpegts" and i was wondering was it corresponded to for ffmpeg ?
[11:07:53 CET] <lofo> and what was the difference between "mpegts" and "rtp_mpegts"
[11:08:08 CET] <lofo> people from OBS told me this list was directly pulled from ffmpeg
[11:10:58 CET] <JEEB> it's some muxer available through lavf, you'd have to look it up specifically to figure out what it actually does and what it's useful for.
[11:11:19 CET] <lofo> lavf = libavformat ?
[11:11:41 CET] <JEEB> yes
[11:11:42 CET] <JEEB> .name = "rtp_mpegts",
[11:11:42 CET] <JEEB> .long_name = NULL_IF_CONFIG_SMALL("RTP/mpegts output format"),
[11:12:07 CET] <lofo> thanks JEEB
[11:12:20 CET] <JEEB> seems to just use mpegts and rtp inside
[11:12:40 CET] <JEEB> probably to pass MPEG-TS through RTP, since RTP itself is on the lavf level as well
[11:13:07 CET] <lofo> i dont understand why the two are mangled together
[11:13:39 CET] <lofo> one is a transport protocol the other a container
[11:14:27 CET] <JEEB> as far as I can see, RTP has a separate packetizer and a protocol thing
[11:15:18 CET] <JEEB> and rtp_mpegts is then the packetizer that utilizes MPEG-TS
[11:15:30 CET] <JEEB> not perfect
[11:15:35 CET] <JEEB> but that's how it is atm
[11:22:05 CET] <lofo> ok :)
[11:26:48 CET] <JEEB> of course the fun part with these that utilize other things internally is how you handle option passing :)
[11:27:04 CET] <JEEB> we've had plenty of fun with things like MPEG-DASH or HLS
[11:27:20 CET] <JEEB> where you have the "muxer" doing the playlist writing etc
[11:27:32 CET] <JEEB> and then calling internally mpeg-ts or mp4 muxers
[11:32:24 CET] <lofo> it looks like ffmpeg hits the conceptual limits of "muxer" :)
[11:32:41 CET] <lofo> quite confusing for a noob like me
[13:17:27 CET] <faLUCE> hello. does ffmpeg need to compiled from scratch for ogg/theora support, or are there binaries for linux?
[13:28:25 CET] <tablerice> faLUCE: Seems like ffmpeg should support ogg by default. However, in the general docs it looks like Theora decoding is supported, but encoding is only supported via external libraries
[13:30:16 CET] <faLUCE> tablerice: I need to do screen capture with theora codec. And I don't want to recompile ffmpeg
[13:30:45 CET] <faLUCE> should I switch to another tool?
[13:31:48 CET] <faLUCE> tablerice: solved. I just found recordmydesktop app
[13:32:03 CET] <tablerice> My first thought was OBS
[13:32:20 CET] <faLUCE> what's OBS?
[13:32:21 CET] <tablerice> https://obsproject.com
[13:32:34 CET] <tablerice> Screen recorder / streaming platform
[13:32:41 CET] <tablerice> it runs ffmpeg under the hood
[13:32:46 CET] <pink_mist> afaik it's what most twitch streamers use
[13:33:11 CET] <faLUCE> does it support theora encoding?
[13:33:26 CET] <tablerice> I'm rummaging through the settings right quick...
[13:33:59 CET] <tablerice> Yea, it looks to support theora and ogg
[13:34:31 CET] <faLUCE> tablerice: thanks
[13:35:10 CET] <tablerice> No problem. I just went to settings > output > advanced and chose 'custom output (ffmpeg)'
[14:29:05 CET] <EmmaT> ffmpeg seems to out the files in a different directory than the working directory, or i am doing something wrong
[14:30:15 CET] <pink_mist> if you specified a different directory for ffmpeg to output the files into, sure
[14:30:23 CET] <pink_mist> but otherwise it would definitely use the working directory
[14:30:40 CET] <pink_mist> or maybe you're running a script that changes the working directory for ffmpeg
[14:31:59 CET] <EmmaT> pink_mist i booted up an old project ... on mac now .. and using a different static build which is behaving very strange in terms of output ... but no . .the command lists the output correctly ... i get the playlist file correcty ... the hls files not
[14:33:26 CET] <EmmaT> command
[14:33:27 CET] <EmmaT> ./ffmpeg -reconnect 1 -reconnect_at_eof 1 -reconnect_streamed 1 -reconnect_delay_max 3000 -i "http://user:password@192.168.0.11:9981/play/stream/service/2ec9eb7f5285a9345c59929084fccc8f" -vf "scale=floor(iw*min(1\,min(1024/iw\,720/ih))/2+0.4999999)*2:-2" -vf setdar=16/9 -crf 24 -preset veryfast -threads 1 -c:v libx264 -r 24 -g 96 -x264opts keyint_min=96 -x264-params scenecut=0 -map 0:v:0 -map 0:a:0 -f hls -hls_list_size 100 -hls_time 4 -hls_flags
[14:33:28 CET] <EmmaT> delete_segments -sn /c/projects/momomo/.generated/Tv/streams/KANAL9_2346076453/
[14:37:18 CET] <pink_mist> well you're specifically setting the output to a dir there
[14:37:29 CET] <pink_mist> I'll presume that means it'll just use that dir
[14:37:50 CET] <pink_mist> but I'm far from an expert ... I would have expected it would complain that it wasn't a filename
[14:50:58 CET] <lofo_> on the commandline tool how does one specify a muxer explicitely ? instead of implicitely choosing it with the output file extension ?
[14:51:31 CET] <JEEB> -f
[14:51:47 CET] <JEEB> -f FORMAT, and formats can be listed with -formats
[14:51:52 CET] <EmmaT> pink_mist maybe it has to do with .generated ... since the output is going to the parent folder
[14:52:36 CET] <lofo_> oh, i'm confused with how the -f option work. take this command for example
[14:52:36 CET] <lofo_> ffmpeg -f avfoundation -i "0" -c:v mpeg2video -q:v 20 -pix_fmt yuv420p -f rtp rtp://192.168.1.47:1234
[14:52:44 CET] <EmmaT> but strange thing is that the playlist is getting in htere
[14:52:45 CET] <lofo_> there are two -f options
[14:53:15 CET] <JEEB> before/after
[14:53:17 CET] <JEEB> input/output
[14:53:28 CET] <lofo_> so you can only specify -f two times ?
[14:53:29 CET] <JEEB> -i INPUT being the differentiator
[14:53:36 CET] <JEEB> for each input and output it can be set
[14:53:45 CET] <DHE> ffmpeg [options for file] [input-or-output file]
[14:53:50 CET] <DHE> and that pattern repeats
[14:54:05 CET] <EmmaT> pink_mist it was that
[14:54:11 CET] <EmmaT> bug!
[14:54:16 CET] <JEEB> ffmpeg -f blah -i blah -f blah -i INPUT_STREAM -f outblah output1 -f outblah output2
[14:54:17 CET] <EmmaT> ffmpeg can not handle .folders
[14:54:20 CET] <lofo_> DHE so i could put as much -f as i want
[14:54:42 CET] <JEEB> there, I have an example with two inputs and outputs set and f set for all of them :P
[14:54:48 CET] <DHE> lofo_: ffmpeg -f image2 -i "%04d.png" -f mp4 -i input.mp4 -filter_compex hstack -f mkv output.mkv
[14:54:59 CET] <DHE> 2 inputs, 1 output, formats for each explicitly set
[14:55:18 CET] <DHE> I'm sure the filter is wrong but you get the idea
[14:55:39 CET] <lofo_> so -i is the only way to differentiate output from input as JEEB stated
[14:55:57 CET] <DHE> "-i" in front of a filename is an input file. else it's an output file
[14:56:08 CET] <lofo_> i get it
[14:56:21 CET] <lofo_> then for -f rtp rtp://192.168.1.47:1234
[14:56:35 CET] <lofo_> rtp is the format ? so i should repalce it with rtp_mpegts ?
[14:57:04 CET] <lofo_> if i want to use rtp_mpegts as a muxer
[14:57:24 CET] <DHE> ffmpeg -muxers # 'D' means input, 'E' means output (decoder, encoder)
[14:57:35 CET] <DHE> err, -formats
[15:11:07 CET] <lofo_> thanks i get the syntax now
[15:11:51 CET] <lofo_> are twitch commands printed to everyone
[15:11:59 CET] <lofo_> oops sorry :/
[16:59:08 CET] <pagios> hi, what is the use of "-g",90 ?
[17:00:08 CET] <pagios> trying to transcode rtmp -> HLS i am using -vf scale -vb bitrate and -g 90 -framerate 30 -hls_time 3 <-- can i optimize this to make it work on a slow machine
[17:03:22 CET] <cehoyos> Your (second) question is missing _all_ necessary information to be answered in a useful way...
[17:03:52 CET] <cehoyos> The option "g" specfies the distance between key frames, I believe it has a very small impact on performance
[17:06:19 CET] <pagios> cehoyos, https://pastebin.com/EA5aL7AJ i am getting these errors continously when the transcoding happens
[17:07:17 CET] <cehoyos> (No errors are shown in your paste) Please do never paste extracts of a random console output, always show us the command line you used together with the complete, uncut console output.
[17:08:00 CET] <cehoyos> Note that your console output (the little that can be seen) indicates that your "slow machine" is fast enough for the task you gave it.
[17:11:33 CET] <pagios> cehoyos, https://pastebin.com/mT9YtJZT this is the command from node and the output
[17:29:39 CET] <cehoyos> Apart from an old binary (that may have less optimizations and more bugs but gets less support), everything looks fine.
[17:35:35 CET] <pagios> cehoyos, is it normal to get these stderr errors?
[17:37:01 CET] <pink_mist> what error?
[17:37:23 CET] <pink_mist> stderr is often used to output information to the user, none of that needs to be an error
[17:46:58 CET] <pagios> pink_mist, cehoyos is there a way to optimize that to make it work on slower cpu
[17:53:47 CET] <cehoyos> Start with a current binary but note that depending on what "slower cpu" means wonders are typically difficult to get...
[17:53:59 CET] <cehoyos> (achieve?)
[17:56:17 CET] <cehoyos> And remember that you can tell the x264 encoder how fast you need it to run (placebo, veryslow, slower, slow, medium, fast, faster, veryfast, superfast, ultrafast)
[17:59:24 CET] <pagios> cehoyos, on a slow pc , better to run placebo?
[18:02:01 CET] <DHE> no. you want a fast setting. veryfast is as far as I'd go because I find quality takes a serious hit for superfast and ultrafast
[18:02:16 CET] <DHE> $ ffmpeg -i filename .... -preset:v veryfast ... output
[18:09:36 CET] <cehoyos> But depending on what a "slower cpu" is, you may have to test superfast and ultrafast...
[23:40:36 CET] <tomb^> Hi, I have a file with audio stream: "Stream #0:1: Audio: pcm_s24le, 48000 Hz, 4.0", I'm trying to make 1 stereo stream out of channel 1&2 (don't need 3 and 4), I tried: "channelsplit=channel_layout=4.0[FL][FR][BL][BR];[FL][FR]amerge=inputs=2[aout]" but I get an error saying "Filter channelsplit:BC has an unconnected output", what am I doing wrong?
[23:46:03 CET] <furq> tomb^: "pan=stereo|c0=c0|c1=c1"
[23:51:03 CET] <tomb^> furq, thanks for the quick reply, it seems to be working, now I'm trying to figure out why :)
[23:58:10 CET] <furq> the error was because you had BL and BR labels not connected to anything
[23:58:26 CET] <furq> you could have also done channelsplit=channel_layout=4.0:channels=FL|FR,amerge=inputs=2
[23:58:43 CET] <furq> but pan is simpler
[00:00:00 CET] --- Wed Jan 15 2020
More information about the Ffmpeg-devel-irc
mailing list