[Ffmpeg-devel-irc] ffmpeg.log.20190731

burek burek021 at gmail.com
Wed Aug 21 14:07:15 EEST 2019


[00:18:28 CEST] <Saccarab> when you merge a lot of files using concat demux can you make sure every audio file starts on a specific timestamp
[00:19:16 CEST] <Saccarab> I've tried adding silent audios between each file but it precision get's lost after first 30-40 files
[00:54:46 CEST] <^Neo> hi! I'm curious if anyone can recommend a player or application to easily manage playing out multicast streams?
[00:55:00 CEST] <^Neo> I was using VLC on an NVIDIA shield, but typing in the multicast address for each one is a pain
[03:02:36 CEST] <NubNubNub> hi, i wanted to know if it is possible that ffmpeg switches inputs in a complex filter sequence ... my goal is to have two live streams, where one is displayed in full and the 2nd is displayed smaller, this, however works fine, now i wanted to be able now to switch both sources, eg make small one large and large one small
[03:04:19 CEST] <klaxa> use obs
[03:04:38 CEST] <klaxa> that'll save you a lot of headaches
[03:04:40 CEST] <furq> yeah i'm pretty sure lavfi can't do that
[03:04:51 CEST] <furq> you could fake it but not without wasting some cpu time
[03:05:00 CEST] <klaxa> and adding latency
[03:05:41 CEST] <furq> also yeah if the inputs are live streams then the ffmpeg cli generally won't do well with that
[03:05:52 CEST] <furq> there's no mechanism for reconnecting one stream if it drops, etc
[03:06:11 CEST] <NubNubNub> the streams are proxied in my case
[03:08:53 CEST] <NubNubNub> i thought that the streamselect filter might be the one to go for, but looks like i am too dumb to use it
[03:14:48 CEST] <NubNubNub> how would i use streamselect properly with having two input sources defined? eg ffmpeg -i source1 -i source2 -filter_complex "&do voodoo streamselect&" [-other options]
[03:42:13 CEST] <Prelude2004c> hey everyone.. good evening. I have a question. I have a source with a bunch of program ID's being pushed to server but i can only get the first udp://localhost:3000 for example because the second ones always say address in use. Using multicast it seems to be ok but how do i copy over like ffmpeg udp://localhost:3000 -f mpegts udp://235.255.x.x:xxxx .. i tried the -c copy but it only pushes out the 1st program ID... i also tried -map 0 for all but
[03:42:13 CEST] <Prelude2004c> its overloading server. I just want to pass input to output to udp. WHat gives ?
[04:19:57 CEST] <kode54> Prelude2004c: UDP has no packet regulation
[04:20:11 CEST] <kode54> fire as fast as the link supports, and no acknowledge or retry
[04:24:46 CEST] <DHE> trying to use ffmpeg as a transport stream demuxer?
[04:37:44 CEST] <ossifrage> lol, I was thinking about using clapper as a web video player. I came home tonight to find the github.com/clappr/clappr page had grown to >2GB on chrome (without ever clicking on the embedded player at the bottom of the page)
[04:41:05 CEST] <ossifrage> It seems like all the github pages have starting growing (potentially without bound, but the clapper one was the worst)
[08:06:17 CEST] <Harzilein> hi
[12:46:40 CEST] <termos> i'm trying to queue up some AVFrames in a AVThreadMessageQueue, but the frames coming out are rejected by my filter graph with av_strerror just being "Invalid argument". It seems like all metadata such as width/height is set correct in the AVFrame coming out of the queue, any ideas?
[13:03:25 CEST] <durandal_1707> termos: not enough info to help, sorry
[13:05:30 CEST] <termos> hmm yes I realize, just cant find examples of using that thread queue for AVFrames only AVPackets. Wondering if Im pushing and pulling them correctly
[13:05:58 CEST] <termos> I expect the issue to be because of some reference counting issues
[13:10:22 CEST] <machtl> hi Guys again!
[13:11:27 CEST] <machtl> when i "ffprobe -v error -select_streams v:0 -show_entries stream=duration -of default=noprint_wrappers=1:nokey=1 -i test.mp4" it returns a duration in what??
[13:12:04 CEST] <machtl> iam asking because i have a problem counting the returned output of ffprobe duration and atrimming audio
[13:15:15 CEST] <machtl> so what iam doing is visible in this pastebin https://pastebin.com/h6KMkWxx later. My problem now is that somehow the second audio stream tagged aa7 and aa8 is not outputting audio
[13:16:39 CEST] <machtl> this could be because this is a livestream and iam trying to get to the "live" position and i cannot read -re  of the stream, but why does it work with audio stream aa6?
[15:15:44 CEST] <Freneticks> hello is there a way to record live stream (mpeg-ts) to pure mp4 format ?
[15:16:06 CEST] <Freneticks> actually if i use -c copy myfile.mp4 the file is in mpeg-ts format
[15:18:52 CEST] <DHE> mp4 isn't stream friendly. you can do it, but it will be basically unplayable until you cleanly shutdown ffmpeg
[15:19:29 CEST] <DHE> well, under certain interpretations of "stream friendly"
[15:19:57 CEST] <Freneticks> And if I segmentize in little file of mp4 ? like 10min each ?
[15:20:08 CEST] <Freneticks> (that's why I try to do)
[15:20:15 CEST] <Freneticks> -why +what
[15:20:35 CEST] <DHE> the rule applies to each file. when it's complete it's playable
[15:21:19 CEST] <Freneticks> DHE: okay, but is there a special command to accomplish that ? i precise .mp4 and i got mpeg-ts in the header media
[15:21:37 CEST] <ritsuka> if you want to play it back while recording you can make a fragmented mp4
[15:21:54 CEST] <Freneticks> ritsuka: yeah i'm looking for that
[15:25:07 CEST] <Harzilein> btw, am i missing something about -f hls in that when working from a non-streaming source it will _instantly_ overwrite the playlist?
[15:25:38 CEST] <JEEB> I think you have to specifically set the type of playlist
[15:26:54 CEST] <DHE> the hls muxer isn't directly aware of the source material type. it writes a new playlist file every X seconds of input video. for streaming sources that will literally be every X seconds. otherwise it will depend on the performance of ffmpeg (encoding, etc)
[15:26:59 CEST] <JEEB> yea
[15:27:28 CEST] <Harzilein> DHE: yeah, i was kind of hoping that it'd write them out in realtime.
[15:27:38 CEST] <JEEB> although I think it might be calculating it based on PTS
[15:27:41 CEST] <JEEB> not realtime
[15:27:54 CEST] <JEEB> hls_list_size etc limiting the amount of segments
[15:28:04 CEST] <JEEB> and hls_time being how long the segments should be
[15:28:18 CEST] <Harzilein> i.e. what i end up is a lot of segment files (which i thought would be deleted automatically too) and the playlist for the final few segments
[15:28:28 CEST] <JEEB> ah yes
[15:28:32 CEST] <JEEB> by default it keeps the old segments
[15:28:33 CEST] <DHE> yeah then your list size is too small
[15:28:40 CEST] <JEEB> there's a flag "delete_segments"
[15:28:42 CEST] <DHE> there is an option -hls_flags delete_segments or such
[15:28:52 CEST] <JEEB> which tells the muxer that it is OK to remove segments
[15:32:21 CEST] <Harzilein> argh, i was scanning the "hls_delete_threshold size" paragraph and overlooked the part where it says it's conditional on the delete_segments flag. i assumed that because of the default at the end i should end up with 5 lines in the playlist and 6 files remaining.
[15:35:16 CEST] <Harzilein> still, apparently it was wishful thinking on my side in general... so i have to write them out _without_ deleting, then parse the playlist, then schedule the rotation myself? just because the input speed is not limited? :/
[15:54:34 CEST] <DHE> Harzilein: what exactly are you trying to accomplish?
[16:03:01 CEST] <ocx32> hi community, reading about webrtc and RTP , can i think of the workflow of RTP as the same for a webrtc endpoint? offer/answer etc? or is it different? Thanks
[16:05:24 CEST] <Harzilein> DHE: say i have a static file on my server that i want to watch on my pc, then have that pc go offline (i.e. no channel to notify everyone else of my playing position) and be able to "nearly resume" the realtime stream on another device. my mental model is that ffserver circular buffer thing...
[16:29:31 CEST] <Harzilein> DHE: hmm... but maybe with ffserver that only worked by sheer happenstance as well...
[16:29:57 CEST] <Harzilein> DHE: clearly the source producing data in realtime is the common case
[16:57:17 CEST] <Harzilein> DHE: i think i used the -re option with ffserver
[17:45:02 CEST] <blb> durandal_1707: err, thanks for handling that. though it wasn't me spamming, sorry for that. i had that guy ignored, but unfortunately he could still bug the channel.
[18:24:43 CEST] <someAlex> Hello. I want to upgrade from ffmpeg-2.6.8 to ffmpeg-3.4.6 , but first I would like to check, are there any significant CLI changes (so I won't break any compatibility in my or other users' scripts). I'm looking at ffmpeg changelog ( https://github.com/FFmpeg/FFmpeg/blob/master/Changelog ), but I don't really see anything significant between mentioned versions (a couple of removed codecs/libs).
[18:25:31 CEST] <someAlex> The question is: any ideas, where I can find information about significant CLI changes? Or, were there such changes at all?
[18:28:22 CEST] <cehoyos> 3.4.6 is old and unsupported. If you find cli incompatibilities between 2.6 and current FFmpeg git head, please tell us!
[18:46:57 CEST] <kepstin> someAlex: in general the ffmpeg cli has stayed quite compatible over the years.
[18:47:05 CEST] <kepstin> most of the the gotchas are codec related
[18:47:42 CEST] <kepstin> e.g. if you select an encoder by codec name "-c:a opus" instead of picking a specific implementation "-c:a libopus", then stuff might break
[18:47:47 CEST] <kepstin> not that 2.6 had opus support lol
[18:48:36 CEST] <kepstin> someAlex: and please use a version newer than 3.4 if you're updating anyways :)
[18:51:47 CEST] <someAlex> kepstin: 3.4 is the latest version available for centos, as far as I can see (rpmfusion repo, official ffmpeg site leads there for fedora/RHEL). I would like to avoid building ffmpeg from source, and also avoid using (unpopular) 3rd party repos.
[18:52:13 CEST] <kepstin> someAlex: you should probably build it yourself or use the static linux build
[18:56:46 CEST] <kepstin> i ended up building a private ffmpeg package at work to use on ubuntu boxes :/
[18:57:09 CEST] <someAlex> kepstin: ok, thank you
[20:10:51 CEST] <whitestone> hello people
[20:12:10 CEST] <dastan> is someoone to help me with nvidia acceleration?
[20:12:20 CEST] <saml> is nvidia good?
[20:12:27 CEST] <saml> i'm here to help but i noob
[20:12:35 CEST] <dastan> if it ok installed and compiled
[20:12:37 CEST] <saml> i only know node.js
[20:12:49 CEST] <dastan> yes, i am working with it
[20:12:57 CEST] <dastan> but i want to do something else
[20:13:00 CEST] <saml> and is it not using nvidia acceleration ?
[20:13:11 CEST] <dastan> yes, its working
[20:13:16 CEST] <saml> i thought ffmpeg supports gpu acceleration out of the box
[20:13:30 CEST] <dastan> but i want to use hardware acceleration with decklink cards
[20:19:20 CEST] <dastan> is it possible to accelerate the insertion of frames in decklink cards?
[20:27:30 CEST] <Ua-Jared> Hey all, I have a performance question related to ffmpeg if I may. I've got an IP camera that streams an H264-encoded stream. If I play the stream with VLC, I see about 1% CPU usage and 3-4% GPU usage in task manager. But if I "play" the stream with the ffmpeg command (i.e., make the input the stream URI over the local network, and the output an mp
[20:27:31 CEST] <Ua-Jared> 4 file), I get around 17% CPU usage and 2% usage. Why would the ffmpeg command use so much more resources? I know VLC uses ffmpeg in the background, so it's confusing to me why the pure ffmpeg command would give worse performance..... System Specs, command output, VLC log and such here: https://pastebin.com/MNZx7amu
[20:29:16 CEST] <durandal11707> Ua-Jared: ffmpeg does not use hardware decoders/encoders, you need to set that explicitly
[20:30:13 CEST] <Ua-Jared> durandal11707: does the following command tell it to use the hardware decoder? I thought I had gotten this right, haha: ./ffmpeg -hwaccel dxva2 -threads 1 -loglevel verbose -i "rtsp://admin:admin@192.168.15.20/media/video2" sampleOutputWithGPUAcelAndThreads1.mp4
[20:31:35 CEST] <durandal11707> dunno, never used hwaccel stuff
[20:32:34 CEST] <jkqxz> Yes, but that is transferring the decoded frames from the GPU and encoding them on the CPU as well.  VLC is only decode and displaying on the GPU, with no frames on the CPU.
[20:33:07 CEST] <jkqxz> The nearest comparison with ffmpeg only would be to decode and throw away the result (since it has no display).
[20:35:18 CEST] <Ua-Jared> Ohh.... wait so, with ffmpeg, I'm reading in the H264 enocded stream, decoded it on the GPU, transferring those frames back to the CPU, REENCODING them, and then saving them in the mp4 file? That's a little silly. The mp4 file could just take the encoded frames right away, right? I'm unfamiliar with how mp4 works but my boss seemed to hint that it
[20:35:19 CEST] <Ua-Jared> just stores the base picture and then deltas from there, like encoded H264 does
[20:36:09 CEST] <durandal11707> Ua-Jared: that is -c:v copy for
[20:37:00 CEST] <durandal11707> your original command just do transcoding
[20:42:37 CEST] <kepstin> Ua-Jared: note that I do not recommend saving a live stream to an mp4 file, because if something goes wrong and file writing is interrupted, the mp4 file format will get corrupted
[20:42:55 CEST] <Ua-Jared> How exactly would I use -c:v flag? i'm still a little confused about it. Is it that I'm effectively decoding the stream, only to then re-encode it for the mp4 file?
[20:43:17 CEST] <durandal11707> -c:v copy just demuxs only
[20:43:33 CEST] <kepstin> Ua-Jared: the "-c copy" option disables ffmpeg's decoding/encoding functionality, and makes it copy the already encoded video without modification
[20:44:02 CEST] <Ua-Jared> kepstin: noted. Would an .avi or something be better? To be honest I didn't know the best way to approach it, my real goal is to use ffmpeg to do the gpu-accelerate decoding, and then (hopefully) read in those decoded frames in my Java program to do some further processing
[20:44:16 CEST] <kepstin> Ua-Jared: consider writing the live data to e.g. an mkv file; you can always convert to a different format later.
[20:47:04 CEST] <furq> Ua-Jared: if you want to do further processing then you probably want -c:v rawvideo
[20:49:58 CEST] <kepstin> ffmpeg's yuv4mpeg writer actually doesn't take rawvideo annoyingly. you use it without any -c:v option.
[20:50:12 CEST] <Ua-Jared> Ayyyy thank yall for guidance! That makes a ton of sense. I actually has tried JavaCV to do this too (and OpenCV itself earlier), but it's a little difficult to specify that you want GPU decoding with those. So I figured I'd just use the underlying tech itself, haha
[20:50:29 CEST] <kepstin> (it actually uses wrapped avframes, which leads to this weirdness)
[20:51:12 CEST] <furq> fun
[20:52:24 CEST] <kepstin> someone ran into that issue here a week or two ago, was pretty puzzling :/
[21:28:54 CEST] <Ua-Jared> So, to be honest I'm still a little confused on how I'd use ffmpeg to decode the video, and then pipe it into my java program... What would the command look like? I thought something ffmpeg -hwaccel dxva2 -threads 1 -i "myInputStreamURL" -f matroska | java myProgram  , but I am a little confused how it works. Would this pipe the output of ffmpeg, w
[21:28:54 CEST] <Ua-Jared> hich is decoded images from the stream, into my Java program?
[21:31:55 CEST] <durandal11707> how do I know, i never looked at your java program
[21:34:05 CEST] <Ua-Jared> Assuming my Java program reads from stdin and all. Like I understand piping right, it's just that stdout of one program is piped into the stdin of another program / utility. But I am little confused here what the stdout of ffmpeg would be
[21:37:23 CEST] <BtbN> Piping in raw decoded video via stdin would be incredibly inefficient
[21:37:38 CEST] <BtbN> If you want to decode video in your Application, use the libraries, not the cli tool
[21:38:54 CEST] <Ua-Jared> Oh, I see. Isn't ffmpeg just C though? How could I use it with Java? I was trying to avoid using JNI if possible, haha, though it is a cool technology
[21:39:45 CEST] <Hello71> we try to avoid java if possible.
[21:42:49 CEST] <Ua-Jared> Hmm, I see. I kinda don't have a choice unfortunately, lol. And I don't mind Java actually, it's pretty nice I find
[22:02:05 CEST] <kepstin> note that javacv is, among other things, a jni wrapper of ffmpeg
[22:02:12 CEST] <kepstin> er, of libav*
[22:45:59 CEST] <NubNubNub> can anybody provide a sample of how to use filter_complex "streamselect" properly, please?
[22:54:34 CEST] <kepstin> NubNubNub: it's basically useless with the ffmpeg cli command, use the "-map" options instead
[22:55:12 CEST] <kepstin> the only real use it has it to toggle between streams at runtime by sending a filter command. this is possible to do with ffmpeg-cli, but kind of a pain
[22:56:01 CEST] <NubNubNub> this is exactly what i am trying to achieve
[22:58:09 CEST] <kepstin> i thought there was a way to send filter commands to ffmpeg cli's stdin, but i've never used that. probably the easiest way to set this up is to use the zmq filter: https://ffmpeg.org/ffmpeg-filters.html#zmq_002c-azmq which can inject filter commands into the filter graph.
[22:58:39 CEST] <NubNubNub> from what i understood it can map -input SRC1 -input SRC2 to various output pads, is that correct?
[22:59:36 CEST] <kepstin> streamselect has multiple inputs, one output. it forwards data from one input to the output and discards the other streams
[22:59:53 CEST] <kepstin> you can switch which stream it's using as an input at runtime.
[23:01:06 CEST] <NubNubNub> like in "[src1][src2] streamselect=input=2:map=1 [out]" ?
[23:01:35 CEST] <kepstin> yeah, in that example it would be forwarding [src2] to [out] and discarding [src1]
[23:01:54 CEST] <kepstin> if you send it a command "map 0" later, then it would start forwarding [src1] to [out] instead
[23:02:27 CEST] <kepstin> i think
[23:02:37 CEST] <kepstin> you know, I actually need to double-check this now, I might have it wrong
[23:02:43 CEST] <kepstin> docs aren't super clear
[23:02:50 CEST] <NubNubNub> the official docs are quite sparse for this.
[23:04:55 CEST] <kepstin> yes, what I said looks correct, it uses framesync internally to sync the inputs.
[23:05:38 CEST] <NubNubNub> thank you very much. i'll try it right away
[23:05:43 CEST] <kepstin> i was unsure about whether it discarded the other inputs, or just let them buffer.
[23:08:24 CEST] <NubNubNub> for a better understanding: i am building a live streaming source which consolidates two or more inputs, where one is always large and others are scaled down, in my UI however i want to be able to pass information to the ffmpeg process doing the work and switch sources so that a different one becomes large while rest stays small
[23:09:13 CEST] <NubNubNub> all inputs are put together in a final overlay which is then streamed
[23:27:32 CEST] <kepstin> right, so use it as a video switcher, thats its intended purpose
[23:28:18 CEST] <kepstin> note that the ffmpeg cli tool itself is not really intended for live stuff, it's primarily a batch processing tool. You can probably make it work but it won't be ideal.
[23:30:09 CEST] <BtbN> You want OBS or something for that kinda stuff
[23:30:47 CEST] <NubNubNub> from what i've seen OBS is a desktop app
[23:32:17 CEST] <kepstin> hmm, i know stuff like GDQ's streaming uses it as an api-controlled live video switcher and compositor
[23:40:17 CEST] <kepstin> but yeah, they're probably still running it as a gui app and then using remote control plugins
[00:00:00 CEST] --- Thu Aug  1 2019


More information about the Ffmpeg-devel-irc mailing list