[Ffmpeg-devel-irc] ffmpeg.log.20190606
burek
burek021 at gmail.com
Fri Jun 7 03:05:02 EEST 2019
[00:33:28 CEST] <eschie> Hi all, I'm trying to troubleshoot an issue and am unsure where to start: When seeking past the video buffer of my HLS stream on iOS, the audio cuts out but the video continues. I am unable to sync them back up again unless I start the video over from the beginning. Is this a problem with framerates and segments? How can I compare why some videos do this, when others of my do not?
[00:40:42 CEST] <CarlFK> another: mind reviewing my params: https://github.com/CarlFK/voctomix-outcasts/blob/8a70888036f29b34682f010775070398eb7ab432/record-timestamp.sh
[02:42:41 CEST] <CarlFK> Trailing options were found on the commandline.
[02:43:07 CEST] <CarlFK> ah, missed a \
[02:57:54 CEST] <Yasuragikara> Hi everyone. I'm trying to build ffmpeg with standard configuration on a win7 host, using the current mingw-get base sets of mingw-dev, mingw32 (+pkg-config) (+libpthreadgc/e) and mingw32-g++ with msys (+yasm) as the make shell.
[02:58:01 CEST] <Yasuragikara> Configuring and making ffmpeg in msys works out fine and I got the static .exe to use, but when I try to pass this cmd line (ffmpeg.exe -i a.aac a.wav) to ffmpeg, I receive "[aac @ 04350bd8] Warning: not compiled with thread support, using thread emulation" as output and the software crashes. Could someone please tell how I can fix this? Thanks in advance for all feedback.
[03:02:17 CEST] <Yasuragikara> I'm using the source of ffmpeg version 4.1.3.
[03:31:18 CEST] <anony11> Hi
[03:31:56 CEST] <anony11> It's possible to output a buffer? Like ffmpeg -i continousrawdata filethatkeepsbeing400kb.webm
[03:50:54 CEST] <void09> hi. trying to test the SVT-AV1 encoder, and their sample usage line with ffmpeg stream led me to the following: ffmpeg -i /scratchpad/Aferim.2015.1080p.BluRay.REMUX.AVC.DTS.HDMA.5.1-BiTHD.mkv -nostdin -f rawvideo -pix_fmt yuv420p - | ./SvtAv1EncApp -i stdin -n 200 -w 1920 -h 1080 -b out.bin
[03:56:48 CEST] <void09> uh nvm i think it works
[07:39:47 CEST] <pk08> Hi,
[07:39:47 CEST] <pk08> i want to overlay 16 video streams into one video stream and output that video to some udp multicast.
[07:39:47 CEST] <pk08> and i am doing this in two steps: 1) scale origianl input to fixed resolution video stream and 2) take input video stream of first step and overlay on nullscr filter and do udp multicast
[07:39:49 CEST] <pk08> so in fist stepp i am using this command to scale video stream https://pastebin.com/Tt20J25N
[07:39:51 CEST] <pk08> and i am talking above commands output for 2nd step as input for overlaying for final output, and command is https://pastebin.com/36bqQJQb
[07:39:54 CEST] <pk08> now in 2nd step i am getting slow encoding speed like between 0.300x to 0.700x and thats why i am getting bad video output and even i am getting some errors like this: https://pastebin.com/v4sTQnXG
[07:40:45 CEST] <pk08> can anyone please tell me if my command is wrong or bad input
[07:40:56 CEST] <pk08> or what am i doing wrong in here
[07:41:01 CEST] <pk08> thanks in advance
[07:41:09 CEST] <pk08> thank
[08:18:44 CEST] <CarlFK> pk08: I would start by reading and writing files - then you don't have to worry about performance
[08:19:53 CEST] <CarlFK> also instead of 16, do 2.
[08:20:04 CEST] <CarlFK> get that working, then you can do more
[08:20:18 CEST] <pk08> CarlFK: i need real time input and output, so it need to be udp multicast
[08:20:35 CEST] <pk08> and yes, i tried to add one by one
[08:20:48 CEST] <CarlFK> pk08: you don't need real time while you are testing
[08:20:49 CEST] <pk08> and i was able to get good output upto 10 inputs
[08:21:23 CEST] <pk08> yes, correct
[08:21:33 CEST] <pk08> it dont need to be realtime while testing
[08:21:50 CEST] <pk08> but even in file, performance will be same
[08:22:14 CEST] <pk08> and when i will do real time, it wont work as expectations
[08:23:05 CEST] <pk08> you think ffmpeg filter_complex has bottleneck of input bitrate or something?
[08:23:31 CEST] <CarlFK> I have no idea - I was here getting some help with deinterlacing flags
[08:23:59 CEST] <CarlFK> it sounds like you have the commands right, but you are running out of some resource
[08:26:49 CEST] <pk08> i think its ffmpeg side issue or i need some improvement in command because i have 32 cores and 64 gb of RAM so i dont think i have resources issue
[08:27:26 CEST] <pk08> so not sure, what i need to do...
[08:29:07 CEST] <CarlFK> could be memory bandwitch
[08:29:21 CEST] <CarlFK> again... try files... see if it breaks
[08:30:47 CEST] <CarlFK> 16 network streams is a bit of bandwidth, and maybe cores decompressing them
[08:38:40 CEST] <pk08> may be you are right about network bandwidth, but need to be sure about it
[08:38:56 CEST] <pk08> so let me test everything as file
[08:39:08 CEST] <pk08> like all input and output in file format
[08:39:19 CEST] <pk08> and check performance
[09:38:14 CEST] <th3_v0ice> When is the value of ts_offset set in FFmpeg.c code?
[09:57:39 CEST] <th3_v0ice> Found it.
[11:20:31 CEST] <keglevich> hello all...
[11:22:04 CEST] <keglevich> I have a question regarding ffmpeg concat demuxer... simple scenario: I try to join (concat) two files into single one...both are perfect constant 25fps... after concat the output is variable with minimum framerate (1fps - 20fps) and max framerate still 25fps... same issue is posted here https://trac.ffmpeg.org/ticket/7939
[11:22:28 CEST] <keglevich> why does this happen and what can I do to avoid it and get true constant 25fps end result without the need to re-encode again?
[11:23:01 CEST] <keglevich> ffmpeg -f concat -safe 0 -i clip-list.txt -c copy clip-concat.mp4
[11:23:09 CEST] <keglevich> that's the command I'm using
[11:38:15 CEST] <th3_v0ice> keglevich, Maybe try to produce those two input files with -stitchable, if you are using x264 for encoding that is. You can also encode input files with -stitchable and output to .h264. Concat those, that would work.
[11:38:41 CEST] Last message repeated 1 time(s).
[11:38:41 CEST] <th3_v0ice> Sorry for double posting.
[11:42:32 CEST] <keglevich> th3_v0ice: hmm, what about if there files are already done and I wouldn't like to reencode them again?
[11:43:07 CEST] <keglevich> all the files I have are already mp4, libx264, aac, 25fps ... same resolution, same ar... I'd just like to join them somehow and get 25fps result
[11:44:31 CEST] <keglevich> meybe it's worth mentioning, I tried the option with creating two .ts files as described here https://trac.ffmpeg.org/wiki/Concatenate (Using intermediate files) and the result was correct in the end...but I'm not sure, if this method would work for other formats as well (for instance libx264 with mp2 audio?)
[12:10:08 CEST] <keglevich> and again... also tried with different .ts files and the final output sometimes is correct and sometimes is variable... so this option also doesn't work as expected
[12:33:36 CEST] <anony11> Hi, how to: "ffmpeg -i rawdatastream streamwebmof2seconds.webm". Regards
[12:34:45 CEST] <pk08> CarlFK: i have tried to take all inputs as file format and output as file too, and it still gives me 0.7x encoding speed
[12:35:14 CEST] <pk08> and to make 5 mins output video, its takes 6:30 mins
[12:35:39 CEST] <pk08> so its something related to ffmpeg side issue
[12:36:24 CEST] <anony11> Oh, to be clear: the input is ok, only the output(to use as little size as possible)
[12:37:31 CEST] <anony11> Like: "ffmpeg -i rawdatastream -fs 200k streamwebmof2seconds.webm". Regards
[12:39:20 CEST] <pk08> can anyone please tell me, how can i increase encoding speed in this command https://pastebin.com/36bqQJQb , I am getting around 0.5 to 0.7x speed while encoding
[12:39:40 CEST] <anony11> platform?
[12:43:15 CEST] <a_c_m> Hi
[12:51:13 CEST] <a_c_m> I have a perhaps interesting use case, which i can't quite get to work https://pastebin.com/b03e9URc - TL:DR; i want to start and stop a stream, but with the output not knowning i did. Any one any ideas on how i might do this? I have a basic idea working, but its not right.
[12:53:03 CEST] <anony11> lmao
[12:53:49 CEST] <a_c_m> @anony11 - :) to who's question ?
[12:54:24 CEST] <anony11> Haha yours
[12:54:58 CEST] <a_c_m> :) Because it brain dead simple to fix? (i hope :) ) or some other reasons?
[12:56:33 CEST] <anony11> Here
[12:56:34 CEST] <anony11> https://imgur.com/L7rtCvb
[12:57:54 CEST] <a_c_m> lol - someone else asked the same thing? woah. and there was me thinking it was novel :) - So your saying pipe the output to another instance of ffmpeg and all will be well?
[12:58:32 CEST] <anony11> Tought of that, but > and after that? You have to continue to pipe
[12:59:10 CEST] <a_c_m> will give it a try
[12:59:33 CEST] <anony11> People suggested to: make a little program, a while loop, but then, since streaming, the file "changes", and it's not a buffered output
[12:59:49 CEST] <a_c_m> i'm actually doing it all from node - so controlling the streaming output is easy to to control
[13:00:46 CEST] <a_c_m> its just the timecodes thats the problem - as it confuses the players into showing nothing / the last frame for that "paused" section
[13:01:14 CEST] <anony11> I'm above, doing in C haha
[13:05:56 CEST] <Yasuragikara> @pk08 Check this out: https://pastebin.com/Bj6TjqSL
[13:06:17 CEST] <Yasuragikara> Maybe this can help you.
[13:54:51 CEST] <anony11> Regards
[14:13:26 CEST] <pk08> @Yasuragikara thank you, let me test it, i will get back with results
[14:14:17 CEST] <furq> that's not going to work
[14:14:19 CEST] <furq> -vf isn't per-input
[14:14:50 CEST] <furq> pk08: keep the filterchain you already had but replace all the overlays with one xstack
[14:14:58 CEST] <furq> https://ffmpeg.org/ffmpeg-filters.html#xstack
[14:15:37 CEST] <pk08> furq: yes, i know
[14:15:50 CEST] <pk08> thank you :)
[14:16:46 CEST] <pk08> but this is temporary fix, even if it works
[14:17:24 CEST] <pk08> because in future i will need different resolution's video as input
[14:17:43 CEST] <pk08> and stack filter work for only same resolution
[14:18:03 CEST] <pk08> does anyone know any alternative or fix for that?
[14:18:18 CEST] <furq> change fifo,setpts to fifo,setpts,scale
[14:19:07 CEST] <furq> also you can put newlines in filterchains (or load the filterchain from a text file) so it's probably worth doing that for readability
[14:19:43 CEST] <keglevich> I have a question regarding ffmpeg concat demuxer... simple scenario: I try to join (concat) two files into single one...both are perfect constant 25fps... after concat the output is variable with minimum framerate (1fps - 20fps) and max framerate still 25fps... same issue is posted here https://trac.ffmpeg.org/ticket/7939
[14:20:10 CEST] <keglevich> why does this happen and what can I do to avoid it and get true constant 25fps end result without the need to re-encode again?
[14:20:20 CEST] <keglevich> ffmpeg -f concat -safe 0 -i clip-list.txt -c copy clip-concat.mp4
[14:20:30 CEST] <keglevich> this is the command I was using
[14:22:33 CEST] <pk08> furq: i have started testing with 2 inputs at a time, command: https://pastebin.com/q4jBUN51 and its working fine
[14:23:03 CEST] <pk08> i will add more and more, and bother you guys if i will have any issue ;)
[14:36:49 CEST] <keglevich> pk08: I checked your pastebin... what do you actually plan to achieve? multiscreen UDP-ts maybe?
[14:38:42 CEST] <pk08> keglevich: i want to scale input video streams to 480x270 and make single multicast of 16 video streams
[14:39:37 CEST] <keglevich> that's something I was trying to achieve a while ago... and I gave up in the end
[14:39:51 CEST] <keglevich> one multicast stream drops... everything drops
[14:40:18 CEST] <JEEB> yea you need to do your own fallback handling in your own ffmpeg api code
[14:40:33 CEST] <JEEB> ffmpeg.c won't help you as it's a generic tool
[14:40:51 CEST] <JEEB> you can of course add it into ffmpeg.c if you want
[14:41:07 CEST] <pk08> yes, but i guess shortest filter will help in that case keglevich
[14:41:31 CEST] <JEEB> well that will kill the whole thing :p
[14:41:41 CEST] <keglevich> I was also looking for already made alternative, and I was out of luck as well...althought I've seen some pretty nice demo examples on youtube, but never got any real solution to this
[14:41:41 CEST] <JEEB> not dynamiclaly flal back to backuo
[14:41:49 CEST] <keglevich> so I'm still interested if there's "working" one
[14:42:10 CEST] <JEEB> upipe has dynamic source/backup source switching but also requires your own code
[14:42:58 CEST] <pk08> keglevich: i will let you know if i make it working...
[14:43:27 CEST] <furq> pk08: there's not really any way around that problem
[14:43:36 CEST] <furq> i figured all your sources are from the same ip so it should probably work
[14:43:37 CEST] <pk08> JEEB: i was trying to make this by coding but i gave up because i wasnt getting output at all...
[14:44:00 CEST] <furq> barring issues on the source side
[14:44:35 CEST] <keglevich> it's called "mosaic" now I remember... there was some talks about VLC being able to do this...also ffmpeg... but as said, I haven't found anything in the end...
[14:44:41 CEST] <keglevich> that's how should it look like: http://mib.pianetalinux.org/blog/images/stories/t/tvmosaic020-1.jpg
[14:44:41 CEST] <pk08> furq: its just an example, if this will work i will have different ip, different url format like hls, rtmp or even ts-file
[14:44:53 CEST] <furq> yeah that probably won't work well then
[14:45:05 CEST] <furq> there's no way of handling reconnection etc
[14:46:11 CEST] <keglevich> ^ this
[14:46:55 CEST] <keglevich> but as said...there are already made solutions for this kind of mosaic screens...I just haven't found one for free
[14:47:27 CEST] <pk08> keglevich: image looks cool, i need something like this
[14:47:43 CEST] <pk08> but with some more features
[14:47:48 CEST] <keglevich> so I'm just using multiple VLC instances ... each instance one UDP stream
[14:48:07 CEST] <keglevich> yeah I know exactly what you need...but I really think you're wasting time with ffmpeg to do this
[14:49:08 CEST] <keglevich> more or less all (payable) playout solutions have something like this...but again, it's not free
[14:49:24 CEST] <furq> you can do most of this with ffmpeg, just not the network input handling
[14:49:33 CEST] <furq> and you could probably do that well enough with the ffmpeg libs
[14:50:12 CEST] <keglevich> maybe try to do something with this: https://wiki.videolan.org/VLC_HowTo/Make_a_mosaic/
[14:51:01 CEST] <keglevich> I pretty much believe he needs a solution "out-of-the-box" as soon as possible and can't spend time or doesn't have knowledge to code his own with ffmpeg...or maybe I'm wrong?
[14:51:29 CEST] <furq> i suspect most people don't want to write C in general
[14:52:07 CEST] <keglevich> with me lack of programming knowledge was always an issue
[14:52:32 CEST] <pk08> well, i spent atleast a month in c coding in this issue, but somehow i couldnt get output
[14:52:38 CEST] <pk08> so i moved to ffmpeg command line
[14:53:41 CEST] <pk08> thank you guys for your feedback, i need to rethink about this...
[14:54:21 CEST] <keglevich> try with VLC as I said...it works pretty well...but again, you'll need to restart (stop/play) some stuck UDP streams sometimes as we do
[14:55:09 CEST] <keglevich> and you'll also have EPG, teletext, interlacing, etc available out-of-the-box
[14:55:17 CEST] <keglevich> to control each UDP stream separately
[14:55:30 CEST] <pk08> yeah, i got yo
[14:56:00 CEST] <pk08> because of packet loss or audio/video issue or other issue
[14:56:07 CEST] <pk08> we might need restart
[14:56:23 CEST] <keglevich> yeah, sometimes it happens, without a proper reason
[14:57:02 CEST] <pk08> i can understand i have been through this
[14:57:59 CEST] <keglevich> I also wasted a lot of time trying to achieve the same as you try now, without any real success in the end...so we just do everything manually now, using multiple VLC instances...it works ok more or less
[14:58:04 CEST] <keglevich> and it's free
[14:58:36 CEST] <keglevich> if you find a solution, please drop me an email to suljotron at gmail.com
[14:58:47 CEST] <keglevich> I'd be glad to reply if I can help with something
[15:02:39 CEST] <pk08> sure, keglevich
[15:03:03 CEST] <pk08> i will mail you, if i find any working solution
[15:03:19 CEST] <pk08> and my email is parbatvgec at gmail.com
[15:05:53 CEST] <EvanCarroll> minor documentation bug: https://unix.stackexchange.com/q/523188/3285
[15:10:19 CEST] <furq> the latest openssl release is still under the old license
[15:10:39 CEST] <furq> so i'd say the docs are still correct for now
[15:41:28 CEST] <Anill> if i provide different segements of file to ffmpeg, will it write(after transcoding) to same file, or over write the segments or create different files?
[16:18:06 CEST] <th3_v0ice> keglevich, I am sorry but -stitichable is the only thing that comes to mind. On a side note I think that the video is perfectly fine but its only reported as variable framerate because last packet's from concatenating may not have proper packet durations set. Mediainfo was giving me variable framerate whenever my packet durations were off even by a little.
[16:18:56 CEST] <th3_v0ice> You can use -debug_ts to actually test if the packets are spaced equally.
[22:34:19 CEST] <electrotoscope> Does anyone have an example of -vf "drawtext=text=%{metadata:_____}" actually working? I'm absolutely stumped. I'm pretty sure I've escaped the : properly, because I can get it to switch to the "use if metadata not found" second argument, but I can't get it to display anything at all from the first argument
[22:35:31 CEST] <electrotoscope> https://stackoverflow.com/questions/24744608/ffmpeg-and-timecode-from-movie-metadata says that it has to be something that shows up in an ffprobe frame under "TAG:____" but I can't find an example file that has that
[23:48:05 CEST] <another> hmm.. trac down?
[23:52:22 CEST] <steve___> another: something has been funky with the webserver. Just keep trying and the page should load
[00:00:00 CEST] --- Fri Jun 7 2019
More information about the Ffmpeg-devel-irc
mailing list