[Ffmpeg-devel-irc] ffmpeg.log.20170814
burek
burek021 at gmail.com
Tue Aug 15 03:05:01 EEST 2017
[00:20:43 CEST] <sh4rm4^bnc> what's the trick to get ffprobe to report length for all of a dvd's VOB files ?
[00:21:21 CEST] <sh4rm4^bnc> i tried using concat:file1.VOB|file2.VOB, and looping over all vobs and adding up the size
[00:21:47 CEST] <sh4rm4^bnc> the concat thing just returns length for the last VOB in the list
[00:22:27 CEST] <sh4rm4^bnc> and looping over single files gets erroneus results, for example VTS_01_4.VOB is reported to have 90000 seconds
[00:54:09 CEST] <dystopia_> ffprobe -i "concat:VTS_01_1.VOB|VTS_01_2.VOB|VTS_01_3.VOB|VTS_01_4.VOB|VTS_01_5.VOB|VTS_01_6.VOB"
[00:54:10 CEST] <dystopia_> maybe
[01:29:08 CEST] <sh4rm4^bnc> dystopia_, thanks, but as i said, this returns only the length of the last file
[01:46:51 CEST] <c_14> sh4rm4^bnc: there's no good solution besides iterating over each file and getting the duration there (if necessary by actually counting frames/pts)
[01:47:20 CEST] <c_14> Well, I guess you could use concat and ffmpeg to create a file and grab the reported output duration of that
[01:56:32 CEST] <sh4rm4^bnc> c_14, is there an advantage over using ffmpeg for that instead of cat *.VOB > foo.vob ?
[01:57:21 CEST] <sh4rm4^bnc> (that's what i wanted to avoid to begin with; i need the length for automatic framerate calculation to encode the dvd as mp4)
[01:59:23 CEST] <c_14> If you're using the concat protocol, not really. Using tccat might be a better alternative
[01:59:50 CEST] <sh4rm4^bnc> btw i would have expected that dvd metadata files have the length info, but my tried to use dvd://path didnt work out...
[02:00:20 CEST] <c_14> they should
[02:01:02 CEST] <sh4rm4^bnc> so there's probably a trick to make ffprobe read the info from the metadata, instead of scanning the vob's
[02:01:23 CEST] <c_14> Probably a deficiency in the code if anything
[02:01:56 CEST] <sh4rm4^bnc> well, i find it quite deficient that the concat:foo|bar input doesn't work as expected
[02:02:42 CEST] <c_14> the duration reported by ffprobe is only metadata
[02:02:48 CEST] <c_14> and for that it just uses the first file
[02:03:00 CEST] <c_14> you could use -count_frames and some extra options to get all the frame pts
[02:03:06 CEST] <c_14> but those'll probably wrap
[02:03:13 CEST] <c_14> So you'd have to catch that and add it all up yourself
[02:04:48 CEST] <sh4rm4^bnc> haha, -count_frames makes ffmpeg segfault
[02:05:00 CEST] <sh4rm4^bnc> or ffprobe, rather
[02:05:08 CEST] <c_14> version?
[02:05:22 CEST] <sh4rm4^bnc> ffprobe version 3.2.3 Copyright (c) 2007-2017 the FFmpeg developers
[02:06:02 CEST] <c_14> That's relatively new I guess
[02:06:04 CEST] <c_14> weird
[02:06:08 CEST] <c_14> linux?
[02:06:12 CEST] <sh4rm4^bnc> yes
[02:06:16 CEST] <c_14> Try
[02:06:18 CEST] <c_14> http://johnvansickle.com/ffmpeg/
[02:06:20 CEST] <c_14> A build from there
[02:06:54 CEST] <sh4rm4^bnc> nope sorry - i use only sw built myself from source
[02:07:13 CEST] <c_14> can you build from git then?
[02:07:13 CEST] <sh4rm4^bnc> also i use musl libc, so that one is almost certainly not compatible anyway
[02:07:16 CEST] <c_14> git master head?
[02:07:25 CEST] <sh4rm4^bnc> i guess i could try that
[02:07:28 CEST] <c_14> I just recommended that because most people are lazy
[02:36:08 CEST] <sh4rm4^bnc> no segfault with latest git snapshot... though it's either dead-slow, or stuck in an endless loop
[02:36:35 CEST] <sh4rm4^bnc> i started the command like 10 mins ago, still not finished
[02:36:56 CEST] <c_14> count_frames can take a while
[02:36:57 CEST] <c_14> though not _that_ long usually
[02:37:34 CEST] <sh4rm4^bnc> PID USER PRI NI VIRT RES SHR S CPU% MEM% TIME+ Command
[02:37:34 CEST] <sh4rm4^bnc> 25885 user 20 0 95612 22264 12012 R 99.8 0.1 11:52.75 ffprobe -v error -show_entries format=dura
[02:39:28 CEST] <sh4rm4^bnc> it's on the first vob file, which without count_frames reports a length of 29 secs
[02:40:00 CEST] <c_14> the thing with count_frames is that it won't necessarily make your result better if you don't work with it, if your content is cfr you can grab the total number of frames with format=nb_read_frames, otherwise you'll have to dump info on every frame with show_frames or similar to get the pts of each of them
[02:48:08 CEST] <sh4rm4^bnc> still running, i think we can officially declare it to be an endless loop
[02:48:22 CEST] <sh4rm4^bnc> let's see if i can get a backtrace
[02:59:51 CEST] <sh4rm4^bnc> for some reason gdb can't associate the program counter with the source code addresses, so bad luck i guess
[03:00:32 CEST] <sh4rm4^bnc> only thing i can tell is that memset is called in the loop with small buffers of maybe 32 bytes max
[03:01:16 CEST] <sh4rm4^bnc> anyway, let me know if you need further info, c_14 (assuming you're a dev interested in fixing this)
[03:02:28 CEST] <c_14> Im not much of a dev and I'm busy these days anyway, but you could open a bug report on trac though without a reproducer getting a fix would be hard
[03:02:41 CEST] <c_14> To fix your actual problem, I would recommend using tccat or something though
[05:32:01 CEST] <c3r1c3-Win> danieel: i need a log, not a screenshot.
[05:32:28 CEST] <c3r1c3-Win> Sorry... I somehow ended up in this chat.
[06:22:49 CEST] <sh4rm4^bnc> hmm, when i encode stuff with libx264, the resulting video can only be watched if it is completely downloaded, unlike e.g. youtube videos where you can watch while you download. i get this error:
[06:22:52 CEST] <sh4rm4^bnc> [ffmpeg/demuxer] mov,mp4,m4a,3gp,3g2,mj2: moov atom not found
[06:22:52 CEST] <sh4rm4^bnc> [lavf] avformat_open_input() failed
[06:23:17 CEST] <sh4rm4^bnc> is there a trick to produce a "streamable" video ?
[06:23:47 CEST] <relaxed> you need -movflags faststart
[06:24:02 CEST] <sh4rm4^bnc> cool, thanks
[06:24:15 CEST] <furq> you shouldn't need that any more unless your browser is really old
[06:24:22 CEST] <furq> although it's still a good idea in general
[06:24:37 CEST] <thebombzen> sh4rm4^bnc: keep in mind that if you encode with -movflags +faststart, you still need to write the whole file. it's not stremable in the sense that you can view partial files
[06:24:48 CEST] <furq> right
[06:24:52 CEST] <furq> you'd need to use a different muxer for that
[06:24:53 CEST] <sh4rm4^bnc> i encode stuff on an fast server, then download it, trying to watch it with mpv while it plays
[06:25:01 CEST] <relaxed> the + isn't needed
[06:25:02 CEST] <furq> while it encodes, you mean?
[06:25:09 CEST] <furq> in that case use mkv
[06:25:12 CEST] <sh4rm4^bnc> no, after it is encoded
[06:25:17 CEST] <furq> oh
[06:25:18 CEST] <thebombzen> relaxed: it should be used though
[06:25:33 CEST] <furq> tbh you should probably still use mkv unless you have a good reason to prefer mp4
[06:25:40 CEST] <furq> but if you want to use mp4 then do what relaxed said
[06:26:00 CEST] <relaxed> it should be used for multiple flags
[06:26:08 CEST] <thebombzen> sh4rm4^bnc: mp4 files have a table of contents (called the moov atom) that by default is at the end of the file and it's created when the file is ready to be finished
[06:26:42 CEST] <thebombzen> you can't read the mp4 file without the moov atom. but -movflags +faststart moves it to the front of the file when writing to a seekable device, which means that browsers can start it immediately
[06:26:43 CEST] <sh4rm4^bnc> i see
[06:26:49 CEST] <furq> browsers will usually seek to the end of the file to download the moov atom now
[06:26:58 CEST] <furq> but if you're using scp or something then obviously that's not going to work
[06:27:11 CEST] <thebombzen> this also requires the http server to support seeking, which it might not if it's old
[06:27:36 CEST] <sh4rm4^bnc> i download with wget :)
[06:27:50 CEST] <thebombzen> well there you go
[06:28:16 CEST] <thebombzen> if you use a primitive browser like wget or curl to download it sequentially, you have the issue described above.
[06:28:56 CEST] <sh4rm4^bnc> can libx264 be used with mkv ? and if so, what's the advantage ?
[06:29:15 CEST] <furq> yes and the advantage is you don't need to rewrite the entire file when it's done encoding to put the moov atom at the start
[06:29:30 CEST] <furq> more or less everything is compatible with mkv
[06:29:46 CEST] <furq> (except timebases that aren't a power of 10, but ignore that for now)
[06:29:50 CEST] <sh4rm4^bnc> oh great. i just use foo.mkv as output filename instead of foo.mp4 ?
[06:29:51 CEST] <thebombzen> most browsers will not autoplay matroska files
[06:30:00 CEST] <furq> well he's not using a browser so that's fine
[06:30:02 CEST] <furq> sh4rm4^bnc: yeah
[06:30:12 CEST] <c3r1c3-Win> Just remux if you need the mp4 file.
[06:30:14 CEST] <thebombzen> don't use -movflags +faststart though, that's a mp4-specific option
[06:30:34 CEST] <sh4rm4^bnc> ok, thanks
[06:30:35 CEST] <thebombzen> (er, mov, mp4, m4a, 3gp, 3g2, mj2, but whatever)
[06:30:53 CEST] <furq> also if you already encoded these files then you don't need to reencode them
[06:31:06 CEST] <furq> you can just copy the streams into mkv or whatever
[06:32:00 CEST] <sh4rm4^bnc> ah, how can i do that ?
[06:32:23 CEST] <furq> -i foo.mp4 -map 0 -c copy bar.mkv
[06:33:51 CEST] <sh4rm4^bnc> nice
[08:15:56 CEST] <thunfisch> is it possible to print an AVFormatContext property with the ffmpeg comand line? need to get the 'rtcp_ts_offset' value that is being set in libavformat/rtsp.c:2188 @master
[08:16:02 CEST] <thunfisch> want to avoid writing custom code
[08:17:53 CEST] <thunfisch> or even better: set dts to that value
[13:35:48 CEST] <Vonor> getting on latest git checkout: x86_64-pc-linux-gnu-gcc is unable to create an executable file. using gcc 4.8.5 -> http://paste.opensuse.org/65205285
[13:38:39 CEST] <durandal_1707> what is error?
[13:39:12 CEST] <Vonor> that is the error
[13:39:19 CEST] <BtbN> there's no error in that log
[13:39:33 CEST] <bencoh> Vonor: you should paste the full configure log
[13:39:46 CEST] <Vonor> the log shows only gcc -v
[13:40:36 CEST] <Vonor> erm, the output of the configure script has that as first line, then suggests about cross compiling (which is not the case ) and says "C compiler test failed". that's it
[13:41:34 CEST] <Vonor> ah. there's a build log. heh. didn't see that before. my bad. here's the actual line: x86_64-pc-linux-gnu-gcc -march=k8 -mtune=core2 -O2 -pipe --static -fPIE -march=core2 -c -o /tmp/ffconf.cFXFnAgx/test.o /tmp/ffconf.cFXFnAgx/test.c
[13:41:34 CEST] <Vonor> ./configure: line 892: x86_64-pc-linux-gnu-gcc: command not found
[13:42:47 CEST] <furq> pastebin the full configure log
[13:47:23 CEST] <Vonor> http://arin.ga/4AdqDq
[13:47:57 CEST] <furq> --cc=x86_64-pc-linux-gnu-gcc
[13:48:12 CEST] <ScottSteiner> I have a video that I sped up 8x. I want to put a timestamp on it, but the drawtext filter doesn't support a framerate of 400/1. How do I generate a transparent video with the timestamp so I can speed it up and then merge it with the original video?
[13:48:24 CEST] <furq> you probably don't want to set --cc etc if you're not cross compiling
[13:48:30 CEST] <furq> especially not to something that doesn't exist
[13:50:45 CEST] <Vonor> ehh..do'h. that configure line was from a gentoo environment i had before. thanks for the hint.
[13:52:50 CEST] <furq> ScottSteiner: what do you mean "doesn't support a framerate of 400/1"
[13:52:54 CEST] <furq> that seems to work fine here
[13:53:17 CEST] <ScottSteiner> I get error message "[Parsed_drawtext_0 @ 0x1e18d00] Timecode frame rate 400/1 not supported"
[13:53:40 CEST] <ScottSteiner> Could it be a codec/container issue? I'm putting it into a libvpx webm
[13:54:02 CEST] <Vonor> hmm. missing some libararies i guess. it's complaining now that ld cannot find -lpng and -lc i got libpng12, libpng16 and their respective -devel packages installed.
[13:54:03 CEST] <furq> that shouldn't matter
[13:55:45 CEST] <furq> ScottSteiner: pastebin the command and full output
[13:56:14 CEST] <ScottSteiner> https://paste.debian.net/plain/981285
[14:02:03 CEST] <furq> oh right i actually get a similar error
[14:02:19 CEST] <furq> timecode_rate only works here if i set it to 30 or 60
[14:02:43 CEST] <ScottSteiner> yeah, i was just looking at the code https://www.ffmpeg.org/doxygen/2.1/libavutil_2timecode_8c_source.html supported_fps[] = {24, 25, 30, 50, 60};
[14:03:39 CEST] <ScottSteiner> so I need to create a blank image with just the timer on an alpha channel...How would I do that?
[14:04:04 CEST] <furq> you can use text=${pts\:hms} for a millisecond timestamp
[14:04:12 CEST] <furq> that'll at least give you a unique value per frame
[14:04:21 CEST] <furq> i don't see how you'd get a proper timecode at 400fps though
[14:04:34 CEST] <furq> er, %{}
[14:04:42 CEST] <ScottSteiner> I can already generate the timestamp
[14:04:55 CEST] <ScottSteiner> My plan is to create a video of the full length and then speed that one up too
[14:05:13 CEST] <ScottSteiner> it should be a bit faster since I'll be doing it at a different bitrate
[14:05:53 CEST] <furq> what are you actually doing
[14:06:00 CEST] <furq> i just noticed your output video is 60fps and now i'm confused
[14:06:01 CEST] <ScottSteiner> putting a timer on a video
[14:06:49 CEST] <ScottSteiner> I have a video that I sped up 8x. I want to put a timestamp on it, but preserve the original timing
[14:07:14 CEST] <ScottSteiner> I don't want to re-encode the entire thing because it's a rather large file that took a few hours to encode initially
[14:14:44 CEST] <furq> ffmpeg -f lavfi -i color=white at 0.0:d=1234,format=rgba,drawtext="timecode='00\:00\:00\.000':timecode_rate=60" -c:v ffv1 out.nut
[14:14:52 CEST] <furq> something like that will draw the timecode on a transparent background
[14:15:00 CEST] <furq> replace d=1234 with the actual duration
[14:26:59 CEST] <ScottSteiner> thanks furq, I'll try that out
[14:33:17 CEST] <furq> actually that might be a dumb way of doing it
[14:33:29 CEST] <Vonor> using this as configure line -->> ./configure --pkg-config-flags="--static" '--extra-cflags=--static' '--extra-cxxflags=--static' --disable-shared --enable-static --disable-gpl --disable-nonfree <<-- I would expect to get a static binary. But I'm getting a dynamically linked one :-( -->> ffmpeg: ELF 64-bit LSB executable, x86-64, version 1 (SYSV), dynamically linked (uses shared libs), for GNU/Linux 3.0.0, BuildID[sha1]=3f1b52e39bab32a7d454a06113e
[14:33:29 CEST] <Vonor> 2c29b5015aa2f, stripped
[14:33:39 CEST] <furq> if you're encoding it again anyway then the right thing is probably to do -vf drawtext=[options],fps=400 on the source video
[14:33:47 CEST] <furq> that'll give you a 60fps timecode on a 400fps video
[14:33:57 CEST] <furq> or whatever the source framerate is
[14:53:34 CEST] <ScottSteiner> I did " ffmpeg -i input.webm -vf drawtext="fontcolor=black:fontsize=48:timecode='00\:00\:00\.000':timecode_rate=60:text=''",fps=400 output.webm" but the timestamp was shown in realtime, but sped up
[14:53:52 CEST] <ScottSteiner> with input.webm being the sped up video
[15:09:10 CEST] <ScottSteiner> maybe ill just bite the bullet and re-encode the whole thing. trying to find workarounds seems to be more painful than just letting it run in the background
[15:11:03 CEST] <dystopia_> wrong framerates maybe?
[15:11:33 CEST] <ScottSteiner> not that I can see
[15:32:38 CEST] <ScottSteiner> it ended up not being too bad, the frames being dropped since I chained my atempo/setpts filters really sped things up
[18:19:57 CEST] <pgorley> hi, how would i put AVFrame->data into an std::vector<uint8_t>?
[18:20:08 CEST] <c_14> memcpy?
[18:21:53 CEST] <pgorley> c_14: i was trying the iterator ctor approach, but it seems to segfaults, is there a way to get a pointer to the last item in AVFrame->output?
[18:25:34 CEST] <c_14> You mean AVFrame->data ?
[18:26:52 CEST] <c_14> use the width * height?
[18:39:44 CEST] <BtbN> height * linesize
[18:45:01 CEST] <c_14> eeh, right
[18:51:56 CEST] <pgorley> BtbN, c_14: thank you
[18:52:13 CEST] <pgorley> but yea, AVFrame->data
[19:43:53 CEST] <pgorley> i think i'm going about this wrong
[19:45:06 CEST] <pgorley> i'm working with an application that takes an array of audio samples for each channel (each channel is an int16_t*) and i want to put my audio file in there
[19:46:07 CEST] <durandal_1707> AVFrame stores channels in extended_data[x]
[19:46:27 CEST] <durandal_1707> where x is 0 for packed sample format
[19:46:54 CEST] <durandal_1707> or x is channel number in case of planar sample format
[19:47:27 CEST] <pgorley> so i would just append extended_data to my array each time i decode a frame?
[19:48:28 CEST] <durandal_1707> something like that
[19:48:34 CEST] <pgorley> hmm
[19:49:14 CEST] <durandal_1707> but make sure its still same endianess, sample fmt is in native endianess
[19:49:29 CEST] <pgorley> and how do i know if my format is packed or planar?
[19:49:35 CEST] <pgorley> oh, nvm found it
[19:50:12 CEST] <pgorley> av_sample_is_planar and then use av_samples_get_buffer_size to get the size of extended_data?
[19:51:35 CEST] <pgorley> durandal_1707: i'll try it out, thanks!
[19:51:44 CEST] <durandal_1707> size of extended data is nb of decoded samples * sizeof( sample ) in bytes
[19:53:50 CEST] <pgorley> durandal_1707: how do i get the size of a sample? since i'm putting it in an array of int16_t, would it be 2?
[19:54:19 CEST] <durandal_1707> if decoded sample format is int16
[19:54:45 CEST] <durandal_1707> if not than you need to resample using libswresample
[19:56:06 CEST] <pgorley> i see, so i would need to check if the samle format is int16 and if not call swr_convert on it?
[19:56:29 CEST] <durandal_1707> yes
[19:57:02 CEST] <pgorley> this is starting to make sense, thanks a lot!
[19:57:26 CEST] <CyBerNetX> hello
[19:57:42 CEST] <durandal_1707> hello
[19:58:05 CEST] <CyBerNetX> i serach info in this page https://trac.ffmpeg.org/wiki/RemapFilter
[19:58:24 CEST] <CyBerNetX> i hav made the project bin
[19:58:55 CEST] <CyBerNetX> but i dont understand this
[19:59:17 CEST] <durandal_1707> what exactly?
[19:59:25 CEST] <CyBerNetX> i test on a file who have 2 spherical view
[19:59:54 CEST] <CyBerNetX> 1 front and
[20:00:04 CEST] <CyBerNetX> bak side
[20:00:34 CEST] <voip_> Hello guys, please help me with problem: Selected ratecontrol mode is not supported by the QSV runtime.
[20:00:42 CEST] <voip_> https://pastebin.com/fQ2xGN6M
[20:00:51 CEST] <BtbN> Use a rate control mode that it supports
[20:00:54 CEST] <CyBerNetX> i think the binarie is used to made remap on 1 spherical view ?
[20:01:20 CEST] <JEEB> voip_: you have not set a rate control mode and I guess the global default is 200kbps bit rate
[20:01:29 CEST] <JEEB> that is not supported by that encoder
[20:01:45 CEST] <CyBerNetX> but my camera has 2 fisheye one on front and 1 one backside
[20:01:47 CEST] <JEEB> also I'm pretty sure the QSV encoder has no knowledge of x264 presets
[20:02:58 CEST] <durandal_1707> CyBerNetX: map files work only for layouts mentioned in wiki
[20:03:27 CEST] <CyBerNetX> ok thanks
[20:08:09 CEST] <voip_> JEEB, how can fix and set rate control
[20:08:10 CEST] <voip_> ?
[20:14:26 CEST] <voip_> JEEB, i tryed with h264_qsv -b:v 2500k same problem
[20:36:30 CEST] <voip_> guys, how can i setup rate control ? I also i tried with h264_qsv -b:v 2500k same problem
[20:37:52 CEST] <pgorley> in multichannel audio, are all AVFrame->extended_data arrays the same size?
[20:38:11 CEST] <BtbN> voip_, try also specifying a maxrate
[20:39:57 CEST] <voip_> tryed, same problem :(
[20:46:13 CEST] <jkqxz> voip_: Making libmfx work on Linux is painful. Make sure you are using the modified kernel and all of the modified libraries (and that your distribution isn't overriding them with normal versions).
[20:47:45 CEST] <BtbN> Or just use libva
[20:47:49 CEST] <jkqxz> If transcoding locally in CBR mode (like "-b:v 1M -maxrate 1M") doesn't work then there is probably something wrong with your setup. The whole thing is hopelessly undebuggable, "strace" is about the best tool for working out what it's actually doing.
[20:49:23 CEST] <voip_> guys i think its setup problem. because i follow instructions and test command (fmpeg -y -i test.mp4 -vcodec h264_qsv -acodec copy -b:v 8000K out.mp4) give me same error
[22:15:37 CEST] <paveldimow> Hi, I have two incoming (live) rtmp streams, I would like pick the video from one stream and audio from another stream and to sync them. Is this possible with ffmpeg?
[22:16:08 CEST] <BtbN> syncing two unrelated streams is most likely close to impossible
[22:17:06 CEST] <JEEB> it's not impossible but you will have to make concessions/have a buffer if they are offset by enough time
[22:18:44 CEST] <BtbN> one slight network-stutter and they will be permanently out of sync again until you restart
[22:19:17 CEST] <paveldimow> what do you mean by unrelated?
[22:19:38 CEST] <JEEB> BtbN: wouldn't the PTS matching still be the same
[22:19:55 CEST] <BtbN> I don't think ffmpeg cares about PTS for seperate input sources
[22:20:00 CEST] <BtbN> it treats them as unrelated
[22:20:03 CEST] <JEEB> yes
[22:20:08 CEST] <JEEB> which is why you have to write hte code yourself
[22:20:22 CEST] <JEEB> you take in two inputs, you know the approximate difference and start matching PTS
[22:20:31 CEST] <JEEB> (using a buffer for the one that's ahead)
[22:20:40 CEST] <BtbN> That does assume that the PTS are identical though
[22:20:44 CEST] <BtbN> which they most likely won't be
[22:21:08 CEST] <JEEB> not identical, but that the offset will not change - you would have to make an interface that takes in change requests for it if it is variable
[22:21:26 CEST] <paveldimow> hmmm
[22:21:48 CEST] <paveldimow> so, to sum, it's possible but very hard
[22:22:11 CEST] <JEEB> possibly not very hard, but "not something achievable with the ffmpeg.c command line app as-is"
[22:22:16 CEST] <paveldimow> ok, does it matter if the input streams are created with the same application?
[22:22:27 CEST] <JEEB> why would it?
[22:23:04 CEST] <paveldimow> I don't know, beacuse of "[22:20] <BtbN> That does assume that the PTS are identical though [22:20] <BtbN> which they most likely won't be"
[22:24:22 CEST] <paveldimow> well ok, what do you think how much if take time/money to make this solution with ffmpeg?
[22:24:43 CEST] <paveldimow> we are talking about days/months/years?
[22:25:11 CEST] <paveldimow> just asking, I don't know if you are developers
[22:26:12 CEST] <JEEB> depends on your experience with the libav* APIs
[22:26:33 CEST] <JEEB> and in general your grasp on the idea of having two inputs and buffering one until you get to the same point in your other stream
[22:27:24 CEST] <paveldimow> how about your experience? :)
[22:27:53 CEST] <JEEB> the dynamic buffer would be the thing for me to watch out for, otherwise it sounds relatively straightforward
[22:28:17 CEST] <BtbN> until someone restarts one of the input streams, and everything's messed up
[22:28:32 CEST] <JEEB> yes, and that's when you have to then re-adjust the dynamic buffer again
[22:28:39 CEST] <JEEB> and the desync rate
[22:28:55 CEST] <JEEB> I mean, far from being perfect but that's what you get when you want to sync two unrelated things together
[22:28:58 CEST] <BtbN> I'd have no idea how to do that in any reliable way
[22:29:08 CEST] <paveldimow> it can be static as well or no? I mean give it enough size
[22:29:33 CEST] <paveldimow> well, does it help if we consider one "master" stream to be super reliable?
[22:29:41 CEST] <BtbN> someone would have to constantly babysit it, and manually adjust the delay
[22:30:07 CEST] <paveldimow> well, manual adjust is not an option... at least for now :)
[22:30:21 CEST] <BtbN> The only reliable way to do things like that is realtime timestamps and synced clocks. But I don't think rtmp even supports that
[22:30:36 CEST] <JEEB> yea, unless you code it into special SEI messages or something
[22:30:40 CEST] <JEEB> and add parsing of that to lavf
[22:31:06 CEST] <JEEB> or use the UTC timestamp as FLV timestamp? not sure how big the values can be
[22:31:15 CEST] <JEEB> only remember how it was a time base of 1/1000
[22:31:18 CEST] <paveldimow> 32bit
[22:31:35 CEST] <paveldimow> which is quite enough
[22:31:36 CEST] <BtbN> rtmp timestamps wrap around after roughly 45 hours
[22:31:43 CEST] <BtbN> realtime timestamps wouldn't work
[22:31:54 CEST] <paveldimow> no no, no need to more then 2-3 hours
[22:32:10 CEST] <BtbN> Well, the unix time started 1970 though
[22:32:38 CEST] <JEEB> anyways, if you can sync the source clocks and use timestamps as your PTS then, uh
[22:32:44 CEST] <JEEB> suddenly the problem becomes a bit simpler
[22:33:10 CEST] <BtbN> If you manage to ensure the PTS are identical on all inputs, you can hack something, yes. But ffmpeg.c still can't do it.
[22:33:36 CEST] <paveldimow> what do you mean byt PTS are identical?
[22:33:48 CEST] <JEEB> PTS = presentation time stamp
[22:34:27 CEST] <paveldimow> yes yes, but why when if I understand correctly, rtmp does not support absolute timestamps only relative
[22:34:47 CEST] <JEEB> RTMP = FLV, FLV AFAIK supports PTS just fine
[22:34:56 CEST] <JEEB> go read the FLV specs if you care enough
[22:34:57 CEST] <BtbN> There are no relative timestamps
[22:35:06 CEST] <BtbN> They are always absolute to a random timebase
[22:35:52 CEST] <paveldimow> Timestamps in RTMP are given as an integer number of milliseconds relative to an unspecified epoch. Typically, each stream will start with a timestamp of 0, but this is not required, as long as the two endpoints agree on the epoch. Note that this means that any synchronization across multiple streams (especially from separate hosts) requires some additional mechanism outside of RTMP.
[22:36:23 CEST] <paveldimow> hmmmm
[22:36:39 CEST] <BtbN> I think RTP/RTSP uses absolute global timestamps by default
[22:36:47 CEST] <BtbN> meaning those streams sync themselves without any further effort
[22:37:08 CEST] <BtbN> only depending on an accurtate clock on all machines involved
[22:37:28 CEST] <paveldimow> would ntp be an option in that case?
[22:37:33 CEST] <pgorley> do i have to set the output frame's nb_samples before calling swr_convert_frame? it's always 0 afterwards
[22:37:35 CEST] <BtbN> it's required.
[22:50:35 CEST] <Cracki> ntp would be a good enough option. consider gps-derived time for more precision. be aware that gps time isn't UTC time, and google time servers aren't UTC either.
[23:12:01 CEST] <paveldimow> Thank you for your answers. I'll have to discuss my idea in my head few times :)
[23:14:01 CEST] <paveldimow> How they did something like this? https://www.youtube.com/watch?v=xQkapLhuGV0
[23:15:36 CEST] <Cracki> timestamps I'd guess
[23:15:50 CEST] <Cracki> audio and video can be bad enough to not be usable for synchronization
[23:16:01 CEST] <paveldimow> :D
[23:16:25 CEST] <paveldimow> you mean in this particular video from snapchat?
[23:16:29 CEST] <Cracki> yes
[23:16:43 CEST] <paveldimow> yes, that's make sense
[23:17:15 CEST] <Cracki> perhaps even gps time, but I don't know if gps receivers typically used in smartphones give that info
[23:18:22 CEST] <paveldimow> not sure
[23:18:30 CEST] <paveldimow> but, what's better ntp or gps?
[23:18:36 CEST] <Cracki> ntp/sntp is easy enough for them to do, and something on the order of <30 ms is imperceptible to humans
[23:19:16 CEST] <Cracki> as long as you have a link, you can always to sntp/ntp. gps only works outdoors or with very good receivers.
[23:19:31 CEST] <Cracki> gps would give you a very phase-stable clock signal
[23:20:36 CEST] <paveldimow> thank you Cracki
[23:53:31 CEST] <SpLiC3> If using ffmpeg as remuxer with avconv what commands would i need to introduce to my .json to maintain the buffer of the original .ts stream or is that a rather obtuse question?
[23:54:33 CEST] <SpLiC3> source is already encoded h.264 in a .ts with acc audio, i just select copy for those though, is that correct?
[00:00:00 CEST] --- Tue Aug 15 2017
More information about the Ffmpeg-devel-irc
mailing list