[Ffmpeg-devel-irc] ffmpeg.log.20180304

burek burek021 at gmail.com
Mon Mar 5 03:05:01 EET 2018


[10:26:47 CET] <Celmor> I'm trying to concatenate 2 mp4 files together without re-encoding, getting 'Unsafe file name' and 'Operation not permitted' errors
[10:27:23 CET] <Celmor> using the 'concat demuxer' example from https://stackoverflow.com/questions/7333232/concatenate-two-mp4-files-using-ffmpeg
[10:28:07 CET] <Celmor> log: https://ptpb.pw/AhIl
[10:34:13 CET] <Celmor> or '1.mp4: Invalid data found when processing input'
[10:34:39 CET] <Celmor> when running `ffmpeg -f concat -i 1.mp4 -i 2.mp4 -c copy output.mp4`
[10:47:35 CET] <BtbN> that's not how the concat format works.
[10:47:45 CET] <BtbN> also, mp4 is not concat friendly
[10:47:54 CET] <BtbN> remux them both to .ts, concat those, and then remux the result to mp4
[10:48:00 CET] <BtbN> if you really need mp4, that is
[10:54:07 CET] <shtomik> Hi guys, I know, that libavfilter isnt thread-safe, but I need to split filtering and encoding procces of each output stream otherwise, the video output processing takes a very long time and it's spoile the sound. What are your recommendations?
[11:01:14 CET] <kerio> BtbN: is there a format to parse a raw h264 stream?
[11:01:33 CET] <BtbN> shtomik, i don't see what would stop you from doing that.
[11:01:36 CET] <BtbN> kerio, what?
[11:02:37 CET] <kerio> a stream of NAL units
[11:02:51 CET] <BtbN> what's with that? raw h264 is a bad idea in any case.
[11:03:33 CET] <kerio> but mpegts is patented :<
[11:04:05 CET] <BtbN> h264 is patented.
[11:04:12 CET] <kerio> o no :<
[11:04:20 CET] <kerio> ye i guess my question was pretty silly
[11:04:28 CET] <kerio> is nut concatenable?
[11:04:57 CET] <BtbN> no idea, but I know mpegts works fine, and it's only an intermediate container anyway
[11:05:07 CET] <shtomik> @BtbN How to properly push and pull frames in threads?
[11:06:49 CET] <furq> kerio: -f h264
[11:06:54 CET] <furq> not that you should use it
[11:07:02 CET] <kerio> you can't tell me what to do >:C
[11:07:08 CET] <furq> i didn't
[11:07:11 CET] <furq> i told you what you should do
[11:07:16 CET] <kerio> you can't tell me what to not do >:C
[11:07:17 CET] <BtbN> I'd just make one thread for filtering and one for encoding. The frames coming out of the filters should be yours and not used by ffmpeg anymore, so you can pass them on to another thread.
[11:07:38 CET] <kerio> i mean what's so bad about it, other than the loss of timing?
[11:07:48 CET] <kerio> nothing deals with variable framerate correctly anyway
[11:07:58 CET] <BtbN> It does not even have a framerate though
[11:08:07 CET] <BtbN> it's just a row of pictures
[11:09:04 CET] <kerio> speaking of which
[11:09:19 CET] <kerio> i would pay dozens of cents for a rawvideo-like format that also allows me to specify the timestamp of each frame
[11:09:35 CET] <furq> what's wrong with nut or mkv
[11:09:48 CET] <kerio> how do you even write those
[11:09:53 CET] <kerio> i was going to use ffmpeg to write those
[11:09:57 CET] <furq> with lavf
[11:10:16 CET] <kerio> libav has no decent python bindings tho
[11:10:17 CET] <BtbN> if you want to concat, you really should use .ts though. It's perfect for that purpose. No idea if nut works equally well
[11:10:18 CET] <furq> what about y4m
[11:10:50 CET] <furq> actually y4m doesn't do vfr
[11:10:50 CET] <shtomik> @BtbN thanks!!! so much
[11:11:12 CET] <Celmor> got concat working with mp4's, apparently I have to have the filenames in a text file for some reason, this works `ffmpeg -f concat -safe 0 -i mux.txt -c copy output.mp4`
[11:12:07 CET] <kerio> how does that even work
[11:12:14 CET] <kerio> mp4 has all sorts of headers and footers
[11:12:23 CET] <furq> because -f concat is a demuxer
[11:12:33 CET] <kerio> oh :o
[11:12:56 CET] <BtbN> From my experience using the concat demuxer with mp4 has a high chance of running into issues
[11:13:04 CET] <BtbN> which are fixed by remuxing to .ts first
[11:13:17 CET] <furq> not all of them are
[11:13:49 CET] <kerio> surely the concat demuxer should be at least as good as concatenating the output of the remuxed .ts
[11:13:54 CET] <furq> the most common thing i run into is audio gaps and that doesn't get fixed by remuxing to ts
[11:14:09 CET] <furq> at least it didn't last time i tried
[11:15:58 CET] <BtbN> I still have an ongoing issue with splitting an incoming rtmp stream into various 4 second .ts chunks, and then concating them with the concat demuxer. I get a ton of errors about duplicate timestamps that way. The concated output still plays fine though.
[11:16:14 CET] <BtbN> So I wonder if that's just harmless noise, or an actual issue somewhere.
[11:30:26 CET] <kerio> check with ffprobe
[11:31:17 CET] <BtbN> every individual file seems fine, the resulting file seems fine
[11:31:31 CET] <BtbN> and the input stream also has no issues
[11:32:00 CET] <kerio> no i mean, check the timestamps
[13:52:55 CET] <GamleGaz> if I set frame->data manually (instead of with sws_scale) do I have to set frame->data[0] = y_buf frame->data[1] = u_buf frame->data[2] = v_buf?
[13:53:20 CET] <GamleGaz> or will a frame->data[0] = yuv_buf do?
[18:35:04 CET] <Li> according to quick peak into man page ffmpeg is econding tool ... I'm wondering if it's also capable of directly recodring from webcam?
[18:38:00 CET] <Li> seems like everyone is high on sunday
[18:41:19 CET] <JEEB> FFmpeg is a set of various libraries as well as a set of command line tools utilizing those libraries
[18:41:23 CET] <JEEB> such as ffmpeg.c
[18:41:45 CET] <JEEB> you can access various webcam-like interfaces through v4l2 etc
[18:42:13 CET] <JEEB> see what your webcam provides as an interface and see if the libraries contain support for it
[18:42:29 CET] <JEEB> also your definition of "recording" may sway some responses
[18:46:23 CET] <techkid6> Heya, I'm looking to see if its possible to have ffmpeg pull from a file of videos and stream them over RTMP, where the file is a dynamicly changing list, or if there's some terminal app that could do similar
[19:00:04 CET] <BtbN> you mean take HLS as input?
[19:01:58 CET] <techkid6> BtbN: Nah, like, a list of MP4s, piping into nginx-rtmp
[19:03:02 CET] <BtbN> HLS is pretty much exactly that
[19:04:17 CET] <techkid6> BtbN: Yeah, I'm grossly aware of HLS, but HLS requires me transcode ahead of time to .ts files, then segment them, and there are potentially times where I want to not stream anything without just freezing the playlist.m3u8 file
[19:04:18 CET] <DHE> or mpeg-dash, but still. that is literally what HLS is for
[19:04:32 CET] <BtbN> HLS works with mp4
[19:04:35 CET] <DHE> that's true of older versions of the HLS spec. new ones do allow for fragmented MP4
[19:04:38 CET] <techkid6> I know, and nginx-rtmp does transcode to HLS, I'm talking just about ingest part
[19:04:42 CET] <techkid6> Oh interesting
[19:05:08 CET] <techkid6> Right now I use a windowed HLS playlist made up of many smaller HLS playlists concatenated together
[19:06:19 CET] <TheWild> hello
[19:06:32 CET] <TheWild> is there a good way to handle "moov atom not found"?
[19:06:42 CET] <TheWild> Google wasn't very helpful
[19:06:53 CET] <JEEB> that usually means that your mp4 file is not fully available
[19:07:08 CET] <JEEB> unfortunately that thing contains information required for decoding a whole bunch of formats
[19:07:26 CET] <JEEB> and unless you use either movie fragments or 2pass muxing that thing is going to be at the end of the file
[19:07:49 CET] <kerio> a fragmented mp4 requires the playlist to be able to seek, right
[19:08:01 CET] <JEEB> no
[19:08:14 CET] <JEEB> lavf can seek in fragmented isobmff quite alright as far as I've seen
[19:08:24 CET] <JEEB> it's just usually been used with DASH or similar formats on the internet
[19:08:38 CET] <TheWild> that was currently a copy of the file that is in process of being downloaded by youtube-dl, but if I ever get such broken video (e.g. power interruption), I would like to know how to fix that.
[19:08:56 CET] <kerio> you can't, afaik
[19:08:57 CET] <JEEB> if you are creating the file yourself then use movie fragments or another container
[19:09:06 CET] <kerio> yeah, use mkv
[19:09:07 CET] <JEEB> if it's something you're downloading then just download again
[19:09:21 CET] <kerio> *continue downloading
[19:09:22 CET] <TheWild> grr... it's a live stream
[19:09:26 CET] <JEEB> because it's the index and if the index is not there then you're basically fucked
[19:09:48 CET] <TheWild> can't the index be rebuild by scanning the file?
[19:09:48 CET] <JEEB> the original live stream is highly unlikely to be normal mp4 so I guess you want to be remuxing to something else than mp4 then :P
[19:09:49 CET] <kerio> then use some other container
[19:09:55 CET] <JEEB> no, it cannot be
[19:10:04 CET] <JEEB> as it contains the initialization data for example for H.264
[19:10:25 CET] <JEEB> you can only do it with such things like cameras etc which always use specific values in the headr
[19:10:27 CET] <techkid6> Pretty much, I have a list of files in the form of (filename, start_point, end_point, start_time), and I'm just trying to figure out how to play those out to a stream without having to manually deal with hls playlisting
[19:11:21 CET] <TheWild> in case of power outage youtube-dl might not be able to fix it later :(
[19:12:23 CET] <JEEB> well if it's a live stream as I noted the thing originally most likely isn't a non-fragmented mp4 file
[19:12:38 CET] <JEEB> so switch what youtube-dl does when getting the stream?
[19:18:22 CET] <TheWild> sorry, can't stop it now and restart with different parameters.
[19:18:35 CET] <TheWild> Kinda disappointed. I thought those things are smarter.
[19:19:46 CET] <JEEB> it probably is doing some sort of realtime remuxing of the original stream, I guess? which kind of makes sense for compatibility. unless you suddenly suffer an issue with the muxing process
[19:43:39 CET] <dada78641> Hi everyone. I'm trying to use ffmpeg to make a gif out of a series of static images. It works, but I'm wondering if it's possible to have specific frame lengths? https://pastebin.com/qbiWgQPX Couldn't figure out a way with setpts
[19:45:04 CET] <BtbN> you set the framerate
[19:47:22 CET] <dada78641> I've got a list of data, e.g. frame 0 needs 120ms, frame 1 needs 130ms, etc., without uniformity
[20:04:00 CET] <InTheWings> why does RGB chromas depend on host endianess ?
[20:05:25 CET] <atomnuker> because they're single plane with interleaved pixels, { R, G, B, A } <- 32 bits
[20:14:25 CET] <InTheWings> If I memcpy 4 bytes sample in RGBA order, i don't see why the #ifdef changes to BGR depending on endianess, that's still RGBA contiguous and I don't think it reads each pixel as an integer
[20:17:22 CET] <atomnuker> why do you care? I don't think there's anything big-endian which matters nowadays
[00:00:00 CET] --- Mon Mar  5 2018


More information about the Ffmpeg-devel-irc mailing list