[Ffmpeg-devel-irc] ffmpeg.log.20170430

burek burek021 at gmail.com
Mon May 1 03:05:01 EEST 2017

[00:44:30 CEST] <mosb3rgler> hey guys, im trying to see if its possible to connect to a HLS Master Playlist, and grab the entire set of pids in 1 go. and reproduce them to another system to maintain the HLS ABR style playlist + mapping to some degree?
[00:45:13 CEST] <mosb3rgler> i realize i can choose the feeds i want to include and so on.. but is it possible to do multiple videos and such the way im saying. keeping B frames intact so it will correctly switch between bitrates
[03:02:35 CEST] <LoopHoldYoaBrown> Hello, is scc muxer working with Quicktime mov?
[06:03:01 CEST] <zap0> when i do ffmpeg -i file.mp4   is there another option to get similar to the  ffprobe -print_format flat ?
[06:03:09 CEST] <zap0> no, i don't want to just use ffprobe.
[06:06:13 CEST] <furq> -print_format doesn't do anything on its own
[06:14:00 CEST] <zap0> i know, that's not my question.
[06:14:47 CEST] <furq> well that code is all specific to ffprobe.c so i suspect the answer is no
[06:26:55 CEST] <zap0> ok.  next question:   with ffprobe flat..   you see things like:  streams.stream.0.width=480     and  streams.stream.0.duration="10.000000"      see how teh 2nd one is in quotes?  is there any reasoning about when quotes are used?
[11:34:01 CEST] <schelleursli> hi there, quick question, is it possible to change the cache value for http input streams? I seem to run out of buffer on my local network streams and can't find anything regarding cache on the ffmpeg manual
[11:34:40 CEST] <schelleursli> funny thing is if I do the same ffmpeg commandline but pipe it via vlc first and then input it things work just fine
[14:56:44 CEST] <xeche> at which point when opening a file for decoding is possible to know the pixel format of a decoded image?
[14:56:59 CEST] <xeche> do you actually have to decode a frame before you can get that information?
[14:57:56 CEST] <atomnuker> yes
[14:58:05 CEST] <xeche> *sigh
[14:58:08 CEST] <xeche> okay thanks man
[14:58:14 CEST] <atomnuker> since pixel format and image size can change in between frames
[14:58:23 CEST] <xeche> i was afraid that were the case
[14:59:07 CEST] <atomnuker> if you control everything you can guarantee it, but if its some unknown mpegts stream anything can change
[14:59:56 CEST] <xeche> dealing with multiple files, i'm guessing i shouldn't make that assumptions, as I didn't generate the files
[15:00:59 CEST] <xeche> so, what's a good way to set up an sws_context then. should you really call sws_getCachedContext every time you work with a decoded frame? depending on how fast that cache look up is, that seems like it would be slow. but maybe it's negligeble
[15:02:12 CEST] <atomnuker> I'd keep a state and reinit if pix_fmt/witdth/height change
[15:02:41 CEST] <atomnuker> but sws_getCachedContext seems to already do that for you
[15:03:55 CEST] <atomnuker> yeah, just call sws_getCachedContext on every frame, it'll be fine
[15:03:56 CEST] <xeche> atomnuker: it does, except for filter settings
[15:04:08 CEST] <xeche> i guess so. thanks again
[15:23:32 CEST] <xeche> how on earth can av_frame_get_buffers return a single buffer of size 6266911
[15:23:54 CEST] <xeche> for 1920x1080 AV_PIX_FMT_RGB24 buffer?
[15:24:01 CEST] <xeche> the single part is fine
[15:24:12 CEST] <xeche> but an odd number of bytes? what the fuck?
[15:31:35 CEST] <xeche> i guess the buffer sizes are work space, and the data[] array is what i actually want.
[16:13:52 CEST] <faLUCE> xeche: you have to use data[] array
[20:45:13 CEST] <chatter29> hey guys
[20:45:16 CEST] <chatter29> allah is doing
[20:45:20 CEST] <chatter29> sun is not doing allah is doing
[20:45:23 CEST] <chatter29> to accept Islam say that i bear witness that there is no deity worthy of worship except Allah and Muhammad peace be upon him is his slave and messenger
[21:35:11 CEST] <faLUCE> hello, I would like to make a http-mpegts-h264 player for chrome... what do you suggest to use? is there a way to integrate libav on chrome?
[21:35:38 CEST] <sfan5> chrome already includes a video player?
[21:35:53 CEST] <furq> yeah chrome already plays h264
[21:36:06 CEST] <furq> and there's already at least one bit of code which remuxes mpegts to fmp4 in javascript
[21:36:23 CEST] <faLUCE> furq: sfan5, but what about http?
[21:36:37 CEST] <furq> are you asking if chrome does http
[21:36:42 CEST] <sfan5> i think he is
[21:37:03 CEST] <BtbN> Just use hls.js?
[21:37:13 CEST] <furq> yeah that's the javascript remuxer to which i refer
[21:37:20 CEST] <BtbN> It plays HLS, like the name may suggest, which is mpegts
[21:37:23 CEST] <furq> if you really don't want to use hls for some reason then you could probably adapt that
[21:37:47 CEST] <BtbN> If you give it a HLS playlist with one huge chunk, it'll probably play it just fine
[21:37:53 CEST] <faLUCE> I have to demux a HTTP-h264 stream, not a hls stream
[21:38:09 CEST] <BtbN> what is a "HTTP-h264 stream"?
[21:38:14 CEST] <BtbN> mpeg-ts over http?
[21:38:18 CEST] <furq> yeah
[21:38:18 CEST] <faLUCE> BtbN: yes
[21:38:29 CEST] <BtbN> so, use hls.js then. Most likely the way easiest solution
[21:38:58 CEST] <faLUCE> BtbN: do you mean that hls.js plays a generic http mpegts stream?
[21:38:58 CEST] <furq> well hls.js expects a playlist, which i assume he doesn't have
[21:39:05 CEST] <furq> so you'd probably need to adapt it
[21:39:12 CEST] <BtbN> Just write a dummy one with just your one file
[21:39:19 CEST] <furq> it's a live stream
[21:39:24 CEST] <BtbN> shouldn't matter
[21:39:26 CEST] <faLUCE> a playlist where? in the header?
[21:39:44 CEST] <furq> you might be right actually
[21:39:48 CEST] <BtbN> It expects a HLS playlist, so give it one, which just points to your stream
[21:39:53 CEST] <furq> just serve an m3u8 that points to your sream
[21:39:54 CEST] <furq> +t
[21:40:25 CEST] <BtbN> I think m3u8 even has a defined mode for that special case
[21:40:33 CEST] <BtbN> But too lazy to look up the spec
[21:40:39 CEST] <furq> it's worth a try anyway
[21:40:53 CEST] <faLUCE> but where has this playlist to be put? in the stream header?
[21:41:22 CEST] <furq> well hls.js is javascript so you'd need to embed the video in a webpage
[21:41:25 CEST] <BtbN> somewhere, it doesn't matter.
[21:41:34 CEST] <furq> so just serve the playlist from the same httpd
[21:41:48 CEST] <BtbN> can probably even generate it on the fly from JS
[21:41:50 CEST] <furq> point a <video> tag to the m3u8
[21:41:50 CEST] <faLUCE> I don't understand exactly, but I want to try
[21:44:07 CEST] <faLUCE> so, basically hls.js adds a demuxer to the browser native decoder?
[21:44:22 CEST] <furq> it remuxes mpegts to fmp4 and then feeds that to the native player
[21:44:23 CEST] <BtbN> hls.js is a mpegts to mp4 remuxer written in JavaScript
[21:44:37 CEST] <faLUCE> I see, but I don't need do remux to mp4
[21:44:40 CEST] <BtbN> you do
[21:44:43 CEST] <faLUCE> why?
[21:44:43 CEST] <BtbN> browsers only play mp4
[21:44:45 CEST] <furq> ^
[21:44:46 CEST] <faLUCE> I see
[21:44:57 CEST] <furq> it's an awful hack but it's the easiest way to do it
[21:45:03 CEST] <faLUCE> I see
[21:45:15 CEST] <faLUCE> what if I mux mp4? Is it possible for a live stream?
[21:45:16 CEST] <furq> and it works in any browser on any platform
[21:45:35 CEST] <BtbN> mp4 can't be easily live streamed. But if you generate a fragmented mp4 stream, it might work as well
[21:45:36 CEST] <furq> fragmented mp4 should work
[21:45:41 CEST] <furq> although i wouldn't bother
[21:45:44 CEST] <BtbN> if you are not bound to mpeg-ts, I'd try that
[21:45:56 CEST] <furq> i'd trust mpegts more for non-browser players
[21:46:01 CEST] <faLUCE> I can try mp4 as well
[21:46:02 CEST] <furq> and hls.js should works for browsers
[21:46:06 CEST] <BtbN> should just be able to play a fragmented mp4 file from a <video> tag
[21:46:07 CEST] <furq> but yeah either should be ok
[21:46:19 CEST] <BtbN> But it will always play from the beginning, even if it's live
[21:46:21 CEST] <faLUCE> but why browsers need mp4? I don't understand
[21:46:30 CEST] <furq> because browsers suck
[21:46:31 CEST] <BtbN> It's the only container they support
[21:46:38 CEST] <furq> for h264, yeah
[21:46:51 CEST] <BtbN> or rather, all of them support
[21:46:56 CEST] <BtbN> some individual browsers support more
[21:46:56 CEST] <furq> yeah
[21:47:10 CEST] <faLUCE> [21:46] <BtbN> But it will always play from the beginning, even if it's live <--- I add a header as soon as the client makes the http request
[21:47:19 CEST] <furq> annoyingly, mobile browsers have native mpegts
[21:47:21 CEST] <BtbN> ...?
[21:47:27 CEST] <BtbN> How would you do that?
[21:47:39 CEST] <faLUCE> BtbN: I created a library
[21:47:44 CEST] <BtbN> You have a plain single video file on the server
[21:47:53 CEST] <BtbN> it'll serve it from the beginning
[21:47:54 CEST] <faLUCE> BtbN: no, I made a live video library
[21:48:07 CEST] <faLUCE> https://github.com/paolo-pr/laav
[21:48:21 CEST] <faLUCE> it works for mpegts but I can adapt it to mp4
[21:48:57 CEST] <faLUCE> the http stream part of the code is made with libevent and handles multiple requests, by adding the container's header for each one
[21:50:11 CEST] <faLUCE> I thought that mp4 was not good for live stream, then I did not use it
[21:50:21 CEST] <BtbN> mp4 is horrible for live streaming
[21:50:27 CEST] <BtbN> which is why HLS and DASH exist
[21:50:46 CEST] <faLUCE> BtbN: but it's the only way to see a stream on the browser
[21:50:53 CEST] <BtbN> exactly
[21:51:00 CEST] <BtbN> so, browsers are bad for live streaming.
[21:51:12 CEST] <BtbN> Can just use Flash and play an rtmp stream...
[21:51:12 CEST] <faLUCE> I thought is could not be used at all (mp4) for live streams
[21:51:18 CEST] <furq> it's ok i'm sure there's no demand for internet livestreaming these days
[21:51:24 CEST] <BtbN> it can't. You cut it into a lot of short fragments
[21:51:39 CEST] <BtbN> So each fragment can be played individually
[21:52:12 CEST] <faLUCE> BtbN: do you mean that fragmented mp4 is somewhat a hack?
[21:52:20 CEST] <BtbN> It's a horrible hack, yes.
[21:52:34 CEST] <faLUCE> BtbN: then in this way I introduce latency
[21:52:38 CEST] <furq> it's not so much a hack as a solution to a problem which shouldn't exist
[21:52:48 CEST] <furq> we already have perfectly good streaming containers
[21:53:01 CEST] <BtbN> If browsers would just support mpeg-ts or mkv natively, there wouldn't be an issue
[21:53:07 CEST] <furq> this is just a way to backdoor a streamable container into the warring factions of browser makers
[21:53:39 CEST] <BtbN> if you are Ok with only supporting MS Edge, that one does play mpegts just fine.
[21:53:44 CEST] <furq> it has no other utility as far as i can tell
[21:53:57 CEST] <faLUCE> let me think
[21:54:00 CEST] <BtbN> Well, HLS/DASH also make things easier for CDNs
[21:54:08 CEST] <james999> hey furq given i got streaming to work to my xbox over wifi with a special command line
[21:54:16 CEST] <faLUCE> my goald is to create a low latency http-mpegts player
[21:54:16 CEST] <james999> is there some way to submit a documentation update or patch for it?''
[21:54:32 CEST] <BtbN> low latency streaming to a browser is not going to happen
[21:54:41 CEST] <faLUCE> BtbN: I see
[21:54:42 CEST] <BtbN> there will always be 3 times the gop length of latency at minimum
[21:54:53 CEST] <faLUCE> BtbN: then I have to switch to a standalone player
[21:55:45 CEST] <BtbN> There are some sites utilizing WebRTC or something to have ultra-low-latency Remote-Desktops
[21:55:53 CEST] <BtbN> But I have no idea how they manage to do that
[21:57:27 CEST] <faLUCE> ok, now, if I want to make a SIMPLE http-mpegts player with libav, should I use libcurl for the http part? Or do I have to use libav for it as well?
[21:57:40 CEST] <faLUCE> (if I want to control the latency)
[21:57:55 CEST] <BtbN> what?
[21:58:02 CEST] <BtbN> Just use mpv?
[21:58:21 CEST] <faLUCE> BtbN:  vlc, mpv and ffplay introduce a latency which I can't control
[21:58:50 CEST] <faLUCE> better: I can't control it completely
[21:59:05 CEST] <james999> faLUCE: ah low latency video streaming, exactly what I'm intersted in. :D
[21:59:24 CEST] <faLUCE> james999: I can control completely the server part, in my library
[22:00:03 CEST] <james999> my goals are much more modest than writing a library
[22:00:12 CEST] <james999> i just want to stream youtube to my xbox one over wifi
[22:00:21 CEST] <BtbN> Can't it play YouTube on its own?
[22:00:26 CEST] <james999> finally figured it out, and wanted to ask furq if i could add it to the ffmpeg documentation somewhere
[22:00:34 CEST] <james999> yeah but you can't select the quality
[22:00:38 CEST] <james999> plus i can stream my desktop this way too
[22:01:04 CEST] <james999> i ended up having to use a udp url which specified a max packet size
[22:03:55 CEST] <BtbN> I wonder why nobody has come up with an easy way to Stream VP8 via srtp to browsers yet
[22:04:01 CEST] <BtbN> That should work great
[22:06:49 CEST] <james999> i'm trying to udp stream to the xbox with vlc udp legacy atm but it's not working
[22:06:52 CEST] <james999> prob cuz of packet isze thing
[22:18:16 CEST] <faLUCE> so, basically, I don't understand how to demux a http mpegts stream with libav: do I have to manage http with avio or with avformat?
[22:18:39 CEST] <james999> well that was a complete failure
[22:18:56 CEST] <james999> tried using both udp and http to my xbox with vlc and it wouldn't stream
[22:19:08 CEST] <james999> but when i picked samba share or UPNP server it worked
[22:21:02 CEST] <faLUCE> any tip about that?
[22:33:12 CEST] <kode54> how cute, deadbeef wants to continue to support ffmpeg 0.x
[22:33:22 CEST] <kode54> apparently, distributions living in the past still matter
[22:39:33 CEST] <kode54> apparently, my changes to their ffmpeg input cause a deadlock in playback on several years old versions of ffmpeg
[23:42:18 CEST] <alphabitcity> Hi all, I'm trying to use ffmpeg to delay the audio on an mp4 (to fix audio desync issue). I was able to figure that out, but I can't seem to figure out how to copy over the 3rd stream of the input mp4 to the output, which is a data stream (amf) data. Is that possible with ffmpeg? https://pastebin.com/EK5vHawT
[23:45:23 CEST] <alphabitcity> Sorry, here's the correct CLI cmd I've been using: https://pastebin.com/ZN6qTYjh
[23:46:52 CEST] <thebombzen> alphabitcity: use -map d in order to map all data streams
[23:47:38 CEST] <thebombzen> just like 'v' is video and 'a' is audio, there's 'd' for data, 's' for subtitle and 't' for attachment
[23:47:55 CEST] <thebombzen> or 't' for text, idk. but that one is used for attached fonts
[23:48:30 CEST] <thebombzen> so you could -map 0:d or -map 1:d depending on where you want it. in your case, using -map d will map the data stream from both files, so you probably want to use -map 0:d
[23:49:05 CEST] <alphabitcity> thebombzen: thx so much. i tried that but am getting "Data stream encoding not supported yet (only streamcopy)" .. perhaps i'm using an old version of ffmpeg?
[23:49:27 CEST] <thebombzen> is it a warning? and yea that's not a surprise
[23:49:45 CEST] <thebombzen> what that means is that you cannot convert between different codecs for the data stream... which you shouldn't be able to
[23:49:56 CEST] <thebombzen> do suppress the warning you can use -c:d copy, which streamcopies the data stream
[23:50:06 CEST] <thebombzen> but it should do that by default anyway
[23:50:29 CEST] <thebombzen> otherwise it tries to autoselect a codec... but for data there isn't any so it'll just default to copy. same with 't' streams
[23:50:39 CEST] <alphabitcity> hmm i see
[23:51:56 CEST] <alphabitcity> getting a "Couold not write header" error now: https://pastebin.com/xAGXcrQ3 (with -map 0:d and -c:d copy)
[23:53:35 CEST] <thebombzen> you're using an ancient version of ffmpeg
[23:53:52 CEST] <alphabitcity> ok good to know, will update :)
[23:54:04 CEST] <thebombzen> that sounds like an issue that should be fixed
[23:54:18 CEST] <thebombzen> I mean that looks like a bug but check a recent version first
[23:54:23 CEST] <thebombzen> fyi you can intall ffmpeg with homebrew
[23:54:33 CEST] <alphabitcity> yea, i have it installed with homebrew. updating it now
[23:54:39 CEST] <alphabitcity> on 3.3 now
[23:55:16 CEST] <thebombzen> try again on 3.3. that should work
[23:55:30 CEST] <thebombzen> also you should use -ss, not -itsoffset
[23:55:39 CEST] <thebombzen> -ss says "seek" and -itsoffset just shifts the timestamps
[23:56:17 CEST] <thebombzen> although -itsoffset is useful for streamcopy, you could run into issues
[23:56:25 CEST] <thebombzen> compatibilty issues with bad players I mean
[23:56:42 CEST] <thebombzen> if you don't care then you don't care, I just figured I'd warn you
[23:56:59 CEST] <alphabitcity> @thebombzen: ty. i switched to ss and upgraded to 3.3. it looks like it's having trouble looking for a codec for the data stream: https://pastebin.com/3G34rrgY
[23:57:29 CEST] <thebombzen> ss is not a drop-in replacement by the way
[23:57:35 CEST] <thebombzen> it'll effectively work in the other direction
[23:57:44 CEST] <alphabitcity> ok, good to know :) i'll read the docs on it
[23:57:46 CEST] <thebombzen> so keep using itsoffset until you fix the data issue. one step at a time
[23:57:57 CEST] <alphabitcity> ok
[23:58:36 CEST] <thebombzen> that's... interesting. what if you write it to matroska?
[23:59:29 CEST] <alphabitcity> @thebombzen: looks like mkv doesn't support data tracks? "Only audio, video, and subtitles are supported for Matroska."
[23:59:37 CEST] <thebombzen> what about nut?
[00:00:00 CEST] --- Mon May  1 2017

More information about the Ffmpeg-devel-irc mailing list