[Ffmpeg-devel-irc] ffmpeg.log.20170917

burek burek021 at gmail.com
Mon Sep 18 03:05:01 EEST 2017


[00:06:17 CEST] <MelchiorGaspar> Thx for the info.. I use Avanti-GUI
[01:25:14 CEST] <wondiws> hello, what should I use instead of AVFrame coded_frame, now that has been deprecated?
[01:30:33 CEST] <jkqxz> What are you using it for?  You probably want something in the packet or side data.
[01:31:58 CEST] <wondiws> jkqxz, I'm not exactly sure; it's not my code, I think they want to get a timestamp or something
[01:32:32 CEST] <wondiws> jkqxz, nah, I don't need it anyway
[03:09:30 CEST] <notdaniel> Schwarzbaer, you're using it in order to use dash/hls on the fly?
[03:10:20 CEST] <Schwarzbaer> notdaniel, I don't have a use case, so... Sure, why not?
[03:11:46 CEST] <notdaniel> i'm currently looking for the reverse
[03:12:02 CEST] <notdaniel> serving up mp4s from a dash manifest
[03:12:42 CEST] <Schwarzbaer> TBH I don't even know what a dash manifest is.
[03:14:16 CEST] <notdaniel> so nginx-rtmp is just there for your amusement
[03:14:24 CEST] <Schwarzbaer> Yup.
[03:14:27 CEST] <notdaniel> rad
[03:14:58 CEST] <notdaniel> well, dash/hls are the specs used most now for adaptive video streaming, like how a 720p youtube video will kick down to 480 if your internet slows but back up if it speeds up
[03:15:37 CEST] <notdaniel> we recently switched our video platform to _only_ use dash/hls and delete the original mp4 for cost reasons
[03:16:30 CEST] <notdaniel> we then discovered that probably 8% of our traffic will never be able view videos again, but we've already made the switch, so i'm now attepting to figure out a hliarious way to remux them back into mp4s on demand for those specific users
[03:18:08 CEST] <Schwarzbaer> Yyyyeah, that's the fun bit about hobby projects, "If it works for me, that's good enough."
[03:19:11 CEST] <DHE> does ffmpeg not have a dash decoder? I only see an encoder
[03:19:14 CEST] <notdaniel> yeah you have that luxury
[03:19:21 CEST] <notdaniel> in our case, we get 3000 angry emails
[03:19:53 CEST] <notdaniel> DHE, dash isnt a format in itself. the media is typically fragmented mp4s
[03:20:11 CEST] <notdaniel> dash is the spec telling the player which media to use at which bandwidth levels
[03:20:20 CEST] <DHE> yes, I know. I'm more an HLS guy myself but still. converting an HLS stream back into an .mp4 (by selecting a single bitrate) is easy
[03:20:31 CEST] <notdaniel> oh, no, we can totally do it
[03:20:55 CEST] <notdaniel> but doing it on demand, without killign performance, while also not incurring the storage costs we started deleting the mp4s to begin with in order to get
[03:21:18 CEST] <notdaniel> hls/dash are essentially the same anyway
[03:21:38 CEST] <DHE> if you need both, are you sure you can't get the additional storage needed?
[03:21:50 CEST] <DHE> clearly you have a need
[03:21:56 CEST] <notdaniel> it's why it's great. we use the same media for hls and dash, and they arent segmented. if only we hadn't given the finger to ios9 and ie11 users in the process, it wouldve been perfect
[03:22:28 CEST] <notdaniel> DHE, we're a funded but very small startup running a free video platform
[03:22:38 CEST] <notdaniel> just deleting the mp4 they uploaded was like a $35k/month savings
[03:22:48 CEST] <DHE> amazon s3? (or similar)
[03:22:53 CEST] <notdaniel> yup
[03:23:17 CEST] <notdaniel> we _thought_ dash/hls was supported enough that this was fine. and in most respects this is still the case -- new iphone release will see upgrades and the problem will be lessened
[03:23:48 CEST] <notdaniel> issue is there are sporadic problems with ie and firefox, even in builds that claim to support dash/hls, due to nuances and minor spec changes and such
[03:26:59 CEST] <Schwarzbaer> Apparently I'll have a lot of learning to do... Anybody please give me pointers? For instance, AFAIK DASH is a HTTP-based protocol? I assume it's about the data going over a websocket?
[03:31:14 CEST] <Schwarzbaer> Also, what is HLS, how does it differ from DASH?
[03:33:19 CEST] <notdaniel> hls is a similar concept, but is implemented on ios devices and like ie edge
[03:34:00 CEST] <notdaniel> same concept, very slightly different implementation, and as ive mentioned, theyve now merged to the point where you can use the same video files with both, the only difference is the format of the text file with the manifests
[03:34:40 CEST] <notdaniel> google favors dash, apple favors hls, both are supported by most platforms and browsers that existed after 2015
[03:34:40 CEST] <Schwarzbaer> I see. Suddenly I'm not interested in HLS anymore, but thank you. ^^
[03:34:44 CEST] <notdaniel> haha
[03:34:53 CEST] <notdaniel> well, cant give the finger to ios users :)
[03:35:13 CEST] <Schwarzbaer> Not if you're paid, sure. Freedom of the hobbyist... ^^
[03:35:23 CEST] <notdaniel> and again, the videos are the same. in order to spit out hls alongside dash, i literally just add an --hls flag to the dash command
[03:35:31 CEST] <Schwarzbaer> How much work is involved in providing both, though?
[03:35:52 CEST] <Schwarzbaer> I see.
[03:35:55 CEST] <notdaniel> actual work? one long command to a free open source tool
[03:36:05 CEST] <notdaniel> costs of bandwidth and such are a whole other ball game
[03:36:25 CEST] <notdaniel> but we're still saving 50% beause hls and dash share the same video files
[03:36:43 CEST] <notdaniel> bento4 is the tool to use for this
[03:37:00 CEST] <Schwarzbaer> At what part does DASH come into play, anyway? I assume that it's yet another protocol, thus one more module in nginx?
[03:37:30 CEST] <notdaniel> much simpler
[03:37:54 CEST] <notdaniel> dash/hls are the frameworks that tell the player/browser which media to retrieve based on the user's connection speed
[03:38:04 CEST] <notdaniel> after that, it's just all basic http requests for the one it picks
[03:38:48 CEST] <notdaniel> "use media file A if bandwidth is lower than 8mbit. switch to media B if bandwidth is measured to be at least 14mbit." etc
[03:38:48 CEST] <Schwarzbaer> That means I need to provide multiple streams?
[03:39:09 CEST] <notdaniel> well sure, so from the original video file, you go make a 1080p, 720p, 480p version
[03:39:14 CEST] <notdaniel> (or whatever you want to support)
[03:39:17 CEST] <Schwarzbaer> And here I was hoping it was a matter of on-the-fly recoding. Which probably wouldn't scale, though.
[03:39:42 CEST] <notdaniel> nginx-rtmp does do this on the fly supposedly
[03:39:49 CEST] <Schwarzbaer> Right now, I have a 640x480 webcam as my media source. ^^
[03:39:52 CEST] <notdaniel> we are doing this all ourselves
[03:40:04 CEST] <notdaniel> haha well then you dont even need dash anyway :P the lowest resolution is the only choice
[03:40:46 CEST] <notdaniel> we have people uploading shit in like 4k 60p needlessly
[03:41:09 CEST] <notdaniel> so we go make a bunch of versions of it and then dash/hls will switch to a different format every few seconds if bandwdith is too low/high
[03:41:23 CEST] <Schwarzbaer> Actually, 640x480 is high for my circumstances. It works within my LAN (although there's a 7 sec delay, the source of which I haven't found yet), but if I'd be serving that over my 60kb/s connection, I'd still have to downsample that.
[03:41:44 CEST] <notdaniel> yeah i suppose it coudl still be a high bitrate despite the resolution
[03:41:55 CEST] <notdaniel> if you want to "see" this, here: http://dashif.org/reference/players/javascript/latest/samples/dash-if-reference-player/index.html
[03:42:17 CEST] <notdaniel> just pick anything from the dropdown and hit "load" and then look at the charts, and youll see as the format picks the correct one based on your internet speed
[03:42:36 CEST] <notdaniel> force a slowdown in chrome tools and a few seconds later itll adapt. thus is the point, and it's absolutely vital
[03:43:22 CEST] <notdaniel> especially for, say, mmobile, where you might have 5mbits and walk ten feet and have nothing for a few seconds and then get 2mbits and etc
[03:43:43 CEST] <notdaniel> video would be unbearable on mobile had this not been implemented years ago by youtube and such
[03:47:27 CEST] <notdaniel> webtorrent will soon make all of this irrelavent anyway
[03:50:07 CEST] <Schwarzbaer> ... Let me guess, ad hoc p2p data distribution networks in everybody's browser?
[03:56:21 CEST] <notdaniel> if you go watch a video on bitchute, if someone else also goes to watch it, you'll be seeding it to them as long as youre on it
[03:56:30 CEST] <notdaniel> reducing the cost to the video platform
[03:56:54 CEST] <Schwarzbaer> I assume that won't just be for video?
[03:56:59 CEST] <notdaniel> but the thing that really astounded me is, go open the web inspector and look at all the xhrs bitchute makes
[03:57:15 CEST] <notdaniel> it's like several http requests per second, but somehow performance is stellar
[03:58:10 CEST] <notdaniel> webrtc can be fun. but this is also only usable in latest browsers, isnt compatible with existing standards, and would also only benefit videos that have multiple concurrent viewers at any given time
[03:58:17 CEST] <notdaniel> but still interesting
[03:59:11 CEST] <Schwarzbaer> Well, I'm interested in serving large data blobs at some time in the future, so it's still of interest to me... if it's not just for video.
[04:00:44 CEST] <notdaniel> webtorrent still implies multiple seeders simultaneously and such
[04:01:47 CEST] <notdaniel> but if your use-case is data blobs that are shared by lots of users at any given time, man, stop paying that bandwidth
[04:01:50 CEST] <Schwarzbaer> To re-summarize... RTMP is for feeding audio/video to a server, and streaming it from it; DASH is for streaming it, and re-negotiating the video source on the fly, and it's the protocol between player and server. HLS is DASH in Appleish. And webtorrent is the near future. So far, so correct?
[04:01:55 CEST] <notdaniel> (once all your users switch to latest firefox and chrome of course)
[04:02:36 CEST] <notdaniel> rtmp is a misnomer in the context of nginx-rtmp
[04:02:49 CEST] <notdaniel> rtmp was the protocol that video used when everything was flash-based
[04:03:15 CEST] <JEEB> no, it's still the feeding protocol
[04:03:53 CEST] <JEEB> so it's not a misnomer, and he understands correctly that it looks like <feeder>->RTMP-><nginx><-DASH/HLS over HTTP<-<client>
[04:04:14 CEST] <Schwarzbaer> Is it a misnomer for the streaming bit? Because I've pointed a player at the same endpoint, got the stream, and was very confused.
[04:04:30 CEST] <JEEB> left of nginx being feeding to the server, and right being what's served to clients
[04:04:41 CEST] <JEEB> although of course it also can provide you RTMP as well
[04:04:47 CEST] <JEEB> for the client that is
[04:05:00 CEST] <JEEB> it's just that usually only software players and flash support RTMP
[04:05:07 CEST] <notdaniel> yeah you are correct, but if i read "rtmp" my brain thinks "flash" not "live remuxing to dash/hls"
[04:06:43 CEST] <notdaniel> and this is not correct, but it's habit. the practical result is a dash/hls manifest
[04:09:04 CEST] <notdaniel> still so much fragmentation out there. IE Edge supports new dash/hls specs with shared fragmented mp4s, but it seems to choke if the manifest is provided as a data blob
[04:09:32 CEST] <notdaniel> it's all basically a big mess and we are now paying the price for having tried to save money by deleted our mp4s
[05:00:18 CEST] <JC_Yang> quetions about avc1 streams, in this mdsn docs, https://msdn.microsoft.com/zh-cn/library/dd757808(v=vs.85).aspx, it is so stated that the size of length field of avc1 frame can vary, how can I read the size of the length field of a avc1 frame with libavformat?  which parameter should I read? AVStream->what?
[11:43:15 CEST] <wondiws> hi there
[11:48:05 CEST] <wondiws> Just to make sure, I know this is probably a superfluous question, but ffmpeg uses LGPL, right?
[11:49:50 CEST] <JEEB> it uses LGPL as long as it's configured with LGPL, there are some components that require additional licensing paramters such as '--enable-version3' or '--enable-gpl'
[11:50:14 CEST] <JEEB> or '--enable-nonfree' which is very rare thankfully, but will make binary distribution impossible due to licensing incompatibility
[11:50:44 CEST] <JEEB> if you just do ./configure --disable-autodetect -- that should result in an LGPL configured FFmpeg
[11:50:51 CEST] <iive> also linking it to GPL library like libx264 turns it into GPL too.
[11:51:01 CEST] <JEEB> iive: that's why you require --enable-gpl for libx264 :P
[11:52:03 CEST] <wondiws> I read something along the lines that LGPL libraries are usually dynamically linked. Is there some clause that forbids statically linking ffmpeg to your proprietary executable?
[11:53:12 CEST] <JEEB> no, but since the LGPL requires you to make it possible to switch the libraries shared linking is more often preferred
[11:53:25 CEST] <JEEB> because with static linking you'd have to provide the object files for the proprietary parts
[11:53:33 CEST] <JEEB> for the final linking
[11:53:43 CEST] <wondiws> JEEB, let me try to digest that for a second
[11:53:54 CEST] <wondiws> switch the libraries?
[11:54:11 CEST] <JEEB> yes, in other words if someone builds FFmpeg just like you did, he should be able to switch the libraries
[11:54:26 CEST] <wondiws> switch to what?
[11:54:30 CEST] <JEEB> his own binaries
[11:54:36 CEST] <JEEB> not that you have to give support for that, just that it is technically possible
[11:54:42 CEST] <JEEB> that's what's in the LGPL
[11:54:55 CEST] <JEEB> you give out the source code for the LGPL component (in this case, FFmpeg)
[11:55:10 CEST] <JEEB> and make sure that you can technically replace the libraries linked into your application
[11:55:14 CEST] <wondiws> but I don't make changes to the ffmpeg code
[11:55:18 CEST] <JEEB> that doesn't matter
[11:55:29 CEST] <JEEB> the license is the same whether you make changes or not
[11:55:55 CEST] <JEEB> you publish sources for the LGPL part + make it possible to change the FFmpeg in your proprietary app
[11:56:04 CEST] <JEEB> or whatever is under LGPL I mean
[11:56:11 CEST] <iive> the user shuold be able to change its ffmpeg library
[11:56:26 CEST] <JEEB> that's why people opt for shared libraries, because the user can just change the solib/dylib/DLL
[11:57:06 CEST] <wondiws> oh, so you should be able to update the ffmpeg, and have the proprietary program still be able to work?
[11:57:11 CEST] <JEEB> no
[11:57:30 CEST] <JEEB> but if the person builds the same version in the same way
[11:57:36 CEST] <JEEB> he should be able to technically switch
[11:57:51 CEST] <JEEB> since FFmpeg's API changes any larger updates are out of scope (But people can try if they really want to)
[11:58:20 CEST] <wondiws> so if you just release a proprietary binary with ffmpeg statically linked into it, you're are not compliant to the ffmpeg license?
[11:58:32 CEST] <JEEB> s/ffmpeg license/LGPL/
[11:58:35 CEST] <JEEB> since this is not FFmpeg specific
[11:58:45 CEST] <wondiws> yes, the LGPL license
[11:58:48 CEST] <JEEB> also no, you can be compliant with static linking
[11:58:59 CEST] <JEEB> but you will then have to provide the object files
[11:59:03 CEST] <JEEB> for the final linking
[11:59:23 CEST] <JEEB> because without them the user cannot replace the LGPL component if he wishes to
[11:59:31 CEST] <JEEB> that's why people generally don't do it
[11:59:47 CEST] <JEEB> because it's more effort than just having the shared libraries there
[11:59:50 CEST] <wondiws> so the end user must have the possibility to do the linking process himself?
[11:59:59 CEST] <JEEB> with static libraries yes
[12:00:03 CEST] <wondiws> ah ok
[12:00:05 CEST] <JEEB> since that's how you replace the library
[12:00:22 CEST] <JEEB> with dynamic libraries it's often enough to just replace the solib/dylib/DLL
[12:00:41 CEST] <wondiws> ah I see
[12:00:48 CEST] <Fyr> guys, I experience some troubles in conversion. please, take a look at this:
[12:00:48 CEST] <Fyr> https://pastebin.com/PyCyevgD
[12:01:34 CEST] <Fyr> all the 23 episodes were converted successfully, only the final failed.
[12:01:48 CEST] <wondiws> I'm using ffmpeg in this program, I use a MJPEG stream. Is it possible to use OMX (raspberry pi) acceleration on MJPEG?
[12:02:44 CEST] <JEEB> Fyr: seems like it stumbled on invalid chapters?
[12:03:01 CEST] <Fyr> JEEB, so, it's only the wrong metadata?
[12:03:11 CEST] <Fyr> I don't really need chapters.
[12:03:23 CEST] <JEEB> that's how it looks like, since the failed write happens right after the chapter error in the matroska muxer
[12:03:34 CEST] <JEEB> there was a way to ignore the metadata or so I think?
[12:03:40 CEST] <Fyr> ok, how do I avoid metadata?
[12:03:44 CEST] <JEEB> also rip shana. that BD upscale was bad :<
[12:03:57 CEST] <JEEB> they warpsharp()'d it to hell and back
[12:03:58 CEST] <Fyr> JEEB, I'm ripping.
[12:04:12 CEST] <Fyr> I've downloaded and trying to convert the video.
[12:04:21 CEST] <JEEB> basically the upscaling they did for the TV was OK, but the blu-rays are goddamn awful
[12:04:23 CEST] <Fyr> (to watch it on my ipad)
[12:04:34 CEST] <Fyr> I know
[12:04:42 CEST] <JEEB> well, then you know :P
[12:04:46 CEST] <Fyr> I'm trying to make the bitrate right.
[12:04:55 CEST] <JEEB> wondiws: there's OMX hwaccel for some formats but not sure about MJPEG
[12:05:05 CEST] <Fyr> JEEB, what is the way to exclude metadata?
[12:05:30 CEST] <JEEB> it should be mentioned somewhere on ffmpeg-all.html, I just don't fucking know out of the top of my mind :D
[12:05:36 CEST] <JEEB> I just know it exists
[12:05:41 CEST] <JEEB> I think it had to do with mapping
[12:06:26 CEST] <Fyr> -map_metadata -1 doesn't work.
[12:06:43 CEST] <Fyr> FFMPEG still tries to open the metadata to avoid it.
[12:07:19 CEST] <Fyr> and stops the conversion as soon as finds out that it's invalid.
[12:07:30 CEST] <JEEB> rip
[12:07:39 CEST] <Fyr> I tried to pipe the video, however failed with the same reason.
[12:07:50 CEST] <JEEB> use mkvmerge or something to pre-process the file first
[12:13:02 CEST] <wondiws> JEEB, where do I need to search for in ffmpeg-all? :)
[12:14:00 CEST] <wondiws> I know I can encode h264 using "ffmpeg -i whatever.vob -vcodec h264_omx output.mp4"
[12:14:10 CEST] <JEEB> uhh, that was for Fyr
[12:14:36 CEST] <wondiws> :(
[12:15:26 CEST] <wondiws> gstreamer does support mjpeg, but gstreamer doesn't use ffmpeg code, right?
[12:15:34 CEST] <wondiws> (I've never used gstreamer before)
[12:16:51 CEST] <JEEB> it uses FFmpeg in the background in various cases
[12:16:58 CEST] <JEEB> also omx.c seems to mostly/only encode
[12:17:03 CEST] <JEEB> under libavcodec
[12:17:15 CEST] <wondiws> JEEB, and probably only h264, right?
[12:17:19 CEST] <wondiws> or perhaps h263?
[12:17:27 CEST] <wondiws> I think I recently looked there
[12:17:28 CEST] <JEEB> it also has mpeg-4 part 2 and AVC
[12:17:37 CEST] <wondiws> yeah, h263
[12:17:37 CEST] <JEEB> so yes, two formats that I can see
[12:17:41 CEST] <JEEB> no, H.263 is different
[12:17:49 CEST] <wondiws> I thought that was part 2?
[12:18:13 CEST] <JEEB> part 2 based on H.263 but removed features
[12:18:17 CEST] <JEEB> like in-loop deblocking
[12:18:28 CEST] <JEEB> which is why ITU-T never adopted MPEG-4 Part 2
[12:18:54 CEST] <wondiws> but MPEG-4 part 2 does comply with h263, just not the other way around? ;)
[12:19:19 CEST] <JEEB> not sure about that
[12:19:34 CEST] <wondiws> if part2 is a subset of h263, but nevermind
[12:19:39 CEST] <JEEB> yes, IFF it is
[12:19:52 CEST] <JEEB> I am not sure of the bit stream
[12:23:33 CEST] <JC_Yang> that's no known-to-me way to read this piece of information, the length header size... help please
[12:29:05 CEST] <Fyr> I've posted the bug report.
[12:34:28 CEST] <furq> wondiws: last time i checked ffmpeg's mmal implementation only supports h264
[12:34:56 CEST] <wondiws> furq, we just established it also supports MPEG-4 part 2 ;)
[12:35:04 CEST] <furq> https://github.com/FFmpeg/FFmpeg/blob/master/libavcodec/mmaldec.c#L376-L388
[12:35:15 CEST] <furq> i'm probably remembering wrong but i did remember there's no mjpeg
[12:35:27 CEST] <JEEB> furq: right, the decoder was MMAL
[12:35:31 CEST] <furq> yeah
[12:35:32 CEST] <JEEB> omx.c had just the encoders
[12:35:40 CEST] <JEEB> which were mpeg-4 part 2 / AVC
[12:36:19 CEST] <furq> also i assume you still need to buy the m2v/vc-1 decoders for ffmpeg to be able to use them
[12:39:48 CEST] <wondiws> oh, so I can decode MPEG2 using mmal?
[12:40:00 CEST] <wondiws> and mpeg4, vc1, h264?
[12:40:59 CEST] <Fyr> what is "h264_mmal"?
[12:41:12 CEST] <Fyr> I can't find an explanation for this.
[12:43:04 CEST] <furq> mmal is the raspberry pi's hardware decoder
[12:43:14 CEST] <furq> wondiws: you sure can
[12:43:32 CEST] <furq> mmal does definitely support mjpeg but it looks like it's not made it into libavcodec yet
[12:43:59 CEST] <furq> which is odd considering 90% of support questions in here about an rpi not working are about usb2 webcams
[12:48:28 CEST] <wondiws> "unknown decoder mpeg2_mmal" :(
[12:49:00 CEST] <wondiws> furq, also the dedicated RPi camera is tied in to the omx chip
[12:56:41 CEST] <wondiws> wow, I disabled audio encoding, and the I go from 26fps to 110fps encoding
[12:59:18 CEST] <furq> nice
[12:59:36 CEST] <furq> what codec
[13:00:17 CEST] <wondiws> furq, h264_omx
[13:00:24 CEST] <furq> i mean what audio codec
[13:00:40 CEST] <wondiws> oh, previously I got aac
[13:00:56 CEST] <furq> you might want to try with fdk if your build has it
[13:01:04 CEST] <furq> or lame
[13:01:28 CEST] <wondiws> furq, I think AVC should be accompanied by AAC ;)
[13:01:40 CEST] <furq> fdk is an aac encoder
[13:01:56 CEST] <furq> it's optimised for android so maybe it'll be faster on arm, idk
[13:02:07 CEST] <wondiws> furq, in due time ;)
[13:02:11 CEST] <furq> but yeah lame will definitely be faster than either
[13:10:18 CEST] <wondiws> furq, great I got a 200kbps video... why in the world would it default to such rate...
[13:10:30 CEST] <wondiws> with HEVC it might be watchable though :P
[13:15:09 CEST] <doslas> Hi
[13:15:15 CEST] <doslas> https://imgur.com/O4BZK4J
[13:15:19 CEST] <doslas> Opera
[13:23:20 CEST] <JEEB> wondiws: that's the libavcodec default if the encoder doesn't override it :)
[15:51:26 CEST] <arpu> can i use drawtext on a 360 video equirectangular? i think i need to remap the rectangualr font to equirectangular?
[17:45:35 CEST] <timofonic> Hello
[17:48:55 CEST] <timofonic> Is it true FFmpeg has some specific features related to streaming? They reiterate FFmpeg is very bad at downloading HLS streams for different reasons (lack of http keep-alive support and bad bitrate selection). They also say it has very basic MPEG-DASH and RTMP support. Has this changed? Any hope?
[19:45:18 CEST] <DHE> timofonic: ffmpeg generally doesn't select a bitrate. if you're making a player, you likely want to make your own DASH parser and just have ffmpeg parse/decode the real video files
[19:46:09 CEST] <DHE> RTMP is supported, but ffmpeg is not an RMTP server. It is an endpoint. Use something like nginx-rtmp and have ffmpeg feed it
[21:04:36 CEST] <panzerknacker> Hello everyone!
[21:07:55 CEST] <panzerknacker> I have a question: I'm using Dave Rice' Filter from here (http://dericed.com/2012/display-video-difference-with-ffmpegs-overlay-filter/) to compare files, and just noted that if I compare the same file (using same -i both times) that it does show differences in the output, while I understand that it should not show anything at all since there is no difference. The line I'm using is: https://pastebin.com/ynxuuFvQ
[21:09:40 CEST] <panzerknacker> The question being, what am I doing wrong :)
[21:13:14 CEST] <furq> you can just use lut2 for that
[21:13:19 CEST] <furq> !filter lut2 @panzerknacker
[21:13:19 CEST] <nfobot> panzerknacker: http://ffmpeg.org/ffmpeg-filters.html#lut2_002c-tlut2
[21:26:08 CEST] <DHE> shouldn't you use a semicolon in a filter_complex when using [labeledoutputs] ?
[21:26:33 CEST] <DHE> I mean the one before the [0:v] labeled input
[21:27:39 CEST] <SavinaRoja> so I am working on trying to prepare a video file for MPEG-Dash streaming, and I've managed to effectively allocate keyframes for stream fragmentation, but the tool I'm trying to use to prepare the manifest appears to be complaining that the audio stream is not fragmented
[21:30:32 CEST] <SavinaRoja> my general question is: is there an audio codec reference I should look at to see how I might set up a fragmented stream for audio?
[21:54:59 CEST] <SavinaRoja> I don't even seem to google anything up about how to fragment an audio stream in mp4, except that it can be done by a tool in bento4
[22:17:34 CEST] <SavinaRoja> finally realized I was being daft, the fragmentation is configured on the container, not the stream
[22:18:16 CEST] <SavinaRoja> -frag_duration in https://ffmpeg.org/ffmpeg-formats.html#Options-8 is what I wanted
[22:52:53 CEST] <timofonic> DHE: RTMP server why?? Why not put that code into FFmpeg and avoid code duplication?
[22:53:41 CEST] <BtbN> Because ffmpeg is not a server, not suited to act as one.
[23:04:30 CEST] <DHE> timofonic: a server needs to handle a large number of simultaneous sockets. that's not the intended use of the ffmpeg RTMP muxer. nginx is much better suited to this
[00:00:00 CEST] --- Mon Sep 18 2017


More information about the Ffmpeg-devel-irc mailing list