[Ffmpeg-devel-irc] ffmpeg.log.20141216

burek burek021 at gmail.com
Wed Dec 17 02:11:15 CET 2014


[00:49] <Eric______> can someone suggest where can i found open source ULE encapsulator. source code?
[00:49] <Eric______> i see this posting but its not accisible any more
[00:49] <Eric______> http://www.erg.abdn.ac.uk/users/Gorry/ipdvb/archive/0602/msg00010.html
[00:50] <Eric______> please help me?
[00:54] <Eric______> can someone suggest where can i found open source ULE encapsulator. source code?
[00:54] <pzich> nope
[00:55] <iive> i don't know what ule is...
[00:55] <iive> i now this http://obe.tv/ address have some interesting dvb stuff.
[00:56] <iive> Eric______: btw, have you tried webarchive ?
[00:56] <Eric______> ULE is Ultra Light Encapsulation (ULE)
[00:57] <Eric______> @iive where is that?
[00:58] <tnelsond> When trying to hardcode a bitmap subtitle stream into a mkv using overlay I got very very dimmed subtitles. So I tried using blend=all_mode=overlay.
[00:58] <iive> Eric______:  https://web.archive.org/web/20070630203514/http://nrg.cs.usm.my/ule.htm
[00:58] <tnelsond> Now I get "First input link top parameters (size 720x480, SAR 0:1) do not match the corresponding second input link bottom parameters (720x480, SAR 8:9)"
[01:00] <Eric______> @<iive> let me see that
[01:00] <pzich> tnelsond: maybe use setsar?
[01:00] <Eric______> many Thanks
[01:01] <tnelsond> pzich: Ok, yes, setsar works, thanks.
[01:03] <tnelsond> Now what blend mode do I use to put bitmap subtitles over top of the video?
[01:04] <iive> tnelsond: it would be a good idea to use something with alpha channel.
[01:04] <iive> aka, true transparency
[01:07] <tnelsond> Well, the bitmap subtitle stream came with the dvd. I just want to hardcode it over top. So how would I do something with the alpha channel? Shouldn't the dvdsubs already have an alpha?
[01:07] <iive> so, vobsub?
[01:07] <tnelsond> I think they're vobsub.
[01:09] <tnelsond> They're definitely raster and not text.
[01:09] <pzich> is it just a video stream?
[01:09] <iive> yes they are raster. and they do have alpha,
[01:10] <iive> have you tried just -vf subtitles ? I have no idea if that would work with vobsub...
[01:10] <tnelsond> They have alpha when played on a dvd screen.
[01:12] <pzich> can you -filter_complex '[mov][subs]overlay[out1]'?
[01:13] <tnelsond> pzich: When I do that the subtitles show overtop very very dimly. But if I output just the subtitles they're easy to see.
[01:15] <tnelsond> So basically I have a .vob file that has a video stream and a    Stream #0:5[0x20]: Subtitle: dvd_subtitle
[01:19] <iive> well... google search says to use overlay, so i guess you are probably doing the right thing.
[01:20] <iive> i have no idea what might be wrong... maybe the alpha channel is wrong and letters are also transparent, when they should not be.
[01:20] <pzich> that's what I was thinking too
[01:22] <iive> colors are usually hold in *.idx or *.ifo file...
[01:25] <tnelsond> What program does one use to read a .ifo?
[02:49] <k_sze> Good day.
[02:49] <k_sze> Are intra-frame-only codecs generally less efficient than inter-frame codecs in terms of compression ratio?
[02:52] <llogan> impatient
[03:00] <k_sze> grrr, this wifi is pissing me off. Disconnects so easily.
[03:07] <llogan> k_sze: yes
[03:09] <k_sze> And I'm not sure if I'm doing something wrong, but I tried encoding hevc with -preset ultrafast and even that can't catch up to 30 fps real time speed.
[03:09] <k_sze> Whereas libx264 with -preset superfast or even veryfast can.
[03:11] <c_14> I'd wager that's hevc just being Hugely Efficient at using Vuluable Cpu-time.
[03:12] <c_14> I've never tested with those presets, but x265 is several orders of magnitude slower than x264.
[03:12] <c_14> For now, anyway.
[03:14] <klaxa> ha, hevc real-time in software, dream on
[03:21] <k_sze> And what's the difference between yuv422p and yuvj422p?
[03:23] <k_sze> yuvj has a full dynamic range where as yuv is limited to 16-235, is that correct?
[03:42] <k_sze> oh god... turns out there are so many YUV variants.
[03:42] <k_sze> yuva, yuv, yuvj, and then yuva444p161e, etc.
[03:42] <k_sze> How do I make sense of all of them?
[04:05] <Nosomy|off> k_sze
[07:54] <bencc> I'm trying to concatenate two mp4 files but when reaching the second part the video doesn't play right
[07:54] <bencc> do I need to make sure that the parameters of the two parts match?
[07:54] <bencc> the resolution and frame rate is the same but the bitrate doesn't match
[08:00] <bencc> now I'm using the same preset and the second part play but the audio jumps between the videos
[08:03] <bencc> the audio bitrate and sample rate is different
[09:30] <K4T> is it possible to calculate avarange video bitrate when I use only crf parameter in transcoding process?
[09:31] <K4T> I have to document stream parameters and I dont have idea what to type in field called "Video bitrate: " cause I used "-crf 23"
[10:02] <pzich> K4T: you could specify a bitrate, or read it off the file after
[10:06] <BtbN> If you want to stream video, and it needs a fixed bitrate, crf based encoding is not what you want to do.
[10:07] <BtbN> It doesn't have a fixed bitrate. It's fixed quality and the bitrate is based on how complex it is to achive that quality.
[10:16] <a-amason> hey all, got a video that didn't finish recording properly and I'm trying to extract some frames from it but I'm not having much luck... I'm trying the commands and getting the output in this paste >> http://pastebin.com/SjQ5Vkzh << any ideas, or am I just SOL?
[10:36] <BtbN> a-amason, mp4 writes the header at the end.
[10:36] <BtbN> So if recording is aborted, the file is basicaly lost
[10:37] <BtbN> It's a horrible container
[10:37] <BtbN> don't use it if you can
[10:38] <bencc> how can I create a white background video?
[10:38] <bencc> this will give me transparent video nullsrc=size=1280x720
[10:40] <BtbN> -f lavfi -i "color=color=white:size=1280x720"
[10:51] <Elirips> Hello. Are there also static builds for ffserver available ?
[11:29] <bencc> where can I find documentation about -itsoffset?
[11:29] <bencc> I need to sync audio and video
[11:44] <a-amason> BtbN, unfortunately it seems to be all that comes out of my Motorola Defy's camera :/
[11:45] <a-amason> and it's running CM9 so it must be woven into Android
[11:51] <pierre_> hi, can i write aac stream, frames with adts header in mp4 container, will the file be playable ?
[12:33] <a-amason> BtbN, ok, so after  some Googling, I was able to get some video data, but it was really crap quality... guess I'll have to call it a miss.  Shame, I managed to get some nasty bits of driving in that lost footage :(
[12:39] <pierre_> hi, where can i send audio file which possibly corrupts ffmpeg aac decoder ?
[12:41] <b_jonas> pierre_: read the instructions here: http://ffmpeg.org/bugreports.html
[12:43] <pierre_> thanks
[13:30] <Elirips> A very basic question: I have an ffserver running which serves an swf stream. To feed ffserver I do it as simple as 'ffmpeg -i rtsp://..my-ip-cam' http://ffserver:port'. Do I already need to tell ffmpeg to convert stuff to flash, or will ffserver do this for me?
[13:31] <Elirips> And can I also use ffserver to stream a hls? Currently I use ffmpeg to create the segments and serve them using apache
[13:38] <Elirips> Or asked the other way round: Can ffserver also serve static files from a 'Webroot' and send the correct mime-types for m3u8 and ts files?
[13:39] <c_14> As long as your ffserver.conf is correct you don't need to tell ffmpeg anything. I don't know. No.
[13:39] <BtbN> ffserver is basicaly deprecated
[13:39] <BtbN> Don't use it in new setups
[13:40] <Elirips> BtbN: hm, what are the alternatives in using ffserver to stream flash that work as nice and easy as the combination of ffserver and ffmpeg?
[13:41] <BtbN> nginx with rtmp plugin for example
[13:42] <BtbN> also, what do you mean by "stream flash"?
[13:42] <BtbN> a hls stream? Plain flv files?
[13:42] <BtbN> or rtmp?
[13:47] <Elirips> BtbN: Currently I have the following setup: One ffmpeg reading the rtsp stream and converting it to hls. The segments get served by apache. Another ffmpeg instance reads the rtsp stream and forwards it to ffserver, which will serve it as 'Format swf', which is then displayed on a webpage using the flash plugin
[13:47] <Elirips> Like this I can serve the same stream to iOs / Android devices using the fancy html 5 video-tag, and for the dumb IEs I still have the Flash alternative
[13:48] <BtbN> Leave the hls stuff to nginx-rtmp
[13:48] <Elirips> and for mozilla/chrome I will set up a third ffmpeg to convert to ogg which can be handled by firefox / chrome native
[13:53] <Elirips> BtbN: Why? Don't get me wrong, I'm wondering what is the advantage of using nginx-rtmp than ffmpeg+apache.
[13:53] <BtbN> apache can handle rtmp streaming?
[13:54] <Elirips> BtbN: no, apache handles hls
[13:54] <BtbN> Nope, it doesn't.
[13:54] <BtbN> nginx-rtmp does.
[13:54] <Elirips> BtbN: So, I would have only one server (nginx-rtmp) that can handle hls and rtmp?
[14:06] <Elirips> BtbN: Indeed nginx looks interesting. Do you know if nginx-rtmp can also serve RTMPT (RTMP over HTTP)?
[14:06] <Elirips> this is a must have for us, we can only use the http port, everything else is not reliable
[14:07] <Elirips> due to wicked firewalls, vlans, etc.
[14:07] <BtbN> You can configure whatever port you like i'd guess
[14:08] <BtbN> you most likely won't be able to serve hls from the same ip then
[14:12] <Elirips> BtbN: that would be one option - just add two network interfaces. But that means I would just change the RTMP port to 80, and *not* change the protocoll to RTMPT (which is an extension for RTMP to tunnel using HTTP)?
[14:13] <BtbN> no idea?
[14:13] <bencc> is it possible to balance two speakers in a video, one speaking loader than the other?
[14:14] <BtbN> Never heard of issues with the standard rtmp port
[14:14] <BtbN> And if there are, hls will work
[14:15] <Elirips> BtbN: yes, but hls will not work in IE
[14:15] <Elirips> stupid IE as always
[14:16] <BtbN> hls will also not work in firefox
[14:16] <BtbN> even chrome doesn't support it without JavaScript magic
[14:17] <Elirips> yes, but there streaming a ogg-vorbis stream using http works out of the box
[14:17] <Elirips> I know, its a pain
[14:17] <Elirips> we need a working solution to stream to all browsers using html5 video-tag, and as fallback using a flash-plugin
[14:17] <BtbN> Just use something like video.js and stop caring
[14:18] <Elirips> video.js is not usuable for live-streams coming in via rtsp.. I still need a server to convert the live-stream
[14:18] <BtbN> you need that anyway
[14:19] <BtbN> if video.js doesn't support livestreams, jwplayer does
[14:19] <Elirips> and even with video.js I have seen no working solution that would display a live-stream in IE, as IE is simply not capable
[14:19] <Elirips> not for IE imho
[14:19] <BtbN> It will just use flash there
[14:21] <Elirips> And of course, everything runs in a WAN, with no access to the internet
[14:23] <BtbN> Well, you won't manage to magicaly support everything
[14:23] <Elirips> BtbN: anyway, thx for all the inputs :)
[14:23] <BtbN> Firefox alone causes trouble because of its lack of MSE support
[14:23] <BtbN> And if you want to support IE, especialy stuff older than IE11, that propably just won't work
[14:24] <BtbN> video.js or jwplayer with hls only should work absolutely everywhere
[14:24] <Elirips> yes, but we will get a working solution by converting the streams to a) ogg-vorbis b) hls c) flash and serve them all using http. I basically have all that, I just need to "put it together"
[14:24] <BtbN> as it will just fallback to flash
[14:24] <Elirips> no need for flash if you have ogg-vorbis
[14:25] <BtbN> you want to stream only audio?
[14:25] <Elirips> only video
[14:25] <BtbN> vorbis is an audio codec
[14:26] <Elirips> err, theora
[14:27] <BtbN> will propably cause way more trouble than just leaving it all to jwplayer
[14:27] <Elirips> it would also be okay if firefox uses flash
[14:27] <Elirips> unfortunately, almost noone is using firefox in this environment
[14:28] <Elirips> the admins actually hate it, because they say they cant fine-tune it like ie
[14:30] <Elirips> firefox is more or less my personal goal
[14:33] <BtbN> firefox is a major pain. It's the only useable browser, and they are lacking mse support for full 2 years now
[14:36] <JEEBsv> MSE is pretty well working in the nightlies now
[14:36] <JEEBsv> so they are getting somewhere
[14:36] <Elirips> IE is also a major pain, but in general
[14:36] <JEEBsv> (I used to actually disable MSE after they enabled it by default in the nightlies, but now I could re-enable it)
[14:37] <JEEBsv> still has some issues, but much better than not having it at all
[14:37] <JEEBsv> no idea at which stable version it will be enabled
[14:38] <Elirips> the whole html5 video stuff is actually a pain
[14:39] <Elirips> nice idea, but as long as mozilla/ms/apple/<and all the others> cant agree on a minimal common denominator.. its just annoying
[14:39] <Elirips> but still I need it
[14:58] <saste> Elirips, what you're trying to do?
[15:05] <Elirips> saste: I need to stream incoming *live* rtsp-streams to a webpage. Whenever possible using plain html5 video tag, no activeX, no plugins. Except Flash is allowed as fallback. Everything must run on port 80. It must work on android, i-stuff, and normal browsers like IE, chrome, ff, safari
[15:05] <Elirips> for the i-devices I need to convert to hls
[15:05] <Elirips> that works fine
[15:06] <Elirips> using ffmpeg and apache
[15:06] <saste> Elirips, webrtc?
[15:06] <Elirips> maybe, but this is just the frontend
[15:06] <saste> alternatively Source Media Extensions, but as far as I tried it, it tends to be very buggy or incomplete
[15:07] <Elirips> the hard stuff is to set up all the servers and converters
[15:07] <Elirips> and at the end, it will be interesting to monitor the hardware resources required, as there will be ~70 cameras
[15:07] <saste> Elirips, is it possible to directly stream with RTSP to a web client?
[15:08] <Elirips> saste: what would be the benefit of that? no browser supports rtsp
[15:08] <Elirips> nor the i-devices, nor android
[15:08] <saste> Elirips, indeed, it's what I supposed
[15:08] <Elirips> no, the only way is to use HLS for the i-devices and android, and flash for the dumb IE
[15:09] <saste> and webrtc is a mess to setup server side (and it's not really supported yet on many platforms)
[15:09] <Elirips> so atm I have apache + ffmpeg for the HLS and ffserver + ffmpeg for flash
[15:09] <Elirips> and now I'm looking into nginx
[15:09] <Elirips> as suggested by BtbN
[15:10] <saste> Elirips, about ffserver, do you really need it? couldn't you use just ffmpeg?
[15:11] <Elirips> saste: I need ffserver only to serve the flash-stuff
[15:12] <Elirips> of course I would prefer it ffmpeg could take the incoming rtsp-stream, convert it to flash, and serve that using apache
[15:12] <saste> Elirips, ffmpeg can work as rtsp receiver and rtmp client
[15:12] <Elirips> but I need an rtmp server
[15:12] <Elirips> and I would prefer an rtmpT protocol
[15:13] <Elirips> rtmpTunneled over http
[15:13] <Elirips> I guess with everything else we will run into firewall issues sooner or later
[15:14] <saste> and do you really need ffserver for that? ffmpeg can do -i rtsp:///... rtmpt:///...
[15:14] <saste> the only reason to use ffserver is if you need to have a server listening for connections and serving streams when a random client connects
[15:15] <Elirips> saste: So ffmpeg can act as a server??
[15:16] <Elirips> I need a server, as I have a website, and if a client connects to that website it starts loading the stream
[15:17] <BtbN> nginx-rtmp is the server
[15:17] <saste> Elirips, yes in that case you may need ffserver
[15:17] <BtbN> ffmpeg sends the stream there via rtmp
[16:00] <fmax30> Hey it seems that ffmpeg isn't preserving the metadata for the video that is being compressed.
[16:01] <fmax30> I am using the following parameters ffmpeg -y -i input file -strict -2 -b:v 700k -s 640x360 -r 30 -vcodec libx264 -acodec aac -ac 1 -b:a 64k -ar 44100 -preset ultrafast -crf 24 output_file
[16:04] <BtbN> That's going to look horrible. But just check what it says in the output.
[16:04] <BtbN> metadata mapping mostly depends on the input and output containers.
[16:20] <c_14> fmax30: try adding -map_metadata 0
[16:20] <fmax30> @Btbn : Rotation information is present in the input files but not present in the output file.
[16:21] <BtbN> rotation information doesn't sound like it's normal metadata
[16:21] <fmax30> @c_14 : what does -map_metadata do ?
[16:22] <c_14> It maps metadata
[16:22] <BtbN> You still haven't revealed what container formats you are using.
[16:22] <BtbN> Possible that your target format just doesn't support that kind of metadata
[16:23] <fmax30> as far as i know x264 with a mp4 container should have atleast a transformation/affine matrix or something like that
[16:33] <Mavrik> fmax30, ffmpeg's mp4 muxer doesn't support matrix transform... or at least didn't when I checked the last time as of 2.2
[16:33] <Mavrik> so it doesn't write the rotation metadata
[16:34] <Mavrik> fmax30, even though, looking at ffmpeg 2.5 there is support in there now
[16:34] <Mavrik> so try updating? :)
[16:40] <bencc> I did screen capture with "-acodec pcm_s16le" when extract the audio from the file, what extension should I use?
[16:41] <bencc> I want to open it with audacity
[16:41] <kepstin-laptop> probably wav then
[16:42] <bencc> kepstin-laptop: thanks
[16:49] <fmax30> @Mavrik: I am using the latest ffmpeg 2.5 . Is there a param for preserving it or something ?
[16:49] Action: fmax30 looks through the ffmpeg man page
[19:02] <kaotiko> hi
[19:50] <justinX> hello kaotiko
[20:14] <PaperWings> Anyone work with Kurento? Know anything about recording in .mp4?  My attempt is pretty much recording at like 1 fps
[20:15] <justinX> hehe 1 frame/s would be a slideshow not a video :-)
[20:17] <kepstin-laptop> PaperWings: kurento's based on gstreamer, not directly on ffmpeg.
[21:19] <bencc> when replacing audio track, should I expect it to finish very fast or can it take several mintues to process?
[21:19] <bencc> ffmpeg -i video.mkv -i audio.wav -map 0:v -map 0:a:0 -map 1:a -c:v copy -c:a copy output.mkv
[21:23] <pzich> it seems like it should be relatively fast with copy
[21:23] <bencc> 1 hour video takes about 1 mintue
[21:25] <pzich> seems pretty good
[21:25] <pzich> how big are the input files?
[21:27] <bencc> 5GB
[22:19] <anshul_mahe> how to spwcify codec how to give input file to ffmpeg whose extension has no meaning and things need to explicitly specified
[22:21] <c_14> ffmpeg -c:v magic -i weirdfile
[22:21] <c_14> Though, the extension shouldn't matter.
[22:40] <sabkaraja> Hi, I have a problem with cropping & exporting videos from hikvision mp4 . The audio is lost in the process. I am trying for a solution since last 2-3 weeks.
[22:41] <sabkaraja> Hi, I have a problem with cropping & exporting videos from hikvision DVR videos . The audio is lost in the process. I am trying for a solution since last 2-3 weeks.
[22:44] <sabkaraja> Hi, heres the command and output http://pastie.org/9784956
[22:45] <pzich> your input file does not appear to have an audio stream
[22:56] <llogan> sabkaraja: this is a known issue: https://trac.ffmpeg.org/ticket/4182
[23:05] <sabkaraja> yea. reported by me
[23:06] <cu2014> hi I was wondering if anyone else has had difficulty installing libopenjpeg with ffmpeg from source?
[23:10] <cu2014> more specifically when compiling and enabling --enable-libopenjpeg getting the error message "ERROR: libopenjpeg not found" but the header is there.
[23:27] <sabkaraja> @llogan @pzich is there a way I can get some developer to fix it for me?
[23:27] <pzich> you'd have to ask them
[23:44] <sabkaraja> thanks
[00:00] --- Wed Dec 17 2014


More information about the Ffmpeg-devel-irc mailing list