[Ffmpeg-devel-irc] ffmpeg.log.20160320
burek
burek021 at gmail.com
Mon Mar 21 02:05:01 CET 2016
[00:50:42 CET] <ShalokShalom> hi there
[00:51:10 CET] <ShalokShalom> is it correct, that chrome break the law, since they ship the browser with ffmpeg on board?
[00:51:30 CET] <J_Darnley> I donn't think so.
[00:52:02 CET] <J_Darnley> What particular section of the law were you thinking?
[00:52:48 CET] <ShalokShalom> LPGL ?
[00:52:58 CET] <J_Darnley> Copyright then.
[00:54:09 CET] <J_Darnley> As long as they let you change you change the libraries and provide the exact source they used then they don't have to release their own source
[00:54:48 CET] <ShalokShalom> thanks
[00:55:17 CET] <iive> it's lesser or library gpl
[00:55:28 CET] <ShalokShalom> ok :)
[00:56:20 CET] <ShalokShalom> http://en.swpat.org/wiki/GPLv2_and_patents#Prohibition_on_royalties:_section_7
[00:56:54 CET] <iive> it talks about court there, afair
[00:58:43 CET] <furq> do you think this irc channel has better lawyers than google
[00:59:07 CET] <J_Darnley> I would surprised if it has any.
[01:01:44 CET] <furq> besides, it's well known that google is based in luxembourg, where there are no software patents
[01:01:51 CET] <furq> or corporation tax
[01:02:00 CET] <furq> but mostly they're there for the nice scenery
[02:15:55 CET] <Hello71> [23:51:10] < ShalokShalom> is it correct, that chrome break the law, since they ship the browser with ffmpeg on board?
[02:16:29 CET] <Hello71> ... never mind.
[05:10:16 CET] <geusebio> I've been banging my head on a wall all day, and ffmpeg argument strings are making my head hurt. can I borrow one of you nice folk to help me work out why this is failing? http://pastebin.com/Q7xpUzYz Its a ffmpeg consuming a video stream from a security camera I'm attempting to push to a ffserver instance
[05:15:01 CET] <furq> geusebio: does it give a better error message if you run it with -v debug
[05:16:14 CET] <geusebio> furq: thanks for asking: http://pastebin.com/6BqFVS8R
[05:19:30 CET] <furq> i don't see anything obviously wrong but then i've never encoded mpeg-1
[05:19:55 CET] <furq> [mpeg1video @ 0xfef2e0] MPEG1/2 does not support 3/1 fps
[05:20:04 CET] <furq> that looks like it's being overridden but maybe that's causing some issue with ffserver
[05:20:45 CET] <furq> it's not particularly hard to cause issues with ffserver
[05:25:06 CET] <geusebio> Changed it to 10. same story. changed it to 24 and it went away... now to see if its actually streaming anything..
[05:25:55 CET] <geusebio> sweet jesus it works
[05:26:03 CET] <geusebio> its literally the smallest image in the world, but it works
[05:28:23 CET] <relaxed> geusebio: mpeg1video only supports certain framerates. you can see the supported ones with "ffmpeg -h encoder=mpeg1video | less"
[05:30:39 CET] <relaxed> or use -strict experimental for a non-standard framerate
[05:33:45 CET] <geusebio> Mm, it was just to try and get it going - Whats recommended for streaming a video, say, security camera video
[05:34:45 CET] <relaxed> I would copy the video stream
[05:37:00 CET] <geusebio> I couldn't get that working.
[05:38:20 CET] <Prelude2004c> hey guys.. godo evening
[05:38:47 CET] <Prelude2004c> i have a problem maybe someone can asist me with .. http://pastebin.com/DT0zwcfj
[05:39:00 CET] <Prelude2004c> options are : NVENC_OPTS="-preset hq -numB 1 -goplength 250 -rcmode 32 -qp 23"
[05:39:35 CET] <Prelude2004c> so all works well on the output with vlc if i leave the encoding source as 59fps .. but if i set it to 30fps , it plays for hte first few seconds and then vlc starts to freeze and loose video frames and starts to cut out and die
[05:39:39 CET] <Prelude2004c> not sure why.. does not make any sense
[05:41:04 CET] <furq> geusebio: https://github.com/arut/nginx-rtmp-module
[05:41:11 CET] <furq> that should be able to directly restream the input video
[05:41:18 CET] <furq> you'll need to transcode the audio to aac
[05:41:54 CET] <furq> https://gist.github.com/fur-q/d7028f51c38f7d0bb56e
[05:41:58 CET] <furq> that's pretty much the config i use
[05:42:33 CET] <relaxed> Prelude2004c: is there a reason you're not using ffmpeg with nvenc support?
[05:42:42 CET] <geusebio> I don't really want to run another projects stuff - I've got other tasks for this system to perform
[05:42:51 CET] <Prelude2004c> yes was having some major issues with it
[05:42:51 CET] <geusebio> when all i really need is the config :P
[05:42:55 CET] <furq> i would advise against using ffserver really
[05:43:05 CET] <geusebio> What would you suggest?
[05:43:05 CET] <furq> it's never been particularly reliable and it's pretty much abandoned now
[05:43:08 CET] <Prelude2004c> could not decode the video and audio.. stream issues.. NvTranscoder doesn't complain about i
[05:43:13 CET] <furq> i would suggest the thing i just suggested
[05:43:40 CET] <relaxed> geusebio: yes, avoid ffserver
[05:43:59 CET] <geusebio> :/
[05:44:24 CET] <relaxed> FFmpeg should have purged it long ago
[05:44:32 CET] <geusebio> So how do I even use that?
[05:44:46 CET] <geusebio> download nginx source, add a module, compile it?
[05:44:54 CET] <furq> pretty much
[05:45:02 CET] <geusebio> :|
[05:45:03 CET] <furq> there's a sample ./configure string at the bottom of the gist i linked
[05:45:50 CET] <geusebio> Nobody has a docker container that does this knocking about do they?
[05:48:41 CET] <geusebio> Does anyone have a tutorial or something I can follow for getting this going/
[05:49:10 CET] <furq> that gist should have everything you need to set up nginx
[05:49:50 CET] <furq> you can ignore the auth.lua bits if you don't need authentication
[05:49:57 CET] <furq> then: ffmpeg -i rtsp://rtspurl -c:v copy -c:a aac rtmp://rtmpurl/live/streamname
[05:50:28 CET] <Prelude2004c> can anyoen help ?
[05:51:16 CET] <relaxed> Prelude2004c: we might be able to help with ffmpeg's nvenc issues
[05:52:11 CET] <Prelude2004c> ya i know but i have issues with that thing :( .. and its working well ( as long as i keep the 60fps ) .. with video .. why would me setting -fps 30 start encoding ok.. but after a few seconds the video starts to get choppy and messed up and audio starts to cut out
[05:52:16 CET] <Prelude2004c> any suggestion there.. am i missing something ?
[05:53:06 CET] Action: geusebio sits and watches a docker container download build-essential, dies a bit on the inside.
[05:54:16 CET] <relaxed> no idea, we don't support NvTranscoder here
[05:55:20 CET] <ParkerR> Anybody know of a way to stream from the HD PVR directly to twitch? I've tried this but it stalls after about ten seconds. http://ix.io/tZA
[05:55:20 CET] <ParkerR> /dev/video0 is a straight H.264 and AAC stream
[05:56:06 CET] <relaxed> ParkerR: did you see https://trac.ffmpeg.org/wiki/StreamingGuide ?
[05:56:18 CET] <ParkerR> Yeah that's what I started with
[06:02:08 CET] <relaxed> ParkerR: try encoding the stream with -vcodec libx264 -tune zerolatency
[06:04:55 CET] <ParkerR> ffmpeg -y -f mpegts -i /dev/video0 -c:v libx264 -tune zerolatency -ar 44100 -f flv "rtmp://live.twitch.tv/app/live_blah" still stalls
[06:07:35 CET] <geusebio> furq: I cloned nginx, there is no configure in the root of it >>>
[06:08:14 CET] <furq> i guess you need to run autogen.sh or autoreconf then
[06:08:26 CET] <furq> it's pretty standard for configure to only be generated for releases
[06:10:41 CET] <geusebio> This is why I was askin' for a tutorial :p
[06:15:33 CET] <relaxed> ParkerR: try copying the stream again with -re before the input
[06:18:21 CET] <ParkerR> ffmpeg -y -re -f mpegts -i /dev/video0 -c:v libx264 -tune zerolatency -ar 44100 -f flv "rtmp://live.twitch.tv/app/live_blah"
[06:18:27 CET] <ParkerR> [mpegts @ 0x561556475d40] Could not detect TS packet size, defaulting to non-FEC/DVHS
[06:18:37 CET] <ParkerR> /dev/video0: could not find codec parameters
[06:19:59 CET] <ParkerR> Power cycled the HD PVR and it acted like it was working but just quit out
[06:20:19 CET] <ParkerR> With "rtmp://live.twitch.tv/app/live_blah: Input/output error
[06:20:19 CET] <ParkerR> "
[06:20:34 CET] <relaxed> -c:v copy
[06:21:23 CET] <geusebio> ok, to hell with this for tonight. I wasn't expecting to have to crazy things like compile nginx tonight. Night! Cheers for the help though!
[06:24:31 CET] <ParkerR> Stops for about 10 seconds right after listing the stream properties, device lights up, but then ends in that input/output error
[06:25:21 CET] <furq> can you save the input to a file
[06:25:33 CET] <furq> and/or can you stream a file to that twitch url
[06:25:45 CET] <furq> or something like -f lavfi -i smptebars
[06:31:48 CET] <[mbm]> ParkerR: have you tried running wireshark to see if the remote end is somehow killing the connection?
[06:32:05 CET] <ParkerR> [mbm], Umm
[06:32:07 CET] <ParkerR> No
[06:32:12 CET] <ParkerR> And hey
[07:15:53 CET] <satinder___> hi how we can draw text on video
[07:16:01 CET] <satinder___> please any one help me
[08:11:36 CET] <mrec> hi, is anyone familiar with the RTP protocol? how about the client suddenly disappears is there any mechanism available to stop flooding the network with UDP data?
[08:12:03 CET] <mrec> I had a look at some servers, and they keep sending the videodata until they will receive a proper shutdown request
[08:14:58 CET] <TD-Linux> mrec, yes, RTCP
[08:19:49 CET] <mrec> I will have a closer look at that, so seems like the server which I had a look at doesn't fully support the RTP protocol
[11:42:37 CET] <bencc1> is there a tool to overlay text in a video that *moves* with the camera?
[11:43:14 CET] <bencc1> for example, if there is a bus, place an ad on the side of the bus that looks real
[11:59:39 CET] <andrey_utkin> mrec, another option would be to restrict support of RTSP transports to TCP only. Thus you have RTSP signaling and RTP data interleaved in single TCP conn. When it breaks, you're done.
[12:00:22 CET] <andrey_utkin> mrec, RTCP data may be "safely ignored" and things "still work", this may be the case with your server :)
[13:04:01 CET] <bencc1> how can I drawtext for only seconds 5-10 in the video?
[13:04:03 CET] <bencc1> ffmpeg -i in.mp4 -vf drawtext="text='my test':fontsize=50:fontcolor=white:x=100:y=100:n=250" -strict -2 out.mp4
[13:05:48 CET] <J_Darnley> I think there's an enabled option which takes an expression
[13:06:35 CET] <J_Darnley> Use something like: gt(t,5)*lt(t,10)
[13:06:56 CET] <DHE> Using the "moving text" method, you could set 'y' to a formula that evaluates to 100 for the desired window and something way off screen otherwise
[13:07:26 CET] <bencc1> J_Darnley: thanks. trying
[13:07:57 CET] <bencc1> DHE: by "moving text" do you mean that the x and y will be an expression of the time t?
[13:08:06 CET] <DHE> yes
[13:08:48 CET] <DHE> I skimmed the docs and didn't see an 'enabled' variable. but I've done scrolling text in the past, so my method (with J_Darnley's equation as a starting point) should work
[13:09:35 CET] <bencc1> this works: drawtext="enable='between(t,0,10)
[13:10:27 CET] <J_Darnley> http://ffmpeg.org/ffmpeg-filters.html#Timeline-editing
[13:11:09 CET] <J_Darnley> Oh, there's a between now
[13:11:22 CET] <bencc1> nice
[13:11:44 CET] <DHE> interesting that timeline note...
[13:18:30 CET] <bencc1> can I replace green pixels with a box of text?
[13:21:25 CET] <DrSlony> Hello, which audio codec is recommended for general mp3 player support??
[13:21:55 CET] <DHE> ummm... MP3?
[13:21:58 CET] <relaxed> libmp3lame
[13:22:07 CET] <DrSlony> thanks relaxed
[13:22:10 CET] <J_Darnley> There is only one option
[13:23:05 CET] <DHE> while I'd love to say something like Vorbis or AAC, you can't be sure they're supported on older devices (ie. before android phones)
[13:25:12 CET] <DrSlony> i actually decided to go with vorbis because i know rockbox supports it
[13:25:39 CET] <DrSlony> its amazing how many broken mp3s there are floating about
[15:31:23 CET] <andrey_utkin> how could I write wav file with RIFF header, instead of RIFF2k, with ffmpeg?
[15:33:19 CET] <andrey_utkin> speech synthesis software refuses to work with wav file ffmpeg creates by default
[15:34:03 CET] <andrey_utkin> because it expects "data" tag, and instead there's LIST
[15:40:18 CET] <bencc1> how can I use fontcolor_expr to fade-in text?
[15:40:27 CET] <bencc1> I'm displaying text with drawtext
[15:41:27 CET] <bencc1> or maybe use alpha with an expression
[15:42:13 CET] <andrey_utkin> bencc1, 1 sec
[15:44:37 CET] <J_Darnley> http://ffmpeg.org/ffmpeg-filters.html#drawtext-1
[15:44:46 CET] <J_Darnley> the alpha option
[15:45:40 CET] <bencc1> J_Darnley: can I use it to create fade-in and fade-out?
[15:46:01 CET] <bencc1> from frame 10 to 20 increase the alpha from 0 to 1
[15:46:10 CET] <J_Darnley> "Draw the text applying alpha blending"
[15:46:17 CET] <bencc1> from frames 50 to 60 decrease the alpha from 1 to 0
[15:46:19 CET] <J_Darnley> or does alpha mean nothing to you?
[15:46:35 CET] <andrey_utkin> i was doing it kinda so fontcolor_expr=ffffff%{eif\\: max(0\, min(255\, 255*(1*between(t\, 10\, 20) + ((t)/10)*between(t\, 0\, 10) + (-(t - 20))/10)*between(t\, 20\, 100) ) )) \\: x \\: 2 }
[15:46:40 CET] <andrey_utkin> bencc1, --^
[15:47:31 CET] <bencc1> andrey_utkin: what max and min do?
[15:47:47 CET] <andrey_utkin> clamp the value between 0 and 255
[15:48:27 CET] <andrey_utkin> i think now you can use alpha parameter without concatenating that with color code
[15:48:48 CET] <andrey_utkin> maybe at the time i invented this formula, there was no evaluatable alpha parameter
[15:49:08 CET] <bencc1> why do you need between three times?
[15:49:50 CET] <J_Darnley> Ramp up, constant, ramp down?
[15:51:00 CET] <andrey_utkin> bencc1, not really needing between in all three
[15:51:19 CET] <bencc1> andrey_utkin: how can the expression be simplified?
[15:52:24 CET] <andrey_utkin> bencc1, leave color code in fontcolor, move expression to "alpha", dropping multiplication by 255 and "%{eif:" stuff
[15:52:58 CET] <andrey_utkin> further it is ok although you can improve it of course
[15:53:04 CET] <bencc1> andrey_utkin: I still need: ramp up, constant, ramp down
[15:53:16 CET] <bencc1> so don't I need the {eif?
[15:53:40 CET] <bencc1> J_Darnley: very useful. thanks. trying to simplify it
[15:55:01 CET] <andrey_utkin> Reposting my Q: How could I write wav file with RIFF header, instead of RIFF2k, with ffmpeg?
[15:55:44 CET] <J_Darnley> no idea other than "edit the source"
[15:56:31 CET] <bencc1> J_Darnley: what this part does? \\: x \\: 2
[15:56:43 CET] <J_Darnley> NFI
[15:56:53 CET] <andrey_utkin> bencc1, part of eif command
[15:56:54 CET] <andrey_utkin> drop it
[15:57:03 CET] <bencc1> ok
[16:00:03 CET] <wodim> hello, how do I seek faster? I'm using -ss/-to and it takes ages.
[16:01:10 CET] <J_Darnley> Let me guess, you are telling the encoder to wait rather than instructing the decoder to seek.
[16:01:39 CET] <wodim> well I don't know
[16:01:47 CET] <wodim> ffmpeg -i file -ss xx -to xx file
[16:01:49 CET] <wodim> what's wrong
[16:01:59 CET] <J_Darnley> exactly what I thought
[16:02:29 CET] <J_Darnley> read the second line of the help
[16:02:49 CET] <wodim> I can't use it before -i
[16:02:49 CET] <J_Darnley> ffmpeg [options] [[infile options] -i infile]... {[outfile options] outfile}...
[16:03:05 CET] <wodim> not -ss but -to
[16:03:18 CET] <wodim> I mean, -ss can be used before -i but -to can't
[16:04:00 CET] <wodim> am I supposed to use -t instead and having to calculate the length by hand?
[16:04:18 CET] <wodim> s/and//
[16:08:29 CET] <wodim> well better than having to wait 5 minutes it is
[16:15:24 CET] <bencc1> I'm trying this:
[16:15:28 CET] <bencc1> -vf drawtext="enable='between(n,0,5):alpha=n*0.2:text='test':fontsize=110:fontcolor=white:x=100:y=100"
[16:15:45 CET] <bencc1> but the alpha changes instantly, instead of gradually
[16:18:01 CET] <andrey_utkin> bencc1, too fast fadein?
[16:18:13 CET] <andrey_utkin> what about using t instead of n also
[16:18:16 CET] <J_Darnley> And what does the help say about the range of alpha?
[16:18:44 CET] <bencc1> J_Darnley: "Draw the text applying alpha blending. The value can be either a number between 0.0 and 1.0 The expression accepts the same variables x, y do. The default value is 1. Please see fontcolor_expr "
[16:18:57 CET] <J_Darnley> Yes. 5 frames isn't a very long time
[16:19:08 CET] <bencc1> andrey_utkin: tried more frames. still instant. I'll try t
[16:19:46 CET] Action: J_Darnley wonders whether he has a built with drawtext
[16:20:46 CET] <J_Darnley> oh yes, zerano's
[16:22:25 CET] <J_Darnley> what the shit is this?
[16:22:41 CET] <J_Darnley> Fontconfig error: Cannot load default config file
[16:22:41 CET] <J_Darnley> [Parsed_drawtext_0 @ 0000000002895f20] impossible to init fontconfig
[16:42:56 CET] <Duality> hi
[16:43:01 CET] <Duality> can i force a speed?
[16:43:21 CET] <J_Darnley> Speed of what exactly?
[16:43:24 CET] <Duality> like force a constant speed, i am running ffmpeg and pipe the images into my program
[16:44:29 CET] <J_Darnley> Why? The writing should block and wait if you program is too slow
[16:44:49 CET] <J_Darnley> And similar for reading if ffmpeg is too slow.
[16:44:55 CET] <Duality> well when i run the pipe it shows me this: frame= 20 fps=8.2 q=-0.0 Lsize= 270kB time=00:00:20.00 bitrate= 110.6kbits/s dup=0 drop=542 speed=8.17x
[16:45:00 CET] <Duality> and it runs way to fast
[16:45:24 CET] <J_Darnley> Then stop telling ffmpeg to drop frames!
[16:45:43 CET] <Duality> how ?
[16:45:55 CET] <J_Darnley> Stop giving it an -r option
[16:46:27 CET] <Duality> ha
[16:46:45 CET] <DHE> or maybe you want to put the -r option somewhere else, like on the input rather than the output?
[16:47:03 CET] <Duality> i had used the -r to fix something else, namely that the speed keeps dropping over the lenght of the video
[16:47:43 CET] <Duality> length*
[16:47:47 CET] <J_Darnley> -r tell ffmpeg to change the framerate of the output video
[16:47:57 CET] <J_Darnley> it has nothing to do with encoding speed.
[16:48:31 CET] <DHE> ffmpeg keeps the video consistent such that the same image is shown at, say, the 1 minute interval regardless of framerate. it duplicates or drops frames to meet that requirement
[16:48:54 CET] <J_Darnley> How about we stop groping around in the dark like blind people trying to fuck and you post your exact command line.
[16:49:48 CET] <Duality> my command is this: ffmpeg -i /home/robert/Videos/bad.mkv -vf "scale=96:48" -f image2pipe -pix_fmt rgb24 -vcodec rawvideo -
[16:50:01 CET] <DHE> use pastebin, and include the output
[16:50:07 CET] <DHE> (see the topic)
[16:50:11 CET] <Duality> ah
[16:50:13 CET] <Duality> sorry :)
[16:51:29 CET] <Duality> http://pastebin.com/KuupQMxp
[16:53:16 CET] <J_Darnley> That is surprisingly slow
[16:53:53 CET] <Duality> yes
[16:53:56 CET] <Duality> :)
[16:54:01 CET] <Duality> I am wondering why
[16:54:29 CET] <DHE> easily explained by the program accepting the stdout reading the data slowly...
[17:09:09 CET] <Duality> it written in python does allot of things though, so uh yea maybe i'll have to rewrite it in c/c++ :)
[17:10:45 CET] <J_Darnley> Before rewriting the whole thing, benchmark, profile, and see where the problem is.
[17:34:20 CET] <Duality> good idea will do :)
[17:47:07 CET] <bencc1> can I change the angle when using drawtext?
[17:47:19 CET] <bencc1> so the text will have a little tilt?
[17:48:00 CET] <J_Darnley> At this point you should be using ass so you can do full 3d effects
[17:48:22 CET] <J_Darnley> And no, I don't think you can
[17:48:37 CET] <bencc1> what is ass?
[17:48:54 CET] <J_Darnley> The glorious subtitle format.
[17:49:02 CET] <bencc1> so it's better to draw the text with imagemagick to an image and than place it on the video?
[17:49:22 CET] <J_Darnley> I guess that depends on what you really want to do.
[17:49:33 CET] <bencc1> ass will work?
[17:52:33 CET] <J_Darnley> Since fansubbers use it to produce infinitely varied styles of karaoke and infinitely varied sign translation I'm sure it can render text with a little tilt.
[17:53:37 CET] <J_Darnley> It does require learning new things so if you know how to do what you want with imagemagick then stick with that.
[17:54:10 CET] <bencc1> this is the only docs about ass? https://ffmpeg.org/ffmpeg-filters.html#subtitles-1
[17:54:28 CET] <J_Darnley> About the ffmpeg subtitle filter, yes
[17:54:39 CET] <kepstin> bencc1: if you're doing ass stuff, it would be best to use aegisub to do the text layout etc.
[17:54:40 CET] <J_Darnley> If you want an editor: http://www.aegisub.org/
[17:54:45 CET] <kepstin> it has visual layout editor
[17:54:54 CET] <bencc1> I need it in a script
[17:55:15 CET] <J_Darnley> And an ass file is a plaintext file you can edit with a script
[17:55:53 CET] <kepstin> the aegisub website has docs of the commands you can use for layout in the ass script
[17:56:07 CET] <bencc1> I'll look at it. thanks
[17:57:53 CET] <bencc1> ffmpeg can take ass script and render it into the video?
[17:59:50 CET] <kepstin> bencc1: yes, with the subtitles or ass filters
[18:00:09 CET] <bencc1> nice
[18:00:22 CET] <bencc1> I'll try to create a simple example with tilted text to see if it works
[18:33:36 CET] <bencc1> J_Darnley: aegisub is cool. now trying to burn the subtitle
[18:34:47 CET] <JEEB> aegisub is the least retarded open source subtitle editor
[18:34:51 CET] <JEEB> I like it <3
[18:36:28 CET] <bencc1> :)
[18:47:39 CET] <andrey_utkin> bencc1, still there with your issue?
[18:55:22 CET] <bencc1> andrey_utkin: yes
[18:55:39 CET] <bencc1> I'm able to use aegisub to fade-in and fade-out text
[18:55:45 CET] <bencc1> now I'm trying to animate the position
[18:55:47 CET] <andrey_utkin> could you share full minimized example of what you're trying?
[18:55:55 CET] <andrey_utkin> preferrably with -f lavfi -i testsrc
[18:56:00 CET] <andrey_utkin> so that i can try it instantly
[18:56:17 CET] <bencc1> I'm not sure it's possible with ffmpeg without ass
[18:56:28 CET] <bencc1> I'm trying to drawtext with several effects
[18:56:36 CET] <bencc1> fade-in, fade-out (possible)
[18:56:43 CET] <bencc1> movement (possible)
[18:56:51 CET] <bencc1> rotation (not possible?)
[18:56:52 CET] <andrey_utkin> ass should be irrelevant
[18:57:08 CET] <bencc1> can you rotate text with drawtext?
[18:57:41 CET] <andrey_utkin> rotation... maybe if you rotate the background picture with dedicated filter, drawtext, then rotate it back :)
[18:58:01 CET] <c_14> I'm 99% sure you can rotate ass
[18:58:09 CET] <bencc1> ass you can
[18:58:14 CET] <bencc1> but drawtext filter in ffmpeg?
[18:58:27 CET] <andrey_utkin> c_14, you mean ass or ass? :)
[18:59:06 CET] <andrey_utkin> bencc1, not 100% sure, i'd check the doc section of drawtext
[18:59:31 CET] <c_14> I don't think you can rotate drawtext
[19:07:24 CET] <geusebio> wtf, you're all talking about ass
[19:07:42 CET] <geusebio> and being able to rotate ass.
[19:10:44 CET] <furq> how do i throw that ass in a circle
[19:12:10 CET] <bencc1> geusebio: ass is a format for subtitles
[19:13:06 CET] <furq> also i'm pretty sure andrey_utkin already made the same joke
[20:28:39 CET] <geusebio> furq: I just wasn't expecting ass everywhere.
[20:28:50 CET] <geusebio> Also, trying to compile nginx ala last nights discussion
[20:28:53 CET] <geusebio> ./configure: error: no modules/rtmp/config was found
[21:34:48 CET] <courrier__> Hey guys, I shot a video using a video gain of 25db, on the camcorder it looked great but now I'm viewing the files on a HD screen, the image is somehow "pixelated", any idea of a filter that could rectify this?
[21:36:01 CET] <J_Darnley> I understand all of those words but cannot get unstanding from this particular order.
[21:36:14 CET] <J_Darnley> Perhaps you can give us a screen shot.
[21:36:31 CET] <BtbN> what is a 25db video gain? oO
[21:39:08 CET] <J_Darnley> Now that I see "video" there I guess it means a little over 16x brightness gain
[21:40:27 CET] <courrier__> Here's a capture: http://www.cjoint.com/data3/FCuuNTCU4JG_Capture.jpeg
[21:40:45 CET] <courrier__> Look at the sofa (top left), this is really ugly
[21:41:06 CET] <J_Darnley> Right
[21:41:14 CET] <J_Darnley> Camera sensor noise
[21:41:30 CET] <courrier__> Not sure exactly what's this video gain, this is the "gain" key of the Sony NEX VG10
[21:42:31 CET] <J_Darnley> A denoiser would blur it out but obviously make it blurrier
[21:43:03 CET] <J_Darnley> you can try hqdn3d
[21:47:06 CET] <courrier__> J_Darnley: mmmmh OK
[21:47:16 CET] <courrier__> any clue about the parameters?
[21:47:20 CET] <courrier__> just tweaking?
[21:47:36 CET] <J_Darnley> start with none
[21:48:44 CET] <c_14> And obviously test on part of your capture first (using -ss and/or -t) to see if you like it before trying it on the whole captur.
[21:48:46 CET] <c_14> +e
[21:53:39 CET] <courrier__> c_14: thanks for the good advice I'll see how hqdn3d and -ss work :)
[22:04:31 CET] <courrier__> Yes, that seems better with the filter
[22:32:52 CET] <lx2> Hello 2all. I'm looking for someone with nvenc_h264 usage experience. Looking at the ffmpeg -h full output I see that there's some kind of the 2 pass encoding support there but I can't figure out the correct sequence to use it. With x264 we've got working -pass option that generates log file on the 1st pass and updates it on subsequent passes.
[22:32:53 CET] <lx2> But for nvenc it seems that using -pass does nothing: log file ends up 0 bytes in size and the encoding results produced on 1st and sebsequent passes are byte-to-byte the same. What am I doing wrong? Thanks in advance for clarification.
[22:33:51 CET] <c_14> nvenc's "2-pass" mode is internal to the hardware encoder
[22:34:20 CET] <c_14> It only actually runs 1 pass of the whole file
[22:34:23 CET] <lx2> Ough, so it's "fake" 2 pass and has nothing to do with full blown two pass encoding scheme?
[22:34:28 CET] <c_14> yes
[22:34:38 CET] <lx2> Are there any benefits in using it?
[22:34:47 CET] <BtbN> It makes encoding slower but it looks a bit better
[22:35:01 CET] <BtbN> Nobody knows what it actualy does internaly.
[22:35:22 CET] <J_Darnley> Black box encoders, aren't they great?
[22:37:10 CET] <J_Darnley> I wonder how much it buffers since it obviously isn't the whole video.
[22:39:18 CET] <BtbN> it does not add any noticable latency, just reduces the maximum fps
[22:39:31 CET] <lx2> Thanks a lot for your help. Surprisingly enough I wasn't able to find this info online on ffmpeg forums or in the mailing list archives. Had even tried to look into the source code but video encoding stuff is far from my area of expertise so I failed miserably. From my testings for 720p I see next to none encoding speed difference with 2pass enabled vs disabled. Testing on Win7 x64 with latest nVIDIA drivers, latest ffmpeg compiled from git head and GeForce G
[22:40:08 CET] <BtbN> With ffmpeg you run into limits way before the actual nvenc encoder is capped
[22:40:21 CET] <BtbN> for 720p it easily encodes 800~1300 FPS
[22:40:32 CET] <lx2> Most probably that's the case.
[22:40:47 CET] <BtbN> Copying around the frames to/from GPU is was too slow to reach that limit
[22:40:51 CET] <BtbN> *way
[22:48:15 CET] <lx2> In my case source stream is also H.264 720p and one of the bottlenecks was the decoding speed. Tried different variants with "-f null - -benchmark" and fastest ones were single-threaded dxva2 or multiple threaded default CPU decoder with 2x overcommit ratio vs. CPU cores. I've got AMD FX-8350 on encoding workstation with it's 8 "fake" cores, specifying 16 threads resulted in ~810 FPS decoding speed.
[22:48:47 CET] <lx2> dxva2 with single thread ended up around ~500 fps.
[22:49:31 CET] <lx2> No hwaccel + default threads setting (i.e. no threads specified) - ~760 FPS.
[22:51:53 CET] <lx2> Multithreaded dxva2 seems to perform the same speed as software decoding but requires way lower number of threads: even 2 threads gives around 780 FPS. But there's a sound warning about using hwaccel with multiple threads so I hadn't tried to use it to encode anything large yet.
[22:53:05 CET] <JEEB> multithreaded hwaccel in general is a bad idea
[22:53:42 CET] <JEEB> and it really shouldn't bring you any speed-up, honestly. wouldn't be surprised if that benchmark didn't work properly in that case
[22:54:27 CET] <JEEB> the only reason why some people used or use multithreading with hwaccels is because they are too lazy to recreate the decoder in case the hwaccel fails
[22:54:42 CET] <JEEB> (because it will fall back to software decoding with that many threads)
[22:56:27 CET] <lx2> Yep, I also have concerns about thread safety vs. dxva and doubt that it is possible to properly support multiple threads accessing the same dxva context and the same hardware at the same time. And I don't see a reason why should CPU multithreading result in faster performance in case we've got other dedicated hardware to do actual decoding.
[23:04:50 CET] <BtbN> because the dedicated hardware is designed for real time decoding of media
[23:04:59 CET] <BtbN> anything beyond that is a pure bonus
[23:05:36 CET] <BtbN> So modern hw decoders are designed in a way that allows them to barely reach 60 FPS for 4K.
[23:10:18 CET] <lx2> Exactly. I've seen tests of the hardware decoders comparing Intel vs nVIDIA vs AMD and it looks like Intel's solution is fastest among them. On the other hand it is possible to achieve several orders higher decoding rates on some special hardware. My everyday work is an HPC *nix engineer and recently I've seen benchmarking of the native ffmpeg build for the Intel's Xeon Phi (a.k.a. knc - Knight's Corner). It does wonders as long as you are able to feed it wit
[23:12:05 CET] <lx2> Infiniband FDR + lustre + proper cluster storage solution definitely helps but unfortunately it is not something I'd expect to see people using at their homes :-).
[00:00:00 CET] --- Mon Mar 21 2016
More information about the Ffmpeg-devel-irc
mailing list