[Ffmpeg-devel-irc] ffmpeg.log.20160706
burek
burek021 at gmail.com
Thu Jul 7 02:05:01 CEST 2016
[00:01:26 CEST] <ChocolateArmpits> Threads: yeah
[01:41:18 CEST] <dystopia> ffmpeg is only using about 50% of my cpu load they rest is idle
[01:41:28 CEST] <dystopia> how can i get it to max out the cpu performance?
[01:43:46 CEST] <dystopia> http://i.imgur.com/T4K50Ea.jpg
[01:46:12 CEST] <iive> libx264 usually manages to max out 8 cores
[01:46:47 CEST] <iive> so... what are you doing exactly?
[01:49:16 CEST] <dystopia> encoding 1080i mpeg2 to 720x404 hdtv
[01:49:24 CEST] <dystopia> to x264*
[01:49:30 CEST] <dystopia> http://pastebin.com/BYJEKuEx
[01:49:32 CEST] <dystopia> encoding line
[01:53:04 CEST] <iive> one possibility is that yadif and scale make feeding frames slow enough for x264 to not fully load.
[01:54:06 CEST] <iive> try adding -threads 4 before the filters... maybe mpeg2 decoder would start using threads for decoding.
[01:54:39 CEST] <dystopia> ok will give it a shot
[01:56:22 CEST] <iive> try it before the "-i" too.
[01:57:51 CEST] <iive> sorry, gtg. maybe somebody else could help.
[02:07:38 CEST] <dystopia> it's slower now
[02:07:54 CEST] <dystopia> averaging at 35-40% cpu usage :(
[04:23:47 CEST] <ThatTreeOverTher> Is there a way to change the volume of a file during playback? Is it possible to send new commands to FFserver as it streams?
[04:26:55 CEST] <jnorthrup> i think i saw this question last week
[04:27:07 CEST] <jnorthrup> yes, dtdin is live, you can adjust volume through stdin
[04:27:27 CEST] <jnorthrup> oh sry, ffserver, nm
[04:27:51 CEST] <ThatTreeOverTher> no, either one is good! I just want to change volume "interactively".
[04:27:59 CEST] <ThatTreeOverTher> what is "dtdin" ?
[04:31:30 CEST] <ThatTreeOverTher> Oh, I understand now. Thanks!
[04:35:08 CEST] <ThatTreeOverTher> One more question: is there a way to buffer the output of FFmpeg? I'd like to stream data to a pipe, but give the playback some leeway so it doesn't start and stop.
[04:35:18 CEST] <ThatTreeOverTher> (or is that what FFserver is for?)
[04:41:33 CEST] <klaxa> buffering data is the job of the player
[04:42:50 CEST] <ThatTreeOverTher> Darn, okay. Thanks.
[04:45:07 CEST] <klaxa> i mean, it's not like you can tell the user what to do with your data, right?
[04:45:27 CEST] <klaxa> what you can do is send some data from the past if it is a livestream
[04:45:49 CEST] <klaxa> i think ffserver even has a setting for that, or rather it is a request parameter for the url requested by the client
[04:48:00 CEST] <ThatTreeOverTher> I'm trying to pipe data through FFmpeg into a Discord stream, and I was wondering if FFmpeg could hold off on sending some of the bytes through the pipe until a certain amount were already ready
[04:48:15 CEST] <ThatTreeOverTher> But I guess I could just start FFmpeg early and then pick up the bytes later or something
[05:28:44 CEST] <fling> Could ffserver be used to save video to files?
[05:30:45 CEST] <fling> ThatTreeOverTher: you could use mbuffer
[05:31:26 CEST] <ThatTreeOverTher> oh that looks nice
[05:31:53 CEST] <ThatTreeOverTher> Instead of writing to a tape, I am writing to people's ears
[05:32:00 CEST] <fling> haha
[05:32:13 CEST] <fling> can I see your ffserver config file?
[05:33:09 CEST] <ThatTreeOverTher> I stopped using ffserver, I'm now playing with ffmpeg and its interactive stdin
[05:33:18 CEST] <ThatTreeOverTher> although it's not quite working...
[05:33:26 CEST] <fling> what is wrong with it?
[05:34:11 CEST] <ThatTreeOverTher> I'm trying to send "all -1 volume " and then a random number, and then enter every half a second
[05:34:17 CEST] <ThatTreeOverTher> the volume remains unrandomized
[05:35:14 CEST] <ThatTreeOverTher> I can't find any documentation on the interactive stdin, I have no idea what I'm supposed to do or even if ffmpeg is getting my input
[05:40:24 CEST] <pzich> what are you using to send that to stdin? you may need to flush the pipe between each of those
[05:43:32 CEST] <ThatTreeOverTher> pzich, I'm using node.js... you're right, I hadn't considered flushing the pipe
[05:44:05 CEST] <fling> D'oh!
[06:20:48 CEST] <ThatTreeOverTher> is there a way for me to interactively change volume during output with ffmpeg? I know typing "c" will open a console, but I don't understand how to say "change the volume" through it
[07:34:07 CEST] <ThatTreeOverTher> Okay. I figured it out.
[07:34:39 CEST] <ThatTreeOverTher> Somewhere at the beginning of your ffmpeg invocation you must include "-af" and "volume" with a value I think
[07:35:17 CEST] <ThatTreeOverTher> then you send it "cvolume -1 volume <modifier>" via stdin to modify its volume during the rendering process
[09:32:02 CEST] <soulshock> is there a recommendation for which deinterlacing to use? i.e. which one offers a decent tradeoff between speed and quality
[09:32:21 CEST] <furq> probably yadif
[09:33:34 CEST] <furq> yadif in mode 1 is pretty good with a clean source
[09:33:56 CEST] <furq> otherwise i tend to use qtgmc with avisynth/vapoursynth
[09:34:09 CEST] <furq> you could use nnedi in ffmpeg but it's incredibly slow
[09:34:42 CEST] <soulshock> ok. the source is very clean, 1080i xdcam 60mbit so yadif might be best
[09:34:47 CEST] <furq> yeah yadif will be fine
[09:35:14 CEST] <soulshock> and it handles motion? i.e. football matches
[09:35:33 CEST] <furq> it should be fine
[09:35:49 CEST] <furq> you can preview it with ffplay or vlc
[09:35:58 CEST] <furq> probably mpv too
[09:36:08 CEST] <soulshock> ah cool
[09:36:54 CEST] <furq> for reference i get about 15fps on ntsc dvd with nnedi
[09:37:01 CEST] <furq> so you'd probably be getting <5 on 1080i
[09:38:24 CEST] <Nem> Hello, First time in this IRC
[09:38:53 CEST] <Nem> Anyway to see a list of active members using the web client?
[09:41:18 CEST] <soulshock> furq ah I'm resizing to 720p though. so it would probably be faster than 5fps
[09:41:33 CEST] <furq> well you'd want to deinterlace before resizing
[09:41:50 CEST] <soulshock> yeah I put -filter_complex "yadif=0:-1:0,scale=1280x720"
[09:42:14 CEST] <furq> that should be 1280:720
[09:42:37 CEST] <soulshock> seems to produce same result
[10:10:40 CEST] <k_sze[work]> So apparently it's possible to *push* a video stream via rtmp.
[10:11:25 CEST] <k_sze[work]> Does anybody know how libraries to implement the receiving side?
[10:11:32 CEST] <k_sze[work]> s/how//
[10:12:01 CEST] <k_sze[work]> Does ffmpeg's libraries support acting as the receiving side of rtmp push stream?
[10:49:40 CEST] <whald> hi! could someone lend me a hand figuring out how to use VAAPI to decode h264 when using libav*? I managed to do the initialization dance w/ wayland and now sit here with a AVHWDeviceContext and AVHWFramesContext, but I have no clue what to do with them. looking at the h265 decoder source, it seems to look for a hwaccel filled in in it's AVCodecContext
[10:51:46 CEST] <BtbN> look at ffmpeg_vaapi.c
[10:52:39 CEST] <whald> i don't really see what's up next. my ultimate goal would be to obtain frames in the AV_PIX_FMT_VAAPI format, which could then be displayed using vaGetSurfaceBufferWl: https://cgit.freedesktop.org/libva/tree/va/wayland/va_wayland.h#n91
[10:53:21 CEST] <whald> BtbN, that file was the reference for most of what I currently have, but I don't see how to kick of the actual decoding part
[10:53:43 CEST] <BtbN> you set the relevant fields, and then use the normal h264 decoder.
[10:56:07 CEST] <whald> BtbN, that would be pix_fmt and hwaccel_context it seems, it was in plain sight all the time... thanks! :-)
[10:56:49 CEST] <BtbN> you also have to implement the get_format callback and select the right hwaccel pix fmt there.
[10:58:22 CEST] <whald> BtbN, i'll try that. thanks again.
[15:13:48 CEST] <hero_biz> hi all
[15:14:24 CEST] <hero_biz> guys, I use a connad like this for encoding audio of a mkv file: ffmpeg -i a.mkv -vn -c:a libfdk_aac -b:a 48k audio.aac
[15:14:41 CEST] <hero_biz> this is on for when there is 1 audio track.
[15:15:16 CEST] <hero_biz> I wonder how I can do similar thign when there are 2 audio tracks and make 2 different .aac files.
[15:17:59 CEST] <whald> hero_biz, you can specify multiple outputs and map input channels to those
[15:18:30 CEST] <BtbN> Don't use .aac though
[15:18:36 CEST] <whald> hero_biz, there's a nice wiki enty about this: https://trac.ffmpeg.org/wiki/Creating%20multiple%20outputs
[15:18:55 CEST] <BtbN> That's raw AAC, you want to use some container, like m4a, which is essentialy just mp4, for example
[15:20:55 CEST] <hero_biz> BtbN: I will use it for mkvmerge later. Will not be a standalone file or I were used .mka container
[15:21:18 CEST] <hero_biz> whald: ty I will check it out
[18:42:01 CEST] <satinder___> There have anyone who have expertise in ffmpeg rtsp streaming
[18:42:21 CEST] <satinder___> I am using following command but that is not working
[18:43:05 CEST] <satinder___> ffmpeg -i video.ts -vcodec copy -f mpegts rtsp://227.40.50.60:1234
[18:43:15 CEST] <satinder___> please anyone can help me
[18:46:54 CEST] <DHE> mpegts over multicast? do you maybe just want UDP mode instead?
[18:48:55 CEST] <satinder___> DHE : thank you for replay
[18:49:08 CEST] <satinder___> I want use rtsp protocol for streaming
[18:49:20 CEST] <satinder___> I used udp that is working fine
[18:49:42 CEST] <satinder___> but when I change udp:// to rtsp:// that is not working
[18:50:01 CEST] <satinder___> please give some idea how I can do rtsp streaming with ffmpeg
[18:50:08 CEST] <satinder___> DHE : ??
[18:51:39 CEST] <DHE> sorry, I don't do rtsp. I do use udp though
[18:52:07 CEST] <satinder___> ok thanks for giving your time :)
[18:52:29 CEST] <c_14> -rtsp_transport udp_multicast ?
[18:54:38 CEST] <satinder___> ok sir I will try it
[18:54:49 CEST] <satinder___> c_14 : give update to you as soon
[18:54:53 CEST] <satinder___> thanks :)
[18:55:55 CEST] <satinder___> c_14 : I use following and got error header error
[18:56:58 CEST] <satinder___> ffmpeg -i ../video.mp4 -f rtsp -rtsp_transport tcp rtsp://localhost:8888
[18:56:58 CEST] <satinder___> Could not write header for output file #0 (incorrect codec parameters ?): Connection refused
[18:56:58 CEST] <satinder___> [tcp @ 0x28c3ba0] Connection to tcp://localhost:8888?timeout=0 failed: Connection refused
[18:56:59 CEST] <satinder___> these two red errors comes after running above command
[18:57:28 CEST] <satinder___> c_14 : what I am doing wrong sir ??
[18:57:30 CEST] <c_14> eeh
[18:58:20 CEST] <satinder___> Sorry
[18:58:45 CEST] <satinder___> Now you want I paste it on pastebin
[18:58:48 CEST] <satinder___> ??
[18:58:53 CEST] <satinder___> please help
[18:59:03 CEST] <c_14> I want everything, the command you're using along with the complete console output
[18:59:09 CEST] <c_14> On a pastebin service of your preference
[18:59:17 CEST] <c_14> (as long as it's not annoying to use for me)
[18:59:19 CEST] <satinder___> ok sir give me 2 mins
[19:03:47 CEST] <satinder___> c_14 : there is my console : http://pastebin.com/6T19sgRW
[19:04:21 CEST] <satinder___> please see
[19:06:59 CEST] <c_14> Do you have a process listening on tcp on port 8888 localhost?
[19:07:25 CEST] <satinder___> no
[19:07:32 CEST] <satinder___> you meaning any player
[19:07:50 CEST] <satinder___> which have already waiting for packets ??
[19:07:56 CEST] <c_14> tcp is a connection oriented protocol, you need to have something listening on the port if you want to send something there
[19:08:05 CEST] <satinder___> ok
[19:08:15 CEST] <satinder___> can I use vlc for that
[19:08:20 CEST] <satinder___> or ffplay
[19:08:23 CEST] <c_14> sure
[19:08:27 CEST] <c_14> probably
[19:08:39 CEST] <satinder___> then that will work ? :)
[19:08:52 CEST] <c_14> ffplay rtsp://localhost:8888 -listen 1
[19:08:56 CEST] <c_14> eh
[19:09:03 CEST] <c_14> -rtsp_flags listen
[19:09:58 CEST] <satinder___> 1st one I understand but what is second one ?
[19:10:08 CEST] <c_14> use that instead of -listen 1
[19:10:16 CEST] <satinder___> where I can use second one or for which purpose ?
[19:10:19 CEST] <satinder___> ok
[19:10:22 CEST] <satinder___> thanks
[19:11:27 CEST] <satinder___> thank you so much sir that is working fine
[19:11:43 CEST] <satinder___> but I want know little more info
[19:11:54 CEST] <satinder___> I read some forms
[19:12:04 CEST] <satinder___> sorry forums
[19:12:25 CEST] <satinder___> there is said , use ffserver for rtsp stream
[19:12:31 CEST] <satinder___> any reason ??
[19:12:58 CEST] <c_14> I wouldn't recommend using ffserver, when it works it works but when it doesn't it doesn't and nobody really knows how to fix it and nobody really maintains it.
[19:14:15 CEST] <satinder___> sir then what is main purpose of develop ffserver
[19:15:14 CEST] <c_14> Some people wrote it because they wanted it/used it, now it lingers because nobody wants to delete it.
[19:15:27 CEST] <satinder___> ok
[19:15:51 CEST] <satinder___> thanks for giving user valuable time sir
[19:15:57 CEST] <satinder___> :)
[19:16:12 CEST] <satinder___> sorry your ;)
[19:17:34 CEST] <DHE> next major version of ffmpeg will probably remove it
[21:27:43 CEST] <defusix> Hey hey. I'm using the ffmpeg C libraries to read audio from a video/audio file and it works fine for most files. But with WAV files (that the ffmpeg CLI accepts perfectly), I get the following message: "[pcm_s16le @ 0x1e50480] PCM channels out of bounds" and avcodec_open2(...) fails. The function av_dump_format(...) gives the right information, including the number of channels. Demonstrating code: http://pastebin.com/kEuiXshG I'm using ffmpeg 3.1 I
[21:27:44 CEST] <defusix> hope someone here is able to help me out
[23:12:11 CEST] <defusix> I think I found the proper solution to my problem. If I copy the codec context from the right stream from the format context to the allocated codec context it works without errors.
[23:12:25 CEST] <defusix> So I don't need anyone's help anymore.
[23:50:36 CEST] <__raven> hi
[23:50:45 CEST] <__raven> how to do rolling shutter correction of cmos footage?
[00:00:00 CEST] --- Thu Jul 7 2016
More information about the Ffmpeg-devel-irc
mailing list