[Ffmpeg-devel-irc] ffmpeg.log.20160103

burek burek021 at gmail.com
Mon Jan 4 02:05:01 CET 2016


[00:15:11 CET] <t4nk767> hey guys another quesiton
[00:15:28 CET] <t4nk767> is it possible to stream with ffmpeg a folder with a lot of videos
[00:15:30 CET] <t4nk767> randomly
[00:41:16 CET] <c_14> yes, but it's a pain
[00:41:26 CET] <c_14> and requires at least mild scripting
[00:44:38 CET] <t4nk767> is there any guide
[00:44:40 CET] <t4nk767> somewhere
[00:45:09 CET] <c_14> It's basically the concat demuxer and some scripting for the shuffling
[00:45:14 CET] <c_14> https://trac.ffmpeg.org/wiki/Concatenate
[00:46:21 CET] <t4nk767> ok and for the shuffeling is there anything online
[00:46:52 CET] <t4nk767> or any tip
[00:47:52 CET] <c_14> ls | shuf
[00:48:32 CET] <t4nk767> but it takes as input a txt file
[00:48:51 CET] <c_14> ffmpeg -f concat <(ls | shuf)
[00:48:53 CET] <t4nk767> i guess it can be auto generated with grep somehow, but how to dynamically change it
[00:49:00 CET] <t4nk767> aha
[01:02:19 CET] <t4nk767> its working somehow :)
[01:02:26 CET] <t4nk767> after the playlist finishes
[01:02:30 CET] <t4nk767> deos it loop?
[01:04:14 CET] <c_14> no
[01:04:57 CET] <Deihmos> I am using ffmpeg binaries for a project I found on github that converts videos
[01:05:10 CET] <Deihmos> problem is ffmpeg runs slow as molasses
[01:06:01 CET] <Deihmos> taking forever to remux a video. Other programs can do it in less than a minute. Is there something i can do to make it run full speed?
[01:06:07 CET] <Deihmos> cpu use is like 10%
[01:07:42 CET] <c_14> depends, what's the command?
[01:08:25 CET] <Deihmos> i have no clue
[01:08:37 CET] <Deihmos> i don't see this info
[01:08:42 CET] <Deihmos> let me check the logs
[01:10:34 CET] <t4nk767> @c_14 , sorry to ask again but, this concatenate just finishes the videos transcoding and thats it, I am looking more on a solution to create a channel from videos inside a folder. Any ideas on how to do that, maybe?
[01:10:49 CET] <c_14> channel?
[01:11:09 CET] <t4nk767> yes from a folder that has music videdos lets say, I want to loop those randomly all the time
[01:11:23 CET] <t4nk767> and stream them to a server
[01:12:51 CET] <c_14> t4nk767: to loop, you can put the ls in a while true loop
[01:12:59 CET] <c_14> Otherwise it's just regular streaming.
[01:14:01 CET] <t4nk767> yes but ffmpeg is just converting the video from what I see at 5x speed. I need it to play at normal speed and broadcast it
[01:14:10 CET] <c_14> add -re before -i
[01:14:19 CET] <t4nk767> ill try
[01:15:06 CET] <t4nk767> ah
[01:15:33 CET] <t4nk767> seems good now
[01:15:55 CET] <t4nk767> ffmpeg -f concat -re -i <(printf "file '$PWD/%s'\n" ./*.mp4 | shuf) -c copy -f flv rtmp://publishing_server:8001/input/stream
[01:16:12 CET] <t4nk767> can I put the while somewhere in the -i command there?
[01:16:26 CET] <c_14> <(while true; do blah; done
[01:16:28 CET] <c_14> )
[01:16:41 CET] <t4nk767> oki thanks, will try
[01:25:31 CET] <t4nk767> while true doesnt work, ffmpeg just hangs
[01:25:52 CET] <t4nk767> i guess it needs a limit or I am doing something wrong again
[01:29:53 CET] <c_14> mhm, ye it might read to eof
[01:30:15 CET] <c_14> then just pick a really big number and for i in `seq <big number>`;
[01:44:46 CET] <t4nk767> yep its what I did
[01:44:48 CET] <t4nk767> thanks
[02:13:54 CET] <t4nk767> hi again
[02:14:27 CET] <t4nk767> concatenation seems to fail for some reason. Once the first video is finished I get the following error
[02:14:28 CET] <t4nk767> invalid dropping st:0
[03:45:02 CET] <fluter> hi
[06:35:06 CET] <prelude2004c> hey guys. having trouble with vdpau... someone told me to decode using " vc ffh264vdpau   " . not sure where to put that.. i am using ffmpeg
[06:35:08 CET] <prelude2004c> mpeg2video sources seem to decode ok and then i can transcode it to something else.. but h264 sources don't seem to decode at all
[06:35:09 CET] <prelude2004c> i am using M4000 card
[06:35:11 CET] <prelude2004c> running latest 258.16 version
[06:38:43 CET] <furq> prelude2004c: that's an mplayer option
[06:39:32 CET] <prelude2004c> but ffmpeg has the option too i think
[06:39:41 CET] <prelude2004c> is there some way to use mplayer and then pipe to ffmpeg ?
[06:39:56 CET] <prelude2004c> i tried with mplayer before but it would not take in the udp source stream
[06:41:12 CET] <prelude2004c> even tried with ffmplay
[06:58:46 CET] <SSE4>  Hi, I am multiplexing MP4 file, and I need to add track reference information (tref box). I've searched ffmpeg sources, and found out that movenc.c has function that does exactly that I need - mov_write_tref_tag. it will be called for trach if it has tref_tag set. however, it's not clear for me how to set tref_tag from the API? tref_tag is member of MOVTrack structure, but I am not sure how can I access it from my code?
[09:41:04 CET] <Mavrik> SSE4, if it's a muxer option, pass it as part of options dict when opening the muxer (or then writing header, don't remember ATM which call has the options param)
[09:41:52 CET] <SSE4> Mavrik it doesn't seem like muxer option, it's rather internal variable of muxer internals
[09:53:13 CET] <Mavrik> Hrmf, yeah I see it in source, the tref_tags are set only in code
[09:53:29 CET] <Mavrik> CHAP, HINT and TMCD if I read this correctly
[09:56:46 CET] <SSE4> yep exactly
[09:57:00 CET] <SSE4> is there a way to set that from the outside?
[09:59:12 CET] <Mavrik> Do you want to set one of those three or a custom one?
[10:43:33 CET] <SSE4> custom one
[14:04:50 CET] <clb> hey, I wonder if someone could help me out for a sec with command line flags to use for converting a sequence of .tifs to a h265 video?
[14:05:28 CET] <clb> I am trying with "ffmpeg -start_number 557 -i DSC00%%3d.tif -vcodec libx265 -x265-params --input-csp i420 -preset veryslow out.mp4", but I'm getting "[NULL @ 000001eef5d5cb80] Unable to find a suitable output format for 'i420'"
[14:06:18 CET] <clb> in particular, it's looking like this: http://pastebin.com/XGWmzXqz
[14:09:20 CET] <clb> hmm, "ffmpeg -start_number 557 -i DSC00%%3d.tif -vcodec libx265 -strict experimental -preset veryslow out.mp4" starts running, though I am a bit unsure about the -strict experimental, I wonder if that's a bad thing to use with respect to getting out a file that'll play in most browsers e.g. in youtube?
[14:11:03 CET] <sfan5> can't you omit -strict experimental?
[14:11:15 CET] <sfan5> also i'm pretty sure that no browser can play H.265 atm
[14:11:25 CET] <sfan5> and -x265-params takes a string
[14:11:39 CET] <clb> if I do omit -strict experimental, I get "[libx265 @ 0000027a90a5bae0] 4:2:2 and 4:4:4 support is not fully defined for HEVC yet. Set -strict experimental to encode anyway."
[14:12:00 CET] <sfan5> you probably need to do -x265-params "input-csp=i420" if you really want to pass that to x265
[14:12:10 CET] <clb> ah, didn't know browsers don't have H.265, I'm just kinda going from "for 4k video, H.265 is recommended over H.264"
[14:12:20 CET] <sfan5> in that case i'd suggest using -pix_fmt yuv420p
[14:13:08 CET] <sfan5> telling x265 directly that the input is yuv420p will probably cause problem as ffmpeg doesn't know about that and will pass whatever your input pixfmt was (probably yuv422p) to x265
[14:13:41 CET] <clb> thanks, let me try that in a moment after this -strict experimental encode finishes
[14:14:01 CET] <clb> I think my input format is RGB24, since it's a sequence of tiffs(?)
[14:14:13 CET] <clb> or perhaps that is different
[14:14:18 CET] <sfan5> ffmpeg will convert that to YUV for encoding
[14:14:22 CET] <clb> ah, check
[14:14:22 CET] <sfan5> probably yuv444p
[14:18:38 CET] <clb> ooh ffmpeg is pretty efficient at saturating 16 logical cores on my system, neat
[14:19:53 CET] <JEEB> btw, depending on your use case libx265 might not be "ready" yet
[14:20:00 CET] <furq> clb: if you're uploading to youtube then it doesn't matter what you transcode it to (unless bandwidth is a concern)
[14:20:19 CET] <JEEB> it seems to randomly lose parts of stuff in the picture
[14:20:21 CET] <furq> youtube will transcode it to h264/vp9 regardless
[14:20:27 CET] <JEEB> on relatively high bit rate scenarios
[14:21:12 CET] <JEEB> if it's just to push stuff down while having it not look too great, that is where it already more or less works
[14:21:22 CET] <clb> furq, JEEB: thanks, very good to know, perhaps I'll be better off running the final export with h264 then. I suppose h264 is fine with 4K and not outright horrible?
[14:21:40 CET] <furq> it's fine with enough bitrate
[14:21:57 CET] <furq> youtube will transcode it again anyway
[14:22:17 CET] <furq> even if you upload h264/aac it'll transcode it to h264/aac
[14:22:20 CET] <JEEB> and if it's to youtube you might as well use some lossless format as the thing that goes up and let youtube take it behind the barn
[14:22:33 CET] <furq> unless bandwidth is an issue
[14:22:45 CET] <furq> if it's not then yeah, there's no point compressing it
[14:22:54 CET] <JEEB> there's a point to compressing it
[14:23:07 CET] <JEEB> I don't like getting masters as uncompressed shit :P
[14:23:16 CET] <JEEB> (and it takes longer to upload those)
[14:23:36 CET] <JEEB> in which I want to say that lossless compression != uncompressed
[14:33:18 CET] <clb> ok, switched to h.264
[14:35:54 CET] <BtbN> Doesn't YT have an upload size limit?
[14:36:10 CET] <BtbN> Like, won't they shout at you if you upload a 500GB lossless master file?
[14:36:13 CET] <furq> yes but it's 128GB
[14:36:25 CET] <furq> or 11 hours
[14:37:37 CET] <clb> perhaps an odd question, but I wonder if there is a way to duplicate the last frame of the sequence for several seconds?
[14:39:34 CET] <furq> clb: http://video.stackexchange.com/a/10833
[14:40:04 CET] <clb> thanks!
[15:00:36 CET] <clb> thanks for the help, it worked out well! If you want to take a peek of the result, see https://dl.dropboxusercontent.com/u/40949268/dump/DragonHeartless.mp4 (89MB, 60 seconds)
[15:16:52 CET] <clb> need to get a bigger memory card so that I can shoot more frames to get a 30fps timelapse, and with a slightly smaller aperture so that the whole model will be in focus
[16:50:24 CET] <Disturbed1> anyone know what this means? been fighting with ffmpeg compile (nvenc/nvresize/patch1511) got it to work... now when i try to transcode a file i get this msg... ->
[16:50:28 CET] <Disturbed1> [nvenc @ 0x3c9df80] NVENC with Codec H264 Not Available at requested GPU 0
[17:13:55 CET] <BtbN> Add -pix_fmt yuv420p
[17:14:39 CET] <BtbN> Also, "nvresize" looks like you applied that hacky nvidia patch-set. That's entirely unsupported and you are on your own when using that.
[17:15:23 CET] <DHE> Disturbed1: sounds like your GPU doesn't support nvenc
[17:15:31 CET] <DHE> you need a card without shadowplay support
[17:16:00 CET] <BtbN> That entire error message doesn't look like it comes from the ffmpeg nvenc encoder, at least I don't remember seeing it before.
[17:21:40 CET] <Disturbed1> i'm using a nvidia s870 ..... :(
[17:29:12 CET] <Disturbed1> couldn't find any documentation if the tesla s870 is supported by ffmpeg or nvenc5sdk so i was just hoping it would be... kinda the only reason i bought it...
[17:45:44 CET] <DHE> those are not intended for video processing.. no outputs. I'm suspecting they don't have the nvenc chip on them
[17:46:13 CET] <DHE> nvenc actually uses a distinct hardware h264 encoding processor. the idea is that encoding has no effect on using the board for ordinary 3d gaming
[17:52:01 CET] <BtbN> Just don't use strange patches from nvidia, and it might actualy work
[18:45:45 CET] <Disturbed1> i tried compiling before without patch and was getting error -> " No CUDA capable devices found " , with patch i get this new msg NVENC with Codec H264 Not Available at requested GPU 0... after reading a bit more nvenc5 requires nvidia driver 346.22, but my hardware(s870) only works with 340.xx.... i'm going to retry using nvenc4 since it supports 340.22...    the joys of repurposing old legecy
[18:45:45 CET] <Disturbed1> hardware... o_O
[18:47:14 CET] <BtbN> I don't think the old SDK v4 is still supported.
[18:47:26 CET] <BtbN> And "No CUDA capable devices found" happens before any NVENC interaction
[18:47:42 CET] <BtbN> it means your GPU simply does not support NVENC at all, no matter which driver is driving it
[18:47:57 CET] <BtbN> The fact that it needs a legacy driver alone is enough evidence for that
[18:48:12 CET] <BtbN> Only Kepler or newer supports NVENC, and those are still supported
[18:57:15 CET] <Disturbed1> hmmm... thanks for you input Btbn....
[19:05:11 CET] <Disturbed1> guess what i need todo now is double check my hardware... i'll report back with what i learn shortly...
[19:53:17 CET] <DHE> Disturbed1: I'm pretty sure you'll need a more consumer level GPU. one intended for use with graphics. pure GPGPU processing cards sound like they wouldn't have that feature.
[19:57:13 CET] <Bombo> is there qtgmc or something like it in ffmpeg? http://avisynth.nl/index.php/QTGMC
[20:00:01 CET] <Disturbed1> cd ..
[20:01:23 CET] <Bombo> or this https://www.youtube.com/watch?v=vSyupaW1jY8 TempGaussMC
[20:16:41 CET] <Disturbed1> BtbN / DHE : yup you guys right.... i went back to nvenc 4 sdk, used the nvida apps (nvEncoderApp) , failed output said
[20:16:46 CET] <Disturbed1> >> GetNumberEncoders() has detected 2 CUDA capable GPU device(s) <<
[20:16:54 CET] <Disturbed1> [ GPU #0 - < Tesla C870 > has Compute SM 1.0, NVENC Not Available ]
[20:17:04 CET] <Disturbed1> [ GPU #1 - < Tesla C870 > has Compute SM 1.0, NVENC Not Available ]
[20:18:22 CET] <Disturbed1> sooo... time to try and find another purpose for this piece of hardware.....
[21:41:57 CET] <adko> I'm following: https://trac.ffmpeg.org/wiki/Create%20a%20video%20slideshow%20from%20images -- how can I specify a video duration?  The problem being - I specify 1/12 for the input frame rate and I need my video to be 12 seconds long
[21:42:08 CET] <adko> -t 12 doesn't seem to have an effect
[21:43:12 CET] <c_14> Shouldn't the output duration be automatically determined from the number of images and the framerate?
[21:43:35 CET] <adko> c_14: would be nice :D
[21:43:50 CET] <c_14> What's your current commandline?
[21:44:39 CET] <adko> ffmpeg -r 1/12.480000000000004 -pattern_type glob -i '/tmp/1451852979055159028/1451852979055159028-*.png' -c:v libx264 -r 30 -pix_fmt yuv420p /tmp/1451852979055159028/out.mp4
[21:44:47 CET] <adko> I have also tried -vf fps=30
[21:44:58 CET] <adko> (per note on the trac wiki)
[21:45:08 CET] <c_14> use -framerate instead of -r
[21:45:31 CET] <adko> (in this case there is only 1 image in /tmp/1451852979055159028/ )
[21:45:44 CET] <adko> It seems to just cut the frame short
[21:45:49 CET] <adko> last* frame
[21:45:53 CET] <adko> ok will try
[21:46:24 CET] <c_14> Also, is the output too short or too long?
[21:46:30 CET] <adko> boom -- that seems to work
[21:46:47 CET] <adko> the output is too short... I didn't realize -r didn't work
[21:46:50 CET] <adko> you da man
[21:51:18 CET] <Disturbed1> so....
[21:52:00 CET] <Disturbed1> since i've learned my S870 tesla can NOT be used to transcode..... my next question the the ffmpeg community is this...
[21:53:27 CET] <Disturbed1> what GPU should i get and is compatible with nvenc and use multiple GPU in one system.... at the moment i'm looking at Quadro....
[21:53:54 CET] <Disturbed1> on budget . . .  lol
[21:54:41 CET] <Disturbed1> need to build a system to transcode live mpeg2 ts -> live 264 hls
[21:55:39 CET] <c_14> Have you tried just using x264?
[21:56:23 CET] <Disturbed1> i have 32 live streams to transcode....
[21:56:52 CET] <c_14> x264 on the faster presets uses very little cpu
[21:58:50 CET] <Disturbed1> ka, i will try it out when i get a chance...  wanna push as many streams into one server as possible to cut back on the amount of machines required for all of them....
[22:01:44 CET] <Mavrik> Disturbed1, use CPU.
[22:01:51 CET] <Mavrik> Buying GPUs is a huge waste of money for that.
[22:02:00 CET] <Mavrik> 32 streams should be doable on a 24core Xeon machine.
[22:05:15 CET] <Disturbed1> wow... ka.. ummm maybe....
[22:06:01 CET] <Disturbed1> maybe i should explain a little more what i'm trying todo here before i confuse everyone here and think i'm a crazy person...
[22:08:59 CET] <Disturbed1> i have 32 mpeg2 ts streams comming in at 20mbps per stream at 1080p60... i need each stream transcoded into 3 streams in 264/hls 1080p24/720p24/480p24...
[22:10:51 CET] <Disturbed1> plan to use adaptive bitrate to scale up or down depending on viewers net connection
[22:11:19 CET] <Mavrik> Ah.
[22:11:24 CET] <Mavrik> And? :)
[22:13:03 CET] <adko> c_14: Strangely my video still has a reported framerate of 1/2
[22:13:06 CET] <adko> er.. 1/12
[22:13:19 CET] <adko> Which seems to be messing up later "drawtext" filtering
[22:13:30 CET] <Disturbed1> last time i tried to transcode 1 stream into 5 (1080/720/480/360/240) i maxed out the server i was using
[22:13:54 CET] <Mavrik> Yeah, you can get like 20 outputs on a 24 core dual socket Xeon
[22:13:59 CET] <Mavrik> depending on your speed settings
[22:14:16 CET] <Disturbed1> so this is why i'm looking at maybe using gpu power to help cut back on the number of machines required
[22:14:24 CET] <c_14> adko: is the drawtext filter after the fps filter?
[22:14:25 CET] <Mavrik> So I guess you'll need 8 machines or so
[22:14:36 CET] <adko> c_14: yeah it's totally separate commands
[22:14:37 CET] <Mavrik> GPUs have terrible P/P for that
[22:14:51 CET] <adko> do I need -framerate AND -vf fps=30 ?
[22:15:05 CET] <c_14> adko: yes
[22:15:15 CET] <adko> k will try thanks
[22:15:17 CET] <JEEB> also ASIC encoders usually can handle one or two streams and that's it
[22:15:17 CET] <c_14> if you want the output to be in 30fps, yes
[22:15:25 CET] <JEEB> while a proper CPU can do much more
[22:15:52 CET] <Mavrik> Also if I see correctly, currently nvidia driver freezes if you try to initialize several nvenc sessions
[22:16:19 CET] <Disturbed1> ka. thx mavrik... i'll continue doing my home work before i buy anything... thx to ya'll
[22:16:34 CET] <adko> c_14: didn't help
[22:16:43 CET] <Mavrik> I'm just looking at a machine that's crunching through 11 SD streams at 30% CPU utilization across the board
[22:17:47 CET] <c_14> adko: try putting the fps filter directly before the drawtext filter?
[22:18:15 CET] <Disturbed1> mavrik:  what gear you looking at? (model/make/etc)
[22:18:58 CET] <adko> ok
[22:20:54 CET] <BtbN> NVENC has a hard limit of 2 encoding sessions per GPU
[22:21:06 CET] <BtbN> So for your 32 streams, you'd need 16 Nvidia GPUs. Good luck.
[22:21:54 CET] <Disturbed1> lol
[22:22:23 CET] <Mavrik> Disturbed1, model name	: Intel(R) Xeon(R) CPU E5-2697 v2 @ 2.70GHz x 2
[22:22:43 CET] <adko> c_14: I did like this: -vf "[in]fps=30,drawtext...."
[22:22:46 CET] <adko> right ?
[22:22:48 CET] <Mavrik> About 12GB RAM used
[22:22:53 CET] <c_14> without the [in]
[22:22:53 CET] <adko> It didn't seem to help
[22:22:56 CET] <fritsch> BtbN: vaapi encoding should be able to cope with that
[22:22:57 CET] <c_14> but ye
[22:23:04 CET] <fritsch> BtbN: "in theory" :-) of course
[22:23:08 CET] <adko> Well I have to use [in] and [out] because I have lots of filters here
[22:23:22 CET] <adko> (drawtext filters) with quotes and junk
[22:23:23 CET] <BtbN> I don't think you'll get much more than 2 parallel streams through VAAPI either
[22:23:25 CET] <c_14> you can't have more than one -vf per command
[22:23:32 CET] <adko> yeah it's just one vf
[22:23:33 CET] <Mavrik> Stuffing 16 GPUs in a chasis is problematic in so many levels
[22:23:35 CET] <BtbN> It's another hardwired encoder after all
[22:23:49 CET] <fritsch> BtbN: let me find it on the ML - there are some guys also holding a speech in belgium conference soon - about exactly that
[22:23:53 CET] <fritsch> let me have a look
[22:24:17 CET] <Disturbed1> nice CPU.... lol LGA2011 rocks....
[22:24:36 CET] <adko> I think my biggest issue right now is that my initial video is 1/12th fps instead of 30 fps
[22:24:44 CET] <adko> I have an idea.. will try
[22:27:14 CET] <Mavrik> Even Cisco and others are moving away form ASICs to branded x86 boxes
[22:27:21 CET] <adko> c_14: my bad I had -r and -framerate switched... Sorry.. got lost :D
[22:35:59 CET] <adko> Well damn.. that didn't fix my ultimate problem
[22:36:26 CET] <adko> c_14: skipping backwards in the video gives me a blank video output (in my player) -- unless I get back to the start of the video
[22:36:43 CET] <Zeranoe> Does FFmpeg support adding chapters to a MP4?
[22:36:52 CET] <adko> sometimes even skipping forward
[23:10:15 CET] <c_14> adko: sounds like not enough keyframes
[23:14:09 CET] <c_14> Zeranoe: if the input has chapters they'll be copied, adding maybe if you used an ffmetadata file as second input and mapping the metadata from that to the output
[23:14:23 CET] <c_14> (or at least mapping the chapter metadata)
[23:14:31 CET] <c_14> Don't know if ffmpeg has another method of adding chapters
[23:31:41 CET] <adko> c_14: yep I forced some key frames and I'm good to go now.
[23:32:11 CET] <adko> Have some work to do on figuring out when to add key frames so I don't artificially bloat my file. But for now I'm good. Thanks again.
[23:32:27 CET] <c_14> you could just lower the gop
[00:00:00 CET] --- Mon Jan  4 2016


More information about the Ffmpeg-devel-irc mailing list