[Ffmpeg-devel-irc] ffmpeg.log.20160331
burek
burek021 at gmail.com
Fri Apr 1 02:05:02 CEST 2016
[00:18:52 CEST] <Taoki> In that case: Is it not possible to define the length of the video in hours:minutes:seconds, instead of the frame rate? So FPS is based on the time you want.
[00:19:19 CEST] <Taoki> Actually, I might be able to find a converter for this online if not....
[00:19:31 CEST] <J_Darnley> No
[00:19:36 CEST] <explodes> Hello! Can I use libpostproc to skip sws_scale? sws_scale is giving me glorious segfaults, and I have no idea why;
[00:19:49 CEST] <J_Darnley> Usually ffmpeg has no clue how many frames a video has nor how long it actually is.
[00:20:05 CEST] <Taoki> Online coverter it is then
[00:20:12 CEST] <J_Darnley> explodes: to do what exactly?
[00:20:14 CEST] <explodes> I've only seen the case where ffmpeg knows exactly how long a video is, in how many frames
[00:20:16 CEST] <explodes> BUt that's just me
[00:20:30 CEST] <explodes> J_Darnley: to convert YUV420p to BGR32
[00:20:33 CEST] <J_Darnley> no
[00:20:40 CEST] <explodes> YUV*something
[00:20:45 CEST] <explodes> Dang
[00:20:45 CEST] <J_Darnley> still no
[00:20:52 CEST] <c_14> explodes: if it's segfaulting, try updating?
[00:21:10 CEST] <explodes> Yea, the jump from 2.8 -> 3.0 blow up all of the things
[00:21:17 CEST] <fritsch> memory alignment
[00:21:24 CEST] <fritsch> does it crash in ASM code?
[00:22:25 CEST] <fritsch> a backtrace would be nice to have - even better: part of the code you are using
[00:22:34 CEST] <fritsch> context setup + sws_scale
[00:23:42 CEST] <fritsch> i had a similar issue after upgrading to 3.0 with yuv402p to rgb32 code with neon asm
[00:23:59 CEST] <fritsch> which required 16 byte aligned memory adresses
[00:24:03 CEST] <explodes> I'm using asm mode for ARM, not for X86 ; but I have no data on X86 anyways.
[00:24:10 CEST] <fritsch> same here
[00:24:20 CEST] <explodes> Nice. let me get some snippets
[00:24:36 CEST] <fritsch> you got a backtrace?
[00:24:50 CEST] <fritsch> you supply the data "from external"?
[00:25:02 CEST] <fritsch> e.g.set frame->data[0] manually?
[00:26:06 CEST] <fritsch> bool aligned = (((uintptr_t)(const void *)(pixels)) % (16) == 0); if that is false
[00:26:13 CEST] <explodes> http://pastebin.com/ZQbUS3Wz
[00:26:25 CEST] <explodes> I don't set it manually, no
[00:26:42 CEST] <explodes> ALso I have no backtrace, JNI isn't playing very nice for us
[00:26:51 CEST] <fritsch> deprecated API
[00:26:53 CEST] <fritsch> :-)
[00:26:57 CEST] <fritsch> but I see the issue
[00:27:10 CEST] <fritsch> avpicture_fill uses av_image_ and only aligns to 1 byte
[00:27:17 CEST] <fritsch> replace avpicture_fill with:
[00:27:38 CEST] <fritsch> av_image_fill_arrays
[00:27:43 CEST] <fritsch> and use 16 alignment
[00:27:45 CEST] <explodes> You're awesome
[00:27:52 CEST] <explodes> If this works, NOra say she's gonna marry you
[00:27:59 CEST] <fritsch> i am already married
[00:28:03 CEST] <explodes> double up.
[00:28:53 CEST] <explodes> Ah, the problem here is that we need pixels to be written to (at least eventually)
[00:29:09 CEST] <explodes> av_image_fill_arrays doesn't take a pixel pointer of sorts, and so nothing in that method would write the data to that block
[00:29:29 CEST] <fritsch> av_image_fill_arrays(stream->pFrameOut->data, stream->pFrameOut->linesize, pixels, PIXEL_FORMAT, width ,height, 16)
[00:29:57 CEST] <explodes> Ohhh
[00:30:11 CEST] <fritsch> https://ffmpeg.org/doxygen/2.4/group__lavu__picture.html#ga5b6ead346a70342ae8a303c16d2b3629
[00:30:25 CEST] <fritsch> btw. never the less this is an ffmpeg 3.0 bug
[00:30:34 CEST] <fritsch> this asm makes assumptions it does not have
[00:30:36 CEST] <fritsch> and crashes
[00:30:55 CEST] <fritsch> i had to revisit kodi's complete texture memory allocation
[00:30:57 CEST] <fritsch> :-)
[00:31:00 CEST] <fritsch> cause of that
[00:31:06 CEST] <explodes> So if I disable ASM, the issue would be fixed?
[00:31:10 CEST] <explodes> Not going to, but
[00:31:21 CEST] <fritsch> try the replacement first
[00:31:29 CEST] <fritsch> width and height you need to replace with yours
[00:31:34 CEST] <fritsch> the rest should work as is
[00:31:43 CEST] <explodes> fritsch, I don't know who you are or where you came from, but you've made my day. We've been working hard, fruitlessly, for weeks.
[00:31:58 CEST] <fritsch> does it work?
[00:32:13 CEST] <fritsch> btw. don't use SWS_FAST_BILINEAR
[00:32:24 CEST] <fritsch> it is not really faster than normal bilinear but the quality sucks like hell
[00:32:25 CEST] <fritsch> :-)
[00:32:40 CEST] <varu-> hi, i'm trying to receive/process an mpts stream from a commercial ird
[00:33:18 CEST] <varu-> what it does is fling a ts mux via udp to an IP specified in its interface, in this case a box with ffmpeg
[00:33:34 CEST] <varu-> how can i get ffmpeg to 'tune in'?
[00:35:45 CEST] <J_Darnley> ffmpeg -i SOME_URL maybe
[00:36:03 CEST] <c_14> ffmpeg -i udp://localhost:port probably
[00:36:09 CEST] <varu-> tried doing ffmpeg -i udp://@:<porthere>
[00:36:24 CEST] <c_14> may need ?listen
[00:36:32 CEST] <c_14> ie udp://localhost:port?listen
[00:37:01 CEST] <c_14> And you may need to listen on either a broadcast address or the address it's being sent to
[00:37:12 CEST] <c_14> So <my ip> instead of localhost
[00:37:59 CEST] <varu-> ok, tried all of the above. sits there until i ctrl-c out of it, reporting "could not find codec parameters"
[00:38:04 CEST] <explodes> fritsch: P.S. I love you, but we also another segfault happening in yuva2rgba.c, but I have no stack trace or idea about why
[00:39:02 CEST] <fritsch> explodes: did it work?
[00:39:17 CEST] <fritsch> so - then show me that code, too
[00:40:02 CEST] <varu-> running with -v debug now, seems it *is* receiving the stream: Statistics: 36817732 bytes read
[00:40:24 CEST] <c_14> Is the server sending plain udp or something like rtp?
[00:40:54 CEST] <varu-> the setting in the server is 'mpts', it's sending a mux of several services
[00:41:15 CEST] <varu-> [mpegts @ 0x260e600] Format mpegts probed with size=2048 and score=100
[00:41:28 CEST] <explodes> fritsch: yep! it's working, we haven't been able to replicate the segfault reliably, but we will see how it behaves in the wild.
[00:41:38 CEST] <varu-> [mpegts @ 0x260e600] Before avformat_find_stream_info() pos: 0 bytes read:5000800 seeks:0
[00:41:58 CEST] <explodes> fritsch: I have no idea which code is causing the yuv2rgb problem; it's either in our decode function or vsp_getFrame
[00:42:05 CEST] <varu-> [mpegts @ 0x260e600] After avformat_find_stream_info() pos: 508081280 bytes read:508081280 seeks:0 frames:0 - and couldn't find codec parameters (this is after ctrl-c)
[00:42:11 CEST] <c_14> varu-: what codec is the source stream[s]?
[00:42:25 CEST] <fritsch> explodes: then that's the first thing to find out :-)
[00:42:37 CEST] <fritsch> does that also go through that part of the code you linked?
[00:42:59 CEST] <explodes> fritsch: http://pastebin.com/gFRN787i
[00:43:01 CEST] <varu-> not entirely sure, most likely mpeg4 avc
[00:43:34 CEST] <explodes> nope, the flow of the program is: probe, load, decode, decode, decode...
[00:44:39 CEST] <fritsch> explodes: your pts handling is not that smart :-)
[00:44:47 CEST] <explodes> lol
[00:44:55 CEST] <explodes> What is a better way to handle that?
[00:46:19 CEST] <fritsch> is that your code?
[00:46:25 CEST] <fritsch> still looking for yuv2rgb
[00:46:28 CEST] <fritsch> in that one
[00:46:47 CEST] <fritsch> http://dranger.com/ffmpeg/tutorial05.html <- good read
[00:46:53 CEST] <explodes> It was a contractors, but we have extensively modified it to fix the bugs that we've found
[00:47:11 CEST] <fritsch> it looks like some pimped ffmpeg beginners howto :-)
[00:47:13 CEST] <fritsch> hehe
[00:47:21 CEST] <explodes> Yea that's where he got most of the code
[00:47:28 CEST] <fritsch> oki - so for which company I am currently working? :-)
[00:47:45 CEST] <explodes> I've tried to not disturb the code too much, as we're not much of C experts
[00:48:11 CEST] <fritsch> yeah ... no problem. see if the other issue comes back
[00:48:15 CEST] <zamba> hi guys! i'm digitalizing some old vhs for a friend of mine.. i have a easycap video capture device set up.. and i have the audio routed through my microphone jack (using alsa).. i don't want to encode/transcode on the fly, since my computer can't handle it, so i just need to dumpm the streams as quickly as possible to disk.. what options do you recommend for this?
[00:48:57 CEST] <zamba> as a PoC i did: ffmpeg -f video4linux2 -i /dev/video1 -f alsa -i default output.mkv
[00:49:13 CEST] <zamba> and i got both video and audio, but with lags here and there - which i suspect is due to the computer not keeping up
[01:00:01 CEST] <llogan> zamba: i would be surprised if your computer couldn't keep up encoding SD content. unless it's some ancient jalopy.
[01:00:20 CEST] <zamba> it's a P8400
[01:00:36 CEST] <zamba> llogan: but i'd rather just dump with the best quality initially and just worry about the conversion afterwards
[01:00:55 CEST] <zamba> but preferably in a container that will keep the a/v in snc
[01:00:57 CEST] <llogan> ok. you'll need to show the complete console output from that command. you can use a pastebin site.
[01:00:57 CEST] <zamba> sync*
[01:01:10 CEST] <zamba> from what command?
[01:01:18 CEST] <llogan> the one you just showed here
[01:02:29 CEST] <zamba> yeah, but again.. i want to dump in raw format initiall
[01:02:33 CEST] <zamba> intially*
[01:02:48 CEST] <zamba> and then do the conversion
[01:02:52 CEST] <llogan> i know that
[01:04:42 CEST] <zamba> but you still want the output of the first command?
[01:05:39 CEST] <zamba> http://pastebin.com/1vtL3kty
[01:05:40 CEST] <zamba> there you go
[01:08:59 CEST] <llogan> the input is rawvideo so if you want to just mux that into the output it will make a huge file
[01:09:05 CEST] <zamba> sure
[01:10:04 CEST] <zamba> llogan: or do you have some other suggestions? i'm wondering if the "Past duration X too large" messages are the ones that indicate that my computer isn't fast enough?
[01:10:46 CEST] <zamba> what about some of the lossless codecs? x264 for instance?
[01:11:12 CEST] <zamba> of course the source is crappy to begin with, but i at least want to preserve the "quality" from that
[01:14:13 CEST] <llogan> you cna measure encoding speed with "ffmpeg -f video4linux2 -i /dev/video1 -f alsa -i default -c:v libx264 -c:a aac -f null -", but the lags may be coming from v4l2 or alsa instead of any encoding overhead
[01:15:25 CEST] <llogan> ...which may be related to the "Thread message queue blocking" warning
[01:17:13 CEST] <zamba> i'd rather not experiment too much with the source.. as this is fragile tape..
[01:18:01 CEST] <zamba> you can't provide me with an output format that contains some compression and still is a format that can be easily worked with afterwards if i want to go even smaller in size?
[01:18:10 CEST] <zamba> maybe not a -raw- dump, but close enough?
[01:19:52 CEST] <llogan> what's the output being used for? just viewing on computer? DVD? etc?
[01:20:07 CEST] <zamba> it'll probably end up on a DVD, yeah
[01:20:25 CEST] <zamba> but i would still like to keep the source for archiving purposes
[01:20:35 CEST] <zamba> as a file
[01:21:14 CEST] <zamba> flac for video, basically.. :)
[01:21:58 CEST] <furq> huffyuv, ffv1, x264 lossless
[01:22:04 CEST] <zamba> furq: now we're talking
[01:22:06 CEST] <llogan> add -crf 0 and -c:a copy to your previous command
[01:22:22 CEST] <zamba> llogan: that will give me a -raw- dump?
[01:22:33 CEST] <furq> -crf 0 is x264 lossless (assuming you're using 8-bit x264)
[01:22:59 CEST] <zamba> furq: how much cpu does that need?
[01:23:08 CEST] <furq> it depends which preset you're using
[01:23:27 CEST] <zamba> well.. eh.. "the best"? :)
[01:23:31 CEST] <zamba> i know none of them
[01:23:31 CEST] <furq> it should probably be fine with -preset veryfast
[01:23:53 CEST] <zamba> and how much cpu does that need?
[01:24:36 CEST] <furq> i don't know how to answer that question
[01:25:36 CEST] <furq> the only reliable answer you'll get is to try it and find out
[01:25:51 CEST] <zamba> will i get any feedback if the computer is not keeping up?
[01:25:56 CEST] <zamba> and if so, what kind?
[01:26:31 CEST] <furq> the progress report shows the encoding speed in fps
[01:26:45 CEST] <furq> if that drops below your target fps then your cpu isn't fast enough
[01:27:08 CEST] <furq> recent versions will have "1.000x" (or the current relative speed) at the end of the progressbar as well
[01:27:21 CEST] <furq> i'm not sure how accurate that is though
[01:29:05 CEST] <petecouture> What is the best practices to restart ffmpeg while it's encoding a live stream? I'm accepting a live RTP stream and encoding to HLS. If the broadcasters bandwidth drops or cuts out for a few seconds, ffmpeg keeps encoding but nothing is processing. The timemark increments but the framenumber doesn't go up. Here's an output of the frame/progress and the file listings at the bottom so you can see how it's chunking. ht
[01:29:05 CEST] <petecouture> tp://pastebin.com/vubxsHjd
[01:29:33 CEST] <petecouture> The would be to continue the HLS segment and m3u8 generation without the end user having to refresh
[01:31:31 CEST] <zamba> furq: 25 fps.. but it dropped down to 20 fps
[01:34:38 CEST] <furq> you can try huffyuv, ffvhuff, ffv1 and ut video as well
[01:35:12 CEST] <zamba> looks like the problem is the disk write performance
[01:35:24 CEST] <zamba> ffmpeg -f video4linux2 -i /dev/video1 -f alsa -ac 2 -i default -c:a copy -crf 0 -preset veryfast -f null -
[01:35:34 CEST] <zamba> if i do this, then i'm dead solid on 25 fps and 1x
[01:35:49 CEST] <furq> surely the problem isn't the disk then
[01:36:01 CEST] <zamba> oh?
[01:36:05 CEST] <furq> oh nvm that's going to stdout
[01:36:08 CEST] <zamba> yeah
[01:36:34 CEST] <furq> that's also rawvideo though
[01:36:40 CEST] <zamba> oh?
[01:36:55 CEST] <furq> you're not supplying a codec, and it won't default to x264 with no output file extension
[01:37:09 CEST] <zamba> ah, ok
[01:37:11 CEST] <furq> specify -c:v libx264
[01:37:45 CEST] <zamba> like so: ffmpeg -f video4linux2 -i /dev/video1 -f alsa -ac 2 -i default -c:a copy -crf 0 -c:v libx264 -f null -
[01:37:52 CEST] <furq> sure
[01:38:05 CEST] <zamba> actually even worse performance then
[01:38:06 CEST] <furq> you forgot -preset veryfast
[01:38:07 CEST] <zamba> 12 fps
[01:38:21 CEST] <zamba> ah, that i did
[01:38:28 CEST] <zamba> yeah.. then i'm back to the same speed
[01:38:31 CEST] <zamba> so it's not the disk
[01:38:45 CEST] <furq> you could maybe try -preset ultrafast but that'll lose a lot of efficiency
[01:38:57 CEST] <zamba> measured 66.3 MB/s, so it can't be the disk
[01:39:19 CEST] <zamba> 32 MB/s raw write speed (no memory involved)
[01:39:21 CEST] <TD-Linux> ffv1 is probably faster (?)
[01:39:37 CEST] <furq> yeah one of the other lossless yuv codecs is probably a better bet
[01:39:40 CEST] <zamba> -c:v ffv1?
[01:39:49 CEST] <furq> yeah, and get rid of -preset and -crf
[01:40:04 CEST] <furq> there's also huffyuv, ffvhuff and utvideo
[01:40:08 CEST] <zamba> nope
[01:40:18 CEST] <zamba> actually worse than libx264
[01:40:21 CEST] <zamba> down to 12 fps
[01:41:03 CEST] <furq> you might need to specify -threads with ffv1
[01:41:25 CEST] <zamba> tried -threads 2
[01:41:33 CEST] <zamba> huffyuv looks promising
[01:43:00 CEST] <zamba> with huffyuv i'm able to maintain 25 fps
[01:43:16 CEST] <furq> iirc huffyuv is pretty fast but not very efficient
[01:43:31 CEST] <zamba> .... :)
[01:43:38 CEST] <zamba> so which one should i select? :)
[01:43:48 CEST] <furq> well for you i guess it's whichever one runs at full speed
[01:44:13 CEST] <furq> there are a bunch more options for ffv1 but i've not used it much
[01:44:23 CEST] <furq> https://trac.ffmpeg.org/wiki/Encode/FFV1#FFV1version3
[01:44:33 CEST] <furq> maybe -slices will speed it up
[01:45:05 CEST] <zamba> i'll have to look at this tomorrow again..
[01:45:18 CEST] <zamba> too many choices
[01:45:22 CEST] <zamba> i have no idea what i'm doing :)
[01:47:30 CEST] <furq> i would suggest overclocking but that's a mobile cpu isn't it
[03:35:13 CEST] <explodes> what is the 'align' argument in av_image_get_buffer_size ?
[03:39:34 CEST] <explodes> and av_image_fill_arrays for that matter..
[04:22:38 CEST] <J_Darnley> At a guess: the byte alignment you want arrays to be allocated with
[04:22:48 CEST] <J_Darnley> possibly line alignment oo
[04:22:51 CEST] <J_Darnley> *too
[04:35:01 CEST] <explodes> What if I do not know, I think I can have them aligned any way: alignment of 1
[06:47:40 CEST] <varu-> fyi i figured out what the issue was, ird problem. tcpdump'd the udp stream into a pcap and looked at it with wireshark, all null packets! ffmpeg had nothing to decode
[06:48:31 CEST] <varu-> the rest was just trial and error, now works just fine. amazing software!
[07:52:37 CEST] <zamba> furq: it's a P8400 CPU, yeah
[07:52:42 CEST] <zamba> dunno if it's a mobile one or not
[08:43:40 CEST] <rrva> is there anything like http://gopchop.sourceforge.net/ but for h264 ?
[10:09:43 CEST] <Keshl> rrva: Tell me if you find out. I've been looking for something like this for /ages/. o_o
[10:18:42 CEST] <rrva> Keshl: avidemux or ffmpeg -i ${1} -f segment -reset_timestamps 1 -segment_list out.ffcat -segment_times 172.16,202.56 -c copy -copy_unknown -map 0 ${1}_02%d.ts
[10:18:56 CEST] <rrva> Keshl: both of those can cut streams without re-encoding
[10:19:11 CEST] <rrva> Keshl: but still not as unmodified as gopchop
[10:31:52 CEST] <Keshl> rrva: I think what you just threw at me does what I need it to do. o_o.
[10:32:31 CEST] <Keshl> Thank you. o_o. Why won't that work for you, though? oÉo?
[10:38:46 CEST] <momomo> how do you create multiple outputs at once using one ffmpeg connection to an input ?
[10:38:48 CEST] <GNUbahn> Hi. I'm trying to get ffmpeg to work on my LinuxMint. It seems that I can't add the necessary PPA but instead have to download, extract and execute the linuxbuild from ffmpeg.org. But how and what do I execute?
[10:43:54 CEST] <GNUbahn> VB
[10:43:56 CEST] <c_14> GNUbahn: do you want the static build or are you trying to build ffmpeg yourself?
[10:45:13 CEST] <GNUbahn> c_14: I just need to install the program itself
[10:45:41 CEST] <c_14> The just download the static build, extract it and place the "ffmpeg" file in your PATH somewhere (probably /usr/local/bin)
[10:47:32 CEST] <GNUbahn> c_14: there is no ffmpeg file. There is ffmpeg.s ffmpeg.h and six ffmpeg.*
[10:47:57 CEST] <furq> GNUbahn: http://johnvansickle.com/ffmpeg/
[10:48:06 CEST] <c_14> You downloaded the sources, you want the build from what furq just posted
[10:51:36 CEST] <GNUbahn> according to the ffmpeg site I downloaded the program itself
[10:54:11 CEST] <c_14> Did you press the big green download button? (ffmpeg-3.0.1.tar.bz2) ?
[10:54:28 CEST] <c_14> Because those are the sources
[10:54:50 CEST] <c_14> FFmpeg itself doesn't host binary builds, those are hosted by third parties.
[10:55:29 CEST] <GNUbahn> c_14: That's what I did, believing it to be the program. Thanks for helping me out of the mess
[10:58:43 CEST] <GNUbahn> I was helped to find this page: https://launchpad.net/~mc3man/+archive/ubuntu/trusty-media
[11:05:44 CEST] <edkamsalp> I use this command to capture the desktop:
[11:05:47 CEST] <edkamsalp> sleep 2s; ffmpeg -video_size 1024x768 -framerate 25 -f x11grab -i :0 -f alsa -ac 2 -i hw:0 output.mkv
[11:06:13 CEST] <edkamsalp> to stop, capturing I use: the key q
[11:06:30 CEST] <edkamsalp> Is possible to pause/resume?
[11:06:52 CEST] <edkamsalp> Is it*
[11:09:16 CEST] <edkamsalp> Can ffmpeg be paused and resume during screen capturing?
[11:09:37 CEST] <relaxed> edkamsalp: you can try "ctrl z" and then "fg" to resume, but I'm not sure how well it will work.
[11:10:07 CEST] <GNUbahn> it still doesn't work though. When I type 'ffmpeg -f concat -i mylist.txt -c copy HTS1.sl1.r.mp4' I'm told the command wasn't found
[11:11:34 CEST] <edkamsalp> GNUbahn: ffmpeg -f concat -i input.txt -c copy output.mp4 works for me
[11:11:48 CEST] <edkamsalp> How do you right your mylist.txt?
[11:11:55 CEST] <edkamsalp> write*
[11:13:50 CEST] <GNUbahn> edkamsalp: Thanks, bit it doesn't for me. I'm still told, that the commands can't be found
[11:15:21 CEST] <edkamsalp> GNUbahn: type and Enter this:
[11:15:24 CEST] <edkamsalp> hgsfyugcbshmgv
[11:17:41 CEST] <edkamsalp> relaxed: It does not stop the screen-capturing! Capturing just will be freezed! It stop capturing new activities, but time is going on!
[11:22:52 CEST] <GNUbahn> edkamsalp: ?
[11:23:47 CEST] <edkamsalp> ffmpeg -f concat -i mylist.txt -c copy HTS1.sl1.r.mp4 | curl -F c=@- https://ptpb.pw
[11:24:03 CEST] <edkamsalp> GNUbahn: try it and share the link
[11:24:17 CEST] <GNUbahn> c_14 & edkamsalp: I think I solved the issue. It turned out to be me being an idi*t
[11:24:32 CEST] <GNUbahn> Not installing correctly
[11:24:50 CEST] <GNUbahn> edkamsalp: What does this mean: curl -F c=@- https://ptpb.pw
[11:25:01 CEST] <zamba> TD-Linux: lo! do you have more details/suggestions to using the ffv1 codec?
[11:25:26 CEST] <edkamsalp> GNUbahn: it's a pastebin
[11:26:58 CEST] <GNUbahn> edkamsalp: so if I write that, you'll be able to se the output on https://ptpb.pw?
[11:32:17 CEST] <GNUbahn> edkamsalp: I tryed but can't open the link I get i firefox
[11:32:59 CEST] <edkamsalp> GNUbahn:
[11:33:07 CEST] <edkamsalp> alias ptpb='curl -F c=@- https://ptpb.pw'
[11:33:10 CEST] <edkamsalp> then
[11:33:23 CEST] <edkamsalp> cat foo.txt | ptpb
[11:34:57 CEST] <termos> trying to run the example from https://github.com/FFmpeg/FFmpeg/blob/master/doc/examples/transcoding.c using libx264 it fails and tells me that "broken ffmpeg default settings detected"? Is this a known bug?
[11:35:06 CEST] <edkamsalp> GNUbahn: it upload the stdout to its servers and returns a link to you. A link like in this format: https://ptpb.pw/9orb
[11:35:35 CEST] <GNUbahn> edkamsalp: Yeah I git that working but for some reason firefow won't open that link
[11:36:28 CEST] <edkamsalp> The open well here in FF.
[11:36:33 CEST] <edkamsalp> they*
[11:36:42 CEST] <edkamsalp> are opened*
[11:36:56 CEST] <GNUbahn> edkamsalp: try https://ptpb.pw/2AcJ
[11:37:14 CEST] <GNUbahn> It's obsolete now though
[11:38:04 CEST] <GNUbahn> I got it to work and an now concatenating two files.
[11:38:34 CEST] <edkamsalp> GNUbahn: https://ptpb.pw/2AcJ --> it's empty! <command> | ptpb --> the output of your <command> is nothing!
[11:39:04 CEST] <GNUbahn> edkamsalp: Yeah really empty. I wonder what went wrong!?
[11:40:49 CEST] <furq> if the command was ffmpeg then you didn't redirect stderr to stdout
[11:41:29 CEST] <GNUbahn> furq: So what should the command have been?
[11:41:48 CEST] <furq> ffmpeg [opts] 2>&1 | ptbp
[11:42:27 CEST] <GNUbahn> furq: this was my command: ffmpeg -f concat -i mylist.txt -c copy HTS1.sl1.r.mp4 | curl -F
[11:42:27 CEST] <GNUbahn> c=@- https://ptpb.pw
[11:42:43 CEST] <furq> yeah that doesn't do anything
[11:43:10 CEST] <GNUbahn> furq: because?
[11:43:21 CEST] <furq> because it doesn't have 2>&1 after the ffmpeg command
[11:43:43 CEST] <GNUbahn> furq: I'm sorry, I don't know what you mean
[11:45:00 CEST] <furq> ffmpeg -f concat -i mylist.txt -c copy HTS1.sl1.r.mp4 2>&1 | curl -F c=@- https://ptpb.pw
[11:46:07 CEST] <momomo> I am trying to generate multiple outputs, and looking at this:
[11:46:07 CEST] <momomo> https://trac.ffmpeg.org/wiki/Creating%20multiple%20outputs
[11:46:28 CEST] <momomo> should I be using "no filtering" or "filtering for all outputs" ?
[11:46:49 CEST] <momomo> what is the diffference? performance benefits ?
[11:47:22 CEST] <furq> do you want to apply filters to the outputs
[11:48:14 CEST] <GNUbahn> furq & edkamsalp: Now I got the pastebin thing working (though I stille don't know what 2>&1 is good for) What do you get from this: https://ptpb.pw/Rmw-
[11:48:22 CEST] <momomo> may I ask what means with filters? if I want to apply an alpha image ?
[11:48:26 CEST] <momomo> or something ?
[11:48:28 CEST] <momomo> :S
[11:48:41 CEST] <momomo> I would like to add an image to the stream ( a logo )
[11:48:49 CEST] <furq> momomo: https://ffmpeg.org/ffmpeg-filters.html#toc-Video-Filters
[11:48:55 CEST] <momomo> as well as generate one output for desktop and one for mobile
[11:49:08 CEST] <furq> if you want to add an image then you'll need to filter all outputs
[11:49:25 CEST] <edkamsalp> GNUbahn: https://en.wikipedia.org/wiki/Standard_streams + http://www.cyberciti.biz/faq/redirecting-stderr-to-stdout/
[11:50:25 CEST] <momomo> furq, thanks, may I ask, I want to apply a logo as well at some corner, would I do that using filters or is that for other things such as manipulating colors and others ?
[11:51:48 CEST] <GNUbahn> edkamsalp & furq: I've got to go now, but will probably return with further questions. Thanks a lot for your patience and help.
[11:51:57 CEST] <edkamsalp> GNUbahn: didn't you say already your work is done?!
[11:52:45 CEST] <furq> momomo: https://ffmpeg.org/ffmpeg-filters.html#overlay-1
[11:53:10 CEST] <momomo> furq, that's one video over the other ?
[11:53:20 CEST] <furq> it works just as well with images
[11:53:28 CEST] <momomo> ook
[11:53:41 CEST] <furq> ffmpeg usually considers an image to be a 1-frame video
[11:53:58 CEST] <momomo> furq, this is the best way to do it?
[11:54:11 CEST] <furq> probably
[11:54:14 CEST] <furq> i've never had to do it
[11:54:24 CEST] <momomo> furq, ook, good to know.. learning more :)
[11:54:29 CEST] <furq> there's certainly no way which doesn't involve filters
[11:55:45 CEST] <momomo> i have another issue that I've been pushing for some time .. and that is ... normally I would have streams generating hls videos ... at some point, I would like to push in an advert into an existing running stream ... is that possible? to manipulate an existing ffmpeg command ?
[11:56:05 CEST] <momomo> or pause it
[11:56:26 CEST] <momomo> so I can manipualte the playlist.m3u8 file and push my own hls segnment
[12:13:57 CEST] <momomo> I am getting invalid for command:
[12:14:02 CEST] <momomo> ffmpeg -i http://user:pass@domain.host.ca:9981/play/stream/channel/50690c661e5c524dffffacca7cf133e0 -x264-params scenecut=0 -x264opts keyint_min=125 -g 125 -r 25 -framerate 25 -b:v 1024k -bufsize 1024k -minrate 1024k -maxrate 1024k -c:v libx264 -c:a libvo_aacenc -b:a 64k -ac 2 -f hls -hls_list_size 3 -hls_time 5 -hls_flags delete_segments /momomo/Generated/Tv/2488934447TV4/c -s 736x306 -x264-params scenecut=0 -x264opts keyint_min=125
[12:14:02 CEST] <momomo> -g 125 -r 25 -framerate 25 -b:v 1024k -bufsize 1024k -minrate 1024k -maxrate 1024k -c:v libx264 -c:a libvo_aacenc -b:a 64k -ac 2 -f hls -hls_list_size 3 -hls_time 5 -hls_flags delete_segments /momomo/Generated/Tv/2488934447TV4/m
[12:14:19 CEST] <momomo> does this command look right? do I have to repeat both commands like that ?
[12:14:27 CEST] <momomo> error: [NULL @ 0x2232600] Unable to find a suitable output format for ''
[12:15:21 CEST] <momomo> the command on multiple lines: http://hastebin.com/ceparekuna.sm
[12:16:18 CEST] <momomo> should I perhaps be piping the first command to the second to generate the smaller size ?
[12:17:24 CEST] <momomo> or the tee pseudo-mixer ?
[12:33:13 CEST] <momomo> I am getting the same error when using tee:
[12:33:13 CEST] <momomo> http://hastebin.com/gahizareta.sm
[12:40:06 CEST] <relaxed> momomo: omit -framerate 25
[12:40:44 CEST] <momomo> maybe, but then it probably doesn't matter .. or do you mean it would resolve my issue ?
[12:41:53 CEST] <relaxed> no, but you should still do it
[12:42:12 CEST] <momomo> hmm .. there is an error i am unable to spot .. i even reverted to older version and still getting same error
[12:42:39 CEST] <momomo> maybe the extra spaces are not allowed ?
[12:50:43 CEST] <tommy``> hello
[12:51:48 CEST] <tommy``> which is the best gui for ffmpeg?
[12:52:44 CEST] <BtbN> I prefer Cygwin MinTTY.
[12:53:24 CEST] <ADenizA> Hello, I have an IP camera and I want to create a live-stream from a location inside the video (for example a motion tracked object). How can I create this live-stream using ffmpeg? More detailed: how can I push the frames I crop from the input video into an HLS stream using ffpmeg from code?
[12:54:15 CEST] <sagax> hi all!
[12:54:38 CEST] <sagax> how to change sample rate for libvorbis codec in ogg format?
[12:55:08 CEST] <tommy``> BtbN: how do you do with commands? You remember the syntax every time?
[12:55:40 CEST] <BtbN> Well, it doesn't change, so I only remembered it one time.
[12:56:51 CEST] <tommy``> for example if i want extract an audio from .avi without knows the audio codec inside of it, what i have to do?
[12:57:12 CEST] <sagax> -acode copy
[12:57:14 CEST] <sagax> maybe
[12:57:22 CEST] <BtbN> -i file.avi -c:a copy -vn -sn out.whatever
[13:09:14 CEST] <Thor________> hi. Does anyone know if FFServer supports RTSP or HLS as streaming formats ?
[13:11:16 CEST] <Thor________> or is it better to use FFMpeg as a "vod" server ?
[13:43:56 CEST] <tommy``> is possible to increase the speed of extraction? actually I've 20-21x
[13:44:46 CEST] <c_14> Get a faster hard drive?
[13:45:05 CEST] <tommy``> it's 7200rpm
[13:45:15 CEST] <c_14> Get an ssd
[13:45:24 CEST] <tommy``> i've ssd on C:
[13:45:34 CEST] <c_14> But no, there's not really a faster way.
[13:45:40 CEST] <c_14> You're most likely IO-limited
[13:45:48 CEST] <c_14> Short of getting a faster IO interface, there's not much you can do.
[13:46:36 CEST] <tommy``> I think also my cpu/ram are crap for ffmpeg (dual core 2.60 ghz with 6GB ddr2)
[13:46:56 CEST] <c_14> Shouldn't be that relevant if you're using -c copy
[13:47:09 CEST] <tommy``> the command I'm using is:
[13:47:18 CEST] <tommy``> ffmpeg -i file.avi -vn audio.mp3
[13:48:06 CEST] <furq> that's encoding the audio
[13:48:17 CEST] <tommy``> not only plain extraction?
[13:48:19 CEST] <furq> no
[13:48:22 CEST] <tommy``> ah
[13:48:24 CEST] <furq> -c:a copy will copy the audio stream
[13:49:01 CEST] <tommy``> ok i'll try
[13:53:09 CEST] <tommy``> furq: now i launched this: ffmpeg -i file.avi -c:a copy -vn -sn audio.mp3
[13:53:11 CEST] <tommy``> good?
[13:53:29 CEST] <furq> sure
[13:54:19 CEST] <tommy``> what does -sn means?
[13:54:26 CEST] <furq> no subtitles
[13:54:35 CEST] <furq> you probably don't need -vn or -sn if the output container is audio-only
[13:54:38 CEST] <tommy``> so -vn i suppose "video no"
[13:54:52 CEST] <furq> it would be useful if you were extracting to m4a or some other container which is also used for video
[13:55:23 CEST] <tommy``> yes the output is only .mp3, i extract .mp3 to mux into a 1080p web-dl
[13:58:58 CEST] <tommy``> furq i think i'll enjoy make some .bat for autoextracting audio with this string :D
[14:15:12 CEST] <tommy``> something wrong here: ffmpeg -i audio08.mp3 -c:a libfdk_aac -b:a audio08.aac ?
[14:29:49 CEST] <J_Darnley> Yes, you are encoding one lossy format to another
[14:34:14 CEST] <furq> you didn't give an argument to -b:a
[14:34:26 CEST] <furq> also you probably don't have libfdk_aac unless you compiled ffmpeg yourself
[14:34:33 CEST] <tommy``> i change the syntax becuase it seems i haven't libfdk_aac on windows
[14:34:36 CEST] <tommy``> now i'm using:
[14:34:41 CEST] <J_Darnley> and if you want a minor speed up don't read and write to the same disk
[14:34:42 CEST] <tommy``> ffmpeg -i audio08.mp3 -c:a copy audio.aac
[14:34:49 CEST] <furq> well that's not going to work at all
[14:35:14 CEST] <tommy``> ffmpeg -i audio08.mp3 -c:a aac copy audio.aac <<--- sorry
[14:35:34 CEST] <furq> i'm going to assume you don't actually have "copy" in there
[14:35:45 CEST] <furq> also use audio.m4a, not audio.aac
[14:36:35 CEST] <tommy``> furq: "ffmpeg -i audio.mp3 aac audio.m4a" better?
[14:37:34 CEST] <J_Darnley> If that is your literal command, no
[14:37:45 CEST] <tommy``> yes i suppose yes
[14:38:01 CEST] <tommy``> furq says that i don't have "copy" so i removed -c:a
[14:38:19 CEST] <J_Darnley> Then what the heck does that "aac" belong to then?
[14:40:21 CEST] <tommy``> ah yes: ffmpeg -i audio.mp3 audio.m4a
[14:40:28 CEST] <tommy``> this seems working
[15:07:01 CEST] <zamba> furq: i'm performing some additional tests with the following command now: ffmpeg -f video4linux2 -i /dev/video1 -f alsa -ac 2 -i default -c:a copy -vcodec ffv1 -level 3 -threads 2 -coder 1 -context 1 -g 1 -slices 24 -slicecrc 1 output.mkv
[15:07:17 CEST] <zamba> furq: and that runs at around 25 fps (though, to begin with it's at around 30)
[15:09:45 CEST] <zamba> but i'm getting bursts of these: Past duration 0.997215 too large
[15:10:02 CEST] <furq> i'd have thought slicecrc would slow it down
[15:11:33 CEST] <zamba> furq: so i should omit that?
[15:11:41 CEST] <zamba> furq: but what is that "Past duration" message?
[15:11:43 CEST] <furq> probably. i've never used ffv1
[15:12:23 CEST] <furq> you can usually ignore that warning
[15:13:14 CEST] <zamba> i don't get it if i do '-f null -' instead of 'output.mkv'
[15:13:25 CEST] <zamba> but then i instead get [null @ 0x9f152c0] Application provided invalid, non monotonically increasing dts to muxer in stream 0: 36 >= 36
[15:13:47 CEST] <zamba> every 70 or so frame
[15:14:06 CEST] <zamba> and then the fps is 15 instead of 25.. so this makes no sense
[15:18:00 CEST] <zamba> http://pastebin.com/03Gcg4N8
[15:31:11 CEST] <varu-> so i've got one ffmpeg instance listening to a udp stream sent to it by a device, it's an mpegts stream carrying multiple services/PIDs
[15:32:06 CEST] <varu-> i'm successfully selecting one, however i'd like to run multiple ffmpeg sessions to be able to capture and decode all of them separately (they're being re-encoded and streamed via rtmp)
[15:32:50 CEST] <varu-> when i try, ffmpeg throws a socket binding error, i'm assuming only one app can sit and listen on the udp port the stream is coming in on
[15:33:38 CEST] <varu-> is there a way to have multiple ffmpeg sessions 'reading' the same udp port?
[15:33:50 CEST] <c_14> varu-: just map separate outputs
[15:34:08 CEST] <c_14> i.e. ffmpeg -i udp:// -map 0:1 out.mkv -map 0:2 -map 0:3 out2.mkv
[15:34:10 CEST] <c_14> etc
[15:34:30 CEST] <c_14> If that doesn't fit your usecase (you don't want them running all at the same time) you're going to have to get some sort of multiplexer in the middle
[15:35:04 CEST] <varu-> i can do that, but will it use a separate thread for encoding each?
[15:35:22 CEST] <Mavrik> yes
[15:35:24 CEST] <c_14> yep
[15:35:36 CEST] <c_14> It might even use several threads for each
[15:35:48 CEST] <c_14> depending on the threading settings of the encoders used
[15:37:06 CEST] <BtbN> isn't the ffmpeg "pipeline" single-threaded? Meaning that it will demux sequentialy, sending each frame/packet to the de/encoder in order, and not in parallel?
[15:37:59 CEST] <Mavrik> Um, not sure about the question?
[15:38:20 CEST] <Mavrik> Components that can be multithreaded are multithreaded.
[15:38:30 CEST] <Mavrik> Ones where it makes no sense (muxers, demuxers) aren't.
[15:41:12 CEST] <BtbN> The entire pipeline isn't though.
[15:41:23 CEST] <BtbN> So one slow encoder will slow everything down.
[15:41:50 CEST] <Mavrik> Uhm, why would it?
[15:42:11 CEST] <BtbN> because that's what happens if something is single-threaded and some part of it only handled 5fps.
[15:42:19 CEST] <varu-> my only concern with doing this is, should the stream IDs in the mux change (added or removed), i have to take the whole thing down to adjust
[15:42:31 CEST] <varu-> running separate sessions would resolve that
[15:42:47 CEST] <BtbN> you can't listen on the same port with multiple applications, so you have no other option if you want to use ffmpeg.
[15:43:08 CEST] <c_14> Besides hanging some sort of multiplexer in front of ffmpeg
[15:43:21 CEST] <BtbN> you could of course start one ffmpeg that just does -c copy to a local http/tcp server
[15:43:29 CEST] <BtbN> and then connect multiple other ffmpeg to that.
[15:43:47 CEST] <varu-> that's certainly an idea!
[15:44:01 CEST] <Mavrik> BtbN, yes, but that's not completely the case with ffmpeg
[15:44:07 CEST] <Mavrik> It depends on encoders though.
[15:44:26 CEST] <BtbN> As far as I'm aware, ffmpeg is only multithreaded within codecs/filters
[15:44:37 CEST] <BtbN> the API itself is strictly single-threaded and synchronous
[15:45:06 CEST] <Mavrik> Yes.
[15:45:13 CEST] <Mavrik> Hence "depends on encoders"
[15:45:28 CEST] <Mavrik> Because some queue input files, dispatch to threads and then give you output on next invocation.
[15:45:32 CEST] <Mavrik> Same for filters.
[15:45:56 CEST] <BtbN> Yes, but if one encoder in the entire single thread is capped at 5 fps, the entire pipeline can't run faster than that.
[15:46:46 CEST] <BtbN> Even if there is an entirele seperate branch of the filter/encode graph that in theory could run faster on its own
[15:46:56 CEST] <Mavrik> Yes. That's why it depends on encoders :)
[15:47:19 CEST] <BtbN> Uhm, not realy
[15:47:21 CEST] <Mavrik> On a decently fast machine encoding x264 isn't problematic
[15:47:26 CEST] <Mavrik> It'll saturate cores well.
[15:47:35 CEST] <Mavrik> Other encoders are a bit more problematic.
[15:47:37 CEST] <BtbN> But encoding with x264 5 times in the same thread could very well be
[15:47:51 CEST] <BtbN> even if all 5 of them could still run 30 fps on their own, even in parallel
[15:54:36 CEST] <hanshenrik_> how can i ask ffmpeg if the input is a static image or something else?
[15:58:29 CEST] <J_Darnley> You want to know if a video is the same frame repeated? Or at least very similar?
[15:58:52 CEST] <hanshenrik_> no, i want to know if the input file is a .jpg or a .mp4
[15:58:56 CEST] <hanshenrik_> or something else
[15:59:10 CEST] <hanshenrik_> .jpg / .png / .bmp etc are static
[15:59:18 CEST] <hanshenrik_> .mp4 would not be
[15:59:24 CEST] <hanshenrik_> as for .gif... dunno
[15:59:32 CEST] <J_Darnley> Number of video frames then.
[15:59:34 CEST] <hanshenrik_> yeah
[15:59:37 CEST] <hanshenrik_> that!
[15:59:39 CEST] <J_Darnley> Perhaps ffprobe
[15:59:59 CEST] <J_Darnley> and its show frames option
[16:00:26 CEST] <hanshenrik_> and it has a json output format! fantastic
[16:00:43 CEST] <hanshenrik_> thanks
[16:20:26 CEST] <varu-> hmm, looking at ffserver, i wonder if theres a way to get it to copy the feed to multiple streams
[16:20:49 CEST] <varu-> looking at the stream config parameters, you *have* to do encoding on them, you can't just copy the raw feed... or can you?
[16:41:21 CEST] <hanshenrik_> is that correct?
[16:41:28 CEST] <hanshenrik_> wrong window
[16:43:08 CEST] <hanshenrik_> the -print_format json seems to be bugged
[16:43:39 CEST] <hanshenrik_> it printed { [mjpeg @ 0x47d60] changing bps to 8 \n Input #0, image2~~t
[16:43:45 CEST] <hanshenrik_> that's not like any json i've ever seen
[16:43:49 CEST] <hanshenrik_> ohhh wait
[16:43:56 CEST] <hanshenrik_> maybe its just stdout and stderr mixed
[16:48:59 CEST] <hanshenrik_> ffprobe -print_format json -count_frames 'jpeg.jpg'
[16:49:08 CEST] <hanshenrik_> its returning an empty json
[16:49:09 CEST] <hanshenrik_> and -1
[16:49:14 CEST] <hanshenrik_> in a valid .jpeg file
[16:49:47 CEST] <hanshenrik_> valid .mp4 file* x.x
[16:50:25 CEST] <hanshenrik_> ffprobe -print_format json -count_frames 'mp4.mp4'
[16:50:42 CEST] <hanshenrik_> valid mp4 file, i can play it back just fine,
[16:50:48 CEST] <hanshenrik_> it gives me empty json and returns -1
[16:50:58 CEST] <hanshenrik_> it does detect video stream and auto stream
[16:51:12 CEST] <hanshenrik_> says "video: h264" etc etc
[16:51:20 CEST] <hanshenrik_> any idea what im doing wrong?
[16:53:53 CEST] <hanshenrik_> json ffprobe -print_format json 'mp420.mp4' also empty json
[16:58:46 CEST] <hanshenrik_> apparently i need to add -select_streams v:0
[16:58:56 CEST] <hanshenrik_> then it started giving sensible output ^^
[17:04:33 CEST] <hanshenrik_> .. and -show_streams
[17:39:52 CEST] <hanshenrik_> https://trac.ffmpeg.org/wiki/Create%20a%20thumbnail%20image%20every%20X%20seconds%20of%20the%20video
[17:40:07 CEST] <hanshenrik_> nvm
[17:43:32 CEST] <momomo> is this -tee command flawed?
[17:43:35 CEST] <momomo> tee -map 0:v -map 0:a /momomo/Generated/Tv/3220225315SVT1/c|[s=480x320]/momomo/Generated/Tv/3220225315SVT1/m
[17:43:47 CEST] <momomo> do I need to escape the path ?
[17:44:35 CEST] <J_Darnley> | is a special char in most shell so I would expect "yes"
[17:58:06 CEST] <momomo> J_Darnley, I am trying to create multiple outputs
[17:58:12 CEST] <momomo> here is my full command:
[17:58:15 CEST] <momomo> ffmpeg -i http://user:pass@domain.ca:9981/play/stream/channel/9dd70a30d7f544e23950f7ee0e6e1f86 -x264-params scenecut=0 -x264opts keyint_min=125 -g 125 -r 25 -framerate 25 -b:v 1024k -bufsize 1024k -minrate 1024k -maxrate 1024k -c:v libx264 -c:a libvo_aacenc -b:a 64k -ac 2 -f hls -hls_list_size 3 -hls_time 5 -hls_flags delete_segments -f tee -map 0:v -map 0:a "/momomo/Generated/Tv/3220225315BBB/c|[s=480x320]/momomo/Generated/Tv/3220225315AA
[17:58:15 CEST] <momomo> A/m"
[17:58:30 CEST] <momomo> http://hastebin.com/etucufutup.sm
[17:58:34 CEST] <momomo> but it is not working
[17:58:43 CEST] <momomo> not sure what is wrong here
[18:02:34 CEST] <momomo> is it possible to generate multiple hls outputs from one command? or is this only for mp4 and others ?
[18:08:17 CEST] <explodes> Repeated uses of av_probe_input_format fails with a segfault, I have no idear what could be causing it - none of the data I'm passing in is novel
[18:08:23 CEST] <J_Darnley> What does "not working" mean?
[18:09:11 CEST] <rrva> how can I get ffprobe to display TS packet number and byte position for each i-frame?
[18:09:16 CEST] <J_Darnley> Am I going to have to allow javascrip to read that?
[18:09:25 CEST] <J_Darnley> It better contain ffmpeg's output.
[18:09:42 CEST] <podman> Are there any benefits to having one ffmpeg process produce multiple outputs vs multiple ffmpeg processes producing a single output running concurrently?
[18:09:52 CEST] <J_Darnley> ... it doesn't
[18:10:34 CEST] <J_Darnley> podman: reading input once might be better than several times.
[18:10:50 CEST] <J_Darnley> perhaps an input device can only be opened once.
[18:10:59 CEST] <J_Darnley> perhaps it is a network stream
[18:11:20 CEST] <J_Darnley> perhaps it is a collosal uncompressed 4k file so all the time is spent reading it from disk
[18:12:01 CEST] <podman> Right now I think memory is a constraint. Testing encoding a 8K file into multiple resolutions. 32GB of ram and it runs out :\
[18:14:47 CEST] <J_Darnley> Well encoding 8k isn't going to be easy on memory
[18:15:35 CEST] <J_Darnley> First of all make sure you are using a 64bit binary
[18:15:48 CEST] <J_Darnley> Then perhaps look at the encoding settings
[18:16:22 CEST] <J_Darnley> If you were using libx264 I would tell you to reduce the lookahead and save some memory
[18:16:41 CEST] <J_Darnley> Other encoders, libx265 for instance, may have a similar option.
[18:19:40 CEST] <momomo> anyone knows how to create several hls segments files? one for mobile ( smaller size ) and one for computers
[18:20:22 CEST] <J_Darnley> Use several outputs, not the tee muxer
[18:20:46 CEST] <momomo> I was hoping that I didn't had to
[18:21:19 CEST] <J_Darnley> The tee muxer is for writing the same thing to several places (if I understand it correctly)
[18:21:51 CEST] <podman> Yeah, I switched the preset from faster to veryfast and at least I didn't run out of memory (i know that reduces lookahead)
[18:22:27 CEST] <podman> it'll be interesting to see what kind of gains there may or may not be by using one FFMPEG process instead
[18:23:00 CEST] <furq> well you only have to decode it once if you do it all in one process
[18:23:03 CEST] <J_Darnley> What format and codec is the input?
[18:23:15 CEST] <podman> i think it's just mp4 h264
[18:23:26 CEST] <podman> but ultimately it could be anything
[18:23:29 CEST] <furq> with an 8K source i'd imagine that's not insignificant
[18:24:06 CEST] <podman> furq: yeah, I'll have to test it out. It would be a major rewrite, but could be worth it in the end
[18:24:08 CEST] <furq> i'd be doing that regardless of the source, though
[18:24:20 CEST] <furq> there's no point duplicating work even if it's only decoding
[18:27:10 CEST] <momomo> furq, is hls segmenting into differnt output formats using tee going to work ?
[18:27:26 CEST] <furq> 17:21:19 ( J_Darnley) The tee muxer is for writing the same thing to several places (if I understand it correctly)
[18:27:29 CEST] <furq> doesn't look like it
[18:27:41 CEST] <momomo> tried this: http://hastebin.com/iyalagogen.sm
[18:28:36 CEST] <momomo> but i am getting: Opening an output file: c.m3u8|[s=480x320]m.m3u8.
[18:28:51 CEST] <momomo> I also tried to escape that part but the quotes just got included
[18:29:00 CEST] <furq> yeah that's not what the tee muxer does
[18:29:12 CEST] <furq> it's for writing the same output to multiple files
[18:29:18 CEST] <furq> if you want different outputs then use multiple outputs
[18:29:32 CEST] <J_Darnley> http://ffmpeg.org/ffmpeg-formats.html#tee
[18:29:40 CEST] <furq> https://trac.ffmpeg.org/wiki/Creating%20multiple%20outputs#Samefilteringforalloutputs
[18:29:44 CEST] <furq> i'm pretty sure you linked that earlier
[18:29:46 CEST] <furq> that's what you want
[18:29:55 CEST] <momomo> i thought because it could take more options it could do that: -map 0:a "output.mkv|[f=mpegts]udp://10.0.1.255:1234/"
[18:29:57 CEST] <momomo> from https://trac.ffmpeg.org/wiki/Creating%20multiple%20outputs
[18:30:19 CEST] <furq> yes that is what it does
[18:30:25 CEST] <furq> but that's not what you said you want to do
[18:30:54 CEST] <momomo> well, i though i could pass another size in there
[18:31:00 CEST] <furq> oh nvm i see what you mean. no it won't rescale
[18:31:11 CEST] <momomo> so should I use filtering or no filtering ?
[18:31:20 CEST] <furq> didn't i already answer this
[18:31:30 CEST] <momomo> not sure, don't remember
[18:31:38 CEST] <momomo> that was regarding the logo i think then
[18:31:50 CEST] <furq> 10:49:08 ( furq) if you want to add an image then you'll need to filter all outputs
[18:32:22 CEST] <furq> there's not really any clearer way to say it. if you want to apply filters then apply filters, otherwise don't
[18:32:58 CEST] <podman> J_Darnley: furq: yeah... so it uses WAY less memory, so that's an improvement. might be way slower though
[18:33:09 CEST] <momomo> ook, i will probably but I meant this specific case, generating sizes .. but I guess it doesnt has to do with filter or not
[18:42:36 CEST] <momomo> it works now using this huge command, do I have to repeat all of it in the second?
[18:42:36 CEST] <momomo> ffmpeg -v 9 -loglevel 99 -re -i http://user:pass@domain.com:9981/play/stream/channel/9dd70a30d7f544e23950f7ee0e6e1f86 -s 1280x720 -x264-params scenecut=0 -x264opts keyint_min=125 -g 125 -r 25 -framerate 25 -b:v 1024k -bufsize 1024k -minrate 1024k -maxrate 1024k -c:v libx264 -c:a libvo_aacenc -b:a 64k -ac 2 -f hls -hls_list_size 3 -hls_time 5 -hls_flags delete_segments /momomo/Generated/Tv/3220225315BBB/c.m3u8 -s 480x320 -x264-params
[18:42:36 CEST] <momomo> scenecut=0 -x264opts keyint_min=125 -g 125 -r 25 -framerate 25 -b:v 1024k -bufsize 1024k -minrate 1024k -maxrate 1024k -c:v libx264 -c:a libvo_aacenc -b:a 64k -ac 2 -f hls -hls_list_size 3 -hls_time 5 -hls_flags delete_segments /momomo/Generated/Tv/3220225315AAA/m.m3u8
[18:42:58 CEST] <momomo> the go from -s to -s
[18:43:02 CEST] <momomo> they*
[18:43:28 CEST] <momomo> more readable format: http://hastebin.com/ujuliyohiv.sm
[18:43:35 CEST] <momomo> do you see any potential improvemtns ?
[18:44:41 CEST] <momomo> for some reason, my mobile output with a smaller screen resulted in larger ts files
[18:44:44 CEST] <momomo> :S
[18:47:24 CEST] <momomo> is -re option any good?
[18:48:23 CEST] <furq> it makes no difference for your use case
[18:48:41 CEST] <momomo> ook good to know
[18:48:54 CEST] <furq> also your outputs are different aspect ratios
[18:49:52 CEST] <furq> and you're using the same bitrate for both streams, which is probably why the smaller one generated larger outputs
[18:50:17 CEST] <momomo> oh, should i perhaps change to crf 23 or simialar ?
[18:50:28 CEST] <furq> i thought you wanted a fixed bitrate
[18:50:59 CEST] <momomo> i am confused at this point. what i really want is fixed segment length
[18:51:15 CEST] <furq> you already have that
[18:51:16 CEST] <momomo> i think your keint_min and -r 25 did that
[18:51:23 CEST] <furq> i thought you also wanted fixed segment size
[18:51:35 CEST] <momomo> no, i had mistakenly imagined that was important
[18:51:38 CEST] <furq> oh ok
[18:51:46 CEST] <momomo> it was actually the time that is important to avoid lags
[18:51:49 CEST] <furq> well yeah crf is much better than b:v
[18:51:57 CEST] <furq> you can use the same value for both
[18:52:11 CEST] <furq> the same crf value for both output streams, i mean
[18:52:12 CEST] <momomo> ooh much prettier picute
[18:52:16 CEST] <momomo> picture now
[18:52:26 CEST] <furq> 1mbit is very low for 720p
[18:52:29 CEST] <momomo> yes, I am repeating the command
[18:52:29 CEST] <furq> so i'm not surprised
[18:53:32 CEST] <momomo> size on mobile is now around 300 k but the desktop one has grown bigger up to 1.1 mb from around 600
[18:53:48 CEST] <momomo> cpu is way down too
[18:54:17 CEST] <momomo> or maybe not.. was a temporary flux
[18:59:04 CEST] <momomo> i removed the -s 1280x720 and size got smaller too
[18:59:12 CEST] <momomo> maybe the source is not that big
[19:02:00 CEST] <momomo> ffmpeg is logging the source format
[19:02:02 CEST] <momomo> 720x576
[19:02:28 CEST] <momomo> is there a way to set a max size ? because I think now ffmpeg is upscaling the video
[19:02:38 CEST] <momomo> if it is less, then it can leave it as is
[19:02:46 CEST] <momomo> no point in upscaling it
[19:06:16 CEST] <momomo> is this it: -vf "scale=iw*min(1\,if(gt(iw\,ih)\,640/iw\,(640*sar)/ih)):(floor((ow/dar)/2))*2" ?
[19:06:18 CEST] <momomo> from: http://superuser.com/questions/566998/how-can-i-fit-a-video-to-a-certain-size-but-dont-upscale-it-with-ffmpeg
[19:08:17 CEST] <furq> as long as the aspect ratio remains intact that should be fine
[19:09:37 CEST] <momomo> the aspect ratio can vary ?
[19:10:18 CEST] <furq> 720*576 is 1.25:1, so it's always anamorphic
[19:11:10 CEST] <fatpelt> afternoon all. i've got a command i'm running that opens 16 http streams and then stacks them all up. when i look at the processor utilization i've got one core maxed and 29 others empty. i'm a compile out of current git. is tehre something i'm missing to use all my other cores?
[19:11:33 CEST] <furq> if you're downscaling, it'll look nicer if you scale it to the actual AR
[19:12:14 CEST] <furq> if you leave it at the native resolution and don't manually set the AR it should get passed through to the player
[19:12:28 CEST] <furq> it works with hls.js in firefox if nothing else
[19:12:57 CEST] <momomo> furq, yes, i am just worried that the sources might have very high resolution or size and I would like to cap it at the top
[19:13:12 CEST] <furq> yeah you have some nontrivial scale command to write
[19:13:13 CEST] <momomo> am I scaling it wrong now then ? the 640 here refers to the width ... i am guessiong the height gets the original aspect .
[19:13:24 CEST] <momomo> -vf "scale=iw*min(1\,if(gt(iw\,ih)\,640/iw\,(640*sar)/ih)):(floor((ow/dar)/2))*2" ? ?
[19:13:28 CEST] <momomo> no goo?
[19:13:31 CEST] <momomo> no good?
[19:13:38 CEST] <furq> you can just scale it proportionately and keep it anamorphic
[19:14:13 CEST] <furq> that's the simplest way to do it
[19:14:45 CEST] <momomo> furq, unfortunately i am not sure how to go about writing such a function ... i figured this function already did that .. maybe it does something else
[19:15:09 CEST] <furq> i believe that function will do what you want if you want the smaller outputs to be anamorphic
[19:15:16 CEST] <furq> or rather if you're happy with that
[19:15:59 CEST] <momomo> isn't anamorphic always preferable ? let the device rescale it
[19:16:05 CEST] <furq> i'd expect the picture quality to be slightly worse
[19:16:13 CEST] <furq> and i'm not totally confident that every player will scale it correctly
[19:16:24 CEST] <momomo> furq, i am thinking mobile devices
[19:16:31 CEST] <momomo> iphone should be able to ..
[19:16:36 CEST] <furq> give it a try
[19:16:57 CEST] <furq> it'll probably be fine
[19:17:25 CEST] <furq> it might be preferable all round if you want to keep the bandwidth low
[19:17:45 CEST] <furq> it's certainly preferable for the native res stream
[19:18:33 CEST] <momomo> i am not sure that command does it after all .. it seems 640x640 refers to height and width .. he wanted a squarre ... i would like to specify the width and let the height scale down appropiatelly
[19:18:36 CEST] <momomo> 1:1
[19:18:51 CEST] <furq> change the bit after the : to -2
[19:20:04 CEST] <momomo> ill try that
[19:20:12 CEST] <momomo> i found this scale=720x406,setsar=1:1 on : http://video.stackexchange.com/questions/9947/how-do-i-change-frame-size-preserving-width-using-ffmpeg
[19:20:40 CEST] <furq> scale=if(gt(iw\,640),640,iw):-2
[19:20:40 CEST] <furq> i think that's all you need? i've not used scale expressions much
[19:21:05 CEST] <furq> don't use setsar if you want to keep it anamorphic
[19:21:13 CEST] <momomo> ook
[19:21:31 CEST] <momomo> so discard: -vf scale=iw*min(1\,if(gt(iw\,ih)\,1280/iw\,(1280*sar)/ih)):-2
[19:22:53 CEST] <momomo> furq, can I donate some money since you have helped me so much ? :D
[19:24:11 CEST] <furq> sure
[19:25:22 CEST] <momomo> paypal?
[19:44:46 CEST] <zamba> furq: are you able to help me out a bit more?
[19:45:15 CEST] <furq> i can try
[19:45:26 CEST] <furq> i don't think i know any more than you about ffv1 though
[19:51:17 CEST] <zamba> yeah, but you know way more about ffmpeg
[19:52:13 CEST] <apocalipsis> hi, mi name is ariel and I'm an android developer, anybody works with android and ffmpeg to compress at low level?(without command line, using NDK)
[19:57:14 CEST] <Mavrik> you'll get more useful answers if you actually ask the question
[19:59:46 CEST] <whald> hi! i'm trying to transcode video segments suitable for http live streaming (HLS) on-demand. i'm currently prototyping that stuff using the ffmpeg commandline tool and a bash script, and this is what I currently have: http://pastebin.com/RidVwtdw
[20:00:27 CEST] <whald> to my surprise, this actually plays in firefox and chrome video tags (through the help of hls.js)
[20:00:51 CEST] <Mavrik> um
[20:01:00 CEST] <Mavrik> Is there a reason why aren't you using ffmpeg's segmenter?
[20:01:01 CEST] <whald> yet, the generated segments are still off regarding the contained PTSs. is what I'm trying to do even sensible?
[20:01:47 CEST] <whald> Mavrik, yes, because i'd like to provide the capability to do bitrate switching and i have neither the computing nor storage capabilities to create the segments offline.
[20:01:59 CEST] <whald> Mavrik, though that would be very nice indeed.
[20:02:14 CEST] <Mavrik> I don't understand.
[20:02:52 CEST] <Mavrik> Thing is, doing it like you do will have issues with keyframes, PAT/PMT tables and exact PTS limits
[20:02:56 CEST] <Mavrik> especially on more broken players
[20:03:12 CEST] <zamba> furq: i believe we can forget about the ffv1 codec.. it just doesn't work performance wise
[20:03:20 CEST] <Mavrik> That's why you have a segmenter in ffmpeg that does segmenting properly for you and can do it pretty much live if necessary
[20:03:24 CEST] <zamba> furq: do you have yet another alternative?
[20:04:05 CEST] <whald> Mavrik, to my understanding, the ffmpeg segmenter creates segments 1) in the order they appear in the source file and 2) for a single resolution/bitrate. and i'd really like to be able to adapt bitrate and jump forward to parts not transcoded yet.
[20:04:25 CEST] <Mavrik> So?
[20:04:39 CEST] <Mavrik> I mean, you can tell ffmpeg when to start :)
[20:05:08 CEST] <Mavrik> The way you want to do is... problematic.
[20:05:19 CEST] <Mavrik> Since players will expect segments to have synchronized keyframes and PTS.
[20:05:28 CEST] <Mavrik> Which may or may not be doable by how you want to do it.
[20:05:49 CEST] <JEEB> do note: I have seen services not match up their IRAPs for years
[20:05:56 CEST] <JEEB> it's usually third party streaming servers
[20:05:59 CEST] <apocalipsis> Mavrik, my question was about if anyone knows a wrapper to make video compression in android, detect video orientation, size, bitrate, to create trim functionality in my app
[20:06:03 CEST] <Mavrik> JEEB, yp
[20:06:09 CEST] <Mavrik> Most players handle it, but not all.
[20:06:10 CEST] <JEEB> and maybe some semi-broken HLS implementations
[20:06:22 CEST] <Mavrik> We had bunch of issues with Androids and some STBs
[20:06:24 CEST] <JEEB> basically the HLS spec says that you shouldn't depend on the segments for anything
[20:06:30 CEST] <JEEB> android has been OK as far as I know
[20:06:35 CEST] <JEEB> maybe some older versions
[20:06:42 CEST] <Mavrik> Yeah, 4.0 or so
[20:06:56 CEST] <JEEB> well the services I can think of started at around 2011 or so...
[20:07:13 CEST] <JEEB> STBs and other plastic boxes of course are a completely separate tale
[20:07:22 CEST] <Mavrik> apocalipsis, I'd just compile ffmpeg binary and invoke it.
[20:07:28 CEST] <Mavrik> Wouldn't even touch NDK.
[20:07:45 CEST] <JEEB> NDK isn't too bad, but it requires knowledge of the libav* APIs
[20:07:58 CEST] <JEEB> also I don't think libavcodec has encoding yet on android HW
[20:08:07 CEST] <JEEB> although not sure
[20:08:20 CEST] <Mavrik> MediaCodec patches I've seen were player only.
[20:08:25 CEST] <JEEB> yes
[20:08:26 CEST] <Mavrik> Encoding with that is hell anyway.
[20:08:34 CEST] <whald> Mavrik, to my understanding, each segment must start with a keyframe. this is the case with my script, and the timestamps should be copied from the source. even this is mostly ok, except that the segments i'm generating are a tad too short at times. i feel like this might be a problem with my "select" filter, instead of the 10 seconds i'm going for i always end up the 8.5 seconds for the first segment. i don't understand why.
[20:08:36 CEST] <JEEB> I have been dealing with mediacodec with mpv-on-android
[20:08:55 CEST] <Mavrik> I've just been dealing with it directly and it was... fun.
[20:09:16 CEST] <Mavrik> But for general video processing just doing Runtime.exec() on a ffmpeg binary was way easier
[20:09:23 CEST] <JEEB> sure
[20:09:24 CEST] <Mavrik> Especially since JNI has a nasty overhead
[20:09:35 CEST] <Mavrik> And requires people to learn C which is usually an issue :P
[20:09:58 CEST] <JEEB> anyways, re: HLS - unless you are basically having your major client being an STB that you know has issues with mismatching GOPs, that is not a problem as long as your maximum GOP length is properly noted in the playlist
[20:10:21 CEST] <JEEB> at least if you're generating the HLS yourself
[20:11:00 CEST] <apocalipsis> it's true, I make some tests, but I don't have knowledge about the use of ffmpeg libs low level programming
[20:11:17 CEST] <apocalipsis> and it's a hard work
[20:11:21 CEST] <JEEB> yes
[20:11:49 CEST] <JEEB> I recommend you look for someone to do some consultancy work
[20:12:04 CEST] <JEEB> usual prices probably go around $100/h
[20:12:59 CEST] <apocalipsis> and make a wrapper, and no broke mi head, sounds good :)
[20:14:19 CEST] <apocalipsis> thanks JEEB and Mavrik!!
[20:22:02 CEST] <netw1z> is there a way to convert AIFC to wav with ffmpeg?
[20:25:43 CEST] <petecouture> netw1z: aiff files?
[20:25:48 CEST] <rjp421> is there a way to get the ffmpeg cmd that would output the same codec/bitrate etc of a given file?
[20:26:16 CEST] <netw1z> AIFF in general, but I havent been able to convert AIFC files
[20:26:51 CEST] <netw1z> soxi and ffmpeg complain about unknown codec in24 for AIFC (compressed AIFF)
[20:27:11 CEST] <durandal_1707> sample?
[20:27:35 CEST] <durandal_1707> aifc should be supported
[20:29:05 CEST] <netw1z> a sample aifc file? i have one
[20:29:34 CEST] <durandal_1707> Yes, upload it somewhere
[20:30:52 CEST] <netw1z> khttp://mvgen.com/test.aifc
[20:30:55 CEST] <netw1z> http://mvgen.com/test.aifc
[20:32:23 CEST] <netw1z> the latter one durandal_1707
[20:33:47 CEST] <petecouture> I can hear on the last link
[20:34:07 CEST] <petecouture> rjp421: You're looking for ffprobe I think
[20:36:07 CEST] <rjp421> petecouture, i can see the details with ffprobe but am unsure on the args to reproduce them sufficiently.. like to output something thats the same as an mp4 from youtube-dl, so i can pipe into castnow
[20:36:32 CEST] <durandal_1707> netw1z: works fine here, what ffmpeg version you use?
[20:36:54 CEST] <petecouture> what do you meanunsure of arges?
[20:36:59 CEST] <petecouture> args*
[20:37:29 CEST] <rjp421> petecouture, for the ffmpeg cmd
[20:38:00 CEST] <rjp421> im particular bitrate and fps/gop etc
[20:39:18 CEST] <rjp421> i assume chromecasts are picky and will expect those particulars
[20:40:28 CEST] <netw1z> wow
[20:40:40 CEST] <netw1z> 1.0.10
[20:40:53 CEST] <rjp421> hopefully im wrong and can just use defaults?
[20:41:24 CEST] <netw1z> duranda1_1707 Ffmpeg version 1.0.10
[20:44:36 CEST] <durandal_1707> netw1z: 3.0.1 is latest release
[20:45:03 CEST] <netw1z> yow
[20:46:04 CEST] <netw1z> thanks - let me get on that now
[20:46:40 CEST] <rjp421> petecouture, "Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p, 1280x720 [SAR 1:1 DAR 16:9], 1890 kb/s, 29.97 fps, 29.97 tbr, 30k tbn, 59.94 tbc (default)" - other than using x264 with profile high at 720p and 29.97 fps, and the aac 192kb 44100khz 2channel audio, should i further specify bitrate etc?
[20:47:25 CEST] <rjp421> to match that ffprobe output
[20:56:56 CEST] <rjp421> i just want to output a known clean stream that shouldnt give me any bs further down the line
[21:35:58 CEST] <podman> any idea why CBR is so much slower than VBR? I'm trying to start transcoding video for HLS but it seems prohibitively slow.
[21:36:38 CEST] <c_14> cbr is vbr...
[21:36:41 CEST] <c_14> eh, no
[21:36:43 CEST] <c_14> sorry
[21:36:45 CEST] <c_14> misread
[21:36:47 CEST] <c_14> For what codec?
[21:36:53 CEST] <podman> h.264
[21:37:15 CEST] <c_14> hmm, cbr should be faster
[21:38:08 CEST] <podman> seems to be much slower
[21:38:17 CEST] <furq> what are you calling cbr
[21:38:20 CEST] <JEEB> what's your definition of CBR and VBR, first of all
[21:38:43 CEST] <c_14> Hmm, -b:v does seem slower than -crf
[21:38:44 CEST] <furq> probably just pastebin your "cbr" and "vbr" commands
[21:39:12 CEST] <podman> furq: sure, can do
[21:39:20 CEST] <c_14> hmm, nvmd might just be slight bitrate differences
[21:39:38 CEST] <furq> i don't know why i put "vbr" in quotes when it will definitely be vbr
[21:41:08 CEST] <podman> http://pastebin.com/a7wQfEQ6
[21:42:23 CEST] <furq> not that it has anything to do with your issue, but -b:a will be being ignored there
[21:42:55 CEST] <podman> furq: ah, true
[21:43:10 CEST] <furq> in the first one, that is
[21:43:19 CEST] <c_14> podman: is the average bitrate of both commands the same?
[21:43:38 CEST] <c_14> rather, is the average bitrate of the first command 2400k
[21:43:46 CEST] <podman> c_14: not sure. can check
[21:43:53 CEST] <furq> yeah if the average bitrate of the second one is higher then that'd account for the slowdown
[21:44:26 CEST] <furq> is there a good explanation of how minrate/maxrate/bufsize work somewhere
[21:44:44 CEST] <furq> particularly bufsize
[21:44:55 CEST] <podman> overall bitrate, as reported by mediainfo of that one is 2339 Kbps
[21:45:04 CEST] <podman> for the first command
[21:45:05 CEST] <c_14> bufsize is the number of bites over which the average bitrate is calculated
[21:45:18 CEST] <furq> bits or bytes
[21:45:37 CEST] <c_14> I think bytes
[21:45:43 CEST] <podman> bit rate for the video track, as reported by mediainfo, is 2222 Kbps
[21:45:50 CEST] <furq> ...weird
[21:46:02 CEST] <c_14> podman: how much faster is the crf one?
[21:48:08 CEST] <podman> the crf one took 5:33 for a 4:31 8K video, the other one took...15:55
[21:48:27 CEST] <podman> same hardware, same load
[21:48:32 CEST] <furq> that is weird
[21:49:06 CEST] <furq> is it that slow with all the vbv stuff disabled
[21:49:10 CEST] <c_14> Can you try actually setting a bitrate on the second with -b:v ?
[21:49:21 CEST] <furq> oh that's a good point
[21:49:58 CEST] <c_14> I also just noticed that the second command has -r and -g while the first doesn't
[21:50:04 CEST] <c_14> try removing them or adding them to the first?
[21:50:13 CEST] <c_14> ditto sc_threshold
[21:50:49 CEST] <podman> kinda need those for HLS
[21:51:04 CEST] <podman> doesn't really alter the transcoding speed though
[21:52:57 CEST] Action: c_14 is mostly out of ideas. Does removing the audio track on both change anything?
[21:53:17 CEST] <podman> that would be interesting
[21:59:13 CEST] <podman> doesn't seem to do anything
[22:00:33 CEST] <gnome1> can someone hilight me? I'm trying to test one thing here...
[22:00:58 CEST] <hurstly> gnome1
[22:01:57 CEST] <gnome1> hurstly: thanks! and it worked!
[22:02:59 CEST] <podman> furq: c_14: so nothing I'm doing looks weird?
[22:02:59 CEST] <hurstly> no problem :)
[22:03:24 CEST] <podman> so, the different machines are using different versions of FFMPEG and possibly libx264
[22:03:39 CEST] <podman> seems unlikely that newer versions would be slower though?
[22:04:59 CEST] <podman> need to find a way to make this reasonably fast. trying to do this on elastic transcoder or zencoder would be prohibitively expensive.
[22:05:01 CEST] <c_14> Wait, this is running on different machines?
[22:05:06 CEST] <c_14> Could be instruction sets
[22:05:10 CEST] <c_14> More optimizations on one cpu etc
[22:07:21 CEST] <podman> c_14: both jobs are running on the same version of linux on the same hardware (EC2 c4.4xlarge)
[22:07:32 CEST] <podman> just different instances
[22:08:14 CEST] <c_14> Try using the same version of ffmpeg/libx264 on both? Maybe with a static build.
[22:08:59 CEST] <kepstin> podman: hmm, I think c4 machines should all be using the same (or at least similar) cpus. But you might get varying cpu performance based on what other users are sharing the host.
[22:13:20 CEST] <podman> kepstin: they're both running on dedicated tenancy
[22:14:08 CEST] <podman> and they all are running on Intel Xeon E5-2666 v3 processors
[22:16:06 CEST] <kepstin> oh, you're using vbv?
[22:16:33 CEST] <kepstin> my impression is that encodes with vbv will be slower than crf or straight vbr, since there's more constraints the encoder has to satisfy.
[22:17:12 CEST] <podman> what's vbv?
[22:17:36 CEST] <kepstin> the minrate/maxrate/bufsize options
[22:18:46 CEST] <podman> ah, well you need to do that to approximate cbr for libx264
[22:18:55 CEST] <kepstin> (they map to a set of options on x264 that start with 'vbv', see https://en.wikipedia.org/wiki/Video_buffering_verifier )
[22:19:12 CEST] <kepstin> yeah, and it makes the stream harder (aka slower) to encode.
[22:19:34 CEST] <JEEB> minrate is ffmpeg-specific btw, and I don't recommend using it
[22:19:43 CEST] <JEEB> maxrate and bufsize map to libx264's stuff
[22:20:16 CEST] <podman> seems like every tutorial recommends it if the output is going to be hls
[22:22:51 CEST] <JEEB> not required and possibly harmful
[22:23:21 CEST] <JEEB> or well, would make it use more bits so that you have less buffer to use in case the scene suddenly became more demanding
[22:23:43 CEST] <JEEB> you only need maxrate and some other rate control mode.
[22:23:52 CEST] <JEEB> uhh, maxrate+bufsize I mean of course
[22:27:51 CEST] <podman> hmm
[22:27:53 CEST] <podman> i'll test
[22:28:19 CEST] <hanshenrik__> can i tell ffmpeg to output to a specific file descriptor? 0 is stdin, 1 is stdout, 2 is stderr, and 3 is another pipe opened by the caller process
[22:28:27 CEST] <hanshenrik__> i want ffmpeg to output to pipe 3
[22:29:28 CEST] <c_14> pipe:3
[22:29:39 CEST] <hanshenrik__> nice
[22:34:08 CEST] <podman> JEEB: made no difference
[22:34:24 CEST] <JEEB> no idea what you were comparing, just noted in general
[22:41:35 CEST] <podman> is there anything crazy i can do to speed up transcoding? split the file up into little chunks and have lots of servers process them at the same time?
[22:42:38 CEST] <c_14> use a faster preset
[22:43:15 CEST] <podman> in terms of net gains, that does very litt
[22:43:16 CEST] <podman> e
[22:43:43 CEST] <J_Darnley> You cannot do that with just ffmpeg.
[22:44:16 CEST] <J_Darnley> There are some distributed encoders that come up now and again.
[22:44:17 CEST] <netw1z> is there a compiled version of ffmpeg 3.0 for wheezy/debian i can install from somewhere?
[22:44:23 CEST] <netw1z> i had an issue on compuile
[22:44:25 CEST] <J_Darnley> Or at least there were.
[22:45:21 CEST] <c_14> http://johnvansickle.com/ffmpeg/ <- netw1z
[22:46:04 CEST] <podman> I'm just trying to find a way to make sure a 4:30 video doesn't take 25 minutes to transcode into all of the formats i need for HLS
[22:48:43 CEST] <netw1z> thanks @c_14
[22:48:46 CEST] <netw1z> ccehking now
[22:48:58 CEST] <hanshenrik__> when i tell ffmpeg to output to a .jpg file
[22:49:09 CEST] <hanshenrik__> and i dont specify a -f format
[22:49:14 CEST] <hanshenrik__> it will take a best guess
[22:49:21 CEST] <c_14> it uses the file extension
[22:49:23 CEST] <hanshenrik__> what format does it default to for out.jpg exactly?
[22:49:27 CEST] <hanshenrik__> "mjpeg" ?
[22:50:34 CEST] <kepstin> hanshenrik__: probably 'singlejpeg', but I'd have to check to make sure.
[22:51:45 CEST] <netw1z> @c_14 do I just overwrite the existing binaries with these from johnvansickle.com
[22:52:44 CEST] <kepstin> hanshenrik__: actually, no, it would be 'image2'
[22:52:55 CEST] <hanshenrik__> thanks
[22:53:03 CEST] <hanshenrik__> best name for a jpg format
[22:53:05 CEST] <kepstin> which then internally does some stuff to select the real format for the images.
[22:53:05 CEST] <hanshenrik__> "image2"
[22:53:09 CEST] <hanshenrik__> oh
[22:53:10 CEST] <hanshenrik__> ok
[22:53:25 CEST] <kepstin> image2 is the generic format for outputing (sequences of) individual images.
[22:53:26 CEST] <c_14> netw1z: put it in your PATH somewhere in front of your current binaries (like /usr/local/bin)
[22:54:00 CEST] <c_14> You _can_ replace them, but if the current binaries are provided by your OS then that can create issues
[22:54:36 CEST] <netw1z> my current binaries are in /usr/local/bin already
[22:54:41 CEST] <c_14> then just replace them
[22:54:45 CEST] <netw1z> sweet!
[22:57:53 CEST] <netw1z> thanks @c_14 its rocking
[23:04:41 CEST] <podman> what provider would you recommend for hardware for FFMPEG? I think we're reaching the limit of what EC2 can do cost effectively
[23:06:06 CEST] <podman> i also tried using their GPU instances but it didn't seem to be any faster
[23:06:53 CEST] <podman> using NVENC
[23:07:40 CEST] <furq> just buy the cheapest dedicated server you can get which isn't an atom
[23:12:11 CEST] <sagax_> hi all!
[23:12:45 CEST] <sagax_> what about ffmpeg rendering process with network?
[23:12:53 CEST] <sagax_> like as render farm
[23:34:57 CEST] <VVelox> Is there anything like -s, but will use that as the bounds for scaling, keeping it in the aspect ratio?
[23:35:30 CEST] <furq> -s 640:-2
[23:35:34 CEST] <furq> or whatever you want the width to be
[23:35:42 CEST] <c_14> eeeh
[23:35:47 CEST] <c_14> don't you need to use the scale filter for that?
[23:36:00 CEST] <furq> doesn't -s just append the appropriate scale filter
[23:36:18 CEST] <c_14> I think it does some strange parsing
[23:36:29 CEST] <furq> well yeah either that or -vf scale=640:-1
[23:36:36 CEST] <furq> or -2 if you want it to be mod2
[23:37:01 CEST] <kepstin> if you're using yuv420, you want -2, otherwise you'll just get errors due to the subsampling.
[23:37:45 CEST] <VVelox> Nice. Thanks!
[23:43:59 CEST] <podman> so, i just tested the same jobs on AWS GPU instances and it was even slower
[23:44:05 CEST] <podman> using nvenc
[23:44:08 CEST] <podman> ugh
[00:00:00 CEST] --- Fri Apr 1 2016
More information about the Ffmpeg-devel-irc
mailing list