[Ffmpeg-devel-irc] ffmpeg.log.20140925

burek burek021 at gmail.com
Fri Sep 26 02:05:01 CEST 2014


[00:15] <relaxed> MikeJoel: http://johnvansickle.com/ffmpeg/
[00:35] <w00ds> howdy
[00:36] <w00ds> think this be a good format to convert mkv/h.264 to mp4/h.265?
[00:36] <w00ds> ffmpeg -i INPUTFILE.MKV -y -c:v libx265 -c:a copy -preset ultrafast -qp 0 OUTPUTFILE.mp4
[00:38] <c_14> It'll work, if that's what you're asking.
[00:38] <c_14> Not sure if it'll be any smaller than the h.264 file though, also the quality won't be better.
[00:39] <microchip_> wodim: x265 is still not mature yet so don't expect wonders
[00:40] <wodim> what?
[00:40] <w00ds> c_14: will it be ALOT worse quality you think?
[00:40] <w00ds> it'll definitely do a MUCH smaller file size though
[00:40] <microchip_> wodim: sorry, meant w00ds
[00:41] <w00ds> i factored it out, will be closer to like 2.3GB output file compared to source file 11.5GB
[00:41] Action: c_14 hasn't done a lot of H.265 encoding, but you probably won't see the difference.
[00:41] <w00ds> c_14: thats awesome, thanks
[00:41] <w00ds> microchip_: it seems to be pretty awesome though
[00:41] <w00ds> especially consider Plex Server is able to read it ...
[00:42] <w00ds> here's a question though ...
[00:42] <c_14> It'll burn your cpu.
[00:42] <microchip_> w00ds: you think now it's awesome? Wait until it's ready.
[00:42] <w00ds> if i convert this file to be for example 2.3GB file size using h.265 ... when it transcodes to work on a non-compliant device, will it transcode it back to being 11.5GB?
[00:43] <w00ds> or just convert it to be h.264 at the size 2.6GB still?
[00:43] <c_14> That depends on what plex does.
[00:43] <w00ds> good point ...
[00:43] <w00ds> wondering if converting my stuff to x265 would be worth it
[00:43] <Nosomy|off> hevc still is experimental
[00:44] <w00ds> well im thinking in a better remote streaming outcome ...
[00:44] <w00ds> a client would be able to download a 2.3GB file faster than a 11.5GB file
[00:44] <c_14> You also have to take into account if that client can then play the H.265 file.
[00:44] <w00ds> err not faster ... but keep up with a movie that is a smaller size.
[00:44] <w00ds> there are no clients that play h.265 (aside from PC)
[00:45] <w00ds> just don't know if it converts it back to 11.5GB before it streams to a client
[00:45] <c_14> Then they will at some point have to encode to H.264 (or something else) which will burn cpu and increase the file size (or ruin the quality).
[00:45] <w00ds> so the transcoding back to h.264 will jump the size back up to 11.5GB?
[00:46] <w00ds> in theory.
[00:46] <Nosomy> lol
[00:46] <c_14> Depends on how the source was encoded and how it's encoded back, but it could.
[00:46] <Nosomy> lossy is always lossy...
[00:46] <w00ds> hmm
[00:46] <c_14> And the quality will be worse, it might not be _visually_ worse, but it'll be worse.
[00:46] <w00ds> true
[00:47] <w00ds> well im trying it out ... see how it works out
[00:47] <w00ds> because then a lot of homes could easily remote stream from home with their 10mbt upload speed
[00:47] <w00ds> not needing a full 100mbit/1gbit to get fast enough speed ...
[00:47] <c_14> If they could encode H.265.
[00:47] <c_14> In a reasonable time frame.
[00:47] <w00ds> decode you mean?
[00:47] <w00ds> the client decode...
[00:48] <c_14> you were talking about upload speed, the person who uploads needs to encode
[00:48] <w00ds> well if it was already encoded in h.265
[00:48] <w00ds> be quicker to send 2.3GB than 11.5GB ...
[00:48] <c_14> Ye, you'll need to check if the clients can decode H.265 in realtime though.
[00:49] <c_14> Especially at higher resolutions.
[00:49] <w00ds> true, when a h.265 client is out
[00:49] <w00ds> heh
[00:49] <c_14> Try it out though and see if it works.
[00:49] <w00ds> will do
[00:49] <w00ds> im also going to try out plex's "cloud sync" feature
[00:49] <c_14> The main reason I haven't bothered that much with it is because it takes forever to encode.
[00:50] <w00ds> which transcodes video to google drive for example
[00:50] <w00ds> c_14: ya it does take a long time if you don't do ultrafast
[00:50] <c_14> I can't remember what settings I was using, but afaik I calculated 20 or so days for a 2 hour 1080p bluray rip.
[00:51] <w00ds> weee ... 76 degrees celsius temperature for my CPU
[00:51] <w00ds> lol
[00:51] <c_14> That was 4 or 5 months ago.
[00:51] <w00ds> http://i.imgur.com/MVDTdyV.png
[00:51] <c_14> yep, and now it'll stay like that. For a while.
[00:53] <w00ds> aren't there encoding/transcoding companies?
[00:53] <c_14> Companies that'll encode video for you?
[00:53] <w00ds> ya
[00:54] <c_14> No clue. There might be. But tbh all you need to do is buy/rent a bunch of cpu time.
[00:54] <c_14> But I like having control of my encodes.
[00:55] <w00ds> tue true
[00:55] <w00ds> $26/month http://myskyhost.com/NL-Encoding-RDP.html
[00:57] <w00ds> maybe cheaper just get a kimsufi box
[00:57] <w00ds> heh
[00:58] <c_14> I usually just use one of my computers.
[00:58] <c_14> I have a couple standing around so if one of them is busy it doesn't bother me that much.
[00:58] <c_14> Plus it makes for a great heater.
[00:58] <w00ds> ya true.
[00:59] <w00ds> would more cpus be better or more threads?
[00:59] <w00ds> example, 4c/8threads @ 2.6gighz or 4c/4threads @ 3.1gighz
[01:00] <c_14> Probably more cores.
[01:00] <c_14> Hyperthreading is mainly for desktop loads.
[01:00] <c_14> ie doing lots of things at once
[01:01] <c_14> Ie lots of short-running threads while encoding is several long-running threads.
[01:01] <w00ds> so faster cores will matter more than threads, k got it.
[01:12] <vmBenLubar> Does -c:v libvpx-vp9 -lossless 1 work? It seems to wash out the colors, at least when I play it back in vlc.
[02:04] <kryo_> hey guys
[02:05] <kryo_> i'm trying to encode an MP4-TS as flv and send it to a rtmp:// url
[02:05] <kryo_> w21
[02:07] <kryo_> any ideas on how to get ffmpeg t odo this?
[02:10] <c_14> ffmpeg -i file -c copy -f flv rtmp://foobar
[02:10] <c_14> And the word you are looking for is remux, not encode. Unless you actually want to change the audio/video data and not just the format.
[02:13] <kryo_> ah sweet
[02:13] <kryo_> remux is even better
[02:13] <kryo_> thought i might need to encode cause of the TS format
[02:23] <c_14> Not sure you have to, but never streamed to rtmp either.
[02:28] <kryo_> c_14: one more question
[02:28] <kryo_> can i make it loop the video?
[02:29] <c_14> Not ffmpeg-internally. You said the input file was mpeg-ts, right?
[02:29] <c_14> If it is, you can just use: `while true; do cat file.ts; done | ffmpeg -i - -c copy -f flv rtmp://foobar'
[02:31] <kryo_> looks good to me
[02:31] <kryo_> thanks a lot
[02:38] <Riviera> I hardcoded subtitles by using "-vf subtitles=file.srt", yet could not find an easy way of specifying the font size (I want a larger font).  Is there any?
[02:44] <c_14> Short of converting the srt to ass and changing the fontsize there, you can mess around with the original_size setting.
[02:44] <c_14> Though, tbh. Just convert to ass and change the fontsize.
[02:52] <Riviera> c_14: hm, funny hack, i'll try that; thanks :)
[02:54] <circ-user-iNQ84> Hi, I'm building ffmpeg to include libvorbis and libvpx for Android. On configuration I get 'ERROR: libvorbis not found'. How do I tell ffmeg where to find libvorbis? My configuration command is:
[02:54] <circ-user-iNQ84> ./configure \     --prefix=${DIR_SYSROOT} --arch=${CPU} --target-os=linux \     --extra-ldflags="-L${DIR_SYSROOT}lib" --extra-cflags="-I${DIR_SYSROOT}include" \     --enable-cross-compile --cross-prefix=${PREFIX}- --sysroot=${DIR_SYSROOT} \     --disable-shared --enable-static --enable-small \     --disable-all --enable-ffmpeg \     --enable-avcodec --enable-avformat --enable-avutil --enable-swresample --enable-avfilter \
[02:54] <c_14> Where is libvorbis installed?
[02:55] <circ-user-iNQ84> ${DIR_SYSROOT}/lib/libvorbis.a
[02:55] <circ-user-iNQ84> ${DIR_SYSROOT} is the android system root
[02:56] <circ-user-iNQ84> I don't know if this means that libvorbis was not found or if it was not built correctly
[02:56] <c_14> What does `pkg-config --exists libvorbis' return?
[02:56] <circ-user-iNQ84> Nothing
[02:57] <c_14> echo $?
[02:57] <c_14> The actual return code, not output.
[02:57] <circ-user-iNQ84> How do I get that? $$ echo $
[02:57] <circ-user-iNQ84> ?
[02:57] <dahat> Are there any known 'newer' RTMP servers that spit out either RTMPE or RTMPS that libRTMP is known not to work with at this time?
[02:57] <c_14> `echo $?'
[02:58] <circ-user-iNQ84> I mean where should I put that
[02:58] <circ-user-iNQ84> it just prints a $
[02:58] <circ-user-iNQ84> Oh! It's 1
[02:59] <c_14> Right, how did you install libvorbis?
[02:59] <circ-user-iNQ84> make install
[02:59] <circ-user-iNQ84> I looked through thr logs
[02:59] <circ-user-iNQ84> It was put in the android sysroot
[02:59] <c_14> do you have ${DIR_SYSROOT}/include/vorbis/vorbisenc.h ?
[03:00] <circ-user-iNQ84> Yes
[03:00] <circ-user-iNQ84> I mean pkg-config shouldnt find libvorbis
[03:00] <circ-user-iNQ84> pkg-config is the host machine's pkg-config
[03:00] <circ-user-iNQ84> I don't have a pkg-config for android
[03:01] <c_14> It doesn't matter anyway, the configure script doesn't use pkg-config for libvorbis.
[03:01] <c_14> does ${DIR_SYSROOT} have a trailing slash?
[03:01] <circ-user-iNQ84> Yes
[03:02] <c_14> aaah
[03:02] <circ-user-iNQ84> Should I remove it?
[03:02] <c_14> you might need --extra-cxxflags="-I${DIR_SYSROOT}include"
[03:02] <c_14> I think libvorbis is c++, not c.
[03:02] <circ-user-iNQ84> So I need that for compiling fmpeg or libvorbis?
[03:03] <c_14> for compiling ffmpeg
[03:03] <c_14> Just add it to above configure line.
[03:03] <c_14> *the above
[03:04] <circ-user-iNQ84> Nothing
[03:04] <circ-user-iNQ84> The same error
[03:04] <c_14> Can you pastebin config.log ?
[03:05] <circ-user-iNQ84> One sec
[03:07] <circ-user-iNQ84> http://pastebin.com/ftuEEK0Q
[03:10] <circ-user-iNQ84> There are these lines at the end.
[03:11] <circ-user-iNQ84> something about ranlib
[03:11] <c_14> yep
[03:11] <c_14> Saw them as well, trying to find out why that might be happening.
[03:12] <circ-user-iNQ84> Do I need any environment variables setup in order to build?
[03:13] <c_14> I'm guessing the libvorbis configure cannot correctly detect the ranlib binary and therefore doesn't generate the indexes.
[03:13] <circ-user-iNQ84> http://www.mega-nerd.com/erikd/Blog/CodeHacking/MinGWCross/pkg-config.html
[03:13] <circ-user-iNQ84> Is this related?
[03:14] <c_14> probably not
[03:18] <c_14> You could probably either directly run ranlib on the vorbis libraries, or you could check the Makefile that ./configure (for libvorbis) outputs and checks what it sets RANLIB as, and then make sure it's set correctly so it references the android ranlib
[03:20] <circ-user-iNQ84> It's using the wrong ranlib in libvorbis Makefile
[03:20] <circ-user-iNQ84> RANLIB = ranlib
[03:20] <c_14> You can probably just correct it so it uses android ranlib.
[03:20] <circ-user-iNQ84> So, is there an environment variable I can set to fix this when building libvorbis?
[03:20] <circ-user-iNQ84> Or should I fix it manually and run make?
[03:21] <c_14> I'd fix it manually.
[03:21] <c_14> Not sure if there's a variabl.
[03:21] <c_14> *variable
[03:22] <c_14> Might want to file a bug with libvorbis though, since that would probably be a bug in their configure script.
[03:22] <circ-user-iNQ84> Oh! Cool!
[03:22] <circ-user-iNQ84> Let me try if this works first.
[03:25] <circ-user-iNQ84> Not working.
[03:25] <circ-user-iNQ84> I'll check config.log again.
[03:26] <circ-user-iNQ84> Let me send a pastebin, it's something about ogg.
[03:27] <circ-user-iNQ84> http://pastebin.com/FMdaD9jL
[03:29] <c_14> >ogg: no archive symbol table (run ranlib)
[03:29] <c_14> looks like the same thing, just for ogg this time
[03:29] <circ-user-iNQ84> Oh! Nice!
[03:36] <circ-user-iNQ84> Making sure it works. Will be back in a bit.
[04:11] <circ-user-iNQ84> Now, I'm having a problem with libvorbis, turns out it's not built correctly too.
[04:11] <circ-user-iNQ84> Should I ask about this here?
[05:02] <MrJoestar> I'm trying to use " ffmpeg -i input.mkv -filter_complex "[0:v][0:s]overlay[v]" -map [v] -map 0:a <output options> output.mkv ", but I got the message " no matches found: [v]".
[06:16] <gcl5cp> Overlay area (0,0)<->(720,480) not within the main area (0,0)<->(480,270) or zero-sized
[06:18] <gcl5cp> I'm trying to overlay a greater image than video, overlay=-120:-100, to concatenate a small video with other with more resolution.
[06:19] <gcl5cp> is this posible?
[06:35] <pentanol> gcl5cp hi, video overlay on the video?
[06:36] <gcl5cp> ffmpeg -i w.png -i v.mp4
[06:40] <gcl5cp> i need a 720x480 video. Original is 480x270, y don't want use -s, will be "pixeled". so i want to use a png watermark(border frame) and insert(overlay) video in center.
[06:41] <pentanol> you must use scale
[06:45] <pentanol> format=rgba,scale=720x480
[06:53] <gcl5cp> so, it is not possible
[06:55] <gcl5cp> preserve original(low) resolution
[07:12] <pentanol> as i remember in ffmpeg swscaler make expand the picture
[07:15] <pentanol> ffmpeg -i w.png -vf format=rgba,scale=720x480  -i v.mp4
[07:15] <pentanol> pardon, ffmpeg -i w.png -vf format=rgba,scale=720:480  -i v.mp4
[08:38] <kryo_> hey i want to remux an mpeg-ts to flv and play it back to a rtmp:// address
[08:38] <kryo_> can i do that with ffmpeg?
[08:39] <pentanol> sure
[08:39] <kryo_> i'm using -c copy
[08:39] <kryo_> but it goes much too fast... :P
[08:40] <kryo_> how to slow down? XD
[08:44] <pentanol> -c ? for subtitles?
[08:44] <pentanol> need to go...
[08:44] <kryo_> lol
[09:00] <EvolE> xD
[09:01] <EvolE> kryo_: what goes too fast?
[09:01] <kryo_> ffmpeg is sending the stream over rtmp
[09:01] <kryo_> and it's going at 2500+ fps
[09:02] <kryo_> i believe i need to to go 30 fps
[09:04] <EvolE> that's strange. doesn't it recognize source fps ?
[09:05] <kryo_> i think it's essentially just remuxing the video and sending it over rtmp asap
[09:05] <kryo_> setting -r 30 does nothing o_O
[09:05] <EvolE> hmm... maybe you are right. did you try -re option?
[09:06] <EvolE> before the input
[09:09] <kryo_> well that work excellent, thanks!
[09:09] <kryo_> however, the audio isn't being sent o_O
[09:09] <kryo_> ffmpeg -re -i fil.mp4 -bsf:a aac_adtstoasc -acodec copy -c copy -f flv rtmp://
[09:15] <EvolE> if you are using "-c copy" then you probably don't need -acodec copy
[09:16] <EvolE> or you can use both "-acodec copy" and "-vcodec copy"
[09:16] <EvolE> but i'm not sure if it's the case why audio isn't sent
[09:16] <kryo_> the audio is still detected
[09:17] <kryo_> maybe it's being sent in the wrong format
[09:17] <EvolE> maybe.
[09:19] <kryo_> ha just had to use -acodec libmp3lame
[09:21] <kryo_> thanks EvolE :D
[09:22] <EvolE> kryo_: np, good it helped)
[14:44] <csepulvedab> hi guys!
[14:47] <csepulvedab> guys, you know how i can transcode a video using the same x264 option uses in another video?
[14:47] <csepulvedab> there is an option for this or i must parse the output of ffprobe?
[16:23] <McSalty> Hi, I would like to use a stream from an ipcam (http://x.x.x.x/mjpg/video.mjpg) and map it to a video loopback device (via v4l2loopback) so that I can access the stream via an application requiring a valid webcam. I am only allowed to use ffmpeg directly (no wrappers or other tools). How can I do this?
[16:23] <McSalty> I already have a virtual device and I can pipe a video to it, but the stream doesn't work yet.
[16:24] <disconnected> hey, I am compressing mp4 files using: ffmpeg -y -i input_file -qscale 31 -vcodec libxvid -acodec copy output_file. Why does this create bigger files sometimes? For 7.4 MB file it created 36.5MB output file and for file 42.5MB -> 15.1MB. Is there any property that determines if it produces bigger file or not?
[16:42] <klaxa|work> disconnected: the bitrate/quality of the input files?
[16:42] <klaxa|work> those feel like properties that will determine whether or not the resulting file (with constant quality) will be smaller or bigger
[16:43] <disconnected> from medianfo, the overall bit rate is 184 Kbps (with variable bit rate mode), the video bit rate is 55.6 Kbps
[16:44] <disconnected> the created bigger file has 771 Kbps video bit rate with Constant bit rate mode
[16:44] <disconnected> is there any universal command that would just create less quality file somehow?
[16:44] <disconnected> (or rather universal parameters...)
[16:45] <klaxa|work> i don't think so, you can parse mediainfo or ffprobe and generate command line parameters from that
[16:45] <disconnected> if not - how can I calculate/is this defined somehow what is the threshold above which the generated file will be bigger?
[16:49] <disconnected> (btw. the input file is marked by the ffmpeg as: Stream: Video h264 (Baseline) (avc1), yuv420p(tv), <here resolution which varies - 1280x720 and 1920x1080), here probably bitrate? which varies: 1874 kb/s and 55kb/s
[16:53] <wintershade> hi everyone
[16:57] <McSalty> Hi guys, I'm getting closer tot he solution. Right now the only problem is that the output format (video4linux2) is not suitable for video1
[16:57] <McSalty> ffmpeg -i http://username:password@x.x.x.x/mjpg/video.mjpg -f video4linux2 /dev/video1
[17:01] <ubitux> what version of ffmpeg?
[17:02] <ubitux> McSalty: ah i forgot to add the video4linux2 alias
[17:02] <ubitux> McSalty: use -f v4l2
[17:05] <McSalty> ubitux, version 0.8.16-6:0.8.16-1 it doesn't work with -f v4l2
[17:05] <ubitux> you're probably not using ffmpeg
[17:05] <ubitux> we are in 2.4 btw, just upgrade
[17:06] <McSalty> that's the problem, I'm not allowed to upgrade. I have to stick to this version. If it keeps failing, I'll have to suggest to upgrade
[17:06] <wintershade> McSalty: you can just download the binary, extract it somewhere in your /home directory and run it from there
[17:07] <wintershade> McSalty: which OS are you using anyway?
[17:07] <McSalty> wintershade, the system is running crunchbang
[17:08] <McSalty> wintershade, so I don't have to load new modules. then I guess I will just download the binary and try that first. thanks a lot, I'll try it right away
[17:08] <wintershade> McSalty: odd. can you paste(bin) the console output when you just type "ffmpeg" in it? tia
[17:11] <McSalty> wintershade, sure: http://pastebin.com/tg79eBdi
[17:11] <wintershade> McSalty: it *could* work. Linux 2ndGen package management system is there to make things easier for you, but it's not always the only way - just the most convenient one for most users. OTOH, I know people who always install just the bare bones of their Linux distro, the last thing being X and perhaps WindowMaker or OpenBox. All the other applications, they just download the binary in .tgz/.rpm/.deb/.whatever, extract it somewhere in
[17:11] <wintershade> their /home and run it from there.
[17:11] <wintershade> McSalty: whoa, you really do have 0.18.16 version of ffmpeg. how old is that???
[17:12] <wintershade> and I thought my Gentoo stable has some old packages...
[17:13] <McSalty> wintershade, ha, I know now, I didn't realize that. is it really that old? I'll just try the updated version, things will probably be much better:)
[17:14] <wintershade> McSalty: as ubitux said, the current branch is 2.4. that's way above 0 :D I've just updated to 2.3.3 on my Gentoo the other day. will probably return to 2.2, until guys at oDesk decide to compile their oDeskTeam against newer ffmpeg libs.
[17:41] <bluepr0> hello! Im trying to find out if its possible with ffmpeg to start transcoding on a certain time position for a HTTP stream. For example an stream given by peerflix
[17:54] <McSalty> wintershade, I've sent you a pm
[17:55] <McSalty> wintershate, I've sent you a pm
[18:19] <gcl5cp> thank pentanol, "scale=720:480" but i am looking for enlarge resolution without scale video, is like add a frame(border) to a photo making canvas bigger but same image resolution.
[19:44] <alexblzn> hello, is it possible to convert XESC to AVI format with ffmpeg?
[19:50] <sacarasc> From the ffmpeg -codecs output, I would say maybe.
[19:52] <alexblzn> I'm taking a look in the list.
[19:53] <alexblzn> Thanks.
[21:19] <sheshkovsky> Hello guys, I'm trying to build an application to burn subtitles on videos. The specific task to do is burning Persian (Right-to-left language like arabic) subtitles. But I have problem with it. When I got several words simuntaneously without any Enter between them, the ffmpeg push the lines to up. for example if I have three lines, the third lines comes on first line, then the second line, and then the first line prints on the 
[21:19] <sheshkovsky> Can anyone help me with this problem?
[21:33] <sheshkovsky> I've posted a question on stackoverflow about my problem, if anyone could help me please send the answer there. Thanks. Here's the link: http://stackoverflow.com/questions/26046586/how-to-fix-ffmpeg-mencoder-pushing-persian-rtl-subtitles-reversed
[21:35] <benlieb> of
[21:35] <benlieb> How do I cut a section of movie between two time stamps?
[21:35] <benlieb> it seems like -t and -to both represent durations...
[21:36] <c_14> You want the middle part or you don't want the middle part?
[21:39] <benlieb> c_14: I want the part between the two times
[21:39] <c_14> ffmpeg -ss start time -t duration -i input output
[21:39] <c_14> Or, ffmpeg -i input -ss start_time -to end_time output
[21:40] <c_14> s/start time/start_time
[21:40] <benlieb> c_14: the -to docs say "Stop writing the output at position"
[21:40] <benlieb> this is position of the output file, no?
[21:41] <c_14> depends
[21:42] <c_14> If you use -to without -ss or with -ss as an output option, it's the timestamp of the input file.
[21:42] <benlieb> if I wanted to get from 03:00 to 04:00 would I use -to 04:00 or -to 01:00
[21:42] <benlieb> so when -ss and -to are use as output options they refer to the input file?
[21:43] <benlieb> That's weird
[21:43] <c_14> ffmpeg -i file -ss 03:00 -to 04:00 output or ffmpeg -ss 03:00 -t 01:00 -i input output or ffmpeg -ss 03:00 -i input -t 01:00 output or ffmpeg -ss 03:00 -i input -to 01:00 output
[21:43] <benlieb> about -ss docs say When used as an output option (before an output filename), decodes but discards input until the timestamps reach position
[21:43] <c_14> yep
[21:43] <benlieb> this clip is actually towards the end of a 2 hour vi
[21:43] <benlieb> vid
[21:44] <c_14> https://trac.ffmpeg.org/wiki/Seeking%20with%20FFmpeg
[21:44] <c_14> That might be a bit more helpful.
[21:44] <c_14> If you don't understand something, tell me and I'll fix it/help you.
[21:47] <benlieb> c_14: ffmpeg -i video.mp4 -ss 00:01:00 -to 00:02:00 -c copy cut.mp4
[21:47] <benlieb> will do what I want
[21:47] <benlieb> but will it also decode 0:00 to 1:00
[21:47] <benlieb> ?
[21:47] <c_14> yes
[21:48] <benlieb> so there's no way to do a fast seek and use two timestamps
[21:48] <benlieb> which is odd
[21:48] <c_14> Fast seeking throws away the timestamps.
[21:48] <benlieb> because it's my most common use case
[21:49] <benlieb> as of now I use a duration library to interpret two time stamps and calculate the -t
[21:49] <c_14> Ie if I fast-seek 20 minutes into a video, the first timestamp I encounter after seeking is 00:00:00 not 00:20:00
[21:49] <benlieb> but it doesn't handle milliseconds
[21:49] <c_14> It does _unless_ you use -c copy.
[21:49] <c_14> Or do you mean the library?
[21:49] <benlieb> the lib
[21:50] <benlieb> c_14: I actually can't find a single ruby lib that handles duration w milliseconds.
[21:50] <benlieb> except ones that represent 'time', which is linked to the unix epoque
[21:51] <c_14> Where do you get the timestamps from? Because you can just give ffmpeg timestamps as floats in seconds.milliseconds
[21:51] <c_14> "floats", that is
[21:51] <benlieb> c_14: that wouldn't solve the problem of needing a duration
[21:52] <benlieb> That's what the lib was doing.
[21:52] <benlieb> I'm getting them from a aegisub
[21:52] <benlieb> subtitling program, which has a nice interface for mapping sections of video, even though i'm not using subtitles ;)
[21:53] <c_14> you have a from and a to timestamp, right?
[21:53] <c_14> In the form xx:xx:xx.xx ?
[21:53] <benlieb> it also uses the easily parseable .ass format
[21:53] <benlieb> yep
[21:54] <c_14> You might need to make your own function to calculate duration with milliseconds...
[21:55] <c_14> That, or find a way to get ffmpeg not to throw away timestamps when keyframe-seeking.
[22:00] <benlieb> c_14: yeah
[22:01] <benlieb> prob will end up writing my own duration class
[22:01] <benlieb> grr
[22:02] <benlieb> all good though I guess
[22:09] <benlieb> c_14: I wonder if ffmpeg is the right tool for what I'm doing
[22:10] <benlieb> I have fallen into this odd service of converting instructional dance DVDs into a digitally distributable format.
[22:11] <benlieb> It requires adding intros, and splitting the dvd at various points, adding music, etc. But I'm not super happy with the product I've achieved. I've managed to get a relatively automated system together, but it's buggy and unwieldy.
[22:11] <benlieb> But here's an example of the 'finished' product:
[22:11] <benlieb> c_14: http://www.idance.net/en/packs/465-the-22-foundation-patterns-of-west-coast-swing
[22:11] <benlieb> do you have any tools in mind that might be better suited for this?
[22:12] <benlieb> using a gui would take way too long and be more tedious.
[22:14] <ChocolateArmpits> Did you look into Avisynth?
[22:15] <benlieb> ChocolateArmpits: hm. Hadn't heard of it.
[22:15] <benlieb> ChocolateArmpits: how does it compare to ffmpeg?
[22:16] <ChocolateArmpits> Avisynth allows to use video filters to process video, you can write filters, share filters
[22:16] <ChocolateArmpits> It's a filter processor
[22:16] <ChocolateArmpits> of sorts
[22:17] <benlieb> a filter processor?
[22:17] <ChocolateArmpits> well, ok, Wiki gives a better description of a "GUI-less video editor"
[22:17] <benlieb> I've found it extremely unintuitive to work with ffmpeg to do basics of cropping, cutting, contacting etc.
[22:18] <benlieb> is it more intuitive?
[22:18] <ChocolateArmpits> Well if "vid1 ++ vid2" to get a concatted video is intuitive, then yes
[22:19] <ChocolateArmpits> http://en.wikipedia.org/wiki/AviSynth
[22:19] <ChocolateArmpits> Just read the page
[22:19] <benlieb> that sounds good.
[22:20] <benlieb> considering this is what I do do to add an intro (already made) in ffmpeg. https://gist.github.com/pixelterra/c46cfc699caba052d543
[22:21] <benlieb> ug
[22:21] <ChocolateArmpits> You can load Avisynth scripts into ffmpeg as a regular input, but only 32bit version is supported
[22:22] <ChocolateArmpits> 32bit compile of ffmpeg that is
[22:23] <ChocolateArmpits> There is also Vapoursynth, similar thing, but newer and uses Python http://www.vapoursynth.com/
[22:24] <benlieb> Those look really cool.
[22:24] <ChocolateArmpits> You can generate text in ImageMagick, save as an image file with alpha and then apply it in Avisynth over a video via Layer filter
[22:25] <ChocolateArmpits> I use Avisynth myself to generate videos with different video/subtitle combinations
[22:27] <benlieb> ChocolateArmpits: Awesome. Is Vapoursynth more or less powerful / popular?
[22:27] <ChocolateArmpits> Vapoursynth doesn't have as many filters, so obviously less functionality, but it should be faster than Avisynth, at least from my few tests
[22:28] <benlieb> ChocolateArmpits: speed isn't my main concern
[22:29] <ChocolateArmpits> In general Vapoursynth tries to position itself as a logical continuation to Avisynth
[22:29] <ChocolateArmpits> even though there are projects, such as Avisynth+ which try to bring Avisynth more up to date by combining features such as 64bit support, proper multithreading
[22:29] <benlieb> ChocolateArmpits: since you're in ffmpeg chanel, you clearly still need / use ffmpeg?
[22:30] <ChocolateArmpits> I'd probably go insane having to use an NLE every time I want to transcode a video
[22:30] <benlieb> nle?
[22:30] <ChocolateArmpits> Non-Linear Editor
[22:30] <ChocolateArmpits> You're not familiar with broadcast terms?
[22:31] <ChocolateArmpits> or rather, video in general
[22:31] <benlieb> ChocolateArmpits: lol, nope
[22:31] <ChocolateArmpits> It stands for video editor software
[22:31] <ChocolateArmpits> editing*
[22:31] <ChocolateArmpits> Such as Premiere, Final Cut
[22:31] <benlieb> I'm a web developer, and somehow stumbled into video accidentally with a project
[22:32] <benlieb> now I have thousands of videos on my hand and a huge stack of dads to process.
[22:32] <ChocolateArmpits> I see
[22:35] <ChocolateArmpits> Any more questions ?
[22:35] <benlieb> so for my general use case I need to chop a dvd into pieces, add an intro, make previews, and stills
[22:35] <benlieb> ChocolateArmpits: ^
[22:35] <benlieb> well I need to make the intro too
[22:36] <benlieb> I'm using .ass, a background video, and an mp3 to make a pretty simple intro
[22:36] <ChocolateArmpits> The intro consists of just an overlayed text that fits the topic the video is for ?
[22:37] <benlieb> I grab the text from the db and use an .ass template file to output another .ass file that I use for the ffmpeg filter.
[22:37] <benlieb> could avisynth do all of this?
[22:38] <ChocolateArmpits> Well, in this case Avisynth will only overlay the subtitle
[22:38] <ChocolateArmpits> via a subtitle plugin, Assrender or VsFilter(mod)
[22:38] <benlieb> can it do cropping / cutting?
[22:38] <ChocolateArmpits> Yes
[22:38] <ChocolateArmpits> Trim(in_point,out_point)
[22:39] <benlieb> does it do audio noise reduction?
[22:39] <ChocolateArmpits> Eh, it's not really audio-oriented
[22:39] <benlieb> ChocolateArmpits: can it layer multiple videos with alpha channels ?
[22:40] <ChocolateArmpits> Yes, Layer(bg_video,top_video)
[22:40] <ChocolateArmpits> You can use video as a mask for another video
[22:40] <benlieb> that sounds like more what I need.
[22:40] <benlieb> but no audio processing?
[22:41] <ChocolateArmpits> Well there are filters, both internal and external http://avisynth.nl/index.php/External_filters#Audio_Filters http://avisynth.nl/index.php/Internal_filters#Audio_processing_filters
[22:41] <ChocolateArmpits> Externals seem to offer SoX, so maybe that has some noise correction feature
[22:41] <ChocolateArmpits> You can also use SoX CLI outside avisynth
[22:42] <ChocolateArmpits> At least I do all my sound processing outside Avisynth and then mux with ffmpeg
[22:43] <benlieb> yeah, this is all immensely complex. i probably bit off more than is reasonable
[22:43] <benlieb> I have also been storing data in a db that corresponds to all of the parameters needed to (re)generate all resources from their source file
[22:43] <benlieb> this includes intro music, font size and color etc, video background, dimensions...
[22:44] <ChocolateArmpits> Is the noise quantifiable?
[22:44] <benlieb> so if something goes wrong or missing and is noticed 1 year from now, I'd be able to regenerate all videos, thumbs etc from the db.
[22:44] <benlieb> ChocolateArmpits: hrn?
[22:44] <benlieb> hm?
[22:44] <benlieb> ChocolateArmpits: the noise is different for every recording
[22:45] <benlieb> I've been manually using audacity
[22:45] <ChocolateArmpits> Oh, so different frequencies and levels ?
[22:45] <benlieb> I'm basically converting dvds to purchasable sections
[22:48] <ChocolateArmpits> ok, Sox has a noise reduction filters, http://sox.sourceforge.net/sox.html search for "noisered"
[22:48] <ChocolateArmpits> But if the noise pattern isn't systematic it will be hard to apply it
[22:49] <Diogo> Hi this is possíble using generate ts for hls change the time of each segment
[22:49] <benlieb> ChocolateArmpits: you usually have to sample it.
[22:49] <benlieb> ChocolateArmpits: but iMovie has a great noise filter that is very smaret
[22:49] <benlieb> smart
[22:56] <ChocolateArmpits> Hmm, do the videos have the same audio mix level ?
[22:58] <ChocolateArmpits> benlieb ^
[22:59] <benlieb> nope, but it's not really that important
[22:59] <benlieb> I mostly am concerned with making attractive intros, and cutting videos easily. maybe some filters for contrast etc
[23:00] <ChocolateArmpits> Well, you can detect audio silence, when it goes below a threshold level, then use the value to sample silent section and then use the sampled noise for the noisered
[23:00] <ChocolateArmpits> So it's possible to some degree to automate
[23:00] <ChocolateArmpits> You would have to average the levels beforehand so the silence is at relatively the same level through the recordings
[23:01] <ChocolateArmpits> Silence detection can be done with ffmpeg https://www.ffmpeg.org/ffmpeg-filters.html#silencedetect
[23:01] <ChocolateArmpits> from there you would move with the extracted value to SoX
[23:03] <benlieb> ChocolateArmpits: would it be wisest to start with avisynth or go directly to vaporsynth?
[23:03] <ChocolateArmpits> If you don't know Python, then Avisynth
[23:04] <benlieb> ChocolateArmpits: I know basic python
[23:04] <benlieb> do you think the features of VS are catching up to AS?
[23:06] <ChocolateArmpits> I have only started following it only recently, but some of the essential plugins as getting native rewrites, rather than using a plugin to load avisynth native plugins, so there is some implied migration
[23:08] <ChocolateArmpits> I would probably choose Avisynth, as I'm not familiar with Python, I only installed Vapoursynth to see how it performs and general interest
[23:08] <ChocolateArmpits> My future outlook on Vapoursynth is better than on Avisynth though
[00:00] --- Fri Sep 26 2014


More information about the Ffmpeg-devel-irc mailing list