[Ffmpeg-devel-irc] ffmpeg.log.20161219
burek
burek021 at gmail.com
Tue Dec 20 03:05:01 EET 2016
[00:01:53 CET] <MoonOwl> done
[00:02:00 CET] <MoonOwl> You were right
[00:05:46 CET] <MoonOwl> The output for part of what I'm trying to compress http://pastebin.com/Zv0Q7xNR
[00:08:32 CET] <MoonOwl> Thanks
[00:12:51 CET] <furq> isn't 20 a really low crf for x265
[00:13:17 CET] <furq> apparently 28 is equivalent to x264 23
[00:13:23 CET] <MoonOwl> I am backing up my DVD's
[00:13:26 CET] <furq> so i guess you want 24-25
[00:13:50 CET] <MoonOwl> I thought the CRF's for x264 and x265 corresponded with each other
[00:13:53 CET] <furq> they don't
[00:14:38 CET] <furq> i assume -preset ultrafast was just for that paste
[00:14:38 CET] <MoonOwl> Pink Floyd's Pulse DVD has very poor quality and trying to rip it is a painful experience
[00:14:45 CET] <furq> ultrafast will compress worse than x264 at the same encoding speed
[00:14:53 CET] <MoonOwl> It was just for the paste
[00:14:56 CET] <furq> ok good
[00:15:16 CET] <furq> anything below -preset medium with x265 is a waste of time, x264 will do the same or better at equivalent speed
[00:15:26 CET] <furq> usually, anyway
[00:22:07 CET] <MoonOwl> A higher average bitrate was obtained with -preset medium all else being the same http://pastebin.com/qwX2vAUf
[00:23:07 CET] <MoonOwl> I thought the slower the preset, the better the compression under normal circumstances
[00:26:42 CET] <MoonOwl> Is x265 like the video counterpart of HE-AACv2, the codec Nokia used to ship for syncing to symbian phones in how it relates to its predecessor
[00:29:56 CET] <DHE> super sport version, x265 is an encoder for the H265 codec, intended to be the successor to H264.
[00:33:53 CET] <furq> yeah it's more akin to the difference between xvid and x264
[00:34:25 CET] <furq> maybe not quite that pronounced though
[00:34:30 CET] <furq> x264 is still pretty competitive
[00:35:09 CET] <MoonOwl> "-preset fast" compressed better than "-preset medium" but still did not perform like "-preset ultrafast". It seems HEVC's implementations, x265 in particular, are still work in progress
[00:35:18 CET] <furq> well x264 is the same way
[00:35:40 CET] <furq> slower presets will generally look better rather than compress better at the same crf
[00:35:42 CET] <MoonOwl> Nero's AVC codec was beautiful
[00:36:30 CET] <MoonOwl> But furq, I thought crf implied constant quality
[00:36:56 CET] <MoonOwl> And all you could ever do in terms of fiddling with settings would determine how strong the compression is
[00:37:06 CET] <furq> constant given the same settings other than crf
[00:37:39 CET] <furq> some of the settings defined by the presets make a massive difference, e.g. cabac is disabled with -preset ultrafast
[00:39:00 CET] <furq> qp is true constant quality, but it's not recommended
[00:39:21 CET] <MoonOwl> I saw in the docs
[00:39:23 CET] <furq> i guess you could use it for benchmarking how much difference the presets make, but it's probably not useful information
[00:39:54 CET] <furq> i'll admit it's unintuitive
[00:41:13 CET] <MoonOwl> Do you think video encoding will ever have a FLAC
[00:41:25 CET] <furq> it already does
[00:41:50 CET] <MoonOwl> A lossless standard that makes some lossy transcoding a waste
[00:41:54 CET] <MoonOwl> Which one?
[00:42:00 CET] <furq> well not in terms of usability
[00:42:08 CET] <furq> most lossless video codecs beat flac in terms of average compression ratio
[00:42:17 CET] <furq> but rawvideo is a few orders of magnitude bigger than cd audio
[00:43:21 CET] <furq> 1080p60 yuv420p rawvideo is about 1.5gbps
[00:44:14 CET] <MoonOwl> Most cheap consumer hardware even with flash storage is incapable of such bitrates
[00:44:39 CET] <furq> well you'd need a 90% compression ratio for me to be able to stream that on my internet
[00:44:44 CET] <furq> and my internet is pretty good
[00:45:46 CET] <MoonOwl> HD Audio and 12 bit 4k... some claim these are marketing ripoffs
[00:45:49 CET] <furq> x264 lossless and ffv1 can do better than 2:1 on a good day
[00:46:04 CET] <furq> but yeah, you'd need more like 20:1 really
[00:46:10 CET] <llamapixel> Just to clarify, no uncompressed standard at the moment is really uncompressed.. The lense and capture rate is a form of compression when it is captured. Just be happy if the content is crisp enough and double the intended usage if you are compositing.
[00:46:17 CET] <furq> christ not this again
[00:46:43 CET] <MoonOwl> I'm sorry furq for bringing it up. I'll withdraw it
[00:47:00 CET] <furq> not you
[00:47:03 CET] <furq> remind me what hd audio is
[00:47:20 CET] <llamapixel> It should be understood so there is no confusion with people who assume it is the ultimate.
[00:47:38 CET] <phillipk> I updated my question that was also discussed here--http://ffmpeg.gusari.org/viewtopic.php?f=11&t=3329&p=9970#p9970
[00:48:21 CET] <llamapixel> uncompressed is a misnomer that can confuse people entering the subject.
[00:48:45 CET] <furq> you're the only person i've ever seen get confused about it
[00:48:59 CET] <phillipk> it appears to me, -filter_complex_script (which points to a "command file" --text file with complex filter) has an upper limit for the size of the command file
[00:49:05 CET] <llamapixel> I have no confusion about it. hence why I am not asking questions that relate to it.
[00:49:30 CET] <furq> phillipk: post a bug report i guess
[00:50:46 CET] <phillipk> @furq you think that's appropriate? I noticed some optimizations for captions must address this because you can have a huge data file containing the captions (right?)
[00:51:07 CET] <furq> people normally use libass for captions
[00:51:20 CET] <furq> i've never seen anyone doing it with -filter_complex_script, but who knows
[00:51:33 CET] <phillipk> well, I'm not doing captions but it seems similar
[00:52:05 CET] <MoonOwl> HD Audio is marketted as audo with a sample rate of 92 or 192 kHz and bit depth of 24 bits
[00:52:09 CET] <phillipk> I am wanting to overlay a png on any frame in any location--with data drawn from a text file. (thought originally, I thought maybe I could cram it into the command itself)
[00:52:12 CET] <furq> oh
[00:52:15 CET] <furq> yeah that's snake oil
[00:52:34 CET] <furq> 24-bit is only useful for mastering, and >48khz is completely useless
[00:53:03 CET] <furq> high-bit-depth 4k isn't really a marketing thing but very few people have a good enough display to notice the difference
[00:53:40 CET] <furq> i mean it's a marketing thing in that it's a way to get you to buy a new expensive tv and buy all your films again, but there is actually a noticeable difference
[00:55:29 CET] <furq> phillipk: i take it you couldn't get sendcmd to work
[00:55:43 CET] <MoonOwl> This reminds me of how people were supposed to get excited to be paying extra for 720p broadcasts when people wanted an upgrade to 1080p from 576p not a retardation to 480p to sell 720p encoded content. This happened with one of our local satellite providers
[00:57:01 CET] <MoonOwl> I always ask myself the question: How many ordinary people are willing to fork out extra for 4k tv when almost no one is streaming their content at that resolution
[00:57:17 CET] <furq> is anyone anywhere broadcasting 4k yet
[00:57:32 CET] <thebombzen_> yea. it's somewhat unfair to compare to x264 because it's so good
[00:58:03 CET] <thebombzen_> given that HEVC has like 10 years fewer of development
[00:58:18 CET] <thebombzen_> you can't expect something new like x265 to be anywhere as good relative to the bitstream spec that x264 is
[00:58:44 CET] <DHE> x264 is over a decade old now. x265 not so much
[00:58:55 CET] <iive> x264 got quite good quite fast
[00:58:59 CET] <thebombzen_> my guess is in a few years x265 will be far far better than x264. nowadays x264 is still preferrable over x265 in many cases because x265 is so damn slow
[00:59:05 CET] <furq> tbh i would probably stick with x264
[00:59:17 CET] <furq> x265's future isn't as certain as i'd like
[00:59:18 CET] <DHE> I could disagree. frame-based threading took a while.
[00:59:43 CET] <thebombzen_> although tbh HEVC seems like it's a transitionary technology
[00:59:56 CET] <thebombzen_> H.264 is like mpeg2 as in, super standard around forever
[00:59:58 CET] <MoonOwl> http://www.directv.com/technology/4k
[01:00:06 CET] <furq> i mean av1 will probably not be ready for a good few years
[01:00:15 CET] <furq> and there's no other path forward atm
[01:00:19 CET] <thebombzen_> whereas mpeg4-asp was never fully adopted
[01:00:31 CET] <thebombzen_> I feel like HEVC will be like mpeg4asp
[01:00:42 CET] <furq> yeah the licensing situation is very unhelpful to its cause
[01:00:46 CET] <thebombzen_> ideally the next generation after HEVC and vp9 will work.
[01:00:51 CET] <furq> but also x264 is so embedded now
[01:00:54 CET] <thebombzen_> specifically Daala
[01:00:55 CET] <furq> s/x/h/
[01:01:10 CET] <thebombzen_> Daala looks very promising
[01:01:13 CET] <furq> aren't the daala guys working on av1 now
[01:01:15 CET] <MoonOwl> ASP was bombed on us in Southern Africa as THE codec
[01:01:27 CET] <thebombzen_> well I can't speak for southern africa
[01:01:32 CET] <thebombzen_> I'm in North America
[01:01:37 CET] <furq> i heard some daala tech was going to make it into av1, but then i also heard it wasn't going to any more
[01:01:59 CET] <thebombzen_> well the whole point of daala was that they invented a deblocking filter with a Right Inverse
[01:02:01 CET] <thebombzen_> which is a big deal
[01:02:12 CET] <furq> i thought the whole point was that it didn't use IDCT
[01:02:23 CET] <furq> which is a big patent troll vector
[01:02:35 CET] <thebombzen_> well the DCT is just math
[01:02:40 CET] <thebombzen_> that's not patented
[01:03:13 CET] <thebombzen_> DCT is just a change of basis
[01:03:26 CET] <thebombzen_> it's essentially just linear algebra
[01:03:49 CET] <thebombzen_> either way afaik the point of daala is that there's a deblocking filter P with a right inverse (P')
[01:03:53 CET] <furq> In addition to the technical freedom of starting fresh, this new design consciously avoids most of the patent thicket surrounding mainstream block-DCT-based codecs. At its very core, for example, Daala is based on lapped transforms, not the traditional DCT.
[01:04:01 CET] <MoonOwl> BBC should go back into the game of codec development
[01:04:05 CET] <thebombzen_> huh
[01:04:17 CET] <thebombzen_> either way the right-invertible deblocking filter is a big deal
[01:04:20 CET] <furq> sure
[01:04:32 CET] <thebombzen_> cause if P' is a right inverse of P then you can take P' and then do the transform stuff and then take P
[01:04:37 CET] <furq> you get that for free with the lapped transform don't you
[01:04:41 CET] <thebombzen_> I don't know
[01:04:52 CET] <furq> well i'm no expert on this stuff but that's the impression i got from this article
[01:04:52 CET] <thebombzen_> isn't "lapped transform" like the MDCT used in vorbis
[01:05:47 CET] <furq> something like that
[01:05:55 CET] <thebombzen_> I just find it very weird that P has a right inverse. it's much harder to find a right inverse than a left inverse
[01:05:57 CET] <thebombzen_> so that's nice
[01:06:43 CET] <thebombzen_> but either way I still don't see the issue with DCT
[01:06:47 CET] <thebombzen_> it's just a change of basis
[01:06:57 CET] <thebombzen_> I'm not sure how you can patent that
[01:07:00 CET] <furq> MoonOwl: some of the bbc r&d guys are apparently working on av1
[01:07:32 CET] <furq> thebombzen_: i imagine you can patent using a DCT to compress video in some way
[01:08:20 CET] <furq> oh what
[01:08:25 CET] <furq> http://www.bbc.co.uk/rd/blog/2016/09/turing-codec
[01:08:28 CET] <furq> this is news to me
[01:08:28 CET] <MoonOwl> I hope daala matures and grows in acceptance.
[01:09:11 CET] <thebombzen_> but furq the original idea to use DCT to compress video was actually JPEG
[01:09:17 CET] <thebombzen_> and everything's a derivative of that
[01:09:20 CET] <thebombzen_> afaik the JPEG patent expired
[01:09:22 CET] <furq> shrug
[01:09:24 CET] <furq> i'm not a patent lawyer
[01:09:33 CET] <thebombzen_> patenting an algorithm seems really weird to me
[01:09:37 CET] <furq> or a video encoder writer
[01:09:54 CET] <thebombzen_> especially since the algorithm is simple enough that I could explain it to a college student after one semester of linear algebra
[01:10:15 CET] <MoonOwl> Thank God patents expire
[01:10:26 CET] <furq> but yeah it looks like some daala stuff is on the table for av1
[01:10:29 CET] <thebombzen_> I tutor linear algebra students at uni and I could explain to them after the semester is over how jpeg works
[01:10:35 CET] <MoonOwl> MP3's patent has expirent in most places
[01:10:47 CET] <furq> well it was never patented in most places because most places don't honour software patents
[01:10:55 CET] <furq> the last patent expires in the US at the end of next year
[01:10:57 CET] <thebombzen_> well fraunhofer wanted to charge royalties for anything distributed in the mp3 format
[01:11:04 CET] <thebombzen_> which is just stupid
[01:11:15 CET] <thebombzen_> what a great way to get people to not use your technology
[01:11:26 CET] <furq> isn't that pretty much what you have to do for h264
[01:11:29 CET] <thebombzen_> no
[01:11:55 CET] <thebombzen_> Fraunhofer wanted to make it so if I put an mp3 on my website of music I made in my garage, they'd get royalties
[01:12:12 CET] <thebombzen_> even if I paid a licensee to encode it
[01:12:21 CET] <thebombzen_> and it could only be played on a licensed player
[01:12:26 CET] <furq> oh
[01:12:32 CET] <furq> yeah that is a bit more restrictive then
[01:12:34 CET] <MoonOwl> It's a fair thing to ask for given what mp3 did for consumer world
[01:12:39 CET] <thebombzen_> well they never ended up doing it
[01:12:47 CET] <thebombzen_> because 98 percent of people refused to pay it
[01:13:09 CET] <furq> they probably only decided that would be a good idea after it already took off
[01:13:23 CET] <thebombzen_> maybe. but they never ended up with that stuff
[01:13:37 CET] <thebombzen_> the problem with software patents is that patents are supposed to encourage innovation
[01:13:37 CET] <furq> if only hevc had the same foresight
[01:13:39 CET] <MoonOwl> MP3 became the dreaded infrastructure technology of audio distribution
[01:14:03 CET] <MoonOwl> Has anyone ever used dirac?
[01:14:18 CET] <thebombzen_> patents are supposed to encourage innovation by making it so there's no risk in publishing an invention
[01:14:19 CET] <furq> did dirac ever get beyond the experimental stage
[01:14:34 CET] <thebombzen_> the issue is that software and codec patents in general don't serve to encourage innovation
[01:14:38 CET] <thebombzen_> they essentially do nothing but hinder it
[01:14:57 CET] <MoonOwl> It's all about intellectual property
[01:15:01 CET] <thebombzen_> the fact that the mp3 patents are a pain in the ass makes it not worth it
[01:15:10 CET] <MoonOwl> Not innovation
[01:15:35 CET] <thebombzen_> patents in theory are to encourage innovation. because the idea is to have no fear of publishing an invention or making it public
[01:15:45 CET] <thebombzen_> they just don't work that way in practice with respect to software patents and codecs
[01:15:47 CET] <MoonOwl> I always wondered how LAME got away with implementing MP3 and distributing the binaries of their codec
[01:15:59 CET] <thebombzen_> I don't think they do
[01:16:07 CET] <thebombzen_> afaik LAME was source-only
[01:16:11 CET] <thebombzen_> which is weird
[01:16:12 CET] <thebombzen_> cause like
[01:16:29 CET] <thebombzen_> patent laws are such that LAME is source-only
[01:16:37 CET] <thebombzen_> like who cares if it's in C or in machine code
[01:16:57 CET] <MoonOwl> Anything can be disassembled
[01:16:57 CET] <thebombzen_> how is that any different. but apparently someone managed to convince the US gov't that source code was protected first amendment speech
[01:17:20 CET] <thebombzen_> which is IMO wrong
[01:17:25 CET] <thebombzen_> like that's not what speech is
[01:17:31 CET] <c_14> Why not?
[01:17:48 CET] <c_14> It's just like making a movie.
[01:17:52 CET] <c_14> And that's free speech.
[01:17:54 CET] <thebombzen_> it's a bit complicated. but not all "text" is "speech"
[01:18:02 CET] <thebombzen_> source code in particular is not english text
[01:18:05 CET] <c_14> So?
[01:18:10 CET] <c_14> Does it have to be?
[01:18:14 CET] <MoonOwl> I think 'speech' is now being used where 'expresson' could have been used
[01:18:24 CET] <thebombzen_> well source code is an algorithm description
[01:18:36 CET] <thebombzen_> there's really no difference between source and binaries from a speech perspective
[01:18:39 CET] <c_14> So are books that describe how to build bombs.
[01:18:46 CET] <c_14> Those are protected under freedom of speech.
[01:19:00 CET] <thebombzen_> my point is that source and binary code shouldn't be different
[01:19:03 CET] <thebombzen_> but they are for some reason
[01:19:05 CET] <c_14> There is no difference between source code and binaries from a speech definition
[01:19:11 CET] <thebombzen_> actually yes there is
[01:19:13 CET] <c_14> But lawmakers/lawyers etc don't understand computers
[01:19:21 CET] <thebombzen_> that's my point
[01:19:37 CET] <thebombzen_> someone convinced lawmakers that source code is speech but binaries aren't
[01:19:38 CET] <thebombzen_> which is weird.
[01:19:56 CET] <thebombzen_> I can buy an argument that source code is speech only if binaries are to
[01:20:04 CET] <thebombzen_> but that's now how the law works for some weird reason
[01:20:15 CET] <thebombzen_> of course that doesn't matter that europe doesn't have protectd speech
[01:20:22 CET] <thebombzen_> cause europe also doesn't have software patents ^_^
[01:20:29 CET] <c_14> Because the people who convinced the lawmakers thereof thought it would be easier to just do code.
[01:20:34 CET] <thebombzen_> but like
[01:20:39 CET] <thebombzen_> disassembly exists
[01:20:45 CET] <c_14> Because that looks more like speech
[01:20:45 CET] <thebombzen_> and some languages are even more disassemblable
[01:20:54 CET] <thebombzen_> Java can even be decompiled
[01:20:57 CET] <thebombzen_> not just disassembled
[01:21:06 CET] <furq> https://upload.wikimedia.org/wikipedia/commons/thumb/f/fd/Sample_09-F9_protest_art%2C_Free_Speech_Flag_by_John_Marcotte.svg/800px-Sample_09-F9_protest_art%2C_Free_Speech_Flag_by_John_Marcotte.svg.png
[01:21:08 CET] <MoonOwl> The assumption causing problems is that programming languages are all human languages
[01:21:11 CET] <furq> never forget
[01:21:17 CET] <thebombzen_> what about Brainfuck
[01:21:21 CET] <thebombzen_> is that human language
[01:21:45 CET] <thebombzen_> bf is essentially Turing Machines The Language
[01:21:45 CET] <furq> yeah any brainfuck program translates into english as "i fucking suck ass, don't talk to me"
[01:21:55 CET] <MoonOwl> Well, hieroglyphics were a human language
[01:22:04 CET] <thebombzen_> well bf can be compiled with a one-line bash script
[01:22:24 CET] <thebombzen_> because you can translate bf to C with a regular expression and a bit of boilerplate.
[01:23:02 CET] <thebombzen_> does that make it speech
[01:23:10 CET] <thebombzen_> if you can compile it to C
[01:23:13 CET] <thebombzen_> which is 'human readable'
[01:23:26 CET] <thebombzen_> (despite that making it significantly less readable)
[01:23:44 CET] <MoonOwl> Every language is human readable
[01:23:55 CET] <furq> what about java
[01:23:55 CET] <c_14> Well, no
[01:24:01 CET] <c_14> Every Human language is human readable
[01:24:03 CET] <MoonOwl> Otherwise, we wouldn't classify it as a a language
[01:24:06 CET] <thebombzen_> furq: yes
[01:24:15 CET] <thebombzen_> not sure what you mean by that
[01:24:29 CET] <furq> me?
[01:24:36 CET] <thebombzen_> yea. what do you mean "what about java"
[01:24:41 CET] <furq> i'm saying that java sucks. it's a clever and original joke
[01:24:51 CET] <thebombzen_> well what you said is that java isn't human readable
[01:24:53 CET] <thebombzen_> which isn't true
[01:24:58 CET] <MoonOwl> For us to classify an object as a language, don't we first have to have an understanding of it? And by understanding a language, the language becomes readable to us
[01:25:19 CET] <thebombzen_> also it's unfair to say that Java sucks
[01:25:25 CET] <thebombzen_> at least it was designed
[01:25:42 CET] <thebombzen_> you may dislike it. I hate python but I don't think it sucks
[01:25:42 CET] <MoonOwl> Java is actually pretty compared to some of the things that I've tried to write
[01:26:06 CET] <furq> is there a distinction between "i dislike it" and "it sucks"
[01:26:11 CET] <thebombzen_> Yes
[01:26:14 CET] <thebombzen_> PHP sucks
[01:26:33 CET] <thebombzen_> I dislike Javascript
[01:26:40 CET] <thebombzen_> but it doesn't "suck" in the same way PHP does
[01:26:50 CET] <MoonOwl> Languages may suck but the beauty about them is that if you work hard enough you can generate them and even translate to the languages that are descriptions of the produced languages
[01:27:05 CET] <thebombzen_> Java in particular is actually a very well-designed language and does what it's supposed to do very well
[01:27:10 CET] <furq> there is no objective standard against which to define php as a bad language though
[01:27:15 CET] <furq> even though it obviously is
[01:27:23 CET] <thebombzen_> php doesn't do what it's designed to do
[01:27:29 CET] <thebombzen_> it creates net positive work for the programmer
[01:27:30 CET] <MoonOwl> At the end of the day whichever language we decide to use, the end-user could not care if we wrote in C++ or Lisp
[01:27:44 CET] <furq> it's designed to get a website up with minimal effort
[01:27:48 CET] <furq> if anything it does that too well
[01:27:54 CET] <c_14> Unless they happen to not have enough RAM to compile the program in C++ or don't have a Lisp REPL
[01:27:54 CET] <thebombzen_> not really
[01:28:08 CET] <furq> i said "up", not "working to an acceptable level"
[01:28:10 CET] <thebombzen_> c_14: I think they meant the binaries
[01:28:25 CET] <MoonOwl> PHP causes great fatigue
[01:28:45 CET] <thebombzen_> but either way the thing about Java is that it's very double-edged
[01:28:58 CET] <thebombzen_> there's a lot of things that people find very strange about it
[01:29:07 CET] <MoonOwl> Wasn't Java supposed to bring Lisp to C programmers?
[01:29:11 CET] <thebombzen_> no
[01:29:18 CET] <thebombzen_> Java has literally nothing to do with lisp
[01:29:27 CET] <furq> maybe you're thinking of javascript
[01:29:32 CET] <c_14> Well, they're both programming languages
[01:29:36 CET] <MoonOwl> No... Im referring to Java
[01:29:37 CET] <thebombzen_> Javascript is also nothing like lisp
[01:29:41 CET] <furq> which also has little to do with lisp, but it has an association
[01:29:46 CET] <thebombzen_> Java is an objected oriented C-syntax language
[01:29:56 CET] <thebombzen_> not sure how Lisp is relevant there
[01:30:08 CET] <thebombzen_> or rather. it's got a C-like syntax.
[01:30:08 CET] <furq> it's relevant to js because js was almost lisp
[01:30:16 CET] <furq> i have no idea what the association with java is
[01:30:20 CET] <thebombzen_> but I'm not sure how Java has anything do with lisp
[01:30:24 CET] <furq> either with lisp or javascript
[01:30:32 CET] <thebombzen_> Javascript was named after Java
[01:30:36 CET] <thebombzen_> which was a huge mistake
[01:30:46 CET] <MoonOwl> JavaScript to me is like a Scheme imitation that aspired to be like some relatively esoteric programming language and found itself looing like C to the Mr Magoo and Stevie Wonder
[01:30:58 CET] <thebombzen_> JavaScript is an extension of ECMA
[01:31:12 CET] <thebombzen_> I dislike ECMA
[01:31:14 CET] <furq> brendan eich got hired to embed scheme into netscape
[01:31:25 CET] <thebombzen_> to be honest one of the reasons I like Java is that I hate weak typing
[01:31:31 CET] <furq> but then sun got involved and decided that it should be a language with c-family syntax
[01:31:37 CET] <furq> so eich hacked some bullshit together in a couple of weeks
[01:31:50 CET] <furq> and now, 20 years later, we get to reap the benefits of good planning
[01:32:02 CET] <thebombzen_> and Java is a strongly-typed object-oriented language
[01:32:08 CET] <thebombzen_> of which there are extremely few
[01:32:35 CET] <c_14> I mean, I like strong typing as well but I'd much rather write lisp than Java.
[01:32:49 CET] <c_14> The Java syntax is so excessive
[01:32:59 CET] <thebombzen_> Not really. depends on what you're trying to do
[01:33:02 CET] <MoonOwl> Where I took the quote https://news.ycombinator.com/item?id=2323963
[01:33:13 CET] <thebombzen_> If you're trying to write hello world? Yea it takes a lot of lines. Because it's not a scripting language.
[01:33:20 CET] <furq> don't forget the jvm
[01:33:28 CET] <thebombzen_> If you're trying to do something /really simple/ in Java then you're using the wrong tool.
[01:33:50 CET] <MoonOwl> Vala seems nicer than Java to me
[01:33:50 CET] <thebombzen_> so it's a bit silly to say that the syntax is so excessive
[01:34:06 CET] <MoonOwl> Java is not excessive
[01:34:10 CET] <thebombzen_> also I happen to like the fact that Java tends to have keywords rather than nonalphanumeric symbols
[01:34:24 CET] <furq> you'd love lua then
[01:34:24 CET] <thebombzen_> it makes for more characters typed but its' easier to type and it's easier to read
[01:34:54 CET] <MoonOwl> Java just happens to be a language for everyone
[01:35:01 CET] <llamapixel> lua is horrible spaghetti, had to manage some smart device games for McDonalds with that in Corona.
[01:35:12 CET] <thebombzen_> although a big thing about Java is that it's got a lof keywords but it doesn't give up grouping symbols
[01:35:14 CET] <thebombzen_> that's a big deal
[01:35:18 CET] <MoonOwl> Where each individual has to tolerate everyone's preferences
[01:35:19 CET] <DHE> I actually like lua. a few ambiguous syntax issues, but mostly okay
[01:35:20 CET] <furq> lua does not seem like the weak link in that scenario
[01:35:37 CET] <furq> ambiguous syntax?
[01:35:38 CET] <thebombzen_> because people have a bad habit of making langauges with keywords and forgetting that grouping symbols like () [] and {} are still extremely nice
[01:35:53 CET] <DHE> furq: with optional semicolons, it's possible to have ambiguous syntax when omitting them
[01:35:58 CET] <thebombzen_> what I mean for example is "extends" versus "::"
[01:36:03 CET] <furq> you never actually run into that though
[01:36:14 CET] <MoonOwl> A Go and C++ hybrid would be nice
[01:36:17 CET] <thebombzen_> lol you guys would love perl
[01:36:25 CET] <thebombzen_> Perl before Perl6 is parsed in realtime
[01:36:34 CET] <MoonOwl> Perl is not human readable
[01:36:35 CET] <thebombzen_> in some rare cases it can't be parsed
[01:36:35 CET] <furq> it's not doing asi like js where you have to remember all the bullshit asi rules
[01:36:45 CET] <thebombzen_> asi?
[01:36:54 CET] <furq> that only happens in one place in lua and it's if you end a line with ) and start the next one with (, which you never do in practice
[01:36:54 CET] <thebombzen_> automatic semicolon insertion?
[01:36:57 CET] <furq> yeah
[01:37:03 CET] <thebombzen_> oh lol fuck that
[01:37:10 CET] <thebombzen_> it should just give you a compiletime error
[01:37:21 CET] <thebombzen_> asi sounds like it's a kind of thing to reward incompetent programmers who don't care enough
[01:37:22 CET] <furq> if you mean lua, it does
[01:37:36 CET] <thebombzen_> like if someone wants a garage website
[01:37:39 CET] <furq> oh nvm you mean not terminating a statement with ; in js
[01:37:46 CET] <furq> i always just use the semicolons in js
[01:37:49 CET] <llamapixel> smells like measuring chat.
[01:37:52 CET] <thebombzen_> haha
[01:38:01 CET] <thebombzen_> I'm not sure why javascript inserts semicolons for you
[01:38:03 CET] <thebombzen_> I just
[01:38:09 CET] <furq> because it's a terrible mistake of a language
[01:38:12 CET] <thebombzen_> don't understand why they would allow you to be that lazy
[01:38:30 CET] <thebombzen_> furq: I like how if you want to convent a string to a number
[01:38:33 CET] <thebombzen_> you use the unary +
[01:38:37 CET] <furq> ;_;
[01:38:38 CET] <thebombzen_> +"5" === 5
[01:38:45 CET] <thebombzen_> yea
[01:38:52 CET] <furq> the worst thing about that line of code isn't what you just said
[01:38:53 CET] <furq> it's ===
[01:39:02 CET] <thebombzen_> that is what I just said
[01:39:13 CET] <thebombzen_> oh yea
[01:39:15 CET] <MoonOwl> JavaScript is a beautiful language waiting to be discovered... just like C++
[01:39:21 CET] <thebombzen_> the fact that they have === and ==
[01:39:28 CET] <furq> yeah exactly like C++
[01:39:33 CET] <furq> there is a good language in there somewhere
[01:39:40 CET] <c_14> Are you sure?
[01:39:40 CET] <furq> it's just surrounded by seven or eight garbage languages
[01:39:53 CET] <furq> and nobody quite agrees on which is the good subset
[01:39:53 CET] <thebombzen_> in Java they thought of that and the non-identity equals is just an instance method
[01:40:10 CET] <thebombzen_> which can (and should) be overridden as appropriate
[01:40:23 CET] <furq> i generally end up writing js as if it was lua
[01:40:32 CET] <thebombzen_> the most annoying thing about js
[01:40:34 CET] <MoonOwl> Prototype-based OOP vs class-based OOP... what do you prefer
[01:40:37 CET] <furq> and it's tolerable, even with the fucked-up type system
[01:40:46 CET] <thebombzen_> class-based is more readable
[01:40:51 CET] <thebombzen_> more maintainable
[01:40:55 CET] <thebombzen_> and much stricter
[01:41:08 CET] <thebombzen_> it's far easier to mash together a quick script in a prototype OOP language
[01:41:19 CET] <thebombzen_> but it's much easier to maintain a codebase in a class-based OOP language
[01:41:35 CET] <furq> composition is better than inheritance
[01:41:41 CET] <thebombzen_> composition?
[01:41:56 CET] <thebombzen_> what do you mean by composition
[01:42:08 CET] <furq> https://en.wikipedia.org/wiki/Composition_over_inheritance
[01:42:35 CET] <furq> usually expressed with interfaces, like in go
[01:42:50 CET] <thebombzen_> well
[01:42:59 CET] <thebombzen_> they're not mutually exclusive
[01:43:07 CET] <thebombzen_> you can have interfaces and inheritance
[01:43:36 CET] <MoonOwl> Interfaces can inherit each other
[01:43:44 CET] <furq> sounds awful
[01:43:50 CET] <thebombzen_> why is that awful
[01:43:51 CET] <MoonOwl> I know
[01:44:26 CET] <thebombzen_> what's wrong with interfaces having parent interfaces
[01:45:49 CET] <thebombzen_> I happen to like Java a lot which is why I use it for a lot of things
[01:45:53 CET] <phillipk> @furq, correct, -filter_complex_script works until I make the contents of the specified file sort of huge--around 3000 lines (but I suspect the limit depends on how long those 3000 lines are)
[01:45:58 CET] <thebombzen_> but it's not the best tool for everything
[01:46:15 CET] <thebombzen_> phillipk: why would you want a 3000 line filter_complex
[01:46:30 CET] <MoonOwl> Has anyone used OCaml?
[01:46:33 CET] <furq> different overlay for every frame
[01:46:35 CET] <thebombzen_> no
[01:46:39 CET] <furq> i've briefly used ocaml
[01:46:50 CET] <thebombzen_> furq: sounds like it's easier to extract pngs and do it in bash
[01:46:52 CET] <furq> it looks interesting but it has the D problem of having several stdlibs
[01:47:04 CET] <thebombzen_> I don't even know what OCaml is
[01:47:10 CET] <furq> and half the docs/libs/etc are for one and half are for the other
[01:47:28 CET] <thebombzen_> the thing about Java is that people have a hard time getting used to it
[01:47:33 CET] <furq> i think there's more than two ocaml stdlibs but only two are widely-used afaik
[01:47:34 CET] <thebombzen_> so most people either love it or hate it
[01:47:46 CET] <thebombzen_> stuff like the fact that all linking is dynamic
[01:47:47 CET] <furq> but yeah i've been meaning to learn some standard ML dialect properly
[01:47:49 CET] <thebombzen_> people find that very weird
[01:47:51 CET] <furq> i just don't have a project to use it for
[01:48:01 CET] <furq> it seems like a neat language though
[01:48:04 CET] <thebombzen_> furq: if you need a markup language use LaTeX
[01:48:08 CET] <thebombzen_> best markup language
[01:48:10 CET] <furq> not markup
[01:48:17 CET] <thebombzen_> ML = markup language?
[01:48:17 CET] <furq> https://en.wikipedia.org/wiki/Standard_ML
[01:48:25 CET] <furq> what ocaml is based on
[01:48:49 CET] <thebombzen_> oh it's a functional langauge
[01:49:01 CET] <thebombzen_> I use Mathematica for that but it's proprietary
[01:49:10 CET] <thebombzen_> my uni pays for it tho
[01:49:18 CET] <furq> i guess there's also F# if you're a .net guy
[01:49:22 CET] <thebombzen_> eww
[01:49:23 CET] <furq> that seems very similar to ocaml
[01:49:42 CET] <thebombzen_> .net is like
[01:49:55 CET] <thebombzen_> what happens when you could write something but you decide to make it windows only for no reason
[01:49:57 CET] <furq> there's also that other language that starts with an H whose name we can't say or someone who's really enthusiastic about it will come and ruin the channel
[01:50:07 CET] <thebombzen_> like C# is just fake java for windows-only
[01:50:47 CET] <c_14> furq: well, that's just mean
[01:50:55 CET] <thebombzen_> something that seems weird to me
[01:50:58 CET] <thebombzen_> is how huge GHC si
[01:51:01 CET] <thebombzen_> is*
[01:51:07 CET] <thebombzen_> it's like a 2 GB package
[01:51:25 CET] <furq> i tried it once and didn't even write a line of code
[01:51:34 CET] <furq> cabal shat the bed when i tried to install the library i wanted to use
[01:51:45 CET] <furq> i'm told that hasn't got any better in the past few years
[01:51:46 CET] <thebombzen_> sorry it's 1146 MiB
[01:51:50 CET] <thebombzen_> GHC is weird
[01:52:14 CET] <thebombzen_> cause for some reason the people who wrote Haskell didn't realize that maybe there's better things to do than include it four times
[01:52:23 CET] <furq> you said it
[01:52:25 CET] <furq> you said the H word
[01:52:28 CET] <thebombzen_> you'd think the authors of Haskell would have thought of a workaroudn to including it four times
[01:52:29 CET] <furq> you know they all have it on highlight
[01:52:45 CET] <c_14> The worst part about haskell isn't the language, it's the ecosystem.
[01:52:48 CET] <furq> now they're all coming to tell us about monads
[01:52:49 CET] <furq> or monoids
[01:52:53 CET] <furq> or whatever the fuck it is this week
[01:53:02 CET] <thebombzen_> A monad is like a burrito that takes in a cat and spits out another burrito
[01:53:09 CET] <furq> wow! it's so clear now!
[01:53:16 CET] <thebombzen_> that's the best description I"ve ever heard of a monad
[01:53:23 CET] <furq> every other language i've ever used is now worthless
[01:53:24 CET] <thebombzen_> I hope that clears it up
[01:53:43 CET] <furq> i shall shave my head and embark on a pilgrimage to the church of never shutting the fuck up about haskell on irc
[01:54:05 CET] <furq> only my head, though. not my beard
[01:54:50 CET] <furq> i'm debating picking this ocaml book back up now
[01:55:44 CET] <thebombzen_> so what a Monad actually is, it's a functor from a category to itself, coupled with two natural transformations
[01:56:01 CET] <furq> oh no
[01:56:05 CET] <furq> the killer was inside the house all along
[01:56:26 CET] <thebombzen_> whereas a natural transformation in this context is a morphism in the category of functors between two other categories
[01:56:31 CET] <thebombzen_> I have no idea what it means in Haskell
[01:56:35 CET] <thebombzen_> I'm just a mathematician
[01:57:02 CET] <thebombzen_> I'm not just a mathematician. I'm a weird mathematician who likes mathematical logic
[02:03:37 CET] <phillipk> @furq and @thebombzen_ are you thinking create ~3000 png files and turn them into single alpha channel video that I then overlay?
[02:03:55 CET] <furq> i didn't suggest that, but it would probably work
[02:04:14 CET] <phillipk> right I think thebombzen_ did
[02:04:25 CET] <phillipk> or something that sounded like that to me.
[02:04:32 CET] <thebombzen_> by the way it doesn't ping me if you preceed my name with an @
[02:04:42 CET] <phillipk> oh--thanks
[02:05:17 CET] <thebombzen_> although if you're trying to do 3000 different overlays for 3000 different frames
[02:05:29 CET] <thebombzen_> I do not see any reason not to do it separately
[02:06:02 CET] <furq> what i will say is that you'll definitely want to use -v error if you do that
[02:06:08 CET] <thebombzen_> ha
[02:06:17 CET] <phillipk> I just had visions that my filter complex could access a subroutine right inside the dynamic expression
[02:06:38 CET] <thebombzen_> I mean it's possible you could do it with one filter_complex
[02:06:46 CET] <thebombzen_> but it seems like you really don't want to
[02:06:54 CET] <furq> he tried that, it didn't work
[02:06:59 CET] <thebombzen_> especially since if you do it separatelly you can make it faster with GNU Parallel
[02:07:01 CET] <phillipk> I'm fine changing tact--already did a million times anyway
[02:07:02 CET] <furq> ffmpeg bailed out on loading the file
[02:07:10 CET] <furq> also use xargs, not parallel
[02:07:18 CET] <thebombzen_> xargs doesn't parallelize afaik
[02:07:20 CET] <thebombzen_> parallel does
[02:07:21 CET] <furq> it does
[02:07:28 CET] <furq> and you already have xargs
[02:07:37 CET] <thebombzen_> true but xargs also doesn't support find-like syntax
[02:07:38 CET] <phillipk> what does xargs apply to?
[02:07:44 CET] <thebombzen_> like {.} and {/}
[02:07:49 CET] <thebombzen_> parallel does which is nice
[02:08:10 CET] <thebombzen_> also how do you even do that with xargs
[02:08:17 CET] <phillipk> yeah, I need to learn all that syntax--but there's no syntax to "go get line 200 of my 20,000 line file"?
[02:08:38 CET] <thebombzen_> xargs just takes several tokens on stdin and builds one command out of them
[02:08:51 CET] <thebombzen_> parallel actually executes the one command per input token
[02:09:13 CET] <phillipk> I wonder if that can help increase my 2000 limit
[02:09:36 CET] <phillipk> because, my filter_complex_script thing does work...
[02:10:16 CET] <furq> i can do everything with xargs except increment the output filename
[02:10:25 CET] <furq> which there is probably some way of doing but i've never needed it before
[02:10:44 CET] <thebombzen_> furq: http://hastebin.com/uzidafafav.cpp
[02:10:46 CET] <thebombzen_> they don't do the same thing
[02:10:56 CET] <furq> printf "overlay=10,20\0overlay=30,40" | xargs -0 -n1 -P2 -I{} ffmpeg -i foo -i bar -filter_complex "{}" ??.png
[02:11:14 CET] <furq> thebombzen_: -P
[02:11:54 CET] <furq> also obviously you wouldn't actually use printf there, you'd cat a text file
[02:12:32 CET] <thebombzen_> well clearly I don't know xargs syntax
[02:12:52 CET] <thebombzen_> but I'm not sure how you can use xargs to reproduce the behavior of parallel in the above paste
[02:12:55 CET] <furq> cat overlays | xargs -d'\n' -n1 -P2 -I{} ffmpeg -i foo -i bar -filter_complex "{}" ??.png
[02:12:58 CET] <furq> something like that
[02:13:04 CET] <furq> idk how you'd increment the output filename though
[02:13:22 CET] <thebombzen_> oh that? I'd use bash haha
[02:13:52 CET] <phillipk> I think I'll try making 1000s of pngs and then assemble into a single video
[02:13:55 CET] <thebombzen_> cat overlays | while read line; do ...; i=$((i+1)); done
[02:14:23 CET] <furq> thebombzen_: http://vpaste.net/rh1np
[02:14:39 CET] <furq> -d is gnu only, hence gxargs (this is on a bsd box)
[02:15:32 CET] <furq> maybe parallel is better tbh, i tend to try to make my scripts work with a normal base system
[02:16:02 CET] <furq> i tried it a couple of times and couldn't get my head round it
[02:16:20 CET] <furq> but yeah most people don't know that xargs will run jobs in parallel
[02:16:37 CET] <thebombzen_> parallel is nice even if you're not trying to parallelzie
[02:16:50 CET] <thebombzen_> suppose I have a whole bunch of .wavs and I want to convert them to .flacs
[02:17:15 CET] <furq> what's wrong with find -exec
[02:17:17 CET] <thebombzen_> I can do find -name '*.wav' | parallel ffmpeg -i {} {.}.flac
[02:17:36 CET] <thebombzen_> -exec doesn't understand {.}.flac
[02:17:40 CET] <furq> i guess
[02:18:00 CET] <thebombzen_> you could use -exec sh -c "" and basename
[02:18:01 CET] <furq> i would normally just use `for f in **/*.wav` there
[02:18:03 CET] <thebombzen_> or you could not
[02:18:10 CET] <thebombzen_> same thing tbh
[02:18:28 CET] <thebombzen_> the above command I wrote also keeps the directory structure
[02:18:31 CET] <thebombzen_> which is really nice.
[02:18:46 CET] <thebombzen_> find is also very powerful so like if you wanted to say
[02:18:51 CET] <furq> is {.} like ${f%.*}
[02:19:13 CET] <thebombzen_> how does ${f%.*} interact with filenames with more than one period
[02:19:52 CET] <thebombzen_> ffmpeg-3.1.2.tar.gz might be considered a 1.2.tar.gz file extension
[02:19:58 CET] <thebombzen_> you have to be careful with that
[02:20:04 CET] <furq> http://vpaste.net/pb0YG
[02:20:25 CET] <furq> % is shortest, %% is longest
[02:21:22 CET] <furq> you can also use ${f%.wav} if you know the extension
[02:23:53 CET] <thebombzen_> huh
[02:23:55 CET] <thebombzen_> I did'nt know that
[02:23:59 CET] <thebombzen_> but that still doesn't work with exec
[02:24:19 CET] <thebombzen_> the benefit of using parallel over -exec is that ffmpeg is singlethreaded when you're converting pcm to flac usually
[02:24:26 CET] <thebombzen_> and I don't know if you can change that
[02:24:50 CET] <thebombzen_> parallel is a tool that's far more powerful but also not as simple or easy to use than find exec
[02:25:04 CET] <furq> well xargs will do that except for stripping the extension
[02:26:40 CET] <thebombzen_> unfortunately
[02:26:59 CET] <thebombzen_> if you write to /dev/null ffmpeg and many other programs say "do you want to overwrite"
[02:27:02 CET] <thebombzen_> which is very annoying
[02:27:23 CET] <furq> yeah it's weird that you have to use -y for that
[02:27:29 CET] <thebombzen_> well it exists
[02:27:36 CET] <furq> although if you're using -f null you can just write to stdout
[02:27:48 CET] <thebombzen_> yea true
[02:28:16 CET] <thebombzen_> although it's also nice because you can test a muxer with like -f matroska -y /dev/null
[02:28:27 CET] <thebombzen_> -f null disables muxing essentially
[02:29:00 CET] <thebombzen_> tbh ffmpeg should check and if you're writing to /dev/null and /dev/null is a character special, then it should not ask.
[02:42:42 CET] <thebombzen_> huh that's weird
[02:42:50 CET] <thebombzen_> not sure why ffmpeg distributes its source tarballs as bz2
[02:43:00 CET] <thebombzen_> afaik there's essentially no reason to use bz2
[02:59:23 CET] <furq> yeah that is weird
[02:59:35 CET] <furq> is there anyone left who can't use xz
[02:59:52 CET] <c_14> It's "always" been bz2 and changing it is effort
[03:02:43 CET] <c_14> Does ancient CentOS support xz?
[04:03:09 CET] <thebombzen> furq: anyone who can't use xz should be using gzip and bz2
[04:03:15 CET] <thebombzen> so it doesn't matter
[04:28:40 CET] <thebombzen> wow I don't think I've encoded a video so slowly
[04:29:10 CET] <thebombzen> I just transcoded a 0.88 second video at 0.1% speed and it took more than 10 minutes
[04:29:15 CET] <thebombzen> to encode a video at <1 second
[04:29:36 CET] <furq> NICE
[04:29:38 CET] <furq> er
[04:29:39 CET] <furq> nice
[04:29:45 CET] <thebombzen> I can see why people don't like -vf minterpolate:mi=esa
[04:29:53 CET] <thebombzen> "exhaustive search algorithm"
[04:30:07 CET] <furq> yeah esa is what x264 uses in -preset placebo
[04:30:52 CET] <furq> you should combine that with nnedi
[04:31:53 CET] <thebombzen> nnedi?
[04:32:21 CET] <furq> the extremely slow deinterlacer
[04:32:30 CET] <thebombzen> but it's not interlaced
[04:32:38 CET] <thebombzen> so should I artificially interlace it with tinterlace
[04:32:45 CET] <furq> i don't think it matters
[04:32:56 CET] <thebombzen> what happens if you try to deinterlace a progressive video
[04:33:37 CET] <thebombzen> but you're right that I don't think combining these will do anything.
[04:34:11 CET] <furq> i don't actually know
[04:34:12 CET] <thebombzen> cause it's a pipeline. if I feed it to nnedi and then to minterpolate and then into libx264 -preset placebo
[04:34:17 CET] <thebombzen> it'll just use 3 threads instead of 1
[04:34:24 CET] <thebombzen> and there'll just be a bottleneck at the sloweset
[04:34:47 CET] <furq> shouldn't you be using libvpx if you want it to be extremely slow
[04:35:01 CET] <furq> also isn't filterchain processing singlethreaded
[04:54:35 CET] <thebombzen> idk
[04:54:50 CET] <thebombzen> I was considering actually trying that to see how slowly it would encode
[04:54:57 CET] <thebombzen> but I also don't want to wait 20 minutes
[04:55:00 CET] <thebombzen> eating cpu time
[07:00:46 CET] <damdai> x265 2.1:[Windows][GCC 5.4.0][64 bit] 8bit
[07:00:46 CET] <damdai> Encoding settings : wpp / ctu=64 / min-cu-size=8 / max-tu-size=32 / tu-intra-depth=1 / tu-inter-depth=1 / me=3 / subme=3 / merange=57 / rect / no-amp / max-merge=3 / temporal-mvp / no-early-skip / rskip / rdpenalty=0 / no-tskip / no-tskip-fast / no-strong-intra-smoothing / no-lossless / no-cu-lossless / no-constrained-intra / no-fast-intra / open-gop / no-temporal-layers / interlace=0
[07:00:46 CET] <damdai> / keyint=300 / min-keyint=30 / scenecut=40 / rc-lookahead=25 / lookahead-slices=0 / bframes=4 / bframe-bias=0 / b-adapt=2 / ref=4 / limit-refs=3 / limit-modes / weightp / no-weightb / aq-mode=1 / qg-size=32 / aq-strength=1.00 / cbqpoffs=0 / crqpoffs=0 / rd=4 / psy-rd=2.00 / rdoq-level=2 / psy-rdoq=1.00 / log2-max-poc-lsb=8 / no-rd-refine / signhide / deblock=0:0 / sao / no-sao-non-deblock /
[07:00:46 CET] <damdai> b-pyramid / cutree / no-intra-refresh / rc=crf / crf=22.0 / qcomp=0.60 / qpmin=0 / qpmax=69 / qpstep=4 / ipratio=1.40 / pbratio=1.30
[07:00:46 CET] <damdai> Default : Yes
[07:01:19 CET] <damdai> x265 2.1:[Windows][GCC 5.4.0][64 bit] 8bit
[07:01:19 CET] <damdai> Encoding settings : wpp / ctu=64 / min-cu-size=8 / max-tu-size=32 / tu-intra-depth=1 / tu-inter-depth=1 / me=3 / subme=3 / merange=57 / rect / no-amp / max-merge=3 / temporal-mvp / no-early-skip / rskip / rdpenalty=0 / no-tskip / no-tskip-fast / no-strong-intra-smoothing / no-lossless / no-cu-lossless / no-constrained-intra / no-fast-intra / open-gop / no-temporal-layers / interlace=0
[07:01:19 CET] <damdai> / keyint=300 / min-keyint=30 / scenecut=40 / rc-lookahead=25 / lookahead-slices=0 / bframes=4 / bframe-bias=0 / b-adapt=2 / ref=4 / limit-refs=3 / limit-modes / weightp / no-weightb / aq-mode=1 / qg-size=32 / aq-strength=1.00 / cbqpoffs=0 / crqpoffs=0 / rd=4 / psy-rd=2.00 / rdoq-level=2 / psy-rdoq=1.00 / log2-max-poc-lsb=8 / no-rd-refine / signhide / deblock=0:0 / sao / no-sao-non-deblock /
[07:01:19 CET] <damdai> b-pyramid / cutree / no-intra-refresh / rc=crf / crf=23.0 / qcomp=0.60 / qpmin=0 / qpmax=69 / qpstep=4 / ipratio=1.40 / pbratio=1.30
[07:01:19 CET] <damdai> Default : Yes
[08:38:01 CET] <TMan459> I salvaged a bunch of videos off a dying drive. Some of them are corrupt. I'd like to run a command to read them all and determine which one's are corrupt. Any thoughts?
[08:58:11 CET] <Samuel235> Morning all!
[09:01:54 CET] <Samuel235> Would anyone be able to give me a push in the right directiong, I'm attempting to stream the output of Raspivid on a Raspberry Pi to a h264 stream please? I have FFMPEG installed and compiled myself, took about 4 hours to do so. Its not the pre-packaged one in Raspbian. I'm getting a few little errors with my server.conf file and some issues with opening the stream file.
[09:06:46 CET] <JEEB> just use ffmpeg with nginx-rtmp or something, ffserver is basically black magic. also the first gen raspbi is not capable of doing anything in software, so make sure your chain is 100% hw (video and ausio)
[09:09:19 CET] <Samuel235> Right okay, so you would advise me to take the nginx-rtmp route. Would you advise me against trying to use something like gStreamer or even https://github.com/mpromonet/v4l2rtspserver then? To be honest, i wanted to use FFMPEG just because of the support around it, the community looks pretty large compared to the other options. I have running the first gen Pi, but i have a Pi 3 that i was going
[09:09:19 CET] <Samuel235> to use for the final product.
[09:13:36 CET] <JEEB> nginx-rtmp means that you have nginx with that module somewhere and then you push rtmp into it with normal ffmpeg cli. then nginx-rtmp will serve hls/dash out of it
[09:14:27 CET] <Samuel235> Right ok
[09:14:42 CET] <Samuel235> I have found this, do you have any idea of crtmpserver?
[09:14:46 CET] <Samuel235> http://www.linux-projects.org/uv4l/tutorials/rtmp-server/
[09:15:31 CET] <JEEB> no
[09:16:28 CET] <Samuel235> Okay, and one last question if you don't mind: If i were to use the linux section of "https://trac.ffmpeg.org/wiki/Capture/Webcam" would i still be using the FFserver?
[09:17:00 CET] <Samuel235> Or is that along the lines of what i would need to do to push the data into the nginx-rtmp module to then stream?
[09:58:58 CET] <jubalh> using the ffmpeg activation_bytes thing i was finally able to backup my audible audiobooks and also play them on offline linux. thats very nice since i payed for them i also want to use them not only online. i wondered if ffmpeg can also decrypt the amazon video files?
[10:11:23 CET] <Samuel235> JEEB - Does this look like the process you would advise for nginx-rtmp?
[10:11:24 CET] <Samuel235> https://github.com/arut/nginx-rtmp-module/wiki/Getting-started-with-nginx-rtmp
[10:18:06 CET] <Samuel235> Or even https://www.vultr.com/docs/setup-nginx-rtmp-on-ubuntu-14-04 - Would this still work on a debian raspberry pi, or would you not touch it since its for Ubuntu 14-04, different build and config processes maybe or?
[10:59:22 CET] <Samuel235> Does anyone have any experience with Nginx that could help me with a little error i get when trying to install?
[11:02:13 CET] <Samuel235> Would it be best to install the generic nginx package and then install the RTSP module for it or would that be too much for what is needed for a ffmpeg/nginx stream config?
[14:00:08 CET] <Samuel235> Is there anyone running a nginx server with RTMP module?
[14:14:19 CET] <Samuel235> [tcp @ 0x27acf60] Connection to tcp://localhost:1935 failed (Connection refused), trying next address
[14:15:25 CET] <Samuel235> I don't understand why this is giving me an error, i thought the server IP should be that of localhost as ffmpeg and the nginx server are on the same machine.
[14:16:23 CET] <kerio> are you sure that's not rtmp?
[14:16:33 CET] <kerio> tcp:// is a raw stream
[14:17:07 CET] <Samuel235> I'm running ffmpeg -f video4linux2 -i /dev/video0 -c:v libx264 -an -f flv rtmp://localhost/myapp/mystream
[14:17:27 CET] <Samuel235> and its throwing that error at me, along with codec issues, but one error at a time
[14:17:41 CET] <Samuel235> but i need to sort one error at a time*
[14:41:25 CET] <Kadigan_KSB> Hey. Can I have ffmpeg/ffprobe detect whether the content is really Interlaced, or Progressive Segmented Frame?
[14:44:27 CET] <relaxed> Kadigan_KSB: look at the idet filter
[14:48:09 CET] <Kadigan_KSB> Ah, okay. So it CAN detect it. Cool.
[14:48:25 CET] <Kadigan_KSB> Now, how do I get rid of the "Truncating likely oversized PPS" spam?
[14:48:27 CET] <fling> How do I capture two webcams keeping streams in sync?
[14:52:14 CET] <relaxed> fling: are you using one ffmpeg command to do it?
[14:52:58 CET] <fling> relaxed: I tried to use one command and also tried piping with three ffmpegs
[14:56:01 CET] <fling> relaxed: http://dpaste.com/08ETS1T
[14:59:03 CET] <relaxed> do they begin at different times or do they drift?
[14:59:33 CET] <fling> With one command the output is not playable because it is getting corrupted because ffmpeg is single threaded
[14:59:48 CET] <fling> With pipes it could not be in sync because ffmpegs are starting with a delay
[15:00:15 CET] <fling> and yes atleast audio should drift
[15:01:55 CET] <relaxed> did you try transoding the video streams with one instance of ffmpeg?
[15:02:42 CET] <relaxed> using pipes I gues you could play with -itsoffset to sync up when they start
[15:08:35 CET] <thebombzen_> fling: what do you want to do with the two streams?
[15:08:57 CET] <thebombzen_> if you're doing something like hstack or vstack then you can get ffmpeg to respect the timestamps
[15:11:44 CET] <thebombzen_> fling: in particular, the overlay filter respects the PTS (timestamps) of the input video
[15:12:53 CET] <thebombzen_> this can work for or against you, depending on how important the timestamps are. in your case it means that you might want to sync them up with the overlay filter.
[15:13:52 CET] <thebombzen_> that will allow ffmpeg to keep the streams in line
[15:37:39 CET] <adgtl> Hi guys
[15:37:58 CET] <adgtl> I have an mp4 of 15 minutes.. but first 10 minutes there is silence
[15:38:23 CET] <adgtl> How can I trim mp4 to just 5 minutes, where person starts talking?
[15:38:34 CET] <adgtl> I need ffmpeg command for that?
[15:38:42 CET] <adgtl> Is this simply possible?
[15:40:48 CET] <relaxed> adgtl: use -ss to seek to the desired start position
[15:41:06 CET] <adgtl> relaxed how will know what's desired start position?
[15:41:18 CET] <adgtl> I don't know when the person starts talking
[15:41:18 CET] <relaxed> https://trac.ffmpeg.org/wiki/Seeking
[15:41:26 CET] <adgtl> I am looking for programmatic solution
[15:41:32 CET] <adgtl> where I don't know what time to seek?
[15:42:41 CET] <relaxed> there's a filter called silencedetect
[15:44:26 CET] <adgtl> relaxed there is silenceremove .. but does this work with mp4 or webm?
[15:46:53 CET] <durandal_1707> adgtl: it works for anyou audio but there is no way to drop video too
[15:47:04 CET] <adgtl> durandal_1707 :(
[15:47:12 CET] <adgtl> anyone here has some recommendation?
[15:47:30 CET] <adgtl> I want to basically cut video parts where audio was silent for certain decibels
[15:48:14 CET] <durandal_1707> you could use silencedetect to detect silence and in another trim it with trim filters
[15:49:39 CET] <relaxed> you can do just about anything when scripting with ffmpeg
[15:51:34 CET] <adgtl> durandal_1707 hmm
[15:51:47 CET] <adgtl> durandal_1707 any example that you know of? I am googling for it anyway
[15:54:21 CET] <relaxed> learn how the filters you need work, write a script, ???, profit
[15:55:25 CET] <__jack__> that's my boy :3
[15:58:41 CET] <durandal_1707> adgtl: it's not hard to write scripts in any language you want
[15:58:50 CET] <furq> https://ffmpeg.org/ffmpeg-filters.html#silenceremove
[15:58:54 CET] <furq> that looks like what you want
[15:59:29 CET] <furq> oh nvm, video
[15:59:31 CET] <relaxed> furq: what about the video stream?
[15:59:40 CET] <furq> yeah, silencedetect is your best bet then
[16:00:41 CET] <fling> thebombzen_: I don't know what hstack and vstack are
[16:01:00 CET] <furq> hstack stacks two videos horizontally
[16:01:04 CET] <furq> you can probably guess what vstack does
[16:01:20 CET] <furq> s/two/multiple/
[16:01:23 CET] <fling> thebombzen_: I don't want to stack the streams.
[16:01:42 CET] <fling> furq: but I want this feature in mpv :P
[16:01:49 CET] <thebombzen> fling: what do you mean by "I want to align them"
[16:02:04 CET] <furq> i don't think mpv supports complex filterchains
[16:02:05 CET] <fling> furq: or better separate/muliple windows/tiles per video stream
[16:02:41 CET] <fling> thebombzen: >> How do I capture two webcams keeping streams in sync?
[16:02:41 CET] <thebombzen> furq: it does support weird stream notation but it's only the stuff in -vf.
[16:02:55 CET] <thebombzen> fling: what does it mean to keep them in sync if you're not displaying them next to each other.
[16:03:01 CET] <thebombzen> what are you doing with the two outputs
[16:03:32 CET] <fling> thebombzen: there are no two output. I have a single output http://dpaste.com/08ETS1T
[16:04:05 CET] <relaxed> he means when you watch them
[16:04:34 CET] <thebombzen> if you use -vsync 0, then it should pass all the timestamps from the v4l2 to the muxer.
[16:04:57 CET] <thebombzen> that should preserve the timestamps in the resulting file which you can then use to sync them up
[16:05:04 CET] <fling> I mean this example does not work ^ but I _will_ have a playable and not corrupted single output piping multiple ffmpegs to a single muxing ffmpeg. But this way I will loose not only audio but also video sync
[16:05:57 CET] <fling> >> which you can then use to sync them up
[16:05:59 CET] <thebombzen> well the timestamps are the most important thing
[16:06:00 CET] <fling> How? :P
[16:06:13 CET] <thebombzen> -copyts before -i preserves input timestamps
[16:06:28 CET] <thebombzen> and -vsync 0 prevents them from being recreated by the encoder/muxer
[16:06:47 CET] <thebombzen> so as long as you always use a timestamp-friendly container format like matrsoka, all the infromation will be there.
[16:06:59 CET] <thebombzen> that's how you can preserve the data that would be used to sync them up
[16:07:10 CET] <thebombzen> the issue is that it's all meaningless until you go to watch them both at the same time
[16:07:59 CET] <fling> thebombzen: are the timestamps going to be taken from the v4l devices? This should break sync with everything because each device is using separate clock. But if the timestamps are local then everyng is going to be fine.
[16:08:19 CET] <thebombzen> afaik they will be
[16:08:32 CET] <thebombzen> or at least. there's some option to make it that way
[16:08:51 CET] <klevin> hello
[16:08:58 CET] <klevin> anybody?
[16:09:14 CET] <fling> ¿poquP
[16:10:11 CET] <thebombzen> fling: so I just looked at it. some v4l drivers use the wall clock (UTC) and some use "uptime" which is a timestamp based on when the computer booted. but it should be consistent between the iterations as long as you don't reboot
[16:10:21 CET] <klevin> when i use this option "-r 1/90 images/tematv%04d.jpg" in the ffmpeg command i do not see any frame log in the linux terminal console and i see a lot of drop frames
[16:10:40 CET] <thebombzen> well of course you're dropping frames
[16:10:48 CET] <kerio> timestamp from cameras is a crapshoot most of the times
[16:10:48 CET] <thebombzen> you set the framerate to 1 frame every 90 seconds
[16:11:02 CET] <kerio> LOL
[16:11:09 CET] <kerio> he did, the absolute madman
[16:11:16 CET] <thebombzen> I think I know why
[16:11:27 CET] <thebombzen> it's a way to extract a tumbnail every 90 seconds
[16:11:38 CET] <relaxed> klevin: paste your whole command
[16:11:42 CET] <klevin> yes to take a screenshot every 90 seconds
[16:11:50 CET] <klevin> but is normal the frames drop?
[16:11:53 CET] <thebombzen> well yes
[16:11:55 CET] <furq> yes
[16:11:59 CET] <thebombzen> you're dropping 89 out of every 90 frames.
[16:12:03 CET] <thebombzen> or rather
[16:12:09 CET] <thebombzen> all but 1 frame every 90 seconds
[16:12:17 CET] <klevin> the system i sstreaming as well from one source to another
[16:12:18 CET] <thebombzen> so you should drop the vast majority of your frames
[16:12:35 CET] <klevin> i just need to take 1 out of every 90 frames and store it like img
[16:12:45 CET] <furq> well then congratulations
[16:12:45 CET] <klevin> the rest should proceed normaly\
[16:12:48 CET] <fling> thebombzen: ok so fist step is to add all these timestamp flags to capturing ffmpegs piping this to the single muxing ffmpeg. Should I somehow preserve timestamps in the pipes for aligning in muxing ffmpeg?
[16:13:02 CET] <thebombzen> as much as possible yes
[16:13:02 CET] <klevin> how?
[16:13:11 CET] <fling> will it also work for alsa?
[16:13:26 CET] <thebombzen> I don't know about alsa timestamps I'm afraid
[16:13:31 CET] <thebombzen> Sorry about that.
[16:13:37 CET] <fling> I was never able to sync multiple alsa inputs without special hardware.
[16:13:53 CET] <thebombzen> I've never been able to do that either without using something very low-level
[16:14:04 CET] <thebombzen> but I've also never had to so I generally gave up when I couldn't
[16:14:41 CET] <klevin> any open source script for monitoring the rtmp links?
[16:14:46 CET] <klevin> in node js?
[16:15:10 CET] <fling> klevin: ffprobe
[16:15:31 CET] <klevin> i am building one, but would be nice if the client can have the posibility to check every 90 sec the frame content
[16:15:48 CET] <klevin> ffprobe when detects black frames stop
[16:16:05 CET] <klevin> instead ffmpeg contionue display
[16:16:24 CET] <klevin> developering via linux terminal, so the node js will received live data
[16:16:38 CET] <fling> What is linux terminal?
[16:16:40 CET] <klevin> but now abkle to work with the images without droping
[16:16:49 CET] <klevin> kind of windows cmd
[16:16:49 CET] <fling> >> instead ffmpeg contionue display
[16:16:52 CET] <klevin> bu in linux
[16:16:56 CET] <fling> ^ what are you doing?
[16:17:09 CET] <klevin> if you have a rtmp linkj
[16:17:13 CET] <fling> Give us some command examples and their output
[16:17:22 CET] <klevin> you need to check all the time the signal
[16:18:38 CET] <klevin> ffmpeg -i blacknormalblack.mp4 -s 720x576 -y -strict -2 -acodec aac -ab 128k -ac 2 -ar 48000 -vcodec libx264 -x264opts keyint=5:min-keyint=10 -g 60 -minrate 2000k -maxrate 4000k -bufsize 9000k -preset ultrafast -vf blackframe=100:84 -f flv rtmp:link 2>&1
[16:18:59 CET] <klevin> this detect the black frame
[16:19:02 CET] <klevin> all the time is on
[16:19:22 CET] <klevin> now i need ti implement on the command as well the images screenshoot as well
[16:19:36 CET] <klevin> to check periodically the source content
[16:19:41 CET] <klevin> any idea?
[16:20:55 CET] <fling> klevin: do you want to save screenshots from blacknormalblack.mp4 the same time you are streaming to rtmp:link?
[16:21:13 CET] <klevin> yes
[16:21:14 CET] <fling> klevin: just add another jpeg output saving to files
[16:21:24 CET] <klevin> save img without stoping the isgnale
[16:21:25 CET] <fling> klevin: or you could save to the same jpeg file
[16:21:49 CET] <klevin> i tried but i have a log of drop
[16:22:02 CET] <klevin> -r 1/90 images/tematv%04d.jpg
[16:22:24 CET] <klevin> also some data (frames, size, speed ) have nor results if i insert this
[16:22:55 CET] <fling> You will drop frames anyway
[16:23:29 CET] <klevin> so teorically i will have to rtmp source, first output and second input
[16:23:56 CET] <klevin> the function will transmit from the first to second rtmp and every 90 sec will take a screeenshot
[16:24:05 CET] <fling> klevin: no you will have blacknormalblack.mp4 input and rtmp output and jpeg output
[16:24:35 CET] <klevin> the blacknormalblack.mp4 is just for testing purpose
[16:25:13 CET] <klevin> so fling what is your suggestion?
[16:27:49 CET] <fling> klevin: ffmpeg -i rtmp:input -c copy -f flv rtmp:output -f mjpeg -r 1/90 screenshot.jpg
[16:28:12 CET] <fling> klevin: this will copy the input stream to the output stream and save a screenshot to screenshot.jpg each 90 seconds
[16:28:59 CET] <klevin> so you suggesting to add a second command only for this purpose?
[16:29:13 CET] <fling> this is a single command
[16:29:19 CET] <fling> I'm not suggesting using two commands.
[16:30:01 CET] <klevin> -vf blackframe=100:84 need to added as well
[16:32:23 CET] <klevin> so the command transmit, check the source for blackframes, take screenshot every 90 sec
[16:32:34 CET] <klevin> ffmpeg -i blacknormalblack.mp4 -s 720x576 -y -strict -2 -acodec aac -ab 128k -ac 2 -ar 48000 -vcodec libx264 -x264opts keyint=5:min-keyint=10 -g 60 -minrate 2000k -maxrate 4000k -bufsize 9000k -preset ultrafast -vf blackframe=100:84 -f flv rtmp:link 2>&1
[16:35:15 CET] <fling> klevin: why are you reencoding video and audio?
[16:35:35 CET] <fling> klevin: ffmpeg -i rtmp:input -c copy -f flv rtmp:output -vf blackframe=100:84 -f mjpeg -r 1/90 screenshot.jpg
[16:37:31 CET] <klevin> ok checking this command
[16:41:26 CET] <teratorn> anyone have suggestions for how I might remux (-c copy) an rtmp source stream to hls on disk? I'm having issues and wondering if this is even possible or not
[16:41:48 CET] <klevin> where can i find the optionts for blackframe detect, kind of : pblack, pts
[16:42:33 CET] <DHE> teratorn: loosely speaking, ffmpeg -i $RTMPURL -c copy -f hls [hls-options] /path/to/output.m3u8
[16:42:40 CET] <teratorn> sorry, rtsp stream
[16:42:45 CET] <DHE> still...
[16:43:07 CET] <teratorn> DHE: yeah... that's basically what I'm doing. let me check some things and I'll come back with more details :)
[16:43:22 CET] <DHE> do note that HLS is rather strict in its list of supported codecs. some players will refuse anything except AAC and H264 because that's all the spec allows for. other players will be more tolerant
[16:43:39 CET] <teratorn> DHE: it's h264 and no audio so im in luck
[16:44:04 CET] <DHE> well, when in doubt..
[16:44:33 CET] <klevin> can someone explain what this logs means?
[16:44:34 CET] <klevin> frame:826pblack:100pts:7434000t:82.600000type:Plast_keyframe:800
[16:50:17 CET] <thebombzen> fling: it's -f image2 -c mjpeg
[16:50:30 CET] <thebombzen> image2 is the format for 'sequence of images'
[16:52:12 CET] <furq> you shouldn't need either
[16:52:20 CET] <thebombzen> correct, but I prefer to
[16:52:26 CET] <furq> just `-update 1 screenshot.jpg` should do it
[16:52:40 CET] <thebombzen> although fling had written -f mjpeg
[16:52:43 CET] <thebombzen> which afaik is wrong
[16:52:46 CET] <furq> yeah that's definitely wrong
[16:52:56 CET] <furq> that'll write an mjpeg stream
[16:53:02 CET] <thebombzen> yea that's what I thought
[16:53:15 CET] <thebombzen> but what is -update 1
[16:53:35 CET] <furq> image2 will bail out if the output filename isn't a pattern
[16:53:48 CET] <furq> -update 1 forces it to just keep overwriting the output file
[16:54:32 CET] <furq> it'll bail out of writing multiple frames, that is
[16:55:11 CET] <thebombzen> oh so update 1 will write the first frame, and then the second frame will just write it again, etc
[16:55:14 CET] <klevin> where can i find the description for this logs? frame:1281pblack:100pts:11529000t:128.100000type:Plast_keyframe:1250
[16:55:25 CET] <thebombzen> klevin where did you find that?
[16:55:34 CET] <thebombzen> post the full log where you found that on a paste site
[16:55:40 CET] <furq> i'm guessing that's the output of -vf blackframe
[16:55:57 CET] <thebombzen> it looks like it's missing some whitespace though
[16:56:09 CET] <klevin> i removed all the white sp<ce
[16:56:23 CET] <klevin> so i can extract data with regular expresion
[16:56:34 CET] <klevin> yes the blackframe output
[16:56:51 CET] <thebombzen> why would you keep the whitespace removed when you show it to a human
[16:56:59 CET] <furq> i mean
[16:57:11 CET] <furq> why would you remove the whitespace to write a regex
[16:57:16 CET] <thebombzen> also have you considered using tr instead: tr ' ' ','
[16:57:23 CET] <thebombzen> that'll change it to CSV
[16:57:45 CET] <klevin> i remove the whitespace, and than i extract specifich data with regual expresion
[16:57:49 CET] <klevin> to display in web
[16:58:08 CET] <klevin> another story
[16:58:11 CET] <thebombzen> no you don't
[16:58:21 CET] <klevin> why?
[16:58:23 CET] <thebombzen> regular expressions can work across whitespace
[16:58:25 CET] <adgtl> found this command to trim video for silence parts
[16:58:27 CET] <adgtl> ffmpeg -i final.mp4 -filter_complex "[0:a]silencedetect=n=-90dB:d=0.3[outa]" -map [outa] -f s16le -y /dev/null | F='-aq 70 -v warning' perl -ne 'INIT { $ss=0; $se=0; } if (/silence_start: (\S+)/) { $ss=$1; $ctr+=1; printf "ffmpeg -nostdin -i final.mp4 -ss %f -t %f $ENV{F} -y %03d.mkv\n", $se, ($ss-$se), $ctr; } if (/silence_end: (\S+)/) { $se=$1; } END { printf "ffmpeg -nostdin -i final.mp4 -ss %f $ENV{F} -y %03d.mp4\n", $se,
[16:58:27 CET] <adgtl> $ctr+1; }' | bash -x
[16:58:48 CET] <klevin> i found more simpler to work without spaces
[16:58:49 CET] <furq> i have no idea why that's using -filter_complex instead of -af but sure
[16:58:53 CET] <adgtl> but looks like it's not finding enough silence parts.. tweaked with decibels.. but did not help much
[16:58:57 CET] <thebombzen> furq: it's also using perl
[16:59:05 CET] <klevin> i am working with node js
[16:59:20 CET] <furq> well my brain shuts off when i see perl so i can't comment on that bit
[16:59:39 CET] <thebombzen> klevin: well javaScript regular expressions work with whitespace
[16:59:44 CET] <thebombzen> so I guess you're in luck
[17:00:10 CET] <klevin> the number didn't acording to the number the spaces between numbers changed and can not display properly
[17:00:14 CET] <klevin> trust me i tested this part
[17:00:19 CET] <klevin> an entire day
[17:00:40 CET] <furq> \s*
[17:01:43 CET] <fling> Try this ^
[17:01:50 CET] <thebombzen> for reference: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Regular_Expressions#special-white-space
[17:02:04 CET] <thebombzen> \s is one of the 6 whitespace characters
[17:02:42 CET] <thebombzen> * is "zero or more of that"
[17:02:47 CET] <efface> I am trying to create a monitor for our internal video network and I am wondering if there is a way to make the encoder strict without failing? For instance, if I pulled the input UDP into VLC when there is a problem with the source, I will see macroblocking and other problems. I would like the x264/ac3 encoder to not conceal any problems with the source and continue encoding
[17:02:49 CET] <thebombzen> so \s* is zero or more whitespace characters
[17:03:30 CET] <JEEB> efface: that's more up to the decoder, and most formats really can't conceal issues too much
[17:03:50 CET] <JEEB> you can get a partial decoded picture with artifacts, or no picture at all until one can be decoded
[17:04:17 CET] <efface> I would like to have the artifacts included
[17:04:19 CET] <thebombzen> yea. if you see artifacts caused by a bad bitstream (which in turn is caused by packet loss) then that's a decoder issue
[17:04:30 CET] <thebombzen> well that's not the encoder's decision
[17:04:32 CET] <efface> ok
[17:04:44 CET] <thebombzen> the encoder takes raw video input and encodes it.
[17:05:01 CET] <JEEB> whether or not the whole chain breaks because of the input being considered utterly broken is a separate issue.
[17:05:04 CET] <thebombzen> if that raw video had ugly macroblocky things then so should the encoded video
[17:05:10 CET] <efface> Any recommendations for an HLS website plugin that does such? I am currently using clappr
[17:05:15 CET] <JEEB> and it depends on the decoder's error cases :P
[17:05:26 CET] <thebombzen> yea different decoders will beheave differently
[17:05:33 CET] <thebombzen> as will different players.
[17:05:47 CET] <thebombzen> some players will discard garbage and some players will attempt to display it
[17:06:02 CET] <efface> clappr seems to be fairly limited, especially since it's built on flashls
[17:06:23 CET] <thebombzen> fling: another idea might be to do the syncing separately
[17:06:52 CET] <thebombzen> using the system clock with "date +%s.%N" means you can do it manually
[17:06:56 CET] <thebombzen> but with a bit of work.
[17:07:28 CET] <thebombzen> %s is unix time in seconds and %N is fractional time in nanoseconds
[17:08:14 CET] <fling> thebombzen: this will only work if streams are running with the same speeds which is not the case for alsa afaik
[17:08:31 CET] <thebombzen> speeds?
[17:08:36 CET] <thebombzen> wouldn't it be 1.0x speed?
[17:08:51 CET] <fling> sound cards are using their own clocks
[17:09:18 CET] <fling> There will be drift anyway with multiple alsa inputs :<
[17:09:25 CET] <fling> I need to test this once again
[17:12:42 CET] <thebombzen> also
[17:12:55 CET] <thebombzen> have you looked at trying to sync stuff with -map
[17:13:05 CET] <fling> no
[17:13:26 CET] <fling> I thought map is only for specifying what is where
[17:13:35 CET] <thebombzen> I thought that too until 3 minutes ago
[17:13:41 CET] <thebombzen> when I saw something in the docs
[17:13:49 CET] Action: fling is opening the doc
[17:15:24 CET] <thebombzen> "With -map you can select from which stream the timestamps should be taken. You can leave either video or audio unchanged and sync the remaining stream(s) to the unchanged one."
[17:15:35 CET] <thebombzen> https://ffmpeg.org/ffmpeg.html
[17:15:46 CET] <thebombzen> unfortunately, under -map's optiosn it says very little about how that works
[17:19:08 CET] <fling> thebombzen: so I hope this will allow me to sync multiple pipe inputs together.
[17:19:19 CET] <fling> Which container to use for piping? nut?
[17:20:17 CET] <thebombzen> I don't know. I think matroska or nut are fine
[17:20:25 CET] <thebombzen> I'd try both and see which one works better
[17:20:53 CET] <thebombzen> furq: so apparently libavfilter is multithreaded
[17:21:06 CET] <thebombzen> -filter_complex_threads nb_threads (global): "Defines how many threads are used to process a filter_complex graph. Similar to filter_threads but used for -filter_complex graphs only. The default is the number of available CPUs."
[17:21:23 CET] <fling> >> With -map you can select &
[17:21:36 CET] <fling> But I don't see how exactly could I select!
[17:22:09 CET] <BtbN> filters can be multi threaded
[17:22:38 CET] <BtbN> Not a filter graph though
[17:35:39 CET] <fling> The question is why are the intputs not multi threaded.
[17:36:41 CET] <thebombzen> idk
[17:37:24 CET] <thebombzen> threading in ffmpeg be like https://i.imgur.com/XuRZsVQ.jpg
[17:41:01 CET] <BtbN> ffmpeg.c is not multi threaded
[17:42:27 CET] <thebombzen> sometimes I feel that threads are just easier to use programmatically
[17:42:38 CET] <thebombzen> given that ffmpeg is like a pipeline I'd do it asychronously
[17:42:41 CET] <thebombzen> even on a singlecore cpu
[17:42:44 CET] <thebombzen> is there any reason ffmpeg doesn't
[18:02:46 CET] <BtbN> Because it's everything but easy.
[18:03:45 CET] <furq> threads are very eas to usye
[18:04:18 CET] <BtbN> Patches without regressions are allways welcome then.
[18:04:59 CET] <furq> noth couing ld be splimer
[19:07:03 CET] <fahadash> Can we build a one gigantic filter-graph and call it a project?
[19:08:25 CET] <fahadash> I have about 20 different shots of a scene, now I have to pick and choose portions from each, time-lapse some portions and add captions and what not as well. Can I use FFMPEG with 20 -i FILE statements and one big filter-graph for this purpose to build my output?
[20:03:31 CET] <kepstin> fahadash: yep, but at that point it might be easier to put a bunch of 'movie' filters in your filter script as sources rather than keeping track of input indexes.
[20:04:22 CET] <kepstin> (note that ffmpeg inputs and 'movie' filter inputs have slightly different behaviour wrt timestamps when seeking)
[20:43:03 CET] <fahadash> What are the 'movie' filters?
[20:50:38 CET] <phillipk> I keep getting a white background but expected alpha. I have two alpha pngs and this command:
[20:51:39 CET] <phillipk> ffmpeg -y -r 25 -t 2 -loop 1 -s 4cif -i background.png -i overlay.png -filter_complex overlay=x=10:y=200 -pix_fmt rgba -vcodec png output.mov
[20:51:52 CET] <phillipk> I'm not sure if -loop 1 is necessary... or -s 4cif
[20:52:27 CET] <phillipk> comes out with a white background every time.
[20:54:43 CET] <DHE> fahadash: they let you use a file on-disk as a video/audio source. kinda like the nullsrc except rather than solid black they'll do video playback
[20:56:56 CET] <fahadash> Can we put the whole filter graph, and possibly -i FILE switches in a text file and just do /usr/bin/ffmpeg /path/to/filethathasmyswitches ?
[20:57:33 CET] <BtbN> just use your shell to expand it
[20:58:48 CET] <fahadash> I would've loved to put at least the filter graph in a file so I can do ffmpeg -WHATEVERSWITCH /path/to/file
[21:02:49 CET] <llogan> fahadash: -filter_complex_script
[21:03:30 CET] <phillipk> my current issue is that when -filter_complex_script is crazy long, it fails.
[21:03:51 CET] <phillipk> of course it fails earlier if I just put the filter inline using -filter_complex
[21:04:16 CET] <llogan> phillipk: you probably need to add "format=rgb" to your overlay
[21:04:22 CET] <llogan> regarding the alpha junk
[21:04:32 CET] <phillipk> I'll try, thanks
[21:04:49 CET] <llogan> and remove -r 25 and -s 4cif
[21:05:29 CET] <phillipk> oky, but I want framerate 25
[21:05:49 CET] <llogan> taht's the default
[21:06:20 CET] <llogan> if you want it to be different then use -framerate, not -r when using image file demuxer.
[21:06:39 CET] <phillipk> ok
[21:07:11 CET] <furq> png in mov?
[21:12:25 CET] <phillipk> I'm just trying to assemble some pngs to create a video that contains transparency--so that I can overlay THAT video with my other content
[21:13:26 CET] <llogan> why not use multiple overlay?
[21:14:25 CET] <phillipk> nearly every frame has to overlay the png in a different location (based on a user's mouse movement). So, there are ~7000 overlays
[21:15:05 CET] <furq> phillipk: i imagine it's less overhead to just write pngs
[21:15:20 CET] <furq> using mov just confuses the matter
[21:16:33 CET] <phillipk> except I figured I could make a filter graph that specifies a few seconds worth of x/y locations--output a 2 second .mov. Then, assemble those. That way, I'd have a few thousand movs vs. 30x that (in pngs)
[21:16:56 CET] <furq> 7000 pngs isn't that many
[21:17:18 CET] <phillipk> how about 210,000?
[21:17:32 CET] <furq> that should probably be fine
[21:18:07 CET] <furq> by all means use an intermediate format if that many pngs causes an issue
[21:18:14 CET] <furq> but i don't think it will, and it's much simpler to do it that way
[21:19:11 CET] <phillipk> ok, let me try... BTW, changing "overlay=x=10:y=300:format=rgb" didn't exactly work--just made the edges of my png sharp
[21:19:57 CET] <furq> i'd have thought you'd need format=rgba
[21:20:12 CET] <furq> actually nvm there is no such thing
[21:21:04 CET] <thebombzen> I'm pretty sure rgba is a thing
[21:21:24 CET] <furq> not as an argument to overlay
[21:21:35 CET] <llogan> phillipk: which player are you using?
[21:21:50 CET] <phillipk> potplayer--yeah, I thought maybe that just wasn't showing the alpha
[21:22:11 CET] <phillipk> but it shows alpha as black when I take an old .flv I have with alpha
[21:22:36 CET] <llogan> i'll be lazy and blame the player
[21:22:55 CET] <furq> the flv is probably using yuva rather than rgba, if that makes any difference
[21:24:53 CET] <phillipk> okay, I'll just make pngs and see how that plays out
[21:26:38 CET] <fahadash> I have to rotate the video 90 degrees clockwise, transpose=1 is rotating it 180 degrees clockwise. What am I missing?
[21:29:42 CET] <thebombzen> fahadash: how does one rotate a video 180 degrees clockwise
[21:29:47 CET] <thebombzen> that's just 180 degrees
[21:31:47 CET] <fahadash> Ok. You are right. But that is not my issue
[21:31:58 CET] <thebombzen> and tranpose is working fine for me
[21:31:58 CET] <fahadash> My issue is to rotate only 90 degrees not 180
[21:32:07 CET] <thebombzen> post exact command?
[21:32:25 CET] <thebombzen> the usual drill. post exact command and output on a paste site
[21:33:26 CET] <fahadash> working on windows currently, have to figure out how to stream the output directly to a paste site
[21:33:44 CET] <llogan> ffmpeg -i input ... output 2> log.txt
[21:33:54 CET] <llogan> I guess...for Windows. Not a Windows user.
[21:34:05 CET] <fahadash> there you go: http://pastebin.com/raw/XzU10dgY
[21:34:28 CET] <llogan> why do you ahve two inputs?
[21:34:37 CET] <thebombzen> yea I was just about to ask that
[21:34:43 CET] <thebombzen> you do realize you have two inputs right
[21:34:57 CET] <fahadash> I have to overlay them side by side; but at this time I am only concerned with rotating them. I am building it brick by brick
[21:35:16 CET] <thebombzen> well first let's do the rotation
[21:35:23 CET] <thebombzen> which one are you trying to rotate?
[21:35:25 CET] <fahadash> I was also going to ask how could I rotate both
[21:35:42 CET] <thebombzen> you'd do that one at a time.
[21:35:49 CET] <thebombzen> that's how you'd rotate both.
[21:35:56 CET] <fahadash> Can't put that in a filter graph?
[21:36:01 CET] <thebombzen> you can. but you're not.
[21:36:14 CET] <thebombzen> and the filter graph would involve applying the transpose filter to two separate streams
[21:36:24 CET] <thebombzen> either way you're still doing it one at a time.
[21:36:31 CET] <llogan> for each rotation you can chain two transpose: transpose=1,transpose=1. or use vflip,hflip, or use rotate=PI:bilinear=0
[21:36:54 CET] <thebombzen> well llogan that would rotate it 180 degrees
[21:37:10 CET] <fahadash> Let me get this right... you guys do not want me to put transpose as part of my filter graph because it will complicate things?
[21:37:10 CET] <llogan> oh, i transposed what he wanted.
[21:37:14 CET] <furq> vstack,transpose=1
[21:37:37 CET] <thebombzen> fahadash, no I want to know why your transpose filter is rotating it 180 degrees
[21:37:42 CET] <thebombzen> and the extra stuff is just getting in the way
[21:37:47 CET] <thebombzen> so let's diagonise one problem at a time.
[21:37:55 CET] <thebombzen> and then we can worry about hte fact that you have two videos later.
[21:37:55 CET] <fahadash> ok. Let me drop one input
[21:38:26 CET] <furq> Side data:
[21:38:27 CET] <furq> displaymatrix: rotation of -90.00 degrees
[21:38:32 CET] <furq> that probably has something to do with it
[21:38:43 CET] <kuroro> hello, does anyone have experience w/ Separating Background Music from Vocals given an mp3 ?
[21:38:49 CET] <fahadash> alright, I got one video input and transpose=1 is still doing 90 deg clockwise
[21:38:51 CET] <thebombzen> oh yea that's a good point
[21:38:57 CET] <furq> fahadash: ^
[21:39:01 CET] <furq> your input has rotation metadata set
[21:39:08 CET] <thebombzen> so iPhone records have EXIF data that say "rotate this video"
[21:39:08 CET] <furq> you probably want to get rid of that
[21:39:23 CET] <thebombzen> kuroro: that's generally impossible.
[21:39:32 CET] <furq> kuroro: not without an instrumental version of the same song
[21:40:07 CET] <fahadash> How do I get rid of that rotation metadata or override that?
[21:40:09 CET] <kuroro> just seeing some papers on people using deep learing to separate vocals, so i thought there might be new tools/advancements
[21:40:14 CET] <thebombzen> -map_metadata -1
[21:40:15 CET] <furq> -map_metadata -1
[21:40:21 CET] <kuroro> https://arxiv.org/abs/1504.04658
[21:40:23 CET] <thebombzen> says "discard all metadata"
[21:40:41 CET] <furq> although if the input is rotated by -90 degrees, you might not need to transpose
[21:41:16 CET] <furq> there's probably some tool which can get rid of that inplace, but `ffmpeg -i foo.mp4 -map_metadata -1 -c copy -map 0 bar.mp4` works
[21:41:16 CET] <thebombzen> lol kuroro did you read the abstract
[21:41:21 CET] <thebombzen> "However, it is not yet known whether these methods are capable of generalizing to the discrimination of voice and non-voice in the context of musical mixtures. "
[21:41:36 CET] <thebombzen> "used to estimate 'ideal' binary masks for carefully controlled cocktail party speech separation problems"
[21:41:56 CET] <kuroro> ah ic
[21:41:57 CET] <fahadash> I added -map_metadata -1 after the -i FILE and I still get it rotated 180 instead of 90
[21:42:20 CET] <thebombzen> that's because the output is being played without the rotation
[21:42:25 CET] <thebombzen> and the input is being played with it.
[21:42:36 CET] <phillipk> I'm curious what workflow you all use? I'm using node with fluent-ffmpeg and it seems fine--I think a plain old .bat file might be enough however.
[21:42:38 CET] <thebombzen> try ffmpeg -i input -map_metadata -1 -c copy output and working with that
[21:42:59 CET] <thebombzen> that'll copy the video without transcoding but discard the metadata
[21:43:04 CET] <fahadash> I have a video which is turned on its side counterclockwise 90 degrees. I need to tilt it back. How do I do it?
[21:43:06 CET] <thebombzen> then work with that as your source.
[21:43:16 CET] <thebombzen> fahadash: we are literally explaining this to you right now.
[21:43:43 CET] <thebombzen> why do you just jump in like you just joined the channel and haven't been speaking with us for the last 8 minutes
[21:44:46 CET] <fahadash> 15:42 <thebombzen> try ffmpeg -i input -map_metadata -1 -c copy output and working with that , ended in less than 10 ms and did nothing to the video. Input and output are same
[21:46:06 CET] <fahadash> thebombzen: I was trying to restate my question in simple terms
[21:47:37 CET] <thebombzen> it didn't do nothing
[21:47:41 CET] <thebombzen> it discarded the metadata
[21:47:59 CET] <thebombzen> it should be fast if you're not transcoding
[21:48:06 CET] <fahadash> ok
[21:48:10 CET] <llogan> -noautorotate
[21:48:34 CET] <llogan> although i can't remember if autorotate is even used then filtering
[21:50:11 CET] <fahadash> I did -map_metadata -1 -c copy, then used the output with another command with -vf "transpose=1" and I still end up with 180 deg
[21:51:46 CET] <thebombzen> can you post the exact command and output of the second command
[21:51:54 CET] <fahadash> http://pastebin.com/raw/kNxLkvWZ
[21:52:04 CET] <fahadash> this has both
[21:53:07 CET] <fahadash> I still see the following in the 2nd command
[21:53:11 CET] <fahadash> Side data:
[21:53:11 CET] <fahadash> displaymatrix: rotation of -90.00 degrees
[21:54:30 CET] <thebombzen> interesting
[21:54:39 CET] <thebombzen> that's apparently because -map_metadata -1 isn't discarding that
[21:56:02 CET] <thebombzen> I don't know why
[21:56:40 CET] <fahadash> Ok. Thank you
[21:57:38 CET] <furq> well yeah just don't use transpose
[21:57:50 CET] <furq> evidently it's discarding the side data when you reencode
[21:57:52 CET] <fahadash> How do I tilt it then?
[21:58:01 CET] <furq> reencoding will reset the rotation to 0
[21:58:09 CET] <furq> and since the original is -90, that'll effectively rotate by 90
[22:03:08 CET] <fahadash> furq: You are right, I applied the overlay to the two inputs and they were not tilted on their sides
[22:14:49 CET] <fahadash> in filter graph the W is the width of what? I am assuming the widest of the input videos
[22:36:05 CET] <fahadash> I am trying to resize two inputs to 350x500, and put them on side by side; since the output frame can only fit one source; I am using pad=700:500:0:0 at the end of the filter chain but all I see is only one source and a blank area on the other side, why?
[22:36:12 CET] <fahadash> ffmpeg -y -i c:\temp\del\bowling\IMG_4548.MOV -i c:\temp\del\bowling\IMG_4549.MOV -filter_complex "[0:v] scale=w=350:500 [tim]; [1:v] scale=w=350:h=500 [scott]; [scott][tim] overlay=W:0, pad=700:500:0:0" c:\temp\del\bowling\out\final.avi
[22:36:49 CET] <thebombzen> fahadash: -vf hstack
[22:37:02 CET] <furq> yeah don't use pad and overlay for that
[22:37:18 CET] <thebombzen> you could but it's unnecessarily slow and complicated
[22:37:34 CET] <furq> i would probably just do hstack,scale=700:500
[22:39:11 CET] <fahadash> doesn't work: ffmpeg -y -i c:\temp\del\bowling\IMG_4548.MOV -i c:\temp\del\bowling\IMG_4549.MOV -filter_complex "[0:v] scale=w=350:500 [tim]; [1:v] scale=w=350:h=500 [scott]; hstack=[tim]:[scott]" c:\temp\del\bowling\out\final.avi
[22:39:23 CET] <fahadash> [AVFilterGraph @ 000000000075a7a0] Unable to parse graph description substring: ":[scott]"
[22:41:25 CET] <fahadash> found the problem, works perfect
[22:44:23 CET] <fahadash> Why can't we provide more than two inputs to hstack?
[22:44:29 CET] <furq> you can
[22:44:32 CET] <furq> hstack=inputs=3
[22:44:57 CET] <fahadash> got it. thanks
[23:01:01 CET] <fahadash> how do we join two conditions within drawtext? If I want to add text between 3 and 6 seconds into the timeline
[23:26:29 CET] <llogan> use "enable" option. see "man ffmpeg-filters" for examples.
[23:26:38 CET] <llogan> search for "between"
[00:00:00 CET] --- Tue Dec 20 2016
More information about the Ffmpeg-devel-irc
mailing list