[Ffmpeg-devel-irc] ffmpeg.log.20181003
burek
burek021 at gmail.com
Thu Oct 4 03:05:02 EEST 2018
[02:16:23 CEST] <axisys> how do I remove from time 51:40 to 52:35 from an mp4 file?
[02:16:31 CEST] <axisys> I like to keep the rest
[02:30:16 CEST] <tdr> you probably want to clip those two entire seconds or at least a couple frames, not those exact points
[02:39:07 CEST] <axisys> tdr: not sure how to do that
[02:40:20 CEST] <axisys> it's a 56 min long video.. but there is portion where I by mistake shows my private key info.. I need to take that small portiout before sharing
[02:40:31 CEST] <axisys> showed*
[03:24:46 CEST] <relaxed> axisys: ffmpeg -i input -c copy -t 00:51:40 start.mp4
[03:25:20 CEST] <relaxed> ffmpeg -ss 00:52:35 -i input -c copy end.mp4
[03:26:53 CEST] <relaxed> ffmpeg -f concat -safe 0 -i <(for i in start.mp4 end.mp4; do printf "file "%s"\n" "$PWD"/"$i"; done) -c copy -movflags +faststart done.mp4
[03:33:20 CEST] <axisys> relaxed: awesome.. saving it to my cheat sheet.. what do the -movflags and +faststart do ?
[03:34:18 CEST] <relaxed> moves the index to the start of the container for faster loading
[03:35:13 CEST] <axisys> relaxed: thank you!
[03:35:24 CEST] <relaxed> you're welcome
[03:36:12 CEST] <axisys> relaxed: so I used vlc to get those times .. I suppose that works fine.. any recom in there?
[03:38:52 CEST] <relaxed> Well, mpv is the best video player. Watch the output make sure your personal info omitted
[03:39:26 CEST] <axisys> relaxed: I checked with vlc and it has been.. i will check out mpv
[03:39:31 CEST] <axisys> relaxed: thank you again!
[03:42:50 CEST] Action: relaxed levels up
[05:21:37 CEST] <Marble68> ffmpeg pastbin is down - so Ill ask. Ive got frames being captured with timestap - works great. Ive also gotten frames to capture with %03d to get incremental frame numbers. Is where a way to combine these so that I can get frame number AND time stamp? It just overwrites the frame timestamp because its highest precision is 1 second. So Im after something like -strftime 1 "img/%Y-%m-%d_%H-%M-%S_test_%01d.jpg"
[05:23:47 CEST] <Marble68> that way if 10 frames are captured a second - Ill get <timestamp>_test_<frame # this second>
[05:25:18 CEST] <Marble68> Im thinking Im going to have to modify strftime source to do something like this - maybe a rolling integer that resets when the second changes
[05:25:35 CEST] <Marble68> TIA for any advice
[06:31:15 CEST] <the_gamer> hi there, i got some pictures i want to make a video of. no problem but is there a way to tell ffmpeg to take the audio from another .mp4?
[06:50:52 CEST] <furq> the_gamer: -i foo%d.jpg -i foo.mp4 -map 0:v -map 1:a
[06:51:07 CEST] <the_gamer> oh great, thank you :)
[06:51:11 CEST] <furq> you shouldn't even need -map in this case because the first input has no audio
[06:51:16 CEST] <furq> but it's good to be explicit
[07:03:56 CEST] <mbnt> Hi, I am sort of a newb to ffmpeg. I want to convert h264 footage to cineform. My camera codec is h264 and I want to edit in Resolve on Linux and it does not recognize my codec, but Cineform, it does. I have a folder full of H264 footage that I want to edit. What would be the command to convert the folder contents to cineform?
[07:20:16 CEST] <lindylex> What I want to do. I want to overlay the video num1.mov on the image cover.png. The video has a black background I would like to be transparent so I can see the cover.png below.
[07:20:28 CEST] <lindylex> This is what I have tried : https://pastebin.com/BG1SPwtt
[07:49:29 CEST] <killown> how can I add an image to the end of the video?
[07:49:39 CEST] <killown> I mean, that will last 15 seconds
[07:53:26 CEST] <killown> ffmpeg -loop 1 -t 10 -i preview.png -vf "crop=w=W:h=ih:x='(iw-W)*t/10':y=0" -r 25 -pix_fmt yuv420p out.mp4
[07:53:37 CEST] <killown> Undefined constant or missing '(' in 'W'
[07:53:37 CEST] <killown> Error when evaluating the expression 'W'
[07:56:13 CEST] <killown> ffmpeg -r 1/5 -i preview.png -c:v libx264 -vf fps=25 -pix_fmt yuv420p out.mp4
[07:56:37 CEST] <killown> http://wpbin.io/g10oj7
[07:56:44 CEST] <killown> is this even possible to use ffmpeg to do that?
[08:03:25 CEST] <lindylex> Yes it is possible to do this.
[08:06:23 CEST] <killown> I did
[08:06:31 CEST] <killown> I don't know how to merge two videos
[08:09:17 CEST] <killown> should be simple to join two videos?
[08:12:09 CEST] <killown> lindylex, can you help me join videos?
[08:12:47 CEST] <poutine> are you sure that error is generated from that last command line you put?
[08:12:56 CEST] <poutine> can you post the full log?
[08:13:03 CEST] <lindylex> The image is it the same size as the video you are joining it to?
[08:13:32 CEST] <poutine> https://stackoverflow.com/questions/20847674/ffmpeg-libx264-height-not-divisible-by-2 <- you looked at this as well?
[08:13:48 CEST] <killown> lindylex, not the same
[08:13:52 CEST] <furq> killown: crop=w=iw
[08:14:02 CEST] <killown> furq, sorry already fixed this issue
[08:14:37 CEST] <furq> well yeah the width and height need to be divisible by 2
[08:14:51 CEST] <killown> now I created 15 second image
[08:15:02 CEST] <killown> I want to join video.mp4 with picture.mp4
[08:15:20 CEST] <furq> https://trac.ffmpeg.org/wiki/Concatenate#demuxer
[08:15:42 CEST] <killown> this is not working
[08:15:50 CEST] <killown> the 15 seconds turns 1 seconds in the end
[08:16:06 CEST] <killown> [mp4 @ 0x55fccd4a70c0] Non-monotonous DTS in output stream 0:0; previous: 359196, current: 359130; changing to 359197. This may result in incorrect timestamps in the output file.
[08:16:36 CEST] <furq> do the framerates match
[08:16:54 CEST] <killown> what?
[08:17:01 CEST] <furq> both files need to have the same framerate
[08:17:03 CEST] <killown> don't know what that means
[08:17:06 CEST] <killown> ok
[08:17:14 CEST] <killown> how to do that?
[08:17:25 CEST] <lindylex> furq : they both need to have audio. The image needs to have blank audio when created I think for the concat to work.
[08:18:20 CEST] <furq> killown: -framerate 30 -i preview.png
[08:18:22 CEST] <furq> and get rid of -r 25
[08:18:42 CEST] <furq> obviously replace 30 with whatever your other input is
[08:20:22 CEST] <killown> I am using this ffmpeg -loop 1 -i preview.png -c:v libx264 -t 15 picture.mp4
[08:23:39 CEST] <killown> how do I replace this -i mylist.txt with -i file1.mp4 file2.mp4
[10:28:30 CEST] <lindylex> killown : Convert each file like this. ffmpeg -i input1.mp4 -c copy -bsf:v h264_mp4toannexb -f mpegts intermediate1.ts
[10:28:54 CEST] <lindylex> killown : And This ffmpeg -i input2.mp4 -c copy -bsf:v h264_mp4toannexb -f mpegts intermediate2.ts
[10:29:05 CEST] <lindylex> Then combine them together. ffmpeg -i "concat:intermediate1.ts|intermediate2.ts" -c copy -bsf:a aac_adtstoasc output.mp4
[10:29:19 CEST] <lindylex> This is here also https://pastebin.com/7GhTTYGv
[10:42:13 CEST] <the_gamer> i am having a problem. command: "ffmpeg -i rendered/%04d.png -i P1010197.MP4 -map 0:v -map 1:a -q:a 1 -q:v 1 -c:a copy rendered.mp4". this is using only the first picture of the rendered/*.pngs. why? it should use all of them
[10:47:23 CEST] <the_gamer> they are all in one row, no number/picture is missing. why are tehy not used?
[10:53:13 CEST] <the_gamer> nobody? anybody?
[10:59:39 CEST] <durandal_1707> the_gamer: there should be no gaps
[10:59:51 CEST] <the_gamer> there aren't
[11:01:06 CEST] <durandal_1707> the_gamer: pastebin full ffmpeg output
[11:01:49 CEST] <the_gamer> i have to apologize
[11:02:10 CEST] <the_gamer> pictures are from blender which i thought does not make gaps but blender screwed up and made gaps
[11:02:12 CEST] <the_gamer> thank you
[11:24:59 CEST] <zerodefect> I'm using a COTS multiplexer to multiplex an H.264 stream. I'm encoding the stream using x264 via FFmpeg using the C-API. The multiplexer logs errors when the stream does not adhere to the T-STD model; it believes the video stream is not quite CBR. Does anyone have any tips/suggestions. Admittedly, not too sure where to start (out of my depth?).
[11:27:57 CEST] <JEEB> well the first question is whether you need to have VBV/HRD or actual "dumb" CBR which pads the stream to make it "really CBR"
[11:29:14 CEST] <JEEB> for the first part you need to set maxrate/bufsize (at the minimum), and enable nal-hrd information output from x264
[11:29:52 CEST] <JEEB> see the x264 documentation which key=value pairs you have to give through x264-params (maxrate|bufsize are in the global API for libavcodec)
[11:30:20 CEST] <JEEB> and then if you need "dumb" CBR then a) I'm very sorry b) set the x264 HRD mode to CBR
[11:32:41 CEST] <zerodefect> For your first question, it's the former.
[11:33:53 CEST] <zerodefect> Second question, I set "nal-hrd" option on encoder and set 'bit_rate' and 'bit_rate_tolerance' on AVCodecContext encoding properties. I'll look into bufsize.
[11:34:45 CEST] <zerodefect> You've given me something to go on - thank you.
[11:35:18 CEST] <JEEB> you need specifically maxrate and bufsize both
[11:35:25 CEST] <JEEB> as it's maxrate over bufsize
[11:35:32 CEST] <JEEB> otherwise VBV/HRD will not be done
[11:36:31 CEST] <zerodefect> Ok. Cool.
[11:59:36 CEST] <King_DuckZ> hello, could someone comment about this code please? https://alarmpi.no-ip.org/kamokan/cm?cc
[12:01:06 CEST] <Mavrik> It's very green.
[12:02:55 CEST] <King_DuckZ> Mavrik: replace the ?cc part with ?colourless if that bothers you :) or remove it entirely
[12:15:25 CEST] <zerodefect> @JEEB if for example I'm wanting to encode a 10Mbit/s h.264 stream, what are sensible maxrate/bufsize values? Is that question a bit open-ended?
[12:32:26 CEST] <furq> zerodefect: maxrate should be the same as bitrate for cbr
[12:33:19 CEST] <furq> bufsize should generally be 1-2x your maxrate afaik
[12:33:39 CEST] <zerodefect> Ah ok. Bufsize was the one throwing me. I'll try that out
[12:33:51 CEST] <furq> idk if there are any special considerations there for cbr
[13:39:25 CEST] <GuiToris> hey, this is a rather subjective question. Isn't a 555mb libx264 10-minute video rather big?
[13:40:05 CEST] <GuiToris> it's almost the size of a CD
[13:41:09 CEST] <GuiToris> I thought it would be around 100mb
[13:43:53 CEST] <GuiToris> I forgot to mention it's a 16:9 1080p video
[13:45:36 CEST] <durandal_1707> and fps?
[13:46:36 CEST] <GuiToris> 25
[13:48:21 CEST] <GuiToris> durandal_1707, https://ptpb.pw/TLHl
[13:48:35 CEST] <GuiToris> it's quality 25
[13:49:14 CEST] <furq> if crf 25 is coming out that big then the source is just hard to compress
[13:49:36 CEST] <furq> i'm guessing this is video game footage or something like that
[13:50:08 CEST] <durandal_1707> or fractals
[13:50:19 CEST] <furq> or 10 minutes of the hbo logo
[13:50:23 CEST] <GuiToris> no, it was recorded with my camcorder
[13:50:39 CEST] <furq> i'm guessing you don't have a gimbal then
[13:51:05 CEST] <furq> either that or it's low light and super grainy
[13:51:14 CEST] <furq> or noisy, rather
[13:51:42 CEST] <GuiToris> it's sometimes noisy right
[13:53:20 CEST] <furq> you can try denoising or maybe using vidstab if it's shaky
[13:54:17 CEST] <furq> but yeah that bitrate isn't extraordinarily high
[13:54:23 CEST] <GuiToris> https://ptpb.pw/aPSJ
[13:54:27 CEST] <GuiToris> here's a screenshot
[13:54:32 CEST] <GuiToris> it's mostly like this
[13:54:37 CEST] <furq> just be glad it's not a video capture of quake speedrunning
[13:54:55 CEST] <furq> if you ever want to murder x264 then that's a good way to do it
[13:55:06 CEST] <GuiToris> I've already used vidstab, it was shaky
[13:56:02 CEST] <GuiToris> it was originally an interlaced video, do you think (since vidstab needs two passes) I should deinterlace the first pass too?
[13:56:38 CEST] <furq> that would make sense, yeah
[13:57:14 CEST] <GuiToris> I haven't used the -tune option, should I use it?
[13:57:48 CEST] <GuiToris> none of them seems appropriate here
[14:00:03 CEST] <GuiToris> is it maybe 'film'?
[14:01:58 CEST] <Harzilein> <furq> or 10 minutes of the hbo logo
[14:02:00 CEST] <Harzilein> :D
[14:06:18 CEST] <th3_v0ice> The av_packet_unref(&packet) will only free the packet if there are no more references to it, correct? I assume this operation is not thread safe?
[14:15:58 CEST] <atomnuker> yes, yes
[14:27:57 CEST] <th3_v0ice> Ok, thanks
[14:29:35 CEST] <th3_v0ice> One more question though, can I send same packet to two different muxers?
[14:33:35 CEST] <atomnuker> yes, refcounting will take care of it
[14:44:34 CEST] <th3_v0ice> Cool
[15:57:21 CEST] <GuiToris> if I have separate video and audio files, is there any better way to combine them? ffmpeg -i video.mp4 -i audio.ogg -map 0:v -map 1:a -c copy happilytogether.mkv ?
[15:58:51 CEST] <relaxed> GuiToris: nope
[15:59:00 CEST] <GuiToris> thanks relaxed
[15:59:36 CEST] <Blacker47> GuiToris, you can use mkvmerge if .mkv is what you want.
[16:00:18 CEST] <GuiToris> yes, that's the desired container format, I'll look up, thank you for your suggestion
[16:01:20 CEST] <Blacker47> but it is unclear what you want to go "better".
[16:53:26 CEST] <ciga> hi
[16:55:06 CEST] <ciga> I'm trying to get ffmpeg capture my screen and encode it using gpu. I'm on radeon with mesa using h264_vaapi. ffmpeg drops frames for some reason. anyone know how to fix this?
[16:55:59 CEST] <ciga> im on ffmpeg 4.0.2-6 with Ubuntu
[17:14:17 CEST] <relaxed> ciga: pastebin.com your command and output
[17:16:18 CEST] <ciga> https://pastebin.com/L4H4yzdD
[17:21:51 CEST] <relaxed> try adding -framerate 60 before the input
[17:22:38 CEST] <ciga> it is the same
[17:25:07 CEST] <relaxed> 30?
[17:25:29 CEST] <ciga> still the same
[17:26:26 CEST] <ciga> not sure if this is related: https://lists.ffmpeg.org/pipermail/ffmpeg-user/2018-February/038989.html
[17:27:51 CEST] <relaxed> try using -f x11grab instead
[17:28:28 CEST] <ciga> thats what I use
[17:29:19 CEST] <relaxed> oh, I meant without vaapi
[17:38:17 CEST] <ciga> libx264 and libx265 drops frames on 4.0.2, but works on 3.4.4 that comes with Ubuntu
[17:38:41 CEST] <Mavrik> um.
[17:38:54 CEST] <Mavrik> Neither libx264 or libx265 encode on GPU
[17:39:32 CEST] <ciga> relaxed asked me to try it without vaapi
[17:41:13 CEST] <Mavrik> ah, nevermind
[17:44:52 CEST] <Mista_D> Trying to extract list of "key-frames only" with ffprobe, and get a lot of "ATSC A53 Part 4 Closed Captions" any way to skip them please?
[17:50:22 CEST] <^Neo> hello friends, can someone recommend a function to programmatically fill an individual field of an AVFrame buffer?
[17:50:39 CEST] <relaxed> Mista_D: pastebin.com the command and sample output of the problem
[17:50:56 CEST] <ciga> so, it seems h264_vaapi works just fine with 3.4.4
[17:51:30 CEST] <JEEB> ^Neo: unfortunately most things expect both fields in one AVFrame
[17:51:36 CEST] <ciga> hevc_vaapi says 'Encoding entrypoint not found (17 / 6)'. Is this something I can fix?
[17:51:51 CEST] <JEEB> ciga: you'd have to check vainfo or whatever the vaapi information app was
[17:52:07 CEST] <JEEB> if your device supports whatever that is in HEVC
[17:52:11 CEST] <JEEB> or HEVC at all
[17:52:14 CEST] <JEEB> (for encoding)
[17:52:17 CEST] <ciga> VAProfileHEVCMain : VAEntrypointVLD
[17:52:24 CEST] <JEEB> that might be decoding
[17:52:44 CEST] <ciga> i dont have an entry for VAEntrypointEncSlice
[17:53:32 CEST] <ciga> so, i guess i need a newer Mesa ...
[17:53:33 CEST] <^Neo> JEEB, well, there's av_image_fill_arrays but I'm curious if there's an easy way to just generate like a green field and a red field for testing.
[17:53:34 CEST] <JEEB> ^Neo: so you'd have to lace it through with 2*stride
[17:53:53 CEST] <^Neo> got it
[17:53:57 CEST] <JEEB> there's a helper function to get at least a black AVFrame
[17:54:01 CEST] <JEEB> but not sure if there's one for fields
[17:54:12 CEST] <^Neo> ah ok, name of the black AVFrame function?
[17:54:14 CEST] <JEEB> if you can't find anything relevant in doxy/code then most likely there's not
[17:54:24 CEST] <JEEB> don't remember :D it was relatively recently added by wm4
[17:54:39 CEST] <^Neo> oh cool! ok, I'll go digging
[17:55:12 CEST] <JEEB> or I guess you could use avfilter for the interleaving?
[17:55:33 CEST] <JEEB> since you can generate a red and blue or whatever frames with 1/2 height
[17:55:38 CEST] <JEEB> and then call the interleave filter on those?
[17:56:01 CEST] <JEEB> although that might be less simple than just filling the fields depending on your needs :D
[17:57:48 CEST] <^Neo> a little, heh. Thanks though!
[17:58:21 CEST] <Mista_D> relaxed: ./ffprobe -i source1.ts -select_streams v -show_frames -of csv -show_entries frame=pkt_pts_time,pict_type
[18:00:11 CEST] <JEEB> unfortunately the caption packets are within the video stream
[18:00:18 CEST] <JEEB> so unless ther's an option to specifically not show the caption packets
[18:00:29 CEST] <JEEB> then you just need to filter better :P
[18:01:51 CEST] <RedSoxFan07> Is FFMPEG's Bob Weaver Deinterlacing Filter the same as Handbrake's Decomb filter with the Bob option?
[18:02:33 CEST] <furq> handbrake just uses yadif in send_frame mode
[18:02:48 CEST] <furq> i think the decomb filter just runs idet beforehand
[18:03:01 CEST] <furq> decomb generally gives bad results in my experience though
[18:03:23 CEST] <RedSoxFan07> furq: Oh, okay. What gives the best results?
[18:03:43 CEST] <furq> check to see if it's interlaced and then either deinterlace or don't
[18:04:10 CEST] <furq> idet will give false negatives on low-motion interlaced frames
[18:04:25 CEST] <furq> so you'll end up with intermittent artifacts
[18:04:29 CEST] <relaxed> Mista_D: you only want: frame,11,...
[18:05:05 CEST] <furq> if you actually have a clip which is half interlaced and half progressive then i would normally break out vapoursynth and start counting frame numbers
[18:05:18 CEST] <furq> but you could presumably do it with yadif and enable
[18:14:41 CEST] <RedSoxFan07> I've got a lot of videos to convert. I'm going for a solution that doesn't take a ton of time and effort.
[18:14:50 CEST] <RedSoxFan07> I don't want to be counting frames or anything like that.
[18:15:18 CEST] <RedSoxFan07> I don't mind checking to see whether or not the video is interlaced and acting on that information, but I don't want to take forever.
[18:19:27 CEST] <furq> using the log output from idet to detect is usually fine
[18:19:54 CEST] <furq> using it to flag frames as interlaced and then conditionally deinterlacing frame-by-frame is more or less what handbrake does
[18:20:03 CEST] <furq> which is just a bad idea
[18:20:36 CEST] <furq> visually detecting is the best way but obviously you can't automate that
[18:21:26 CEST] <furq> this is assuming you don't have any telecine content which you will need to check for manually afaik
[18:32:49 CEST] <iive> furq, there are interlace detect filters that does just that, visually detect
[18:33:10 CEST] <furq> yeah that's what idet does
[18:33:35 CEST] <furq> but it's not perfect
[18:35:53 CEST] <Mista_D> http://pastebin.com/hPX9Wmmi side_data junk
[18:45:42 CEST] <relaxed> Mista_D: command | awk '/^frame/{sub("side.*","");print}'
[18:51:32 CEST] <Mista_D> relaxed: command | sed 's/side_data, ATSC A53 Part 4 Closed Captions//g'
[18:54:54 CEST] <iive> furq, it doesn't have to be perfect, just good enough :D
[20:36:22 CEST] <russoisraeli> hello folks. I need to test an RTMP (Kaltura) stream. If I use "ffplay" or "vlc" with the "rtsp://" protocol prefix, it works. "rtmp://" does not - returns a "missing argument token" error. Any idea why?
[20:36:45 CEST] <JEEB> probably some authentication parameters missing?
[20:36:59 CEST] <JEEB> you should know what parameters are required to connect to the RTMP stream
[20:38:56 CEST] <russoisraeli> JEEB - right.... but it's just a URL difference.... should rtmp:// and rtsp:// be interchangeable? or is there something else going on? sorry if the question is silly
[20:38:58 CEST] <BtbN> rtsp and rtmp are two very different things.
[20:39:41 CEST] <russoisraeli> it looks like rtsp:// does something additional
[20:39:50 CEST] <BtbN> It's an entire other protocol...
[20:40:23 CEST] <russoisraeli> weird... I was give rtmp:// URL's
[20:40:31 CEST] <JEEB> russoisraeli: just go read the kaltura documentation and figure out the proper URLs and parameters required to access the RTMP end point
[20:40:41 CEST] <JEEB> or ask whomever configured the kaltura you're testing
[20:40:45 CEST] <JEEB> we're not kaltura support
[20:41:42 CEST] <russoisraeli> JEEB - yeah, I got the rtmp:// URL's from Kaltura.. I understand.. not asking for support. Just trying to figure out why both vlc and ffplay play their URLs as rtsp://, but not the given rtmp://
[20:41:57 CEST] <BtbN> Probably because it's an rtsp url.
[20:42:15 CEST] <russoisraeli> probably
[20:42:34 CEST] <JEEB> either the URL is incorrect, or it lacks parameters that are usually utilized to keep random people from connecting to the RTMP end point
[20:53:33 CEST] <lindylex> How do I change the background color of the video? ffplay -f lavfi 'amovie=a.mp3, asplit [a][out1]; [a] showcqt=s=900x900:count=4:bar_h=800:text=0:cscheme=.74|0|0|.74|0|0 [out0]'
[21:31:35 CEST] <lindylex> I am trying to change the background color of this video? ffmpeg -i a.mp3 -filter_complex \
[21:31:35 CEST] <lindylex> "[0:a]showcqt=s=900x900:count=4:bar_h=800:text=0:cscheme=.74|0|0|.74|0|0,format=yuv420p[v]" \
[21:31:35 CEST] <lindylex> -map "[v]" -map 0:a -y o2.mp4; mplayer -loop 0 o2.mp4;
[21:31:39 CEST] <lindylex> How do I do this?
[21:33:18 CEST] <JEEB> did you look at the docs for the schowcqt filter?
[21:33:40 CEST] <JEEB> I'll even give you a hint http://ffmpeg.org/ffmpeg-filters.html
[21:34:03 CEST] <JEEB> and of course, if there is no option for the background and you don't see another way to do it - then it's not currently possible
[21:53:54 CEST] <lindylex> JEEB I did read it and there is no option.
[21:54:25 CEST] <JEEB> well then that's it
[22:12:33 CEST] <leif> Can anyone here tell me the difference between passing audio through FFmpeg's highpass filter, then immediately its lowpass filter, compared to just using its bandpass filter?
[22:14:06 CEST] <durandal_1707> leif: using just bandpass filter is 2x faster
[22:15:05 CEST] <leif> durandal_1707: Ah, okay, thanks.
[22:16:05 CEST] <durandal_1707> and you can not get exactly same output...
[22:17:05 CEST] <leif> That makes sense. So its like actually using a bandpass filter circuit rather than connecting a low+high pass circuit together.
[22:17:11 CEST] <leif> Which would make sense.
[22:17:14 CEST] <leif> If so, thanks. :)
[22:18:02 CEST] <durandal_1707> biquads are just "special" IIR filters
[22:18:18 CEST] <leif> Ah, okay, thanks.
[22:18:32 CEST] <durandal_1707> there is math behind all of it
[22:19:29 CEST] <leif> Yup. I remember from one of my signal processing classes, I just wanted to see if ffmpeg did anything differently. :)
[22:19:42 CEST] <durandal_1707> combining several biquads in cascade one can do all sort of strange filtering..
[23:10:56 CEST] <leif> Okay, here's an odd one, is there any reasn that the lowpass filter might be converting its s16 input into s16p output?
[23:18:39 CEST] <none2give> hey everyone. i just have some questions about bundling an ffmpeg executable with my program
[23:19:49 CEST] <none2give> i've built the executable myself but bundling the source adds about an extra 100 MB to my program which is undesirable. can i just point to the git repository for the specific version of the code i built from?
[23:20:10 CEST] <none2give> the last thing i want to do is end up in the hall of shame lol
[23:20:29 CEST] <JEEB> test out how well it compresses with something like 7z (lzma)
[23:20:36 CEST] <JEEB> that might make it bearable
[23:20:55 CEST] <JEEB> if not, then just have a github/gitlab fork which tags the exact revisions you use
[23:22:59 CEST] <none2give> so i can just fork the source and point to that?
[23:23:11 CEST] <none2give> my concern is entirely with licensing
[23:24:43 CEST] <JEEB> in both LGPL and GPL you are to provide the exact source code of the FFmpeg (and possible dependencies) you used for each binary. so the safest if the archive is not too big is to bundle. otherwise having a link to your own thing with the things tagged exactly should be good enough.
[23:25:02 CEST] <JEEB> for GPL that's not limited to FFmpeg or its dependencies
[23:25:20 CEST] <none2give> right, i only compiled FFmpeg without any extra codecs or anything
[23:25:30 CEST] <none2give> so i think everything is all clear to distribute under LGPL
[23:26:57 CEST] <JEEB> the configure step requires you to specifically note if you want version 3 of (L)GPL, or if you want LGPL->GPL
[23:27:05 CEST] <JEEB> with --enable-gpl or --enable-version3
[23:27:19 CEST] <JEEB> v3 mostly has things related to DRM
[23:27:34 CEST] <JEEB> (because it was created back in the day as a reaction to the tivo boxes)
[23:28:27 CEST] <none2give> i didn't use either of those flags
[23:29:04 CEST] <JEEB> then FFmpeg itself at least is LGPL.
[23:30:14 CEST] <none2give> i also compiled under MinGW32 but i hid all binaries that ffmpeg might be looking to include like bz, iconv etc
[23:30:52 CEST] <JEEB> I recommend --disable-autodetect for some of those purposes
[23:31:04 CEST] <none2give> ah hell i didn't even know that existed
[23:31:08 CEST] <none2give> i just went thru and renamed all of them hahaha
[23:31:29 CEST] <JEEB> well, better be sure I guess :P I would have used clear sysroots :D
[23:32:19 CEST] <none2give> i'm really new to all of this
[23:32:44 CEST] <none2give> i develop primarily on windows and primarily in visual basic and php so makefiles under a linux subsystem is like an entirely other universe
[23:34:07 CEST] <none2give> basically the purpose for using FFmpeg in my context is: using PHP to generate an error message which is written onto an image, then FFmpeg encodes a TS segment out of the image and all that gets sent over to a Roku via HLS
[23:34:25 CEST] <JEEB> funky
[23:34:43 CEST] <none2give> i'm basically writing a server side application to serve up streams via a playlist and spit out error messages if something goes wrong
[23:35:08 CEST] <none2give> eliminating the need to develop an app for the TV
[23:35:32 CEST] <none2give> and also allowing you to have predefined channels and swap out your streams without touching the TV if one dies or something
[23:35:52 CEST] <none2give> it's fairly nifty but encoding a TS stream is vital for the user experience so that's why i'm in this licensing debacle
[23:36:02 CEST] <none2give> i'm also bundling apache, php and sqlite so i need to figure that mess out lol
[23:36:15 CEST] <JEEB> but wait, if you're making HLS aren't you needing some H.264 encoder?
[23:36:23 CEST] <JEEB> or do you have the actual clips pre-encoded and you're just muxing?
[23:36:55 CEST] <none2give> you know, i was using h.264 originally but it turns out if i throw a bunch of mpeg2 files into a vod playlist that i'm generating, it plays just fine
[23:37:02 CEST] <JEEB> :D
[23:37:08 CEST] <JEEB> that said FFmpeg has support for openh264
[23:37:12 CEST] <none2give> since it's all generated dynamically i need quick quick load times
[23:37:14 CEST] <JEEB> which has a non-GPL license
[23:37:27 CEST] <none2give> mp2 is spitting out smaller files and takes less time to encode
[23:37:57 CEST] <none2give> i'm actually cheating and only generating one segment and serving it up for the length of the runtime
[23:38:02 CEST] <JEEB> sure, mpeg-2 video is most probably simpler, although not sure if it's faster than some really simplistic H.264 encoder (or with fast settings)
[23:38:29 CEST] <none2give> the files i'm generating right now are like 60 kb range
[23:38:39 CEST] <none2give> 960x540, 4 seconds long, 1fps
[23:38:50 CEST] <none2give> 1k bps
[23:38:54 CEST] <JEEB> yea, I mean a message isn't going to be requiring much bit rate at that frame rate
[23:39:09 CEST] <none2give> now my biggest question is how do i bring the bitrate even lower than that lol
[23:39:30 CEST] <none2give> ffmpeg is telling me i probably didn't mean to set it that low but i definitely did
[23:39:47 CEST] <BtbN> If you want low bitrate an ancient codec is probably not the way to go
[23:39:48 CEST] <JEEB> not sure how low you can get with MPEG-2 Video
[23:40:02 CEST] <none2give> okay, i see
[23:40:29 CEST] <none2give> what's the best trade off between quick encoding, small file size, and super low bit rate?
[23:40:40 CEST] <JEEB> with HLS you don't have too many alternatives :P
[23:40:54 CEST] <BtbN> Probably hevc with a hardware encoder.
[23:40:55 CEST] <none2give> so probably h.264 or h.265 or one of the mpegs
[23:41:13 CEST] <BtbN> super low bitrate isn't going to happen with them though
[23:41:19 CEST] <JEEB> also if you need to squeeze something really really low
[23:41:30 CEST] <none2give> i think my biggest priorities then are quick encoding time and small file size
[23:41:31 CEST] <JEEB> then doing the error messages realtime doesn't sound like a good idea
[23:41:57 CEST] <BtbN> small filesize and quick encoding are pretty much direct opposites
[23:42:04 CEST] <JEEB> and just make a single-GOP segment with pretty much all skips
[23:42:04 CEST] <none2give> it's just easier too because i can throw a PIN number on screen or something if i need my user to verify their display to their user account
[23:42:14 CEST] <BtbN> The smaller you squeeze it the more CPU time it takes to make it happen
[23:42:40 CEST] <none2give> that's half the reason i'm generating stuff dynamically, is that there's dynamic content involved
[23:43:25 CEST] <none2give> the other reason is that there's so many damn error codes for specific scenarios that i don't want to make them all manually, i can just add it to my switch statement if i need to create one for a specific case
[23:43:50 CEST] <none2give> BtbN that does make sense
[23:44:18 CEST] <none2give> if they're pretty much inversely proportional then i'm getting only compromise because the wait times between encoding and serving the file will equal out roughly?
[23:44:50 CEST] <BtbN> You'll have to experiment to find out how much quality you are willing to give up
[23:44:51 CEST] <orev> i need to cut a file precisely on a specific frame. i tried to use the "copy" codec but i don't seem to have keyframes at the right location, so i need to do a re-encode
[23:45:16 CEST] <orev> do i need to re-encode audio as well to get the precision? audio is in aac
[00:00:00 CEST] --- Thu Oct 4 2018
More information about the Ffmpeg-devel-irc
mailing list