[Ffmpeg-devel-irc] ffmpeg.log.20190313

burek burek021 at gmail.com
Thu Mar 14 03:05:02 EET 2019


[00:00:20 CET] <cryptopsy> i know that but im not sure why you're telling me this?
[00:00:37 CET] <kepstin> that's probably the reason why you're getting the strange buffering issue
[00:00:52 CET] <cryptopsy> with grep the output is comming as separated by \n and i would actually like to change it to \r
[00:01:57 CET] <kepstin> at this point, I'd switch to a programming language other than bash, this type of text handling is really hard to get right in bash.
[00:02:36 CET] <cryptopsy> how would that help? that hard part is separating each iteration of ffmpeg's internal loop
[00:03:03 CET] <kepstin> if the only thing you need to to display the progress output to the user, why not just run ffmpeg with no output redirection so the user can see the progress output?
[00:03:06 CET] <cryptopsy> if i was going to use a prog lang and write my own loop that calculates time intervals it would be just as hard in any lang as in bash
[00:03:36 CET] <cryptopsy> only the frame numbers are important, separated by \r
[00:04:14 CET] <tdr> you can just pipe output to sed to change it
[00:04:26 CET] <cryptopsy> tdr: i mentioned why that didnt work
[00:04:35 CET] <cryptopsy> seems to be a side effect of multiple piping (to grep to sed)
[00:04:39 CET] <kepstin> or tr can switch the newline/carriage return thing
[00:04:52 CET] <kepstin> every command you add to the pipe adds another buffer of course, so :/
[00:05:28 CET] <cryptopsy> there might be a way to run multiple greps in a single grep statement
[00:05:47 CET] <cryptopsy> multiple -e
[00:05:51 CET] <cryptopsy> attempting now ...
[00:05:56 CET] <kepstin> (doing this in a programming language where you can read from the ffmpeg output directly, then work with string processing locally seems like a much better option...)
[00:06:29 CET] <cryptopsy> a man's gotta stick to his guns ..
[00:06:32 CET] <TheAMM> ffmpeg = subprocess.Popen(cmd, universal_newlines=True, stderr=subprocess.PIPE)
[00:06:47 CET] <TheAMM> for line in ffmpeg.stderr:
[00:06:47 CET] <TheAMM>     print(line.strip())
[00:06:52 CET] <kepstin> but yeah, i still don't have any idea why exactly you need this info, so I can't provide any feedback other than to say "ffmpeg doesn't provide the data you're asking for in a processing-friendly way", and "doing this in bash is hard"
[00:07:04 CET] <cryptopsy> i'm only fluent in C i'm not prepared to do this is in C
[00:07:12 CET] <cryptopsy> oh, and bash :D
[00:07:30 CET] <kepstin> any yeah, for this type of text processing, i'd use python too :)
[00:07:42 CET] <kepstin> but first, i'd just not do it at all if i didn't have to
[00:07:45 CET] <TheAMM> (I haven't tested if subprocess gives you a legit textio to allow iterating lines like that but I expect it to)
[00:07:49 CET] <kepstin> and i still don't know why you want to do it
[00:09:55 CET] <kepstin> (15 years ago, I would have used perl instead, but i've just about forgotten how to program in perl nowadays)
[00:10:22 CET] <TheAMM> (and it was for the better)
[00:55:15 CET] <ossifrage> My arm build is not working: "[NULL @ 0x23240] Requested output format 'mp4' is not a suitable output format"
[00:56:06 CET] <ossifrage> it seems like all the muxers where built
[00:58:02 CET] <pink_mist> are you perhaps trying to put things into mp4 that don't belong in mp4?
[00:58:21 CET] <ossifrage> pink_mist, so it works with my x86_64/fedora build
[00:58:47 CET] <ossifrage> just not my arm32/buildroot build
[00:59:59 CET] <ossifrage> It is failing really early in avformat_alloc_output_context2()
[01:01:10 CET] <ossifrage> I'm assuming this is a build problem, but the trial and error loop is kinda slow
[01:06:04 CET] <ossifrage> It doesn't help that the host has ffmpeg-4.0.3 and the br is only 3.4.5
[01:29:53 CET] <ossifrage> So the libavformat seems to have ff_mp4_muxer, any ideas what else would be needed for avformat_alloc_output_context2(&mux->fmtCtx, NULL, "mp4", NULL) to be happy?
[01:34:24 CET] <ossifrage> Adding av_log_set_level(AV_LOG_TRACE) didn't provide anything extra
[01:47:09 CET] <relaxed> ossifrage: maybe my arm builds will work? https://www.johnvansickle.com/ffmpeg/
[01:56:05 CET] <ossifrage> relaxed, I'm in the process of building the same version as I have on my fedora devel box
[01:56:19 CET] <JEEB> ossifrage: clearly something's missing as it says the output format is not there :/
[01:56:55 CET] <ossifrage> JEEB, yeah, I can see the symbols and the mime types in the library
[01:57:57 CET] <JEEB> that message comes from libavformat/mux.c right within avformat_alloc_output_context2. if you don't have a format set, then it checks the string with av_guess_format(string, NULL, NULL)
[01:58:01 CET] <ossifrage> Should I build 4.0.3 or go for 4.1.1? (fedora29 has 4.0.3, the arm box was 3.4.5)
[01:58:01 CET] <JEEB> and that seems to return nullptr
[01:58:45 CET] <ossifrage> The "NULL @" did seem odd, I'm not sure what it is supposed to be?
[01:58:49 CET] <JEEB> I would just pick a version from master that is green on FATE for the systems and architectures you require, but if you go for release the newer is better
[01:59:08 CET] <JEEB> the NULL in the message is just that it does't yet have a format picked
[01:59:15 CET] <JEEB> it's the av_log first parameter
[01:59:27 CET] <JEEB> it pushes the just created AVFormatContext there
[01:59:30 CET] <pink_mist> what's FATE?
[01:59:34 CET] <JEEB> fate.ffmpeg.org
[01:59:36 CET] <JEEB> automated testing
[01:59:38 CET] <ossifrage> Oh, I never called av_log()
[01:59:46 CET] <JEEB> no, the framework does
[01:59:49 CET] <ossifrage> err, never mind
[02:00:12 CET] <JEEB> anyways, you can try emulating the stuff with av_guess_format :P
[02:00:16 CET] <pink_mist> well, can't seem to connect to fate.ffmpeg.org
[02:00:30 CET] <pink_mist> no wait, it was just very very slow
[02:00:40 CET] <JEEB> yea, it seemed lately to be rather slow for one reason or another
[02:00:49 CET] <JEEB> also older versions of FFmpeg require you to register_all
[02:00:52 CET] <JEEB> while newer donät
[02:00:54 CET] <JEEB> *don't
[02:01:03 CET] <JEEB> so check that you're doing that if you need to support older versions
[02:01:23 CET] <JEEB> you will get a deprecation warning when building against newer code, but the functions are there
[02:01:40 CET] <JEEB> av_register_all() is the exact function it seems
[02:02:35 CET] <JEEB> that registers all formats, protocols etc (in newer versions it's static because the dynamic registration didn't work anyways due to requiring internal structures)
[02:02:46 CET] <JEEB> so if we make a new dynamic registration thing, it will be better designed :P
[02:05:28 CET] <ossifrage> JEEB, using av_guess_format() also failed, but I didn't get the 'Requested output format..." message
[02:05:57 CET] <JEEB> it doesn't of course, I just noted that you can use that as a simple test cae :P
[02:05:58 CET] <JEEB> *case
[02:06:06 CET] <JEEB> since that's what's internally failing in the function you were calling
[02:06:19 CET] <JEEB> which sounds like you didn't call av_register_all in the older FFmpeg version
[02:06:24 CET] <JEEB> please make sure you're doing that :)
[02:06:39 CET] <ossifrage> the joys of older versions, bah
[02:07:19 CET] <ossifrage> JEEB, winner winner!
[02:08:21 CET] <JEEB> and you can call that in the newer versions still, it will just warn during compilation
[02:13:48 CET] <ossifrage> Application provided duration: 6413133197 / timestamp: 6413333310 is out of range for mov/mp4 format
[02:13:56 CET] <ossifrage> I didn't provide a duration, hmmm...
[02:14:46 CET] <JEEB> check time base, dts and pts given in AVPackets
[02:25:41 CET] <ossifrage> Well it is doing something, but it only calls my avio handler twice (36 bytes and 689 bytes) and then nothing for the frames.
[02:26:17 CET] <ossifrage> I really need to sync up the versions so I can compare intel to arms
[02:28:18 CET] <ossifrage> I accidentally fed it some h.265 and it didn't complain so something isn't right
[02:29:42 CET] <JEEB> I don't think the muxer expects such cases and as long as the data looks correct enough it will stick it in
[02:30:36 CET] <ossifrage> This version doesn't have AV_PKT_FRAME_KEY or AV_PKT_FLAG_DISPOSABLE which I was setting in the x86_64 version
[10:10:33 CET] <sazawal> Hello. I want to convert a video clip into png images and get the associated timestamp. I tried this command, "ffmpeg -i input.mkv -vf "showinfo" -vsync 0 frame%04d.png -hide_banner >& output.txt". But sometimes the number of frames generated are one more than the number of timestamps I get. How do I solve this?
[10:44:58 CET] <rnmhdn> I need to convert 150h of video that is exportend from adobe premier
[10:45:34 CET] <rnmhdn> I need to write them into DVD and USB stick and distribute them.
[10:46:21 CET] <rnmhdn> I did a lot of research about how to do it but I cant get confident that I'm doing good enough
[10:46:41 CET] <rnmhdn> I also need a dash version of them.
[10:47:26 CET] <rnmhdn> for example I see that there is a technique of downsampling the colors and assigning good bitrate to luminance
[10:48:06 CET] <rnmhdn> basically I want to learn the common practices in encoding videos for distribution to users.
[10:48:17 CET] <rnmhdn> my current videos are h264
[10:49:11 CET] <rnmhdn> for example this is one of the commands I used: ffmpeg  -i N1.1.mp4 -codec:v libx264 -x264opts "keyint=24:min-keyint=24:no-scenecut" -profile:v baseline -level 4.0 -vf "scale=-2:1080" N1.1_1080.mp4
[10:50:19 CET] <sazawal> rnmhdn, I normally use Handbrake to downsize videos. It is basically ffmpeg but with a gui which makes it easier to use.
[10:52:00 CET] <rnmhdn> for example this is ffmpeg -i for one of my videos:
[10:55:30 CET] <rnmhdn> https://paste.ubuntu.com/p/v7GZFNYXxw/
[10:59:17 CET] <rnmhdn> https://paste.ubuntu.com/p/45t5KSgQ6h/
[11:13:21 CET] <another> any specific reason why you use baseline profile?
[11:40:57 CET] <another> hmm.. what is the difference betwenn all those isobmff muxers (mp4, ipod, mov, psp, ...)?
[11:54:28 CET] <JEEB> another: various sub-types of mov or isobmff added at some point of time
[11:55:08 CET] <JEEB> some of them are probably no longer as needed (f.ex. the PSP one probably is unneeded now since I've always used normal MP4 files after some 3.xx firmware)
[12:06:52 CET] <another> iC
[12:25:46 CET] <sazawal> Hello. I want to convert a video clip into png images and get the associated timestamp. I tried this command, "ffmpeg -i input.mkv -vf "showinfo" -vsync 0 frame%04d.png -hide_banner >& output.txt". But sometimes the number of frames generated are one more than the number of timestamps I get. How do I solve this?
[13:05:15 CET] <BtbN> sazawal, pretty sure that just gives the frames numbers, strictly counting up?
[13:05:54 CET] <BtbN> The img2 muxer has an option called frame_pts, have you tried that?
[13:11:26 CET] <sazawal> BtbN, Now I looked carefully at the output.txt. I have actually missed to look at the frame0001.png which is at pts_time=0. Okay, so I have the same number of frames and timestamps.
[13:12:55 CET] <BtbN> https://www.ffmpeg.org/ffmpeg-formats.html#Options-6
[13:13:03 CET] <BtbN> frame_pts reads like it's exactly what you want
[13:13:44 CET] <sazawal> BtbN, I have another question. I have split a video file into different clips at all the keyframes. Now for the first clip, the first frame is captured at pts_time=0.021. But for the rest of the clips the first frame is captured at pts_time=0. Why is this happening?
[13:14:08 CET] <BtbN> How did you split it? Segment muxer?
[13:15:22 CET] <sazawal> BtbN, I have used "ffmpeg -i input.mkv -acodec copy -f segment -vcodec copy -reset_timestamps 1 -map 0 clip%d.mkv"
[13:16:01 CET] <BtbN> reset_timestamps: "Reset timestamps at the beginning of each segment, so that each segment will start with near-zero timestamps."
[13:16:04 CET] <BtbN> That'd be why
[13:17:17 CET] <sazawal> BtbN, I see. But why the first clip is starting at 0.021? I mean what is happening to the frames before this time?
[13:17:28 CET] <BtbN> There might not be any
[13:17:40 CET] <BtbN> The first timestamp does not have to be 0
[13:18:00 CET] <sazawal> oh I see.
[13:19:05 CET] <BtbN> though with how the logic is implemented, it does seem a bit strange how it can start at anything but 0
[13:20:00 CET] <BtbN> What it does is pts += initial_offset - cur_entry.start_pts
[13:20:33 CET] <sazawal> Another question. If I want the timestamp of a frame in clip-3 for example. Then I would do pts_time (of this frame)+pts_time (of the last frame of clip1)+pts_time (of the last frame of clip-2). Is this correct?
[13:21:05 CET] <BtbN> I don't think so, no
[13:21:16 CET] <BtbN> That would give you a crazy high number potentially
[13:21:29 CET] <sazawal> BtbN, I see. So, what would be a way to find the actual timestamp?
[13:21:49 CET] <BtbN> What do you need them for? They are not exactly carrying much information other than how to play the frames relative to each other.
[13:22:47 CET] <sazawal> BtbN, I am writing a script to generate subtitles.
[13:23:28 CET] <sazawal> After I process a frame in clip-n for example, and extract the subtitles. I would want the timing of this frame in the original video,
[13:23:30 CET] <BtbN> With how the segment muxer does the reset timestamp thing, I don't think the original timestamp is fully recoverable
[13:24:03 CET] <sazawal> BtbN, What if I don't use reset-timestamp flag?
[13:24:16 CET] <BtbN> Then you will have the original timestamps in each segment
[13:24:51 CET] <BtbN> If your software can work with that, that's probably the easiest way
[13:25:04 CET] <sazawal> BtbN, Then it is easier right? I don't need the seperate clip timestamp anyway.
[13:25:18 CET] <sazawal> Yes I think so.
[13:36:50 CET] <sazawal> BtbN, Sorry I did not understand what you said about frame_pts. I don't see frame_pts in the output.txt file.
[13:37:01 CET] <BtbN> It's an image2 muxer option
[13:37:17 CET] <BtbN> see the link I pasted
[13:41:07 CET] <sazawal> Okay, so this is just for naming the image files.
[14:01:29 CET] <jksamir> Hi! In the extract_mvs example, I was wondering if I could get the reference frames numbers from which the motion vectors were calculated (instead of the +/- 1 direction)
[15:41:20 CET] <_Vi> How do I read output of `silencedetect` filter specified in FFmpeg command line without "size=  111830kB time=00:09:56.42 bitrate=1536.0kbits/s speed= 314x" statistics interfering?
[15:41:47 CET] <_Vi> Probably found myself: "-nostats"
[16:23:39 CET] <CyberShadow> _Vi: Looks like the filter sets metadata, which can be written to a file by itself using https://ffmpeg.org/ffmpeg-filters.html#metadata_002c-ametadata (mode=print and file=yourfile.txt)
[16:24:01 CET] <CyberShadow> Oh, there's even an example with silencedetect there.
[16:30:16 CET] <faLUCE> hello. After executing:  ffplay -f v4l2 -input_format mjpeg -video_size 640x480 -framerate 25 -i /dev/video0    I can see the video, but I have lot of this msgs: "[mjpeg @ 0x7efda4003b80] unable to decode APP fields: Invalid data found when processing input".  Is this a ffmpeg bug?
[16:47:19 CET] <USian_nogoodnick> i have a script with a few lines like this: https://pastebin.com/pthJTu2Q and they recently quit recording usable video. i suspect a recent ffmpeg update. does anyone know what changed or what i can do to fix this issue?
[17:03:07 CET] <brimestone> I'm trying to do this. ffmpeg -i 40fps_speedUp.mp4 -f lavfi -i "sine=frequency=40" -filter_complex  "[0:a][1:a]amerge=inputs=2[a]"  -map 0:v -map "[a]" -codec copy 40fps_speedUp_40hz.mp4 but Im getting "Filtering and streamcopy cannot be used together" - how would I make this work?
[17:12:45 CET] <BungeeTheCookie> hello
[17:12:49 CET] <BungeeTheCookie> I'm having a problem with ffprobe
[17:13:54 CET] <BungeeTheCookie> I run this command
[17:13:57 CET] <BungeeTheCookie>  /Users/me/Desktop/audioextract/dist/ffmpeg/ffprobe -show_streams file:/Users/me/Desktop/test.webm
[17:14:38 CET] <BungeeTheCookie> and I get this error
[17:14:39 CET] <BungeeTheCookie> https://hastebin.com/hiwejeleyu.js
[17:14:40 CET] <BungeeTheCookie> help
[17:16:31 CET] <kepstin> BungeeTheCookie: where did you get this ffmpeg binary from? looks like it's missing some dependencies
[17:19:59 CET] <BungeeTheCookie> okay you're right I redownlaoded it and now it works :)
[17:22:55 CET] <BungeeTheCookie> also kepstin where can I find ffmpeg binaries without mp4 / GPL enabled
[17:24:36 CET] <kepstin> I don't know of any lgpl builds for mac os x offhand. you might have to build it yourself.
[17:25:11 CET] <kepstin> (also, i assume you're refering to libx264 for h264 encoder? mp4 container and h264 decoding are available in an LGPL build)
[18:15:59 CET] <_Vi> CyberShadow, Thanks for the suggestion.
[18:18:36 CET] <rnmhdn> another: no, I don't really know what I'm doing:))
[18:21:25 CET] <kepstin> brimestone: do you want to copy the video? if so, change "-codec copy" to "-codec:v copy", that way it's *only* copying the video and can encode the audio.
[18:22:12 CET] <brimestone> thanks let me check it.
[18:57:51 CET] <BLZbubba> hi guys what is the easiest way to make a video from a single image file, e.g.: https://www.youtube.com/watch?v=3ZsNRKHWqWk
[18:59:19 CET] <brimestone> kepstin: It finished without any errors, but I can't hear the 40Hz tone either left or right audio
[19:00:58 CET] <another> rnmhdn: i suggest the wiki articles https://trac.ffmpeg.org/wiki/Encode/H.264 https://trac.ffmpeg.org/wiki/Encode/HighQualityAudio
[19:02:20 CET] <kepstin> brimestone: was your input stereo? if so, amerge would have put the sine into the 3rd audio channel, which might not work for you.
[19:02:29 CET] <rnmhdn> I want to write videos into dvd
[19:02:48 CET] <rnmhdn> I want to have the least size with relatively good quality:D
[19:02:50 CET] <kepstin> brimestone: you might want to use the 'amix' filter instead, if your goal is to have both the original audio and the sine wave mixed together
[19:02:57 CET] <rnmhdn> I don't need some 4k or something
[19:03:10 CET] <kepstin> rnmhdn: do you want something playable in a hardware dvd player?
[19:03:18 CET] <rnmhdn> something that looks not bad is fine, the quality I get in 1080 movies is fine.
[19:03:24 CET] <rnmhdn> kepstin: no.
[19:03:41 CET] <rnmhdn> I also want to stream dash
[19:03:49 CET] <rnmhdn> actually my main problem is the dash streaming
[19:04:12 CET] <kepstin> rnmhdn: right then, to fit a video onto a dvd, just encode it with libx264 in 2-pass mode with a bitrate calculated to give the desired target size.
[19:05:40 CET] <rnmhdn> for example I have a 3h movie which is 1GB
[19:06:01 CET] <rnmhdn> can I achieve the same ratio of quality and size on a dash video?
[19:07:47 CET] <rnmhdn> this is ffmpeg -i for that file: https://termbin.com/5hgx
[19:08:54 CET] <_Vi> BLZbubba, Example making single-frame 10-second video: ffmpeg -safe 0 -f concat -r 0.1 -i <(echo file pic.png; echo file pic.png) -pix_fmt yuv420p -c libx264 -y output.mkv
[19:09:11 CET] <BLZbubba> _Vi: perfect thanks!
[19:09:48 CET] <_Vi> BLZbubba, Note that the lack of intermediate frames sometimes breaks other tools and players. FPS=0.1 is extreme, typical FPS is 15 to 60.
[19:10:10 CET] <kepstin> rnmhdn: depends what exactly your goal is with the dash streaming. in some cases for streaming - in order to be able to seek, to minimize bursts of high required bitrate, and to allow quality switching, you will want a shorter keyframe interval or even fixed keyframes
[19:10:26 CET] <kepstin> rnmhdn: and setting shorter keyframe interval does reduce the quality per bitrate ratio.
[19:11:21 CET] <kepstin> rnmhdn: if bandwidth/buffering isn't a concern and you don't need quality switching, you could probably get away with just copying the encoded video data from your original file, and so it's the exact same quality/size
[19:12:08 CET] <rnmhdn> bandwidth is not a concern
[19:12:16 CET] <rnmhdn> the only problem is the size it takes on my server
[19:12:49 CET] <rnmhdn> for example this is ffmpeg -i for one of my videos:
[19:12:56 CET] <rnmhdn> https://paste.ubuntu.com/p/v7GZFNYXxw/
[19:13:02 CET] <rnmhdn> https://paste.ubuntu.com/p/45t5KSgQ6h/
[19:13:57 CET] <kepstin> alright, so you've got some 10-15mbit/s 1080p videos...
[19:15:05 CET] <kepstin> no way to tell the encoder settings those were made with, it's possible that you could potentially re-encode them with slow x264 settings to make them slightly smaller without visibly reduced quality... but you'd have to try and see.
[19:19:11 CET] <rnmhdn> no
[19:19:13 CET] <rnmhdn> lol
[19:19:16 CET] <rnmhdn> not at all
[19:19:19 CET] <another> that termbin link has mono audio. urg..
[19:19:25 CET] <rnmhdn> my videos are super large
[19:19:30 CET] <another> how can you live with that ;)
[19:19:58 CET] <rnmhdn> eg
[19:20:04 CET] <rnmhdn> this one I send you is 3.4GB
[19:20:17 CET] <rnmhdn> and it's only 40MB
[19:21:29 CET] <kepstin> rnmhdn: so, what's your goal here?
[19:22:15 CET] <kepstin> do you have a particular target size for the videos? or just want them to be "as small as possible" at a target quality?
[19:22:21 CET] <rnmhdn> I want to achieve a ratio like movies that I watch
[19:22:29 CET] <rnmhdn> good quality 3GB 3h
[19:22:38 CET] <rnmhdn> as small
[19:22:54 CET] <rnmhdn> and when I do research I see a lot of ideas and things but I don't know if I'm using them or not
[19:23:12 CET] <kepstin> well, there's two options: you can either pick a final size, and tell the encoder to make the quality as good as possible for that size, or you can pick a final quality, but then the size will be whatever the encoder gives you.
[19:23:14 CET] <rnmhdn> eg down sampling colors and assigning high bitrate for luminance
[19:23:23 CET] <kepstin> rnmhdn: probably read through https://trac.ffmpeg.org/wiki/Encode/H.264 to start with
[19:24:12 CET] <rnmhdn> I will
[19:24:17 CET] <kepstin> rnmhdn: none of your videos are hdr, so that doesn't matter. your videos are already yuv420p, so don't bother changing any subsampling or anything
[19:24:34 CET] <rnmhdn> but If you know some good options that I could use in adobe primier pro, ffmpeg, handbrake
[19:24:40 CET] <rnmhdn> I would really appreciate it
[19:24:51 CET] <rnmhdn> https://termbin.com/jcrs
[19:25:07 CET] <rnmhdn> this is the info for my 3.4GB 36minute video
[19:25:08 CET] <kepstin> rnmhdn: so in the end, the recommendation is basically "encode your videos with libx264 using the slowest setting for -preset that you can stand, and with a -crf setting that gives a quality you're ok with"
[19:25:10 CET] <kepstin> that's it.
[19:25:33 CET] <rnmhdn> can you elaborate on slow preset?
[19:25:39 CET] <rnmhdn> what does it mean?
[19:25:41 CET] <kepstin> rnmhdn: read the wiki doc i linked
[19:26:10 CET] <rnmhdn> and then I simply convert that to dash with mp4box os somesuch?
[19:26:14 CET] <rnmhdn> the dash part doesn't matter?
[19:26:44 CET] <rnmhdn> because when I converted some if these videos to dash and played them on my local browser they were not so good
[19:26:50 CET] <kepstin> depends on the requirements of your dash player and network, which you haven't specified
[19:27:02 CET] <rnmhdn> I'm using dashjs
[19:27:13 CET] <rnmhdn> I'm open to anything opensource for the dash part
[19:27:18 CET] <kepstin> note that if you're not doing quality switching or anything, there's no point in dash - just encode to an mp4 file with "-movflags faststart"
[19:27:25 CET] <rnmhdn> and my bandwidth is very very good
[19:27:43 CET] <rnmhdn> but the space I have for storing the videos is very expensive
[19:27:57 CET] <rnmhdn> so I want them to be as small as possible
[19:28:30 CET] <rnmhdn> I don't need quality switching
[19:28:54 CET] <kepstin> well, bandwidth and space are the same thing. so if space is expensive, you need to use less bandwith.
[19:28:56 CET] <kepstin> The other part of the equation is time - how much time can you spend encoding videos and how fast is your cpu.
[19:28:57 CET] <rnmhdn> but I still need dash because of other things like inserting something or removing something from the video
[19:29:31 CET] <rnmhdn> kepstin: I mean my harddisk on my server is limited
[19:29:37 CET] <rnmhdn> but the bandwidth of the server is very much
[19:30:02 CET] <rnmhdn> as for the cpu part. I don't actually know. right now I have about 150GB of video.
[19:30:02 CET] <kepstin> file size = bitrate * length
[19:30:11 CET] <rnmhdn> I need to get it done in 3-4 days
[19:30:14 CET] <kepstin> so for higher bitrate (= higher bandwidth), you get higher size
[19:30:16 CET] <rnmhdn> I could rent rigs if needed
[19:30:48 CET] <kepstin> anyways, start playing around with x264 options to find something that fits your need. there's no "one size fits all" thing, because every use case is different
[19:31:25 CET] <kepstin> x264 is super easy to use, the only options you need to set is -preset (to adjust how much time to spend encoding video) and either -crf (to adjust target quality) or bitrate options (if you want to target a specific file size)
[19:58:07 CET] <ober> are there any ffmpeg related tools that can do video stabilization?
[19:59:36 CET] <another> https://ffmpeg.org/ffmpeg-all.html#vidstabdetect-1
[19:59:53 CET] <ober> ty
[21:23:50 CET] Action: kylemson slaps arps around a bit with a large trout
[21:25:25 CET] <kylemson> Hi All, I'm trying to watermark a video and this works fine "ffmpeg -y -i /tmp/videofile.mp4 -i /tmp/pngfile.png -filter_complex "overlay=x=(main_w-overlay_w)/2:y=(main_h-overlay_h)/2" -strict -2 /tmp/videofile2.mp4", but i need to tone down the opacity of the watermark, how can i adjust this command to do this?
[21:34:03 CET] <rnmhdn> kepstin: thank you for you advices:) I will look into them tomorrow:) sorry I was so busy I had to go.
[21:36:57 CET] <kepstin> kylemson: use imagemagick or gimp to reduce the opacity of the png first? :)
[21:38:25 CET] <arps> kepstin: the image already has a transparent background and consists of some white text...is it possible to reduce the white text opacity still?
[21:38:37 CET] <arps> kepstin: i was under the impression that there's just one transparency layer in a png
[21:39:02 CET] <kepstin> png images have an alpha channel, each pixel can have arbitrary amount of transparency set
[21:39:44 CET] <kepstin> there probably is a way to adjust this in ffmpeg, but i don't know which filter you'd need for it
[21:40:12 CET] <arps> well the png is always the same one anyway so it would be fine to do it your way... just need to figure out *how* :D
[21:40:32 CET] <arps> kepstin: I'm working with kylemson on this one so that's why i'm answering for him, he's a bit shy
[21:41:19 CET] <kepstin> it looks like you could use the colorlevels filter in ffmpeg to do this, by setting the "aomax" parameter to limit the opacity of the output.
[21:42:16 CET] <kepstin> so something like -filter_complex '[1:v]colorlevels=aomax=0.5[img];[0:v][img]overlay=<your options here>'
[21:42:22 CET] <kepstin> adjust the 0.5 to taste
[21:42:35 CET] <arps> kepstin: we're trying imagemagick this second but will try that if it's a no go, much appreciated
[21:44:27 CET] <kepstin> you might also consider using the 'blend' filter rather than 'overlay' depending on the effect you're going for, blend supports some stuff like image editor layer modes.
[21:50:03 CET] <arps> kepstin: I think we have a winner with the first suggestion, thank you for your help with this
[00:00:00 CET] --- Thu Mar 14 2019


More information about the Ffmpeg-devel-irc mailing list