[Ffmpeg-devel-irc] ffmpeg.log.20160902

burek burek021 at gmail.com
Sat Sep 3 03:05:01 EEST 2016


[00:17:20 CEST] <hwk> hi, im streaming from vlc using pipe and encode aac adts with ffmpeg with the output being piped out
[00:17:46 CEST] <hwk> but if i pause the stream pipe source (vlc) , ffmpeg exits , sadly :(
[00:19:02 CEST] <hwk> right now with the current setup 20% cpu usage (vlc + ffmpeg / 10% on 4 cores) on an arm cpu (mako device)
[00:20:11 CEST] <hwk> tried a setup with vlc outputting to pulseaudio, and using ffmpeg to encode the audio card. which worked . but the cpu usage was like 70% for ffmpeg only :(
[00:21:10 CEST] <hwk> btw the internal aac encoder eats way more cpu (double +) than the "best non-free available one" -- in my arm cpu it makes a difference :)
[00:23:12 CEST] <hwk> as third solution which i would try, is enabling pulseaudio nullsink (this way no sound card is required), stream to it using vlc and encode using ffmpeg
[00:23:20 CEST] <hwk> would this reduce the cpu usage?
[00:24:30 CEST] <hwk> another one would be to use vlc to stream via pipe to ffmpeg and configure ffmpeg to null audio source concatened with the stdin from the vlc pipe -- would this keep ffmpeg alive and off the cpu usage?
[00:24:52 CEST] <hwk> https://trac.ffmpeg.org/wiki/Null  ->  anullsrc
[00:25:04 CEST] <hwk> any guidance?
[00:26:56 CEST] <hwk> anyhow i think using pulseaudio as intermediary  wont reduce the cpu usage . because another component (pulseaudio) is involved in the process -- which eats 10% cpu when in use -- and both vlc and ffmpeg need to do some "transcoding" back and forth
[00:27:16 CEST] <hwk> from the alsa format
[00:27:17 CEST] <JEEB> I think the internal aac encoder now has a faster mode of encoding
[00:27:25 CEST] <JEEB> if you need to minimize its CPU usage
[00:27:48 CEST] <hwk> didnt managed to configure it to cbr :|
[00:27:51 CEST] <JEEB> + you can always try to enable compiler optimizations, some of the stuff is by default disabled for some compilers
[00:28:06 CEST] <JEEB> but anyways, if you're using fdk-aac that's a well known renowned encoder
[00:28:10 CEST] <hwk> compiled ffmpeg on a mobile phone in chroot . the best it gets :)
[00:28:14 CEST] <JEEB> its only issue is that you can't distribute it :P
[00:28:34 CEST] <JEEB> (that's the meaning of the "nonfree" flag)
[00:29:05 CEST] <JEEB> due to the fraunhofer license being incompatible with FFmpeg's
[00:29:27 CEST] <hwk> can someone help with the concat pulsenullsink and stdin
[00:29:41 CEST] <JEEB> haven't used that stuff so nope
[00:29:55 CEST] <hwk> just to know if its possible to pull smth like this
[00:30:16 CEST] <hwk> all this to have a proper aac stream to feed icecast
[00:30:32 CEST] <hwk> mp3 is easy
[00:31:42 CEST] <hwk> this will save from enabling transcoding from vlc (just to make sure i have a valid stream output , noticed that vlc will try to stream all sorts of files :| )
[00:36:51 CEST] <hwk> can someone help me with an example which only audio is being used, starting from this example http://pastebin.com/ekbrF9qN ?
[00:37:26 CEST] <hwk> or both -map "[v]" -map "[a]" need be used no matter the contex?
[00:46:59 CEST] <hwk> this seems right? ffmpeg -i anullsrc=r=44100:cl=stereo -i - -t 1 -filter_complex "[0:a][1:a]concat=n=2:a=1[a]" -map "[a]" -ar 44100 -c:a libfdk_aac -b:a 128k -f adts -  ?
[00:50:51 CEST] <hwk> anullsrc=r=44100:cl=stereo: No such file or directory L|
[00:51:53 CEST] <hwk> missing lavfi
[00:54:11 CEST] <hwk> Stream specifier ':a' in filtergraph description [0:a][1:a]concat=n=2:a=1[a] matches no streams.
[01:00:43 CEST] <relaxed> hwk: concat=n=2:v=0:a=1
[01:07:34 CEST] <hwk> 10x, amerge is what i needed in my case, this seems to work to an extent... /home/deluged/ffmpeg/bin/ffmpeg  -v debug -err_detect ignore_err -f lavfi -i anullsrc=r=44100:cl=stereo -i - -filter_complex amerge -ar 44100 -c:a libfdk_aac -b:a 128k -f adts -f adts -
[01:16:12 CEST] <CFS-MP3> hwk you're passing anullsrc etc after a -i
[01:19:59 CEST] <kyleogrg> it looks like i can't input avisynth scripts into ffmpeg 64-bit.  is this right?
[01:20:26 CEST] <furq> not with a 32-bit avisynth
[01:23:34 CEST] <kyleogrg> okay.. i'll look into that
[01:24:55 CEST] <furq> i don't know if there are 64-bit binaries of avisynth-mt
[01:25:08 CEST] <furq> assuming this is for qtgmc
[01:44:10 CEST] <kyleogrg> furq: yes, it's for qtgmc
[01:44:55 CEST] <furq> the qtgmc wiki page claims there is only one supported version and it's a 32-bit binary hosted on dropbox
[01:45:14 CEST] <furq> which in my experience is a big fan of crashing after eight hours
[01:45:55 CEST] <kyleogrg> hmm, so i must use 32-bit ffmpeg.  is it actually slower?
[01:46:02 CEST] <furq> probably
[01:46:03 CEST] <kyleogrg> let's say to encode an uncompressed avi
[01:46:20 CEST] <furq> it's not going to be noticeably slower if you're running qtgmc at the same time
[01:46:38 CEST] <kyleogrg> makes sense
[01:46:44 CEST] <furq> i doubt it's noticeably slower anyway, but i've never benchmarked it
[01:46:51 CEST] <furq> apparently some encoders are better optimised on amd64
[03:25:13 CEST] <deweydb> http://termbin.com/pouw
[03:25:29 CEST] <deweydb> i'm trying to take a bunch of pictures and make a slideshow, by specifying the time each picture is displayed
[03:25:43 CEST] <deweydb> i'm following the instructions from here: https://en.wikibooks.org/wiki/FFMPEG_An_Intermediate_Guide/image_sequence
[03:25:57 CEST] <deweydb> but i get a bunch of errors in the output: 100 buffers queued in output stream 0:0, something may be wrong.
[03:26:04 CEST] <deweydb> and it takes a VERY long time.
[03:28:08 CEST] <deweydb> and what it produces, is not correct. for example the above command, if i understand correctly should display the first image for 8 seconds, then the second image for the next 53 seconds, and the last time for 48 seconds. but instead it creates a video lasting about 9 seconds. it displays the first image for 8 seconds, and the last two images flash in the last few miliseconds of the video
[03:44:37 CEST] <madprops> high how can I convert from webm to webm using vp9 and the crf or constant bitrate i want?
[03:44:43 CEST] <madprops> hi*
[03:45:12 CEST] <madprops> or actually if it's possible
[03:45:15 CEST] <madprops> webm to mp4
[03:45:20 CEST] <madprops> and mp4 to mp4 as well
[03:45:46 CEST] <madprops> ive been using this to convert mp4 to mp4
[03:45:47 CEST] <madprops> ffmpeg -i 960.mp4 -c:v libx264 -crf 26 jellothere.mp4
[03:46:23 CEST] <deweydb> im pretty sure ffmpeg is smart about inputs
[03:46:33 CEST] <deweydb> you can likely drop other inputs into the above command
[03:46:36 CEST] <deweydb> and still get mp4
[03:46:43 CEST] <madprops> i see
[03:46:47 CEST] <deweydb> for example i just did this today to go from wmv to mp4
[03:46:56 CEST] <madprops> and is that a good command to get a good filesize/quality ratio?
[03:47:09 CEST] <deweydb> thats a personal preference thing. it depends on application
[03:48:08 CEST] <deweydb> i think you can save some bytes and retain decent quality by going up to x265 instead of x264, but again, it depends on application, not all players are gonna suppoer x265
[03:48:23 CEST] <madprops> is it too new?
[03:48:38 CEST] <deweydb> basically, yes.
[04:41:38 CEST] <hwk> the pulseaudio null sink and the fdk codec reduced the cpu usage
[04:41:55 CEST] <hwk> seems the best solution for now
[07:30:53 CEST] <Spring> so I just realized something
[07:31:28 CEST] <madprops> anyone knows if duration data is extracted correctly even on corrupted videos?
[07:31:30 CEST] <Spring> MSI Afterburner requires setting a framerate for the captures, regardless of what the in-game FPS is
[07:32:26 CEST] <Spring> I have this set to 60 FPS. I've noticed that PotPlayer will display the actual original FPS in parenthesis when playing back the captured video
[07:33:28 CEST] <Spring> however upon re-encoding to reduce the filesize both FPS values become 60 FPS. IS there a way to tell ffmpeg what the in-game FPS is so it can encode to it?
[07:34:51 CEST] <Spring> ex, original: https://a.uguu.se/cJycBgokwyir.png
[07:35:35 CEST] <Spring> re-encoded: https://a.uguu.se/DffYuNMZrKMR.png
[07:38:02 CEST] <Spring> the '22.8' value in the original screenshot above goes up to '30' when playing, I had it paused.
[07:55:45 CEST] <redletterchannel> I was just reading up about video containers and video/audio formats. Since we have containers that can support overlapping formats, can we use ffmpeg to output only raw audio/video format without the container and then later on use this to wrap them in a container? Can wrapping a container later be done quick enough to say wrap a container dynamically?
[07:57:53 CEST] <furq> yes to the first question
[07:57:58 CEST] <furq> i'm not really sure what the second question is
[07:58:32 CEST] <Spring> redletterchannel, you could also just copy the audio/video stream to a new container and leave it as whatever container it was originally
[08:00:21 CEST] <furq> yeah
[08:00:40 CEST] <furq> it probably won't be significantly faster to avoid demuxing it, and it's much less annoying
[08:00:57 CEST] <furq> e.g. with raw h264 you need to specify the video size, framerate etc when muxing it
[08:01:17 CEST] <redletterchannel> furq: thanks, the second question is about video delivery. i was wondering if it is possible to deliver video dynamically on different containers while only having to encode into a single format.
[08:01:31 CEST] <furq> sure
[08:02:33 CEST] <furq> it's probably easier to not use any raw streams though
[08:03:06 CEST] <furq> `ffmpeg -i foo.mkv -c copy foo.mp4` will remux from mkv to mp4 without reencoding
[08:03:11 CEST] <redletterchannel> furq: I guess what you mean is similar to what spring said. leave it in a container format and just pull out the encoded video/aduio ?
[08:03:15 CEST] <furq> right
[08:03:27 CEST] <furq> you can use raw streams but it's probably not worth the extra hassle
[08:03:56 CEST] <furq> remuxing is generally only limited by disk speed
[08:05:26 CEST] <redletterchannel> that makes sense, since the container is only a container I can always access the raw streams.
[08:17:23 CEST] <Spring> another way of phrasing what I'm wondering would be: if a video is in reality 30FPS but is captured at 60FPS, would this make a difference to quality or filesize when re-encoded to 60FPS?
[08:18:48 CEST] <Spring> or would it be the same as encoding 30FPS due to the fact that there aren't the same number of actual frames
[08:19:18 CEST] <Spring> visually, that is
[08:31:20 CEST] <squ> furq: what remuxing means?
[08:32:46 CEST] <Threads> it means going from one container to another container without spending hours encoding the video/audio
[08:35:46 CEST] <squ> I see
[08:52:23 CEST] <sitara> Hi
[08:53:06 CEST] <sitara> How can i get the count of particular macro blocks?
[09:05:00 CEST] Last message repeated 1 time(s).
[09:05:58 CEST] <waits_> Hello
[09:07:57 CEST] <sitara> How can i get the count of particular macro blocks?
[09:15:34 CEST] <c_14> sitara: ffmpeg -debug mb_type -i video -f null /dev/null . You'll have to parse the output though
[09:22:07 CEST] <sitara> How can i get the count of particular macro blocks?
[09:22:31 CEST] <c_14> sitara: ffmpeg -debug mb_type -i video -f null /dev/null . You'll have to parse the output though
[09:23:03 CEST] <sitara> Is there any parsing tools available?
[09:24:22 CEST] <c_14> not in ffmpeg
[09:29:09 CEST] <sitara> Will this command give mb type information based on the encoder of the video
[09:29:31 CEST] <sitara> Or its based on mpeg encoder?
[09:33:06 CEST] <c_14> Whatever the decoder extracts
[09:34:58 CEST] <sitara> Ok. I shall check
[09:36:45 CEST] <sitara> How can extract the prediction residual from the encoded video?
[09:38:15 CEST] <sitara> Means the prediction residual kept by the encoder in a video for each of the P and B frames
[09:41:53 CEST] <c_14> That I don't know.
[10:24:07 CEST] <waits_> How do I generate thumbnails from a video using ffmpeg 0.6.5? On the latest version I'm using basically the one from the docs: -i <video> -vf fps=1/60 img%03d.jpg but the older version doesn't recognize these params.
[10:24:53 CEST] <JEEB> yup, that is why you don't use whatever comes with the distribution and instead build your own binaries :P
[10:25:03 CEST] <DHE> what distro uses something THAT old...
[10:25:16 CEST] <JEEB> rhel 6 I think
[10:25:18 CEST] <waits_> centos 6.2
[10:25:19 CEST] <waits_> yep
[10:32:27 CEST] <waits_> so is it possible?
[11:43:51 CEST] <waits_> why is ffmpeg complaining about -v error not being a number when in the man pages it says it's a number or a string?
[11:44:22 CEST] <c_14> It works here.
[11:44:32 CEST] <waits_> ffmpeg 0.6.5
[11:44:34 CEST] <waits_> :|
[11:44:40 CEST] <c_14> that would explain it
[11:44:57 CEST] <waits_> so why does man says it works with a string?
[11:45:39 CEST] <c_14> because the manpage is for a newer version?
[11:45:49 CEST] <waits_> really?
[11:46:15 CEST] <waits_> ok i was thinking it's tied to ffmpeg version
[11:47:21 CEST] <iive> they usually came in the same package...
[11:47:25 CEST] <waits_> it seems installing ffmpeg on centos 6.x is a PITA
[11:47:37 CEST] <waits_> the newest version I mean.
[11:47:59 CEST] <c_14> get a static build or build from sources?
[11:48:31 CEST] <iive> remove the old versions first and check if there is any leftovers
[12:39:51 CEST] <k_sze> If I have an unsigned char pointer to some RGBA data, how do I copy that data into AVFrame->data?
[12:40:07 CEST] <k_sze> Do I have to use av_image_fill_arrays?
[12:46:28 CEST] <waits_> I'm getting an error using ffmpeg 3.x
[12:46:30 CEST] <waits_> [stdout.write] (default task-8) [NULL @ 0x4790600] Unable to find a suitable output format for ''
[12:47:13 CEST] <waits_> command is: /etc/ffmpeg/ffmpeg -y -v error -i /tmp/020916103831.372-xkhVW.mkv -f mjpeg -ss 30 -vframes 1 -an /tmp/thumb.jpg
[12:50:04 CEST] <waits_> relaxed, It's the error from a ffmpeg wrapper for java
[12:50:07 CEST] <waits_> And only error
[12:50:18 CEST] <relaxed> oh
[12:51:09 CEST] <waits_> I don't quiet get it, the command is in the logs and it should work
[13:14:53 CEST] <furq> waits_: it looks like it's passing an empty string as the last argument
[13:28:28 CEST] <hargut> Hello.
[13:29:24 CEST] <hargut> I'm just wondering if there is a way in ffserver to launch the ffmpeg instance when the stream is requested. Does someone of you know that?
[13:30:34 CEST] <hargut> As I need live reencoding for my setup I'd prefer to have the ffmpeg instances run only when watched.
[14:23:54 CEST] <waits_> furq, Sorry but the last argument isn't -an and the output jpg ?
[14:25:06 CEST] <furq> the last argument of the command you pasted is
[14:25:15 CEST] <furq> but if it's a java wrapper then i assume it's just not printing the empty string
[14:25:48 CEST] <furq> if you append "" to that ffmpeg command on the cli then you'll get the same error
[14:26:14 CEST] <waits_> ok, I see in the logs that indeed there's a space after the img path
[14:27:22 CEST] <waits_> i tested on the cli adding a space at the end but it works
[14:27:44 CEST] <furq> that's not the same thing
[14:28:05 CEST] <furq> i mean a pair of double quotes
[14:28:39 CEST] <waits_> hm
[14:29:04 CEST] <furq> http://vpaste.net/t5o9T
[14:29:28 CEST] <waits_> i see
[14:30:00 CEST] <furq> system(3) will do the same thing if there's a trailing empty string
[14:30:03 CEST] <furq> i assume that's what java is using
[14:30:26 CEST] <waits_> i'm using https://github.com/bramp/ffmpeg-cli-wrapper do you happen to know it?
[14:30:35 CEST] <waits_> not sure how to fix it
[15:02:44 CEST] <quup> I'm reading from webcam (av_read_frame) as fast as I can, which turns out to be pretty slowly, is there some way to always get the most recently captured frame?  I tried av_seek_frame but the AVFormatContext->duration isn't set so I have no idea where the end is
[15:03:35 CEST] <quup> i.e. I want to drop frames all the time, always getting the newest one
[15:23:59 CEST] <waits_> I figured out the problem furq
[16:04:35 CEST] <waits_> thanks a lot furq
[16:32:14 CEST] <quup> nonex86: by opening up /dev/video0
[16:33:23 CEST] <xeche> anybody know how to set encoder private option flags ?
[16:33:35 CEST] <xeche> i.e. options with no parameterization
[16:34:31 CEST] <jkqxz> quup:  You need to keep calling DQBUF until it gives you EAGAIN.  There isn't code to do it in ffmpeg already, but adding it would be straightforward - see mmap_read_frame() in libavdevice/v4l2.c.
[16:35:14 CEST] <quup> jkqxz: ok, thanks
[16:49:04 CEST] <xeche> no one knows how to work with encoder flags via priv_data ?
[17:32:28 CEST] <gromp> hello all . I have a question about conversion . might it be possible to extract the video stream from a WebM ... and insert it into an MP4 ? _losslessly_ ... that is . I was able to use gMKVExtractGUI in order to extract the video stream from a WebM ... however, the outputted .ivf file does _not_ appear to be compatible with Yamb/MP4Box . and so hence ... I've run into a glass ceiling
[17:35:36 CEST] <kepstin> gromp: there's currently no mapping for putting vp8/vp9 video inside an mp4 file, as far as I know
[17:35:43 CEST] <kepstin> gromp: so the answer is no, it's not possible
[17:38:11 CEST] <kepstin> (even if you could do that, you'd probably want to transcode to h264 anyways, since very few applications would be able to play it)
[17:45:50 CEST] <furq> it looks like chrome can play vp9 in mp4
[17:46:01 CEST] <furq> i'm not entirely sure why, but netflix are apparently pushing for it
[17:46:14 CEST] <furq> i don't know of any tools which can mux it though
[17:46:48 CEST] <furq> netflix are obviously very serious about it because they have a github repo which contains a spec in docx format
[17:47:16 CEST] <JEEB> docx is how specs get written. just look at jct-vc and such...
[17:47:35 CEST] <furq> it used to be a pdf but they obviously decided that was too usable
[17:50:53 CEST] <JEEB> pdfs are good for export but not editing
[17:51:17 CEST] <JEEB> usually specs are provided as doc(x) until ready
[17:51:44 CEST] <JEEB> and then when ready, exported as pdf put onto official sites
[17:55:00 CEST] <Spring> yooo, thank whoever added to option to import Photoshop curves via a file for the curves filter
[18:16:33 CEST] <gromp> kepstin : furq : thank you so very much for your help, guys . much appreciated
[20:40:39 CEST] <ianbytchek> greetings all. any reason why avfilter_graph_request_oldest never returns anything but EAGAIN?
[20:47:16 CEST] <durandal_1707> ianbytchek: how do you build graph?
[20:53:06 CEST] <ianbytchek> durandal_1707: in every possible way. started with official examples. tried avfilter_graph_parse, avfilter_graph_parse2, avfilter_graph_parse_ptr. managed to get it working with without avfilter_graph_request_oldest like per example, but only with simple graphs, for example only using colorchannelmixer.
[20:54:17 CEST] <ianbytchek> durandal_1707: when i try it with split & palletegen & palleteuse it just gets stuck. but the same filtergraph works with ffmpeg command. got super desperate with this.
[20:55:06 CEST] <ianbytchek> durandal_1707: avfilter_graph_config passes. I assume the problem is elsewhere.
[21:01:25 CEST] <durandal_1707> ianbytchek: why you need that function?
[21:05:05 CEST] <ianbytchek> durandal_1707: i'm using ffmpeg libraries in an app and need to use palettegen and paletteuse filters. examples show a basic usage, it doesn't work with a more complex case. so i'm trying to get my head around how ffmpeg does it.
[21:05:37 CEST] <durandal_1707> ianbytchek: i dont think you need that function
[21:05:49 CEST] <ianbytchek> durandal_1707: what do you suggest?
[21:06:17 CEST] <DHE> I've only tested simple linear graphs, but it's been pretty consistent. put frames in, get frames out.
[21:06:51 CEST] <durandal_1707> ianbytchek: i cant guess things, i need more info
[21:06:57 CEST] <DHE> usually EAGAIN means you need to probe the other endpoint - if you get EAGAIN on submitting to input, grab an output.
[21:07:32 CEST] <durandal_1707> if you have more inputs, you need to check all of them
[21:07:40 CEST] <durandal_1707> same for outputs
[21:08:51 CEST] <ianbytchek> durandal_1707: if i comment avfilter_graph_request_oldest and go straight to av_buffersink_get_frame_flags it just gets stuck witht he same EAGAIN code.
[21:09:12 CEST] <ianbytchek> durandal_1707: ok  split [x][y];[x] palettegen [z];[y][z] paletteuse
[21:09:51 CEST] <ianbytchek> durandal_1707: works on ffmpeg. doesn't in my code. i'm a little confused by graphs still. but as far as i understand this is single input and single output? no?
[21:10:36 CEST] <ianbytchek> durandal_1707: if you have more inputs, you need to check all of them  I had this suspicion, but didn't find how ffmpeg does it. do you think this is what's happening?
[21:12:07 CEST] <ianbytchek> DHE: do you mean EAGAIN with av_buffersrc_add_frame_flags?
[21:13:01 CEST] <DHE> just in general. EAGAIN means try again later, or try again when something's changed.
[21:14:36 CEST] <ianbytchek> DHE: that's clear. if I do that the loop never finishes. hence the confusion.
[21:16:51 CEST] <durandal_1707> ianbytchek: in this case palettegen need whole video to give single frame
[21:18:31 CEST] <ianbytchek> durandal_1707: but it works with a series of png files with  ffmpeg -i ""%02d.png -lavfi "fps=15,scale=500:-1:flags=lanczos [x];[x] split [y][z];[y] palettegen [a];[z][a] paletteuse" -y out.gif
[21:18:53 CEST] <durandal_1707> ianbytchek: it means it needs to buffer all video frames, eating memory, better do that in 2 pass
[21:19:27 CEST] <durandal_1707> ianbytchek: perhaps because you havent reached limit in internal fifo
[21:20:23 CEST] <durandal_1707> ianbytchek: you need to feed NULL frame iirc to signal eof
[21:20:38 CEST] <durandal_1707> than it should work
[21:22:52 CEST] <ianbytchek> durandal_1707: could you tell a little more on that? the way I see it. I take single frame. generate palette. then use it on the same frame. don't wanna do it for the whole sequence. just for that single case.
[21:23:59 CEST] <durandal_1707> ianbytchek: do you use av_buffersrc_add_frame_flags?
[21:24:26 CEST] <durandal_1707> for single frame it should be fine
[21:24:49 CEST] <ianbytchek> durandal_1707: yep. with AV_BUFFERSRC_FLAG_KEEP_REF | AV_BUFFERSRC_FLAG_PUSH. right before avfilter_graph_request_oldest and av_buffersink_get_frame_flags.
[21:25:26 CEST] <ianbytchek> durandal_1707: tried all flag combos. didn't help. missing something?
[21:25:35 CEST] <durandal_1707> ianbytchek: do you call it with NULL as arg after you give it single frame?
[21:26:23 CEST] <ianbytchek> durandal_1707: no. will try now.
[21:33:06 CEST] <ianbytchek> durandal_1707: fails with code 22 on feeding the second iteration / frame. is this expected?
[21:34:14 CEST] <durandal_1707> ianbytchek: yes, you need to recreate graph after you finish with work with single frame
[21:35:08 CEST] <durandal_1707> but i guess there is better to way to let palletegen outputs palette for each frame
[21:39:22 CEST] <ianbytchek> durandal_1707: yep. graph recreating would be counterproductive. but is this really what ffmpeg itself does? sends all frames into the graph and then gets them one by one?
[21:42:06 CEST] <durandal_1707> ianbytchek: palettegen is meant to create palette for whole video, not just single frame
[21:43:21 CEST] <ianbytchek> durandal_1707: in this case, would using two graphs be a sensible approach? one for palletegen one for palleteuse?
[21:44:26 CEST] <kyleogrg> is mjpeg in ffmpeg not the best jpeg encoder?
[21:44:46 CEST] <c_14> probably not (not that I've ever tested it)
[21:45:01 CEST] <durandal_1707> kyleogrg: its just use bad defaults for quality
[21:46:35 CEST] <durandal_1707> ianbytchek: no, as palettegen outputs single frame, and paletteuse needs single frame
[21:46:41 CEST] <kyleogrg> durandal_1707: i've tried qp of maybe 0-8 and i get good results, but i'm wondering if the file size can be smaller.  i can shrink them a little with jpegtran.
[21:47:08 CEST] <kyleogrg> c_14: would imagemagick be the go-to program for this?
[21:47:30 CEST] <durandal_1707> kyleogrg: how much is little?
[21:48:12 CEST] <kyleogrg> durandal_1707: eh, i haven't measured it closely yet.  maybe 80-90% of the original size?
[21:48:30 CEST] <ianbytchek> durandal_1707: cool. gonna try figure it out. you helped incredibly. sincere thanks.
[21:56:01 CEST] <durandal_1707> ianbytchek: i may modify those filter to support your usecase, so no graph reinitialization is necessary
[21:56:53 CEST] <ianbytchek> durandal_1707: that would be saving my ass big time.
[21:57:52 CEST] <ianbytchek> durandal_1707: is this something quick? i'm using custom-configured build, would try it right away.
[21:58:56 CEST] <durandal_1707> ianbytchek: i need some time to implement it
[22:00:01 CEST] <ianbytchek> durandal_1707: if anything I can help with, please let know?
[22:44:31 CEST] <durandal_1707> ianbytchek: check this https://github.com/richardpl/FFmpeg/tree/palette
[22:45:15 CEST] <ianbytchek> durandal_1707: you're on fire. looking into it now.
[22:45:20 CEST] <durandal_1707> ianbytchek: "split[a][b],[a]palettegen=stats_mode=2[a],[b][a]paletteuse=new=1"
[22:47:15 CEST] <scottchiefbaker> I'm using "-vf scale=-1:480" to scale my video to whatever / 480 but I'm getting an error about the width not being divisible by zero
[22:47:30 CEST] <scottchiefbaker> errr... two. Is there a way to make ffmpeg round up/down if it's not?
[22:48:58 CEST] <kepstin> yep, do "-vf scale=-2:480"
[22:49:21 CEST] <scottchiefbaker> kepstin: Ahh... that's too easy!
[22:50:15 CEST] <kepstin> it's even documented :) https://www.ffmpeg.org/ffmpeg-filters.html#scale-1
[22:51:05 CEST] <scottchiefbaker> kepstin: Was that a nice way of saying RTFM :)
[22:51:48 CEST] <kepstin> ... perhaps. But there is an annoying amount of ffmpeg magic with either no or hard-to-find docs.
[22:59:12 CEST] <ianbytchek> durandal_1707: i think i'm missing something  [Parsed_paletteuse_2 @ 0x7fd3a9dc1480] Option 'new' not found Error initializing filter 'paletteuse' with args 'new=1'
[23:01:05 CEST] <durandal_1707> ianbytchek: you are not using my branch somehow
[23:01:23 CEST] <ianbytchek> durandal_1707: yes. something went wrong.
[23:01:41 CEST] <ianbytchek> durandal_1707: balls© used master.
[23:02:40 CEST] <ianbytchek> durandal_1707: funny. when trying to download archive on a branch github give link for master&
[23:03:35 CEST] <durandal_1707> ianbytchek: learn git :D
[23:03:42 CEST] <durandal_1707> its easy
[23:03:58 CEST] <ianbytchek> durandal_1707: yep& :D
[23:04:03 CEST] <durandal_1707> perhaps it needs time to generate...
[23:23:10 CEST] <ianbytchek> durandal_1707: i feel like all little children at once on christmas.
[23:23:59 CEST] <ianbytchek> durandal_1707: works smoothly. gonna play around with it now.
[23:24:21 CEST] <ianbytchek> durandal_1707: when do you think this gets merged into master?
[23:24:49 CEST] <durandal_1707> ianbytchek: after review, option names may change
[23:33:24 CEST] <zevarito> it is possible to add Side data to a video stream ?
[23:38:14 CEST] <durandal_1707> zevarito: what kind of "Side data"?
[23:39:29 CEST] <zevarito> durandal_1707: it seems like some devices with IOS when create portrait videos they add displaymatrix: rotation of -90.00 degrees under Side data section which ffprobe shows right after Metadata section
[23:39:53 CEST] <zevarito> I am looking how to add that info that a stream that doesnt have it
[23:40:55 CEST] <llogan> ffmpeg -i input -c copy -metadata:s:v:0 rotate="-90" output
[23:43:09 CEST] <zevarito> llogan: thats for regular metadata, the kind of video I am talking has both: rotate:90 side data: displaymatrix: rotation of -90.00 degrees
[23:46:05 CEST] <llogan> apparently it doesn't like the negative value. use 270 instead.
[23:47:39 CEST] <zevarito> I tried with 270 at Rotate, but what I need to try is to add Side Data
[23:47:52 CEST] <zevarito> we transcode videos with ffmpeg then post process with other service
[23:48:07 CEST] <llogan> works for me
[23:48:09 CEST] <zevarito> that service is the one having troubles, but not for the videos that come with Side Data
[23:49:06 CEST] <ianbytchek> durandal_1707: are you on mac by any chance?
[23:50:28 CEST] <durandal_1707> ianbytchek: nope, why?
[23:52:28 CEST] <ianbytchek> durandal_1707: don't worry then. man, thanks again for the help. i'd probably be in a hospital by tomorrow with a brain turned into one big tumour.
[23:56:56 CEST] <durandal_1707> ianbytchek: huh?
[23:58:07 CEST] <ianbytchek> durandal_1707: saying my brain would have got molten if not for your help :)
[23:59:00 CEST] <durandal_1707> ah, ok
[00:00:00 CEST] --- Sat Sep  3 2016


More information about the Ffmpeg-devel-irc mailing list