[Ffmpeg-devel-irc] ffmpeg.log.20130917
burek
burek021 at gmail.com
Wed Sep 18 02:05:01 CEST 2013
[00:02] <Guest4966> durandal_1707: Thats the problem, I can't really provide anymore information other than the code. The error that I get is that ffmpeg cannot read reference frame and that there might be a possible corruption
[00:02] <Guest4966> And the error seems to happen on a windows machine that I am working with.
[00:02] <Guest4966> the linux version is perfect.
[02:14] <chrisballinger> woah how come the online docs for avformat.h dont have a ton of the stuff in the actual avformat.h?
[02:19] <klaxa> lazyness maybe?
[02:21] <chrisballinger> they look like they have markup for some sort of parser
[03:16] <jangle_> this must be asked a million times, but I am trying to take rgb24 and convert it to yuv420 for use with x264. My code is failing on the sws_scale call with a bad access, inside ff_yuv2plane1_8_avx.loop_a
[03:16] <jangle_> which suggests to me that I haven't given it the input data structures in the right way
[03:19] <jangle_> I'm starting with what I believe is called non-planar data, that is, rgb pixel values every 24 bits, but it looks like the sws_scale call is generalized to assume that input and output data is planar, and that (I imagine) if the input (or output) is supposed to be non-planar, that it should be stored/can be found in the first plane of the relevant data structures
[03:20] <jangle_> since x264 seems to provide for a data structure that fits this purpose (the img struct inside of a pic_t, properly initialized), I've used it as my dst and dstStride locations
[03:20] <jangle_> from what it sounds like, is my understanding of these correct?
[03:31] <llogan> jangle_: try libav-user mailing list if you don't get an answer here
[03:31] <jangle_> llogan: thanks
[03:46] <vl4kn0> Is there any specify order the video/audio packets arrive from the stream? Is it guaranteed that audiopacket will arive before videopacket in terms of synchronization?
[09:16] <Keshl> Is there a way to introduce an error into a video stream (a recoverable one, like the ones where you see some subtly different shades of colors in part of the video that gets fixed at the next keyframe) using ffmpeg, oÉo?
[09:31] <Theo__> Hi! I'm trying to find a way to add AES 128 bit encyrption when using "avformat_write_header". Anyone know how this might be done? I have the question up here as well: http://stackoverflow.com/questions/18834320/ffmpeg-libavformat-read-and-write-header-with-aes-encryption
[09:49] <smj> how do I make a 4-channel audio file out of 2 stereo files? with '-map_channel 0.0.0 -map_channel 0.0.1 -map_channel 1.0.0 -map_channel 1.0.1' I get only a stereo file with the channels of the first input file
[09:50] <Mavrik> hmm, you'll probably have to mix them
[09:51] <Mavrik> smj, "join" audio filter
[10:40] <Ottre> just compiled the latest version from git
[10:40] <Ottre> what happened to the yadif filter?
[10:41] <saste> Ottre, why?
[10:41] <relaxed> Ottre: ffmpeg -filters 2>&1| grep yad
[10:42] <saste> Ottre, did you forget --enable-gpl?
[10:51] <Ottre> yes looks like i forgot --enable-gpl
[10:55] <JEEB> There was talk of LGPL'ing yadif, and most people agreed
[10:55] <JEEB> except this one guy who wanted a couple of thousand USD for it
[10:55] <JEEB> (he had written some code for it)
[10:58] <Mavrik> is there a alternative for it with better license?
[10:58] <JEEB> not really
[11:01] <Ottre> afaik nothing beats yadif in terms of quality
[11:01] <durandal_1707> nonsense
[11:01] <Ottre> maybe some filters are more efficient
[11:01] <JEEB> no, quality-wise there are better things
[11:01] <JEEB> but they're not exactly things that consist of a single filter available for libavfilter
[11:01] <JEEB> like QTGMC for avisynth
[11:02] <JEEB> that is an avisynth script that bases itself on NNEDI3 and motion compensation
[11:03] <Ottre> not really the same thing is it?
[11:03] <Ottre> i've heard filters with motion compensation are for scrolling text
[11:03] <Kuukunen> depends on what you want to do
[11:03] <Ottre> not general video
[11:04] <Kuukunen> do you want deinterlacing or not? :P
[11:04] <Kuukunen> JEEB: who wanted tousands? :P
[11:04] <JEEB> Kuukunen, guess :P
[13:54] <cusco> hello folks
[13:55] <cusco> any hints on how to get audio length from a gsm encoded file, without converting it first?
[15:18] <xzise> Hello, can I somehow change the framerate by speeding up the video? I have 23.98 fps and want to change it to 25 fps without dropping or duplicating frames. According to https://trac.ffmpeg.org/wiki/How%20to%20speed%20up%20/%20slow%20down%20a%20video setpts should do the trick, but when I check the framerate afterwards it is always twice the original
[17:23] <fling> I have a lot of videos from a security camera.
[17:24] <fling> How to splite them in pieces and keep only ones with movement?
[17:33] <cusco> er.. usually there is software that only records when there is movement
[17:33] <cusco> like zoneminder
[17:39] <saste> fling, select filter + scene for checking changes
[17:39] <saste> as for splitting there is no simple solution
[17:39] <fling> hmm hmmmm
[17:40] <fling> saste: what if I will just drop all equal frames?
[17:40] <fling> saste: both with sound
[17:41] <saste> equal frames in case of video camera output is not perfectly equal
[17:41] <saste> anyway you can set the threshold, check select filter docs and the scene variable
[17:41] <fling> right& I know how to do so with imagemagick
[17:42] <saste> as for synching with sound, that's not easy, i.e. you can't do it with a simple ffmpeg command
[17:42] <fling> convert to images, drop, convert from images back to video
[17:42] <fling> but I want to keep sound
[17:46] <saste> fling, you don't need image magick for detecting motion
[17:47] <fling> saste: but it is simple to drop frames with it.
[18:08] <KalD> Does anyone have a good example of using a video device by index on win32? The documentation makes a refe to this - but there is no example
[20:36] <pyBlob> is there an easy way to read an image sequence like img03.png, img06.png, img09.png, ...
[20:37] <durandal_1707> yes with image2 demuxer see ffmpeg -h demuxer=image2
[20:42] <pyBlob> yes ... but it doesn't state how to make a 3*n-sequence
[20:42] <durandal_1707> you can use glob pattern type
[20:43] <pyBlob> ... don't know glob, I'll try it :/
[20:48] <pyBlob> you know, where I can find some basic information about that glob pattern stuff?
[20:49] <durandal_1707> in docmuntation usual location
[20:53] <pyBlob> sorry ... but it doesn't say anything about glob patterns
[20:54] <intracube> pyBlob: ffmpeg -f image2 -pattern_type glob -i '*.png' -c:v libx264 -crf 16 output.mp4
[20:54] <intracube> will convert all .png in a directory
[20:55] <pyBlob> ok, but there is no easy way to read only every 3rd image
[20:56] <intracube> oh, so you've got a directory with a complete sequential list of files but you want to skip every X image(s)
[20:56] <pyBlob> yes
[20:58] <durandal_1707> there is no such feature, but it could be added if you report it
[21:04] <intracube> pyBlob: you might be able to do it in a two-step process by changing the framerate
[21:04] <intracube> ffmpeg -r 25 -f image2 -pattern_type glob -i '*.png' -c:v libx264 -crf 16 -r 12 test.mp4
[21:04] <intracube> then:
[21:04] <intracube> ffmpeg -r 50 -i test.mp4 -c:v libx264 -crf 16 -r 25 test2.mp4
[21:05] <intracube> which will drop every other frame (2x speedup)
[21:05] <intracube> but to preserve quality you should use a lossless codec as the intermediate
[21:06] <intracube> (note the use of -r options)
[21:07] <intracube> that example might have some dup frames as 12 isn't exactly half 25
[21:08] <pyBlob> globbing not supported -.-
[21:09] <intracube> that's no longer necessary so remove: -pattern_type glob -i '*.png'
[21:09] <saste> pyBlob, windows?
[21:09] <intracube> replace with: -i img%04d.png
[21:09] <pyBlob> %02d works though
[21:09] <intracube> but this assumes consistent naming of files
[21:10] <intracube> I don't think you can mix img04.png with img142.png
[21:10] <intracube> should really left pad the frame numbers to be consistent, like img0001.png rather than img01.png
[21:17] <intracube> pyBlob: oh, and if you happen to use my examples... you should add -pix_fmt yuv420p in there
[21:17] <intracube> otherwise the file won't be compatible with a lot of players...
[21:17] <kms_> hello, i try -filter:a "volume=volume=5.5" but volume same as source, why?
[21:17] <pyBlob> works ... but is sort of hackish ;)
[21:18] <intracube> pyBlob: mplayer has the decimate filter that can be used to drop frames
[21:20] <pyBlob> I think for now I'll just select the files using another program and pipe the single png-images into ffmpeg
[21:22] <intracube> kms_: -filter:a 'volume=0.5*1'
[21:22] <intracube> will halve the volume of the input
[21:22] <intracube> quarter: volume=0.25*1
[21:23] <intracube> twice as loud as input: volume=2*1
[21:23] <intracube> etc
[21:23] <kms_> what mean *1 ?
[21:25] <intracube> kms_: it's a multiplier
[21:27] <kms_> ok, why in documentation i see this example volume=volume=0.5?
[21:29] <durandal_1707> kms_: 0.5 should work too
[21:29] <durandal_1707> and *1 is redundant
[21:29] <intracube> kms_: I don't know what you're looking but this is what's in the linux man pages: http://pastebin.com/6GQqwfm5
[21:29] <mark4o> kms_: that should work also; you probably put the option in the wrong place or are using a weird version, that is why you were asked for your command and console output
[21:31] <kms_> cat SUNP0001.AVI SUNP0002.AVI | ffmpeg -i - -b:v 512k -ac 1 -aq 0 -r 25 -filter:a "volume=5*1" -vf "fade=in:0:50" -y 123.webm
[21:39] <kms_> http://pastebin.com/dZ39ARkq
[21:44] <mark4o> kms: ffmpeg version 0.10.8-7:0.10.8-1~raring1 <= have you tried a current version?
[21:49] <kms_> ok, i will try
[23:04] <KalD> Does anyone have a good example of using a video device by index on win32? The documentation makes a ref to this - but there is no example
[00:00] --- Wed Sep 18 2013
More information about the Ffmpeg-devel-irc
mailing list