[Ffmpeg-devel-irc] ffmpeg.log.20120214

burek burek021 at gmail.com
Wed Feb 15 02:05:02 CET 2012


[00:18] <drno_> Hiya folks
[00:20] <drno_> Question that I'm hoping someone may have experience with... I've got a video stream that I'm encoding from raw images (tv tuner), and writing to a flash media server (red5).  I've verified that the stream is good, that the encoding is working (h.264 stored in FLV container), and that all is getting to RED5 (viewed the stream with a flash viewer).
[00:20] <drno_> So, up to that point, I know it's functional.
[00:20] <drno_> I want to use ffmpeg to capture that stream and save it to a file.
[00:20] <drno_> So, I've run the following command: ffmpeg -i "rtmp://localhost/oflaDemo/streamTest live=1" -an -vcodec copy -y test.flv -loglevel verbose
[00:21] <drno_> It displays the metadata output, then hangs.  I see "Ping's", but nothing gets written.  If I hit CTRL-C twice, it exists, giving messages stating that no data was written.
[00:22] <drno_> Any ideas on what I could try?
[00:25] <Prudhvi> Hi, is there a RPM repo for latest ffmpeg
[00:25] <Hyperi> What's wrong with source?
[00:25] <Hyperi> You get the best out of it
[00:25] <pasteeater> Prudhvi: i don't know of any, but you can follow this: https://ffmpeg.org/trac/ffmpeg/wiki/CentosCompilationGuide
[00:25] <Hyperi> + some of the libraries need manually enabled
[00:25] <Prudhvi> pasteeater: perfect :) Thanks
[00:26] <pasteeater> "you're the best..around. nobody is going to bring you down"
[00:27] <pasteeater> drno_: are you using ffmpeg to feed to the video to red5?
[00:28] <drno_> pasteeater: yes indeed.
[00:28] <pasteeater> why not make two outputs? one to red5 and one to a file
[00:29] <drno_> My understanding is that two outputs mean two encodes.
[00:29] <drno_> one my box can handle -- two might make it a tad unhappy.
[00:30] <drno_> Or is there some way to mux the output without re-encidng?
[00:30] <drno_> encoding, rather.
[01:10] <MrKeuner> hello, how can I rip bunch of vob files from out of a dvd into a single mpeg2 file without any conversion?
[01:12] <sacarasc> If you're lucky cat *.vob > blah.mpeg
[01:14] <MrKeuner> sacarasc, how will I know I am lucky?
[01:14] <sacarasc> If it works.
[01:16] <MrKeuner> sacarasc, I need to know that file is healthy, some application's playing the file does not necessarily prove this
[01:16] <MrKeuner> thanks for trying to help
[01:17] <sacarasc> If you use ffmpeg to do it, you're basically doing the same thing anyway.
[01:20] <MrKeuner> I understand that. I just need to be sure that the output file is healthy. I would trus t ffmpeg more than I trust cat for this operation
[01:20] <Zeranoe> Does anyone know where I can get a source archive of libnut? I see it is made by MPlayer/FFmpeg so it should be around here somewhere?
[01:29] <pasteeater> Zeranoe: git://git.ffmpeg.org/nut
[01:29] <pasteeater> 'git clone git://git.ffmpeg.org/nut' to be specific
[01:31] <Zeranoe> pasteeater: Thanks a lot
[02:20] <^sandro^Zzz> hello everyone
[02:21] <^sandro^Zzz> anyone know how to fix this error
[02:21] <^sandro^Zzz> [libx264 @ 0x1781bc0] Input picture width (704) is greater than stride (0)
[02:21] <^sandro^Zzz> Video encoding failed
[02:39] <pasteeater> ^sandro^Zzz: is there a reason you're not using the standard x264 encoding presets?
[02:52] <^sandro^Zzz> what do you mean
[02:52] <^sandro^Zzz> -preset low
[02:52] <^sandro^Zzz> or whatever ?
[02:52] <^sandro^Zzz> doesn't change anything i have tried
[02:58] <pasteeater> ^sandro^Zzz: as in ffmpeg -y -i udp://@:5000 -an -c:v libx264 -preset medium -crf 24 -map p:"167" output.ts
[03:22] <^sandro^Zzz> sorry pasteeater
[03:22] <^sandro^Zzz> ress [q] to stop, [?] for help
[03:22] <^sandro^Zzz> [libx264 @ 0x14d8ea0] Input picture width (704) is greater than stride (0)
[03:22] <^sandro^Zzz> Video encoding failed
[03:22] <^sandro^Zzz> Segmentation fault
[03:22] <^sandro^Zzz> same error.. using version 120 of x264 now
[03:22] <^sandro^Zzz> updating ffmpeg too
[03:26] <^sandro^Zzz> hey anyonek know how to fix this error on compile ffmpeg
[03:26] <^sandro^Zzz> collect2: ld returned 1 exit status
[03:26] <^sandro^Zzz> make: *** [libavutil/libavutil.so.51] Error 1
[03:26] <^sandro^Zzz> oops..
[03:27] <^sandro^Zzz> collect2: ld returned 1 exit status
[03:27] <^sandro^Zzz> make: *** [libavutil/libavutil.so.51] Error 1
[03:27] <^sandro^Zzz> damn.. wont let me paste.
[03:27] <^sandro^Zzz> http://pastebin.com/sWsJQCne
[03:27] <^sandro^Zzz> that is easier :P
[03:29] <VooDooNOFX_> ^sandro^Zzz, you're missing some libs on this system I think
[03:34] <^sandro^Zzz> which ones is the trick
[03:34] <^sandro^Zzz> :(
[03:40] <pasteeater> ^sandro^Zzz: what system/os/distro/whatever
[03:40] <pasteeater> and show your ./configure
[03:49] <^sandro^Zzz> ubuntu
[03:50] <^sandro^Zzz> ./configure --enable-libmp3lame --enable-libtheora --enable-libvorbis --enable-libfaac --disable-mmx --enable-shared --enable-libxvid --enable-gpl --enable-nonfree
[03:52] <pasteeater> why disable-mmx?
[03:52] <pasteeater> what ubuntu version?
[03:54] <^sandro^Zzz> 11.04
[03:55] <pasteeater> http://ubuntuforums.org/showthread.php?t=786095
[04:16] <cosmint> This works: ffmpeg -i input.avi -vcodec libx264 -an -x264opts crf=8 output.flv, but when I change flv to f4v it complains "Unable to find a suitable output format"
[04:20] <cosmint> -f mp4 seems to fix the problem, not that i know exactly why
[04:39] <relaxed> cosmint: ffmpeg doesn't know about the f4v format.
[04:40] <relaxed> if you don't force a format with -f $format, ffmpeg will use the extension to set it.
[04:40] <cosmint> i see, this wiki page made me think f4v would work, http://en.wikibooks.org/wiki/FFMPEG_An_Intermediate_Guide/h.264
[04:40] <cosmint> so i made an mp4 or m4v and flash plays it
[04:41] <cosmint> good enough i guess, now to find the difference between f4v and m4v
[04:41] <relaxed> that is old, outdated and wrong
[04:42] <cosmint> ya
[04:43] <relaxed> how does f4v and m4v differ from flv and mp4? silly apply naming convention?
[04:43] <relaxed> I would assume "-f flv output.f4v" would do it
[04:44] <relaxed> s/apply/apple/
[04:50] <cosmint> I think I've figured it out. f4v is its own format, distinct from flv, but very close to mp4, or maybe just adobe's implementation of mp4
[04:51] <cosmint> I lose some features I don't think I need by using mp4 instead of f4v, cue points or something.
[04:51] <cosmint> So I'm happy for now.
[04:54] <funkster_> looking to create a photo slideshow with some fading between the photos - anyone point me in the right direction on how to do this?
[04:56] <relaxed> funkster_: Shouldn't be hard- ffmpeg has a fad filter
[04:56] <funkster_> yeah? hmm ok lemme try and find that
[04:56] <relaxed> loop each a certain number of times and use a similar fade for each image
[04:58] <funkster_> happen to have an example somewhere, the man pages just confuse me more then i am already.
[04:58] <relaxed> not really, I'm just thinking out loud
[04:59] <relaxed> did you find fade in the man page?
[05:01] <funkster_> the main ffmpeg site, i found fade filter, yeah.
[05:17] <cosmint> i found out why flashsv2 is experimental :p
[05:19] <cosmint> thanks relaxed
[06:42] <lahwran> how does one loop an input file into an output file? I have a file I would like to loop so it repeats, say, 10 times in a single stream
[06:45] <relaxed> what kind of file?
[07:04] <relaxed> lahwran: look at concat in the man page
[07:06] <lahwran> relaxed: thanks
[07:06] <lahwran> and, mp4
[07:50] <trupheenix> hi. i want to know how i can find out a formula to cut my video based on keyframes? i want to do this for streaming.
[10:52] <Xcellence> Hi there.. is there any way atm to decode a vc-1 interlaced stream with ffmpeg? On google i just found some mentions of w32codecs but i dont know if ffmpeg supports this or how i can use them.. :/
[11:37] <Xcellence> nobody knows? :(
[11:47] <zap0> nobody knows is a statement.  adding a question mark doesn't make it a question.
[11:50] <Xcellence> ;)
[12:59] <stonie> can ffmpeg do audio streams as flac?
[14:02] <bgmarete> Hello. How can I make MPEG-2 encoding more scalable? That is, make it use more threads + do more fps? At the moment, the maximum scalability I can achieve is 3 threads.
[14:03] <bgmarete> That is, I am using only 3 cores, no matter my argument to the `-threads' option.
[14:05] <sacarasc> Are you putting -threads in the output options?
[14:19] <bgmarete> sacarasc: Yes.
[14:20] <bgmarete> sacarasc: `ffmpeg -i input.mkv -vf yadif -map 0 -threads 4 -vcodec mpeg2video -r 25 -g 12 -bf 3 -s 720x576 -aspect 4:3 -pix_fmt:v yuv420p -mbd rd -cmp 2 -subcmp 2 -b:v 6000k -maxrate 6000k -minrate 6000k -bufsize 6000k -acodec mp2 -ac 2 -ar 48000 -b:a 256k -f vob -y out.mpg'
[14:58] <SvenL-> is there something special to do to be able to use -pix_fmt with ffplay ?
[14:58] <SvenL-> tells me it's deprecated, -pixel_format option is "not found"
[15:20] <Xcellence> Hi there.. is there any way atm to decode a vc-1 interlaced stream with ffmpeg? On google i just found some mentions of w32codecs but i dont know if ffmpeg supports this or how i can use them.. :/
[15:42] <MrKeuner> hello, how can I rip bunch of vob files from out of a dvd into a single mpeg2 file without any conversion?
[16:06] <relaxed> MrKeuner: you can concat a bunch of vobs without any conversion using ffmpeg
[16:07] <MrKeuner> how do I do it
[16:08] <relaxed> ffmpeg can't "rip" the dvd though
[16:08] <MrKeuner> by rip I meant getting the mpeg2 file out of the dvd
[16:09] <MrKeuner> this is a dvd created from a video8 cassette
[16:09] <relaxed> this would be easier with mplayer:  mplayer -dumpstream -dumpfile output.vob dvd://
[16:10] <MrKeuner> relaxed, absolutely no conversion?
[16:10] <relaxed> zero
[16:10] <MrKeuner> and vob to mpeg2?
[16:11] <MrKeuner> file dvd/file.vob returns mpeg2
[16:11] <MrKeuner> but I don't know if simply changing the extension is ok
[16:11] <relaxed> .vob = MPEG-2 PS format
[16:12] <relaxed> it's a container that holds mpeg2 video and audio streams
[16:14] <relaxed> if it will give you the warm fuzzies you can use `output.mpg`
[16:32] <MrKeuner> if I encode some.dv to mpeg2 versus an mpeg2 in a vob, what's the difference?
[16:40] <relaxed> you want to go from dv -> dvd ?
[16:40] <MrKeuner> relaxed, no trying to understand what container format vob is for
[16:40] <relaxed> DVDs
[16:40] <MrKeuner> mpeg2 already has sound in it, no
[16:43] <relaxed> mpeg2 is a video codec; mpeg ts and mpeg ps are containers that hold mpeg2 video
[16:45] <MrKeuner> relaxed, what is a container for?
[16:46] <relaxed> http://en.wikipedia.org/wiki/Digital_container_format
[16:52] <MrKeuner> relaxed, so vob file is like a Java class inheriting mpg format so that changing the file extension from vob to mpg does not break any mpg players
[16:52] <kriegerod> is there some avfilter to transform variable-frame-rate video to fixed-frame-rate one? If not, then a) what could be adviced way to do that? b) is it good idea to implement such filter?
[16:56] <relaxed> kriegerod: I believe ffmpeg is incapable of outputting vfr, so cfr is all you get.
[16:57] <kriegerod> relaxed, sorry i forgot to mention it, i work with API, not utility
[16:58] <relaxed> still, if ffmpeg can do it you can too.
[17:00] <kriegerod> sure, i just asked in hope to save efforts on researching
[17:00] <relaxed> maybe ask in #libav too, they're pretty good about answering such questions.
[17:02] <aClam> He did minutes before
[17:02] <relaxed> oh, you're right :)
[17:23] <renderTom> Hello everybody
[17:23] <renderTom> I was wondering, is it possible in ffmpeg to use some other scaling method? As '-s" seems to produce "soft" scaling
[17:28] <microchip_> renderTom: -sws_flags spline (for example)
[17:31] <renderTom> oh, thanks, will look at that
[17:43] <renderTom> I am hacing no luck finding understandable info about scaling :)
[17:44] <renderTom> what I am after is pretty simple - "-s" scaling seems to get pretty soft output, so I wonder if there's a option to make it better or to use some other scaling method?
[17:47] <aClam> You didn't try sws_flags ?
[17:49] <renderTom> well, not yet, as I am not sure how to use it :)
[17:49] <microchip_> renderTom: 'ffmpeg -h' and look for -sws_flags
[17:49] <renderTom> oh yeahhhh
[17:49] <renderTom> solly me :)
[17:49] <microchip_> renderTom: to use it, specify the scaler you want. ie, -sws_flags lanczos or -sws_flags spline
[17:49] <renderTom> silly
[17:50] <renderTom> yeap, got it
[17:50] <renderTom> what method could you reccomend?
[17:50] <microchip_> I always use spline, but lanczos is pretty good too
[17:50] <renderTom> mhm, I see
[17:51] <microchip_> i recall that -s default to linear, that's why you get softness
[17:51] <renderTom> will it take longer to transcode using this scaling method?
[17:51] <microchip_> yes, a bit
[17:51] <microchip_> spline is more demanding than lanczos, though
[17:51] <renderTom> but you'd recommend spline to try at first. Thanks will give it a go
[17:52] <microchip_> i recommend both spline and lanczos :)
[17:53] <renderTom> :)
[17:53] <microchip_> use either one of them
[17:53] <renderTom> will try spline at first
[17:53] <renderTom> by the way, I am on iMac i7 with 4 cores (+ 4 virtual),
[17:53] <renderTom> and was wondering, how to distribute or use all cores during compression?
[17:53] <renderTom> is it even possible?
[17:54] <microchip_> -threads 4
[17:54] <renderTom> was trying -thred 4 with no luck
[17:54] <microchip_> hmm
[17:54] <microchip_> don't know, i'm not very much into ffmpeg
[17:54] <renderTom> at least I didn't see any change in activity monitor
[17:54] <aClam> -threads 0
[17:55] <aClam> I used that to auto detect, but it will depends on what compression library you are using
[17:55] <renderTom> and also, is it important where to put this -thread 0 command?
[17:55] <aClam> if you are using libx264 it will work
[17:55] <renderTom> I am going from h264 to prorez
[17:55] <renderTom> ffmpeg -i /_in.mov -vcodec prores -profile 2 -aspect 16:9 -s 1024x576 -an /_out.mov
[17:55] <aClam> Then it will depend on the prores encoder whether it supports multithreaded encoding
[17:55] <renderTom> so does it matter where to put -threads and -sws_flags command?
[17:56] <renderTom> aClam, I see, that makes sence
[17:56] <aClam> I haven't used those flags since the command line syntax reshuffle so I wouldn't know exactly
[18:00] <renderTom> ok, I'll put -treads and sws_flags at the end of the code
[18:00] <relaxed> anywhere after the input should be fine
[18:01] <renderTom> yep, so it seems h264 to prores has no effect with -threads 0
[18:02] <renderTom> as I see in Activity Monitor (mac) that process jumps from one core to another
[18:02] <relaxed> use -vf scale=1024:576,setdar=16:9
[18:02] <renderTom> and not 4 of them are working simultaniously
[18:03] <relaxed> 0 only works with libx264. try -threads 4
[18:03] <renderTom> -vf? Is it different ftom -swf_flags?
[18:04] <aClam> It's not "-treads" by the way
[18:05] <relaxed> using a recent version of ffmpeg, your command should be: ffmpeg -i input -an -threads 4 -swf_flags spline -vf scale=1024:576,setdar=16:9 -vcodec prores -profile 2 output.mov
[18:05] <aClam> -vf is if you wanted to say -s more verbose
[18:06] <aClam> For example: "-vf scale=854:480"
[18:06] <renderTom> sorry, I dont know what is verbose
[18:06] <renderTom> oh, ok
[18:06] <aClam> instead of "-s 854x480"
[18:10] <renderTom> with that syntax <relaxed> just provided I don't see any change :)
[18:11] <renderTom> any ways, I am just testing, so I have time even if one core is working
[18:11] <renderTom> thanks for your imput guys
[18:12] <relaxed> renderTom: prores may not be multithreaded. The syntax i used is the recommended way now.
[18:15] <renderTom> I see
[18:15] <renderTom> thank you
[21:13] <JDuke128> hello , i can read file by writing this function ( avformat_open_input ) but how can i get from a network stream ?
[21:36] <JDuke128> hey
[21:36] <JDuke128> i can read file by writing this function ( avformat_open_input ) but how can i get from a network stream ?
[21:46] <Zilly> Can FFmpeg encode rgb24 to Theora or only yuv?
[21:52] <pasteeater> Zilly: ffmpeg will probably autoslect yuv420p for the output
[21:53] <pasteeater> if your input is rgb
[21:53] <Zilly> pasteeater, Yeah.  I meant that I'm trying to encode a set of frames to a video.  The images are in rgb24 and, when encoded with Theora, look awful.  However, when encoded with x264, it looks perfect.
[21:55] <teratorn> Zilly: bad quality, or corrupted?
[21:56] <teratorn> JDuke128: use a correct URL and it will be opened
[21:56] <Zilly> teratorn, it appears that it may be corrupted.  It's not playing back properly.  The part that does is pretty bad, though.
[21:57] <pasteeater> Zilly: use a pastebin site to show your ffmpeg command and the complete console output
[21:57] <Mista_D> Anayway to have FFMPEG store mp4 atoms in the beginning of the file?
[21:57] <Zilly> pasteeater, I don't have the code with me at the moment.  Do you know of anything that I could look for that would be suspicious?  It complains that the images are in rgb24 instead of YUV420p.
[21:58] <tweek__> AtomicParsley can do that
[21:58] <teratorn> Zilly: so.. don't do that? put them in yuv420p first
[21:58] <JEEB> Mista_D, the qt-faststart thingy in the tools or whatever folder
[21:58] <JEEB> you have to build it separately but it's there
[21:59] <Zilly> teratorn, my question was if there is anyway to have Theora encode rgb24 images instead of yuv420p.
[21:59] <JEEB> Zilly, probably not unless the encoder has it as a feature :P
[21:59] <JEEB> I'd be surprised if Theora supports  RGB tbqh
[21:59] <teratorn> Zilly: sws_scale()
[21:59] <Mista_D> JEEB: thnaks.
[22:00] <Mista_D> I'll stick to mp4box for container fixes.
[22:03] <Zilly> teratron, thanks.  It seems like FFmpeg does that by default, though, right?  I'd probably be better off saving the frames in yuv420p.  Is there any format that does this by default?  JPEG is YcRcB, right?
[22:05] <vadim_> hi 2 all
[22:07] <tweek__> usually I find it's easier to just rely on mp4box exclusively for remuxing
[22:09] <JDuke128> teratorn , i want to decode from char* custom binary data instead of file
[22:15] <teratorn> Zilly: Y`CbCr, yea
[22:16] <Zilly> teratorn, won't that format cause the same problem that rbg24 causes?
[22:17] <teratorn> you said it complains that the images are not in yuv420p
[22:18] <teratorn> Zilly: so just try giving it frames in yuv420p and see what happens
[22:18] <teratorn> you can do the pixel format conversion yourself with sws_scale() before encoding the frames
[22:18] <Zilly> teratorn, Yeah.  That's what I want to do.  I'm under the impression that Y`CbCr and yuv420p are not the same.
[22:18] <teratorn> I don't know if it's already doing that for you or not
[22:18] <teratorn> well, you have rbg24 already so just start with that
[22:19] <cbreak> YCbCr is a color standard
[22:19] <cbreak> it doesn't specify color subsampling
[22:19] <cbreak> I use YCbCr with a 422 subsampling for example
[22:19] <Zilly> teratorn, ok.  I'll do that.
[22:20] <Tjoppen> cbreak: don't forget color siting
[22:21] <Zilly> cbreak, I see.  I don't really know how to convert a JPEG to different subsamples.  I'm just using PIL to do some really light editing on frames.
[22:23] <teratorn> Zilly: erm, well, sws_scale() does pixel format conversions including re-subsampling I believe
[22:24] <teratorn> Zilly: so load (decompress) your jpeg frame, and run it through sws_scale() specifying the correct source and destination pixel formats
[22:25] <Zilly> teratorn, ok.  That's what I'm assuming I'll have to do.  I was just hoping that I could continue using PIL.
[22:25] <teratorn> JPEG can be saved in various subsampling formats, so you'll have to get that from the header I guess
[22:27] <teratorn> Zilly: you're already using ffmpeg to encode a video right?
[22:27] <Zilly> teratorn, Yeah.
[22:27] <teratorn> Zilly: from static jpeg files?
[22:27] <JDuke128> whats the fastest way to print avcodec_decode_video2 output to put framebuffer any extra lib ?
[22:27] <Zilly> teratorn, Well, PNGs at the moment--they were working better to encode with x264.
[22:27] <Zilly> teratorn, but, it's no difference to use JPEG.
[22:27] <teratorn> Zilly: and you're doing this from the command-line (not in C code) ?
[22:28] <Zilly> teratorn, more or less.  I'm just running a command from Python's commands lib.
[22:28] <teratorn> Zilly: hehe. don't use the 'commands' module. use subprocess pl0x
[22:29] <Zilly> teratorn, Err... WAS using commands--now using subprocess >.<
[22:29] <teratorn> Zilly: and uh, I think that /should/ just work and do the correct pixel format conversion for you
[22:29] <teratorn> Zilly: so sounds bug-report/mailing-list worthy
[22:31] <Zilly> teratorn, wait.  Should that work with PNG rgb24?  Or are you saying if I have this problem with JPEG?
[22:31] <teratorn> I'm saying it should work with any set of valid image files that ffmpeg knows how to load
[22:32] <Zilly> teratorn, what if I gave it a pretty weird set where some have an alpha channel and some don't?  Would that affect it?
[22:32] <teratorn> uhhh
[22:32] <teratorn> quite possibly
[22:33] <Zilly> teratorn, that was my next thing to try. I just figured that I'd have to give it yuv420p since it complained about it.
[22:33] <teratorn> I would tend to think the alpha channel would be ignored
[22:33] <Zilly> teratorn, thanks!
[22:33] <teratorn> but hard to say
[22:34] <teratorn> Zilly: please do follow this up, it sounds off
[22:37] <Zilly> teratorn, ok.  I'm actually at work >.<  This is a side project.  I'll check back into IRC later tonight / tomorrow with results.
[22:38] <teratorn> very good
[22:39] <Zilly> teratorn, thanks!
[22:56] <__vincent> does ffmpeg know how to write an mp4 header?
[22:57] <sacarasc> Yes.
[22:59] <jensverwiebe> hi folks
[22:59] <jensverwiebe> i'am the blender OSX maintainer and ran into a problem with ffv1 both on 0.10 release and trunk
[23:00] <jensverwiebe> ./ffmpeg -i /Volumes/SystemHD/Users/jensverwiebe/Downloads/Hausbrennt8d0001_0250a.mov -pix_fmt rgb32 -vcodec ffv1 -an output.mov is scrambled due pix_fmt rgb32
[23:00] <jensverwiebe> any clou ?
[23:02] <jensverwiebe> made a crosscheck with 0.87 and all is k in this version, same configured and compiled
[23:40] <jensverwiebe> hmm, no one ?
[23:51] <relaxed> jensverwiebe: can you stick a small sample of your input somewhere?
[23:51] <jensverwiebe> relaxed: sure, moment
[23:52] <relaxed> sorry, I have to run right now. go ahead and paste the url and I'll look at it when I return.
[23:53] <jensverwiebe> relaxed: http://www.jensverwiebe.de/temp/ffmpeg_input_rgb32.mov
[23:53] <jensverwiebe> relaxed: btw, tested now all rgb24, rgb444 etc, only rgb32 seems to be broken with ffv1, but nit with huffyuf  for example
[23:54] <jensverwiebe> always same input movie used
[23:54] <jensverwiebe> relaxed: + afaik works on Linux
[00:00] --- Wed Feb 15 2012


More information about the Ffmpeg-devel-irc mailing list