[Ffmpeg-devel-irc] ffmpeg.log.20130510

burek burek021 at gmail.com
Sat May 11 02:05:01 CEST 2013


[00:27] <Dettorer> hi there, I have a problem with ffmpeg trying to stream to a justintv service (twitch.tv)
[00:27] <Dettorer> all frames are droped after a few seconds of streaming
[00:38] <Dettorer> strangely, on my laptop (same network, same system, same kernel, just a slightly better hardware), it works fine
[00:39] <Dettorer> (oh yes I'm on archlinux)
[00:40] <Dettorer> I use these options http://paste.awesom.eu/0qs
[01:07] <schtinky> i'm back. using the following to grab still frames from a digital TV tuner device on a raspberry pi:
[01:08] <schtinky> "gnutv | ffmpeg -skip_frame nokey -i inputfile.mpg -an -r 1 -vsync vfr outputfiles%9d.tiff"
[01:09] <schtinky> ... and that command works except the raspberry pi is a hair too slow
[01:09] <schtinky> can do about 27 images in 30 seconds
[01:09] <schtinky> generates dvr overflow errors due to being unable to keep up
[01:09] <schtinky> I know tiff is faster than jpg, png and pam
[01:10] <schtinky> is there anything else I can try, either format-wise or via the ffmpeg parameters to squeeze it out just a little faster?
[01:10] <Mavrik> not really
[01:11] <schtinky> i've tried overclocking the pi, but it effs the SD card
[01:11] <schtinky> looks like I"m going to have to upgrade to some sort of atom-based machine
[01:11] <schtinky> yes I said "upgrade to atom"
[01:13] <Mavrik> yeah well
[01:13] <Mavrik> RPi does have an exceptionally crappy CPu
[01:14] <Mavrik> with obsolete architecture
[01:14] <Mavrik> no NEON support, nothing
[01:14] <Mavrik> I'm not sure who keeps persuading people it's capable of doing anything useful related to video above 320x240 >(
[01:14] <Mavrik> :)
[01:23] <schtinky> I have to buy potentially hundreds of machines that can grab still frames from OTA TV
[01:23] <schtinky> if I can do it with $100 machines instead of $500 machines, that makes a big difference
[01:24] <schtinky> oh man, don't look now. Just got 29/30 frames to work
[01:24] <schtinky> overclocked the pi, turned off xorg
[01:24] <schtinky> SOOoooo close
[01:24] <schtinky> still probably very far away from production-worthy stability, though
[01:25] <schtinky> one hiccup in the digital tuner would probably wreck the whole thing
[01:26] <schtinky> and I still have to run some sort of program to upload the resulting images
[01:26] <schtinky> which will take its own chunk of cpu
[01:26] <schtinky> I think I'm beaten on this
[01:39] <Zarx> if anyone could help me, I have been having a really hard time getting things to work right when I build ffmpeg using patched versions of x264. It doesn't want to accept the new commandline options that are added by the patches
[01:54] <llogan> Dettorer: you should show an actual ffmpeg command and the complete console output.
[01:56] <llogan> Dettorer: ...minus your stream key of course
[02:00] <Zarx> i gotta go pick someone up, ill paste it in just a bit
[02:15] <Mavrik> schtinky, how bout getting one of those boards with actual 1GHz + multicore Cortexes A15 with NEON?
[02:24] <highgod> Hi, I want to ask a question,we want to use ffmpeg to upload the stream to ffserver, and use a nother ffmpeg to receive the stream from ffserver, add save as a file, can ffmpeg do that, what is the command line? thanks
[02:26] <gajbooks> I'm using sort of a yucky frontend for FFMPEG called FFsplit to record my screen to H.264 and I'm noticing that any text that is truly black is weirdly multicolored. Is this fixable by a flag, or is that just the way H.264 works?
[02:32] <Mavrik> gajbooks, that would look like a colorspace conversion wart
[02:32] <Mavrik> gajbooks, are you using x264 and encoding to yuv420?
[02:33] <gajbooks> I don't know what I'm technically encoding from, but I'm encoding to x264
[02:34] <gajbooks> Oh, it is grabbing from yuv420p
[02:34] <gajbooks> Excuse me, to.
[02:34] <gajbooks> Mavrik: Yes.
[02:35] <Mavrik> gajbooks, yeah, going from RGB to YUV420 can cause color artifacts :)
[02:35] <gajbooks> Is there any way to fix it?
[02:35] <Mavrik> gajbooks, usually setting colormatrix flag on x264 fixes it
[02:35] <Mavrik> try either bt709 or bt470bg
[02:35] <Mavrik> I always forget which is used by ffmpeg
[02:37] <Mavrik> gajbooks, so to recap, use x264 parameters and pass in "colormatrix=bt709" :)
[02:38] <gajbooks> It doesn't like that argument.
[02:38] <Mavrik> are you passing it to x264 not ffmpeg?
[02:39] <gajbooks> How would one do that through a sucky front end?
[02:39] <Mavrik> no idea
[02:39] <Mavrik> by not using sucky frontend usually ;)
[02:39] <Mavrik> or consulting docs :P
[02:43] <gajbooks> This sucky front end seems to be one of the things with the right DirectShow filters to record screens for free.
[02:49] <llogan> ffmpeg can record your screen too
[02:49] <llogan> https://ffmpeg.org/trac/ffmpeg/wiki/How%20to%20grab%20the%20desktop%20%28screen%29%20with%20FFmpeg
[02:49] <llogan> ugly ass url. why no camelcase?
[02:50] <llogan> wiki/Screencast would have been nice
[02:51] <gajbooks> Now that I found the right thing, it's giving me an "error splitting the argument"
[02:51] <llogan> highgod: did you refer to the tee muxer? http://ffmpeg.org/ffmpeg-formats.html#tee
[02:58] <highgod> record screen ffmpeg -f dshow -i video="screen-capture-recorder" -vcodec libx264 -x264opts bframes=0:ref=1 -preset fast out.m2ts
[02:58] <highgod> it works, I work on it too
[03:03] <ubitux> http://pastie.org/7824987  am i doing something stupid, or is it the compiler?
[03:03] <ubitux> oups bad chan
[03:58] <Hans_Henrik> if i want to COPY the audio from src to destination, how can i do that?  does this seem right? -acodec copy
[04:04] <Hans_Henrik> seems so yes
[09:47] <ankr> How to convert MP4 => OGV on a mac?
[09:47] <ankr> I tried "ffmpeg -i input.mp4 -acodec libvorbis -vcodec libtheora -f ogv output.ogv" but got the error "[NULL @ 0x7fe2a901ee00] Requested output format 'ogv' is not a suitable output format"
[09:48] <ankr> I also tried "ffmpeg -i input.mp4 -acodec vorbis -vcodec libtheora -f ogg output.ogv" but got the error "Unknown encoder 'libtheora'"
[09:54] <relaxed> ankr: -f ogg
[09:54] <relaxed> oh
[09:54] <JEEB> then you don't have libtheora built into your ffmpeg build
[09:54] <relaxed> ankr: http://goo.gl/DPrRY
[09:55] <RSDRSDRSD> I want to encode user submitted videos, I am using the following command: http://pastebin.com/1vYhFL48 but the videos seem rather shocky, what can go wrong?
[09:56] <retard> the videos seem shocky? your users might be trolls i guess?
[09:56] <RSDRSDRSD> quality of a still image is rather good, but when it is playing it doesn play smooth
[09:57] <RSDRSDRSD> sorry for my wrong language
[09:57] <RSDRSDRSD> it doesn play smooth ;-)
[09:57] <relaxed> are you sure their input is interlaced?
[09:58] <relaxed> because using yadif on non-interlaced input would do that
[09:59] <RSDRSDRSD> no not sure about that
[09:59] <RSDRSDRSD> how can I detect that?
[10:01] <RSDRSDRSD> if a video is interlaced, does it give problems when I don;t make it non-interlaced?
[10:03] <JEEB> you probably meant something else when writing that
[10:03] <JEEB> you will have combing and the clip loses the "I'm interlaced yo" flag if you re-encode interlaced content as-is as progressive
[10:03] <ankr> relaxed: can you point me in a direction on how to get the libtheora?
[10:04] <relaxed> ankr: install libtheora-dev and recomplile ffmpeg with --enable-libtheora, or use my static binary
[10:05] <RSDRSDRSD> hmm, don understand that
[10:06] <RSDRSDRSD> I have user submitted video content
[10:06] <RSDRSDRSD> I want to encode it to webm/mp4
[10:06] <JEEB> also it all depends on what you mean "doesn't play smooth"
[10:07] <RSDRSDRSD> Well on scene changes or fast scenes it just looks like it stands still and goes further
[10:09] <relaxed> RSDRSDRSD: Install mediainfo and test with --> mediainfo --Inform='Video;%ScanType%' user_video.mpg
[10:09] <relaxed> If it returns "Interlaced", use yadif
[10:11] <RSDRSDRSD> ok thanks, will try that
[10:11] <relaxed> there's also ffmpeg's idet filter
[10:11] <relaxed> I'm not sure which is better.
[10:12] <relaxed> Seems like there should be an option for yadif to test for interlaced input and have the option to pass through if it's not.
[10:15] <RSDRSDRSD> doesn ffprobe have some info on interlaced content?
[10:16] <relaxed> RSDRSDRSD: try yadif=deint=interlaced
[10:17] <relaxed> that might be your silver bullet ^^
[10:18] <RSDRSDRSD> shouldn it be tdeint ?
[10:19] <relaxed> "interlaced, only deinterlace frames marked as interlaced"
[10:22] <RSDRSDRSD> oh it wasn a typo?
[10:22] <relaxed> man ffmpeg-filters| less +/^'   'yadif
[10:26] <RSDRSDRSD> if I read at http://sonnati.wordpress.com/2012/10/19/ffmpeg-the-swiss-army-knife-of-internet-streaming-part-vi/
[10:26] <RSDRSDRSD> point 3
[10:30] <RSDRSDRSD> what if I do yadiff=0:-1:1   sendframe, autodetect,only frames which are interlaced
[11:38] <RSDRSDRSD> deint=interlaced is also working good, but I don´t know what really happens when using that one, can´t find the command in de the docs
[11:50] <saste> RSDRSDRSD: please file a ticket (about the missing docs)
[11:55] <relaxed> it's in my man page, I quoted it above.
[11:59] <RSDRSDRSD> ah now i see, but shouldn´t I send the mode and parity also?
[11:59] <praveenmarkandu> hi is there a way to generate WEBVTT
[11:59] <praveenmarkandu> closed questions
[11:59] <praveenmarkandu> *closed captions
[12:02] <RSDRSDRSD> ah okey yadif=0:-1:1 is the same as yadif=deint=interlaced, since mode and parity are by default 0 respectively -1
[12:02] <RSDRSDRSD> thanks
[12:04] <ubitux> praveenmarkandu: no, only demuxing so far
[12:04] <ubitux> patch welcome
[12:12] <praveenmarkandu> @ubitux: any way to insert soft subs into a hls stream?
[12:18] <ubitux> praveenmarkandu: possibly
[12:18] <ubitux> with mov_text maybe
[12:30] <durandal_1707> RSDRSDRSD, ubitux: deint have been removed
[12:32] <ubitux> durandal_1707: huh??
[12:32] <ubitux> why?
[12:32] <ubitux> oh, merge...
[12:33] <ubitux> ?
[12:34] <durandal_1707> oh. deint is what?
[12:34] <ubitux> it looks present
[12:34] <ubitux> just undocumented
[12:34] <durandal_1707> where it is?
[12:35] <ubitux> yadif filter option
[12:35] <ubitux> to make it deinterlace only interlaced frames
[12:35] <durandal_1707> ah, i though user was talking about filter, so I replaced it with something else ....
[12:42] <relaxed> why were the other deinterlacing filters ported? I thought yadif was by far the best.
[13:05] <Gabriel_Blake> Could someone help me out ? I'm trying to convert .dv (from minidv camcoder) to .mkv (x264/AAC). After demuxing the original stream, audio and video are out of sync, but in the original file, they're ok. Can ffmpeg correct the demuxing ?
[13:17] <luc4> Hi! I have an h264 raw file with variable framerate. Can I wrap this into an mp4 with ffmpeg?
[13:18] <klaxa> <relaxed> why were the other deinterlacing filters ported? I thought yadif was by far the best.
[13:19] <klaxa> i thought so too until yesterday
[13:19] <klaxa> yadif actually introduced some stuttering (frames shown slightly too long) which mplayers pulldown filter did not
[13:19] <klaxa> *mplayer's
[13:20] <klaxa> *pullup
[13:20] <klaxa> god what's wrong with me today
[13:23] <durandal_1707> yadif is inverse interlacer and pullup is inverse teleciner
[13:23] <klaxa> hmm... right
[13:25] <klaxa> nvm me...
[14:10] <RSDRSDRSD> what is best to use for deinterlacing now?
[14:11] <JEEB> I think current ffmpeg has two good filters for deint
[14:11] <JEEB> yadif (for general deinterlacing, and handles true interlacing)
[14:11] <JEEB> and vivtc if ubitux got that in
[14:12] <JEEB> latter is very specific for a certain use case, so it isn't useful for automation
[14:12] <luc4> Hi! I have an h264 raw file with variable framerate. Can I wrap this into an mp4 with ffmpeg?
[14:12] <JEEB> how are the timestamps for it stored (if at all)?
[14:13] <JEEB> if it's raw H.264 I don't think it has any timing information whatsoever
[14:13] <ubitux> JEEB: fieldmatch is in yes
[14:13] <luc4> JEEB: oh... I created using Android classes, I given the encoder the timestamp.
[14:13] <luc4> JEEB: but it is possible it is ignored...
[14:13] <JEEB> luc4, and if it's a raw H.264 stream that doesn't keep the information at all ^^;
[14:14] <JEEB> it probably isn't ignored during the encoding
[14:14] <ubitux> JEEB: http://ffmpeg.org/ffmpeg-filters.html#fieldmatch
[14:14] <JEEB> but the raw Annex B just doesn't contain timestamps
[14:14] <JEEB> so they got used during encoding and then thrown out
[14:14] <JEEB> ubitux, nice, what about decimation?
[14:14] <luc4> oh... so from that file there is no way I can get a "correct" mp4 right?
[14:15] <JEEB> yes, unless you bring in the timestamps somehow else
[14:15] <ubitux> JEEB: http://ffmpeg.org/ffmpeg-filters.html#decimate
[14:15] <JEEB> \o/
[14:16] <JEEB> also as far as I know ffmpeg's annex B frame types etc. -> PTS calculation is still borked, so I recommend using L-SMASH for annex B -> mp4
[14:17] <luc4> JEEB: ok, so I'll have to change this entirely. Thanks!
[14:18] <JEEB> I would recommend you mux your encoded stream into some container (mkv, mov/mp4) when capturing if you want to keep the timestamps
[14:18] <luc4> JEEB: that seems not to be supported with the hardware encoder unfortunately...
[14:19] <luc4> so I thought I could mux after the entire video was stored in raw format. I thought that nal units did carry timestamps... but I guess I was wrong...
[14:19] <JEEB> luc4, you take the NAL units and use something to mux it?
[14:19] <JEEB> and the timestamps should have to be passed somehow
[14:20] <JEEB> also there might be a way of putting timestamps into the NAL units, but I'm pretty sure most parsers will just ignore it, even if it would exist :s
[14:21] <luc4> JEEB: I encode raw frames from the camera (vfr) to h264 with the hardware encoder, and then I thought I could use ffmpeg to create an mp4 or a mov to correctly play in the android player. The timestamps are passed to the hardware encoder for each frame.
[14:22] <JEEB> you could use libavformat yes
[14:22] <luc4> JEEB: but as the hardware encoder API are generic, timestamps could be ignored I suppose.
[14:22] <JEEB> does the encoder give you timestamps from the output side? in the API?
[14:22] <luc4> JEEB: the problem is that I see the result has an incorrect frame rate after muxed via ffmpeg. So either my h264 does not have the timestamps, or I'm muxing wrong using ffmpeg.
[14:23] <luc4> no, I give the timestamps to the encoder, but it returns only a buffer
[14:24] <JEEB> so you get no PTS from the encoder in addition to the buffer?
[14:24] <JEEB> that sucks :s
[14:25] <luc4> JEEB: it seems not... http://developer.android.com/reference/android/media/MediaCodec.html
[14:25] <luc4> or maybe yes...
[14:26] <luc4> JEEB:  dequeueOutputBuffer also filles a bufferinfo, that might be what you are talking about...
[14:26] <luc4> in case those were the timestamps, how could I provide to ffmpeg to get a correct mp4?
[14:27] <JEEB> yeah, the presentationTimeUs sounds like PTS
[14:27] <JEEB> not sure what the offset is :s
[14:28] <luc4> but can I provide those to ffmpeg somehow then?
[14:28] <JEEB> the library, yes
[14:29] <luc4> and from the command line? I could test with that first, might be simpler.
[14:30] <JEEB> you could output the timestamps onto a "ogm timecode file" that is a text file
[14:30] <JEEB> *v2 timecode file
[14:31] <JEEB> and then mux that Annex B file with L-SMASH's muxer first
[14:31] <JEEB> and then use the timelineeditor binary in L-SMASH to modify the timestamps
[14:31] <JEEB> according to the v2 timecode file
[14:33] <luc4> JEEB: ok, I'll try to study this and see if I can follow your instructions :-) thanks!
[14:34] <JEEB> example of the v2 timecode format btw https://trac.bunkus.org/browser/examples/example-timecodes-v2.txt
[14:34] <JEEB> these do actually ignore b-frames and such so you might be able to just grab the timestamp information from the input side
[14:35] <JEEB> (as in, these are the timestamp values for the input/after decoding clip)
[14:35] <JEEB> or put more simply, "in presentation order"
[14:38] <luc4> But then I should provide that v2 file to ffmpeg together with my h264?
[14:39] <JEEB> yes, but I'm not sure if ffmpeg reads that, which is why I mentioned the tools from the L-SMASH toolset
[14:39] <JEEB> http://code.google.com/p/l-smash/
[14:41] <luc4> JEEB: ah ok, I see. But in case I wanted to do it in code with libavformat, should this be possible directly with ffmpeg?
[14:41] <JEEB> well yes, you would be directly using libavformat then, and giving the encoder's decoding order timestamps to the muxer
[14:41] <JEEB> ffmpeg the tool wouldn't be called
[14:42] <JEEB> (of course if you end up using mp4 as the output, L-SMASH's muxer is another alternative, but libavformat can generally output to various formats so it can be closer to your needs, possibly)
[14:43] <luc4> I'll have to study then :-) thanks!
[14:49] <luc4> JEEB: I read this: Note that FFmpeg's mp4 muxer does not support vfr output. Do you know if this is correct?
[14:50] <JEEB> no, it is not
[14:50] <JEEB> it works on timestamps and thus VFR works just fine
[14:50] <JEEB> the only problem I know of the command line application's muxing capabilities is the PTS generation from a frame rate and the frame types
[14:51] <luc4> JEEB: I was thinking of trying to use libavformat directly :-)
[14:52] <JEEB> I wish thee luck, you should first get current ffmpeg built for whatever architecture you're using android under
[14:52] <JEEB> and then you shall open the documentation
[14:53] <JEEB> and read where available :)
[14:55] <luc4> JEEB: I am a little familiar with this: http://ffmpeg.org/doxygen/trunk/doc_2examples_2muxing_8c-example.html. I suppose that is what I should refer to. I cross-compiled ffmpeg for some arhitectures in the past and it worked perfectly, maybe there is something done already for android...
[14:56] <luc4> JEEB: this seems nice! http://sourceforge.net/projects/ffmpeg4android/
[15:47] <MortenB> Hi there. I'm converting Canon Ixus digital camera videos with 11024 Hz mono pcm_u8 sound into mp4 files using libfdk_aac for the sound side. If I don't upsample I get "Unable to encode frame: Encoding error". If I do (using -ar 44100) I get a weird hiss, that is noticeably badder sound quality, irrespective of VBR setting. How do I go about this?
[15:52] <Peace-> hey guys i did this but  i got  a video that is too much fast ...  /home/shared/ffmpeg-04-23-13/./ffmpeg -s   $(xrandr | awk '/, current /{print $8}')x$(xrandr  | awk '/, current /{gsub(/\,/,"");print $10}') -f x11grab -r 25  -i :0.0  -vcodec  h264  -y  $HOME/output.avi
[15:53] <Peace-> i have changed codec but no way
[15:53] <Peace-> the video it's always too much fast
[16:37] <luc4> Hi! I'm trying to compile a cpp source file but it keeps complaining undefined reference to `avcodec_get_name', and a couple of others... shouldn't this be included in the static libs?
[16:38] <JEEB> luc4, this? http://ffmpeg.org/faq.html#I_0027m-using-FFmpeg-from-within-my-C_002b_002b-application-but-the-linker-complains-about-missing-symbols-which-seem-to-be-available_002e
[16:39] <luc4> JEEB: done that already yes
[16:40] <luc4> JEEB: I'm trying to compile muxing.c to be precise.
[16:40] <JEEB> that'd be C, not C++...
[16:40] <JEEB> and yes. those should be in the libraries
[16:40] <luc4> JEEB: yes, but I put those sources in a c++ file
[16:41] <luc4> I had to redefined a symbol which was not compiling
[16:41] <luc4> but still for 6 functions I get undefined reference
[16:42] <JEEB> also you should check the api changelog documentation regarding those undefined ones, make sure the example (if that was it) was up-to-date
[16:42] <JEEB> and do check the exports of that library to make sure what is being exported
[16:44] <luc4> JEEB: those are in the header... by exported you mean __attribute__((__visibility__("default")))?
[16:45] <luc4> JEEB: for examples, I don't understand why avcodec_get_name should not be visible...
[16:46] <JEEB> no, I mean check with tools what symbols the libraries export
[16:46] <JEEB> the libraries you built
[16:47] <luc4> JEEB: seems to be there yes: avcodec_get_name
[16:48] <JEEB> ok, then make sure you are linking to that library
[16:48] <luc4> I used pkgconfig line: -pthread -L/usr/local/ffmpeg-1.2/lib -lavformat -lavcodec -ldl -lva -lXfixes -lXext -lX11 -ljack -lasound -lSDL -lx264 -lvorbisenc -lvorbis -ltheoraenc -ltheoradec -logg -lspeex -lrtmp -lgnutls -lopencore-amrnb -lz -lavutil -lm -lswscale
[16:49] <JEEB> if the library is there, and gets linked to, and you have included all the headers correctly as C stuff, I don't see a reason for it to fail :P
[16:49] <JEEB> so make yourself a checklist and see where you could have failed
[16:49] <luc4> maybe... the linker is taking the libs from my system?...
[16:50] <luc4> I mean those from the system path instead of those I told it to.
[17:00] <luc4> JEEB: ok, that was it. For some reason it was ignoring the -L and was taking the system libs. Thanks.
[17:02] <JEEB> luc4, I think the -L spots are added to the end of the search path, no?
[17:03] <luc4> JEEB: I thought it could override the system paths...
[17:04] <JEEB> that would happen if the -L flags set directories before the search path
[17:04] <JEEB> I /think/ they set them /after/
[17:18] <xlinkz0> how do i use -report?
[17:18] <xlinkz0> how do i set the filename?
[17:19] <xlinkz0> tried -report:file=name
[17:24] <trose> xlinkz0: i'm not an ffmpeg dev but i found this
[17:24] <trose> http://stackoverflow.com/questions/11241878/ffmpeg-report-generation
[17:24] <trose> hopefully it helps?
[17:25] <xlinkz0> thanks
[18:05] <ov3rmind> hello guys is a pleasure undertand a litle to try stream audio & video thanks all for any comment ? ! :)
[18:05] <ov3rmind> "understand"
[18:25] <ov3rmind> http://iworks.srv.br/~leo/relar%C3%B3rio_Stream-server.html i can't connect webcam on debian 7 any help are greatly welcome
[18:55] <ov3rmind> any can help?
[19:54] <ov3rmind> how i can see the formats suported by one codec in ffmpeg ?
[19:54] <ov3rmind> thanks for any advice !
[19:56] <ubitux> ./ffmpeg -h encoder=ffv1
[19:56] <xlink> how do i put a watermark on a video?
[19:56] <xlink> i tried ffmpeg -y -i video.mp4 -vf "movie=photo.png[watermark];[in][watermark]overlay=10:10 [out]" out.mp4
[19:56] <ubitux> 'doesn't work?
[19:56] <xlink> but i get a lot of [png @ 0435fbe0] Missing png signature
[19:57] <xlink> ubitux, http://codepad.org/0l6Glxf5
[19:58] <ubitux> xlink: can you share that png9$?
[19:58] <ubitux> -9$
[19:59] <smj> do you know a music player that tracks the contents of a directory? so that I don't have to add the files manually to the playlist when playing on shuffle
[19:59] <xlink> ubitux, http://www.iconspedia.com/icon/stopwatch-icon-20052.html
[20:00] <ubitux> xlink: which one?
[20:00] <xlink> the png one
[20:00] <xlink> hover over the big picture
[20:00] <xlink> ubitux, full download http://www.iconspedia.com/dload.php?up_id=110917
[20:02] <ubitux> mmh
[20:02] <ubitux> looks like that png is not correctly supported
[20:03] <durandal_1707> that palette looks to have alpha
[20:04] <ubitux> xlink: can you open a ticket attaching that sample?
[20:04] <xlink> idk how..
[20:04] <xlink> so the command is good?
[20:04] <ubitux> yes, it's a bug
[20:04] <xlink> thanks i just needed to find a command
[20:05] <ubitux> it seems converting the png to gif works as a workaround
[20:05] <ubitux> ffmpeg -i /tmp/Stopwatch-256.png test.gif
[20:05] <ubitux> then use test.gif in your filtergraph
[20:12] <ubitux> xlink: https://ffmpeg.org/trac/ffmpeg/ticket/2556
[20:12] <xlink> thanks
[23:21] <luc4> Hi! I'm trying to mux a h264 coming from a variable frame rate source. When I open my mp4, I get that portions are slower and other are faster. I suppose this means that the timestamp was not taken into consideration when muxing right?
[23:31] <relaxed> luc4: Try muxing with MP4Box. If that works then file a bug report against ffmpeg and provide your sample.
[23:32] <luc4> But can I provide timestamps along to h264 stream?
[23:32] <luc4> I mean, can I do that with this MP4Box?
[23:34] <Zarx> if I want to load a video file (that may contain an arbitrary number of audio streams) and then also load a seperate audio file, then copy ONLY the seperate audio file along with the video into my output, what would that commandline look like?
[23:35] <Zarx> basically, ignore all audio streams from first input
[23:37] <relaxed> Zarx: ffmpeg -i video.mkv -i audio.flac -map 0:v -map 1:a ...
[23:39] <Zarx> ok, let me try this...
[23:45] <Zarx> seems like its doing something different than before. Thanks, ill play with this for a bit.
[23:56] <luc4> When muxing with libavformat a h264 stream with vfr, is it correct to create a AVacket like this http://paste.kde.org/741548/ to call the av_interleaved_write_frame afterwards? Or maybe something is missing? Because at the end of my muxing vlc seems also unable to get the entire length of my video.
[00:00] --- Sat May 11 2013


More information about the Ffmpeg-devel-irc mailing list