[Ffmpeg-devel-irc] ffmpeg.log.20120507
burek
burek021 at gmail.com
Tue May 8 02:05:01 CEST 2012
[00:00] <aphid> dalaa is xiph's next gen codec
[00:00] <NuxRo> interesting, homework for tomorrow this dalaa thing :)
[00:00] <NuxRo> night guys
[00:00] Action: NuxRo &
[00:01] <aphid> shame they didn't call it gnortwot or snuppflog
[00:01] <aphid> oh, it's daala
[01:39] <Harzilein> hi
[01:41] <Harzilein> can the ffmpeg command line utility be used to convert gif frames with variable delay (split with gifsicle -U --explode) to matroska with variable fps?
[01:54] <Harzilein> btw, i found the codec i was looking for a while ago (the one with the simpsons examples), it was nuppelvideo
[01:55] <Harzilein> hmm... or at least i think it is
[02:01] <Harzilein> yeah, the part about simpsons is in the readme :)
[02:02] <Harzilein> only thing i misremembered was that i thought it was lossless but it is in fact lossy and the animation part was about how that is more efficiently compressed than live action movies
[02:19] <Harzilein> maybe the ffmetadata demuxer can solve my above problem?
[02:49] <AntumDeluge> I've got a video that is 657x435. I want to get it to 640x360 (16:9) without stretching the video. I've used the video filter "scale" to resize it to 544x360. Now I need to add a black border on the left and right, each 48 pixels wide, to get it to 640. But I'm not sure how to use the "pad" filter to do it.
[02:58] <AntumDeluge> Figured it out: "-vf scale=-1:360,pad=640:360:48:0:black"
[03:16] <AntumDeluge> How can I create a blank (black) video? Is there some kind of null video input I can use or will I have to something like encode from a black bitmap image?
[04:10] <AntumDeluge> How can I make a six second video from a single png image? I've tried "ffmpeg -f image2 -loop -r 30 -i black.png -f alsa -i hw:0,0 -vcodec huffyuv -acodec pcm_s16le -t 6 black.avi", but the resulting video only has one frame.
[09:56] <roxlu> hi, when I see "EV libx264" and I try to create a video from image stills I get a "Unknown decoder libx264". Why?
[09:57] <juanmabc> keyword: decoder
[09:57] <roxlu> ah.. arg.. (googled for a command which creates a video from image stills.. probably that one is incorrect)
[09:58] <juanmabc> unfortunately, yep
[12:58] <juanbobo> i am having trouble getting yadif to work
[13:23] <burek> roxlu, something like this? http://ffmpeg.gusari.org/viewtopic.php?f=25&t=39
[13:27] <roxlu> burek: ah thanks, I already got it
[13:28] <burek> Harzilein, it might be the best to first extract gif frames into images and then use those images to create output video
[13:28] <burek> ok :)
[13:28] <burek> Harzilein, you can also use the same link for that http://ffmpeg.gusari.org/viewtopic.php?f=25&t=39
[13:31] <Harzilein> burek: i said i _did_ split the images, the problem is getting in the timestamps
[13:34] <Harzilein> burek: at the very least i'd need a way to read the list of files and respective offsets from a file
[13:35] <burek> hmh..
[13:35] <burek> I see..
[13:35] <burek> well, maybe it would be better to find some gif2avi tool
[13:35] <burek> and then convert that avi to anything you want, with ffmpeg
[13:39] <Harzilein> burek: that was not the point of my question. i'm pretty sure it would be easy(ish) to code it myself, i just wonder if it's possible to do with the ffmpeg utility and if it isn't if that should not be a feature
[13:41] <burek> ok
[13:41] <Harzilein> it's easy enough with fixed fps, but there is lots of gifs with variable delay
[14:22] <jayece09> HI
[14:23] <jayece09> I want to encode video streams to H264 format and transmit over internet to client side and decode there
[14:23] <jayece09> I am using ffmpeg for it
[14:26] <jayece09> I want to make an UI app in python
[14:27] <jayece09> it basically transmits the video after encoding and client decodes it
[14:27] <jayece09> I wnt to do the process in python in H264 FORMAT
[15:15] <burek> jayece09, ok, it can be done
[15:15] <burek> usually with ffmpeg -i input.avi -vcodec libx264 -f mpegts udp://remote_server:port
[15:50] <cell> Hi. I am having some problems using ffmpeg to create a video from an image sequence that has been editing by ImageMagick. I can successfully create the video from the original image sequence, but not from any output from ImageMagick, even with no specified command. The ouput video from the edited sequence is is either grayscale or a featureless gray gradient. Has anyone seen anything like this before?
[16:04] <burek> cell,
[16:04] <burek> can you please use pastebin.com, to show your command line and its output?
[16:15] <cell> Hi burek. I have uploaded the original conversion and the broken conversion, as well as information output from ImageMagick. http://pastebin.com/1BeyPXJJ ffmpeg's main complaint is "[mjpeg @ 01BEEE40] dqt: 16bit precision". Thanks
[16:17] <burek> well, those auto-generated images are somehow damaged.. did you try any image viewer
[16:18] <burek> to see if they are actually ok?
[16:18] <cell> They look fine in Window's default image viewer and GIMP
[16:19] <burek> ok, let me check
[16:19] <burek> just a sec
[16:21] <cell> Thanks for your help. I don't think the original image matters that much. I was originally using something I took myself, but it also does it for an image grabbed off google.
[16:21] <burek> first one says: Interlace: JPEG
[16:21] <burek> the 2nd says Interlace: none
[16:21] <cell> I tried adding that, and the error changed a little, Will make a log of that as well if that helps. the quality is also missing
[16:22] <burek> also 1st one has Profile-APP12: 85 bytes
[16:22] <burek> the 2nd doesn't
[16:22] <cell> I haven't tried changing that as I have no idea what that is?
[16:22] <burek> well, the problem is that generated images are somehow "wrong" for ffmpeg
[16:22] <burek> or it doesn't handle them correctly
[16:30] <cell> Setting the interlace and quality to be the same seems to cause more issues for ffmpeg. Same as before, the images look fine. http://pastebin.com/w8GdBcMA
[16:39] <burek> just a sec
[16:40] <burek> hm, first of all
[16:40] <burek> you are doing 2 compressions
[16:40] <burek> first jpg, then x264
[16:40] <burek> which should be avoided
[16:41] <burek> you might use lossless png output with image magick
[16:41] <burek> or just output bmp files
[16:41] <burek> to preserve the quality
[16:42] <burek> you can also read this http://forum.videolan.org/viewtopic.php?f=2&t=60983&p=181576
[16:43] <burek> about how to produce uncompressed output with imagemagick
[16:43] <burek> which can be used as a raw input for ffmpeg
[16:43] <burek> so that you don't compress things twice
[16:43] <cell> Ok will read
[16:43] <burek> and possibly avoid creating files/images
[16:43] <burek> but rather use pipes to "stream" your images directly to ffmpeg
[16:43] <burek> to save hdd space
[16:45] <cell> Thats a good point, but I am working with stop motion animation so am planning on editing individual frames. Uncompressed output is probably a good idea
[16:45] <burek> well, then go for bmp
[16:45] <burek> as they are mostly supported
[16:45] <burek> if hdd space is not an issue
[16:45] <burek> if it is, go for lossless png
[16:45] <superlinux-hp> I have an OGV video but the sound has distortions. how can I get rid of the noises?
[16:45] <burek> because it compresses data, but not losing the quality
[16:46] <burek> superdump, using some audio editor like audacity for example
[16:46] <burek> not superdump, superlinux-hp :)
[16:47] <superlinux-hp> burek, so what you're saying is that i must extract the sound aside and then edit it?
[16:49] <cell> I don't think the issue occurs for pngs so I guess I will stick to that. Wanted to work with jpgs as thats what I started out with, didn't think of the quality issues with re-converting to jpgs, I will leave it there, thank you very much for your help
[16:50] <burek> superlinux-hp, if the source is distorted, how else do you intend to fix it if not like that?
[16:50] <burek> cell :beer: :)
[16:50] <superlinux-hp> burek, ok thanks
[16:50] <cell> :) Thanks, bye
[16:53] <superlinux-hp> burek, how can i extract ONLY the video from the ogv?
[16:54] <ubitux> ffmpeg -i in.ogv -map 0:0 -c copy out.ogv ?
[16:54] <burek> that's if the video is the first stream
[16:54] <ubitux> should be the case ;)
[16:55] <ubitux> maybe -map 0:v or sth like this
[16:55] <burek> well, ok :) but ffmpeg -i input.ogv -an -vcodec copy output.ogv should give you what you want no matter what's inside
[16:57] <ubitux> it doesn't drop data streams
[16:57] <ubitux> also if there is multiple video streams, it won't do as expected
[16:58] <burek> I agree
[17:02] <superlinux-hp> burek, ubitux thanks
[17:22] <NuxRo> burek: if i wanted to transcode mp4 (or anything) to ogv without much quality loss, what should i do? last night you people said not to use -sameq
[17:24] <ubitux> maybe you can avoid the reencode
[17:24] <ubitux> with -c copy
[17:27] <NuxRo> ubitux: noted, thanks
[17:27] <burek> NuxRo, as ubitux said, you might need just to remuc
[17:27] <burek> remux*
[17:28] <doorp> I was wondering if there's a method for extracting the size in bytes, of each frame within a H.264 stream?
[17:28] <burek> doorp, what do you need that for?
[17:28] <doorp> burek: I would like to compare the bitrate graph of 2 exports
[17:29] <burek> do you need a coding solution or cmd line?
[17:30] <ubitux> ffprobe -show_packets will show you the size of each packet/frame
[17:30] <ubitux> (hint: -print_format option might help you extracting these informations)
[17:30] <doorp> burek: I method for extracting the values would be fine, I'll manage from that point
[17:30] <doorp> ubitux: Thanks, very appreciated
[17:30] <doorp> A method*
[17:33] <doorp> ubitux: Thanks once again, exactly what I needed :)
[17:33] <ubitux> great :)
[17:37] <NuxRo> ubitux: i think the content is h264, just copy-ing over in ogg won't work
[17:37] <NuxRo> any more advice of how to limit quality loss? :)
[17:39] <ubitux> -qscale i guess
[17:42] <NuxRo> thanks
[17:43] <ubitux> keep in mind the resulting file won't be that good anyway
[17:43] <ubitux> you're transcoding, and from a good codec to likely a "bad" one
[17:44] <NuxRo> ubitux: yes, I'm aware quality loss is not avoidable
[17:44] <NuxRo> but want to lose as little as possible
[17:48] <NuxRo> ubitux: qscale seems to be working for what i need, cheers
[17:48] <NuxRo> you guys were talking alst night about a new codec or smth developed by Xiph, daraa or smth like that? care to refresh my memory please?
[17:48] <ubitux> 00:00:12 < aphid> dalaa is xiph's next gen codec
[17:48] <burek> http://ffmpeg.gusari.org/irclogs/
[17:49] <NuxRo> :D
[17:49] <ubitux> this is what my lastlog raises
[17:49] <NuxRo> ahh.. nice feature, never used it :)
[17:49] <NuxRo> thanks
[17:49] <NuxRo> (lastlog that is)
[18:00] <superlinux-hp> burek, i have fixed the sound to an acceptable level. now I need to mix the video-only ogv with the wave file I have to get the video with audio again. how should I do it?
[18:09] <burek> ffmpeg -i in.ogv -an -vcodec copy -i temp.wav -acodec mp3 out.ogg
[18:09] <burek> something like that
[18:09] <burek> let me see the google
[18:09] <burek> ffmpeg -i in.ogv -an -vcodec copy -i temp.wav -acodec vorbis -ab 128k out.ogv
[18:09] <burek> try that
[18:10] <superlinux-hp> burek, i'll see
[18:12] <superlinux-hp> burek, no. it did not work
[18:12] <burek> can you please use pastebin.com, to show your command line and its output?
[18:13] <superlinux-hp> ok just moment
[18:15] <superlinux-hp> burek, http://pastebin.com/2pWNaQdY
[18:15] <burek> oh
[18:15] <burek> remove -an
[18:15] <superlinux-hp> ok
[18:18] <superlinux-hp> burek, no sound yet
[18:18] <burek> pastebin?
[18:20] <superlinux-hp> burek, http://pastebin.com/YRuZS7Vj
[18:22] <burek> superdump, everything is ok now
[18:22] <burek> superlinux-hp*
[18:23] <superlinux-hp> burek, but i hear no sound
[18:23] <burek> probably because u used ogv (v for video)
[18:23] <burek> use ogg
[18:24] <burek> out.ogg
[18:24] <burek> just rename it
[18:25] <superlinux-hp> ok
[18:25] <burek> although, I'm reading on wikipedia, ogv should work too
[18:25] <burek> different types of content such as .oga for audio only files, .ogv for video with or without sound (including Theora), and .ogx for multiplexed Ogg.[5]
[18:26] <superlinux-hp> burek, it's not merging still audio and video
[18:26] <burek> Output #0, ogg, to 'out2.ogv':
[18:26] <burek> Metadata:
[18:26] <burek> encoder : Lavf52.111.0
[18:26] <burek> Stream #0.0: Video: libtheora, yuv420p, 1354x768 [PAR 1:1 DAR 677:384], q=2-31, 17 tbn, 17 tbc
[18:26] <burek> Stream #0.1: Audio: libvorbis, 44100 Hz, 2 channels, s16, 128 kb/s
[18:26] <burek> try typing ffmpeg -i out2.ogv
[18:26] <burek> to see if the streams are inside
[18:27] <superlinux-hp> ok
[18:28] <superlinux-hp> burek,
[18:28] <superlinux-hp> Input #0, ogg, from 'out2.ogv':
[18:28] <superlinux-hp> Duration: 00:13:29.37, start: 0.000000, bitrate: 440 kb/s
[18:28] <superlinux-hp> Stream #0.0: Video: theora, yuv420p, 1354x768 [PAR 1:1 DAR 677:384], 17 fps, 17 tbr, 17 tbn, 17 tbc
[18:28] <superlinux-hp> Stream #0.1: Audio: vorbis, 44100 Hz, stereo, s16
[18:28] <superlinux-hp> Metadata:
[18:28] <superlinux-hp> ENCODER : Lavf52.111.0
[18:28] <superlinux-hp> At least one output file must be specified
[18:28] <superlinux-hp> there is a sound but i don't hear it! hehe
[18:28] <burek> turn on your speakers? :D
[18:28] <superlinux-hp> i ma pretty sure the volume is raised
[18:29] <superlinux-hp> they r on. i can hear my wav file
[18:40] <zandzpide> What is the most reliable way to read FPS from the ffmpeg CLI?
[18:40] <zandzpide> Stream #0:0: Video: vp6f (VP6F / 0x46365056), yuv420p, 640x368, SAR 1:1 DAR 40:23, 23.81 fps, 23.81 tbr, 1k tbn, 23.81 tbc (default)
[18:40] <zandzpide> Lets say i get that line, and "FPS" is not there.
[18:40] <burek> ffprobe?
[18:46] <superlinux-hp> burek, i did it through using oggz-tools. I converted the audio from wav to ogv using ffmpeg. then i merged them using : oggz-merge video.ogv audio.ogv -o mixed_output.ogv
[18:47] <zandzpide> burek: ffprobe does not show fps on some of the files, just like ffmpeg
[18:47] <burek> zandzpide, usually for variable fps it doesn't show
[18:48] <zandzpide> how on earth do i get total amount of frames then :P cant use php-ffmpeg
[18:48] <burek> superlinux-hp, great :) only, I don't really understand why it wouldn't work with ffmpeg only..
[18:48] <zandzpide> length * fps == frames. and with no fps. yey
[18:48] <superlinux-hp> me too burek
[18:48] <burek> zandzpide, count them all, from beginning to the end :)
[18:49] <zandzpide> any fast way todo this? ^^
[18:49] <burek> with constant fps yes, with var no
[18:53] <zandzpide> Allright, just going to skip progressbar on the ones with variable fps then =)
[18:53] <zandzpide> Thanks burek=)
[18:53] <burek> :beer: :)
[18:54] <superlinux-hp> i am running my laptop of i7 intel CPU like an oven.. i am on 94C degrees
[18:54] <superlinux-hp> that's cos of ffmpeg
[18:55] <burek> install a water-cooling :)
[18:55] <zandzpide> tell it to use less cores :P
[18:55] <superlinux-hp> zandzpide, i'd prefer it use them all with 99 threads
[18:55] <zandzpide> who am i kidding :P ALL THE CORES!
[18:56] <superlinux-hp> but my laptop is slim.
[18:56] <superlinux-hp> i wish that it had a thicker casing
[18:57] <burek> find the laptop holder with coolers
[18:57] <burek> they are cheap
[18:57] <superlinux-hp> maybe this way it's pussing the heat faster
[18:57] <zandzpide> should buy air-in-a-can and blow it into the cpu fan. make sure there's not packed with dust in there.
[18:57] <superlinux-hp> *pushing .. bad me!
[18:57] <zandzpide> turn the laptop off, ofc
[18:58] <burek> http://www.google.com/search?q=laptop+pad+cooler&tbm=shop
[18:58] <superlinux-hp> zandzpide, it's still new. I had it as a christmass gift this year
[18:59] <superlinux-hp> and an i7 must reach high temps
[18:59] <zandzpide> ahh. go with burek's suggestion then
[18:59] <zandzpide> must reach high temps? it will reach but it doesnt have too
[18:59] <zandzpide> ^^
[19:00] <superlinux-hp> on normal temp it's 52C
[19:00] <burek> I really think your radiator (below the cooler) is not mounted correctly.. cpu should never reach 94 C
[19:00] <zandzpide> well its a laptop.
[19:00] <superlinux-hp> i was once in the bus beside the window, I let the wind blow in , i reached 48C
[19:01] <superlinux-hp> i think i really need a side fan.. maybe like an A/C
[19:03] <superlinux-hp> this means people like google have rooms with water chillers as A/C for their servers!
[19:05] <zandzpide> they have massive watercooled systems for the serverparks
[19:05] <zandzpide> depends on what type of serverpark ofc :P container ones are not watercooled :P thats just so modular
[19:15] <zandzpide> is there a dedicated mencoder channel?
[19:16] <JEEB> mencoder is part of mplayer so #mplayer
[19:16] <zandzpide> Thanks
[19:56] <AntumDeluge> Can FFmpeg hardcode Advanced Substation Alpha (.ass) subtitles into a video stream?
[19:57] <NuxRo> AntumDeluge: http://ffmpeg.gusari.org/viewtopic.php?f=25&t=34
[19:58] <AntumDeluge> NuxRo: Wow, that was easy! Thanks!
[20:00] <NuxRo> thank burek
[20:00] <NuxRo> you're welcome
[20:43] <bahar> for real-time video transcoding in software, how important is processor cache? does anyone know?
[20:43] <bahar> wondering if a 6 core w/ 10mb cache is as good as a 6 core w/ 15mb cache or if clock speed is more important than cache.
[21:07] <iframe> how can i join two *.flv files into a single *.flv file?
[21:29] <burek> iframe, use a video editor, such as virtual dub or similar
[21:30] <burek> bahar, you can ask in #ffmpeg-devel where developers are
[21:31] <iframe> burek, is it possible via ffmpeg? don't have virtual dub
[21:31] <burek> iframe, well yes, but it's not optimal to do so using ffmpeg
[21:31] <burek> you'd have to extract both videos to rawvideo files
[21:31] <burek> join them and then encode that big one
[21:32] <burek> or you can try concat filter
[21:32] <burek> just a sec to find a link
[21:32] <iframe> burek, ...mmm, i see,
[21:32] <burek> try this
[21:32] <burek> http://ffmpeg.org/ffmpeg.html#concat
[21:32] <burek> it might work
[21:32] <iframe> burek, thanks...
[21:33] <burek> :beer: :)
[21:34] <burek> so, try ffmpeg -i concat:1.flv\|2.flv -vcodec copy 3.flv
[22:18] <jhurlima_> i'm trying to use the http://ffmpeg.org/libavfilter.html#thumbnail filter to extract 10 thumbnails from a video, but i'm unsure how to put the command line together for that based on the documentation examples
[22:32] <burek> did you try the given example: ffmpeg -i in.avi -vf thumbnail,scale=300:200 -frames:v 1 out.png
[22:33] <burek> just change -vframes 1 to -vframes 10
[22:33] <burek> or -frames:v
[22:34] <burek> jhurlima_, or you can use this: http://ffmpeg.gusari.org/viewtopic.php?f=25&t=35
[22:34] <jhurlima_> ah ok, thanks
[22:57] <RyuGuns> How do I get sound from my desktop and my microphone at le same time when recording with FFMPEG?
[22:59] <burek> windows/linux?
[23:01] <burek> RyuGuns, ping
[23:10] <RyuGuns> Oh..
[23:10] <RyuGuns> Sorry.
[23:10] <RyuGuns> Linux.
[23:10] <RyuGuns> burek?
[23:22] <burek> RyuGuns, http://ffmpeg.gusari.org/viewtopic.php?f=25&t=594&p=627
[00:00] --- Tue May 8 2012
More information about the Ffmpeg-devel-irc
mailing list