[Ffmpeg-devel-irc] ffmpeg.log.20150519

burek burek021 at gmail.com
Wed May 20 02:05:01 CEST 2015


[00:00:08 CEST] <rcombs> how can I get a human-readable "4:2:0" or similar string from an AVPixFmtDescriptor?
[03:08:53 CEST] <Rasi> hi
[03:08:57 CEST] <Rasi> [x11grab @ 0x7fc5843c2560] Capture area 1920x1080 at position 2560.0 outside the screen size 4480x1440
[03:09:00 CEST] <Rasi> i am getting this since latest update
[03:09:10 CEST] <Rasi> seems i now have to substract one pixel for it to work
[03:09:41 CEST] <Rasi> ffmpeg -f x11grab -s 1920x1080 -i :0+2559,0 < this will work
[03:09:53 CEST] <Rasi> is this a bug or done on purpose?
[03:21:47 CEST] <BtbN> Rasi, 1920+2560 is exactly 4480, and a screen with the width 4480 only goes from 0 to 4479
[03:22:12 CEST] <Rasi> but it worked for years now
[03:22:20 CEST] <BtbN> So either an off-by-one bug in the X11 capture code was fixed, or it does bounds checking now.
[03:22:29 CEST] <Rasi> kk
[03:22:53 CEST] <c_14> iirc it does bounds checking now
[05:22:24 CEST] <chankit> How do I extract a frame from a yuv420 video file?
[05:23:00 CEST] <c_14> ffmpeg -ss timestamp -i video -frames:v 1 out.png
[05:23:31 CEST] <chankit> I tried ./ffmpeg -i iceage_720x480_491.yuv -r 1 -s 720x480 -vcodec yuv420p -pix_fmts yuv420p -f image2 images.png but while there wasn't ant error msg, i didnt see the output image
[05:23:36 CEST] <chankit> c_14: will try that now...
[05:26:00 CEST] <chankit> c_14: I tried ./ffmpeg -ss 12 -i iceage_720x480_491.yuv -frames:v 1 out.png but now I got Output #0, image2, to 'out.png':
[05:26:00 CEST] <chankit> Output file #0 does not contain any stream
[05:26:44 CEST] <ponyofdeath> hi, I have some mp4 libx264 / aac that I am having issues skiping forward. what is the best way to trouble shoot that or re-encode them?
[05:27:30 CEST] <c_14> chankit: can you pastebin the output of `ffprobe filename'
[05:27:46 CEST] <c_14> ponyofdeath: issues skipping forward?
[05:28:48 CEST] <ponyofdeath> c_14: basically when trying to go skip forward in time of the video, it just hangs
[05:29:04 CEST] <chankit> c_14: http://pastebin.com/cVyb9b4X
[05:31:31 CEST] <c_14> ponyofdeath: try remuxing
[05:31:47 CEST] <ponyofdeath> c_14: ok let me google that
[05:32:05 CEST] <c_14> chankit: add -s 720x480 as an input option
[05:32:27 CEST] <c_14> ponyofdeath: ffmpeg -i input -map 0 -c copy out.mkv
[05:32:48 CEST] <ponyofdeath> c_14: ty, can i do mp4 instead of mkv
[05:32:56 CEST] <ponyofdeath> the input files are all mp4
[05:33:04 CEST] <c_14> sure
[05:33:14 CEST] <ponyofdeath> c_14: thanks! gonna give that a go
[05:34:50 CEST] <chankit> c_14: now I got  Picture size 0x0 is invalid . Should I add -s argument as well to the output?
[05:35:06 CEST] <chankit> btw that's my command: ./ffmpeg -ss 12 -i iceage_720x480_491.yuv -s 720x480 -frames:v 1 out.png
[05:35:12 CEST] <c_14> chankit: what command?
[05:35:27 CEST] <chankit> that I put before I got the Picture size error
[05:35:30 CEST] <c_14> put -s before -i
[05:36:43 CEST] <chankit> c_14: that worked. Thanks BTW. so the out.png should be first frame right?
[05:36:56 CEST] <ponyofdeath> c_14: same problem
[05:37:05 CEST] <ponyofdeath> seems like its at a certain parts of the video
[05:37:08 CEST] <c_14> chankit: with that command it should be the frame at second 12 in the video
[05:37:28 CEST] <c_14> ponyofdeath: try with -map 0:v instead of -map 0
[05:37:37 CEST] <c_14> just to make sure it's the video track
[05:37:43 CEST] <maqr> how can i check how many priming frames are in the AAC stream of an mp4?
[05:38:06 CEST] <ponyofdeath> c_14: what does that do not move the audio
[05:38:14 CEST] <c_14> ponyofdeath: ye
[05:38:27 CEST] <chankit> c_14: so assuming that the video is played at rate of 12 fps, so which frame does ffmpeg take?
[05:38:28 CEST] <ponyofdeath> c_14: roger, encoding
[05:39:19 CEST] <c_14> chankit: the 144th if my arithmetic isn't failing me
[05:40:05 CEST] <ponyofdeath> c_14: yup its the video
[05:41:00 CEST] <chankit> the first frame of the second then?
[05:41:24 CEST] <chankit> c_14: should be 145th right if that's the case?
[05:41:51 CEST] <c_14> chankit: the frame that would be displayed at that moment in time.
[05:42:17 CEST] <c_14> And it's the 144th 0-indexed.
[05:42:25 CEST] <c_14> ponyofdeath: you'll probably have to reencode then
[05:42:37 CEST] <chankit> c_14: got that. Thanks
[05:43:05 CEST] <ponyofdeath> c_14: what is a good way to do that use the defaults?
[05:43:27 CEST] <ponyofdeath> ffmpeg -i input.mp4 -c:v libx264 -crf 23 -c:a libfaac -q:a 100 output.mp4
[05:43:32 CEST] <ponyofdeath> something like that?
[05:43:47 CEST] <c_14> -c:a copy, and get rid of the -q:a
[05:43:55 CEST] <c_14> don't need to reencode the audio if it isn't a problem
[05:44:02 CEST] <ponyofdeath> c_14: ok ty
[05:44:07 CEST] <ponyofdeath> is the crf ok
[05:44:42 CEST] <c_14> You'll have to look at the video. If it's already been encoded at a moderate bitrate, you might want to lower the crf.
[07:42:04 CEST] <chankit> anyone know what software i can use to view raw yuv image?
[07:42:09 CEST] <chankit> *yuv frame
[09:34:12 CEST] <lee__> hi, every one
[09:34:57 CEST] <lee__> anyone konws how to debug ffmpeg in eclipse
[09:37:53 CEST] <JEEBsv> lee__: the usual way if it can use gdb in there. --enable-debug and make sure the binary has debug symbols available (gdb output will tell)
[09:38:09 CEST] <JEEBsv> there's no project files or anything so if eclipse is dumb regarding that then I don't know if there's much to do :P
[10:24:07 CEST] <lee__> @JEEBsv i see it
[10:38:06 CEST] <HebusLeTroll> hello folks ! In ffprobe's video stream infos, what's the difference beween height/width and coded_height/coded_width ?
[10:41:05 CEST] <lee__> @JEEBsv thanks i just debug the ffmpeg with gdb , it's usefull
[11:30:42 CEST] <lee__> who konw fate in ffmpeg
[11:30:58 CEST] <lee__> how to use ffmepg fate
[11:33:20 CEST] <lee__> anyone can help me
[11:33:46 CEST] <c_14> lee__: https://ffmpeg.org/fate.html
[12:18:03 CEST] <jeffshanab> I need to do screen capture and wanted to know the difference between gdigrab, dshow and vfw on windows. Front buffer vs back buffer and performance. I assume gdigrab is fast but front buffer only???
[13:26:53 CEST] <ahop> Hi!
[13:27:11 CEST] <ahop> What's the simplest (command line) tool to cut a video into 2 parts (without reenconding!)
[13:27:21 CEST] <ahop> my video is a .MTS
[13:28:07 CEST] <ahop> Video: MPEG4 Video (H264) 1280x720 50.00fps [Video]
[13:28:07 CEST] <ahop> Audio: Dolby AC3 48000Hz stereo [Audio]
[13:36:08 CEST] <ahop> NoNet do you know how to NOT reencode when ffmpeging ?
[13:41:29 CEST] <sh4rm4^bnc> what's a good format to stream a webcam using ffserver in a LAN ? (low cpu usage, high quality)
[13:41:50 CEST] <sh4rm4^bnc> it's sufficient to compress the input stream to 1/3 of the size
[13:42:26 CEST] <sh4rm4^bnc> it seems libvpx/webm uses a ton of cpu while delivering decent quality
[14:50:34 CEST] <grublet> does libvorbis have a minimum bitrate? i seem unable to set anything below 64K
[15:34:04 CEST] <stk944> Hello, I have a video, it's an AVI file, I can post the ffprobe if you'd like where seeking (either input or output) doesn't work and just produces a black screen in ffplay or when using ffmpeg, its codec is rawvideo
[15:34:29 CEST] <stk944> pix_fmt is pal8
[20:13:18 CEST] <ac_slater> hey all. I'm doing some mpegts muxing. For my AVStreams, I do `stream->codec->time_base.den = 30` (.num = 1) to set the framerate hint. At runtime it says this is deprecated. But when I do what it says and use `stream->time_base.den = 30;` ... the muxed file is garbage
[20:14:26 CEST] <ac_slater> oh ... apparently fixed after 2.5
[20:14:31 CEST] <ac_slater> shame, Im locked to 2.5
[21:48:17 CEST] <Sonny_Jim> Does anybody have a precompiled version of ffmpeg for raspbian?
[23:03:27 CEST] <keb> Hey everyone. I'm a new trying to write a python script to quickly compress my mp3s so that I dont have to use dbPowerAmp anymore. I am wondering if there is a way to use ffmpeg to utilize all cores of a CPU to encode multiple mp3s at once?
[23:04:13 CEST] <Hello71> ffmpeg &
[23:04:54 CEST] <keb> ?
[23:05:14 CEST] <c_14> keb: either spawn multiple ffmpeg processes or add multiple outputs to a single ffmpeg process
[23:06:31 CEST] <keb> if it is a single ffmpeg process, wouldnt it only run on a single core then?
[23:06:49 CEST] <keb> oops
[23:06:59 CEST] <keb> c_14: if it is a single ffmpeg process, wouldnt it only run on a single core then?
[23:08:30 CEST] <c_14> nah, even if lame is singlethreaded, it'll use one thread per output
[23:08:38 CEST] <Sonny_Jim> I'm trying to reverse engineer a video format used on a pinball machines playfield LCD and I've noticed something curious.  It appears that it uses a 320x240x2 frame size, but it has an extra 8 bytes on each frame.  Any idea why this might be?
[23:10:11 CEST] <keb> c_14: interesting. thank you!
[23:11:02 CEST] <iive> Sonny_Jim: i assume you work only with the video data, without having working decoder.
[23:12:29 CEST] <iive> it could be a lot of things... see if there are any repeating patterns. e.g. it could be timestamps, durations, even just sync bytes
[23:13:53 CEST] <iive> if they contain flags needed for decoding of the frame data, you'd know when you get frames you cannot decode.
[23:14:01 CEST] <Sonny_Jim> Well, it looks to be raw RGB data
[23:14:08 CEST] <Sonny_Jim> ie No compression
[23:14:36 CEST] <Sonny_Jim> I suppose I could dump out all the trailing 8 bytes to see if there is a pattern
[23:14:54 CEST] <Sonny_Jim> http://pinball.servebeer.com/BLANK_FRAME.spv
[23:14:58 CEST] <Sonny_Jim> That's a blank frame ^
[23:15:14 CEST] <Sonny_Jim> http://pinball.servebeer.com/WWELOGO_FPS20.spv
[23:15:22 CEST] <Sonny_Jim> That is a short 3 second video
[23:15:34 CEST] <Sonny_Jim> https://github.com/SonnyJim/spvtool/blob/master/spvtool.c
[23:15:48 CEST] <Sonny_Jim> That has most of what I think is correct so far with regard to the header
[23:16:24 CEST] <Sonny_Jim> I'm expecting it to be a very simple format as the CPU in the machine isn't exactly a monster
[23:21:12 CEST] <iive> i'm not going to look at that myself. can you put the hex of frame headers in some pastebin site?
[23:21:33 CEST] <iive> 8 bytes per line .
[23:22:50 CEST] <ac_slater> hey all, I can't really figure out how to use frame/packet/stream/codec PTS and DTS to do rate limiting. I noticed ffmpeg.c does this via rate_emu, but I've playing with it for days and cant really figure it out. Any pointers?
[23:23:36 CEST] <ac_slater> I'd like to not really on timing queues or anything, just single threaded stuff
[23:23:43 CEST] <Sonny_Jim> iive: Here's the output of hexdump
[23:23:45 CEST] <Sonny_Jim> http://pastebin.com/dq6GqnQd
[23:24:07 CEST] <Sonny_Jim> That's for a blank frame
[23:27:21 CEST] <ac_slater> I guess I want to fill a queue at N fps (which I do since my source only feeds at N fps), but I also want to pop the queue at the same rate. I guess I can deduce some sleep time, but ffmpeg.c and ffplay.c don't have a fps-dependent sleep. I'm sure people do this
[00:00:00 CEST] --- Wed May 20 2015


More information about the Ffmpeg-devel-irc mailing list