[Ffmpeg-devel-irc] ffmpeg.log.20181009
burek
burek021 at gmail.com
Wed Oct 10 03:05:03 EEST 2018
[00:11:42 CEST] <tytan> Hello everyone, I am trying to two pass encode a file in Windows but there is no /dev/null in Windows. How do I do it there?
[00:12:59 CEST] <JEEB> NUL
[00:13:10 CEST] <tytan> just "NUL" ?
[00:13:11 CEST] <JEEB> a quick search would have made you find that I think?
[00:14:17 CEST] <tytan> in the guide it says
[00:14:18 CEST] <tytan> Note: Windows users should use NUL instead of /dev/null and ^ instead of \.
[00:14:31 CEST] <tytan> I wasn't sure about the syntax
[00:24:50 CEST] <tytan> it fails at the && though
[00:32:04 CEST] <poutine> tytan, when asking for help with syntax, posting the syntax used is always helpful (and in a pastebin)
[00:35:23 CEST] <tytan> http://ffmpeg.pastebin.com is dead
[00:36:57 CEST] <poutine> I guess there's no alternatives then
[00:37:06 CEST] <tytan> +
[00:38:51 CEST] <ariyasu> are you running your command from a batch file?
[00:39:13 CEST] <ariyasu> if not, use & instead of &&
[00:39:41 CEST] <ariyasu> and yes posting your full command and output to pastbin would be very helpful if you need help
[00:40:16 CEST] <tytan> https://pastebin.com/pcMn61t2
[00:40:35 CEST] <tytan> just powershell
[00:40:57 CEST] <ariyasu> whats the output
[00:42:13 CEST] <ariyasu> i don't believe you need " && ^" at all
[00:42:14 CEST] <poutine> I don't think you need the ^ in powerhsell
[00:42:20 CEST] <ariyasu> just end the line with NUL
[00:44:10 CEST] <tytan> let me try that first
[00:48:02 CEST] <tytan> ok so doing it in cmd works. I guess I just don't use power shell then.
[00:48:21 CEST] <ariyasu> i personally hate powershell and do everything in cmd
[00:48:25 CEST] <ariyasu> but it should work in both
[00:48:27 CEST] <tytan> I just used power shell because that what you get when you shift right click in a directory on Windows 10 1809
[00:48:44 CEST] <ariyasu> you can change that with a registry tweak if you want
[00:49:25 CEST] <tytan> nah i will just use cmd as I don't care. I'm more the GNU/Linux guy either but I'm at a Threadripper 2990WX atm and I want to play around with it ^^
[00:52:37 CEST] <tytan> 32 cores and 64 threads. pretty fast for video encoding. the future looks good
[00:53:27 CEST] <tytan> anyway, have a nice day everyone
[01:52:33 CEST] <rkantos_> Btw... Any idea if FFMpeg is supposed to be able to encode say 2x2 videos?
[01:52:46 CEST] <rkantos_> I need recording capabilities for 2x1920 footage :)
[01:53:06 CEST] <rkantos_> It actually seems to function with greyscale.. just not with color
[02:09:25 CEST] <Hello71> some codecs may not support it
[02:09:41 CEST] <Hello71> and some pixel formats
[03:00:15 CEST] <rkantos_> Hello71: meh... leadtools MJPEG handles it :D
[03:05:24 CEST] <rkantos_> actually 1920x2 footage is what I really mean, so 2px height
[12:19:23 CEST] <TheGrumpyScot> Having problems building ffmpeg with --enable-vorbis on ubuntu 18.04 (server) -- stated error is "vorbis not found using pkg-config"; (./configure command and partial config.log at https://node86.com/pastebin/ww26b) - what am I missing ?
[12:25:38 CEST] <spaam> TheGrumpyScot: did you install the dev packet of libvorbis?
[12:26:11 CEST] <TheGrumpyScot> spaam: Yes, installed `libvorbis-dev`
[12:32:05 CEST] <zerodefect> Could try running 'ldconfig' and try rebuilding incase the .so db wasn't updated
[12:35:55 CEST] <TheGrumpyScot> Same results I'm afraid - and I've checked the ldconfig cache is in fact listing libvorbis.so
[12:38:29 CEST] <TheGrumpyScot> ah.. it's a pkg-config error; my bad! Sorry to waste your time folks. (Corrected with apt install pkg-config) :D
[17:12:38 CEST] <King_DuckZ> hello, I found one more thing (surprise!) I don't understand in the code I'm trying to fix
[17:14:05 CEST] <King_DuckZ> there's a function that gets asked to return frame n, so n gets converted to a PTS via the function I posted yesterday and then the code starts calling avcodec_receive_frame() until it gets a frame with the desired PTS
[17:14:22 CEST] <King_DuckZ> I have no idea if that's optimal or not, but that's how the previous programmer wrote it
[17:15:27 CEST] <King_DuckZ> what I added is that if the returned PTS is greater than the desired one, then choose current or previous frame depending on which one is closest to the wanted PTS
[17:17:11 CEST] <King_DuckZ> if it's the old one then I call av_seek_frame(prev_frame->pts) because current frame is being discarded and should be re-read next time, however from here on the code seems to always resume from the first frame found through avcodec_receive_frame()
[17:46:08 CEST] <Essadon> Hello, I would like to know the syntax for combining a slideshow of multiple GIF pictures and the sound of a MP4 file. I have timestamps for when a certain picture should be viewed, and I want to merge the pictures and the sound together into a single file.
[17:48:47 CEST] <King_DuckZ> am I correct if I say that avcodec_receive_frame() advances the read position of the stream?
[18:06:16 CEST] <JEEB> King_DuckZ: you might be confusing that one with the lavf read function which is called frame while it returns an AVPacket
[18:25:16 CEST] Action: King_DuckZ looks around
[18:25:35 CEST] <King_DuckZ> JEEB: you might be confusing me with someone who understands ffmpeg :p
[18:26:11 CEST] <King_DuckZ> I'm trying to figure out what's going on in the code flow... been doing that for the past 2 weeks
[19:11:58 CEST] <King_DuckZ> this is fucked up https://alarmpi.no-ip.org/kamokan/cu?colourless
[19:12:27 CEST] <King_DuckZ> frame number goes up, pts loops around
[20:04:48 CEST] <sonicrocketman> hello all! Does anyone have experience with the x11grab device? Im getting an error "Cannot get the image data event_error:" and I just get a 12KB 1 second all black video.
[20:04:57 CEST] <sonicrocketman> Im kinda lost on a way to debug this rly.
[21:07:17 CEST] <retal> Hello guys, i am publishing live rtmp stream like:
[21:07:17 CEST] <retal> ffmpeg -i SOURCE -preset fast ... -f flv rtmp://10.110.0.5:1935/test_rtmp/stream
[21:07:17 CEST] <retal> Now need publish with password, how can i do that?
[21:09:52 CEST] <relaxed> retal: man ffmpeg-all|less +/^' 'rtmp
[21:10:37 CEST] <retal> relaxed, i did but in manual i see rtmp://username:password@myserver/
[21:10:51 CEST] <retal> I dont have username, only password
[21:12:09 CEST] <relaxed> maybe, rtmp://password@myserver/ ?
[21:12:30 CEST] <retal> relaxed, tnx i will try
[21:12:30 CEST] <sonicrocketman> maybe rtmp://:password@myserver/
[21:12:40 CEST] <sonicrocketman> notice colon before pass
[21:12:43 CEST] <retal> sonicrocketman, tnx
[21:55:52 CEST] <trashPanda_> Hello, does anyone have any experience using programatic filtergraphs? I'm feeding frames into the graph and receiving frames from it, however the PTS values of frames coming from the buffersink are all increased by 1 PTS. Am I expected to convert this into my appropriate output timebase?
[21:57:32 CEST] <JEEB> I think both the sink and the buffer src had time bases
[21:57:57 CEST] <trashPanda_> I know the buffer src does, I didn't see any parameters for the sink in the documentation on the web
[21:58:06 CEST] <JEEB> and then you appropriately have to set the time base for the encoder AVCodecContext, and the muxer AVStream
[21:58:27 CEST] <trashPanda_> do you think it works(has the same names) as the buffer src?
[21:59:12 CEST] <JEEB> well I'm just looking at the example I did ages ago of doing something simple, and that one seems to set a time_base for the buffersrc
[21:59:29 CEST] <JEEB> ok, "buffer" I mean as far as the filter's name goes
[21:59:35 CEST] <trashPanda_> Yes, I did that. The output still has values increasing by 1
[21:59:53 CEST] <JEEB> trashPanda_: av_buffersink_get_time_base
[21:59:59 CEST] <JEEB> there's a thing called like this it seems
[22:00:21 CEST] <JEEB> https://www.ffmpeg.org/doxygen/trunk/group__lavfi__buffersink__accessors.html
[22:02:33 CEST] <trashPanda_> I'll see what I get from it, thanks
[22:02:52 CEST] <JEEB> np
[22:03:44 CEST] <trashPanda_> how should "timebase" be used officially?
[22:04:26 CEST] <trashPanda_> I see it used in different ways depending on the context
[22:04:33 CEST] <JEEB> set it to the minimal value viable or pass it through as-is?
[22:04:56 CEST] <JEEB> AVStreams have them, AVCodecContexts have them, avfilter has them
[22:05:21 CEST] <trashPanda_> Yes, AVStream seems to report the PTS conversion rate of 1 to 90000hz number
[22:05:30 CEST] <JEEB> yup, that's MPEG-TS
[22:05:44 CEST] <trashPanda_> CodecContexts seem to report something closer to a framerate, 1001/60000 or 1001/30000 etc.
[22:05:48 CEST] <JEEB> yes
[22:06:01 CEST] <JEEB> generally they parse the value that most likely'ishly is the frame rate
[22:06:05 CEST] <trashPanda_> This framerate filter is reporting 1/30 (aka fps)
[22:06:12 CEST] <JEEB> yup
[22:06:30 CEST] <JEEB> that's the time base that the filter will use in the output then
[22:06:41 CEST] <JEEB> and since it's the fps filter it most likely will output the frames at +1
[22:06:42 CEST] <JEEB> each
[22:07:44 CEST] <trashPanda_> Yes, it is doing that. Ok, so do AVStreams typically report pts conversion rate? or is it just special with Mpeg2 TS. SHould I expect other formats to output something similar to the AVCodecContext framerates?
[22:08:01 CEST] <trashPanda_> Just curious for knowledge sake
[22:08:17 CEST] <JEEB> in containers like MPEG-TS where the time base is hard-coded you will get exactly that
[22:08:22 CEST] <JEEB> in other containers you can get the actual rate
[22:08:44 CEST] <JEEB> tl;dr don't expect stuff, other than DTS/PTS over the AVStream's time base
[22:08:57 CEST] <JEEB> you may get exact numbers, or you might get inexact
[22:09:26 CEST] <trashPanda_> Ok, thank you
[22:09:41 CEST] <JEEB> np
[22:10:22 CEST] <JEEB> the biggest problem generally is that AVPackets and AVFrames always have PTS/DTS from something, and you have to keep in mind from where that is
[22:10:41 CEST] <JEEB> demuxer AVPacket? AVStream it came from
[22:10:54 CEST] <JEEB> decoded AVFrame? AVCodecContext it came from
[22:11:07 CEST] <JEEB> filtered AVFrame? the filter graph
[22:11:24 CEST] <JEEB> encoded AVPacket? the AVCodecContext
[22:11:27 CEST] <JEEB> and so on
[22:11:40 CEST] <JEEB> and you have to convert between those if they differ
[23:10:26 CEST] <Aardwolf> Hi, I'd like to encode a single frame with a certain exact target filesize. I try combinations of the -r and -b:v parameter, but I can't get it to output the size I want. Is this possible? E.g. -r 1 and -b:v 100k, should output a 100k file? (it doesn't). Thanks!
[23:11:14 CEST] <JEEB> that 100% depends on the encoder in use
[23:11:19 CEST] <JEEB> some suck at hitting specific rates
[23:11:40 CEST] <Aardwolf> I see, that explains why it differs wildly with different encoders
[23:11:58 CEST] <JEEB> also the bit rate is in bits per second, and kilo/mega are bases of 1000
[23:12:01 CEST] <JEEB> not 1024
[23:12:34 CEST] <Aardwolf> If there is a single frame, how many seconds is that? I hoped setting -r 1 would make it count as 1 second
[23:13:48 CEST] <JEEB> it depends on the time base and the duration of the picture. with one picture it's left rather vague
[23:16:06 CEST] <trashPanda_> Hey JEEB, do you know of any reason I would keep getting frames out of the filter pipeline, despite not feeding it any more frames?
[23:16:22 CEST] <JEEB> depends on the filter chain I guess?
[23:16:48 CEST] <trashPanda_> I'm using a string to construct it, mind if I paste it?
[23:17:23 CEST] <JEEB> as long as it's not too long. if it's long just pastebin or so it and link here
[23:17:30 CEST] <trashPanda_> "buffer=width=%d:height=%d:pix_fmt=%s:time_base=%d/%d:sar=1, fps=fps=%i, %sbuffersink"
[23:18:13 CEST] <trashPanda_> I fill the variables in with the source width, height, input AVStream timebase num and den, target fps and the last %s is an empty string right now
[23:18:42 CEST] <trashPanda_> so basically, "buffer, fps, buffersink" not a long chain
[23:19:18 CEST] <JEEB> yea, no idea of the internals of the fps filter unfortunately
[23:20:21 CEST] <trashPanda_> so you think it would be that filter that keeps producing frames?
[23:21:58 CEST] <trashPanda_> that makes sense, my timebases must be off and its trying to hit a wrong timebase number. thanks
[23:46:24 CEST] <trashPanda_> Does anyone know where in code the -re command line argument (limiting read speed to be at the input src rate) is implemented?
[23:50:30 CEST] <JEEB> trashPanda_: in ffmpeg.c
[23:50:34 CEST] <JEEB> as in the example API client
[23:50:44 CEST] <JEEB> it just sleeps according to some heuristic I think
[23:50:57 CEST] <JEEB> so if your timestamps go wee-wee, it's going to 100% break on you
[23:58:09 CEST] <trashPanda_> is it in a function call defined elsewhere? I'm not seeing anything that sleeps in the actual ffmpeg.c file, sorry if I missed it
[23:58:48 CEST] <trashPanda_> nothing that would sleep like that anyway
[00:00:00 CEST] --- Wed Oct 10 2018
More information about the Ffmpeg-devel-irc
mailing list