[Ffmpeg-devel-irc] ffmpeg.log.20170214
burek
burek021 at gmail.com
Wed Feb 15 03:05:01 EET 2017
[00:01:58 CET] <jLuca> Hello guys, I need your help: I wanted to try streaming using directshow but there is no video devices avaible/installed, any guess?
[00:02:07 CET] <jLuca> Here there is the pastebin: http://pastebin.com/ef2dpJxY
[00:02:14 CET] <jLuca> Windows 10 x64 Pro N
[00:22:01 CET] <llogan> jLuca: IIRC, and I'm not a Windows user, you may have to install something first...
[00:22:05 CET] <llogan> https://github.com/rdp/screen-capture-recorder-to-video-windows-free
[00:22:11 CET] <llogan> but i may be wrong
[00:28:29 CET] <jLuca> I don't know what directshow is/how it works, I thought it had a native interface to capture the desktop screen
[00:28:37 CET] <jLuca> I will try to download that program
[00:29:06 CET] <jLuca> thanks llogan
[00:47:51 CET] <xtina> Hey guys. I'm currently streaming via ffmpeg with this command: http://pastebin.com/1zvEz3vJ
[00:48:26 CET] <xtina> my stream is great for about 1 minute. then, the audio and video bitrate start dropping. video goes from 250kbps to 150 and audio goes from 24kbps to 0!
[00:48:42 CET] <xtina> this causes my Youtube stream to enter a state of permanent buffering
[00:48:51 CET] <xtina> with no new frames ever being shown
[00:49:22 CET] <xtina> i receive the following warnings from Youtube: http://pastebin.com/qmu8n2YF
[00:49:30 CET] <xtina> notably, 'The audio stream's current bitrate (0) is lower than the recommended bitrate. We recommend that you use an audio stream bitrate of 128 Kbps.'
[00:49:44 CET] <xtina> why would my audio bitrate drop to 0 after 1 minute of streaming?
[00:51:25 CET] <atomnuker> silence in the stream probably
[00:52:42 CET] <xtina> atomnuker: this bitrate drop happens no matter what noise there is
[00:53:01 CET] <xtina> i have music playing so it isn't silent when the bitrate drops to 0
[00:53:30 CET] <xtina> anyway, silence shouldn't cause the Youtube stream to error out, otherwise silent streams would all fail?
[00:54:04 CET] <xtina> is it possible the 0 audio bitrate is not what's causing the Youtube stream to permanently buffer?
[00:55:50 CET] <xtina> i'm also getting a keyframe frequency error (i'm at 6s, need to be 4s or less). can i fix this? i'm not re-encoding with my ffmpeg command, only copying
[01:05:23 CET] <xtina> any ideas why my audio bitrate drops to 0? is this just silence or is it the reason my stream starts perma-buffering?
[01:13:52 CET] <atomnuker> I think it's silence, though it won't drop it to exactly 0 bytes
[01:14:05 CET] <atomnuker> it'll be close to like 4 kbps for aac silence
[01:16:32 CET] <explodes> Hello! -preset slow and -preset ultrafast means it loads the presets with those names which are key-value pairs for options. Does, however, the "speed" in the name indicate how long it takes, generally, to convert from one format to another? What is the purpose of these "speed" presets?
[01:18:30 CET] <arog> hey i want to write a c++ tool that I can push frames to ffmpeg which will save it to a mp4
[01:18:35 CET] <arog> how should i get started with that
[01:20:15 CET] <xtina> atomnuker: thanks, i'll try to test something louder then
[01:20:26 CET] <blb> arog: read the api docs
[01:22:32 CET] <JEEB> arog: basically you need to create AVFrames, then do format conversions if required and finally feed the result to an encoder which will produce AVPackets, which will in turn go to the mp4 muxer
[01:22:46 CET] <JEEB> arog: see the API and examples directory under docs in the source tree
[01:23:04 CET] <arog> thanks
[01:23:19 CET] <blb> JEEB is much more helpful
[01:23:41 CET] <arog> if cuda is available will it do the encoding on the graphics card
[01:23:43 CET] <arog> or does it do it on the CPU
[01:24:10 CET] <JEEB> depending on the driver and such you might be able to use the hardware encoding component on your GPU, but do note that its use is not high compression
[01:24:18 CET] <JEEB> you mostly gain possibly speed and low latency
[01:24:32 CET] <arog> im not interested in doing heavy compression
[01:24:40 CET] <arog> is crf18 considered heavy compression?
[01:26:17 CET] <JEEB> arog: it's a rate control value for CRF, orthogonal to how good the encoder is at compression
[01:26:41 CET] <arog> hmm
[01:26:44 CET] <arog> ill check out the examples :)
[01:26:46 CET] <JEEB> it's like comparing an encoder doing preset ultrafast and preset veryslow with the same bit rate
[01:27:06 CET] <JEEB> the latter will get you much more buck for the same amount of bits
[01:27:44 CET] <JEEB> what I'm saying is that while the latest GPU ASICs are better than ultrafast (I think), they are meant for the part of the market that doesn't need that much compression capability
[01:28:29 CET] <JEEB> they're pretty much hard-wired for low latency (which by itself limits compression as you can't use compression things that heighten latency) and some amount of speed
[01:28:37 CET] <arog> oh i see
[01:28:39 CET] <arog> okay
[01:29:18 CET] <JEEB> meanwhile libx264 will give you everything from ultrafast to placebo, and then low latency is a configuration (although I must say then the rest of FFmpeg's framework has to be also tweaked for it, which I have no idea about)
[02:02:48 CET] <xtina> hey guys. i'm getting the 'overrun!' messages in my ffmpeg stream. what does that mean?
[02:03:12 CET] <xtina> when this happens, i don't send frames for a sec, then continue sending. but my stream starts permanently buffering - it never continues.
[02:03:19 CET] <xtina> what does that mean?
[02:47:56 CET] <explodes> None of these videos I'm converting play in VLC..
[02:48:10 CET] <explodes> Audio works ok, but the video is a still picture, frame 0
[02:53:01 CET] <DHE> in terms of the API, is there an easy way to detect when the source video has changed? eg new or removed audio/video streams? or metadata changes?
[02:53:35 CET] <DHE> s/source video/streams in source container/
[02:54:55 CET] <faLUCE> DHE: for that I would use a callback in avio_alloc_context()
[02:56:31 CET] <faLUCE> when you read packets, you can obtain the opaque ptr of the producer in the callback
[02:56:41 CET] <DHE> uhh. what?
[02:57:57 CET] <faLUCE> DHE: I use that callback in order to customize packets written to the muxer
[02:58:40 CET] <DHE> no. I'm receiving data from a demuxer. the contents of the stream may change though, such as the number of audio streams or the languages involved. I need to detect when this has happened
[02:58:59 CET] <DHE> and while I suppose I can just constantly watch all the fields like a hawk, I'd like to know if there's a better way
[03:03:09 CET] <faLUCE> DHE: the demuxer is not a libav one?
[03:04:15 CET] <DHE> it is
[03:04:35 CET] <faLUCE> then, you can use a read callbac, like avioreading shows
[03:04:42 CET] <faLUCE> *callback
[03:04:47 CET] <DHE> shows what?
[03:04:53 CET] <faLUCE> I'm talking about avioreading.c
[03:05:14 CET] <DHE> I have a working application. it's fine as long as the video stream doesn't change from its specs on startup
[03:05:19 CET] <faLUCE> it shows how to capture packets from the muxer
[03:05:41 CET] <faLUCE> the callback can have also ano opaque pointer
[03:05:52 CET] <DHE> eg: the stream changes from (1 video, 1 audio) to (1 video, 2 audio, 1 subtitle) or something like that
[03:06:15 CET] <DHE> at which point it rolls over and diea
[03:06:17 CET] <DHE> dies
[03:06:43 CET] <faLUCE> DHE: the concept is: you link an aviocontext to a muxer, then set a callback into the aviocontext which reads packets from the muxer
[03:06:58 CET] <faLUCE> and you pass also the stream's infos to the callback
[03:07:37 CET] <faLUCE> so, as soon as you receive a packet, the callback will give you also the stream's infos
[03:08:23 CET] <DHE> aviocontext is for the raw file/disk/network/whatever IO.
[03:08:33 CET] <faLUCE> yes
[03:08:34 CET] <DHE> that layer has nothing to do with the stream interpretations
[03:10:51 CET] <faLUCE> DHE: if you link a muxer to an aviocontext, then you link also a stream to it, because the muxer is linked to the stream
[03:15:03 CET] <faLUCE> avio_alloc_context(muxerAVIOContextBuffer, someCallback, ....); muxerContext->pb = muxerAVIOContext; avformat_new_stream(muxerContext...);
[03:15:12 CET] <DHE> I said demuxer
[03:15:23 CET] <faLUCE> DHE: muxer can be demuxer too
[03:19:22 CET] <DHE> umm.. no this means nothing in terms of the AVStreams changing
[03:20:33 CET] <faLUCE> why not? if the stream changes, it changes the pointer of it linked to the muxer
[03:20:52 CET] <DHE> no. the source of my data (network socket in this case) doesn't change
[03:22:13 CET] <faLUCE> DHE: then you could use multiple stream containers and change the link when the active stream changes
[03:23:19 CET] <DHE> you don't understand
[03:25:12 CET] <faLUCE> DHE: I'm pretty sure you can solve that with the callback I said before
[03:25:34 CET] <faLUCE> DHE: but I don't want to insist
[03:26:30 CET] <DHE> no. you're suggesting that using a custom file/stream IO handler will somehow assist in identifying when a second audio stream becomes available within it
[03:26:55 CET] <faLUCE> yes.
[03:27:58 CET] <faLUCE> because the handler is linked to the demuxer through muxerContext->pb
[03:28:09 CET] <faLUCE> DemuxerContext->pb
[03:29:20 CET] <faLUCE> so the chain is streams-->demuxer-->pb(AVIOContext)<--- IOHandler
[03:30:31 CET] <DHE> I give up
[03:31:30 CET] <explodes> Is there a SOLID resource for how to format videos for streaming on mobile devices? I've just been collecting bits and pieces from here and there and putting them together
[03:32:12 CET] <faLUCE> explodes: what do you mean with SOLID resource? documentation?
[03:32:41 CET] <DHE> different devices have different capabilities. apple has docs for their various iDevices, just choose what minimum you want to support and meet it
[03:36:52 CET] <xtina> hey guys. I'm trying to understand why my video stream would enter a state of permanent buffering, even as I continue to send video and audio to it
[03:37:33 CET] <xtina> before I can debug I want to understand why a stream would ever permanently buffer
[03:37:38 CET] <xtina> even as it's receiving data?
[03:37:49 CET] <xtina> at the same audio and video bitrate as before
[04:04:36 CET] <dysfigured> i'm surprised there aren't better completions for zsh..
[04:28:29 CET] <explodes> faLUCE: Yea, some good examples or case studies
[04:28:53 CET] <explodes> faLUCE: there is plenty of documentation, i feel like i have hours of reading ahead of me
[04:29:08 CET] <explodes> i would like to see what people have already done, and works well for them
[04:32:16 CET] <explodes> All I could come up with so far is : http://ix.io/1Tk5
[04:33:20 CET] <explodes> But, then again, VLC can't even play the output
[04:53:12 CET] <llamapixel> Start with a spreadsheet that gives you the most common formats you want to targer with the devices and expand on that.
[04:53:25 CET] <llamapixel> https://en.wikipedia.org/wiki/List_of_common_resolutions
[05:10:28 CET] <vans163> im trying to debug some latency streaming issues, i seem to be getting some choppiness / dela
[05:10:49 CET] <vans163> Im noticing the first 3 frames arrive within 5ms of eachother, then every other frame is 14-20 ms apart (60fps)
[05:11:01 CET] <vans163> could this in anyway set the stream back 3 frames? im thinking no
[05:11:32 CET] <vans163> the stream should correct itself within the first few seconds and catchup right?
[05:23:38 CET] <aster__> Wht is the porpose of AVPacketSideData in AVCodecContext? thanks
[05:29:35 CET] <vans163> anyone know what the param would be to prevent such large frames?
[05:31:24 CET] <vans163> the param perhaps is GOP frequency, but Im seeing patterns when recording 60fps like. 200kb frame, 300bytes*GOPfrequency, 200kb frame
[05:31:45 CET] <vans163> then the decoder is choking every 7th-8th frame
[05:31:54 CET] <vans163> 0-1 ms decoder times, then 7th-8th frame 120ms
[05:32:06 CET] <vans163> is there some kind of param to smooth the stream out?
[05:32:26 CET] <vans163> perhaps increase the decode time from 0-1ms to 3-8ms, but not have those 120ms chunks
[05:32:47 CET] <vans163> anyone know what is arye?
[06:00:37 CET] <vans163> after some more investigation it seems its related to something blocking somewhere rendering the picture the decoder produces, and decoder stalls if the picture in the pipeline is not removed
[06:01:14 CET] <vans163> i disabled rendering the decoded picture and everything is running at 0-1ms per frame decode call
[06:13:25 CET] <vans163> yea so rendering is taking way too long
[06:13:50 CET] <vans163> could this be to do with how the stream is encoded? perhaps its an incompatible pixel format as there is heavy conversion?
[06:15:27 CET] <Diag> um
[06:15:30 CET] <Diag> encoding takes forever
[06:20:15 CET] <vans163> Diag: encoding is fine. i need to rule out the local network then
[06:20:25 CET] <Diag> oh
[06:20:40 CET] <vans163> im seeing the client sometimes gets 3frames in the same millisecond
[06:22:38 CET] <vans163> once i started dropping frames that are too slow to paint the buildup dissappeared
[06:22:54 CET] <vans163> so no more 5+second delay. but its still choppy
[06:28:04 CET] <vans163> currently using managed code for the server that relays the stream. going to code up a c server
[06:28:05 CET] <vans163> tomorow
[06:28:21 CET] <vans163> it seems like it could be a shitty nic / mananged code
[06:29:15 CET] <vans163> the stream is arriving in studders sometimes, which leads to 2-3 frames being in the pipeline within 1-3 ms of eachother, which leads to 3 pictures poping out like triplets and 2 need to be dropped
[06:29:22 CET] <vans163> then we get the studder
[08:22:09 CET] <xtina> hey guys, i'm streaming with ffmpeg, and after 90 seconds, my stream freezes in a state of permanent buffering
[08:22:15 CET] <xtina> This happens even though my bitrate/fps remain consistent
[08:22:30 CET] <xtina> what does this mean?
[08:22:31 CET] <xtina> trying to understand why a stream would get stuck in permanent buffering
[08:38:05 CET] <thebombzen> xtina: the stream freezes server-side or client-side?
[08:51:15 CET] <xtina> thebombzen: the stream enters a state of permanent buffering
[08:51:34 CET] <xtina> as i watch it from the Youtube live dashboard
[08:51:51 CET] <xtina> it appears to be server-side
[08:51:59 CET] <xtina> my client has no issues with buffering youtube streams in general
[09:06:52 CET] <Chen__> hello guys , who can help me?
[09:06:57 CET] <Chen__> Good morning.
[09:07:44 CET] <xtina> my FFMPEG stdout is showing me that I'm getting a consistent 200kbps bitrate, even after my stream is permanently buffering
[09:08:04 CET] <xtina> is the FFMPEG's reported bitrate reliable? does this mean i'm definitely still sending data to the stream?
[09:08:17 CET] <xtina> and what does that bitrate represent? audio + video bitrate?
[09:17:39 CET] <thebombzen> xtina: what do you mean "permanently buffering"
[09:17:48 CET] <thebombzen> does ffmpeg stop sending data?
[09:18:06 CET] <xtina> thebombzen: FFMPEG's stdout reports a steady 200kbps bitrate and 15FPS
[09:18:15 CET] <xtina> this means it is still sending data, right?
[09:18:24 CET] <thebombzen> stderr, and yes, ffmpeg is
[09:18:30 CET] <xtina> on my desktop I'm watching my Pi's livestream from Youtube
[09:18:35 CET] <thebombzen> also
[09:18:38 CET] <xtina> after the 90s mark, the stream buffers and never stops buffering
[09:18:52 CET] <xtina> (the circular buffering indicator.. and no new frames of video)
[09:18:53 CET] <thebombzen> keep in mind that if you're on home internet, your ISP might be screwing with your connection
[09:19:05 CET] <xtina> it isn't an issue with my computer.. i dont' experience buffering with other streams
[09:19:20 CET] <thebombzen> well you're also ULing and DLing from the same computer
[09:19:26 CET] <thebombzen> so keep that in mind
[09:19:27 CET] <xtina> i watch video streams a lot from this computer, i never see buffering
[09:19:34 CET] <xtina> hmmm
[09:19:42 CET] <xtina> i am UL and DL from the same network
[09:19:45 CET] <xtina> not same computer
[09:19:49 CET] <thebombzen> okay
[09:19:51 CET] <thebombzen> sure
[09:19:52 CET] <xtina> if that's what you mean?
[09:20:02 CET] <thebombzen> yea
[09:20:10 CET] <xtina> so
[09:20:11 CET] <thebombzen> monitor your UL and DL speed
[09:20:15 CET] <thebombzen> do you stop ULing data?
[09:20:31 CET] <xtina> i've been battling a lot to ensure that my video and audio pipes don't get 'stuck' and lose frames
[09:20:41 CET] <xtina> i've expanded the max pipe buffer size on my Pi
[09:20:43 CET] <thebombzen> try watching it on your phone via your cellular data and see what happens
[09:20:49 CET] <thebombzen> instead of on your local network
[09:20:54 CET] <xtina> i'm trying to add a pipe viewer buffer to increase the size even more
[09:20:59 CET] <xtina> but i'm not sure if that's the issue at all
[09:21:03 CET] <thebombzen> sure but diagnose the problem before you try to fix it
[09:21:07 CET] <xtina> since ffmpeg tells me the bitrate/fps are constant and healthy
[09:21:12 CET] <xtina> does this mean there is no issue from the Pi side?
[09:21:17 CET] <thebombzen> perhaps
[09:21:22 CET] <thebombzen> it means there's no issue with ffmpeg itself
[09:21:28 CET] <thebombzen> I think this sounds like a network layering issue
[09:21:34 CET] <thebombzen> and you should diagnose the problem first.
[09:21:49 CET] <xtina> network layering, i.e. the issue is my network is used both for UL/DL
[09:21:51 CET] <xtina> right?
[09:21:55 CET] <thebombzen> perhaps
[09:22:02 CET] <thebombzen> maybe you should test to see if it's actually that
[09:22:03 CET] <xtina> i don't think i stop ULing only b/c ffmpeg tells me the stream's OK?
[09:22:05 CET] <thebombzen> which is what i"ve said like four times
[09:22:07 CET] <xtina> OK
[09:22:09 CET] <xtina> sorry
[09:22:15 CET] <xtina> i'll watch the stream from a different network
[09:22:25 CET] <xtina> thanks for the help, i appreciate it
[09:22:33 CET] <thebombzen> xtina: if ffmpeg says the stream is okay it just means that it's successfully dumped the encoded data somewhere
[09:22:40 CET] <thebombzen> that doesn't mean it actually arrived at youtube's servers
[09:23:06 CET] <xtina> hmm
[09:23:16 CET] <xtina> i see
[10:47:31 CET] <xtina_> thebombzen: i just tried watching the stream from a different network than the one I used to upload it from the Pi
[10:47:47 CET] <xtina_> even on a different network, I can see that the stream starts buffering after 90s and never stops
[10:48:27 CET] <xtina_> i looked again at ffmpeg's stderr and the reported bitrate is a healthy 250kbps
[10:48:48 CET] <xtina_> do you have any idea how i can debug this?
[10:51:13 CET] <xtina_> but i think i found a big clue
[10:51:17 CET] <xtina_> i just looked in the report log
[10:51:27 CET] <xtina_> and at the time when the youtube stream perma-freezes, i start getting this message:
[10:51:28 CET] <xtina_> 'Delay between the first packet and last packet in the muxing queue is 10100000 > 10000000: forcing output'
[10:51:35 CET] <xtina_> and i get it twice a second until the end of the stream
[10:51:37 CET] <xtina_> so this must be related!
[10:51:46 CET] <xtina_> but what does this mean?
[10:53:17 CET] <xtina_> while it reports a healthy 250kbps bitrate, it's spitting this message constantly while my stream is stuck permanently buffering
[11:25:17 CET] <pihpah> What's the point of converting video with aspect ratio 4:3 to, let's say, 16:9 using padding?
[11:25:24 CET] <pihpah> Is there any?
[11:25:50 CET] <BtbN> stupid players doing stupid things with it otherwise I guess
[11:26:01 CET] <BtbN> or strict resolution requirement by some hw stuff
[11:29:41 CET] <pihpah> So the whole reason is that some players can stretch it out without taking into account its actual aspect ratio?
[11:40:23 CET] <xtina_> guys, anyone know what this error is?
[11:40:28 CET] <xtina_> Delay between the first packet and last packet in the muxing queue is 10100000 > 10000000: forcing output'
[12:13:37 CET] <DHE> xtina_: you're using av_interleaved_write_frame() which buffers packets internally in order to flush them out in proper DTS order. but the separation got so large it's given up
[12:14:06 CET] <xtina_> DHE: you're right. i investigated and saw mention of max_interleave_delta
[12:14:12 CET] <xtina_> i've now set it to 0 so that it 'never gives up' (i think)
[12:14:19 CET] <xtina_> this seems to prevent this error from occurring. however -
[12:14:48 CET] <xtina_> at some point, for some reason, Youtube Live thinks the stream has ended
[12:14:57 CET] <xtina_> even though ffmpeg says it is still continuing to send streams
[12:15:18 CET] <DHE> you feeding the data properly? are frames (audio or video) being dropped and not being handled properly?
[12:15:43 CET] <DHE> I'm thinking you're losing data and audio/video sync is drifting out to the point the process gives out
[12:15:44 CET] <xtina_> perhaps Youtube thinks the stream is over because now ffmpeg is waiting so long to get a packet from A/V that Youtube stops waiting?
[12:16:07 CET] <xtina_> it's very likely that i'm losing data
[12:16:12 CET] <xtina_> i'm writing both audio and video to named pipes
[12:16:19 CET] <xtina_> and sending those in to ffmpeg
[12:16:21 CET] <xtina_> i'm on a pi zero
[12:16:25 CET] <xtina_> i've witnessed audio/video desync
[12:17:08 CET] <xtina_> i have heard of a few ideas for preventing frames for being dropped (insert a pipe buffer, use a faster SD card, use XFS filesystem)
[12:17:21 CET] <xtina_> i have already increased my system's pipe max buffer size to 1MB
[12:17:47 CET] <xtina_> hmmm
[12:18:24 CET] <xtina_> i can tolerate a stream with occasional audio/video desync and/or buffering
[12:18:37 CET] <xtina_> however, i want the system to recover and resync the A/V eventually
[12:18:39 CET] <xtina_> even if it takes 1-2s
[12:18:47 CET] <xtina_> is this syncing 'recovery' possible?
[12:19:23 CET] <xtina_> i need to ensure no frames ever get dropped from either pipe, right?
[12:54:35 CET] <DHE> or something that can produce a timestamped stream. pipes of raw video/audio have no timestamps and must be 100% complete
[12:55:45 CET] <xtina_> DHE: got it
[12:56:02 CET] <xtina_> the reason i'm using raw audio/video is that i'm trying to minimize processing
[12:56:06 CET] <xtina_> since i only have a Pi Zero
[12:56:32 CET] <xtina_> do you have any recommendations about which of the following would most improve my issues with dropped frames?
[12:57:00 CET] <xtina_> getting a faster SD card (i have a cheap no-name), writing/reading from XFS filesystem, inserting a pipe buffer tool into my audio/video pipes?
[13:03:11 CET] <furq> what are you doing which is hitting the filesystem
[13:03:41 CET] <furq> oh nvm, the fifo
[13:03:48 CET] <furq> you should really have those on a tmpfs
[13:04:03 CET] <furq> /tmp is probably already mounted as a tmpfs, but otherwise you can create one
[13:04:32 CET] <furq> but the actual fifo contents are in memory regardless, so that's probably not the issue
[13:16:18 CET] <xtina_> furq: i am putting the pipes in /tmp :)
[13:16:25 CET] <xtina_> here's my full command:
[13:16:30 CET] <xtina_> http://pastebin.com/Fj2Tk1Ak
[13:16:55 CET] <furq> well yeah the filesystem and sd card will make no difference in that case
[13:17:02 CET] <furq> every bit of that should be in memory
[13:17:31 CET] <xtina_> so the only thing that could make a difference is if i could add a bigger buffer into the pipes? i'm maxed out at 1MB right now
[13:17:42 CET] <furq> maybe?
[13:17:53 CET] <xtina_> but
[13:18:00 CET] <xtina_> i thought the point of a better SD card was that
[13:18:01 CET] <furq> i've never had cause to route stuff through fifos
[13:18:03 CET] <xtina_> it would read/write faster?
[13:18:11 CET] <furq> yeah but no part of that is reading from the card
[13:18:16 CET] <xtina_> why does the SD card not make a difference if i'm writing to /tmp?
[13:18:19 CET] <xtina_> oh
[13:18:27 CET] <furq> tmp is a tmpfs, it's in memory
[13:18:31 CET] <xtina_> oh right
[13:18:41 CET] <furq> once ffmpeg has been loaded off the card (assuming it wasn't already in the cache) then nothing touches the card
[13:18:48 CET] <xtina_> right
[13:19:15 CET] <xtina_> hmmm
[13:19:49 CET] <furq> idk if the pi zero has an sd activity indicator but you should see that flashing if i'm wrong
[13:20:19 CET] <xtina_> it only has one LED .. for power i think
[13:21:23 CET] <xtina_> is there any point trying a really fast SD card with a XFS filesystem vs. fully writing/reading from memory as I'm doing now?
[13:21:53 CET] <xtina_> because
[13:22:16 CET] <xtina_> i read this RPi forum thread: https://www.raspberrypi.org/forums/viewtopic.php?t=43738&p=348942
[13:22:29 CET] <xtina_> in which a lot of people encountering my kind of audio/video issue switched to XFS
[13:22:37 CET] <xtina_> and it completely eliminated the problem for them
[13:22:52 CET] <xtina_> (i know they probably don't have Zeros)
[13:25:26 CET] <xtina_> is it possible that XFS would be better than in-memory?
[13:25:53 CET] <furq> no
[13:26:24 CET] <xtina_> so putting it in /tmp is the absolute best i can do
[14:15:55 CET] <DHE> a named pipe doesn't consume disk IO in any significant way even if you're moving gigabytes through it. at most mtime is being updated
[14:30:52 CET] <migimunz> Hi everyone. I want to use ffmpeg and volumedetect to get the mean and max volumes from an audio file, but I'd like it to return something machine-parsable, like json
[14:31:10 CET] <migimunz> ffprobe does have this option, but I don't understand if I can make it do volumedetect
[14:31:17 CET] <migimunz> the docs do list the filters, but I'm not sure how to make it work
[14:31:54 CET] <DHE> the filters do their own thing independent of the output of ffprobe
[14:32:57 CET] <DHE> you'll have to just collect the logs and parse them as-is, unless you want to modify the filter itself
[14:33:09 CET] <migimunz> DHE: so the filters will only print to stdout, they don't actually contribute to the output
[14:33:12 CET] <migimunz> I understand, thank you
[14:34:04 CET] <DHE> technically they print to the log at level INFO. choosing a -loglevel that's too stringent would hide the output
[14:35:20 CET] <DHE> also they go to stderr so that stdout can be redirected for pipe-based outputs
[14:38:29 CET] <durandal_1707> migimunz: you can use astats filter output
[14:39:27 CET] <migimunz> durandal_1707, thanks, reading the docs now
[16:21:43 CET] <vans163> anyone know the difference between h264 streams when there is 1 large frame 200kb and many small ones 300 bytes (while image is pretty still/idle) vs 3000-5000 byte frames with 50-100kb for the large ones
[16:21:55 CET] <vans163> When I set the VBV buffer I get the latter, otherwise its the former
[16:57:26 CET] <spirytusrektus> Hi, has anyone ever used framerate filter? I am trying to use it on 29.97 fps interlaced media (together with yadif) but result is blurry
[16:58:52 CET] <spirytusrektus> my filter description is "yadif=0:0:0,format=pix_fmts=yuv420p,framerate=25:15:240:7"
[17:00:05 CET] <spirytusrektus> is there any way to prevent blurriness? I did try different parameters for framerate filter but the result was not satisfactory
[17:05:47 CET] <furq> spirytusrektus: try the minterpolate filter
[17:06:00 CET] <furq> you probably also want to use yadif in mode 1 so you've got more frames to work with
[17:19:04 CET] <spirytusrektus> furq: thanks for tips
[18:25:14 CET] <nohop> hey guys. We have to use ffmpeg to encode a video (only) stream. It has to be cross-platform (linux and windows) and use NVenc
[18:25:41 CET] <nohop> Is a common way of doing things to actually invoke the ffmpeg binary and push raw data up it's stdin, and make it encode that way ?
[18:26:12 CET] <nohop> or would that be concidered ugly (i feel like it is, but i have a feeling this is done often? am i wrong?)
[18:26:33 CET] <nohop> should i, instead, just use libavcodec?
[18:48:04 CET] <vans163> can frames be dropped out the encoder or should pictures be?
[18:48:20 CET] <vans163> **decoder sorry
[18:48:40 CET] <Diag> yes
[18:49:08 CET] <vans163> Diag: would it lead to anything adverse? like I see some frames are 9kb others are 100-200kb. what if a 100-200kb frame gets dropped?
[18:49:22 CET] <vans163> or would the stream recover naturally from anything?
[18:49:25 CET] <Diag> well it depends
[18:49:37 CET] <Diag> typically
[18:49:52 CET] <Diag> if you jump right into an mpeg stream (depending on the decoder)
[18:49:56 CET] <Diag> it could either
[18:50:01 CET] <Diag> use the last keyframe
[18:50:03 CET] <vans163> h264 stream* i should be specific
[18:50:07 CET] <Diag> or
[18:50:14 CET] <Diag> you could get bits and pieces of the image
[18:50:20 CET] <Diag> and the rest would be a solid color
[18:50:28 CET] <Diag> but i mean
[18:50:35 CET] <Diag> im sure youve seen in extremely compressed shit
[18:50:39 CET] <Diag> (this is just for an example)
[18:50:43 CET] <Diag> when a scene changes
[18:50:49 CET] <Diag> and it uses the image from the last keyframe
[18:50:58 CET] <Diag> with the movement data from the next frame
[18:51:06 CET] <Diag> you know what i mean?
[18:51:51 CET] <vans163> hum.. I guess if I implement the dropping I will better understand. I have a problem where I am doing low latency decoding and sometimes the client is unreliable and gets 4-6 frames in a 60fps stream at the same time. Im not sure how to handle this right no
[18:52:10 CET] <vans163> If I decode them all then drop the pictures, i get studder
[18:52:16 CET] <Diag> idunno
[18:53:31 CET] <vans163> hum.. would it be possible to SwapBuffers off the mainthread?
[18:54:18 CET] <Diag> idunno im not an ffmpeg person, you need someone smart like uhh
[18:54:21 CET] <Diag> furq or kerio
[18:56:00 CET] <vans163> Diag: gotcha, im really curious how to get to the bottom of this
[18:58:25 CET] <kerio> how am i smart
[19:57:59 CET] <lanc> Hi all, im currently working with the libavfilter and am having some issues understanding an error code Im receiving. Ive created an abuffer_ctx and a channelsplit_ctx, then setup the filters. After that I link one of the channels together using avfilter_link(abuffer_ctx, 0, channelsplit_ctx, 2), but get error code -22. How can I find more information out about this error 22?
[20:10:01 CET] <lanc> Hmm ive tracked down that its an invalid argument error
[20:37:36 CET] <arog> lanc you here?
[21:13:36 CET] <arog> are there any easy to follow samples to build my own application and use nvenc to make a x264/mp4 file?
[21:21:01 CET] <Diag> yes
[21:32:49 CET] <jarkko> how do i enable 10bit at x265 at compile time?
[21:36:08 CET] <EGreg> hi all
[21:36:13 CET] <EGreg> what is the command to draw on the video
[21:36:30 CET] <EGreg> for example if I choose "blue color" from a pallette, and draw something with a brush, and I captured the positions, etc.
[21:36:51 CET] <EGreg> what commands do I execute to animate this "drawing" on the video at certain times in the video, and then clear it at a later time
[21:37:06 CET] <Diag> are you sure you have the right software?
[21:37:31 CET] <EGreg> I can use -vf drawbox
[21:37:40 CET] <EGreg> Yeah, I thought ffmpeg can work similarly to let's say imagemagick
[21:37:42 CET] <cousin_luigi> Greetings.
[21:37:46 CET] <Diag> oh idunno
[21:37:49 CET] <Diag> sup
[21:37:49 CET] <EGreg> drawbox=10:20:200:60:red at 0.5
[21:37:57 CET] <EGreg> ok so let someone else answer who knwos :)
[21:37:59 CET] <Diag> im learning so much shit about ffffffmpeg
[21:38:15 CET] <Diag> EGreg: yeah theres a couple people in here who are pretty damn smart
[21:38:16 CET] Action: cousin_luigi needs to concatenate a few hundreds clips and zoom in them too.
[21:38:24 CET] <cousin_luigi> I could use a nudge in the right direction.
[21:38:35 CET] <Diag> cousin_luigi: https://trac.ffmpeg.org/wiki/Concatenate
[21:39:05 CET] <Diag> also
[21:39:06 CET] <Diag> iirc
[21:39:19 CET] <Diag> to zoom you have to scale and crop..... (i think, it might be different by now)
[21:39:35 CET] <cousin_luigi> Yes, that was my doubt.
[21:39:38 CET] <Diag> yeah looks like theres a better way to do it now
[21:39:44 CET] <cousin_luigi> perhaps I'll leave that to the player if it's too messy
[21:39:57 CET] <Diag> heh
[21:40:03 CET] <Diag> there is a better way to do it
[21:40:07 CET] <Diag> im just unaware
[21:40:38 CET] <cousin_luigi> Bah, nevermind. I'm tired of the movie too:)
[21:40:39 CET] <cousin_luigi> bbl
[21:44:33 CET] <arog> Diag: which sample shows how to use nvenc ?
[21:53:25 CET] <arog> https://ffmpeg.org/doxygen/trunk/api-example_8c-source.html i found this -- would the code in video_encode be identical regardless of the encoder I use?
[22:17:07 CET] <kepstin> arog: I'm not sure about nvenc, but some of the other hw encoders (e.g. vaapi) require an explicit step to upload the video into video memory before encoding.
[22:25:18 CET] <arog> kepstin okay will check it out
[22:25:26 CET] <arog> gah im running into all these build issues because my app is c++
[22:26:23 CET] <JEEB> when you include FFmpeg headers, add extern "C"
[22:26:45 CET] <JEEB> pretty sure it's mentioned in the FAQ
[22:26:47 CET] <JEEB> in the docs
[22:33:33 CET] <nohop> We need to encode a real-time video stream in our application. Is it 'bad design' to invoke ffmpeg and pipe raw rgb data into it's stdin? (and make ffmpeg output to a file) ? Should I use libavcodec instead ?
[22:35:05 CET] <kepstin> nohop: piping raw data into ffmpeg generally works ok, but you might run into some problems if you're relying on it working realtime, due to small pipe buffers (or, well, big frames) and blocking synchronous writes
[22:35:34 CET] <bray90820> OFF TOPIC: Does anyone know of a legal way to make a copy of a non encrypted DVD preferably on OSX but windows will work as well
[22:36:06 CET] <kepstin> non-encrypted? You should just be able to copy the files off it with any old file manager...
[22:36:44 CET] <nohop> kepstin: oh, so in our case, with our _extremely_ large frames (48MB per frame) this is not going to be efficient/reliable/recommended? :)
[22:37:30 CET] <kepstin> nohop: it'll work, as long as you push off the actual writes to a separate IO thread so they don't block the rest of your app
[22:37:55 CET] <nohop> OOh, yeah. Of course that's the plan :)
[22:38:17 CET] <nohop> another, probably dumb question, and I'm sure is kind of the wrong place here... I'm going to try it anyways
[22:38:18 CET] <kepstin> it would be faster to use libav directly tho, because then you could write your data directly into avframes and encode them without copying
[22:38:55 CET] <nohop> are these kinds of pipes efficient in windows ? I know it'll be fine under linux, but i have little experience using them in windows
[22:39:12 CET] <nohop> kepstin: yeah... I'm looking into that. I'm having a bit of trouble getting that to work in windows aswell :)
[22:39:15 CET] <kepstin> I have no experience with this sort of ipc in windows :/
[22:39:33 CET] <bray90820> kepstin: would it play on a normal DVD player or would it need to be one that specifically plays burnable dvd's
[22:39:35 CET] <nohop> ok
[22:40:22 CET] <kepstin> bray90820: most software dvd players can play from a directory of unencrypted dvd files directly
[22:40:25 CET] <arog> JEEB yeah i fixsed it thanks :D
[22:41:07 CET] <arog> okay I wnat to make sure I understand this correctly. so a while back I did sudo apt-get install ffmpeg, but that doesn't install the nvenc encoder which I want to use. So I downloaded the ffmpeg source, built it manually, configured it to use nvenc
[22:41:20 CET] <kepstin> bray90820: if you want to write them back to another dvd-r, you'll need a tool that writes the format correctly, just writing as a data disk won't always work
[22:41:22 CET] <arog> now in my code if i do avcodec_find_encoder_by_name("nvenc") it returns NULL
[22:41:29 CET] <arog> if i do ffmpeg -encoders I see nvenc in that list
[22:41:53 CET] <kepstin> arog: you've probably linked your code to the system libav* libraries rather than your newly built ones
[22:42:12 CET] <arog> kepstin: the newly built ones are going to be in ~ffmpeg/ right?
[22:42:17 CET] <arog> i follwoed this example
[22:42:28 CET] <arog> https://trac.ffmpeg.org/wiki/CompilationGuide/Ubuntu
[22:42:33 CET] <bray90820> kepstin: I did wanna write them to another DVD
[22:42:35 CET] <arog> ~/ffmpeg_sources/
[22:45:19 CET] <arog> okay thanks kepstin i think i know what to do :)
[22:46:06 CET] <kepstin> bray90820: I know there's a few applications that can write dvd-video from a dvd structure on disk, maybe google search "burn video_ts" or something like that
[22:46:26 CET] <kepstin> on linux, I'd just say use dd or ddrescue to make an iso, then burn that :/
[22:46:51 CET] <bray90820> kepstin: Yeah that's not a bad idea
[22:47:06 CET] <bray90820> I know of one I use to use called cloneDVD but I don't wanna pay $40 for it
[22:47:21 CET] <bray90820> I use to use the free trial then reinstall windows
[22:47:52 CET] <kepstin> man, I haven't burned physical disks in many, many years...
[22:48:26 CET] <bray90820> Same here but my friends mother still uses DVD's
[22:49:03 CET] <furq> bray90820: imgburn works on windows
[22:49:51 CET] <bray90820> furq: Would you happen to know how well that works with wine?
[22:50:09 CET] <furq> i don't but if you're not on windows then don't bother
[22:50:34 CET] <kepstin> if you're on linux, the dd then burn iso method is fast and trivial. If you're on mac...? :/
[22:50:44 CET] <bray90820> Like I said I could use windows but OSX is more convenient for me
[22:50:58 CET] <furq> are there actually any special considerations when it comes to burning a video dvd
[22:51:26 CET] <furq> on the few occasions i've done this i just stuck the VIDEO_TS dir in the root and it worked
[22:51:41 CET] <furq> idk if that was down to imgburn being smart though
[22:51:48 CET] <kepstin> just need to make sure it's a specific UDF version, and playback is smoother on older players if the files are laid out on disk in a particular order
[22:52:10 CET] <furq> don't most dvd writing apps default to iso9660+udf
[22:52:30 CET] <kepstin> most newer players are less picky about the UDF version and have faster drives/more ram so the file order doesn't matter as much
[22:52:59 CET] <kepstin> but yeah, many burning apps have dvd-video presets or modes that handle this for you
[22:53:35 CET] <furq> https://discussions.apple.com/thread/3769331?start=0&tstart=0
[22:54:13 CET] <kepstin> e.g. if you're using growisofs on linux, you just use the '-dvd-compat' option and give it the files, and it'll do the right thing
[22:55:45 CET] <kepstin> if you can use dd or something on mac os x on the existing unencrypted dvd, then you can burn that with any tool (since the structure's already in the iso)
[22:56:07 CET] <furq> assuming it's a commercial unencrypted dvd and not a copy
[22:57:07 CET] <bray90820> furq: Sure lets go with that
[22:57:46 CET] <kepstin> (multi-layer dvds are fun, apparently commercial dvd authoring software lets you pick where do split files over layers since older players have noticable pauses when switching)
[22:58:05 CET] <bray90820> I actually have a plugin for windows that bypasses encryption
[22:58:13 CET] <furq> yeah i can imagine that's trickier
[22:58:35 CET] <furq> i don't think i ever actually managed to burn a dual layer dvd without it breaking
[22:58:41 CET] <furq> idk if i got a bad batch or what
[22:58:54 CET] <kepstin> they were so expensive I never bothered :/
[22:59:12 CET] <furq> yeah i got a pack of 10 on offer and the first five or six all aborted halfway through
[22:59:16 CET] <furq> i guess that's why they were on offer
[22:59:53 CET] <furq> it was some reseller whose discs normally id'd as taiyo yuden discs, but the DLs were not of the same high quality
[23:00:21 CET] <furq> christ i've not thought about taiyo yuden in the best part of a decade
[23:00:47 CET] <furq> i'm sure business is booming for those guys
[23:01:51 CET] <kepstin> furq: I think their main business is capacitors and other random electronic bits, they're probably doing fine :/
[23:02:20 CET] <furq> so it is
[23:02:35 CET] <furq> good for them
[23:03:29 CET] <furq> In Japan, Korea and Greece, Taiyo Yuden was distributing its own brand "That's".
[23:03:40 CET] <furq> ok
[23:16:32 CET] <arog> libavcodec.a(cscd.o): undefined reference to symbol 'uncompress' //lib/x86_64-linux-gnu/libz.so.1: error adding symbols: DSO missing from command line
[23:16:35 CET] <arog> have you seen that before
[23:19:55 CET] <arog> when building my own libavcodec is it possible to replace the one in /usr ?
[23:19:57 CET] <arog> if so how do i do that
[23:20:17 CET] <arog> or is that not advisable
[23:21:30 CET] <furq> ./configure --prefix=/usr
[23:21:43 CET] <furq> you should probably keep it in /usr/local though
[23:22:33 CET] <furq> anything on your system which depends on your current libavcodec is pretty likely to fuck up if you replace it
[23:24:32 CET] <arog> good idea
[23:24:34 CET] <arog> thanks
[23:33:05 CET] <arog> /usr/bin/ld: /usr/local/lib/libavcodec.a(cscd.o): undefined reference to symbol 'uncompress'
[23:33:06 CET] <arog> /lib/x86_64-linux-gnu/libz.so.1: error adding symbols: DSO missing from command line
[23:33:08 CET] <arog> y
[23:33:11 CET] <arog> damn still not working
[23:33:13 CET] <arog> any idea?
[23:34:06 CET] <kepstin> you're linking to static libraries, so you have to manually include all the dependencies on the link command line as well
[23:34:21 CET] <kepstin> it might be easier if you rebuild your ffmpeg with shared libraries?
[23:34:33 CET] <arog> hmm
[23:34:34 CET] <arog> oh
[23:34:41 CET] <arog> good point I am only linking to libavcodec
[23:34:45 CET] <arog> i will just link to all ffmpeg_libraries
[23:34:56 CET] <arog> http://stackoverflow.com/questions/24989432/linking-error-dso-missing-from-command-line
[23:35:00 CET] <arog> found this right as I answered you :)
[23:35:14 CET] <kepstin> in this case, you need to also link to libz (-lz) since the libavcodec static library uses it.
[23:59:50 CET] <vlad__> hello! I am editing a C++ application that has simple png dumping capability. I want to instead write the frames to a pipe and stream in real time with ffmpeg
[00:00:00 CET] --- Wed Feb 15 2017
More information about the Ffmpeg-devel-irc
mailing list