[Ffmpeg-devel-irc] ffmpeg.log.20190107
burek
burek021 at gmail.com
Tue Jan 8 03:05:02 EET 2019
[00:07:51 CET] <DHE> because you've lost information from the original image. reversing it is naturally imperfect
[00:09:02 CET] <ring0> BtbN, here I found some ffmpeg reference for using it in combination with a fpga on page 11: https://www.xilinx.com/publications/events/developer-forum/2018-frankfurt/state-of-fpga-acceleration.pdf
[00:11:06 CET] <DHE> sure, but you need the programming for those FPGAs. I'm guessing it's not open or free...
[00:11:49 CET] <DHE> also FPGA or ASIC based encoding has historically been of somewhat poor quality compared to what a CPU can do... my ryzen7 (first gen) can encode 1080p at 60fps realtime more efficiently than my nvidia card
[00:12:55 CET] <ring0> you could do the implementation on your own with verilog/vhdl or use the non-free ip cores, sure
[00:13:32 CET] <ring0> looking at history totally correct DHE
[00:15:04 CET] <ring0> I also saw that ffmpeg has been in touch with this approach, looking at https://trac.ffmpeg.org/ticket/7214
[00:20:50 CET] <iive> DHE, depends how you define efficiently. I know somebody using nvidia pro card to encode above 16 HD streams realtime.
[00:23:07 CET] <iive> so nvidia encoder is a beast. do they produce the same quality for that bitrate as x264 would get? probably no, but definitely doesn't need 16 separate computers.
[00:24:02 CET] <iive> consumer/gaming cards use same encoder, the drivers just limit it to 1 or 2 encodes.
[00:34:16 CET] <DHE> iive: it has the capacity. I have a somewhat old quadro (the kind that doesn't get the 2-streams-max limit removed) and estimated it could do 300fps at 1080p
[00:34:27 CET] <DHE> even in its "best quality" mode
[00:34:43 CET] <DHE> but x264 medium (defaults) preset at otherwise identical bitrate settings looks better than nvenc does
[00:35:10 CET] <DHE> I've heard that the rtx2000 series has a new generation encoder that does better, but not seen it myself
[00:52:56 CET] <iive> DHE, we totally agree.
[01:25:02 CET] <Neoflash> Hello everyone. Man this feels weird. I haven't done IRC since 1998. I didn't know it was still around. Who needs Slack or Discord, amarite!
[01:31:36 CET] <Neoflash> So I guess I'll start with an easy question: How the f@?% do I grok video. I am working on a small dev project that involves taking a live video stream coming from a remote camera on a UDP port and transmiting it to web browser clients with as little latency as possible. In trying to get this working I have stumbled on ffmpeg but I realize that I
[01:31:37 CET] <Neoflash> don't know enough about digital video to make use of it. Where do I start?
[01:32:17 CET] <klaxa> i guess wikipedia is good at giving general outlines
[01:32:58 CET] <klaxa> https://en.wikipedia.org/wiki/Digital_container_format would probably be a good start?
[01:33:38 CET] <klaxa> there are probably a lot of other good starting points
[01:37:19 CET] <Neoflash> I already read that entry and bunch of others on digital video but I guess I still don't understand how it all fits together. For starters, what the heck am I getting on that UDP port?
[01:39:04 CET] <klaxa> hard to know from just knowing it's udp
[01:39:08 CET] <klaxa> probably rtp?
[01:39:28 CET] <klaxa> https://en.wikipedia.org/wiki/Real-time_Transport_Protocol
[01:39:45 CET] <klaxa> or rather maybe
[01:40:56 CET] <Neoflash> I have also read that entry. How do I figure out if it's coming in thru RTP or just straight UDP messages?
[01:42:07 CET] <klaxa> can you play it back? if not you can use wireshark to let that figure out what kind of packets you are receiving
[01:42:29 CET] <iive> ffplay might be able to figure that out.
[01:42:47 CET] <iive> UDP may also be mpeg-ts, but that is not very likely.
[01:43:59 CET] <Neoflash> I was able to use ffplay didn't quite work for me but I was able use ffmpeg to save it to an mp4 file (just followed blog post, doc and tutorials without really understanding what any of it meant). The file works, I was able to play it.
[01:46:18 CET] <Neoflash> I didn't know about Wirershark... seems like a good place to start.
[01:51:48 CET] <friendofafriend> Neoflash: You'll need to find a server that will accept ffmpeg as a source and handle your web clients. I've used Icecast for that.
[01:52:18 CET] <Neoflash> I guess one of the things I don't quite understand is how video can a video "file" be written and read live. How can a partial file be read, wouldn't it normally be considered corrupt, because it is missing so important information?
[01:53:05 CET] <Neoflash> friendofafriend I am writing my own Node.js server.
[02:04:50 CET] <Neoflash> Ok, I've installed Wireshark but I need to restart my computer. Thanks for the help so far, I'll be back later.
[02:23:13 CET] <Neoflash> Hey I'm back with a fresh install of Wireshark and ready to test those UDP packets. I'll try to figure it out on my own and report back.
[02:41:52 CET] <Neoflash> Ok, so there's a bunch of stuff coming in but as far as I can see, the packets on the part that I'm interested in are showing up as being UDP protocol and apart from a few exceptions here and there always have a payload of 1460 bytes.
[02:44:21 CET] <Neoflash> Which would explain why I'm able to just feed them to ffmpeg without any other kind of preprocessing I guess. So where do I go from there. How do I figure out what exactly each packet's payload represents?
[03:00:49 CET] <DHE> it depends on the container. RT[M]P isn't in my area of knowledge, but there are "file formats" that are streaming friendly, or RTP may act as both "container" and network transport at the same time and you don't worry about it...
[03:03:11 CET] <meiamsome> If you successfully fed it through ffmpeg, wouldn't it have logged out what it worked out the format of the input stream is?
[03:04:36 CET] <Neoflash> I did not notice anything like that, let me run it through again and see if it displays the format of the input stream.
[03:11:08 CET] <Neoflash> Ok, I did a simple ffmpeg -i udp://192.168.10.1:11111 output.mp4 and it spewed out a bunch of stuff, of wich I understand pretty much nothing. What am I looking for?
[03:12:10 CET] <Neoflash> On the bright side the output file works.
[03:16:20 CET] <Neoflash> Here is a pastebin of whatever was logged: https://pastebin.com/RFasnY33
[03:37:39 CET] <meiamsome> Lines 101 - 103 seems to imply it's a raw h264 stream as far as I know how to read that
[03:40:15 CET] <meiamsome> You could reasonably stream this directly to web browsers with WebRTC, perhaps if it is just h264
[03:46:14 CET] <meiamsome> otherwise, you could also generate HLS or MPEG DASH fairly easily which are formats for video streams over HTTP
[03:47:37 CET] <meiamsome> WebRTC is better but harder imo, hls could be as simple as `ffmpeg -i udp://192.168.10.1:11111 output/master.m3u8` and then serving the output/ folder with a webserver and clients could use hls.js to play it out
[03:54:18 CET] <meiamsome> So, as you say you are using node, you can probably run `npx http-server --cors` to serve up that folder, and then go here: https://video-dev.github.io/hls.js/stable/demo/ you could enter http://localhost:8080/master.m3u8 and theoretically it'd play, but ymmv
[10:42:23 CET] <rocktop> is here any ffmpeg static and full without compiling ?
[10:46:41 CET] <poutine> http://lmgtfy.com/?s=k&q=ffmpeg+static+builds
[10:47:43 CET] <rocktop> poutine: yes I know this one but I am looking for full static version
[10:58:28 CET] <rocktop> I have this issue Unrecognized option 'preset'
[10:58:39 CET] <rocktop> any idea ?
[11:01:08 CET] <Mavrik> We don't have a crystal ball. Provide more info.
[11:07:17 CET] <rocktop> Mavrik: https://bpaste.net/show/fb600f0c5bc7
[11:07:39 CET] <Mavrik> You're ffmpeg version is beyond ancient.
[11:08:01 CET] <Mavrik> We're at version 4.1 right now and you're at 0.6
[11:09:35 CET] <rocktop> oh yes sorry
[11:09:54 CET] <rocktop> its ok now
[11:09:59 CET] <rocktop> Thank you very much
[12:42:49 CET] <oc1et> Hi all
[12:43:25 CET] <oc1et> is it possible to provide the image of a waveform (not a spectrogram) and get the audio /wave file from it ?
[12:48:29 CET] <durandal_1707> oc1et: that is not possible, unless you have infinite large images CPU and RAM
[12:52:02 CET] <oc1et> thanks durandal_1707
[12:59:20 CET] <durandal_1707> or one use vectors to describe waveform... with bitmap its not possible in general
[13:03:26 CET] <bruce-> your images will be finite sized, but it does seems a bit impractical to have (48000*seconds) x 65536 for 48khz/16 bit sound
[13:04:39 CET] <bruce-> (mono)
[14:59:00 CET] <Neoflash> Hey guys, so I have (what seems to be) a raw h264 stream coming in from a networked camera over plain UDP (see https://pastebin.com/RFasnY33). I am planning to use WebRTC Data Channels to stream the live video to web browser clients where I will use the MediaSource and SourceBuffer APIs to feed the video to a video Html element. I'm doing all of t
[14:59:01 CET] <Neoflash> his from a NodeJs server. From what I can understand, I can't just pipe the raw H.264 to the media source and I need to containerize it first and this is where ffmpeg comes in. I'm looking for guidance on the best way to use ffmpeg to transform the raw feed and pipe it to the web browser in a way that will allow for extremely low latency (I need th
[14:59:01 CET] <Neoflash> e client to see what the camera sees in real-time). I don't want the browser to have to wait and buffer, it needs to play the stream right away. Are there things I need to pay special attention to?
[16:50:23 CET] <kepstin> nah, image of a waveform should be enough - as long as you have at least one pixel per sample horizontally, and 64K pixels vertically (for 16-bit audio) :)
[16:50:39 CET] <kepstin> it will be huge of course, but not infinite.
[16:50:55 CET] <kepstin> er, as bruce- said :/
[19:17:48 CET] <Neoflash> Anyone familiar with the Media Source Extensions Web API ?
[00:00:00 CET] --- Tue Jan 8 2019
More information about the Ffmpeg-devel-irc
mailing list