[Ffmpeg-devel-irc] ffmpeg.log.20180302
burek
burek021 at gmail.com
Sat Mar 3 03:05:01 EET 2018
[00:36:16 CET] <Guest59203> hi all, a question regarding sending a stream with ffmpeg to ffserver. I've configured the ffserver feed and stream correctly and it is started. now I want to send a stream locally with ffmpeg to that ffserver, but ffmpeg is telling me all the time, that it has no suiteable output format: Unable to find a suitable output format for, whats up here ??? Any hint what I missed ? its a package,installed from http://ppa.launchpad.net/jonathon
[00:36:16 CET] <Guest59203> f/ffmpeg-3/ubuntu and its ffmpeg version 3.3.4-2
[00:38:07 CET] <JEEB> pretty much no-one knows ffserver here
[00:38:32 CET] <JEEB> it was used by a very limited amount of people, not maintained and used internal APIs and had other issues so it was removed
[00:42:16 CET] <Guest59203> ah, ok, sounds crappy though
[00:42:36 CET] <JEEB> for most use cases there are alternatives
[00:42:51 CET] <JEEB> ffmpeg.c can be utilized as a feeder for a wide range of media server ingests
[00:43:03 CET] <Guest59203> any suggestion on converting and serving a rtsp stream ?
[00:43:04 CET] <JEEB> and for stuff like UDP multicast you don't even need a media server
[00:43:26 CET] <Guest59203> I plan to provide a mjpeg
[00:43:26 CET] <JEEB> so what do you want your /output/ to be?
[00:44:00 CET] <Guest59203> rtsp to mjpeg conversion
[00:44:54 CET] <JEEB> ok, that is rather unusual
[00:45:05 CET] <JEEB> as in, trying to serve mjpeg over PROTOCOL
[00:45:12 CET] <JEEB> whatever the protocol is
[00:45:17 CET] <Guest59203> http
[00:45:31 CET] <shtomik> Hi ;) Guys, how to use avdevice_list_devices(ifmt_ctx_tmp, &list); ? How to initialize context and list for call this function? I need to avformat_open_input?
[00:46:00 CET] <JEEB> Guest59203: is this for some weird plastic box?
[00:46:06 CET] <JEEB> or what is the use case?
[00:47:15 CET] <Guest59203> @JEEB: yes it is. the rtsp implementation in motion seems to be buggy and the older ffmpeg (<3) seem to have rtsp problems also (didn't react on option answer in protocol)
[00:47:57 CET] <JEEB> I do not fully understand what parts of my questions you're replying to
[00:48:43 CET] <Guest59203> so I try to create some gw between rtsp stream coming from IP camera to provide a readable network stream for motion server
[00:49:30 CET] <Guest59203> @JEEB still wasn't ready in typing
[00:53:15 CET] <Guest59203> any idea on how to feed live555MediaServer from ffmpeg ?
[00:54:57 CET] <JEEB> 1) ?otion seems to already support rtsp https://rawgit.com/Motion-Project/motion/master/motion_config.html#netcam_url
[00:55:20 CET] <JEEB> 2) live555 doesn't say anything about ingest which isn't just reading files http://www.live555.com/mediaServer/
[00:55:29 CET] <Guest59203> indeed but it is a bad implementation
[00:56:10 CET] <JEEB> it also seems to support rtmp, which you can serve with nginx-rtmp
[00:56:29 CET] <JEEB> ffmpeg.c -> nginx-rtmp <- whatever
[00:56:42 CET] <Guest59203> and this implementation is not working with that rtsp streem. It looks like motion is having the same problems like older ffmpeg versions
[00:58:30 CET] <JEEB> also if your plan is to just watch the rtsp stream over HTTP or something, nginx-rtmp or even normal nginx will let you push stuff onto it (nginx-rtmp will generate HLS/DASH from the RTMP ingest, and if you just use the HLS output you can have it just HTTP POST the output files onto a web server
[01:01:11 CET] <Guest59203> ok, thanks, I check that
[01:02:13 CET] <Guest59203> JEEB, many thanks for you help. whish you a nice evening. (my netflix soap finishes in 3 minutes, so I go to bed now)
[01:04:33 CET] <Guest59203> ahh, looks like VLC has implemented all you need ...
[01:07:31 CET] <furq> on which note
[01:07:33 CET] <furq> https://github.com/arut/nginx-ts-module
[01:07:37 CET] <furq> did anyone see this yet
[01:08:40 CET] <furq> it seemed way more interesting until i read the open issues and it says it only remuxes, it doesn't serve the original ts stream
[01:09:15 CET] <furq> so i guess the only use for it is if you want to do live webm over dash, which seems unlikely
[01:58:25 CET] <shtomik> "Function not implemented" - What does it mean? I'm using libavdevice API for get info about devices, and av_log(NULL, AV_LOG_ERROR, "%s\n", av_err2str(ret)); print this error
[01:59:52 CET] <shtomik> Guys, somebody knows, why I get this error?
[02:00:04 CET] <schnozzle> I can only guess
[02:00:17 CET] <shtomik> Please ...
[02:01:37 CET] <schnozzle> it seems the function u r using "to get info about devices" is not implemented yet
[02:02:38 CET] <shtomik> I saw files that I compiled(avdevice.h and avdevice.c) and there this functions implemented o_O
[02:03:13 CET] <schnozzle> do they fail ?
[02:03:24 CET] <shtomik> +
[02:03:46 CET] <shtomik> ret = avdevice_list_input_sources(ifmt_tmp, NULL, NULL, &list);
[02:04:20 CET] <schnozzle> odd, prob a bug?
[02:04:48 CET] <schnozzle> its implemented but fails and says it is not...
[02:05:05 CET] <shtomik> maybe, I don't know...
[02:05:19 CET] <shtomik> always return code "-78"
[02:06:36 CET] <schnozzle> "Returns available device names and their parameters. These are convinient wrappers for avdevice_list_devices(). Device context is allocated and deallocated internally."
[02:06:46 CET] <shtomik> yea
[02:07:05 CET] <schnozzle> negative = error, any positive = count
[02:07:14 CET] <shtomik> yea
[02:07:16 CET] <schnozzle> then u feed "ret" to av_log
[02:07:26 CET] <schnozzle> so ret is negative?
[02:07:41 CET] <shtomik> + always "-78"
[02:12:26 CET] <shtomik> what am I doing wrong?
[02:14:10 CET] <schnozzle> thats odd
[02:14:48 CET] <schnozzle> avdevice.c line 204, according to docu this method returns an int*
[02:14:59 CET] <schnozzle> but its stored in an int
[15:04:12 CET] <shtomik> Guys, who can help me with libavdevice? Always when I call int avdevice_list_input_sources(struct AVInputFormat *device, const char *device_name, AVDictionary *device_options, AVDeviceInfoList **device_list); function I get -78 returned code, its a bug?
[15:05:02 CET] <BtbN> yes. In your code.
[15:05:33 CET] <shtomik> Thanks for your reply, Why do you think so?
[15:06:21 CET] <BtbN> Because the library works fine in ffmpeg.c, and it gives you a sensible error code. So look up what it means, and fix your issue
[15:07:04 CET] <shtomik> My code: AVInputFormat *ifm = av_find_input_format("avfoundation"); AVDeviceInfoList *list; ret = avdevice_list_input_sources(ifmt, NULL, NULL, &list);
[15:07:25 CET] <shtomik> It means that the Function not implemented
[15:07:39 CET] <shtomik> av_log(NULL, AV_LOG_ERROR, "%s\n", av_err2str(ret));
[15:07:46 CET] <BtbN> did find_input_format return something sensible?
[15:08:20 CET] <shtomik> AVInputFormat *av_find_input_format(const char *short_name);
[15:09:52 CET] <jkqxz> The error code is accurate. The device list feature isn't implemented for avfoundation.
[15:11:30 CET] <shtomik> Okay, thanks! Can I get the list for avfoundation device?
[15:13:20 CET] <shtomik> Maybe another way? Except options(list_devices, true)
[15:15:51 CET] <jkqxz> That will list them to the log.
[15:17:59 CET] <shtomik> Thanks, I know it, but I need it, for scan devices, not for print it to std out
[15:18:53 CET] <shtomik> is there any other way?
[15:19:37 CET] <jkqxz> Implement it yourself? The code making that output for the log is in libavdevice/avfoundation.m.
[15:20:37 CET] <shtomik> So sorry for this stupid question, thanks!!! okay, ill do it ;)
[15:22:59 CET] <BtbN> I guess it just assumes you to know the device name?
[15:24:28 CET] <King_DuckZ> guys, the sample code I'm following is seriously bad, aside from the confusing things I asked about yesterday there's the deprecated function warnings that I'm getting today
[15:24:46 CET] <shtomik> libavdevice for avfoundation work with indexes, but indexes in OS and libavdevice - defferent
[15:25:15 CET] <shtomik> different*, sorry
[15:26:37 CET] <BtbN> King_DuckZ, is the sample code part of ffmpeg?
[15:26:45 CET] <BtbN> if so, you can report that as a bug
[15:29:54 CET] <King_DuckZ> BtbN: yes, it's the muxing.c example
[15:31:13 CET] <King_DuckZ> it's confusing and after following it, refactoring and having something that compiles I still don't feel like I get what's going on - aka it doesn't explain things
[15:32:48 CET] <BtbN> The best source to find out what stuff does is the ffmpeg source code
[15:32:50 CET] <King_DuckZ> this one instead seems to be much much more explicit dranger.com/ffmpeg/tutorial01.html but it doesn't cover writing, which is the part I need
[15:33:22 CET] <BtbN> ffmpeg.c isn't neccesarily the best example though. There's a lot of bad practices in there that are hard to fix
[15:39:57 CET] <King_DuckZ> 'go read ffmpeg.c' doesn't sound like a tutorial to me, especially after someone else told me 'go read muxing.c' and after doing it I'm still at the starting point
[15:43:14 CET] <shtomik> @King_DuckZ hi, whats the problem?
[15:44:16 CET] <King_DuckZ> 18:41 < King_Duck> I'm looking at the muxing.c example and at the line with add_stream(&video_st, oc, &video_codec, fmt->video_codec); I suppose video_st and video_codec are some sort of return value, right?
[15:44:24 CET] <King_DuckZ> 18:42 < King_Duck> if so, is there a reason why video_codec is not a member of OutputStream?
[15:44:34 CET] <King_DuckZ> 19:07 < King_Duck> I don't understand that code, why open_video gets an AVFormatContext if it doesn't need one?
[15:44:48 CET] <King_DuckZ> 19:25 < King_Duck> why close_stream() also takes eth AVFormatContext and doesn't use it? and why close_stream() closes everything but the stream itself? doesn't st need to be cleaned up too?
[15:45:03 CET] <King_DuckZ> shtomik: ^^
[15:46:06 CET] <King_DuckZ> and from today: "avcodec_encode_video2 is deprecated [-Wdeprecated-declarations]"
[15:48:30 CET] <shtomik> @King_DuckZ yea, new API was released& send/receive frame/packet(see example, https://github.com/FFmpeg/FFmpeg/blob/master/doc/examples/filtering_audio.c)
[15:50:25 CET] <shtomik> King_DuckZ: concerning the rest, you need more time for libav libs learning &
[15:52:22 CET] <shtomik> King_DuckZ: Read header files for information about API or functions
[15:52:35 CET] <King_DuckZ> shtomik: so should I just sit and wait until I somehow learn it or is there some resource that I could read while I wait? to get some idea of what the concepts are, and bonus points if it's not obsolete yet
[15:55:48 CET] <shtomik> King_DuckZ: I'm certainly not an adviser in these matters. I have 2 outstanding questions and unfortunately no one gives me an answer. So, we need to see the github repo and ffmpeg site + googling ;)
[16:42:15 CET] <King_DuckZ> encode_video example is *completely* different, why is there a FILE* thing floating around now? why is there no call to av[util]_register_all()?
[16:46:03 CET] <King_DuckZ> where's avformat_write_header() now? I thought that thing was necessary? and why is muxing missing the av_packet_free call? is it leaking resources? or is video_encoding doing it unnecessarily?
[17:11:00 CET] <devinheitmueller> Hello all. This might seem like a silly question, but does ffmpeg have a generic construct for storing a FIFO of AVPackets? The Decklink libavdevice has its own implementation, which seems like something that you would expect to find in common code. Theres av_fifo, but that only really works with bytes or flat structures that dont need need deallocation (i.e. if the circular buffer wraps around).
[17:12:54 CET] <BtbN> you make sure it never wraps around too early
[17:13:25 CET] <devinheitmueller> BtbN: that kinds of defeats the purpose of a circular buffer though, no?
[17:13:36 CET] <BtbN> not really, no?
[17:13:37 CET] <devinheitmueller> Or was that just sarcasm?
[17:14:16 CET] <BtbN> you make sure you allocate it big enough so it never bites its tail
[17:14:40 CET] <devinheitmueller> So what youre describing is a list, not a circular buffer. :-)
[17:15:24 CET] <BtbN> except that it's way faster
[17:26:31 CET] <stk944> Hello, I'm trying to run ffprobe on an sdp file generated by ffmpeg, here is the sdp file, https://pastebin.com/uzeEUW1F, this is my ffprobe version" ffprobe version git-2017-01-22-f1214ad" I don't get any output, when I add -print_format json -show_streams I get { }
[17:26:48 CET] <stk944> Wondering if this is supported?
[19:35:09 CET] <MarkedOne> Hello good ppl. I need highest possible compression for pngs.
[19:35:15 CET] <MarkedOne> How to do that?
[19:44:47 CET] <ChocolateArmpits> MarkedOne, consider using optipng
[19:45:08 CET] <ChocolateArmpits> wouldn't look at ffmpeg for highest compression
[19:51:40 CET] <MarkedOne> THank you :)
[19:56:13 CET] <ChocolateArmpits> there's also pngcrush if you want another option at png compression
[19:57:05 CET] <ChocolateArmpits> though optipng works for me great as it is, there are multiple compression levels, depending how much time you can spend
[20:00:09 CET] <Crunch> anyone have any idea how to install ffmpeg-git alongside regular packaged ffmpeg?
[20:04:54 CET] <relaxed> Crunch: https://www.johnvansickle.com/ffmpeg/ look at the faq
[20:06:37 CET] <Crunch> thanks
[23:04:13 CET] <RavenWorks> Is there a way to get ffmpeg to give a list of all the containers that a given codec is supported in?
[23:04:24 CET] <RavenWorks> (Or a table in the docs somewhere that would say the same thing?)
[23:04:41 CET] <RavenWorks> the closest I've found is this table https://trac.ffmpeg.org/wiki/Encode/HighQualityAudio#Containerformats but it's audio only, and it even says it isn't complete
[23:12:49 CET] <BtbN> I guess you'll need to iterate all containers and query their supported formats
[23:13:09 CET] <RavenWorks> that's a start; how do you query the supported formats?
[23:14:27 CET] <DHE> there isn't really a direct way to do that. you can try starting it with a codec and see if it pukes. but even then some will fake it (eg: mpegts)
[23:14:36 CET] <RavenWorks> was afraid of that
[23:14:43 CET] <DHE> each container has a "preferred" codec that's registered with it...
[23:14:56 CET] <BtbN> But there is
[23:14:56 CET] <RavenWorks> where can I look those up?
[23:15:10 CET] <BtbN> containers have the supported codecs in the codec_tag field of their AVOutputFormat
[23:15:55 CET] <BtbN> http://git.videolan.org/?p=ffmpeg.git;a=blob;f=libavformat/avformat.h;h=a2fe7c6bb2cafe8f9c16ab541d3d70079f26ba2a;hb=HEAD#l519
[23:16:49 CET] <DHE> interesting... that will require API access though
[23:20:52 CET] <RavenWorks> yeah I don't know how to make sense of that unfortunately
[23:21:29 CET] <RavenWorks> so basically, if I have a video stream and an audio stream, and I want them to play together in a single file (using -c copy), I just have to try all the container formats I can think of until it stops giving me an error?
[23:22:01 CET] <DHE> for "common" codecs mkv will be highly reliable...
[23:26:24 CET] <RavenWorks> hmm, ok trying mkv is giving me other errors about timestamps not being set, so maybe that's an unrelated issue at that point
[23:26:28 CET] <RavenWorks> I'll try that for a bit, thanks
[00:00:00 CET] --- Sat Mar 3 2018
More information about the Ffmpeg-devel-irc
mailing list