[Ffmpeg-devel-irc] ffmpeg.log.20151119

burek burek021 at gmail.com
Fri Nov 20 02:05:01 CET 2015


[00:09:17 CET] <Flerb> So, I have a particular IP camera with an MJPEG stream, accessible from a cgi script on the camera. When I put the URL for this into VLC, with the username and password at the beginning, it works fine. With ffmpeg, it says it's returned a 401 unauthorised
[00:11:04 CET] <JEEB> tried putting the username and password into the url :P
[00:11:06 CET] <JEEB> ?
[00:13:37 CET] <Flerb> JEEB: that's what I'm doing
[00:13:56 CET] <Flerb> as in the exact url copied verbatim into vlc works
[00:14:27 CET] <JEEB> ok, I misread your first line :)
[00:15:02 CET] <Flerb> I guess I could wireshark it
[00:15:40 CET] <Flerb> think i'm unlikely to get any useful debug info
[00:15:50 CET] <JEEB> seems like it supports the username:password thing at least for rtsp, so I would be surprised if it wouldn't handle HTTP
[00:17:13 CET] <Flerb> i wonder if there are any publicly availably cams
[00:17:18 CET] <Flerb> *available
[00:17:24 CET] <Flerb> that use http auth
[00:18:04 CET] <JEEB> -auth_type basic seems to be in libavformat/http.c
[00:18:21 CET] <JEEB> but not sure if it's for GET or when it's listening
[00:18:42 CET] Action: JEEB goes play the git blame game
[00:19:21 CET] <Flerb> wonder if I could pass it as a parameter
[00:20:08 CET] <JEEB> ok, seems like the "auth_type" option came before listening in libavformat's http thing
[00:20:21 CET] <JEEB> so feel free to try "-auth_type basic" before -i
[00:22:29 CET] <JEEB> https://github.com/FFmpeg/FFmpeg/commit/eb8b05a3824a9fa85e20d603595ac8a3b83505d4 seems like it is supposed to autodetect the auth type but *shrug*
[00:23:19 CET] <Flerb> well i've set it now
[00:23:21 CET] <Flerb> and that works
[00:23:29 CET] <Flerb> so now I have more problems
[00:23:47 CET] <Flerb> with ffserver that is
[00:24:32 CET] <Flerb> "Connection to tcp://127.0.0.1:8090 failed: Connection refused"
[00:24:33 CET] <Flerb> what
[00:25:28 CET] <Flerb> go home ffmpeg
[00:25:30 CET] <Flerb> you're drunk
[00:26:43 CET] <JEEB> you can try pointing ffmpeg to another http server you might have in your posession to see what kind of request it's sending
[00:57:31 CET] <marwagamal> hi117, is anyone available ?
[00:58:07 CET] <marwagamal> i need help to cross compile ffmpeg
[00:58:28 CET] <marwagamal> whenever i enable opus it returns ERROR: opus not found using pkg-config
[01:00:36 CET] <c_14> Did you install libopus so that the cross-compiler can find it?
[01:00:39 CET] <c_14> pastebin your config.log
[01:00:59 CET] <c_14> That is upload it to a pastebin service.
[01:03:00 CET] <marwagamal> http://pastebin.com/0EJ87ZXg
[01:06:23 CET] <c_14> set your PKG_CONFIG_PATH to include the directory containing the opus.pc file
[01:07:15 CET] <marwagamal> where can i find opus.pc file ?
[01:07:44 CET] <c_14> OPUS_PREFIX/lib/pkgconfig/ usually
[01:07:52 CET] <c_14> OPUS_PREFIX being the prefix you installed opus into
[01:08:49 CET] <marwagamal> i installed lipopus using ubuntu manager
[01:09:07 CET] <marwagamal>  pkg-config --cflags opus >> -I/usr/include/opus
[01:09:41 CET] <c_14> probably /usr/include/pkgconfig or /usr/include/opus/pkgconfig
[01:11:13 CET] <marwagamal> none of the paths exist
[01:11:28 CET] <c_14> try `locate opus.pc'
[01:11:39 CET] <marwagamal> /usr/include/opus  includes opus.h opus_types.h
[01:11:54 CET] <marwagamal> not found
[01:20:25 CET] <c_14> Then you're either going to have to write it yourself, or compile libopus yourself so you get it
[01:43:41 CET] <marwagamal> c_14,  the same error after compilation http://pastebin.com/UTdS39w9
[01:44:33 CET] <c_14> set PKG_CONFIG_PATH to include /home/marwagamal/build-ffmpeg/lib/pkgconfig
[01:44:45 CET] <marwagamal> i did
[01:45:18 CET] <c_14> hmm
[01:45:41 CET] <c_14> Your --extra-cflags are off
[01:46:16 CET] <c_14> Get rid of them, the second one looks empty and the first one isn't valid
[01:47:02 CET] <c_14> Also, what does PKG_CONFIG_PATH='/home/marwagamal/build-ffmpeg/lib/pkgconfig' pkg-config --exists --print-errors return?
[01:47:09 CET] <c_14> eh
[01:47:14 CET] <c_14> + opus
[01:49:30 CET] <marwagamal> not found
[01:50:20 CET] <marwagamal> doesn't print anything
[01:51:22 CET] <c_14> Oh
[01:51:23 CET] <c_14> wait
[01:51:25 CET] <c_14> I see the problem
[01:52:39 CET] <c_14> Do you have a mipsel-linux-android-pkg-config binary?
[01:53:21 CET] <marwagamal> how to check ?
[01:53:34 CET] <c_14> Just try executing that as a command
[01:53:45 CET] <marwagamal>  command not found
[01:53:58 CET] <c_14> it might be in your sysroot
[01:54:09 CET] <c_14> But anyway, that's your problem.
[01:54:29 CET] <c_14> configure can't find the pkg-config binary for your cross-compiler
[01:54:50 CET] <c_14> You can try overriding it with --pkg-config=
[01:55:13 CET] <marwagamal> but if i removed opus it success
[01:55:43 CET] <c_14> Because opus is the only package you have that enabled that uses pkg-config detection
[01:55:51 CET] <marwagamal> i'm using ubuntu 15.10
[01:56:00 CET] <marwagamal> aha i see
[01:56:07 CET] <c_14> If you know what cflags/ldflags you need to build with pkg-config support, you can do --pkg-config=true and pass them in --extra-cflags and --extra-ldflags
[01:56:33 CET] <c_14> s/pkg-config support/opus support/
[01:57:33 CET] <marwagamal> how i know that flags ?
[01:58:53 CET] <c_14> They're listed in the .pc file. But it's usually just -I${includedir}/opus for cflags and -L${libdir} -lopus for ldflags replacing includedir and libdir with the paths to the include/lib directory opus was installed into
[03:24:26 CET] -:#ffmpeg- [freenode-info] channel flooding and no channel staff around to help? Please check with freenode support: http://freenode.net/faq.shtml#gettinghelp
[10:42:21 CET] <uglyandstupid> hello
[10:43:11 CET] <uglyandstupid> i'm looking for a sample C/C++ example showing how to extract audio/video from mpegts, if any of you have an idea, thank you in advance
[10:44:00 CET] <uglyandstupid> sorry disconnected
[10:44:34 CET] <uglyandstupid> i was saying that i'm looking for Code samples (C/C++) showing how to extract audio and video (into separate files for example) of an mpegts file.
[10:44:42 CET] <uglyandstupid> if any have an idea, thanks
[11:03:43 CET] <saste> uglyandstupid, doc/examples/demuxing_decoding.c
[11:43:20 CET] <termos> i'm getting a too low AVPacket.duration for audio of FLTP format, but for S16 format the .duration is correct. Is there something I'm missing here?
[11:43:35 CET] <termos> I'm scaling the duration by the time_base
[11:44:45 CET] <fritsch> you count the bytes wrong?
[11:44:50 CET] <fritsch> and are factor 2 off? :-)
[11:47:49 CET] <termos> hmm that could be! because FLTP is 32bit and not 16 right?
[12:21:47 CET] <AndrewMock> Does TrueHD have embedded downmix coefficients?
[12:22:11 CET] <AndrewMock> I know that Master Audio does. (Assuming of course they use the feature).
[13:20:55 CET] <termos> fritsch: after some testing it seems to be off my a factor of 20... very strange
[13:58:38 CET] <grumper> hello
[13:59:11 CET] <grumper> is there a filter equivalent to -adelay for video (using a predefined (massive) max buffer size) ?
[13:59:23 CET] <grumper> I know all about -itsoffset and -ss
[13:59:40 CET] <grumper> the reason I am looking for a filter is so that I can dynamically adjust a desync "live"
[13:59:46 CET] <grumper> via the zmq plugin
[14:19:08 CET] <termos> grumper: i could not find anything like that so I implemented my own "delay queue" that queues up AVPackets until I get my desired delay
[14:20:30 CET] <grumper> termos: you mean externally to ffmpeg?
[14:21:11 CET] <termos> ah I use the C library in my program so I did the programming for it
[14:21:18 CET] <termos> around the C library
[14:22:41 CET] <exebook> have libav librtmp source embedded or it relies on librtmp installed in the system?
[15:18:51 CET] <natureboy> Hi.
[15:19:22 CET] <natureboy> I've managed to recover MXF files from an CF card with a damaged filesystem structure.
[15:19:32 CET] <natureboy> There are a few that aren't playable.
[15:20:22 CET] <natureboy> Forgive me if this is the wrong channel to ask, are they repairable?
[15:21:56 CET] <natureboy> Since they were all taken in the same session, I know the codec and resolution. If possible, I would at least like to treat them as raw MPEG 2 encoded files and vonert them.
[15:22:00 CET] <natureboy> convert them *
[16:49:22 CET] <Prelude_Zzzzz> hi everyone
[16:49:28 CET] <Prelude_Zzzzz> can i please get some help ... http://pastebin.com/puZrja5S
[16:50:15 CET] <Prelude_Zzzzz> can someone tell me what i am doing wrong. Basically the idea is.. i want to grab the input and send it simultaniously to 2 other processes to get encoded .. i want to copy both the audio & video to get processed with each other ffmpeg process simultaniously
[16:50:23 CET] <Prelude_Zzzzz> can anyone help or point me in the wrong direction ?
[16:51:24 CET] <Prelude_Zzzzz> http://pastebin.com/qZPyV39U >> update
[16:54:36 CET] <c_14> Prelude_Zzzzz: what's the problem?
[17:09:03 CET] <DHE> you know ffmpeg can output to two outputs at once. though you may need to ensure they're fast enough if you're doing realtime/live sources
[17:19:55 CET] <Prelude_Zzzzz> yes i know.. but for some reason ffmpeg while using nvenc can only do 150% max cpu
[17:20:08 CET] <Prelude_Zzzzz> so i have to split it up into different processes to take advantage of the cards :(
[17:20:24 CET] <Prelude_Zzzzz> i tried for days using same process but ffmpeg can't  handle it because the CPU isn't fast enough
[17:20:32 CET] <Prelude_Zzzzz> and for some reason threads doesn't really work with nvenc
[17:32:58 CET] <dantti> does ffmpeg uses x264 from vlc to encode stuff in h264?
[17:44:31 CET] <Prelude_Zzzzz> hey, anyone know a good way to deinterlace content but use little cpu power ?
[18:01:59 CET] <jasom> Prelude_Zzzzz: if you know it was a 24fps film->ntsc transfer then you can use ivtc which is very cpu efficient
[18:04:02 CET] <Prelude_Zzzzz> no its live contet
[18:04:10 CET] <Prelude_Zzzzz> content... some commercials are coming in differently i guess
[18:04:53 CET] <jasom> Prelude_Zzzzz: well you could discard all even fields and halve your vertical resolution; that's also quite efficient, but at a loss of quality
[18:14:22 CET] <jasom> Prelude_Zzzzz: It appears that kerndeint is faster than yadif, but has more artifiacts
[18:55:15 CET] <DHE> dantti: ffmpeg allows nvenc (nVidia hardware encoding) and x264 to be used for h264 encoding
[18:55:34 CET] <DHE> nvenc is pretty fast but quality is inferior compared to x264 and there are hardware limits
[18:56:47 CET] <dantti> DHE: right, when using x264 it uses an out of process approach? since x264 is gpl and ffmpeg is lgpg
[18:58:25 CET] <dantti> I'm trying to encode frames on android with Qt and x264 atm but not geting much luck :P then there is also the problem I dunno how easy it would be to implement mpegts to use with hls
[19:07:37 CET] <DHE> you can compile ffmpeg as GPL with --enable-gpl and libx264 is used natively
[19:08:24 CET] <DHE> also ffmpeg supports HLS natively. I'm using it myself. Though I ride the Git version because there are all sorts of fixes which I don't know are in a release version
[19:18:46 CET] <dantti> DHE: yup, that's why I'm wondering to use ffmpeg, however when I start encoding the camera's frames the apps gets slow till crash... and I'm not even saving to disk/network :P
[19:20:12 CET] <DHE> you'd probably need to use preset=ultrafast, but I always assumed there was a hardware video encoder on the phones (even if it's crap compared to real encoders)
[19:20:29 CET] <dantti> I'm using
[19:21:49 CET] <dantti> I'm basicaly using https://github.com/Teaonly/android-eye/blob/master/jni/h264encoder.cpp and feeding the probed frames from QCamera
[19:21:57 CET] <dantti> tho I should lower the resolution :P
[19:22:11 CET] <dantti> I thought I did that but I guess it didn't work...
[19:23:51 CET] <dantti> then yes, since I can save an mp4 file on hi-res and it is fast I guess there is some hw encoder, but streaming from file is probably a bad approach
[20:51:46 CET] <llogan> "Option 'flgas' not found". damn this hangover, can't type today.
[20:51:53 CET] <llogan> First World Problems Man
[21:35:46 CET] <ChocolateArmpits> I have an MMSH stream that gets reset every hour where for up to a second it doesn't send any packets. Naturally this leads to playback stopping. Is there any way to tell ffmpeg to "wait" for packets ?
[22:09:34 CET] <Zeranoe> Is AAC at 128k or Opus at 139k going to be better quality?
[22:10:42 CET] <c_14> Which encoders?
[22:11:46 CET] <furq> probably opus but i doubt there's any audible difference with a good aac encoder
[22:12:32 CET] <c_14> At those bitrates, the difference is minimal. Since you're giving opus more bits that one's probably going to be better (assuming libopus and fdk_aac)
[22:20:55 CET] <Zeranoe> At the same bitrate, something like 128k, is fdk aac better than opus?
[22:24:28 CET] <JEEB> I'd say at that such rates you aren't deciding your format on quality but rather on which is simpler for you to support on whatever you're planning to play that stuff with
[22:24:39 CET] <furq> http://listening-test.coresv.net/nonblocked_means_all_hd2.png
[22:24:43 CET] <furq> that's the closest comparison i can find
[22:24:48 CET] <JEEB> because both are good enough at such a rate that you just don't give a flying duff
[22:24:59 CET] <furq> afaik fdk is in the same ballpark as appleaac
[22:25:28 CET] <furq> but yeah it's unlikely to be noticeably different
[22:25:34 CET] <Zeranoe> Which is better at lower bitrates
[22:25:48 CET] <fritsch> nice graph
[22:26:10 CET] <Zeranoe> Opus provides this one, https://www.opus-codec.org/comparison/quality.png but it might be biased
[22:26:24 CET] <JEEB> well what kind of rates are we talking about :P
[22:27:16 CET] <JEEB> opus is darn good for various uses, that's the general opinion I have of it
[22:27:26 CET] <JEEB> + it's simpler to distribute as far as binaries go
[22:27:38 CET] <furq> i don't think it's really worth worrying about
[22:27:38 CET] <JEEB> so as long as your playback side is OK with Opus, WhyNot
[22:27:42 CET] <furq> i still use lame v2 for everything
[22:27:58 CET] <furq> on the rare occasion that i actually want lossier audio
[22:29:16 CET] <furq> right now opus only seems particularly useful for super low bitrates or for streaming, and it's let down on the latter front by libvpx's woeful performance
[22:30:10 CET] <Zeranoe> Changing to video, what is the closest quality to HEVC that is free/opensource (not run by MPEG LA)?
[22:30:21 CET] <JEEB> nobody says you have to use libvpx for anything, EBU has standardized Opus in MPEG-TS after all
[22:30:22 CET] <furq> vp9
[22:30:48 CET] <fritsch> did someone say motion jpeg?
[22:31:00 CET] <JEEB> ugh
[22:31:09 CET] <furq> JEEB: webm is the only compelling reason to use it over aac, though
[22:31:14 CET] <furq> unless you're really into open source codecs
[22:31:14 CET] <JEEB> the misnomers. if you want to say "supposedly royalty-free" just say so
[22:31:42 CET] <JEEB> furq: not really, there's some places that control their playback side on mobile or whatever well enough that they could just switch to it
[22:32:30 CET] <JEEB> but yes, depending on how much control over the playback you have whether or not using opus is viable or not is decided
[22:32:35 CET] <JEEB> as I've noted
[22:32:57 CET] <furq> i'm sure it's viable, it just doesn't seem compelling
[22:32:58 CET] <JEEB> rather than deciding on the quality side of things, since for most use cases AAC and Opus are pretty darn similar in performance with good implementations
[23:15:16 CET] <Prelude_Zzzzz> hey, can someone help me with this error ?
[23:15:17 CET] <Prelude_Zzzzz> http://pastebin.com/haPBgEwR
[23:15:30 CET] <waressearcher2> Prelude_Zzzzz: hallo
[23:15:38 CET] <Prelude_Zzzzz> any way that i can drop frames or do whatever without the system going crazy ... is it the CPU's .. i am not even at 50% usage
[23:16:44 CET] <llogan> Prelude_Zzzzz: that info is not helpful without your actual command and the complete console output
[00:00:00 CET] --- Fri Nov 20 2015


More information about the Ffmpeg-devel-irc mailing list