[Ffmpeg-devel-irc] ffmpeg.log.20171018
burek
burek021 at gmail.com
Thu Oct 19 03:05:01 EEST 2017
[00:14:07 CEST] <kepstin> squarecircle: what player, what video codec, what pixel format?
[00:14:44 CEST] <kepstin> squarecircle: if this is sampling from your final mp4 file, that's gone through an rgb-yuv conversion, lossy compression, then yuv-rgb conversion in the player, so it's really no surprise...
[00:30:39 CEST] <squarecircle> alexpigment: kepstin: http://paste.ubuntu.com/25761768/
[00:33:02 CEST] <alexpigment> squarecircle: the original question still stands, I believe
[00:33:12 CEST] <alexpigment> kepstin appears to have the same thought as me
[00:33:21 CEST] <squarecircle> alexpigment: about how I'm getting the RGB values?
[00:33:25 CEST] <alexpigment> yeah
[00:33:27 CEST] <alexpigment> the resultant ones
[00:33:33 CEST] <alexpigment> are you playing it in a player and then getting it from that?
[00:33:46 CEST] <alexpigment> if so, there are *a lot* of factors that can cause the color to be off
[00:34:01 CEST] <alexpigment> i can think of at least 3 that are very common
[00:34:06 CEST] <squarecircle> I pipe the raw decoded data in RGB888 Format into a C program
[00:34:52 CEST] <alexpigment> gotcha
[00:35:00 CEST] <alexpigment> does the C program access the graphics driver at all?
[00:35:08 CEST] <alexpigment> in order to read the decoded data?
[00:35:16 CEST] <squarecircle> no
[00:35:29 CEST] <squarecircle> what do you mean by graphics driver?
[00:35:57 CEST] <alexpigment> well, most drivers are set to use either limited or full range of colors
[00:36:04 CEST] <alexpigment> 0-255 or 16-235
[00:36:17 CEST] <squarecircle> the c program itself just reads from stdin wich is piped from ffmpeg
[00:36:19 CEST] <alexpigment> most of the time when someone says the colors in a video look "off", it's because their driver needs to be changed to display videos as full range
[00:36:22 CEST] <alexpigment> k
[00:36:23 CEST] <squarecircle> no graphics driver
[00:36:31 CEST] <alexpigment> I'm not sure then. I just figured I'd ask
[00:36:36 CEST] <squarecircle> no image or graphics stuff
[00:36:44 CEST] <squarecircle> just plain mathematics
[00:36:53 CEST] <alexpigment> kepstin might have some info about the rgb-yuv conversion that could explain it
[00:37:09 CEST] <squarecircle> yeah
[00:37:32 CEST] <squarecircle> I guess there will be ever a small offset from the "true" colours
[00:37:41 CEST] <squarecircle> if I want to display it as RGB
[00:38:00 CEST] <squarecircle> I could also write the C code to read yuv too
[00:38:06 CEST] <squarecircle> and save the data as yuv
[00:38:34 CEST] <kepstin> well, if you want to preserve colours exactly, then use an rgb codec rather than yuv (e.g. specify 'libx264rgb' when making the video) and consider using a lossless mode
[00:39:25 CEST] <squarecircle> and later, if I need it in RGB, I can convert it
[00:39:32 CEST] <kepstin> you'll never get exact colours out of an rgb->yuv->rgb conversion in 8bit because the ranges are just different, and if you use a lossy codec there's no reason it has to preserve exact values, it just has to look close enough.
[00:39:40 CEST] <squarecircle> kepstin: so I need accuracy for testing purposes
[00:40:23 CEST] <squarecircle> I want to create some cases where I can "prove" that my algorithm and the whole program works
[00:40:27 CEST] <kepstin> squarecircle: if you need accuracy, use "-c:v libx264rgb -qp 0" when encoding the video. It'll be way larger, but it'll preserve the values exactly.
[00:40:39 CEST] <squarecircle> kepstin: thank you very much
[00:41:10 CEST] <squarecircle> what does -qp 0 do?
[00:41:24 CEST] <kepstin> sets the libx264 encoder to lossless mode instead of lossy mode.
[00:41:33 CEST] <squarecircle> great!
[00:42:11 CEST] <squarecircle> do you have a estimation for me about how much the conversion will be off?
[00:42:22 CEST] <squarecircle> like 5%, 10%, 15%?
[00:42:47 CEST] <kepstin> do you mean - how big of an effect the rgb to yuv to rgb conversion would have?
[00:42:52 CEST] <squarecircle> so thats "important" later, because I like to have a clue about the error margin
[00:42:56 CEST] <squarecircle> kepstin: no
[00:43:28 CEST] <squarecircle> kepstin: I have a lot of "raw" data, mp4, mkv, avi and so on files in dozens of awkward codecs
[00:43:35 CEST] <squarecircle> kepstin: I convert them all to RGB
[00:43:43 CEST] <squarecircle> for comparison reasons
[00:43:52 CEST] <squarecircle> but can I even do this?
[00:44:12 CEST] <kepstin> squarecircle: unless you have the original source that the video was encoded from, how could you possibly know how far it is off?
[00:44:25 CEST] <squarecircle> good point
[00:44:46 CEST] <squarecircle> so RGB is just fine :]
[00:45:10 CEST] <squarecircle> is there a "finer" colour model?
[00:45:20 CEST] <squarecircle> that has a better "resolution"?
[00:45:27 CEST] <squarecircle> then RGB888?
[00:46:09 CEST] <kepstin> the main thing to do is avoid conversion whenever possible. If you have two random videos, they might both be in yuv420p for example - in that case, you want to compare them in that colour space rather than converting to a different one.
[00:46:40 CEST] <squarecircle> kepstin: so keeping the colour space is the best idea
[00:46:55 CEST] <kepstin> as far as a higher-precision format, you could use e.g. a 16bit per component rgb or something.
[00:47:04 CEST] <kepstin> probably won't really provide much benefit
[00:47:08 CEST] <squarecircle> ok
[00:47:14 CEST] <squarecircle> thanks a lot
[00:48:30 CEST] <kepstin> but if you need to compare across a really wide range of videos, rgb isn't that bad - at least if you use ffmpeg to convert all the videos, it'll do the conversion in the same way on all of them.
[00:50:44 CEST] <squarecircle> kepstin: the most part should be yup420p as it is x264 encoded
[00:50:48 CEST] <squarecircle> in mp4 containers
[00:51:16 CEST] <squarecircle> the probabillity, that it is not yuv420p is quite low, isn't it?
[00:54:53 CEST] <squarecircle> kepstin: ?
[00:55:32 CEST] <kepstin> yeah, in that case it should all be yuv420p or something isomorphic.
[00:57:00 CEST] <squarecircle> is there a way to output the raw data in the respective colour format?
[00:58:00 CEST] <squarecircle> "-codec:v", "rawvideo", should print it as raw pixel values, correct?
[00:58:36 CEST] <squarecircle> "-pix_fmt", "rgb24", should be replaced with "orginal" or?
[00:59:04 CEST] <squarecircle> and what "-f", "rawvideo", does, I have no clue
[00:59:06 CEST] <squarecircle> :/
[01:17:20 CEST] <squarecircle> oh yeah
[01:19:27 CEST] <squarecircle> if I take "+" as a -pix_fmt value, it takes the sources colour space
[01:19:33 CEST] <squarecircle> and theres no conversion
[01:23:49 CEST] <squarecircle> alexpigment: https://git.computerwerk.hg.tu-darmstadt.de/goto/chromatography
[01:24:01 CEST] <squarecircle> alexpigment: thats the full program
[01:29:39 CEST] <Cracki> what's the purpose of that
[01:32:38 CEST] <Cracki> it's certainly not that: https://en.wikipedia.org/wiki/Chromatography
[01:36:18 CEST] <squarecircle> Cracki: no the name is only borrowed
[01:36:53 CEST] <squarecircle> Cracki: it creates a hash based on the average colours of a video
[01:42:02 CEST] <squarecircle> I'm off
[01:42:12 CEST] <squarecircle> many thanks again!
[02:54:15 CEST] <ulf`> Hi
[02:55:25 CEST] <ulf`> I have a USB 2.0 webcam as a source for the following raw pixel format: [video4linux2,v4l2 @ 0x960d1e0] Raw : yuyv422 : YUV 4:2:2 (YUYV) : 640x480 352x288 320x240 176x144 160x120
[02:55:48 CEST] <ulf`> Any idea how I can use ffserver to produce an RTP feed ZoneMinder will understand in its monitor configuration?
[02:57:23 CEST] <ulf`> Or I should say: How I can produce an RTP feed any media player will understand. So far no luck.
[05:39:15 CEST] <mozzarella> guys
[07:31:04 CEST] <blap> you need kobaciky mozzarella
[07:45:41 CEST] <mozzarella> blap: who's that?
[07:46:17 CEST] <blap> it's a form of mozzarella invented in slovakia - pulled into strings and sometimes smoked
[07:49:22 CEST] <Xogium> great, now I feel like eating pizza a 2 am XD
[07:49:27 CEST] <Xogium> at
[07:52:00 CEST] Action: blap decides to make pizza at 7:51 AM. so THERE
[07:52:05 CEST] <mozzarella> blap: how can I put a limit on ffmpeg's cpu usage
[07:52:50 CEST] <blap> cpulimit
[07:55:07 CEST] <mozzarella> blap: how do I use it?
[07:56:55 CEST] <blap> i don't know
[07:57:01 CEST] <blap> cpulimit ffmpeg bla bla
[08:00:34 CEST] Action: blap puts a frozen pizza in the oven and strikes a Power Rangers pose
[08:28:10 CEST] <squarecircle> blap: is this stuff also turning into rubber, like the polish smoked cheese?
[08:29:20 CEST] <blap> it is rubbery
[08:29:31 CEST] <squarecircle> Oscypek
[08:29:53 CEST] <squarecircle> :D
[08:30:36 CEST] <mozzarella> how can I tell ffmpeg how many threads to use
[08:31:11 CEST] <squarecircle> If I choose raw_video as output and yuv420p as pix_fmt, do I get constant 12bits/pixel?
[08:31:32 CEST] <furq> mozzarella: you can't set a global limit, but you can set -threads for the decoder and encoder
[08:31:48 CEST] <furq> but some encoders will use more threads, plus any filters you use will use additional threads
[08:31:51 CEST] <furq> squarecircle: yes
[08:34:36 CEST] <squarecircle> furq: thanks
[08:34:37 CEST] <mozzarella> furq: how
[08:34:50 CEST] <mozzarella> when I use -threads 1 it still uses more than one thread
[08:35:31 CEST] <furq> set -threads 1 before -i as well
[08:35:51 CEST] <furq> that will still use more than two threads with x264 though unless you turn lookahead off
[08:35:53 CEST] <mozzarella> I set it twice?
[08:35:56 CEST] <furq> yes
[08:36:00 CEST] <furq> once for the decoder, once for the encoder
[08:36:12 CEST] <furq> like i said it'll still use more than two threads
[08:36:45 CEST] <mozzarella> ffmpeg -threads 1 -i sleep.mp4 sleep_compressed.mp4 -threads 1
[08:36:47 CEST] <mozzarella> like that?
[08:37:08 CEST] <furq> output options go before the output file
[08:37:29 CEST] <mozzarella> this is so confusing
[08:38:42 CEST] <squarecircle> [ACK]
[08:38:43 CEST] <squarecircle> :)
[08:40:29 CEST] <mozzarella> alright
[08:40:33 CEST] <mozzarella> cpulimit seems to work
[08:40:35 CEST] <mozzarella> I like it
[08:40:38 CEST] <squarecircle> mozzarella: I think its great to remember it like this: ffmpeg [input options] -i input [output options] -o output
[08:40:46 CEST] <furq> except without -o
[08:40:57 CEST] <squarecircle> furq: ok, why no "-o"?
[08:41:07 CEST] <squarecircle> that confuses me since several years now
[08:41:14 CEST] <furq> because you can have multiple output files
[08:41:26 CEST] <squarecircle> ok
[08:41:38 CEST] <furq> it's probably easier to think of it as ffmpeg (input options) -i input (output1 options) output1 [(output2 options) output2]...
[08:42:28 CEST] <furq> but yeah, -threads just signals to the decoder/encoder how many threads to use
[08:42:35 CEST] <squarecircle> ok
[08:42:46 CEST] <furq> ffmpeg is just marshalling those together so it can't really put a global limit on how many threads are used
[08:43:19 CEST] <squarecircle> so Wikipedia tells me: YUV420p 6 bytes per 4 pixels, reordered
[08:43:22 CEST] <furq> with that said usually the vast majority of your cpu usage comes from the encoder
[08:43:49 CEST] <squarecircle> what happens with a 3x3 video?
[08:43:58 CEST] <furq> you can't have a 3x3 video with yuv420p
[08:44:02 CEST] <furq> both dimensions have to be mod2
[08:44:05 CEST] <squarecircle> ok
[08:44:33 CEST] <squarecircle> so its save to read 6 bytes, and get the values for four pixels and go on?
[08:44:43 CEST] <furq> right
[08:44:56 CEST] <squarecircle> ok, thank you
[08:45:59 CEST] <furq> actually, having read the question again, no
[08:46:07 CEST] <furq> that would work for a packed pixel format, but yuv420p is planar
[08:46:27 CEST] <furq> so it depends what the decoder actually gives you
[08:46:42 CEST] <squarecircle> ?
[08:47:16 CEST] <furq> if the decoder outputs yuv420p then you get all the Y values for the entire frame, then all U, then all V
[08:47:38 CEST] <squarecircle> oh
[08:47:48 CEST] <squarecircle> you sure?
[08:48:52 CEST] <squarecircle> ok, just read: Y2UV420p is a planar format, meaning that the Y2, U, and V values are grouped together instead of interspersed.
[08:48:57 CEST] <furq> right
[08:49:06 CEST] <furq> any pixel format in ffmpeg with p on the end is planar
[08:49:14 CEST] <squarecircle> ok
[08:49:21 CEST] <furq> i don't know if there is a packed 4:2:0 format
[08:49:32 CEST] <furq> there are some you might encounter for 4:2:2
[08:49:51 CEST] <squarecircle> later I'll write a script that scans all my raw video data for the pixel formats
[08:50:15 CEST] <squarecircle> I guess most stuff is yuv420p
[08:50:20 CEST] <furq> probably
[08:51:05 CEST] <furq> if you're just reading it from a file then ignore that stuff i said about the decoder
[08:51:51 CEST] <squarecircle> ?
[08:52:06 CEST] <furq> that would only matter if you were reading the data in flight
[08:52:29 CEST] <squarecircle> no, I'm using ffmpeg to convert the encoded video to RAW
[08:52:37 CEST] <squarecircle> furq: thats in flight
[08:52:56 CEST] <squarecircle> because else I would need to write a container parser and a decoder too
[08:53:25 CEST] <furq> you're reading it after it gets written to a file though, right
[08:53:33 CEST] <squarecircle> no
[08:54:12 CEST] <squarecircle> furq: http://paste.ubuntu.com/25764179/
[08:54:16 CEST] <squarecircle> thats my command
[08:54:25 CEST] <furq> yeah that's still more or less the same thing
[08:54:38 CEST] <furq> once it's left ffmpeg it'll be the pix_fmt you requested
[08:56:17 CEST] <squarecircle> furq: I don't want to converse any pix_fmt, because of information loss
[08:57:02 CEST] <squarecircle> furq: so if the source will be a yuv420p file, then I want to have raw yuv420p output
[09:06:03 CEST] <squarecircle> furq: ?
[09:06:15 CEST] <furq> yeah that's all fine. like i said, ignore what i said
[09:06:27 CEST] <squarecircle> what should I ignore?
[09:08:28 CEST] <furq> 07:46:27 ( furq) so it depends what the decoder actually gives you
[09:08:29 CEST] <furq> just that
[09:14:23 CEST] <squarecircle> ok
[09:14:24 CEST] <squarecircle> :D
[09:14:53 CEST] <squarecircle> thanks you
[09:15:16 CEST] <squarecircle> I'll have to work, but I'll care about the project later ;)
[09:57:08 CEST] <mozzarella> guys
[09:57:28 CEST] <mozzarella> https://i.imgur.com/pAz9gND.png
[09:57:31 CEST] <mozzarella> what is this?
[09:58:03 CEST] <mozzarella> why does it say suspended
[09:58:15 CEST] <rabbe> how do i achive smooth motion with decent quality using rtsp on a local network? gopro -> magewell capturecard -> ffmpeg -> wifi -> vlc on iphone. is there any default settings that work well?
[09:59:57 CEST] <rabbe> oh, and i'm also using this: https://github.com/revmischa/rtsp-server
[10:00:15 CEST] <furq> mozzarella: i'm guessing that's cpulimit output
[10:00:34 CEST] <JEEB> RTSP can be either TCP or UDP, UDP is likely to work less well over various networks
[10:00:37 CEST] <JEEB> esp. wireless
[10:00:50 CEST] <rabbe> ah, okay
[10:01:02 CEST] <rabbe> is rtps over tcp better than rtmp?
[10:01:07 CEST] <mozzarella> furq: no, I ran it separately
[10:01:14 CEST] <mozzarella> and gave it the pid of the process
[10:01:15 CEST] <JEEB> I'd say yes because RTSP at least is standardized :P
[10:01:57 CEST] <mozzarella> it's like it has suspended itself but it's still printing to the terminal
[10:04:49 CEST] <rabbe> so basically use -rtsp_transport tcp in ffmpeg?
[10:05:53 CEST] <Nacht> Hmmm. That new XVC codec sounds nice
[10:06:02 CEST] <rabbe> whoa, that made the rtsp server console window go apeshit
[10:08:33 CEST] <rabbe> maybe tcp is not supported by that rtsp server
[10:08:43 CEST] <stevenliu> HSV HSL which one is used by vf_hue.c ? JEEB
[10:08:51 CEST] <JEEB> E_NO_IDEA
[10:22:20 CEST] <rabbe> any good free rtsp-over-tcp server suggestions?
[11:13:59 CEST] <rabbe> setting an h264 profile.. does that affect any other ffmpeg-parameters, which i will not need to set anymore?
[11:14:27 CEST] <JEEB> generally no, other components handle any sort of singaling automagically.
[11:15:42 CEST] <rabbe> ok
[11:27:01 CEST] <rabbe> does VLC suck on IOS? :)
[11:27:15 CEST] <rabbe> (iphone)
[11:27:20 CEST] <JEEB> no idea :P
[12:00:02 CEST] <flok420> any ideas why this happens: https://i.sigio.nl/961d941ea603e55a6649ef986ebf2fde.jpg I mean the colorfull garbage at the lower half. this is an rtsp stream. it also happens with vlc but not with the original windows software. it is supposed to be a regular h.264 stream.
[12:00:29 CEST] <flok420> another example: https://i.sigio.nl/c6fa01a6acba6682160f48b9ec91cd37.png
[12:44:08 CEST] <libertas> Hi, from the manual, I try: ffmpeg -i output.mp4 -map 0 -c:v libx264 -c:a copy output_f.mp4 but returns an error saying Unrecognized option 'c:v' and Failed to set value 'libx264' for option 'c:v'
[12:44:57 CEST] <libertas> ffmpeg version 0.8.7, Copyright (c) 2000-2013 the Libav developers
[12:44:58 CEST] <libertas> built on Oct 10 2017 15:21:15 with gcc 6.3.0
[12:45:25 CEST] <libertas> libav seems to have been deprecated but this is a recent package from my distro, Void Linux
[12:45:52 CEST] <libertas> how can I convert the video?
[12:50:23 CEST] <JEEB> libertas: most likely the avconv binary would do the job for you, but man that is an old version of either :P
[12:50:37 CEST] <JEEB> so I'd just recommend building a newer FFmpeg
[12:52:11 CEST] <Fyr> guys, I get:
[12:52:11 CEST] <Fyr> >>Application provided invalid, non monotonically increasing dts to muxer in stream 4
[12:52:12 CEST] <Fyr> how do I make FFMPEG convert?
[12:53:51 CEST] <JEEB> I remember ffmpeg.c trying to fix the weirdness in MPEG-TS and stuff so you'd need some context about that :P also depending on the output format that is not an error but just a warning
[12:54:19 CEST] <Fyr> JEEB, FFMPEG wrote:
[12:54:21 CEST] <Fyr> av_interleaved_write_frame(): Invalid argument
[12:54:30 CEST] <Fyr> Conversion failed!
[12:54:38 CEST] <JEEB> ok, so in your case it was a failure
[12:54:41 CEST] <libertas> thanks JEEB
[12:55:04 CEST] <JEEB> anyways, I can't say anything about anything generically so if you're not providing info then you might as well go back and herp a derp how FFmpeg is such awful software
[12:55:59 CEST] <JEEB> and that you should have bought this big harmonics box instead
[12:57:00 CEST] <Fyr> JEEB, https://pastebin.com/rBbfhUSe
[12:57:31 CEST] <JEEB> thank you
[12:58:06 CEST] <Fyr> the command was:
[12:58:06 CEST] <Fyr> ffmpeg -i bob.mkv -map 0 -c:v libx264 -crf 24 -preset:v veryslow -profile:v high -level:v 4.1 -c:a aac -b:a 256k -ac 2 -c:s copy -threads 4 bob-conv.mkv
[13:01:50 CEST] <JEEB> I recommend using -v verbose -debug_ts with this thing, but I have a hunch that the input file is borked.
[13:02:11 CEST] <JEEB> that should basically give you the timestamps demuxed and then pushed into the muxer
[13:02:30 CEST] <Fyr> JEEB, does -report include sufficient verbosity?
[13:02:36 CEST] <JEEB> no idea
[13:02:44 CEST] <JEEB> I usually just use 2> stuff.log
[13:02:55 CEST] <JEEB> which redirects stderr to a file
[13:03:22 CEST] <Nacht> Oh man, that -debug_ts option. I love it
[13:03:51 CEST] <JEEB> it will spam you but at least you will know at which point timestamps went derp
[13:04:08 CEST] <JEEB> Fyr: also you can limit to just the subtitle track that fails if you wish
[13:04:30 CEST] <JEEB> since the other tracks are largely unrelated
[13:05:36 CEST] <Xogium> erm... So I setup srtp last evening and I left the stream on for a while in my music player before connecting to something else for the night. This morning I try to play the sdp file I got from that other box last evening and I get this despite the fact the stream never stopped: hmac mismatch
[13:06:59 CEST] <Xogium> still same size as yesterday, same time for last modified, nothing seems to alterate the sdp file
[13:07:51 CEST] <Xogium> if I re-grab the same file from the box it works
[13:08:07 CEST] <JEEB> no idea about SRTP
[13:08:22 CEST] <JEEB> some sort of authentication fialure
[13:08:45 CEST] <Xogium> yep, it compares the hmac of what I got ot its and says it's not good. The file never changed
[13:09:10 CEST] <JEEB> well, if there's some time-based message there
[13:09:13 CEST] <JEEB> or something else
[13:09:15 CEST] <JEEB> I have no idea
[13:09:19 CEST] Action: JEEB never has used SRTP
[13:09:53 CEST] <Xogium> heh.. Maybe others know what I do wrong
[13:12:07 CEST] <Fyr> JEEB, https://gist.github.com/anonymous/b3c56d69c8467eecde6022768c6e7c91
[13:12:12 CEST] <Fyr> full report log
[13:13:17 CEST] <Fyr> oh, it's not full.
[13:13:24 CEST] <Xogium> nvm, my player now refuses to take that file, but ffmpeg didn't modify it either
[13:13:51 CEST] <Xogium> so there must be some time for validity or whatever..
[13:14:22 CEST] <Xogium> if someone who knows about srtp and how it's handled in ffmpeg could explain that would be nice :D
[13:19:29 CEST] <Fyr> JEEB, the complete output:
[13:19:29 CEST] <Fyr> https://www.dropbox.com/s/mw78fki7y5mi5gs/ffmpeg-20171018-180259.log?dl=0
[13:22:37 CEST] <JEEB> Fyr: becomes shorter if you just map that subtitle stream and only copy it
[13:23:06 CEST] <Fyr> I wrote -c:s copy.
[13:23:20 CEST] <JEEB> yea but you are handling all the streams :P
[13:23:26 CEST] <JEEB> we're only interested in that one subtitle stream
[13:23:42 CEST] <Fyr> JEEB, what command do you recommend?
[13:23:58 CEST] <JEEB> -map 0:4 -c:s copy?
[13:24:07 CEST] <JEEB> *-c copy since you don't map anything else
[13:24:41 CEST] <Fyr> why does the first command not work?
[13:24:58 CEST] <JEEB> it does, but I won't download 10 megs of logs :P
[13:25:45 CEST] <Inge-> I would like to convert still images to video, on an rPi. trying to get ffmpeg to work with GPU hw accelleration on the Pi. Running ffmpeg 3.2.5-1. it contains h264_omx, but complains: libOMX_Core.so not found. Trying to compile from source, I get different complaint: ERROR: OMX_Core.h header not found Tips?
[13:49:54 CEST] <berz3rk> can you please help me? I try to use a simple filter with ffmpeg, but when I do it makes the quality of the video worse, halfs the bitrate :(
[13:50:05 CEST] <berz3rk> I just want to convert from right to left sbs to left to right sbs
[13:50:39 CEST] <JEEB> many video encoders have the libavcodec default of 200kbps or so
[13:50:47 CEST] <berz3rk> I used this command : ffmpeg -i 'input.mp4' -map 0:0 -map 0:1 -c:a aac -b:a 320k -vf stereo3d=sbsr:sbsl 'output.mp4'
[13:51:11 CEST] <JEEB> check the stderr for which encoder got used
[13:51:19 CEST] <berz3rk> my output file has half the video bitrate..
[13:51:39 CEST] <JEEB> the input bit rate really doesn't matter so let's ignore that.
[13:51:46 CEST] <berz3rk> ffmpeg -i 'input.mp4' -map 0:0 -map 0:1 -c:a aac -b:a 320k -vf stereo3d=sbsr:sbsl 'output.mp4'
[13:51:48 CEST] <berz3rk> oups
[13:51:50 CEST] <berz3rk> Lavf57.56.101
[13:51:54 CEST] <berz3rk> is this the encover?
[13:51:55 CEST] <JEEB> no
[13:52:02 CEST] <JEEB> just post your full stderr in a pastebin.com or similar
[13:52:04 CEST] <JEEB> and link here
[13:53:11 CEST] <berz3rk> https://pastebin.com/tBdn3yAK
[13:53:57 CEST] <JEEB> thank you
[13:54:03 CEST] <JEEB> ok, so x264 is used
[13:54:14 CEST] <JEEB> in that case the default crf is 23, you can control it with -crf XX
[13:54:26 CEST] <berz3rk> what is the highest value for me here
[13:55:02 CEST] <berz3rk> ah 0
[13:55:02 CEST] <berz3rk> ok
[13:55:04 CEST] <JEEB> no
[13:55:07 CEST] <JEEB> wait a moment there
[13:55:16 CEST] <JEEB> let me finish what I was writing
[13:55:32 CEST] <JEEB> use -vframes 2500 after -i INPUT and -ss SECONDS before -i INPUT to find a spot where you have content and encode about 2500 frames
[13:55:38 CEST] <JEEB> start with say crf 21
[13:55:48 CEST] <JEEB> then if that looks good, remove those two and encode the full movie
[13:56:05 CEST] <Inge-> In my case, it configred fine if I omitted omx and only included omx-rpi
[13:56:14 CEST] <Inge-> now seeing if it will compile
[13:56:30 CEST] <JEEB> CRF zero with 8bit is lossless, and that you generally don't want if you want any sort of hardware decoding
[13:56:31 CEST] <berz3rk> its a 3d movie so its hard to look if it looks good ;D
[13:56:49 CEST] <JEEB> well you were able to tell if it looked bad
[14:07:52 CEST] <Xogium> well dang, I guess I'll just have to restart streaming
[14:11:57 CEST] <Xogium> been streaming for 13 hours, so I guess the only possibility is a validity time for the hmac but then again I suppose ffmpeg would have written new info into the sdp_file which it didn't
[14:18:58 CEST] <paveldimow> Hi, I need a little help to find out am I blind or stupid
[14:19:09 CEST] <paveldimow> ffmpeg -i Downloads/sample.mp4 -force_key_frames "expr:gte(t,n_forced*1)" -filter_complex " [0:v] setpts=PTS-STARTPTS [main]; [1:v] movie="Downloads/sample.mp4", scale=iw/3:ih/3, setpts=PTS-STARTPTS [cohost]; [main][cohost] overlay=shortest=1; [0:a][1:a]amix " -c:a aac -c:v libx264 -strict -2 -f mpegts udp://230.0.0.1:5000
[14:19:27 CEST] <paveldimow> [AVFilterGraph @ 0x56306bffbfe0] Too many inputs specified for the "movie" filter.
[14:20:36 CEST] <paveldimow> any help highly appreciated
[14:32:10 CEST] <Xogium> okaaaay... Just plain weird
[14:32:24 CEST] <Xogium> if I restart the stream, it works
[14:32:45 CEST] <Xogium> I don't even have to get the new file, the file from last evening works magically
[14:33:26 CEST] <Xogium> soooo question is why tell me there's an issue with the hmac after some time if it doesn't change ?
[14:35:15 CEST] <blap> who knows something about X and glx
[14:37:09 CEST] <blap> answer or the duck gets it!
[14:40:38 CEST] <luc4> Hello! Im debugging a weird situation in my player and I notice that after av_seek_frame Im getting AV_NOPTS_VALUES as cur_dts. Is this normal?
[14:41:14 CEST] <luc4> *AV_NOPTS_VALUE
[14:47:44 CEST] <rabbe> do i need a signaling/stun/turn server to broadcast one-way over LAN?
[14:52:21 CEST] <BtbN> what?
[14:52:29 CEST] <BtbN> broadcast one-way?
[14:52:44 CEST] <rabbe> one cam - many clients
[14:53:01 CEST] <BtbN> I'm not aware of any protocol using actual broadcast
[14:53:16 CEST] <BtbN> it's always just a simple connection or UDP communication
[14:53:25 CEST] <BtbN> There are some that use multicast, but getting that working is annoying
[14:53:47 CEST] <rabbe> well, call it whatever. sending video to many clients
[14:54:02 CEST] <BtbN> Just setup any streaming server, and connect your clients there.
[14:54:12 CEST] <BtbN> The data will go over the wire multiple times though.
[14:55:13 CEST] <rabbe> will i need a stun/turn server?
[14:55:29 CEST] <rabbe> i'm talking webRTC, sorry..
[14:56:41 CEST] <BtbN> You have a NAT inside of your LAN? oO
[14:57:03 CEST] <BtbN> And WebRTC for local streaming is highly unusual and hard to do
[14:57:07 CEST] <Xogium> double natting is baaaaad
[14:57:13 CEST] <BtbN> It's practically only implemented in Browsers.
[14:57:56 CEST] <rabbe> dont know, don't think so.. but the terms come up when u try to learn webrtc
[14:58:17 CEST] <rabbe> just think a wireless router to connect a bunch of computers
[14:59:07 CEST] <Xogium> if you have a router behind a router then you are indeed doing double nat (unless you turned off nat into that second router)
[14:59:14 CEST] <rabbe> "highly unusual and hard to do". u have a better way to stream to an iphone realtime?
[14:59:57 CEST] <rabbe> "realtime" = as fast as possible
[15:00:38 CEST] <JEEB> fast or low latency?
[15:00:45 CEST] <JEEB> those are two distinct things
[15:01:16 CEST] <BtbN> I think the iPhone only supports HLS, which has a rather big latency hit
[15:01:17 CEST] <rabbe> i want to see the things as soon as they are captured by the camera
[15:01:26 CEST] <BtbN> That's hardly possible
[15:01:49 CEST] <JEEB> well if you set up your capture + x264 to very low latency and find a way to push UDP packets over your wifi link fast enough...
[15:01:53 CEST] <rabbe> yep.. and VLC seems to suck on iphone.. tried rtmp and rtsp
[15:02:05 CEST] <aleek> sup! I need to write a Linux app that generates a video stream (may be raw, I can compress it later) with analog speed indicator. The speed will be taken from real time input. You guys have any idea how to do it? I have no idea what to write into google search to read about it :/
[15:02:09 CEST] <BtbN> You're not going to get an iPhone to play UDP stuff sent to it though
[15:02:18 CEST] <JEEB> well you'd have to make your own app of course
[15:02:31 CEST] <JEEB> I think that's a prerequisite for something like that :P
[15:02:31 CEST] <rabbe> no custom apps
[15:02:35 CEST] <BtbN> Pretty sure Apps also aren't allowed to do that.
[15:02:39 CEST] <BtbN> TCP only
[15:02:39 CEST] <JEEB> well vlc is a goddamn custom app
[15:02:50 CEST] <rabbe> either a player that exists for all mobile
[15:02:53 CEST] <rabbe> or web
[15:03:08 CEST] <BtbN> You'll have to accept a big latency hit then
[15:03:17 CEST] <rabbe> ;~~(
[15:03:28 CEST] <rabbe> but webrtc should be kinda realtime?
[15:03:31 CEST] <JEEB> basically if you limit your scope and put effort and/or money into it, you can do it
[15:03:37 CEST] <JEEB> stop talking real-time, talk in fucking latency
[15:03:45 CEST] <rabbe> :)
[15:03:46 CEST] <JEEB> real-time is just "the encoder keeps up in frame rate"
[15:04:21 CEST] <JEEB> BtbN: anyways vlc on iphone seemed to be using UDP for RTSP... loldunno, though
[15:04:24 CEST] <rabbe> low latency then, mark my words
[15:04:52 CEST] <BtbN> JEEB, I wouldn't be surprised if it uses tcp for rtsp, or the iPhones native RTSP implementation, if it has one
[15:05:08 CEST] <JEEB> right, it probably has one. it wasn't TCP because this guy's RTSP server derp'd at TCP
[15:05:14 CEST] <JEEB> it seems
[15:05:21 CEST] <JEEB> if I parsed his lines correct
[15:05:21 CEST] <rabbe> yep
[15:06:00 CEST] <JEEB> anyways, it's definitely possible to get stuff going, but the scope has to be limited and there most likely are no widely known pre-made things
[15:06:17 CEST] <JEEB> since usually such strict requirements require effort/time :P
[15:06:17 CEST] <BtbN> You cannot just stream to a browser with WebRTC
[15:06:33 CEST] <JEEB> yes, you need to do the base information sharing first
[15:06:41 CEST] <JEEB> handshakes and all that
[15:06:47 CEST] <rabbe> that web-experiment.com guy seems to have unidirectional webrtc video going
[15:06:50 CEST] <BtbN> In theory it's possibly, but you'll have to write the entire "server", which just acts as another browser with a video source, yourself
[15:07:06 CEST] <BtbN> There is nothing readily available for that.
[15:07:25 CEST] <rabbe> www.webrtc-experiment.com
[15:07:49 CEST] <BtbN> And what part of it being called an experiment makes you think it's a good idea to use in production?
[15:08:13 CEST] <JEEB> anyways, tl;dr it's possible but you'll have to pour time and effort into it
[15:08:17 CEST] <rabbe> all the talk about webrtc being the way forward for video online
[15:08:24 CEST] <BtbN> Between Browsers
[15:08:27 CEST] <rabbe> ios 11 supports it now etc.
[15:08:32 CEST] <BtbN> WebRTC is a gigantic over-engineered highly complex something
[15:08:42 CEST] <BtbN> you can't just stream to it without jumping through 20 hoops
[15:08:45 CEST] <JEEB> I think at least one game streaming company uses it
[15:08:53 CEST] <JEEB> I am not sure if I ever got it working, though
[15:08:55 CEST] <rabbe> thats my initial thought also.. but people keep recommending it
[15:09:01 CEST] <BtbN> mixer uses it as its ingest protocol. But their own custom written variation of it
[15:09:03 CEST] <JEEB> no, it just hits your requirements
[15:09:08 CEST] <BtbN> It's called ftl protocol
[15:09:15 CEST] <JEEB> BtbN: it's pretty much WebRTC
[15:09:21 CEST] <JEEB> (which in turn is pretty much RTP)
[15:09:24 CEST] <JEEB> and it's not just ingest
[15:09:35 CEST] <JEEB> it also uses WebRTC on the client side if you try switching to it
[15:09:52 CEST] <JEEB> rabbe: basically you've been noting the requirements you have and everything people have done is given you currently available *alternatives*
[15:10:03 CEST] <JEEB> you will still have to work out how to do it if you really want to succeed in it
[15:10:09 CEST] <JEEB> and/or find another way of doing it
[15:10:14 CEST] <rabbe> how live is live streaming on youtube etc.? from a guy doing something in front of the camera to the viewers being able to see it? can't be that live
[15:10:29 CEST] <BtbN> ~30 seconds delay
[15:10:29 CEST] <JEEB> easily tens of seconds
[15:10:37 CEST] <Xogium> that's latency, obviously
[15:10:51 CEST] <JEEB> yes, latency due to our HTTP-based ways of streaming etc
[15:10:55 CEST] <furq> youtube live is still hls thanks to our friends at apple
[15:11:09 CEST] <BtbN> Well, it's HLS because there is nothing else
[15:11:16 CEST] <BtbN> YouTube uses HLS with mp4 segments
[15:11:40 CEST] <JEEB> also I think it's DASH. although I think if you pick the webm dash format it will have extra latency because libvpx can't keep up :P
[15:12:00 CEST] <furq> last i checked it was hls for live and dash for everything else
[15:12:08 CEST] <furq> i assume they would use dash for live but for iOS
[15:12:21 CEST] <JEEB> I checked a random live stream on browser some time ago and it was VP9/DASH
[15:12:23 CEST] <furq> but i'm not going to claim to understand google's motivation for doing things
[15:12:30 CEST] <JEEB> so they definitely have multiple things around
[15:12:36 CEST] <furq> oh really
[15:12:39 CEST] <JEEB> most likely HLS for iDevices
[15:12:41 CEST] <furq> yeah
[15:13:11 CEST] <furq> that must have changed fairly recently
[15:13:17 CEST] <furq> either that or firefox is super weird
[15:13:25 CEST] <JEEB> not sure when I first started getting it.
[15:13:50 CEST] <furq> that makes sense because they obviously would prefer not to use hls at all
[15:13:56 CEST] <JEEB> anyways, latency wise something over UDP < something over TCP < something over HTTP
[15:14:10 CEST] <JEEB> (greater being greater latency)
[15:14:13 CEST] <Xogium> udp is a bit better for latency
[15:14:30 CEST] <Xogium> certainly better than http
[15:14:54 CEST] <JEEB> well with HTTP you have the problem that none of the browsers are able to keep reading a URL without buffering all of it
[15:15:04 CEST] <JEEB> which kills the most obvious way of streaming with an unending HTTP thing
[15:15:25 CEST] <JEEB> (as soon as you throw browsers into the mix things get really nasty)
[15:15:46 CEST] <BtbN> You can just dump a fragmented mp4 stream into a Websocket and feed it to MSE
[15:15:52 CEST] <JEEB> yea
[15:15:54 CEST] <BtbN> The problem is, it wants seperate streams for audio and video
[15:16:07 CEST] <JEEB> sure, but that really isn't *that* much of a problem
[15:16:17 CEST] <BtbN> It has no notion of timestamps
[15:16:25 CEST] <JEEB> fragment start DTS
[15:16:27 CEST] <JEEB> is a thing
[15:16:39 CEST] <JEEB> so when you get a fragment you get it's DTS
[15:16:40 CEST] <BtbN> Yeah, but it just doesn't care, you have to make sure you feed it in sync yourself
[15:16:55 CEST] <JEEB> sure, now that is specific to those things being awful
[15:17:08 CEST] <furq> i'm so glad i've never had a need to do this with low latency
[15:17:09 CEST] <JEEB> you made it sound as if there are no timestamps in the fragments :P
[15:17:11 CEST] <Xogium> well I got quite a low latency for my audio stream over srtp tbh, not even half a second
[15:18:02 CEST] <Xogium> but then again it's from a machine in the lan to another
[15:18:20 CEST] <Xogium> still, I don't think I could decrease latency even more
[15:20:30 CEST] <JEEB> half a second is quite a bit of latency and with audio only you could get it pretty low
[15:20:46 CEST] <JEEB> encoder delay + one sample?
[15:20:50 CEST] <Xogium> how should I do ?
[15:20:55 CEST] <Xogium> always curious :D
[15:21:07 CEST] <JEEB> don't ask me for how because you really need to invest time and effort into these things :P
[15:21:15 CEST] <JEEB> and nobody has paid me to do such things yet
[15:21:20 CEST] <Xogium> encoded in opus, 20 kbps and frame duration is 20
[15:21:21 CEST] <JEEB> I've just seen some succesful people
[15:21:54 CEST] <Xogium> I think it's a bit below half a second
[15:22:02 CEST] <Xogium> maybe 250 ms
[15:22:25 CEST] <JEEB> also the client always has to be optimized as well in the end
[15:22:35 CEST] <JEEB> so that it buffers the absolute minimum
[15:22:45 CEST] <JEEB> but yea, it's not something you can do out of the box with a lot of stuff :P
[15:22:46 CEST] <Xogium> yep
[15:22:53 CEST] <Xogium> in mpv you can try doing --no-cache
[15:23:03 CEST] <JEEB> mpv is not a low latency player
[15:23:22 CEST] <Xogium> I wonder what is, then..
[15:23:23 CEST] <JEEB> anyways, all I know is that someone had made a medical thing doing UDP streaming with x264
[15:23:31 CEST] <Xogium> wow
[15:23:36 CEST] <JEEB> so low latency *is* possible
[15:23:43 CEST] <JEEB> but you really need to put effort into it
[15:23:46 CEST] <BtbN> No readily made player is designed for low latency
[15:23:52 CEST] <BtbN> they all favor smooth playback
[15:23:56 CEST] <JEEB> yea
[15:24:36 CEST] <JEEB> because when you start going into really low latency it starts fighting with "smooth experience"
[15:24:38 CEST] <BtbN> With that low latency, every slight nudge on the network cable might throw you off and drop tons of frames
[15:25:28 CEST] <Xogium> there, yet again. Not even an hour after starting the stream I have to restart it again because of hmac mismatch
[15:25:51 CEST] <JEEB> then your server might be limiting the access token's lifetime or something
[15:25:56 CEST] <JEEB> not much I can say without knowing jack
[15:26:05 CEST] <Xogium> I have no server. It's just ffmpeg
[15:26:40 CEST] <Xogium> not to mention if I restart the stream the old file from yesterday will work just like it should
[15:27:01 CEST] <Xogium> it never changed
[15:28:53 CEST] <JEEB> there's no srtpenc.c in libavformat?
[15:28:58 CEST] <JEEB> http://git.videolan.org/?p=ffmpeg.git;a=tree;f=libavformat;h=03f37e2b4749f5f1d21a9a528c0a10eef8ff2cf1;hb=HEAD
[15:29:05 CEST] <Xogium> I don't see any option for time validity of the key
[15:29:06 CEST] <JEEB> so what your server is is something different?
[15:29:30 CEST] <Xogium> it's just pure ffmpeg, no vlc
[15:30:06 CEST] <JEEB> well you are still accessing the stream from somewhere, no?
[15:30:18 CEST] <JEEB> or do you say that ffmpeg.c is your server?
[15:30:52 CEST] <JEEB> there's an rtpenc.c in libavformat, yes
[15:31:17 CEST] <BtbN> srtp is an url protocol in lavf
[15:31:36 CEST] <Xogium> like ffmpeg -f alsa -ac 1 -i jw:0,0 -c libopus -ac 1 -b:a 20k -f rtp -srtp_out_suite suite -srtp_out_params $(echo "some text"|base64) srtp://ip:port
[15:32:09 CEST] <Xogium> and then I grab the sdp info it prints and put them in a file for my player
[15:34:42 CEST] <Xogium> still working after 5 minutes.. Trying to see after how much time it will tell me hmac mismatch
[15:35:20 CEST] <JEEB> so it fails with something easily testable like mpv?
[15:35:36 CEST] <Xogium> yes
[15:35:37 CEST] <JEEB> if so, please make a ticket on trac and see if you can get cehoyos to help you out
[15:35:51 CEST] <JEEB> if you provide exact command lines it should be possible
[15:35:57 CEST] <Xogium> sure
[15:36:14 CEST] <JEEB> in the worst case you will get told that you are getting something really simple wrong, but at least you learn that.
[15:36:28 CEST] <BtbN> The srtp RFC specifies a key lifetime
[15:36:35 CEST] <JEEB> right, I kind of expected that
[15:36:36 CEST] <BtbN> the ffmpeg srtp implementation specifies that it does not implement that.
[15:36:41 CEST] <JEEB> :D
[15:37:19 CEST] <BtbN> http://git.videolan.org/?p=ffmpeg.git;a=blob;f=libavformat/srtp.c;h=f8b686c5aaa71caf3b83388c8f7d41ca38a6e9af;hb=HEAD#l91
[15:38:00 CEST] <Xogium> soooo ?
[15:38:07 CEST] <Xogium> :/
[15:38:22 CEST] <BtbN> patches that implement it are welcome I guess
[15:38:31 CEST] <Xogium> still working after 10 min
[15:38:56 CEST] <Xogium> this is bad..
[15:39:20 CEST] <Xogium> yet again something I'd want to do but since I can't understand the most basic things in programming... :p
[15:40:19 CEST] <JEEB> BtbN: so it does support it on the server side but not on client side or?
[15:40:58 CEST] <JEEB> since otherwise I'd think things would work if both sides would not implement it
[15:42:00 CEST] <Xogium> the message is fairly misleading though, don't you think ? Hmac mismatch... Sounds more like the key is suddenly invalid while it never changed
[15:43:22 CEST] <rabbe> anyone tried UV4L?
[15:44:09 CEST] <Xogium> or maybe it's just cause I'm not a native english speaker
[15:53:08 CEST] <Xogium> key is invalid after 20 mins
[15:53:50 CEST] <JEEB> well that means that something updated it :P
[15:54:04 CEST] <Xogium> and yet, nothing did
[15:54:18 CEST] <JEEB> how do you know?
[15:54:39 CEST] <Xogium> nothing changed in the ffmpeg sdp outpout
[15:54:44 CEST] <JEEB> anyways, whip up that issue ticket on trac with a simple way to replicate, and see what comes out of it :P
[15:54:46 CEST] <Xogium> *output
[15:54:52 CEST] <JEEB> that doesn't IMHO mean much
[15:55:23 CEST] <Xogium> if I restart stream with exact same command line it will work for 20 minutes again
[15:56:05 CEST] <Xogium> if I keep connection open via mpv or whatever above 20 min it will leave me connected, but new connection will be rejected after 20 minutes (i.e: closing mpv after 20 mins and trying to play it again)
[15:57:59 CEST] <JEEB> it is great that you test it as much :)
[15:58:14 CEST] <JEEB> thus I recommend you make that ticket with all that information so that someone can easily replicate it
[15:58:18 CEST] <Xogium> if key had changed I expect ffmpeg to at least write new sdp info so I can send them to client right ? Because right now only way is to restart the stream every 20 mins
[15:58:40 CEST] <Xogium> yeah, where should I open that bug precisely ?
[15:59:20 CEST] <JEEB> trac.ffmpeg.org
[15:59:40 CEST] <JEEB> Xogium: I wouldn't *expect* FFmpeg to update that. expecting things out of software is generally a bad idea
[15:59:49 CEST] <Xogium> I'm surely not a programmer, but I like testing stuff :p
[16:00:06 CEST] <JEEB> you have two things: ffmpeg.c (your server), and mpv (your client) - both of which in theory should be using the libavformat library from FFmpeg
[16:00:14 CEST] <Xogium> well, this means that srtp is bad idea
[16:00:36 CEST] <Xogium> mpv, vlc, audacious, they all use it yes
[16:00:48 CEST] <Xogium> they all throw the same error
[16:01:58 CEST] <Xogium> I'd rather avoid having to restart the ffmpeg stream every 20 mins
[16:02:12 CEST] <JEEB> VLC might actually use something else
[16:02:32 CEST] <JEEB> also opening that ticket can lead you to noticing something you've poked wrong or whatever
[16:02:43 CEST] <JEEB> anyways, I just don't know enough of SRTP
[16:02:49 CEST] <JEEB> and it's not exactly hot on the interwebs
[16:03:09 CEST] <Xogium> used the official doc for the srtp protocol so unless it's not up to date I did exactly as described
[16:03:25 CEST] <Xogium> for ffmpeg that is
[16:03:33 CEST] <Xogium> but will still open it
[16:04:21 CEST] <ulf`> There IS traffic here
[16:04:26 CEST] <ulf`> :)
[16:04:42 CEST] <ulf`> I have a USB 2.0 webcam as a source for the following raw pixel format: [video4linux2,v4l2 @ 0x960d1e0] Raw : yuyv422 : YUV 4:2:2 (YUYV) : 640x480 352x288 320x240 176x144 160x120 is my webcam
[16:04:49 CEST] <ulf`> Any idea how I can use ffserver to produce an RTP feed ZoneMinder will understand in its monitor configuration?
[16:04:53 CEST] <ulf`> Or I should say: How I can produce an RTP feed any media player will understand. So far no luck.
[16:06:06 CEST] <JEEB> first of all, drop all hope regarding ffserver, that's not the tool you are wishing for
[16:06:25 CEST] <JEEB> rtp should be creatable by ffmpeg.c I think?
[16:06:42 CEST] <JEEB> no idea what video format or other parameters that ZoneMinder thing would require :P
[16:07:09 CEST] <Xogium> JEEB: yep just like srtp
[16:07:20 CEST] <Xogium> which is just the secure version of rtp
[16:08:51 CEST] <ulf`> JEEB: so Zoneminder would display a feed that can be opened in a browser as well or so they told me
[16:09:08 CEST] <ulf`> here's my ffserver.conf https://pastebin.com/RdKHFN8r
[16:09:19 CEST] <JEEB> I have no idea what that is, and don't expect ffserver help
[16:09:30 CEST] <ulf`> JEEB: well ffserver is incredibly hard to use
[16:09:37 CEST] <ulf`> JEEB: the issue is the camera is remote hooked up to an Edison
[16:10:04 CEST] <JEEB> ffserver is also really ugly in API usage so a lot of developers want to get rid of it.
[16:10:12 CEST] <ulf`> hehehehe
[16:10:15 CEST] <JEEB> in your case it sounds like ffmpeg.c without ffserver would do the job
[16:10:38 CEST] <ulf`> JEEB: OK.....
[16:10:47 CEST] <ulf`> JEEB: how do I expose a port without ffserver?
[16:10:59 CEST] <c_14> ulf`: ffmpeg -f rtp rtp://<destination_ip>:<destination_port>
[16:11:08 CEST] <c_14> (input goes before that of course)
[16:11:24 CEST] <ulf`> c_14: so I do need to send it to server that does passthrough
[16:11:45 CEST] <c_14> you send to your client ip
[16:11:49 CEST] <c_14> the thing that wants to watch the rtp
[16:12:01 CEST] <JEEB> well Xogium here seemingly was able to create an (S)RTP stream that the client was capable of connecting to :P
[16:12:04 CEST] <JEEB> with ffmpeg.c
[16:12:07 CEST] <ulf`> c_14: there is nothing listening on that end
[16:12:16 CEST] <c_14> well
[16:12:19 CEST] <c_14> someone wants to watch the video
[16:12:23 CEST] <c_14> otherwise you could just use -f null
[16:13:02 CEST] <c_14> rtp isn't really client/server model, the server just blasts out data and whoever it sends the data to can pick it up
[16:13:04 CEST] <c_14> or ignore it
[16:13:25 CEST] <c_14> (at least with ffmpeg)
[16:13:36 CEST] <c_14> I believe there's some rtcp stuff so you can add yourself to the multicast but I can't remember
[16:13:48 CEST] <ulf`> Yeah so the client would be Zoneminder
[16:13:55 CEST] <ulf`> Which wants an RTP URL to use
[16:14:02 CEST] <ulf`> But it has no open port for it
[16:14:07 CEST] <c_14> you usually feed it the sdp file
[16:14:16 CEST] <c_14> which ffmpeg will give you
[16:14:16 CEST] <ulf`> And of course the documentation for how this is done right is terrible
[16:14:22 CEST] <c_14> by default on stdout (or with -sdp_file)
[16:14:30 CEST] <ulf`> c_14: ah!
[16:14:50 CEST] <ulf`> c_14: maybe I try that. Can I set a file limit so that the file doesn't grow beyond a certain size?
[16:14:59 CEST] <c_14> and if you want you can serve the sdp file with some sort of server (say http or whatever)
[16:14:59 CEST] <ulf`> c_14: or is that not necessary in case of sdp?
[16:15:10 CEST] <c_14> the sdp file won't be lager than a few hundred bytes
[16:15:17 CEST] <ulf`> cool!
[16:15:22 CEST] <ulf`> I will give that a try when I get to work
[16:15:25 CEST] <ulf`> Thanks
[16:15:35 CEST] <Xogium> c_14: do you know about srtp and how it's done in ffmpeg ? :D just wondering if you may have the same issue as I do or if you know what I could have done wrong
[16:15:58 CEST] <Xogium> since it's you that helped me for the most part yesterday
[16:16:09 CEST] <c_14> Xogium: I don't know, I wanted to check what you said earlier but haven't had the time yet
[16:16:16 CEST] <Xogium> ahah
[16:16:19 CEST] <JEEB> basically HMAC errors after 20min or so
[16:16:20 CEST] <c_14> I usually just use regular rtp since I control the wires
[16:16:39 CEST] <JEEB> and yea, most people use RTP
[16:16:47 CEST] <Xogium> well good news is, it takes about only 20 mins of streaming to verify, afaik.. It's case here so probably the same for everywhere
[16:16:57 CEST] <JEEB> quite likely :P
[16:16:58 CEST] <c_14> I can probably check later tonight
[16:18:12 CEST] <Xogium> if you keep your player open for like 30 mins just to be sure, close it and then restart it and give it the same sdp file it will complain about hmac mismatch. Only way is to restart ffmpeg on the other end and suddenly your sdp file will work for another 20 mins
[16:20:17 CEST] <Xogium> so good side is it's fairly easy to test
[16:20:29 CEST] <Xogium> btw latest ffmpeg, 3.3.4
[16:21:55 CEST] <JEEB> even the releases rolled beyond that now (which most people around FFmpeg don't actually use). but yea, most likely SRTP hasn't changed a dime since :P
[16:22:04 CEST] <JEEB> since SRTP is not exactly something with active developers
[16:23:01 CEST] <relaxed> Xogium: latest is 3.4
[16:23:13 CEST] <Xogium> aww well dang
[16:23:15 CEST] <Xogium> :D
[16:23:39 CEST] <Xogium> well at least 3.3.4 is fairly recent :p
[16:23:41 CEST] <JEEB> but as I noted, most likely no change in SRTP :P
[16:23:49 CEST] <Xogium> probably none
[16:26:57 CEST] <Xogium> well I'd use rtp but it's definitely not secure especially if I want to check in when I'm not in the lan
[17:00:18 CEST] <Fyr> is there a way to send subtitles to pipe and read those subtitles from it?
[17:01:13 CEST] <JEEB> I'd say "why not"?
[17:01:21 CEST] <JEEB> something just has to know how to read them
[17:01:40 CEST] <Fyr> I'm trying, it writes:
[17:01:41 CEST] <Fyr> [ssa @ 0x54799a0] Only SUBTITLE_ASS type supported.
[17:01:53 CEST] <Fyr> and:
[17:01:53 CEST] <Fyr> [matroska,webm @ 0x50c6020] EBML header parsing failed
[17:01:57 CEST] <JEEB> that's because you're not telling it to copy the style
[17:02:02 CEST] <JEEB> uhh, not style but the stream
[17:02:18 CEST] <JEEB> it's trying to convert to text-based stuff and erroring out since the subtitle type is not text
[17:02:31 CEST] <JEEB> subtitles still aren't in AVFrames which is a problem in that sense :P
[17:02:38 CEST] <Fyr> JEEB, ok, now I'm stuck with the same error:
[17:02:38 CEST] <Fyr> av_interleaved_write_frame(): Invalid argument
[17:03:01 CEST] <Fyr> the same error appears even when I'm sending the subtitles through a pipe.
[17:03:21 CEST] <JEEB> yes, because the input timestamps either are borked to begin with, or get borked by ffmpeg.c
[17:03:31 CEST] <JEEB> which is why I asked you to post a limited log of ffmpeg.c with that track only
[17:03:39 CEST] <JEEB> with debug_ts
[17:03:56 CEST] <Fyr> well, I pasted the complete log.
[17:04:22 CEST] <JEEB> you pasted a link to the 10MiB+ complete log with all streams on dropbox which wouldn't let me easily read it
[17:04:37 CEST] <JEEB> and I think you don't get it with curl, either, although that would be :effort: . and yes, I am lazy
[17:06:54 CEST] <JEEB> Fyr: but so far it basically looks like a broken stream in a matroska file. not sure how the flying fuck someone muxed it, of course. but it doesn't let you mux it again after you read it without fixing the timestamps
[17:07:28 CEST] <JEEB> you might have more luck with something that might fix the timestamp while you're not looking like mkvtoolnix's mkvmerge or something (if just a remux helps)
[17:25:27 CEST] <ulf`> c_14: doesn't look like I can define /dev/video0 as an input and an sdp file as an output. How would I do this? :)
[17:26:04 CEST] <JEEB> the sdp file thing is a specific parameter for the muxer
[17:26:09 CEST] <JEEB> well, "muxer"
[17:26:23 CEST] <JEEB> the output is of course something along the lines of "rtp://STUFF"
[17:26:33 CEST] <JEEB> https://www.ffmpeg.org/ffmpeg-all.html
[17:26:38 CEST] <JEEB> I recommend ctrl+F'ing that one :P
[17:27:43 CEST] <JEEB> oh, it's a global option... "fun"
[17:42:22 CEST] <ulf`> JEEB: jesus christ this thing is a beast
[17:42:38 CEST] <ulf`> Whover wrote ffmpeg obviously hates humanity
[17:43:31 CEST] <JEEB> ffmpeg.c tries to offer access to as many features as possible that are in the libraries
[17:45:38 CEST] <zerodefect> I would like to use the ffmpeg C-API to generate an m3u8 playlist. I was successful yesterday using the cmd line, but now looking at the API - I'm not sure what parameters I should pass to 'avformat_alloc_output_context2()' Do I need to only give it a path to an m3u8 file?
[17:48:04 CEST] <JEEB> looking at the examples under doc/examples, yes :P
[17:48:48 CEST] <JEEB> also there's the documentation on muxing @ https://ffmpeg.org/doxygen/trunk/group__lavf__encoding.html
[17:50:09 CEST] <zerodefect> @JEEB, I'm guessing you're referring to a src file? Which one?
[17:50:23 CEST] <Xogium> c_14: btw the -srtp_out_suite I use is SRTP_AES128_CM_HMAC_SHA1_80
[17:51:16 CEST] <Xogium> with whatever base64 encoded text you like
[17:51:28 CEST] <Xogium> for -srtp_out_params
[17:51:58 CEST] <JEEB> zerodefect: I mean there's multiple :P
[17:56:03 CEST] <Xogium> at least rtp is really nicely keeping latency low instead of increasing like it did when I was using tcp still, I missed something like 300 packets cause of a small network hickup and latency didn't increase
[18:37:29 CEST] <niko1990> Hello everyone :)
[18:54:00 CEST] <niko1990> I have a question: I'm trying to record my screen with the following command. If I'm running the bash script manually, everything works fine, but when I let the bash script run by a cron job, shows the following error... https://pastebin.com/wCcHaZS8
[18:55:31 CEST] <JEEB> first of all it seems like you're calling a mismatched ffmpeg.c compared to your libraries
[18:55:32 CEST] <jkqxz> "WARNING: library configuration mismatch" indicates that you have some libraries installed which don't match the binary.
[18:55:50 CEST] <JEEB> and second of all, you need to quote the 00:... stuff
[18:56:07 CEST] <JEEB> because right now it's misunderstanding 00 as the protocol :P
[18:56:34 CEST] <JEEB> although wait no. also why is -t before input?
[18:56:47 CEST] <JEEB> oh, the input is :0
[18:57:01 CEST] <niko1990> JEEB: yes :)
[18:57:19 CEST] <JEEB> can you move the -t after the input?
[18:57:38 CEST] <niko1990> yes, I'm going to give it a try =)
[18:57:59 CEST] <JEEB> also you are hiding a lot of the logs... I'd recommend -v verbose
[18:58:32 CEST] <jkqxz> Does your cron job have permission to connect to the display :0?
[19:01:22 CEST] <niko1990> with the -t ... behind the -i :0 command it shows the error: Requested output format 'x11grab' is not a suitable output format :0: Invalid argument
[19:02:15 CEST] <jkqxz> That's probably your library mismatch, or maybe cron is actually running a different binary entirely?
[19:02:19 CEST] <niko1990> jkqxz: hmm... I'm not sure... how can I find that out? Or how do i give permission to do that? Btw. the cron job is running as root
[19:03:24 CEST] <jkqxz> Do you get the same configuration mismatch message when you run it yourself?
[19:03:33 CEST] <jkqxz> (And version information and whatnot.)
[19:08:08 CEST] <niko1990> Hmm... yes I get the same mismatch... This is the output...
[19:08:11 CEST] <niko1990> https://pastebin.com/np1g1RJs
[19:18:35 CEST] <jkqxz> "xhost +" disables all access control in X and allows anyone to connect.
[19:18:45 CEST] <jkqxz> But you should probably fix the configuration thing first.
[19:20:22 CEST] <niko1990> jkqxz: hmm... I just found out, when I'm running the command with sudo in front, I can run it manually and get the same error. So It must be, because root is running it...
[20:17:41 CEST] <niko1990> jkqxz: I got it done to fix my config problem :) But I'm still having problems with the xhost + command... How do i use this command the right way? I tried putting it into my bash script before the ffmpeg command like this xhost +root. It gives me the error "xhost: unable to open display ""
[20:18:33 CEST] <jkqxz> You need to run it, as you, inside the X session you are going to capture.
[20:19:06 CEST] <jkqxz> Then later the command run from cron will be allowed to connect to it.
[20:21:40 CEST] <niko1990> jkqxz: Oh... But that would mean, that I can not make it automatically only with a cron job... Is there not another way of dealing with this problem?
[20:25:03 CEST] <jkqxz> Run the cron job as your own user? Normally the access control of X prevents other users from connecting.
[20:42:00 CEST] <markusl> hi
[20:43:19 CEST] <markusl> i'm using ffmpeg to generate several live streams for our university radio (mp3, ogg/vorbis, opus/webm, hls, ...) and it's working great.
[20:44:27 CEST] <markusl> now i'm revamping our metadata system. currently we're using icecast's http endpoint to update song info for the playing programme in the mp3 and ogg streams. icecast doen't support this for webm.
[20:45:08 CEST] <markusl> is there any possible way to change metadata in a running stream (alsa input, http or chunk output)
[20:45:12 CEST] <markusl> ?
[20:53:12 CEST] <alexpigment> markusl: does opus need to be inside webm?
[20:53:19 CEST] <alexpigment> could you not just put opus in an ogg container?
[20:55:12 CEST] <markusl> hmm, I guess it doesn't need to be. right now I simply do progressive streaming to a <audio> element with opus/webm
[20:55:52 CEST] <markusl> i'm planning to implement HLS and/or DASH (opus/webm?) for browsers, so icecast won't be involved anymore
[20:56:15 CEST] <markusl> but, in browsers I don't need to care about metadata inside the stream
[20:56:49 CEST] <alexpigment> well, I know that opus is supported in several containers - mkv, mp4, mpegts, etc - so if webm is the only one giving you problems, it might be worth just using another container
[20:56:50 CEST] <markusl> so yeah, offering ogg/opus instead of webm/opus using icecast should work :)
[20:57:36 CEST] <Xogium> markusl: icecast doesn't support showing opus metadata on status page
[20:59:47 CEST] <markusl> would be nice to have the metadata in players that support it (like VLC, kodi), but otoh kodi doesn't show the ogg and mp3 metadata, maybe i have to research in which containers kodi can show/update metadata at all
[21:00:16 CEST] <Xogium> metadata should be sent to players
[21:00:32 CEST] <Xogium> but the status page just can't display those
[21:02:53 CEST] <markusl> Xogium: ok. but i have no way to update those with ffmpeg? can I maybe generate a (simple) stream w/ only metadata in a 3rd party component that I can merge with the encoded soundcard input?
[21:03:22 CEST] <Xogium> not sure tbh, used icecast only with liquidsoap
[21:04:05 CEST] <c_14> Xogium: ffmpeg running for 31 minutes with srtp, restarted client still works
[21:05:47 CEST] <markusl> afaik VLC can do this, I could try liqudsoap but i'm *very* happy with the ffmpeg encoding pipeline (just a bunch of seperate ffmpeg processes fed through the same dsnoop alsa device) i've been running for the last ~year
[21:16:18 CEST] <c_14> Xogium: nvmd, set something incorrectly. trying again
[21:19:57 CEST] <Xogium> c_14: no worries :)
[22:11:20 CEST] <c_14> Xogium: yeah, the srtp implementation is broken
[22:11:31 CEST] <c_14> would require a fix that may or may not be possible in ffmpeg
[22:11:34 CEST] <c_14> will have to check
[22:11:47 CEST] <Xogium> :'(
[22:11:59 CEST] <Xogium> at least it's confirmed.. :D
[22:12:35 CEST] <Xogium> very bad I was planning to regularely connect from outside
[22:19:22 CEST] <c_14> Or rather, the ffmpeg implementation is just defective and not complete
[22:19:34 CEST] <Xogium> that's what I guessed
[22:19:35 CEST] <c_14> Since it doesn't actually handle the offer/answer part
[22:20:14 CEST] <Xogium> I guess there' no other way to secure rtp ?
[22:20:18 CEST] <Xogium> there's
[22:36:52 CEST] <c_14> VPN
[22:37:00 CEST] <c_14> ?
[22:37:23 CEST] <Xogium> at home that is ?
[22:37:49 CEST] <c_14> Yeah
[22:38:38 CEST] <Xogium> hmm, never set that up but I'll lok about it being on a home made router
[22:38:44 CEST] <Xogium> *look
[00:00:00 CEST] --- Thu Oct 19 2017
More information about the Ffmpeg-devel-irc
mailing list