[Ffmpeg-devel-irc] ffmpeg.log.20170107
burek
burek021 at gmail.com
Sun Jan 8 03:05:01 EET 2017
[00:06:06 CET] <adayzdone> furq How would I redirect the details of the error to a log file?
[00:07:01 CET] <furq> ffmpeg 2>error.log
[00:07:06 CET] <furq> or ffmpeg -report error.log
[00:07:15 CET] <furq> -report gives you the debug log, though, which is massive
[00:07:32 CET] <furq> so probably ffmpeg -v error 2>error.log
[00:09:17 CET] <adayzdone> furq I am trying to do something along the lines of
[00:09:20 CET] <adayzdone> if usr/local/bin/ffmpeg -y -i 'input.MP4' -s hd480 -c:v libx264 -crf 23 -c:a aac -strict -2 'output.MP4' > 'log.txt'; then echo Success >'logpass.txt'; else echo Failure >'logfail.txt'; fi;
[00:15:26 CET] <furq> replace > log.txt with 2> log.txt
[00:16:22 CET] <kerio> or &>
[00:19:31 CET] <adayzdone> Perfect!
[00:19:33 CET] <adayzdone> Thanks you
[00:19:44 CET] <adayzdone> both
[00:25:40 CET] <faLUCE> furq: from what I see, ffserver only calls and organizes some features of ffmpeg. Then, how can I create (with ffmpeg) a http server which listen for a feed from ffmpeg and then accepts connections from clients?
[00:26:17 CET] <JEEB> there are such solutions already out there :P
[00:26:31 CET] <JEEB> both paid and free. you just search for generic streaming servers
[00:26:40 CET] <JEEB> what they take in depends on the server
[00:26:46 CET] <JEEB> some take in rtmp, some take smooth streaming
[00:27:48 CET] <faLUCE> JEEB: I know that there are generic streaming servers, but I wonder if they can be created with ffmpeg
[00:28:16 CET] <faLUCE> instead of using two different programs
[00:28:40 CET] <JEEB> generally it doesn't make any sense to try and serve your clients with ffmpeg or so, rather having ffmpeg just doing the transcoding
[00:29:03 CET] <klaxa> faLUCE: i used the ffmpeg libs to write a server https://github.com/klaxa/mkvserver_mk2
[00:29:07 CET] <klaxa> it still leaks memory though
[00:29:11 CET] <JEEB> ^and yes, there are POCs like that
[00:29:26 CET] <JEEB> and nothing stops you from making a good one with them
[00:29:36 CET] <JEEB> them = FFmpeg's libraries
[00:29:39 CET] <klaxa> yeah make it better please :P
[00:30:03 CET] <faLUCE> JEEB: the problem is that VLC is unreliable for transcoding.... but it's good for streaming. Then, should I use ffmpeg for transcoding and pipe the transcoded output to vlc ?
[00:30:06 CET] <JEEB> well there are already some solutions in the space of streaming, unfortunately
[00:30:08 CET] <klaxa> i've been working on improvements already though, nowhere near ready to be made public though (segfaults, etc)
[00:30:32 CET] <JEEB> so currently if I were to contribute anything it'd be mostly academical
[00:30:56 CET] <JEEB> also for matroska you need a JS parser for any sort of HTML5 or so playback
[00:31:06 CET] <JEEB> which together with mobile is the two major client types
[00:31:09 CET] <Mavrik> :)
[00:31:16 CET] <JEEB> *are
[00:31:27 CET] <Mavrik> So much trouble just to avoid installing another small daemon.
[00:33:11 CET] <furq> klaxa: what does that actually output
[00:33:30 CET] <klaxa> remuxed matroska stream
[00:33:37 CET] <furq> remuxed to what
[00:33:40 CET] <klaxa> to matroska
[00:34:02 CET] <furq> oh nvm i thought i read "browser" on there
[00:34:55 CET] <klaxa> i'm not a fan of video in the browser :)
[00:35:54 CET] <furq> i can contribute significant improvements
[00:35:57 CET] <furq> ...to your makefile
[00:36:33 CET] <faLUCE> I could use this other solution: ffmpeg -i 0.flv -f asf - | vlc - <---- pipe to vlc. But I need to pipe two instances of vlc with one instance of ffmpeg, which does two different encoding of the same input.... what is the syntax?
[00:37:29 CET] <klaxa> furq: i don't doubt it, not really too good with all that build environment stuff
[00:38:40 CET] <klaxa> works on my machine"
[00:38:45 CET] <furq> i'm terrible at C but i make up for it by being good at make
[00:38:53 CET] <furq> make up for it
[00:38:59 CET] <furq> i'm terrible at other things too
[01:00:55 CET] <faLUCE> I'm really getting crazy: ffmpeg -f video4linux2 -input_format mjpeg -video_size 640x480 -i /dev/video0 -vcodec copy -f asf - | vlc <----- how can I add another pipe to another vlc in this syntax ?
[01:01:10 CET] <c_14> tee
[01:01:20 CET] <c_14> (if you want the exact same stuff twice)
[01:01:42 CET] <faLUCE> c_14: I need to make two different encodings, from the same input
[01:01:49 CET] <c_14> then just add another output
[01:01:49 CET] <faLUCE> and pipe two different vlc instances
[01:01:56 CET] <c_14> ffmpeg -i foo out.bar out.baz
[01:02:02 CET] <c_14> the options for each go before the respective output
[01:02:14 CET] <c_14> and use say pipe:3 or something or a fifo even
[01:03:38 CET] <faLUCE> c_14: I know the concept, but I don't understand the syntax :-(
[01:04:13 CET] <c_14> ffmpeg -f v4l2 -blah -i /dev/video0 -c:v copy -f asf pipe:1 -c:v wheee -foo_bar pipe:3
[01:04:14 CET] <c_14> say
[01:04:23 CET] <c_14> eeh, the second output needs a -f blah
[01:05:08 CET] <faLUCE> c_14: and then call vlc pipe:1 and vlc pipe:2 ?
[01:05:51 CET] <c_14> I'm actually not sure how shell syntax works for extra pipes, might want to use fifos for the other one
[01:06:09 CET] <klaxa> i personally would use fifos for both to avoid confusion
[01:06:11 CET] <c_14> so mkfifo foo; ffmpeg -i foo -f asf pipe:1 -f bar foo | vlc -; vlc foo
[01:06:26 CET] <c_14> (semicolon won't work, you'll need to fork)
[01:07:15 CET] <faLUCE> c_14: too many foo and bars, I'm getting confused
[01:08:24 CET] <c_14> https://pb.c-14.de/t/kng.10KF4D
[01:08:27 CET] <c_14> ^something like that
[01:12:03 CET] <faLUCE> let's try
[01:16:30 CET] <faLUCE> there's something wrong: mkfifo output1; mkfifo output2; ffmpeg -f video4linux2 -input_format mjpeg -video_size 640x480 -i /dev/video0 -vcodec copy -f asf output1 -c:v libx264 -f matroska output2 <----- It asks me: "File 'output1' already exists. Overwrite ? [y/N] " and then "File 'output2' already exists. Overwrite ? [y/N] "
[01:17:15 CET] <faLUCE> and the command hangs when I type yes
[01:18:54 CET] <c_14> add -y to the ffmpeg command
[01:19:16 CET] <klaxa> sounds like it tries to write files not streams though :x
[01:19:37 CET] <faLUCE> klaxa: exactly
[01:19:39 CET] <faLUCE> !
[01:19:45 CET] <c_14> It should work (at least I think I've had it work with fifos before...)
[01:20:14 CET] <faLUCE> c_14: doesn't work with -y either
[01:20:36 CET] <faLUCE> as soon as vlc tries to open the fifo, the command exits
[01:21:31 CET] <c_14> If you have bash, this should work ffmpeg -f lavfi -i testsrc -c:v ffv1 -f matroska >(vlc -) -f matroska >(vlc -)
[01:21:59 CET] <c_14> (I actually tested that, so it better)
[01:22:18 CET] <faLUCE> c_14: what is lavfi ?
[01:22:29 CET] <c_14> It's just a pseudo-input
[01:22:38 CET] <c_14> Replace it with your v4l2 settings
[01:25:52 CET] <faLUCE> c_14: wonderful, many many thanks. but what a nightmare.... nothing of that is documented (I searched a lot)
[01:26:35 CET] <c_14> the bash process substitution should be (in the bash manpage)
[01:27:14 CET] <faLUCE> c_14: thanks again
[01:42:22 CET] <faLUCE> now there is this bad consequence, c_14: when I execute that command, the ffv1 encoding takes some seconds in order to start, and I see a delayed video also on the -vcodec copy video
[01:42:45 CET] <faLUCE> if I pipe only the -vcodec copy video I don't see this delay
[01:43:19 CET] <faLUCE> should I use named fifos?
[01:43:22 CET] <faLUCE> and fork?
[01:53:54 CET] <c_14> won't be faster
[01:54:00 CET] <Phi_> how do I pass a CRF parameter to h264_qsv?
[01:54:01 CET] <c_14> ffmpeg writes to the individual muxers in sequence
[01:54:10 CET] <Phi_> -crf and -q are both ignored
[01:55:30 CET] <Phi_> or rather, -q is silently ignored while -crf complains it's not used for anything
[01:55:51 CET] <Phi_> Codec AVOption crf (Select the quality for constant quality mode) specified for output file #0 (.\fun.mp4) has not been used for any stream.
[01:56:38 CET] <Phi_> Stream #0:0: Video: h264 (h264_qsv) ([33][0][0][0] / 0x0021), nv12, 704x480, q=2-31, 1000 kb/s, 10 fps, 10240 tbn, 10 tbc
[01:56:53 CET] <jkqxz> -q sets CQP mode. -crf isn't an option to qsv, so indeed it is complained about.
[01:57:00 CET] <Phi_> q=2-31 stays regardless, seems to be using constant bit rate
[01:57:24 CET] <jkqxz> The RC options are just nasty because there isn't really a nice mapping from the lavc parameters.
[01:57:51 CET] <Phi_> alright, so how would I go about setting the quality via a number param?
[01:58:24 CET] <Phi_> I don't really mind if it's VBR or whatever, as long as there's a number that'll translate to quality
[01:58:31 CET] <Phi_> ideally not something as fiddly as bitrate
[01:59:16 CET] <jkqxz> You should probably just read the function at <http://git.videolan.org/?p=ffmpeg.git;a=blob;f=libavcodec/qsvenc.c#l263> and then work out how to get the libmfx mode you want from that.
[02:01:12 CET] <Phi_> roger that
[02:03:12 CET] <jkqxz> Maybe you want ICQ, if it's there? So try setting only -global_quality.
[02:05:55 CET] <faLUCE> c_14: then, is there a way to share the same v4l input from two ffmpegs instances ?
[02:07:38 CET] <faLUCE> I think this can be a better solution: I redirect to a pipe the faster encoding as a raw copy
[02:07:51 CET] <c_14> faLUCE: ffmpeg -f v4l2 -i blah -c:v copy -f asf | tee >(ffmpeg -f asf -i pipe:0 -c:v libx264 -f matroska >(vlc -)) | vlc -
[02:08:01 CET] <c_14> I'm not sure you can nest process substitution though...
[02:15:17 CET] <faLUCE> c_14: there should be an error in the command....
[02:15:18 CET] <faLUCE> ffmpeg -f video4linux2 -y -input_format mjpeg -video_size 640x480 -i /dev/video0 -vcodec copy -f asf | tee >(ffmpeg -f asf -i pipe:0 -c:v libx264 -f matroska >(vlc -)) | vlc -
[02:15:56 CET] <faLUCE> I also created "mkfifo 0"
[02:16:07 CET] <c_14> you're missing the pipe:1 before the first |
[02:16:19 CET] <c_14> so it can't find an output file
[02:16:32 CET] <faLUCE> c_14: I copied your command
[02:16:40 CET] <c_14> Well, I messed up
[02:18:23 CET] <faLUCE> mkfifo 0; mkfifo 1; ffmpeg -f video4linux2 -y -input_format mjpeg -video_size 640x480 -i /dev/video0 -vcodedc copy -f asf -i pipe:0 | tee >(ffmpeg -f asf -i pipe:1 -c:v libx264 -f matroska >(vlc -)) | vlc -
[02:18:54 CET] <faLUCE> doesn't work either
[02:19:39 CET] <c_14> ffmpeg -f v4l2 -input_format mjpeg -video_size 640x480 -i /dev/video0 -vcodec copy -f asf pipe:1 | tee >(ffmpeg -y -f asf -i pipe:0 -c:v libx264 -f matroska >(vlc -)) | vlc -
[02:19:41 CET] <c_14> Like that
[02:20:48 CET] <Phi_> is there a way to check if ICQ is available?
[02:21:44 CET] <faLUCE> doesn't work either c_14 ... the first output starts normally and delays when the second output is open
[02:22:02 CET] <faLUCE> c_14: I think that I need a fork
[02:22:58 CET] <c_14> opens pretty instantaneously here
[02:23:15 CET] <c_14> pipe and fork are basically the same thing except that pipes also connect stdout to stdin
[02:25:45 CET] <faLUCE> c_14: I think that the right solution is to create two pipes for the "copy" codec. Then feed another instance of ffmpeg with one of them, and making it transcode.
[02:27:10 CET] <faLUCE> c_14:
[02:27:11 CET] <faLUCE> ffmpeg (copy) |______ pipe1____>encode
[02:27:18 CET] <faLUCE> |______ pipe2
[02:29:27 CET] <c_14> that's precisely what that command does
[02:29:34 CET] <c_14> tee duplicates input to every output
[02:29:52 CET] <c_14> Which in this case is the ffmpeg process and the vlc process
[02:30:30 CET] <F00D> Anyone know of a way to alter SEI data from an h264 stream without removing all seakable frames. The current build that allows SEI removal http://forum.doom9.org/showthread.php?t=152419 is causing an mkv mux error on the altered file saying that it doesn't have seek/recovery points
[02:34:19 CET] <F00D> What I'm trying to achieve is removal of x264 metadata without having to redo multi day encodes with the flag that should keep the data from being written to the stream.
[02:34:57 CET] <faLUCE> c_14: what if I nest the pipes in this way ? ffmpeg -f video4linux2 -y -input_format mjpeg -video_size 640x480 -i /dev/video0 -vcodec copy -f asf >(vlc -) -vcodec copy -f asf >(ffmpeg -c:v libx264 -f matroska >(vlc -) - )
[02:35:46 CET] <c_14> you can try that, but I doubt it'll fix the issue
[02:36:08 CET] <faLUCE> c_14: there's an error in the syntax
[02:37:13 CET] <c_14> eh, yeah, move the >(vlc -) inside the >(ffmpeg ..) after the -
[02:38:07 CET] <faLUCE> c_14: tried that:
[02:38:08 CET] <faLUCE> ffmpeg -f video4linux2 -y -input_format mjpeg -video_size 640x480 -i /dev/video0 -vcodec copy -f asf >(vlc -) -vcodec copy -f asf >(ffmpeg - -c:v libx264 -f matroska >(vlc -) )
[02:38:23 CET] <faLUCE> but obtained error as well
[02:38:30 CET] <c_14> that just made it worse, sec
[02:38:44 CET] <c_14> >(ffmpeg -i - -c:v libx264 -f matroska >(vlc -))
[02:42:27 CET] <panda81> Hi a movie I made using ffmpeg can't play in Quicktime even though its codec is h264?
[02:42:45 CET] <c_14> probably pixel format
[02:42:52 CET] <c_14> or bit depth
[02:43:30 CET] <panda81> The command I used is ffmpeg -framerate 24 -i sequence_texture%07d.png output.mp4 -vf format=yuv420p -c:v libx264
[02:44:09 CET] <c_14> can you pastebin an fprobe of the output?
[02:44:36 CET] <c_14> *ffprobe
[02:45:57 CET] <panda81> sure https://justpaste.it/125mq
[02:46:15 CET] <c_14> >yuv444p
[02:46:17 CET] <c_14> yup
[02:46:36 CET] <c_14> the format filter got overridden apparently
[02:46:41 CET] <c_14> try -pix_fmt yuv420p
[02:51:05 CET] <faLUCE> c_14: I'm near the solution: ffmpeg -f video4linux2 -y -input_format mjpeg -video_size 640x480 -i /dev/video0 -vcodec copy -f asf >(vlc -) -vcodec copy -f asf >(ffmpeg -i - -c:v ffv1 -f matroska out.avi) <---this works
[02:51:33 CET] <faLUCE> c_14: this DOESn't : ffmpeg -f video4linux2 -y -input_format mjpeg -video_size 640x480 -i /dev/video0 -vcodec copy -f asf >(vlc -) -vcodec copy -f asf >(ffmpeg -i - -c:v ffv1 -f matroska >(vlc -I dummy -))
[02:51:40 CET] <panda81> c_14: tried and ffprobe still lists yuv444p. What's overwriting it?
[02:52:08 CET] <c_14> faLUCE: error message?
[02:52:19 CET] <c_14> panda81: can you pastebin the command and output you're using for the conversion?
[02:52:20 CET] <faLUCE> [00007f1c04001128] core stream error: cannot pre fill buffer
[02:52:45 CET] <faLUCE> (from vlc)
[02:52:49 CET] <c_14> faLUCE: that seems like some sort of vlc'ish error. Not sure what to do about it
[02:52:55 CET] <faLUCE> c_14: yes
[02:53:11 CET] <faLUCE> c_14: probably, vlc must be launched after a while
[02:53:34 CET] <faLUCE> so I should use named pipes, but I tried lot of syntaxes
[02:54:54 CET] <faLUCE> c_14: mkfifo foopipe; ffmpeg -f video4linux2 -y -input_format mjpeg -video_size 640x480 -i /dev/video0 -vcodec copy -f asf >(vlc -) -vcodec copy -f asf >(ffmpeg -i - -c:v ffv1 -f matroska pipe:foopipe) <---- it writes all the stream bytes on the screen
[02:55:38 CET] <panda81> c_14: figured out. All pixel format options should precede the name of the output
[02:56:07 CET] <c_14> faLUCE: pipe: only accepts numbers corresponding to file descriptors
[02:58:08 CET] <faLUCE> c_14: then you used 0 as stdout and 1 as stdin ?
[02:58:08 CET] <faLUCE> but I can use a named pipe
[02:58:18 CET] <c_14> other way around, 1 is out 0 is in
[03:00:48 CET] <faLUCE> c_14: then, mkfifo foopipe; ffmpeg -f video4linux2 -y -input_format mjpeg -video_size 640x480 -i /dev/video0 -vcodec copy -f asf >(vlc -) -vcodec copy -f asf >(ffmpeg -i - -c:v ffv1 -f matroska pipe:0 foopipe) <--- is this the right syntax?
[03:05:08 CET] <c_14> you could try >(ffmpeg -i - -c:v ffv1 -f matroska pipe:1 > foopipe)
[03:08:05 CET] <faLUCE> :-(
[03:11:12 CET] <faLUCE> c_14: made progress: I missed -y in the second ffmpeg instance
[03:26:21 CET] <faLUCE> let'sreboot
[03:33:22 CET] <faLUCE> made another progress: mkfifo foopipe.avi; ffmpeg -f video4linux2 -y -input_format mjpeg -video_size 640x480 -i /dev/video0 -vcodec copy -f asf >(vlc -) -vcodec copy -f asf >(ffmpeg -i - -y -c:v ffv1 -f matroska pipe:0 foopipe.avi ) <---- it starts the video, but after 3 seconds it hangs. If I don't use the named pipe, and use "out.avi" instead, it works
[03:33:24 CET] <faLUCE> c_14:
[03:33:51 CET] <faLUCE> so the problem is in the pipe
[03:34:12 CET] <c_14> If there's nothing reading from the pipe it will stall when the buffer is full
[03:34:32 CET] <faLUCE> c_14: exactly
[03:34:53 CET] <faLUCE> then I should attach vlc to the pipe before I execute the command
[03:35:00 CET] <c_14> probably
[03:59:03 CET] <faLUCE> c_14: you know how I solved? ffmpeg -f video4linux2 -y -input_format mjpeg -video_size 640x480 -i /dev/video0 -vcodec copy -f asf >(vlc -) -vcodec libx264 -tune zerolatency -r 30 -b 100k -f mpegts udp://localhost:1234
[03:59:18 CET] <faLUCE> one pipe and one udp streamer ;-)
[04:01:04 CET] <faLUCE> pipe for raw stream, udp for encoded stream
[06:48:19 CET] <fling> Is there a proper order for video streams in a container? Should they all go prior audio streams?
[06:54:48 CET] <fling> Which codec to use for a seekable low fps stream?
[07:00:03 CET] <furq> anything
[07:00:06 CET] <furq> just set a low gop length
[07:00:21 CET] <furq> -g 10 or whatever
[07:02:16 CET] <fling> furq: fps is 1/5
[07:04:15 CET] <furq> good job i said "or whatever" then
[07:16:17 CET] <fling> From the compatibility perspective is it a good idea to put all the video streams first and then audio streams or not?
[07:17:43 CET] <furq> i guess
[09:21:53 CET] <thebombzen> fling: it shouldn't matter but most players will play the first video and the first audio stream they find by default
[09:22:12 CET] <thebombzen> some bad players might cough up a file where the audio stream is first
[09:22:39 CET] <thebombzen> but anything reasonable won't care. but yes it's more common to put video first then audio then subtitle
[09:39:17 CET] <fling> ok
[09:49:57 CET] <nate> Is ffmpeg fully MP4 friendly out of the box? I believe in the past one had to install some x264 extras, that still the case?
[09:50:44 CET] <c_14> ffmpeg out of the box will decode pretty much anything you throw at it including H.264 in mp4
[09:51:13 CET] <c_14> If you want to _encode_ H.264 video you will, however, need ffmpeg linked against libx264 (which pretty much any build you get your hands on will)
[09:51:18 CET] <F00D> If only it supported bt2020ncl
[09:51:18 CET] <nate> Nice, cool. Was just making sure, unfortunately on an older ubuntu that doesn't have a packaged ffmpeg
[09:51:30 CET] <nate> Neg no encoding, I'm pretty much just looking at it for reading to generate some thumbnails
[09:52:14 CET] <nate> apparently need yasm first though, whoops
[09:52:22 CET] <c_14> If you're (for whatever reason) stuck on an older Ubuntu/debian you can try a static build
[09:52:27 CET] <c_14> http://johnvansickle.com/ffmpeg/
[09:52:33 CET] <c_14> should work out of the box
[09:52:35 CET] <c_14> no compiling
[09:52:56 CET] <nate> I'm just building from source at the moment, it's just been a while since I had, thought I remembered having to do some libx264 stuff to even read MP4's properly in the past
[09:56:23 CET] <c_14> nah, shouldn't need that
[10:16:52 CET] <cq1> nate: mp4 is a container
[10:19:31 CET] <nate> cq1: Yeah but the majority of MP4's tend to be H.264 based in my experience :P
[10:19:39 CET] <nate> especially online <video> oriented ones
[10:19:56 CET] <kerio> with faststart!
[10:20:25 CET] <kerio> or possibly even fragmented isobmff
[10:21:31 CET] <nate> Woosh, so many warnings in compile, lol. No fatals though so, good I think
[11:19:52 CET] <wouter> how can I tell ffmpeg to add an empty audio track to a newly-created video?
[11:21:31 CET] <wouter> I'm currently running "ffmpeg -loop 1 -i foo.png -c:v libx264 -pix_fmt yuv420p -r 25 -frames:v 125 output.ts"
[11:21:43 CET] <wouter> which generates a five-second static image as a preroll to a video
[11:22:13 CET] <wouter> but since there's no audio, when I concatenate that with the actual content, the first five seconds of the video have audio but show the preroll rather than the actual content...
[11:22:23 CET] <wouter> which is not what I want
[12:14:27 CET] <squ> you can add it later
[12:14:42 CET] <squ> with -c:v copy
[12:15:19 CET] <squ> generate big file with other software, add it later with -shortest flag :)
[12:24:56 CET] <faLUCE> hello: ffmpeg -f video4linux2 -y -input_format mjpeg -video_size 640x480 -i /dev/video0 -vcodec libx264 -r 30 -b 100k -tune zerolatency -x264opts keyint=25 -preset ultrafast -f asf >(vlc -) <---- the encoded output piped to vlc is delayed of 2 seconds. How can I remove this delay? should I change some buffer sieze?
[15:40:55 CET] <William_> Hi. Are there any FFmpeg developers or employees here?
[15:41:28 CET] <BtbN> employees oO
[15:41:37 CET] <William_> Does it look like the following product may violate the FFmpeg license? http://sine.ni.com/nips/cds/view/p/lang/en/nid/213420
[15:42:06 CET] <William_> According to the program VI Package Manager, "m4LabVIEW library adds selected functionality from FFmpeg project to LabVIEW."
[15:42:22 CET] <William_> At the bottom: "Pricing Developer license 949 USD Runtime license 19 USD"
[15:43:24 CET] <BtbN> Using an lgpl build of ffmpeg is perfectly fine for basically anything, if they don't link statically.
[15:44:29 CET] <DHE> as I understand it, you could always invoke the command-line version of ffmpeg and feed it from pipes. that would basically dance around a lot of licensing issues...
[15:45:02 CET] <DHE> since it means you're not directly including ffmpeg, it would be upgradable (with some limits) etc...
[15:45:27 CET] <DHE> this is why lawyers get a lot of money for handling this sort of thing
[15:45:30 CET] <BtbN> Still can't too easily redistribute a GPL build of it in that case iirc
[15:46:03 CET] <DHE> ... true...
[15:49:19 CET] <William_> I do see within the file which is hosted by National Instruments at http://ftp.ni.com/evaluation/labview/lvtn/vipm/packages/kv_lib_m4labview/kv_lib_m4labview-2.0.0.23.vip that it has a precompiled copy of FFmpeg .exe and .dll package.
[15:49:51 CET] <William_> Due to the nature of LabVIEW I suspect it is probably using the .dll files, so the FFmpeg project would want to check for it being compiled in GPL mode.
[15:50:31 CET] <William_> If you don't have LabView or VI Package Manager, the .vip file will open in 7zip, as it is just a renamed .zip file with metadata.
[15:52:08 CET] <DHE> seems like a lot of work to check that the license is being followed without any reason to assume wrong-doing. there's an expectation that people follow the rules to begin with
[16:00:09 CET] <freebird> hello
[16:07:36 CET] <William_> It appears that plugin uses a DLL file called FFMPEG4LabView.dll in order to communicate with the LabVIEW environment.
[16:09:39 CET] <freebird> hello anyone is familiar with ffmpeg with decklink cards?
[16:10:26 CET] <William_> It is 2.45MB in size and there do not appear to be any LoadLibrary or static .dll references to any of the FFmpeg DLLs, or to either of the FFmpeg .exes; the DLL may be obfuscated to discourage reverse-engineering.
[16:11:18 CET] <JEEB> freebird: I haven't had the possibility to test the interface yet but there's stuff for the blackmagic stuff in there if you look at the ffmpeg-all.html page in documentation
[16:18:19 CET] <freebird> thanks jeeb.. i have compiled ffmpeg to use decklink cards and that's quiet fine....my proble is that i could not send vieo to the board....unable to find suitable format
[17:40:47 CET] <MSG|Maverick> anyone have some examples of how to encoder hdr video for uploading to youtube?
[17:40:52 CET] <MSG|Maverick> *encode
[17:42:33 CET] <JEEB> https://support.google.com/youtube/answer/7126552?hl=en :P
[17:42:55 CET] <MSG|Maverick> I mean an example ffmpeg command line
[17:43:29 CET] <JEEB> well that depends on your input colorspace
[17:43:37 CET] <JEEB> color primaries and transport function
[17:44:06 CET] <MSG|Maverick> hm, too bad I have almost no idea what those would be
[17:44:22 CET] <JEEB> ...
[17:44:32 CET] <JEEB> well, if you have a HDR thing waiting as your input
[17:44:36 CET] <JEEB> you should know what the fuck it contains
[17:46:06 CET] <MSG|Maverick> what if I wanted to upconvert a standard sdr image/video to hdr?
[17:46:36 CET] <JEEB> use the zscale filter that utilizes the zimg library
[17:46:45 CET] <JEEB> but why the flying fuck would you want to do that...
[17:47:01 CET] <JEEB> but you should still know your input colorspace
[17:47:10 CET] <JEEB> color primaries and transfer function
[17:47:20 CET] <JEEB> because you can't do X=>Y without knowing X
[17:50:01 CET] <MSG|Maverick> I'm converting tga images (24 or 32 bit) into a video
[17:51:09 CET] <JEEB> that doesn't change the fact that you would have to know what the fuck those contain
[17:51:20 CET] <JEEB> is it sRGB? is it adobe RGB? is it something else?
[17:51:51 CET] <JEEB> but yeah, as long as you know the input colorpsace
[17:51:55 CET] <JEEB> *colorspace
[17:51:59 CET] <JEEB> you can convert it to BT.2020+SMPTE ST.2084
[17:52:18 CET] <JEEB> and then you just have to make sure you have that written in the output video stream written by libx264
[17:54:07 CET] <MSG|Maverick> yeah, I haven't really looked at the colorspace, but for shits'n'giggles let's assume it is sRGB
[17:54:36 CET] <MSG|Maverick> given that, can you give me an idea of what the command line would look like using the zscale filter you mentioned?
[17:54:50 CET] <JEEB> https://www.ffmpeg.org/ffmpeg-all.html#zscale
[17:55:34 CET] <JEEB> actually not sure if zscale supports anything else than sRGB input
[17:56:28 CET] <JEEB> but then for output you want to set primaries to "2020", transfer to "smpte2084" and matrix to "2020_ncl"
[17:56:38 CET] <JEEB> I think range by default for YCbCr is limited
[17:56:59 CET] <JEEB> and then you add a ",format=yuv420p10" or something after the zscale
[17:57:08 CET] <JEEB> that sets the colorspace that zscale will output to
[17:57:16 CET] <MSG|Maverick> gotcha
[17:57:31 CET] <JEEB> then you will have to run that command initially with -v debug and make sure that swscale does no conversions there
[17:57:42 CET] <JEEB> also you will have to be using 10bit libx264
[17:58:01 CET] <JEEB> because the wider range means that you can't just stick it into 8bit well enough
[17:58:16 CET] <MSG|Maverick> yeah, I already compiled the 10bit version, so I'm good there
[17:58:46 CET] <JEEB> the main thing is to make sure that there's no swscale anywhere in your encoding chain
[17:58:55 CET] <MSG|Maverick> ok, I'll keep an eye out for that
[17:59:07 CET] <MSG|Maverick> the youtube page recommends setting some metadata; will zscale take care of any of that?
[17:59:09 CET] <JEEB> at most there might be an RGB->planar RGB conversion
[17:59:14 CET] <furq> wait does this mean youtube supports yuv420p10le
[17:59:19 CET] <JEEB> furq: as input yes
[17:59:22 CET] <MSG|Maverick> yeah
[17:59:29 CET] <JEEB> HDR requires 10bit encoding
[17:59:32 CET] <JEEB> at least
[17:59:41 CET] <furq> surely it needs to keep it 10-bit for hdr though
[17:59:52 CET] <JEEB> for HDR yes, they encode 8bit for SDR transcodes
[17:59:53 CET] <furq> although i'm guessing it doesn't unless you have the metadata set
[17:59:58 CET] <JEEB> yea
[18:00:03 CET] <furq> fun
[18:00:13 CET] <JEEB> MSG|Maverick: unfortunately IIRC the libx264 wrapper doesn't set the metadata
[18:00:19 CET] <JEEB> so you will have to use -x264params
[18:00:23 CET] <JEEB> to set the values
[18:00:25 CET] <MSG|Maverick> I was afraid you'd say that
[18:02:21 CET] <JEEB> -x264-params thankfully takes whatever libx264 (the actual library) takes in
[18:02:31 CET] <JEEB> unlike -x264-opts which has its own key-value list IIRC
[18:02:51 CET] <JEEB> as in, the first one passes things straight to "x264_param_parse"
[18:04:09 CET] <faLUCE> I'm still fighting with this issue: ffmpeg -f video4linux2 -y -video_size 640x480 -i /dev/video0 -vcodec mpeg4 -f asf >(./ffplay -) <---- why the piped encoded video is delayed of 3 seconds? Is there a way to avoid that
[18:04:12 CET] Action: MSG|Maverick is looking up what metadata keys need passing
[18:05:35 CET] <JEEB> -x264-params "colorprim=bt2020,transfer=smpte2084,colormatrix=bt2020nc"
[18:05:38 CET] <JEEB> I think these three
[18:05:52 CET] <JEEB> see how they match what you set in zscale
[18:06:01 CET] <MSG|Maverick> thanks, yup yup
[18:06:27 CET] <MSG|Maverick> ok, well I'll give it a go and report back
[18:06:33 CET] <MSG|Maverick> thanks so much JEEB
[18:06:41 CET] <JEEB> just make sure that ffprobe then gives those three values back after encoding
[18:06:52 CET] <MSG|Maverick> ok
[18:07:06 CET] <JEEB> that would then mean that libx264 wrote the metadata
[18:07:57 CET] <JEEB> there's also some additional metadata fields which I'm not sure if libx264 supports
[18:08:06 CET] <JEEB> which says how bright max your content is
[18:08:20 CET] <JEEB> that helps players not to try into account the maximum that smpte st.2084 permits
[18:08:27 CET] <JEEB> and rather the maximum in your content
[18:08:43 CET] <JEEB> because SMPTE ST.2084 goes to like 10k nits
[18:08:45 CET] <JEEB> which is fucking insane
[18:11:57 CET] <MSG|Maverick> I guess I'll have to play around with that after I get it working
[19:04:41 CET] <Mavrik> hmm, how does HDR even look like when compressed to HEVC?
[19:04:51 CET] <Mavrik> 10-bits per channel? Or just different color curve?
[19:17:23 CET] <faLUCE> does --fflags nobuffer work with x264 encoding? I tried it with mpeg4 and it is ok, but had not success with x264
[19:23:33 CET] <DHE> faLUCE: you probably want '-tune:v zerolatency'
[19:32:31 CET] <faLUCE> DHE: fmpeg -f video4linux2 -y -video_size 640x480 -fflags nobuffer -i /dev/video0 -vcodec libx264 -r 30 -b 100k -tune:v zerolatency -preset ultrafast -x264opts keyint=5 -f asf >(ffplay -) <----- same issue
[19:32:38 CET] <faLUCE> DHE: i'm getting crazy with this
[19:33:43 CET] <faLUCE> i'm using ffmpeg 2.8.7
[19:34:29 CET] <DHE> well, that is old. but do keep in mind that there is buffering by the player
[19:35:13 CET] <faLUCE> DHE: I disabled all buffers in the player
[19:35:23 CET] <faLUCE> I tried with vlc too
[19:36:59 CET] <faLUCE> I wonder if is there a way to pipe the x264 command directly from /dev/video0, avoiding to pass through ffmpeg
[19:40:27 CET] <faLUCE> the problem is that there are too many versions of ffmpeg, too many versions of vlc, too many versions of x264... In addition, they change params often, they messes the rules etc.
[19:40:55 CET] <faLUCE> and it's becoming really impossible to have a good encoder+streamer for h264
[19:41:16 CET] <kerio> ...use the latest?
[19:41:26 CET] <faLUCE> kerio: latest of what ?
[19:41:32 CET] <kerio> ffmpeg+libx264?
[19:42:14 CET] <faLUCE> kerio: the parameters managing of ffmpeg is a nightmare.... you have to try too many options and cross the fingers
[19:43:15 CET] <faLUCE> in addition, the documentation is not up to date
[19:44:01 CET] <faLUCE> so, the only solution is to separate stuff. Now I'm figuring how to pipe x264, without ffmpeg, from v4l device
[19:51:58 CET] <DHE> so the documentation for "-force_key_frames" doesn't indicate it, but in the source code it seems to accept "source" as a value to duplicate the keyframes from the source material. is this intended?
[19:52:05 CET] <DHE> This is the behaviour I'm trying to get from ffmpeg, but it's not working properly.
[20:08:19 CET] <furq> faLUCE: why are you using asf for h264
[20:08:33 CET] <furq> i'm surprised that even works
[20:23:54 CET] <faLUCE> furq: what should I use instead? mpegts ?
[20:25:21 CET] <faLUCE> furq: I tried -f h264 and -f mpegts and have the same issue
[21:00:29 CET] <botto> Hi, I have two different H264 video files. One can be streamed to ATV3 without transcoding, the second one is always automatically transcoded. The only difference I can detect is the code tag (AVC1 vs H264). How can I transform a H264 version into an AVC1 version best?
[21:17:43 CET] <JEEB> boo
[21:18:14 CET] <JEEB> botto: that's most likely not the reason. container or actual avc features usually are
[21:18:49 CET] <JEEB> use ffprobe on both and post their results on pastebin of your liking. then link that here
[21:23:54 CET] <botto> jeeb: right, give me a couple of seconds
[21:31:23 CET] <botto> JEEB: here are the two links: http://pastebin.com/hwikZVTF and http://pastebin.com/7634fFv8
[21:32:32 CET] <botto> the first one is the original file that is always automatically transcoded when transferred to ATV3 and the second one works perfectly - this one has been created by me via ffmpeg
[21:36:26 CET] <botto> here is a third one that works fine with ATV3: http://pastebin.com/gdCU11g2 and one that does not work fine with ATV3 and has to be transcoded: http://pastebin.com/KhAfqssf
[23:47:57 CET] <phillipk> is there a way to make sure I get the LAST frame of the input when my "-ss" value is higher than the input's duration? I have this (which works as long as I don't set -ss too high):
[23:48:21 CET] <phillipk> ffmpeg -i my.flv -vframes 1 -ss 99 output.png
[23:48:51 CET] <phillipk> it works great as long as "99" isn't higher than the my.flv's duration
[23:56:24 CET] <phillipk> maybe I need to use ffprobe
[00:00:00 CET] --- Sun Jan 8 2017
More information about the Ffmpeg-devel-irc
mailing list