[Ffmpeg-devel-irc] ffmpeg.log.20170729

burek burek021 at gmail.com
Sun Jul 30 03:05:01 EEST 2017


[00:27:23 CEST] <LABcrab> Hi. I'm looking for help with ffmpeg2theora (.VOB to .OGV). What is the best setting? Should I simply use 640x480? Here is the command I currently use:
[00:27:33 CEST] <LABcrab> ./ffmpeg2theora --deinterlace -x 640 -y 480 /Volumes/DiscName/VIDEO_TS/VTS_01_1.VOB -o ~/output.ogv -a 7
[00:28:52 CEST] <furq> wtf how is that still maintained
[00:29:12 CEST] <LABcrab> It works for me. I'll gladly use ffmpeg2vp8 if there is such a thing.
[00:29:32 CEST] <furq> why not just use ffmpeg
[00:29:42 CEST] <furq> ffmpeg -i foo.vob out.ogv
[00:29:46 CEST] <furq> will convert to vp8/vorbis
[00:29:51 CEST] <TD-Linux> it's no longer maintained
[00:29:57 CEST] <furq> it was updated in 2016
[00:30:06 CEST] <furq> i'm not sure why
[00:30:25 CEST] <TD-Linux> I take that back, it's mildly maintained
[00:30:35 CEST] <furq> i'm amazed it was updated that recently
[00:30:40 CEST] <furq> i remember using that like 10+ years ago
[00:30:40 CEST] <LABcrab> furq, are there any reasons for me to use VP8/VP9/etc. over Theora?
[00:30:45 CEST] <furq> they're much better
[00:30:52 CEST] <furq> theora was never good and it's especially not good now
[00:31:12 CEST] <furq> you should probably also use opus over vorbis
[00:31:35 CEST] <LABcrab> Is Theora more lightweight, or is it as performance-intense as VP8?
[00:31:42 CEST] <furq> i have no idea
[00:32:08 CEST] <furq> the vp8 encoder in ffmpeg is reasonably quick iirc
[00:32:17 CEST] <furq> i take it you specifically want license-free codecs
[00:32:31 CEST] <LABcrab> This is my player: https://BarlowGirl.ca/wp-content/BigShinyPlanet/BigShinyPlanet.html
[00:32:41 CEST] <LABcrab> I would prefer license-free.
[00:33:24 CEST] <LABcrab> The video is uploaded at 720x480, but I'm thinking of reuploading it at 640x480, and with deinterlacing if absent.
[00:33:31 CEST] <furq> vp8/webm is more widely supported than theora/ogv now anyway
[00:33:49 CEST] <furq> edge and android chrome support webm but not ogv
[00:34:02 CEST] <furq> safari supports neither but then safari doesn't support anything
[00:34:14 CEST] <furq> unless they get mpeg-la backhanders for it
[00:34:19 CEST] <LABcrab> Android doesn't support Theora at all?
[00:34:27 CEST] <furq> http://caniuse.com/#search=theora
[00:34:29 CEST] <furq> not according to this
[00:34:57 CEST] <LABcrab> Also, should I use 720x480 (original) or 640x480 (force 4:3)?
[00:35:10 CEST] <furq> use 720x480 and set the aspect ratio
[00:35:20 CEST] <furq> -aspect 4:3 if you're not cropping
[00:36:17 CEST] <TD-Linux> LABcrab, theora is less resource intensive to decode
[00:36:20 CEST] <TD-Linux> I recommend VP9 though
[00:36:21 CEST] <furq> or uh
[00:36:24 CEST] <furq> -aspect 15:11
[00:36:26 CEST] <furq> ITU baby
[00:36:51 CEST] <LABcrab> TD-Linux Thanks. :-) Do you have a page somewhere that explains why?
[00:37:07 CEST] <TD-Linux> no but I could make one
[00:37:09 CEST] <furq> probably just that it's a much simpler codec
[00:37:31 CEST] <TD-Linux> theora is VP3++
[00:37:42 CEST] <LABcrab> VP3 with goodies?
[00:37:53 CEST] <TD-Linux> yes, a much better encoder and extra bitstream features
[00:37:54 CEST] <furq> VP3 was made public domain, theora is based heavily off it
[00:38:09 CEST] <TD-Linux> the original VP3 encoder was nearly unusably slow
[00:38:11 CEST] <furq> then eventually google bought on2
[00:39:09 CEST] <furq> but yeah vp3 is like 17 years old
[00:39:19 CEST] <LABcrab> Thanks so much for the aspect ratio trick. I typed 1.33 as this is a DVD. What does the computer do? Does it render the 720 width as 640?
[00:39:28 CEST] <furq> something like that
[00:39:38 CEST] <furq> like i said, it should be 15:11 if you're not cropping
[00:39:48 CEST] <furq> 704*480 is 4:3
[00:39:54 CEST] <furq> it's not a huge deal though
[00:40:04 CEST] <TD-Linux> you'll find VP9 encoding pretty slow, you can put higher numbers into cpu-used if it's too slow
[00:40:12 CEST] <TD-Linux> if 6 is still too slow then switch to VP8
[00:40:25 CEST] <furq> you'll want to set a bitrate as well
[00:40:25 CEST] <furq> or crf
[00:40:26 CEST] <LABcrab> I'm not dealing with VP9 yet. With TenFourFox, for example, VP9 doesn't play nice.
[00:40:39 CEST] <LABcrab> The default is quality 6.
[00:40:50 CEST] <TD-Linux> ah yeah actually if you're targeting tenfourfox you'll probably want to compare both vp8 and theora
[00:41:39 CEST] <furq> ffox 45 should have a vp9 decoder
[00:41:45 CEST] <furq> i guess maybe it was removed for ppc though
[00:42:12 CEST] <LABcrab> It is present, but either the PPC CPU is too slow or the code isn't fully optimized.
[00:42:16 CEST] <TD-Linux> more importantly it may not have altivec assembly
[00:42:24 CEST] <TD-Linux> libtheora is more likely to have altivec, let me check
[00:43:01 CEST] <furq> https://code.google.com/archive/p/tenfourfox/issues/28
[00:43:04 CEST] <furq> looks like it's being worked on
[00:43:11 CEST] <furq> or was worked on, rather
[00:43:17 CEST] <TD-Linux> yeah no there's no altivec, you'll just have to pick the one that is fastest
[00:43:21 CEST] <furq> ^
[00:43:37 CEST] <furq> that's from 2011 though so maybe it predates vp9
[00:43:48 CEST] <furq> vp8 is still ok though
[00:43:55 CEST] <TD-Linux> if VP8 is too slow, theora is a reasonable choice. there are certainly worse choices.
[00:44:24 CEST] <furq> i can't really think of any that work in web browsers
[00:44:46 CEST] <TD-Linux> yeah ok
[00:44:51 CEST] <furq> maybe sorenson spark in flv
[00:44:56 CEST] <furq> that's cheating though
[00:45:00 CEST] <LABcrab> On the files that I already have, should I copy them via ffmpeg, but add aspect 1.33?
[00:45:02 CEST] <TD-Linux> and also worse than theora
[00:45:15 CEST] <furq> LABcrab: yeah that works
[00:45:22 CEST] <furq> -c copy -aspect 15:11
[00:45:36 CEST] <furq> -aspect sets the container's dar flag, not the stream's
[00:45:49 CEST] <LABcrab> I'm thinking ffmpeg2theora doesn't have that setting, so I'd have to use ffmpeg after.
[00:46:04 CEST] <TD-Linux> you can use ffmpeg directly instead of ffmpeg2theora
[00:46:04 CEST] <furq> you should really just use ffmpeg for the whole thing
[00:46:20 CEST] <TD-Linux> ffmpeg -i file.mp4 file.ogv
[00:46:25 CEST] <LABcrab> What exactly is the difference?
[00:46:35 CEST] <furq> ffmpeg2theora predates ffmpeg supporting theora
[00:46:40 CEST] <furq> circa ~10 years ago
[00:46:46 CEST] <LABcrab> (15:11 works either way; my mistake.)
[00:46:47 CEST] <furq> but it's supported it for a long time
[00:46:47 CEST] <TD-Linux> also the CLI options are different
[00:47:26 CEST] <LABcrab> Isn't 720x480 1.5:1 though?
[00:47:50 CEST] <furq> it is if the sar is 1:1
[00:47:51 CEST] <furq> but it isn't
[00:48:35 CEST] <furq> ffmpeg -i foo.vob -c:v libvpx -b:v 0 -crf 30 -c:a libopus -b:a 128k out.webm
[00:48:37 CEST] <furq> something like that
[00:48:41 CEST] <furq> er
[00:48:45 CEST] <furq> ffmpeg -i foo.vob -c:v libvpx -b:v 0 -crf 30 -c:a libopus -b:a 128k -aspect 15:11 out.webm
[00:49:20 CEST] <furq> someone who works on libvpx can probably tell me if -b:v 0 works for vp8
[00:49:25 CEST] <furq> if only there were someone like that here...
[00:49:33 CEST] <LABcrab> Is there an article about 15:11?
[00:49:48 CEST] <TD-Linux> it does
[00:51:44 CEST] <redrabbit> is aac a fine choice for avg 120k stereo TS
[00:52:32 CEST] <redrabbit> too bad opus dont work with .ts
[00:53:21 CEST] <JEEB> huh
[00:53:26 CEST] <JEEB> it is defined so it might be the muxer
[00:53:39 CEST] <redrabbit> does webm video codecs compete with x265
[00:53:42 CEST] <redrabbit> maybe
[00:53:47 CEST] <JEEB> except my browser history has a patch for opus in mpegts
[00:53:53 CEST] <TD-Linux> actually I take back what I said about VP8, it uses -b:v as the max bitrate so you need to set it really high instead
[00:54:01 CEST] <TD-Linux> redrabbit, VP9 does
[00:54:50 CEST] <JEEB> redrabbit: so actually FFmpeg's muxer does support it :P http://git.videolan.org/?p=ffmpeg.git;a=commit;h=01509cdf9287b975eced1fd609a8201fbd1438e3
[00:54:57 CEST] <JEEB> since late 2015
[00:55:06 CEST] <furq> LABcrab: 704*480 is exactly 4:3 with a sample aspect ratio of 10:11
[00:55:18 CEST] <furq> 720*480 is 15:11
[00:56:21 CEST] <LABcrab> Okay. What is 4:3, then? I guess that old TVs use 720x480? What happens when a video like that is played on a 640x480 PC monitor?
[00:56:39 CEST] <furq> NTSC DVDs are always 720*480
[00:56:58 CEST] <furq> SD NTSC broadcast is 704*480 iirc
[00:58:27 CEST] <furq> there's some long and tortuous history behind this that someone in here could probably get into
[00:58:55 CEST] <LABcrab> Yeah, there is a gremlin out there who made videos 720x480, maybe.
[00:58:56 CEST] <furq> i just ended up writing a script that gives me the correct dar and decided not to care about why it was correct
[00:59:40 CEST] <voip_> hello guys i need compile/configure in centos7 ffmpeg with h264_qsv support. So where i cand found guide step by step ?
[01:00:21 CEST] <TD-Linux> furq, even worse 704x486
[01:00:35 CEST] <TD-Linux> (NTSC is analog, so we're going with SDI resolution here)
[01:02:57 CEST] <LABcrab> I'll send my link again.
[01:03:37 CEST] <LABcrab> https://BarlowGirl.ca/wp-content/BigShinyPlanet/BigShinyPlanet.html
[01:03:51 CEST] <LABcrab> Should I change my fixed width and height on the player so that it reflects 15:11?
[01:06:15 CEST] <furq> should be 654*480 i guess
[01:15:18 CEST] <redrabbit> ill try to use opus again
[01:41:19 CEST] <LABcrab> What is the link to the website listing which browsers support which codecs?
[01:42:08 CEST] <voip_>  guys i need compile/configure in centos7 ffmpeg with h264_qsv support. So where i cand found guide step by step ?
[01:44:34 CEST] <furq> LABcrab: http://caniuse.com/
[01:44:41 CEST] <LABcrab> Thanks. :-)
[01:54:39 CEST] <LABcrab> The video file displays at ~640x480, so to furq and the Linux user, I'm really grateful for your help. The settings you gave are correct. :-)
[02:12:13 CEST] <redrabbit> strange, i got the dll from https://opus-codec.org/downloads/ in the same folder as ffmpeg and -c:a libopus -b:a 128k -ac 2 -async 1 outputs a video only strem
[02:12:40 CEST] <redrabbit> its with dvbviewer iptv media server i dont know where i can check the logs
[02:18:36 CEST] <redrabbit> -c:a aac works fine
[02:30:35 CEST] <redrabbit> do you guys recommand -cutoff ? i use 15000 atm
[10:47:45 CEST] <dorvan> hi all
[10:49:31 CEST] <dorvan> i'm trying to make a html live streaming from a network camera stream acquired with ffmpeg, how i can make a socket buffer to make the video really "live" ?
[12:10:04 CEST] <q3cpma> Hello, I have a few questions. Is the native AAC encoder (as of 3.3.2) really as good as fdk? Is there a way to convert FLAC replay gain metadata to AAC gain during conversion?
[12:48:51 CEST] <j605> I am trying to play an hls stream using mpv, I ran into this bug:https://trac.ffmpeg.org/ticket/4275
[12:49:20 CEST] <j605> logs are taken by using mpv: https://0x0.st/kZ_.txt
[12:49:31 CEST] <j605> problem starts at: [ 187.935][e][ffmpeg/audio] aac: channel element 1.0 is not allocated
[12:49:55 CEST] <durandal_1707> and how that is relevant?
[12:53:27 CEST] <j605> I was able to reproduce it with: outube-dl --hls-prefer-native -o - http://www.sonyliv.com/details/live/5519365900001/LIVE---Sri-Lanka-vs-India---1st-Test---Day-4---29th-July-2017 | ffplay -
[12:58:34 CEST] <j605> logs with ffplay: https://0x0.st/kZL.log
[13:05:33 CEST] <j605> durandal_1707: is this relevant?
[13:49:04 CEST] <j605> compiling without libfdk-aac fixes the issue
[14:33:13 CEST] <dorvan> hi all, i'm trying to make a html live streaming from a network camera stream acquired with ffmpeg, how i can make a socket buffer to make the video really "live" without a server?
[15:36:15 CEST] <DHE> dorvan: you need a server of some kind to serve it to multiple users, unless maybe you only have 1 user. even if it's just running HTTP (like apache or IIS)
[15:37:32 CEST] <dorvan> DHE: as output for ffmpeg conversion of source stream?
[15:46:12 CEST] <DHE> dorvan: are you referring to HLS when you say Html live streaming?
[15:46:19 CEST] <DHE> (it actually stands for http live streaming)
[15:47:33 CEST] <dorvan> DHE: yes, I know, HLS, or MPEG-DASH, or simple the play of h264 datastream read by a socket...
[15:48:36 CEST] <DHE> HLS and MPEG-DASH allow you to stream using only an ordinary HTTP server, but the latency involved tends to be high. I'd assume 10 seconds best case
[15:48:44 CEST] <dorvan> netcam (rtsp/rtmp)-input-> ffmpeg acquisition, -output-> (something) ->HSL
[15:49:24 CEST] <DHE> I'm running at myself
[15:49:53 CEST] <dorvan> DHE: i need to permit the view of live stream channel of the netcam in a web, html5 application frontend, with less I/O as possible.
[15:50:49 CEST] <dorvan> can be also good a "gw" for source stream to be viewed from html5 page.
[15:50:57 CEST] <DHE> if /var/www/html is your HTTP serving directory (default on some distros) you can run: ffmpeg -i rtsp://....../source -c copy -f hls -hls_list_size 5 -hls_time 4 -hls_flags +delete_segments /var/www/html/mystream.m3u8   # assumes H264 camera source and AAC or AC3 audio
[15:51:15 CEST] <DHE> you can replace -c copy with whatever codec options you want to adjust bitrates or whatever
[15:51:24 CEST] <DHE> HLS.js can give you a pure javascript video player
[15:51:54 CEST] <dorvan> thanks for the cmdline, i need that!!!
[15:52:24 CEST] <dorvan> HLS, have problem with angularjs we use for the gui, we have to review..
[15:52:51 CEST] <dorvan> so i can point to file path of m3u8 file?
[15:53:07 CEST] <DHE> basically
[15:53:25 CEST] <DHE> you can test with VLC using Open Network Source and put in the URL to the m3u8 file
[15:54:05 CEST] <dorvan> great! So HLS can play it without a backend server?
[15:54:32 CEST] <JEEB> the thing you are writing the playlists and segments to is a server
[15:54:45 CEST] <JEEB> and you are using a httpd to serve those files
[15:54:48 CEST] <DHE> you need an HTTP server. I'm assuming that's rather easy to obtain
[15:55:00 CEST] <dorvan> DHE: obviously :-D
[15:55:13 CEST] <JEEB> DHE: also technically the minimal latency for HLS is 3xsegment length
[15:55:32 CEST] <dorvan> DHE: but you don't need a repeater like nginx + rtsp module as backend for live
[15:55:35 CEST] <DHE> JEEB: yes, but people like concrete numbers. :)
[15:55:40 CEST] <JEEB> so if you do 0.5 second segments then you can get it down to 1.5s f.ex.
[15:56:04 CEST] <JEEB> of course that is on top of your encoder delay etc
[15:56:11 CEST] <DHE> JEEB: maybe, but that assumes no encoder latency and such a tiny segment length is kinda nuts
[15:56:20 CEST] <JEEB> I don't disagree :D
[15:56:45 CEST] <JEEB> DASH actually lets you make segments without a RAP in the beginning so you could separate things into multiple segments if you require short segments
[15:56:50 CEST] <dorvan> DHE: we have to tune it... :-)
[15:57:08 CEST] <JEEB> but generally I don't remember what the MPEG DASH playback buffering spec was
[15:57:31 CEST] <dorvan> thanks guys
[15:58:00 CEST] <dorvan> JEEB: it's on client side, in the view
[15:58:14 CEST] <JEEB> I'm pretty sure we're talking about different things :P
[15:59:25 CEST] <JEEB> there should be some specification on how MPEG-DASH players should handle live streams and buffering. just like the HLS spec notes "player has to buffer the last three segments". that controls the minimum latency you can do by poking at the HLS side of things (as opposed to the encoding part before that)
[16:00:08 CEST] <dorvan> ah ok...
[16:02:03 CEST] <dorvan> how i can tune the format to be "mobile" compatible with less I/O? source and converted video it's already a 30fps 4MP h264+aac in mp4 container. but i have problem to see it in "android" device browser (all)
[16:04:47 CEST] <dorvan> about recordings, not live
[16:06:27 CEST] <DHE> if you got live streaming working, HLS can be used in the same way for a pre-created video. set hls_list_size to 0
[16:07:22 CEST] <dorvan> DHE: but have to pre-load the segments?
[16:08:18 CEST] <DHE> on the server side you run ffmpeg on the original video to convert it to HLS. the client will stream the segments as needed
[16:09:19 CEST] <dorvan> DHE: so is it possble presents all recordings as 1 navigable (with timeline) video with HLS?
[16:09:35 CEST] <DHE> HLS supports a simple seek bar, yes
[16:10:06 CEST] <dorvan> DHE: ah ok, i have to convert videos...
[16:10:15 CEST] <dorvan> so no time metadata?
[16:10:19 CEST] <DHE> of course, a pre-recorded .mp4 video will give better efficiency if you can make use of it.  just encode it with output option -movflags faststart
[16:12:48 CEST] <dorvan> example: our system records only on motion event triggers, triggers launch ffmpeg stream acquisition for recording to copy source stream format and forward fastest as possible maintaining frame rate.
[16:14:47 CEST] <dorvan> at moment we register 1 minute video, for any motion event. and put the event in the timeline... is it usefull use hsl segments instead of mp4 and manage the recording as sliced stream?
[16:15:05 CEST] <DHE> that's not something I've done and I don't know...
[16:15:40 CEST] <DHE> the spec does allow for some timecode information in the playlist but I'm not aware of anyone using that feature and the ffmpeg encoder does not make use of it
[16:16:16 CEST] <dorvan> ok, i think managing single mp4 it's more lightweight, and "standard" about timecodes
[17:35:08 CEST] <dorvan> DHE: are you there?
[17:35:20 CEST] <dorvan> JEEB: are you there?
[17:35:52 CEST] <dorvan> DHE: I've made the first try with your suggested options
[17:36:19 CEST] <dorvan> but lag it's heavy.... 25-20 secs
[17:36:23 CEST] <dorvan> 15-20*
[17:36:42 CEST] <DHE> I did warn that 10 seconds was the minimum expected latency
[17:37:09 CEST] <JEEB> well not exactly but that's what you often end up with
[17:37:16 CEST] <DHE> reducing the hls_time to 2 or 3 might shave off some seconds
[17:37:19 CEST] <JEEB> as I said, three segments' worth
[17:37:29 CEST] <DHE> I did specify a segment size of 4 seconds
[17:37:36 CEST] <JEEB> ok
[17:37:51 CEST] <DHE> and that's not knowing the GOP size of the camera involved
[17:37:55 CEST] <JEEB> yup
[17:41:55 CEST] <dorvan> GOP?
[17:42:07 CEST] <DHE> group of pictures. the technical name for what is often called the keyframe interval
[17:42:28 CEST] <dorvan> ah ok...
[17:51:16 CEST] <dorvan> the -movflag faststart it's for mp4 or hls?
[17:52:38 CEST] <dorvan> ffmpeg -re -loglevel warning -i "rtmp://src" -preset ultrafast -tune zerolatency -c copy -f hls -hls_list_size 5 -hls_time 2 -hls_flags delete_segments /Data/mystream.m3u8
[17:53:02 CEST] <dorvan> DHE: i have to put also the movflag faststart?
[17:53:51 CEST] <dorvan> JEEB: what do you think about, -re it's wrong in this case? rtsp instead of rtmp, say want pthreads support and not threads support.
[17:53:52 CEST] <DHE> that's if you output to .mp4. it makes mp4 streamable
[17:54:12 CEST] <DHE> -re is for realtime reading of a file. but if the source is already going to be a realtime bottleneck, don't bother.
[17:54:32 CEST] <DHE> use -re if you're taking a pre-recorded file and want to make it into a live stream virtual source
[18:01:04 CEST] <dorvan> i see -re it's reported as "best practice" for acquisition to force the stream copy without conversion maintaining the framerate... it's wrong? for recording i say (recordings are triggered by motion event)
[18:01:42 CEST] <klaxa> -re means reading in realtime
[18:02:51 CEST] <DHE> it means ffmpeg will pause running for a moment if it's receiving data faster than real-time. which is very possible reading a file from disk. not so much when receiving a live feed from a camera
[18:17:28 CEST] <dorvan> DHE: i have to manage all streams realtime as possible and reduce lag, and force ffmpeg to not convert or manage the input stream.
[18:18:28 CEST] <dorvan> i have to understand, considering i have 2 processes for a single stream (recording and hsl), the right option set,
[18:20:00 CEST] <dorvan> DHE: on https://trac.ffmpeg.org/wiki -re, it's related to input stream and input device... not only on file
[18:38:33 CEST] <DHE> dorvan: if your source already produces realtime feeds, it's not necessary
[18:46:00 CEST] <dorvan> DHE: the netcam it's not "realtime" it's a small arm architeture with a 50% of cpu and memory load, a polling request can help to make it more realtime.
[21:08:35 CEST] <lmao_> Trying to convert video encoder from QuickTime 264 to DNxHR, but cant seem to do it.
[21:12:07 CEST] <ChocolateArmpits> lmao_, can you provide the command line and resulting log ?
[21:35:06 CEST] <timofonic> Hello
[21:37:11 CEST] <timofonic> I just have a naive question, I'm a total n00b: What ffmpeg's framework system provide? Does it support shader ones to be used over OpenGL/Vulkan/Direct3D?
[21:40:50 CEST] <JEEB> FFmpeg's libraries provide a framework for you to read,demultiplex,decode,filter,encode,multiplex and write multimedia things
[21:41:16 CEST] <JEEB> some HW decoders do have the capability to output into D3D surfaces but in most of the cases you are dealing with YCbCr images in RAM
[21:41:26 CEST] <JEEB> you can then upload that to your GPU and handle it however you want
[21:41:45 CEST] <JEEB> presentation of media in general isn't what FFmpeg's focus is, there are actual *player* projects that handle that
[21:42:18 CEST] <JEEB> (many players use FFmpeg's libraries for what happens before presentation)
[21:43:16 CEST] <timofonic> JEEB: I say it because many players like MPV use shaders to filter video, so the filtering work is done only by the GPU. I just wondered if there's something similar in FFMpeg framework, providing "multiplatform shaders" that get later converter to the ones of the native OS (OpenGL, Vulkan, Direct3D, Metal...)
[21:43:40 CEST] <JEEB> no
[21:43:42 CEST] <timofonic> JEEB: I see
[21:44:34 CEST] <timofonic> JEEB: Is it because of project focus or because no developer provided code for it?
[21:44:45 CEST] <JEEB> mpv utilizes FFmpeg for a lot of stuff but presentation related stuff that uses opengl is 100% mpv's own stuff
[21:45:15 CEST] <JEEB> timofonic: probably closer to latter although I'm not sure how much you'd gain for filtering on GPU in non-presentation cases
[21:45:17 CEST] <timofonic> I think mpv is also able to use Direct3D stuff
[21:45:32 CEST] <JEEB> ANGLE lets you use opengl es with d3d only
[21:45:44 CEST] <timofonic> JEEB: Well, there's shaders, OpenGL and CUDA
[21:45:51 CEST] <timofonic> OpenCL CL
[21:46:01 CEST] <JEEB> yea but how useful is that is something someone would have to check
[21:46:11 CEST] <timofonic> JEEB: I see
[21:46:16 CEST] <JEEB> I know GPU processing can be very good when you don't have to take your results back at least
[21:46:33 CEST] <JEEB> but in most cases for non-presentation for FFmpeg the use case would require getting those resulting textures back to RAM
[21:46:45 CEST] <JEEB> not sure if that would be as optimal as with upload+presentation
[21:46:52 CEST] <JEEB> which is what players do
[21:47:39 CEST] <JEEB> tl;dr GPU filtering can be useful but nobody has taken enough time and effort to see if the upload and download of stuff doesn't take too much time
[21:48:21 CEST] <timofonic> JEEB: I see. Another think I imagine would be if would be possible to compile FFMPeg codecs to OpenCL/CUDA, to make them run under GPU only and not depend on proprietary implementations inside GPU of a few codecs (mpeg, H264, H265 in newer ones...)
[21:48:53 CEST] <timofonic> JEEB: And maybe the GPU could be used to decode audio. After all, most people these days use HDMI
[21:49:23 CEST] <timofonic> And there has been an ATi Radeon API for that in the past, I think
[21:49:37 CEST] <timofonic> JEEB: I see
[21:55:29 CEST] <JEEB> timofonic: for decoding and encoding of already available formats give up - that's not what GPUs are good for
[21:55:35 CEST] <JEEB> for actual image filtering they are good
[21:56:07 CEST] <JEEB> pure GPGPU decoders and encoders are only good when the format has from the ground up been made for a very large amount of threads
[21:56:21 CEST] <JEEB> ATi back in the day showed off some nice graphs with a made-up format they made
[21:56:37 CEST] <JEEB> but of course the more threadability on that level you add, generally the less compression it will give
[22:02:16 CEST] <timofonic> JEEB: Oh, I see
[22:04:43 CEST] <JEEB> timofonic: there's a reason why the GPU manufacturers pretty much stopped feeding people the GPGPU meme with regards to video compression
[22:04:53 CEST] <JEEB> (because it just didn't work well)
[22:05:05 CEST] <JEEB> having ASICs on the board optimized for the job works much better
[22:06:29 CEST] <JEEB> the only case which was a semi-successful proof of concept was the some sort of lookahead for x264 that MultiCoreWare did. the result was that the data they calculated was based on smaller frame sizes (they downscaled) and it was not necessarily faster at all compared to doing the full thing on CPU)
[22:07:19 CEST] <JEEB> and that was not actual coding, they could only manage it because the lookahead didn't have to be synchronized with the actual coding
[22:07:36 CEST] <JEEB> so it could be run in a completely separate set of threads without synchronization
[22:07:59 CEST] <DHE> sometimes it would give a +20% performance boost, sometimes it would give a -20% performance penalty...
[22:09:02 CEST] <JEEB> so basically if someone wants to do stuff with the GPU, one should focus on what GPUs are known to actually excel at
[22:18:41 CEST] <luminarys> is it normal that an avfiltergraph with basic resampling screws up pts's in the output frames?
[22:19:20 CEST] <luminarys> I'm correcting the output with av_frame_get_best_effort_timestamp, but even that gives very slightly incorrect values (it provides some duplicate pts's)
[22:20:42 CEST] <timofonic> JEEB: Nice to know
[22:20:53 CEST] <timofonic> Thanks for your magistral class :D
[23:06:43 CEST] <idklol> I'm trying to convert h.264 to dnxhd, can anyone give me the correct command? 720p 60fps.
[23:10:24 CEST] <ChocolateArmpits> idklol, use this http://www.deb-indus.org/tuto/ffmpeg-howto.htm#Encoding_VC-3
[23:15:28 CEST] <idklol> ChoclateArmpits, I get an error, "Invalid chars 'b' at the end of expression '220Mb'"
[23:16:25 CEST] <JEEB> what's the b doing there? ffmpeg.c parameters take just prefixes
[23:16:35 CEST] <JEEB> so mega-something would just be M
[23:21:45 CEST] <idklol> I get another error, "Error initializing output stream 0:0 -- Error while opening encoder for output stream #0:0 - maybe incorrect parameters such as bit_rate, rate, width or height".
[23:22:02 CEST] <JEEB> the actual error is usually higher up in the log
[23:22:05 CEST] <JEEB> that is the final error
[23:22:21 CEST] <JEEB> as in, whatever catches the error from the deeper place
[23:22:36 CEST] <idklol> Video parameters incompatible with DNxHD. Valid DNxHD profiles:
[23:22:41 CEST] <idklol> and lists all profiles
[23:24:43 CEST] <durandal_1707> idklol: what pixel format is your source?
[23:25:03 CEST] <durandal_1707> also what profile you need?
[23:25:42 CEST] <idklol> Pixel format? I need 720p 60FPS. I'm trying to change my codec because DaVinci Resolve doesn't accept H.264.
[23:26:41 CEST] <idklol> durandal_1707 you there?
[23:26:58 CEST] <durandal_1707> idklol: pastebin whole fgmpeg output if you dont know a bunch
[23:27:17 CEST] <idklol> One sec
[23:29:13 CEST] <dystopia_> davinci resolve is horrible
[23:29:23 CEST] <dystopia_> i got that bundled with my capture card
[23:29:32 CEST] <idklol> https://pastebin.com/iz0z0J2k durandal_1707
[23:29:38 CEST] <idklol> what else would I use?
[23:29:40 CEST] <idklol> :p
[23:31:26 CEST] <durandal_1707> you trimmed most important part
[23:31:53 CEST] <idklol> oh
[23:32:48 CEST] <idklol> ffmpeg -i video.mp4 -vcodec dnxhd -b:v 220M output.mov
[23:32:48 CEST] <durandal_1707> anyway, dnxhd supports only some params if you pick limited profile
[23:33:42 CEST] <durandal_1707> and what about resolution of input video?
[23:34:01 CEST] <idklol> it's 1280x720
[23:35:07 CEST] <ThugAim> wooowee
[23:36:41 CEST] <durandal_1707> idklol: you could use one of dnxhr profile
[23:37:02 CEST] <idklol> alright, what's the codec name?
[23:37:23 CEST] <durandal_1707> -profile:v dnxhr_hq
[23:37:37 CEST] <durandal_1707> i think
[23:37:49 CEST] <durandal_1707> just add that to your command
[23:38:09 CEST] <idklol> ok
[23:46:53 CEST] <durandal_1707> you may need to update dnxhd codec component if davinci cant load file
[00:00:00 CEST] --- Sun Jul 30 2017


More information about the Ffmpeg-devel-irc mailing list