[Ffmpeg-devel-irc] ffmpeg.log.20180416
burek
burek021 at gmail.com
Tue Apr 17 03:05:02 EEST 2018
[01:46:16 CEST] <shtomik> Hi to all again, somebody knows, can I reset AVFormatContex(pts,dts)? If I want to reuse output context with another filename(url)?
[04:46:53 CEST] <shtomik> Good morning guys, is anybody here?
[04:47:24 CEST] <shtomik> How can I reset the AVFormatContext timestamps(container dts(cur_dts))
[04:47:38 CEST] <shtomik> Without reinit AVStreams?
[06:28:31 CEST] <gd_> hi
[06:29:03 CEST] <Johnjay> hello
[06:30:04 CEST] <gd_> i have a question, i am doing a project that is using ffprobe and i want to send a proposal to my employer on what the sample output in using ffprobe and extraction audio information
[06:30:48 CEST] <gd_> do you have detailed documentation for each of the output parameters when using ffprobe ?
[06:31:34 CEST] <gd_> i want to be able to describe each of the output parameters in my proposal
[06:31:40 CEST] <Johnjay> hmm there's a tips page https://trac.ffmpeg.org/wiki/FFprobeTips
[06:32:23 CEST] <Johnjay> but this is all there is: https://ffmpeg.org/ffprobe-all.html#Synopsis
[06:32:54 CEST] <Johnjay> there's descriptions for most of the ones that i can see on the page
[06:33:10 CEST] <gd_> yes i already read that page, but im looking for the description of each of those params, like what is "pix_fmt", "bits_per_raw_sample" etc.
[06:33:29 CEST] <Johnjay> furq would know more
[06:33:38 CEST] <Johnjay> but i think if it's not on there you have to read the source
[06:39:42 CEST] <gd_> @Johnjay, most of the descriptions are for the input parameters
[06:43:21 CEST] <gd_> im thinking that the devs may have a documentation file for the output parameters that I cant find in the site
[09:12:17 CEST] <debianuser> Hello. When ffmpeg/libx264 finishes encoding it prints QP stats like: frame I:19 Avg QP:23.87 size: 11879; frame P:635 Avg QP:28.72 size: 3345; frame B:2346 Avg QP:30.26 size: 678. How can I get those QP stats for an already existing .mp4 file, e.g. the one downloaded from youtube? Any ideas welcome!
[10:01:07 CEST] <cowai> On latest master (not on stable 3.4) if rtmp input goes down, the ffmpeg process do not shutdown, it suddenly reaches 100% cpu and never stops. In the output I see this: "RTMP_ReadPacket, failed to read RTMP packet header"
[10:23:11 CEST] <durandal_1707> cowai: report it on bug tracker
[10:23:36 CEST] <cowai> durandal_1707: I am in the process of tracking which commit it started on.
[10:23:48 CEST] <cowai> When I have the latest working commit, and I will.
[11:02:31 CEST] <mustafam> Hello everybody
[11:02:35 CEST] <mustafam> I am trying to transcode 10-bit HDR video to 8-bit SDR, I used '-vf colorspace=all=bt709:range=tv:format=yuv420p', but I am getting this error:
[11:02:48 CEST] <mustafam> Unsupported input transfer characteristics 16 (smpte2084)
[11:03:43 CEST] <mustafam> Any ideas how to solve this??
[11:11:02 CEST] <durandal_1707> mustafam: it is unsupported, write patch?
[11:13:27 CEST] <roxlu> furq: I went for a variable framerate approach (to sync multiple recordings of live video streams) and so far things are looking really well.
[11:22:38 CEST] <mustafam> durandal_1707: I don't know how, I am asking here to iss if anybody faced this problem before and how it can be fixed.
[11:22:48 CEST] <mustafam> *iss
[11:22:53 CEST] <mustafam> *see
[12:35:08 CEST] <Enverex> If ffmpeg's trying to build with "-framework AppKit" does that mean it thinks this is a MacOS machine?
[12:42:46 CEST] <ritsuka> I guess so
[13:07:48 CEST] <Enverex> Weird. Had to explicitly state that it was Linux else it wouldn't build due to trying to use AppKit.
[13:34:18 CEST] <mustafam> Is there a recommended way to convert bt2020 to bt709?
[13:34:57 CEST] <mustafam> colormatrix,colorspace, or zscale?
[14:24:23 CEST] <boobie> do we know when 4.0 gets a release?
[14:25:07 CEST] <alone-y> hello, anyone know, how to drawtext LOCALTIME at windows, but only time?
[14:25:20 CEST] <alone-y> or two lines - date and time..
[14:25:39 CEST] <alone-y> text=%%{localtime}
[14:25:45 CEST] <alone-y> it's working - but with date ;)
[14:26:19 CEST] <Chaz6> Hi there, I have a legal question. If a company develops a "plugin" based on ffmpeg, and sells a service based upon that "plugin", is net necessary to provide the source code or not?
[14:26:27 CEST] <Chaz6> *is it necessary
[14:26:36 CEST] <Chaz6> I am referring to this http://www.prweb.com/releases/2018/04/prweb15395597.htm
[14:36:58 CEST] <dv_> IIRC yes, if the plugin links to ffmpeg libraries
[14:37:21 CEST] <dv_> if it however communicates with ffmpeg through inter-process communication, then no, because that ain't linking
[14:37:50 CEST] <dv_> but, I'm no lawyer
[14:44:49 CEST] <Mavrik> Chaz6: that's probably legal
[14:45:18 CEST] <Mavrik> Oh.
[14:45:19 CEST] <Mavrik> Hm.
[14:45:24 CEST] <Mavrik> Or not, good question :)
[14:45:36 CEST] <Mavrik> I thought the plugin invokes FFMmpeg, not the other way
[15:01:04 CEST] <DHE> it depends. if you invoke ffmpeg as a distinct process, that's one thing. if it links to libavcodec and the like, it's something else
[15:01:42 CEST] <DHE> so, ffmpeg is either licensed under the LGPL or the GPL depending on build options. LGPL is more forgiving, especially if you used dynamic linking.
[15:17:48 CEST] <alone-y> hello, any1 know why '%{localtime:%X}' 8;8 '%{localtime\:%X}' 8;8 '%{localtime\\:%X}'
[15:17:53 CEST] <alone-y> not working with Windows?
[15:30:16 CEST] <shtomik> Hello guys, How can I reset the AVFormatContext timestamps(container dts(cur_dts)) ? Only with new AVFormatContex?
[15:30:57 CEST] <shtomik> I want to record one file, than stop, and than continue record but to new file...
[15:34:51 CEST] <Mavrik> If you're creating a new file, you're creating a new container (format)
[15:34:58 CEST] <Mavrik> So just create a new AVFormatContext and close the other
[15:35:02 CEST] <Mavrik> Or you'll have a bad time.
[15:36:38 CEST] <shtomik> @Mavrik Thanks! ;)
[15:37:32 CEST] <alone-y> Mavrik, may be you know how to cut localtime at windows?
[16:27:49 CEST] <debianuser> Hm... I compressed one of my recordings to 1.mp4 (x264+aac), then converted it to 2.mkv (-c copy), then just for fun converted it back to 3.mp4 (-c copy)... and got a file slightly bigger than 1.mp4. Then I converted that to 4.mkv (-c copy) and got a file with same size as 2.mkv, but different content. And converting 4.mkv to 5.mkv (-c copy) I've got a file bigger than 4.mkv. Is that expected?
[16:28:24 CEST] <c_14> muxing isn't bitexact by default
[16:28:39 CEST] <c_14> try setting -flags +bitexact and/or -fflags +bitexact
[16:45:29 CEST] <debianuser> still different: 1.mp4 (546423253 bytes), 2-bitexact.mkv (544975829), 3-bitexact.mp4 (546423260), 4-bitexact.mkv (544975829 bytes, but ~4760 bytes all over the file are different from 2-bitexact.mkv), 5-bitexact.mkv (544975832 bytes, 3 bytes bigger than 4-bitexact.mkv).
[16:47:26 CEST] <debianuser> As "+bitexact" flag exists, I guess it's expected that it could be different. But why? What part of the muxer/demuxer is randomized?
[16:47:36 CEST] <c_14> timestamps
[16:47:39 CEST] <c_14> usually
[16:47:48 CEST] <c_14> also encoder version and stuff like that
[16:53:31 CEST] <debianuser> Yeah, I thought that something like creation date/time could be different, but that'd be ~4 bytes of difference. Not 4760 bytes in different places all over the file. Retesting with latest static johnvansickle's git build...
[17:02:11 CEST] <c_14> might also be some form of metadata conversion stuff
[17:02:12 CEST] <debianuser> Git version is "worse" - instead of 4760 bytes I get 13162 bytes different between 2.mkv and 4.mkv (544975805 both). Also `./ffmpeg -flags +bitexact -fflags +bitexact -i 4.mkv -c copy -flags +bitexact -fflags +bitexact 5.mkv` produces 5.mkv (544975808 bytes, +3 bytes compared to 4.mkv).
[17:02:37 CEST] <Mavrik> What does the diff say?
[17:02:59 CEST] <c_14> yeah, maybe do a hex dump and diff it
[17:04:13 CEST] <debianuser> Mavrik, c_14: this is the diff: https://pastebin.com/sUkNSTLS
[17:06:34 CEST] <debianuser> I'm basically trying to understand why it's happening. I.e. could it be that `-c copy` is not exactly "copy", and some information is actually lost during the conversion?
[17:21:30 CEST] <spicypixel> does the concat filter work with a filelist or just the muxer?
[17:22:57 CEST] <spicypixel> documentation implies you need a fixed integer to tell it how many source files will be worked on, mhmmm
[17:25:31 CEST] <DHE> yes, because it needs to know what it's getting as input. you can have 1 video and 1 audio, only video, 1 video and 2 audio, etc etc
[17:27:25 CEST] <shtomik> Guys, tell me, please, why when Im creating a new AVFormatContex to record new file, I got an old timestamps in new AVFormat context ? And I have a timing issues ;(
[17:28:05 CEST] <Mavrik> Hmm, are you feeding it frames with old timestamps?
[17:28:22 CEST] <Mavrik> The frame timestamps come from AVFrame afterall
[17:28:26 CEST] <Mavrik> And then may get adjusted
[17:30:41 CEST] <shtomik> @Mavrik Thanks again ;) One moment ;)
[17:34:23 CEST] <shtomik> @Mavrik [mp4 @ 0x10d88b800] Application provided invalid, non monotonically increasing dts to muxer in stream 1: 5571236 >= 0 its after I create a new AVFormatContex
[17:34:51 CEST] <shtomik> @Mavrik and avformat_free_context(ofmt_ctx);
[17:35:12 CEST] <shtomik> @Mavrik AVPacket has pts/dts = 0
[17:36:27 CEST] <Mavrik> Well setup a breakpoint on that error and see what's going on
[17:36:34 CEST] <Mavrik> Are you sharing a struct or something?
[17:38:58 CEST] <shtomik> @Mavrik yea, I tried to debug it& one AVFormatContext * for 2 threads, maybe pointer is cached ?
[17:39:38 CEST] <shtomik> @Mavrik but I freed resources & Before create a new context
[17:45:00 CEST] <spicypixel> when using a filelist in ffmpeg concat, I understand a line is "file 'pathtofile'"
[17:45:06 CEST] <spicypixel> where you do you put duration?
[17:45:11 CEST] <spicypixel> under each filepath?
[17:45:59 CEST] <spicypixel> is there an example filelist with all the various flags used?
[17:46:12 CEST] <spicypixel> https://ffmpeg.org/ffmpeg-formats.html#concat
[17:46:57 CEST] <furq> you don't normally need to set duration
[17:55:22 CEST] <debianuser> I've just moved those bitexact 2.mkv, 3.mp4, 4.mkv and 5.mkv to another dir and regenerated them again. And newly generated files are exactly the same (2.mkv==prev/2.mkv, 3.mp4==prev/3.mp4, etc). So it's not because of timestamps, and it's not random. ffmpeg actually changes something during `-c copy`. But what?
[17:56:13 CEST] <spicypixel> furq: yeah in this case need to, but managed to get it working with it under each file
[17:56:30 CEST] <spicypixel> unifi spits out some fragments of the mp4 with a total timestamp, so 24 hours
[17:56:33 CEST] <spicypixel> for a 8 second clip
[17:56:39 CEST] <spicypixel> hilarity ensues.
[17:57:09 CEST] <spicypixel> this is why my concat + segmentation was failing, since I picked 1 hour segments, it read a 8 second source file as 24 hours
[17:57:15 CEST] <spicypixel> good times
[18:19:49 CEST] <shtomik> @Mavrik Are u here? I know where trouble&
[18:20:18 CEST] <shtomik> @Mavrik I need to renew AVCodecContext too ;(
[18:20:22 CEST] <shtomik> @Mavrik thanks!
[20:55:52 CEST] <MASM> Hello, i need some help about decoding g726 to hls m3u8
[20:59:34 CEST] <debianuser> When ffmpeg/libx264 finishes encoding it prints QP stats like: frame I:19 Avg QP:23.87 size: 11879; frame P:635 Avg QP:28.72 size: 3345; frame B:2346 Avg QP:30.26 size: 678. How can I get those QP stats for an already existing .mp4 file, e.g. the one downloaded from youtube? Any ideas welcome!
[21:01:46 CEST] <MASM> @debianuse
[21:02:53 CEST] <MASM> i undestand that there is an command that show file description about the input source "ffmpeg -i filename.flv"
[21:04:22 CEST] <debianuser> MASM: Yeah, it just doesn't show Avg QP stats. :(
[21:07:08 CEST] <MASM> debianuser: look this "ffprobe video.flv"
[21:09:43 CEST] <MASM> debianuser: https://trac.ffmpeg.org/wiki/FFprobeTips3
[21:11:53 CEST] <debianuser> MASM: Yes, I have hopes for ffprobe too, I just don't know how to extract QP information from it. :(
[21:13:33 CEST] <debianuser> MASM: By the way, about your "g726 to hls m3u8" question. Does `ffplay` plays your file? If it does, it means ffmpeg can decode it, so you'd just need to encode and segment it with `-f segment`. Example: http://hlsbook.net/segmenting-video-with-ffmpeg/
[21:15:06 CEST] <debianuser> ... or with hls muxer ( example: http://hlsbook.net/segmenting-video-with-ffmpeg-part-2/ )
[21:15:19 CEST] Action: debianuser now wonders what's the difference between those two...
[21:17:33 CEST] <DHE> debianuser: you can't. those are encoder stats and are not saved in the video
[21:20:49 CEST] <debianuser> DHE: Is it not a part of the encoded stream? I thought maybe each h.264 frame has its individual QP that is used during decoding... (I don't know h264 decoding format that's why I'm asking)
[21:21:32 CEST] <DHE> it does, but I'm pretty sure it's not saved into the stream...
[21:29:42 CEST] <Renari> Hey guys I'm trying to use a batch file on windows to convert wav files to flac. The batch file is fairly simple but it can be seen here: https://gist.github.com/Renari/c551775d344c4e767e3259a7d08fb86b
[21:30:27 CEST] <Renari> The issue I believe is the filenames have two byte asian characters in the filenames, and the ffmpeg conversion fails with a bunch of garbage being output as the filename.
[21:30:56 CEST] <Renari> e.g. https://i.imgur.com/xe539S3.png
[21:31:13 CEST] <Renari> Does ffmpeg not support unicode characters?
[21:31:20 CEST] <BtbN> blame windows cmd
[21:32:19 CEST] <cowai> I have used all day compiling different commit trees to find which commit caused ffmpeg to crash if the rtmp input goes down, and now I have it nailed down: https://git.ffmpeg.org/gitweb/ffmpeg.git/commitdiff/858db4b01fa2b55ee55056c033054ca54ac9b0fd
[21:32:19 CEST] <cowai> I am using nginx-rtmp as rtmp input for ffmpeg, and when the user stops sending a feed to nginx-rtmp, I expect ffmpeg to shutdown gracefully, instead, after this commit it crashes and the cpu goes to 100% indefinitely.
[21:32:49 CEST] <Renari> Hm, I'm not sure that's the case since if I print the characters instead of passing them to ffmpeg they print out in the console correctly.
[21:33:13 CEST] <cowai> Is this a bug or does this commit actually force the input to adhere to some standards that nginx-rtmp does not follow.
[21:33:16 CEST] <BtbN> windows cmd for anything but ascii is insanity
[21:36:57 CEST] <ChocolateArmpits> Renari, you need to change the text code page at the very beginning of your script https://ss64.com/nt/chcp.html
[21:37:26 CEST] <ChocolateArmpits> Honestly I'd just use powershell, anything beyond plain latin is a pain in the ass
[21:37:30 CEST] <ChocolateArmpits> in cmd
[22:01:50 CEST] <Renari> ChocolateArmpits, thanks got it working
[22:18:33 CEST] <brimestone> hey guys, I'm trying to do an HLS segment and add a function for each segment.
[22:19:53 CEST] <BtbN> you are trying to what?
[22:20:44 CEST] <brimestone> Say, my input is a webcam, and I'm want to record 24/7.. i want to create a 30min segment (HLS) and store each segment into a specific directory
[22:25:23 CEST] <Anonrate> The configure doesn't recognise --nvcc or --windres.
[22:27:49 CEST] <BtbN> it's probably too old then
[22:28:36 CEST] <Anonrate> Nope, checked it on the latest from git.
[22:29:30 CEST] <Anonrate> Just runing ./configure --nvcc=nvcc will replicate the one error, and the other --windres=x86_64-w64-mingw32-windres
[22:32:50 CEST] <BtbN> its entry it gone
[22:32:57 CEST] <BtbN> but nvcc is the default anyway, so no need to set it
[22:33:11 CEST] <Anonrate> Then it should be taken out the list. :)
[22:33:24 CEST] <BtbN> no
[22:33:29 CEST] <BtbN> it should just be added to it
[22:33:34 CEST] <Anonrate> That too. Lol.
[22:34:07 CEST] <Anonrate> I'm just going and checking every option to find some bugs. I'm bored. Lol.
[22:34:38 CEST] <ChocolateArmpits> brimestone, only time information and segment number can be expanded for hls output. You'd have to have an additional script running that would put files where they need to be
[22:34:58 CEST] <Anonrate> Did I report that in the correct channel though? Or should I have posted that in #ffmpeg-devel?
[22:39:07 CEST] <BtbN> it's fixed
[22:39:42 CEST] <Anonrate> libs such as ladspa, alsa, pulse etc.. Can those only be compiled to work on Linux? Or can they be compiled to work on Windows as well?
[22:40:00 CEST] <BtbN> I don't see why ladspa wouldn't work on Windows
[22:40:09 CEST] <BtbN> but alsa is definitely Linux-Only
[22:40:18 CEST] <BtbN> of maybe bsd and stuff also has it? Who knows.
[22:40:38 CEST] <klaxa> i think pulse is also available for windows? i might be wrong though
[22:40:45 CEST] <BtbN> PA for Windows is pretty dead
[22:40:55 CEST] <Anonrate> I've got a bad case of SNS..
[22:41:19 CEST] <klaxa> >It has also been ported to and tested on Solaris, FreeBSD, NetBSD, MacOS X, Windows 2000 and Windows XP.
[22:41:21 CEST] <klaxa> right...
[22:41:54 CEST] <Anonrate> jni.. I've looked but I can't seem where I would find that.
[22:45:56 CEST] <another> hey, i'm trying to concat a video from segment produced by -f segment
[22:46:17 CEST] <Anonrate> I think options that are platform dependent such as alsa, should list in parenthasies saying so. I know for myself it would save alot of time when trying to compile a full decked out ffmpeg.
[22:46:40 CEST] <Anonrate> So for alsa append something like (Linux) or (Linux Only)
[22:47:53 CEST] <another> unfortunately there are glitches at the cut marks
[22:48:41 CEST] <debianuser> Anonrate: It could be you're building for BSD... Or maybe you have a win32 wrapper library that redirects alsa calls to native win32...
[22:48:45 CEST] <another> anyone got some hints?
[22:49:20 CEST] <debianuser> another: Does `cat *.ts > result.ts` work any better?
[22:50:04 CEST] <Anonrate> I didn't think of that debianuser.
[22:50:57 CEST] <Anonrate> I also noticed that when setting arch=x86_64, the config.mak lists ARCH=x86.. I don't think it's much of a big deal, but it may bother people who are new to compilations.
[22:53:59 CEST] <another> debianuser: i'm using mkv as intermediate containers
[22:55:54 CEST] <another> i'll try TS
[23:00:32 CEST] <another> huh.. what do you know
[23:00:47 CEST] <another> TS works pretty flawlessly
[23:02:42 CEST] <cryptopsy> how can i record to .avi the mpv --vo=caca somevideo ?
[23:03:40 CEST] <durandal_1707> you can not, just use normal recorder for screen
[23:03:59 CEST] <cryptopsy> which recorder?
[23:04:12 CEST] <durandal_1707> there are bunch of them
[23:04:20 CEST] <cryptopsy> give me some
[23:04:53 CEST] <durandal_1707> ffmpeg -f xcbgrab
[23:05:49 CEST] <durandal_1707> of x11grab
[23:07:01 CEST] <pgorley> hi, i'm saving an rtp live stream to an mkv file (following the remuxing.c example), how do i manage the timestamps of the AVPacket's? copying them doesn't seem to work
[23:07:22 CEST] <pgorley> do i need to keep track of my own timestamps and have them increment for each packet?
[23:18:04 CEST] <Skullclown> Hey everyone, just spent over 6 hours trying to build ffmpeg for android in a dozen different ways and with a dozen different projects. No luck :( Anyone around to help?
[23:18:42 CEST] <Skullclown> Hey everyone, just spent over 6 hours trying to build ffmpeg for android in a dozen different ways and with a dozen different projects. No luck :( Anyone around to help? Just need an ffmpeg executable for android with the overlap filter enabled, that's all I was adding.
[23:19:00 CEST] <Skullclown> Sorry for that, accidentally pressed the up arrow.
[23:20:33 CEST] <pgorley> Skullclown: i usually use this to build ffmpeg on android: https://pastebin.com/64bP6s6p
[23:20:58 CEST] <pgorley> ofc, change the parameters to suit your needs
[23:21:30 CEST] <Skullclown> pgorley: thanks I'll give it a try
[23:22:49 CEST] <Skullclown> pgorley: NDK 15c or 16, or does it not matter? (had some issues with this w/ one of the projects)
[23:23:10 CEST] <pgorley> hmm, don't remember against which ndk i built my toolchains
[23:23:20 CEST] <pgorley> it shouldn't matter
[23:23:21 CEST] <JEEB> NDK R16b WorksForMe with a standalone toolchain
[23:23:31 CEST] <JEEB> but then again quite a few versions before it did as well
[23:23:55 CEST] <JEEB> if you want to use clang you'll need gas-preprocessor, though (although master with R17 beta doesn't really need that any more)
[23:24:22 CEST] <JEEB> heck, the FATE system even has android toolchains I think?
[23:24:39 CEST] <JEEB> wow
[23:24:46 CEST] <JEEB> I did not think I'd notice > NDKr8
[23:24:49 CEST] <JEEB> that is /old/
[23:24:51 CEST] <Skullclown> pgorley: the clang in there would make me think 16 ?
[23:25:31 CEST] <JEEB> but yea, looks like it has built fine for quite a while :) http://fate.ffmpeg.org/report.cgi?time=20180416183914&slot=armv7a-android-gcc-4.4-shared
[23:25:39 CEST] <JEEB> (and yes, that one is ancient)
[23:26:15 CEST] <cryptopsy> libcaca is so awesome
[23:27:38 CEST] <pgorley> Skullclown: the first line for me is usually: $ANDROID_NDK/build/tools/make_standalone_toolchain.py --arch=arm --api=18 --install-dir /tmp/my-android-toolchain
[23:27:48 CEST] <pgorley> meaning i build for arm on api 18
[23:27:52 CEST] <JEEB> yup
[23:27:58 CEST] <JEEB> and thus you have a proper toolchain going
[23:28:17 CEST] <pgorley> but i recall it working with api 21 or so
[23:28:24 CEST] <brimestone> hey guys, is there a way where I can use s3 as a target output?
[23:28:29 CEST] <pgorley> haven't done any work with android lately
[23:28:54 CEST] <JEEB> ./build/tools/make_standalone_toolchain.py --install-dir /home/jeeb/ownapps/ndk-toolchain-r15 --arch arm --api 21
[23:28:59 CEST] <JEEB> yes, this works with 15 and 16
[23:29:01 CEST] <spicypixel> brimestone: mount the s3 remote on your local box with rclone and write to it like a normal fs
[23:29:52 CEST] <brimestone> how about without mounting? I know FFmpeg can use a URL as the input.. just looking for the same thing in out output side
[23:33:24 CEST] <wfbarksdale> I have a decoding -> encoding pipeline constructed using send/receive packet/frame API (working in C/C++ not command line) and I would like to do some drawing on individual frames before re-encoding. For the predictive `AVFrame`s that I receive from `avcodec_receive_frame`, with they be applied on top of the prior I frame already or will I need to do that manually before drawing on the frame and re-encoding?
[23:34:31 CEST] <wfbarksdale> Also, is there a commonly used tool that can take a raw buffer and then input the pixel / buffer layout and parameters and then display the image? that would be a way I could discover this myself
[23:39:53 CEST] <Anonrate> For the DeckLinkAPI.h, I only managed to get DeckLinkAPI.idl from the BlackMagic site.. There are Linux headers included in the SDK, but they wont work. Is there something I'm supposed to do with this "idl" file?
[23:50:25 CEST] <wfbarksdale> phrased differently, when i decode a packet to a frame using send/receive packet/frame, how can i tell if the AVFrame i have has already been composited with previous frames (if it is a p frame)?
[23:57:54 CEST] <ChocolateArmpits> Anonrate, did you download the full api zip ?
[23:58:21 CEST] <Anonrate> Yes, it has include files for linux, mac, win and samples as well.
[23:58:48 CEST] <ChocolateArmpits> man I only compiled with Windows support, can't comment on Linux
[23:58:59 CEST] <Anonrate> I only want Windows support
[23:59:09 CEST] <ChocolateArmpits> oh then you're in luck!
[23:59:15 CEST] <Anonrate> I see that there is a tool called widl
[23:59:22 CEST] <ChocolateArmpits> that won't work
[23:59:27 CEST] <Anonrate> Figures..
[23:59:31 CEST] <ChocolateArmpits> didn't for me at least, you need visual studio installed
[23:59:52 CEST] <Anonrate> This will make the headers I need, ya?
[00:00:00 CEST] --- Tue Apr 17 2018
More information about the Ffmpeg-devel-irc
mailing list