[Ffmpeg-devel-irc] ffmpeg.log.20160826

burek burek021 at gmail.com
Sat Aug 27 03:05:01 EEST 2016


[00:41:48 CEST] <DHE> durandal_1707: I opened a ticket for it, but since it seems to be based on old APIs I'm guessing it won't see much attention
[00:55:03 CEST] <donics> when i use -r, it captures the video, but when i use -framerate it doesn't
[00:55:41 CEST] <donics> maybe -r is just speeding it up
[01:35:32 CEST] <Mista_D> lloking to keep -g 60 static no matter what and include scene change detetction i frames as well without interfering with a statcic gop interval... any advice? libx264...
[01:35:44 CEST] <DHE> -x264opts no-scenecut
[01:37:41 CEST] <Mista_D> want use static 60 frames interval + scene cut. but normally it'd go 0, 60, 65, 125... ; while looking for 0, 60, 65, 120...
[01:39:14 CEST] <DHE> I'm going to disagree with that. H264 still allows for I frames. There's a special type of I frame called IDR (I + discard references) which are the true keyframes in H264
[01:39:39 CEST] <DHE> so choosing to force IDR when an I frame is sufficient is usually not useful except if you need a true keyframe (eg: for seeking)
[01:39:49 CEST] <DHE> and I'm guessing that's not what you need
[01:40:28 CEST] <Mista_D> -force_key_frames "expr:gte(t,n_forced*2)" ?
[01:41:54 CEST] <Mista_D> woudl that force I frames? or IDRs?
[01:42:21 CEST] <DHE> IDR
[01:51:35 CEST] <Mista_D> DHE: thank you
[02:31:06 CEST] <coyotenq> hi ppl, im fighting with nvencoder and 2 parallel streams limit
[02:32:49 CEST] <coyotenq> anyone has a solution to solve this? the hard im trying to use (Quadro K620) for sure can do more than that
[03:08:12 CEST] <`D`> Is it possible for the ffmpeg in debian to use shared library libvpx3 (1.5.0) from backports?
[03:11:11 CEST] <c_14> probably not. Depends on whether the libvpx3 from backports is abi compatible with the libvpx3 you already have
[03:11:27 CEST] <c_14> You'd either also have to pull ffmpeg from backports or build it yourself
[03:12:42 CEST] <`D`> ffmpeg is from backports, but it's not picking it up.
[03:14:26 CEST] <c_14> check the configure line if it has --enable-libvpx
[03:17:22 CEST] <furq> https://packages.debian.org/jessie-backports/ffmpeg
[03:17:28 CEST] <furq> it doesn't look like it depends on libvpx
[03:17:38 CEST] <furq> oh nvm libavcodec does
[03:18:10 CEST] <furq> it depends on libvpx1 which is dumb and you should probably report it to debian
[03:28:02 CEST] <`D`> it works in debian testing so I doubt they will care. I guess libvpx3 is just a pointless package in backports.
[03:29:37 CEST] <furq> backports packaging seems pretty haphazard
[03:30:02 CEST] <furq> i've noticed a few things in there which don't make any sense
[03:30:03 CEST] <`D`> I am using deb-multimedia backports, which I guess is just as bad.
[03:30:44 CEST] <furq> probably worse
[03:31:01 CEST] <`D`> well is supports more stuff, but this problem seems the same.
[03:31:46 CEST] <furq> it should be an easy fix if you don't mind compiling the package yourself
[03:32:16 CEST] <`D`> yes, I've done it before, but I'll weigh that and upgrading to testing.
[03:32:38 CEST] <furq> i just run testing everywhere
[03:32:46 CEST] <furq> never had any issues with it
[03:38:28 CEST] <ytan> Hi there
[03:39:47 CEST] <ytan> I have encountered some issues cross-compiling ffmpeg on an arm device.
[03:40:39 CEST] <ytan> I test at ubuntu:~/Dev/ffmpeg_src/ffmpeg-arm$ ./configure --prefix=/home/test/Dev/build-arm/ffmpeg-armhf/ --enable-cross-compile --cross-prefix=${CROSS_COMPILE}- --arch=armhf --target-os=linux --pkg-config-flags="--static" --enable-shared --enable-libvpx --enable-libvorbis ERROR: libvorbis not found
[03:41:00 CEST] <ytan> When I run configure, I get the following error:   ERROR: libvorbis not found
[03:43:01 CEST] <ytan> I don't know why that is
[03:43:09 CEST] <furq> do you have it installed for arm
[03:43:11 CEST] <ytan> $ arm-openwrt-linux-gnueabi-pkg-config --modversion vorbis
[03:43:23 CEST] <ytan> when I check with pkg-config
[03:43:28 CEST] <ytan> it tells me 1.3.5
[03:43:35 CEST] <ytan> that means it's there?
[03:44:16 CEST] <ytan> $ arm-openwrt-linux-gnueabi-pkg-config --libs vorbis
[03:44:27 CEST] <ytan> returns "-L/home/reccsi/Dev/build-arm/home/reccsi/Dev/build-arm/libvorbis/lib -lvorbis  "
[03:44:30 CEST] <furq> add --pkg-config="arm-openwrt-linux-gnueabi-pkg-config" to configure
[03:45:09 CEST] <furq> failing that, pastebin the output of config.log
[03:46:46 CEST] <ytan> no good
[03:46:53 CEST] <ytan> let me show you the config.log
[03:47:43 CEST] <furq> i take it you need to use that specific toolchain for openwrt instead of just using the generic ubuntu armhf toolchain
[03:49:41 CEST] <ytan> the http://pastebin.com/QipLqHYX
[03:51:08 CEST] <ytan> Yes it is. The toolchain is provided by the vendor.
[03:52:42 CEST] <furq> what does pkg-config --cflags vorbis return
[03:53:57 CEST] <ytan> -I/home/reccsi/Dev/build-arm/home/reccsi/Dev/build-arm/libvorbis/include -I/home/reccsi/Dev/build-arm/home/reccsi/Dev/build-arm/libogg/include
[03:54:03 CEST] <relaxed> force its hand with, --extra-cflags="-I/path/to/prefix/include" --extra-ldflags="-L/path/to/prefix/lib"
[03:55:26 CEST] <ytan> let me try
[03:57:21 CEST] <ytan> same error
[03:57:40 CEST] <relaxed> is vorbis/vorbisenc.h is in libvorbis/include ?
[03:57:54 CEST] <`D`> well I finally managed to hunt down all the dev lib packages to build ffmpeg. had to leave x265 out since the version available is not compatible with ffmpeg git.
[03:59:08 CEST] <relaxed> `D`: check out x265 master
[03:59:10 CEST] <ytan> $ ls /home/reccsi/Dev/build-arm/libvorbis/include/vorbis codec.h  vorbisenc.h  vorbisfile.h
[03:59:24 CEST] <`D`> relaxed, nah I don't need it.
[03:59:26 CEST] <ytan> yea, the file is present
[04:00:58 CEST] <relaxed> ytan: pastebin the ./configure you just tried
[04:02:03 CEST] <ytan> 1 sec
[04:02:27 CEST] <ytan> http://pastebin.com/cHiUXhEE
[04:04:30 CEST] <relaxed> hmm, omit --pkg-config-flags=--static
[04:05:30 CEST] <relaxed> oh wait, we're getting somewhere
[04:05:57 CEST] <relaxed> now it can't find libogg- do that same for it's include/lib dir
[04:06:06 CEST] <ytan> yea, I saw that
[04:06:10 CEST] <relaxed> its*
[04:06:27 CEST] <ytan> I think forcing with --extra-cflags helped
[04:06:42 CEST] <ytan> now I'm getting this:
[04:06:46 CEST] <ytan> ERROR: libvpx decoder version must be >=0.9.1
[04:07:29 CEST] <relaxed> you'll have to compile a more recent libvpx
[04:08:09 CEST] <ytan> holy--
[04:08:24 CEST] <ytan> i did the same thing, using --extra-cflags
[04:08:33 CEST] <ytan> now it configured successfully.
[04:08:45 CEST] <ytan> Still I wonder. What gives?
[04:09:16 CEST] <ytan> I wonder if my pkg-config is set up incorrectly.
[04:10:10 CEST] <relaxed> correct, it returned "-L/home/reccsi/Dev/build-arm/home/reccsi/Dev/build-arm/libvorbis/lib -lvorbis" which is wrong
[04:11:53 CEST] <ytan> oh...
[04:11:57 CEST] <ytan> I see it now
[04:12:16 CEST] <ytan> wow, what a blunder.
[04:13:44 CEST] <ytan> I couldn't see why it's causing that problem though.
[04:14:33 CEST] <furq> fwiw if you can use the generic toolchain you'll have a much easier time
[04:14:55 CEST] <ytan> I would if I could. :)
[04:15:05 CEST] <ytan> http://pastebin.com/sGman0u1 <== here is my vorbis.pc
[04:16:47 CEST] <ytan> and this is my arm-openwrt-linux-gnueabi-pkg-config script ==> http://pastebin.com/PsRjw7XM
[04:18:55 CEST] <relaxed> edit SYSROOT=/home/reccsi/Dev/build-arm  to  SYSROOT=""
[04:20:12 CEST] <ytan> I've simplified to this:
[04:20:14 CEST] <ytan> #!/bin/sh  SYSROOT= export PKG_CONFIG_PATH=/home/reccsi/Dev/build-arm/pkgconfig exec pkg-config "$@"
[04:20:24 CEST] <ytan>  SYSROOT=
[04:20:31 CEST] <ytan> export PKG_CONFIG_PATH=/home/reccsi/Dev/build-arm/pkgconfig
[04:20:37 CEST] <ytan> exec pkg-config "$@"
[04:21:52 CEST] <ytan> sorry
[04:25:07 CEST] <ytan> thank you relaxed and furq
[04:25:28 CEST] <ytan> please accept my kowtow to both of you
[05:02:35 CEST] <Mista_D> Any way to set "-force_key_frames" to insert every 75 frames instead of seconds please?
[05:04:07 CEST] <Mista_D> its like 2.502502502... seconds, that doesn't work exactly right.
[05:09:24 CEST] <`D`> libvpx is at 1.6.0?? is this stable enough to use?
[05:10:47 CEST] <furq> looks like it
[05:13:07 CEST] <`D`> it's odd that html5 browsers do not support vp9 profile 1 yuv422
[05:13:25 CEST] <`D`> but chrome does yuv444
[06:27:07 CEST] <c_14> Mista_D: force_key_frames expr:eq(mod(n,75),0)
[06:28:00 CEST] <Mista_D> c_14: Thank you!  I used -force_key_frames "expr:gte(n,n_forced*75)" --- seem to work too.
[06:28:38 CEST] <c_14> should do the same thing, ye
[07:00:33 CEST] <Mista_D> Can extra key frames be addded please? "force_key_frames expr:eq(mod(n,75),0) + 50sec + 60sec" as well?
[07:05:27 CEST] <c_14> sure
[07:06:03 CEST] <c_14> force_key_frames expr:eq(mod(n,75),0)+eq(t,50)+eq(t,60)
[07:06:47 CEST] <c_14> using '+' in an expression is the equivalent of a logical OR and '*' is the equivalent of a logical AND
[07:11:20 CEST] <furq> i don't suppose there's a really fast way to count the number of i-frames in a file with ffprobe that i've missed
[07:13:08 CEST] <furq> as in, faster than -skip_frame nokey -count_frames
[10:15:21 CEST] <rossome> good ol' mr robot https://i.sli.mg/JkfWEM.jpg
[10:26:08 CEST] <Threads> rossome is it me or is that a bad command
[10:26:42 CEST] <rossome> with map?
[10:28:01 CEST] <Threads> yep
[10:28:07 CEST] <Threads> also output
[10:29:16 CEST] <rossome> yeah overwriting the inpit essentially
[10:29:46 CEST] <rossome> well after it runs she is propmpted to overwrite the file
[10:31:04 CEST] <radia> Is there a command so that it automatically generates resolution, bitrate, video codec and audio codex to keep the webm file under 10 MB?
[10:33:27 CEST] <rossome> radia: got this from a quick google
[10:33:29 CEST] <rossome> http://stackoverflow.com/questions/29082422/ffmpeg-video-compression-specifc-file-size
[10:36:20 CEST] <_jason_> hello there
[10:36:48 CEST] <_jason_> how can i configure ffmpeg with --enable-opengl option?
[10:38:30 CEST] <_jason_> anyone?
[10:39:28 CEST] <whald> _jason_, i think "./configure --enable-opengl" might work
[10:40:21 CEST] <ozette> _jason_: ./configure --help | grep opengl
[10:40:48 CEST] <radia> thanks.
[12:30:06 CEST] <Milad264> Hi
[14:03:14 CEST] <JohnPreston72> Hi everyone. I am facing a few issues with FFMPeg which maybe you guys can help me with
[14:03:54 CEST] <JohnPreston72> I am running on AWS G2 intance, which uses a GRID K520. I am running on Amazon Linux with the NVIDIA drivers and CUDA provided by the AWS repos.
[14:04:10 CEST] <JohnPreston72> I have compiled FFMPEG from the release 3.1.3
[14:04:26 CEST] <JohnPreston72> compile passed just fine, but when I use ffmpeg it fails, as follows
[14:05:14 CEST] <JohnPreston72> https://www.irccloud.com/pastebin/AXeiu4PG/
[14:05:44 CEST] <JohnPreston72> I am quite a n00b with ffmpeg generally speaking, so if there is anything you guys can see as wrong, help would be very welcomed
[14:14:50 CEST] <ozette> i'm going to build an older version of ffmpeg and pray mp4 > m3u8 works
[14:15:52 CEST] <ozette> i don't see a quick way to trace the SIGILL problem with 3.1.2
[14:20:34 CEST] <DHE> rebuild with --disable-asm and see if it solves your problem. if so, you can report it as a bug. if not, your compiler flags need fixing
[14:20:42 CEST] <DHE> this applies to both ffmpeg and x264
[14:30:15 CEST] <ozette> will try that
[14:33:37 CEST] <DHE> could have sworn I suggested that before...
[14:34:33 CEST] <_jason_> hello
[14:34:47 CEST] <_jason_> im having issue when compiling ffmpeg
[14:35:28 CEST] <_jason_> $ configure --prefix=ffmpeg/ --disable-network --disable-debug --disable-yasm gcc is unable to create an executable file. If gcc is a cross-compiler, use the --enable-cross-compile option. Only do this if you know what cross compiling means. C compiler test failed.  If you think configure made a mistake, make sure you are using the latest version from Git.  If the latest version fails, report the problem to the ffmpeg-user at ffmpeg.org
[14:35:57 CEST] <_jason_> Include the log file "config.log" produced by configure as this will help solving the problem.ÿ
[14:36:02 CEST] <DHE> pastebin the contents of config.log
[14:36:07 CEST] <DHE> as indicated
[14:36:56 CEST] <_jason_> i dont have any file config.log
[14:37:12 CEST] <_jason_> i have config and Changelog two files
[14:38:19 CEST] <JohnPreston72> gentle bump (if you've seen it)
[14:39:16 CEST] <_jason_> found
[14:39:31 CEST] <_jason_> should i paste the whole content on config.log here?
[14:39:35 CEST] <_jason_> of*
[14:39:49 CEST] <DHE> no, pastebin of your choosing. I like fpaste
[14:40:05 CEST] <_jason_> i dont get you
[14:40:33 CEST] <JohnPreston72> _jason_: instead of copy paste the whole change.log here, fpaste it
[14:40:43 CEST] <JohnPreston72> https://paste.fedoraproject.org/
[14:41:27 CEST] <JohnPreston72> as asked earlier, anyone has any idea where this could come from  ?
[14:41:28 CEST] <JohnPreston72> https://www.irccloud.com/pastebin/AXeiu4PG/
[14:41:34 CEST] <JohnPreston72> I have been searching all night
[14:41:46 CEST] <_jason_> https://paste.fedoraproject.org/414309/47221527/
[14:45:08 CEST] <DHE> do pthreads exist on mingw?
[14:46:16 CEST] <DHE> actually it's odd because they were not requested in the compile commandline
[14:50:13 CEST] <JEEB> there are packages that provide pthreads on mingw
[14:50:21 CEST] <JEEB> but by default not included
[14:50:36 CEST] <JEEB> both x264 and FFmpeg have their own wrappers around windows threads
[14:50:51 CEST] <JEEB> so having pthreads is not required
[14:51:46 CEST] <AznGuy> Hi?
[14:52:13 CEST] <AznGuy> Does anyone have experience with recording a rtsp streaming?>
[14:52:31 CEST] <nonex86> +
[14:54:31 CEST] <_jason_> whats pthread?
[14:54:54 CEST] <_jason_> how can i get through this issue guys :(
[14:55:28 CEST] <AznGuy> Currently working on a story. When in recording and the recording is stopped by hand the file seems correct. But when the stream is disconnected due network failure, can that causes the file to be corrupted or something like missing meta data? aka ( duration)
[14:56:13 CEST] <AznGuy> (using ffmepg v2.2.2)
[14:57:25 CEST] <nonex86> AznGuy: are you talking about cli? or you use ffmpeg muxer api directly?
[14:57:58 CEST] <AznGuy> ffmpeg api directly
[14:58:16 CEST] <AznGuy> ffmpeg ${RTSP_TRANSPORT_STRING} -y -i ${INPUT_STREAM} ${QUALITY_STRING} -an -f flv ${OUTPUT_FILE}
[14:58:24 CEST] <AznGuy> for instance
[14:59:05 CEST] <jkqxz> _jason_:  Your compiler doesn't work - it tried to compile the simplest possible program and that didn't work, so configure gave up immediately.  (Somehow it is trying to link with pthreads but can't, but that failure has nothing to do with ffmpeg.)
[14:59:27 CEST] <ozette> well, --disable-asm still results in SIGILL when trying to transcode mp4 > m3u8
[14:59:40 CEST] <nonex86> AznGuy, you talk about cli :)
[14:59:59 CEST] <AznGuy> ohw..  :(
[15:00:20 CEST] <AznGuy> i guess i'm far from being an expert
[15:01:50 CEST] <AznGuy> Point is: I'm maintaining some piece of video software and I'm not so sure what can cause this bug I described earlier..
[15:02:25 CEST] <_jason_> jkqxz: What should i do now?
[15:05:05 CEST] <_jason_> anyone?
[15:05:44 CEST] <_jason_> i'm having this issue when compiling ffmpeg. https://paste.fedoraproject.org/414309/47221527/
[15:05:45 CEST] <jkqxz> _jason_:  Get a working compiler.
[15:06:15 CEST] <_jason_> jkqxz: you mean mingw?
[15:07:06 CEST] <AznGuy> _jason_: I think you need to understand compilers first.. how things link ...
[15:07:30 CEST] <nonex86> _jason_ are you building your code for windows?
[15:07:33 CEST] <_jason_> yes
[15:07:48 CEST] <nonex86> well, then why dont you use msvc instead?
[15:08:18 CEST] <_jason_> was trying to follow this video : https://www.youtube.com/watch?v=3yhkX0uaQGk
[15:08:41 CEST] <nonex86> you have a problem with building or at configure steps?
[15:09:43 CEST] <_jason_> configure steps
[15:10:17 CEST] <nonex86> yeah, i already checked the config.log you are provided
[15:10:31 CEST] <nonex86> wait a second, ill check a configure script
[15:10:39 CEST] <_jason_> my goal is to enabl opengl
[15:10:44 CEST] <_jason_> enable*
[15:10:54 CEST] <_jason_> ok
[15:11:16 CEST] <nonex86> last config.log messages related to pthreads, isnt it?
[15:11:17 CEST] <_jason_> by default ffmpeg configure does not configure opengl.
[15:11:27 CEST] <nonex86> so why dont you just disable pthreads
[15:11:36 CEST] <_jason_> umm
[15:11:40 CEST] <_jason_> --disable-pthreads?
[15:12:05 CEST] <nonex86> yes
[15:12:12 CEST] <_jason_> let me try
[15:12:23 CEST] <nonex86> as you building for windows w32threads should be enough
[15:12:36 CEST] <nonex86> also, i cant understand why you get this error
[15:12:41 CEST] <handbrake-learne> hello
[15:12:43 CEST] <_jason_> why?
[15:12:46 CEST] <nonex86> i build ffmpeg on windows several times
[15:12:52 CEST] <_jason_> i see
[15:12:58 CEST] <nonex86> and everyting was fine
[15:13:05 CEST] <nonex86> you dont need pthreads on windows at all
[15:13:06 CEST] <_jason_> whats your fmmpeg version
[15:13:16 CEST] <handbrake-learne> im using : ffmpeg -i input.mkv -vcodec copy -acodec copy output.mp4  to converts files... is there a ffmpeg gui interface i can use/
[15:13:16 CEST] <_jason_> yeah getting same error
[15:13:17 CEST] <nonex86> last version i build was 3.0.0
[15:13:28 CEST] <_jason_> i'm using 3.13
[15:13:31 CEST] <_jason_> 3.1.1
[15:13:35 CEST] <_jason_> 3.1.3
[15:13:47 CEST] <_jason_> latest version
[15:13:54 CEST] <_jason_> whats your mingw version
[15:13:57 CEST] <nonex86> one more on your error
[15:14:17 CEST] <nonex86> configure creates test program
[15:14:25 CEST] <nonex86> and try to compile it
[15:14:30 CEST] <nonex86> in your case its gcc
[15:14:42 CEST] <nonex86> again, why dont you give a try to msvc?
[15:14:52 CEST] <_jason_> ok let me try msvc
[15:14:53 CEST] <nonex86> or just add pthreads support
[15:14:57 CEST] <nonex86> to your mingw
[15:15:06 CEST] <nonex86> for configure i used msys2
[15:15:11 CEST] <_jason_> ok i'll get back to you when checking boths these things
[15:15:30 CEST] <_jason_> i'm using msys also
[15:15:35 CEST] <_jason_> when comes with mingw
[15:15:40 CEST] <_jason_> which*
[15:16:00 CEST] <nonex86> for gcc my guess you just need to install some pthread devel packets
[15:16:20 CEST] <nonex86> into your msys
[15:17:00 CEST] <JEEB> why would you want to use pthreads with mingw-w64 (32bit or 64bit), though?
[15:17:10 CEST] <JEEB> neither FFmpeg nor most of the sane libraries require it
[15:17:58 CEST] <_jason_> JEEB: can you please tell how to reslove this issuue : https://paste.fedoraproject.org/414309/47221527/
[15:19:04 CEST] <JEEB> > --disable-yasm
[15:19:09 CEST] <JEEB> you have no idea wtf you're doing
[15:19:40 CEST] <JEEB> also why do you need opengl?
[15:19:48 CEST] <JEEB> just out of interest
[15:20:47 CEST] <kdehl> So um, is it reliable to assume that the first NAL unit of a H.264 stream is the SPS and hence contains the width and height of the video stream?
[15:21:04 CEST] <JEEB> no
[15:21:07 CEST] <kdehl> Okay.
[15:21:19 CEST] <JEEB> there can be multiple parameter sets of different types or SEIs or whatever
[15:21:35 CEST] <JEEB> you can only assume those kind of things when you control the full chain yourself and make sure you never get anything else than that first
[15:22:16 CEST] <JEEB> kdehl: I'd say your system is borked
[15:22:22 CEST] <nonex86> kdehl, depends
[15:22:24 CEST] <JEEB> also you should not use ye olde msys or mingw
[15:22:29 CEST] <_jason_> JEEB: My friend i want to enable open so that i can capture opengl texture from screen-capture-recorder. I can't capture full screen mode in games.
[15:22:39 CEST] <JEEB> uhh
[15:22:45 CEST] <JEEB> that's not what the opengl thing does
[15:22:51 CEST] <nonex86> kdehl: only if you know the source of stream you can make an assumptions about
[15:23:01 CEST] <JEEB> also sorry kdehl I mis-autotab'd :P
[15:23:19 CEST] <JEEB> _jason_: your toolchain is mis-installed + you shouldn't use ye olde msys or mingw
[15:23:29 CEST] <_jason_> JEEB: that's not what the opengl thing does. Can you explain?
[15:23:42 CEST] <_jason_> ok fine
[15:24:05 CEST] <JEEB> the "opengl" feature in FFmpeg is a very crappy output "device" which IMHO should have not been merged
[15:24:13 CEST] <JEEB> it doesn't help you capture. at all
[15:24:47 CEST] <_jason_> what it does then lol
[15:24:54 CEST] <BtbN> ffmpeg cannot capture hardware-accelerated windows, let alone exclusive fullscreen games.
[15:25:01 CEST] <BtbN> Use OBS.
[15:25:33 CEST] <JEEB> _jason_: I just explained, it's an "output device" because someone cried enough and didn't want to maintain their thing separately. even though it doesn't really fit the library
[15:25:55 CEST] <JEEB> _jason_: installing the 32bit or 64bit mingw-w64 toolchain from https://msys2.github.io/ probably is the least insane thing (also don't forget yasm because you definitely want those hand-written assembly optimizations :P)
[15:26:15 CEST] <_jason_> <BtbN> : Yes i will try it later its in my todo list.
[15:26:31 CEST] <BtbN> Trying thing that won't work first seems not too efficient.
[15:26:55 CEST] <nonex86> _jason_ and Rabeet is the same guy...
[15:26:56 CEST] <kdehl> nonex86: Right.
[15:27:00 CEST] <kdehl> JEEB: I understand.
[15:27:21 CEST] <nonex86> few days ago he already ask about capturing the opengl/directx surfaces afair
[15:27:22 CEST] <kdehl> I just realized that I can just wait until I have a full frame available.
[15:27:50 CEST] <kdehl> Then I most certainly know about the width and height of the stream.
[15:27:59 CEST] <nonex86> kdehl, usually you dont, for example, if the source is file container
[15:28:12 CEST] <_jason_> So its of no use enabling opengl in ffmpeg? it will not solve my full screen issue right?
[15:28:27 CEST] <nonex86> kdehl, or, if the live source is rtsp source with sps/pps in sdp :)
[15:29:03 CEST] <nonex86> sometimes you can easily get frame dimensions before decoding first frame, sometimes not :)
[15:29:15 CEST] <kdehl> Right.
[15:29:58 CEST] <nonex86> actually, you always can get frame dimension before decoding 1st frame in h264 :) just find sps/pps :)
[15:35:04 CEST] <kdehl> Yeah. I'm pondering what to do... I'm using OpenH264 (as you know), but it doesn't seem to have a function to find the video resolution.
[15:35:26 CEST] <kdehl> Wonder if it's any point writing a parser myself for that.
[15:35:42 CEST] <nonex86> depends :)
[15:35:49 CEST] <kdehl> It's a fun exercise.
[15:35:52 CEST] <kdehl> I guess.
[15:35:53 CEST] <nonex86> what do you want and what you need :)
[15:36:03 CEST] <nonex86> its an easy task
[15:36:14 CEST] <nonex86> to write golomb/bit stream parser
[15:36:23 CEST] <kdehl> Yeah. The other option is that I can just start decoding until I have a frame available, then get the size out of that.
[15:36:28 CEST] <nonex86> sure :)
[15:36:41 CEST] <nonex86> thats why i like this word - "depends" :)
[15:36:50 CEST] <nonex86> you almost always have a choice :)
[15:36:56 CEST] <kdehl> But the point is that I want to allocate a buffer for the decoding, and so I kinda need to know the resolution before I start decoding!
[15:37:23 CEST] <kdehl> Uh. Wonder how the test util does it. Haven't even thought of checking that.
[15:37:44 CEST] <kdehl> Right. It just writes to a file.
[15:37:48 CEST] <kdehl> Cheaters.
[15:38:16 CEST] <nonex86> are you sure openh264 did not give you any abilities to get frame information after decoding sps/pps nalu?
[15:38:38 CEST] <kdehl> No.
[15:38:47 CEST] <nonex86> and you need to supply it
[15:38:51 CEST] <nonex86> preallocated buffer?
[15:38:55 CEST] <nonex86> to get a frame?
[15:39:37 CEST] <kdehl> No, it allocates separate Y, U and V buffers for me.
[15:39:56 CEST] <nonex86> ok, than no problem exists
[15:40:07 CEST] <nonex86> you got your frame data
[15:41:02 CEST] <kdehl> Yeah, I realized this has more to do with how I designed the decoding on a higher level.
[16:21:27 CEST] <Spring> if I have say 130 images that I'm encoding to a GIF, and I set -framerate 15 and -r 15 is that using only the 130 images or adding inter-frames?
[16:36:54 CEST] <ChocolateArmpits> Spring: that shouldn't do anything to the images
[16:37:19 CEST] <Spring> ChocolateArmpits, by that you mean it will only use those images?
[16:38:47 CEST] <ChocolateArmpits> Spring: Most likely, as far as I'm aware because 15fps isn't possible with gif timebase it should get rounded to 14.2fps or 16.6 fps depending on how this is implemented
[16:40:29 CEST] <ChocolateArmpits> gif timebase works in increments of 0.01 second and that's the smallest single frame duration possible, 1 second is longest
[16:43:28 CEST] <Spring> does ffmpeg use any advanced GIF frame blending techniques to optimize filesizes?
[16:43:50 CEST] <Spring> such as only encoding the moving section of a frame
[16:45:00 CEST] <ChocolateArmpits> Have no idea, options are limited to looping and delay between loops
[16:45:28 CEST] <c_14> Spring: I'm pretty sure it doesn't.
[16:45:43 CEST] <Spring> hmm, know of any other encoders I could use?
[16:46:58 CEST] <iive> c_14: why do you think so?
[16:48:22 CEST] <c_14> Oh, nvmd I think it might.
[16:48:40 CEST] <iive> there is nothing advances in gif, afair it have only 1 bit alpha channel, and each frame just writes over the previous.
[16:48:41 CEST] <c_14> iive: I said that because libavcodec encoders by default don't have reference to the last frame so they can't do comparisons like that.
[16:48:51 CEST] <c_14> But the gif encoder actually stores the last frame in its private data
[16:51:12 CEST] <c_14> Though the comparison is pretty basic, it only checks if the outer columns/rows are the same as the last frame and if so crops those rows/columns
[16:51:32 CEST] <t4nk010> I am getting "*** glibc detected *** ffmpeg corrupted double-linked list"
[16:51:42 CEST] <t4nk010> if anyone know how to debug this
[16:51:53 CEST] <t4nk010> I have tried gdb and no hints in that
[16:53:44 CEST] <iive> there is actually variable honor_transparancy that seems to be controlled by flags
[16:54:08 CEST] <iive> yep
[16:54:50 CEST] <iive> there is option "gifflags" that takes arguments "offsetting" and "transdiff"
[16:57:57 CEST] <iive> and if i guess right, they should be enabled by default.
[17:01:02 CEST] <Spring> optimizations like the first one mentioned here, http://blog.bahraniapps.com/gifcam/
[17:01:44 CEST] <Spring> the Pooh bear example. Gifcam has terrible quality output but was wondering if ffmpeg can do that same thing
[17:02:08 CEST] <Vollmer> With ffmpeg 3.1.2 attempting to encode a ppm stream when I specify a framerate for the input stream (from stdin) I get - http://pastebin.ca/3705623
[17:02:22 CEST] <Vollmer> (pastebin also includes ffmpeg command used)
[17:03:50 CEST] <iive> spilotro: yes, ffmpeg supports that, it uses one of the palette colors for transparency
[17:04:11 CEST] <iive> and ffmpeg have a filter to create optimized palette.
[17:06:39 CEST] <Vollmer> ooh for context the ppm is coming from gource using -o -
[17:13:14 CEST] <Spring> man, gifflags is buried in the results
[17:14:25 CEST] <Spring> doesn't seem to help with filesize however :/
[17:17:02 CEST] <c_14> Spring: probably because it's already on by default, you can try disabling it to see if it's doing anything for you. Also, as I mentioned before the algorithm it uses is pretty basic
[17:17:46 CEST] <Spring> I will say that palette generation wise ffmpeg is very good
[17:37:28 CEST] <ozette> how do you guys say ffmpeg? f-f-m-peg? fam-peg? ff-m-p-e-g ?
[17:38:47 CEST] <ozette> or don't you say it at all.. write it down and show it to someone interested?
[17:39:36 CEST] <drv> the first one is what I would guess is most common
[17:41:00 CEST] <ozette> hmm
[17:42:17 CEST] <ozette> i think it's tiresome to say f-f-m-peg all the time in coversation
[17:46:01 CEST] <transhuman_> hi with ffmpeg I am using the following syntax ffpeg -f x11grab -s 640x480 -i :0.0+10,20 -vf format=pix_fmts=yuv420p -f v4l2 /dev/video2 & I am getting the following   ----- Invalid MIT-MAGIC-COOKIE-1 key[x11grab @ 0x25401c0] Could not open X display. :0.0+10,20: Input/output error
[17:47:09 CEST] <transhuman_> do i need to enable xhost +  and X11Forwarding yes in /etc/ssh/ssh_config or does this have nothing to do with it since its not ssh?
[17:47:38 CEST] <c_14> transhuman_: there's something wrong with your Xauthority
[17:47:53 CEST] <c_14> Is this running on a local machine?
[17:47:58 CEST] <transhuman_> yes
[17:48:08 CEST] <transhuman_> but want to do it on remote machines too
[17:48:09 CEST] <c_14> Does the user running the command own the X session?
[17:48:35 CEST] <transhuman_> using sudo as the user so yes I would say so
[17:49:01 CEST] <c_14> try something like xhost +local:username
[17:49:09 CEST] <transhuman_> ok let me try thanks
[17:50:14 CEST] <transhuman_> same thing c_14
[17:50:25 CEST] <transhuman_> should I log out and try again
[17:50:44 CEST] <c_14> You said you're running this through sudo? sudo might be messing up your xauth stuff
[17:51:04 CEST] <transhuman_> ah but i need sudo in order to direct it to /dev/video0 correct?
[17:51:17 CEST] <c_14> In most cases no
[17:51:25 CEST] <transhuman_> oh ok let me try then
[17:51:36 CEST] <c_14> If you have standard unix permissions you just have to be in the "video" group
[17:51:55 CEST] <c_14> If you have polkit/logind it may or may not do magic
[17:52:55 CEST] <transhuman_> ok i added user to video group will have to log out as far as I am aware to get it to take affect correct?
[17:53:01 CEST] <c_14> yes
[17:53:15 CEST] <transhuman_> ok be back if that doesnt work ...if it does thanks in advance
[18:51:07 CEST] <Mista_D> Can extra key frames be addded to a preset interval please? "-force_key_frames expr:eq(mod(n,75),0)"  + @50sec ??
[18:57:50 CEST] <izacarias> Hi!
[18:58:17 CEST] <c_14> force_key_frames expr:eq(mod(n,75),0)+eq(t,50)+eq(t,60) <- Mista_D
[19:01:06 CEST] <izacarias> I'm interesting in extract information about QoE from ffmpeg... like initialization time, number of video stops, sum of video downtimes...
[19:02:00 CEST] <izacarias> Does anyone know if there is an easy way to do it?
[19:24:12 CEST] <DHE> that doesn't sound like something ffmpeg does directly. it doesn't typically feed users, just into a server like nginx-rtmp which would be better suited to that
[19:27:13 CEST] <izacarias> :-/ maybe I could modify the logging mechanism to write log entries when certain events occur (buffer underrun etc..)
[19:47:44 CEST] <Sashmo> Can anyone suggest how to find slates or frozen images in files using FFmpeg?  I can do it with black screen just perfectly, but what about non blacks or frozens?
[19:58:21 CEST] <durandal_1707> Sashmo: with tblend
[19:59:09 CEST] <Sashmo> ok let me check it thanks durandal_1707
[20:00:31 CEST] <durandal_1707> mode difference
[20:00:51 CEST] <durandal_1707> and then use blackdetect
[20:01:11 CEST] <Sashmo> yeah reading that now..... I guess I can make some logic... if X frames = 0 difference
[20:01:24 CEST] <Sashmo> how would you use blackdtect with that?
[20:01:27 CEST] <durandal_1707> you may adjust colors with lutyuv
[20:02:07 CEST] <Sashmo> cool durandal_1707 thanks I will experiment
[20:02:09 CEST] <durandal_1707> you try with tblend to get black color for stills
[20:02:33 CEST] <durandal_1707> and need high percent
[20:02:43 CEST] <durandal_1707> very high
[20:03:15 CEST] <durandal_1707> alternatively use lut2 filter
[20:04:24 CEST] <Sashmo> alright I will experiment to see what I can do
[20:04:33 CEST] <Sashmo> looks like wayyyy over my head though
[20:05:24 CEST] <Sashmo> I guess I could use scene detection too
[20:05:39 CEST] <Sashmo> but it will just give me the times of the detection
[20:05:54 CEST] <durandal_1707> Sashmo: c0 mode is difference and c1, c2 mode is difference128
[20:07:21 CEST] <Sashmo> what is difference128
[20:08:35 CEST] <durandal_17> Sashmo:  ffmpeg.exe -i VIDEO -vf tblend=c0_mode=difference:c1_mode=difference128:c2_mode=difference128,blackdetect -f null -
[20:10:14 CEST] <Sashmo> I get it
[20:10:23 CEST] <Sashmo> ok ill try that
[20:10:28 CEST] <Sashmo> thanks!
[21:14:39 CEST] <ultrav1olet> How can I embed an wav file into an mkv video?
[21:15:21 CEST] <ultrav1olet> ffmpeg -i audio.wav -c:a copy out.mkv produces: [mp4 @ 0x82582e0] Tag [1][0][0][0]/0x00000001 incompatible with output codec id '65536' ([0][0][0][0])
[21:15:31 CEST] <ultrav1olet> and "Could not write header for output file #0 (incorrect codec parameters ?): Invalid data found when processing input"
[21:15:46 CEST] <Threads> ffmpeg -i audio.wav -acodec copy output.mka
[21:16:03 CEST] <Threads> mkv = video
[21:16:07 CEST] <Threads> mka = audio
[21:16:55 CEST] <ultrav1olet> OK, here's the full command line: ffmpeg -loop 1 -i image.png -i audio.wav -c:a copy -c:v libx264 -vf fps=25 -pix_fmt yuv420p out.mp4
[21:17:17 CEST] <ultrav1olet> if I remove "-c:a copy" it works
[21:17:23 CEST] <ultrav1olet> but I want PCM/wav audio
[21:18:10 CEST] <Threads> ohh you want it encoded with the video
[21:18:59 CEST] <ultrav1olet> out.avi worked
[21:19:10 CEST] <ultrav1olet> looks like Matroska isn't the best container
[21:19:12 CEST] <ultrav1olet> lol
[21:19:20 CEST] <ultrav1olet> I thought it's the most versatile
[21:22:55 CEST] <furq> but you're not using matroska
[21:23:57 CEST] <fritsch> ultrav1olet: check the matroska spec ...
[21:24:13 CEST] <furq> do you mean the part where it says "the extension isn't mp4"
[21:24:17 CEST] <ultrav1olet> furq: what am I using then?
[21:24:23 CEST] <furq> you're using mp4
[21:24:23 CEST] <ultrav1olet> omg
[21:24:28 CEST] <fritsch> :-)
[21:24:29 CEST] <ultrav1olet> sorry
[21:24:47 CEST] <ultrav1olet> haven't slept enough today
[21:24:53 CEST] <ultrav1olet> thanks )))))
[21:26:12 CEST] <ultrav1olet> "loop 1" loops forever even though my wav file is long read out. How can I force ffmpeg to output the file which matches the length of my wav file?
[21:26:19 CEST] <furq> -shortest
[21:26:27 CEST] <ultrav1olet> thanks!
[21:26:52 CEST] <ultrav1olet> furq: where does it go?
[21:26:58 CEST] <ultrav1olet> before all "-i"'s?
[21:27:05 CEST] <ultrav1olet> or after them?
[21:27:09 CEST] <furq> it's an output option
[21:29:15 CEST] <ultrav1olet> thanks again
[21:34:08 CEST] <ultrav1olet> I'm encoding a soundtrack with a poster for youtube - I wonder if it'll accept keyint=3000 for my video :-D
[21:34:37 CEST] <furq> -vf fps=6
[21:34:51 CEST] <furq> you could use 1 but youtube won't go any lower than 6 anyway
[21:35:01 CEST] <ultrav1olet> furq: you're sure about 6? ;-)
[21:35:25 CEST] <furq> i used to get bad desyncs with 1, idk if that was fixed
[21:35:36 CEST] <furq> but the lowest fps youtube will encode to is 6
[21:35:43 CEST] <furq> so you might as well do the same
[21:35:50 CEST] <ultrav1olet> I set fps 10 just to be sure
[21:36:26 CEST] <furq> "desyncs" meaning the total file length was a few seconds too long, since obviously you can't desync a static image
[21:36:36 CEST] <ultrav1olet> lol
[21:37:12 CEST] <ultrav1olet> the resulting video bitrate almost matches the source audio bitrate ;-)
[21:37:21 CEST] <ultrav1olet> keyint for static videos rules
[21:37:46 CEST] <ultrav1olet> seeking will be broken completely though
[21:37:55 CEST] <furq> youtube will reencode it anyway
[21:39:01 CEST] <ultrav1olet> I know
[21:45:27 CEST] <ultrav1olet> something is very wrong, darn
[21:45:53 CEST] <ultrav1olet> tCfLNEnWsHA around 12 seconds from the beginning
[21:47:21 CEST] <ultrav1olet> only at fullHD
[21:47:31 CEST] <ultrav1olet> Is it just my PC?
[21:48:00 CEST] <furq> if it only happens at one quality setting then it's probably not worth worrying about
[21:48:19 CEST] <ultrav1olet> furq: only with Google Chrome - weird
[21:48:29 CEST] <ultrav1olet> Firefox plays this video just fine
[21:48:40 CEST] <furq> cache refresh
[21:49:12 CEST] <ultrav1olet> wow, Firefox and Chrome show totally different colors
[21:49:18 CEST] <ultrav1olet> Can anyone confirm?
[21:49:46 CEST] <ultrav1olet> Firefox shows true colors
[21:51:28 CEST] <JEEB> generally if you want "reference colors", use mpv to test and make sure you're using the opengl renderer
[21:51:52 CEST] <JEEB> that in 99% of all cases barring your drivers failing at life should give you the closest to "reference look" you can get in many apps
[21:51:55 CEST] <ultrav1olet> JEEB: under mpv/mplayer everything is fine
[21:52:07 CEST] <JEEB> mplayer is older and different
[21:52:17 CEST] <JEEB> latest mpv is the thing that you can *generally* trust
[21:52:22 CEST] <ultrav1olet> Only Google Chrome botches everything
[21:52:38 CEST] <JEEB> just saying that make sure you have a semi-sane reference thing instead of a browser
[21:52:44 CEST] <ultrav1olet> I wonder if there's a bugzilla for youtube
[21:52:55 CEST] <ultrav1olet> I have a source PNG file :)
[21:53:01 CEST] <furq> echo "it doesn't work" > /dev/null
[21:53:24 CEST] <ultrav1olet> http://imgur.com/a/0e3Kd
[21:53:30 CEST] <JEEB> ultrav1olet: do note that ffmpeg can rape it in various ways with swscale. if it doesn't, great
[21:53:39 CEST] <ultrav1olet> Now look at what youtube shows under Google Chrome
[21:53:41 CEST] <JEEB> which is why I would pretty much always check with mpv your YCbCr video's output
[21:53:43 CEST] <ultrav1olet> day and night, lol
[22:36:20 CEST] <Mista_D> c_14: Thanks for the forced key frames advice.
[23:54:28 CEST] <wallbroken> guys
[23:54:56 CEST] <wallbroken> i need to estract a piece of video
[23:54:58 CEST] <wallbroken> is possible?
[23:57:46 CEST] <iive> wallbroken: usually. check the -ss -t -to and -c copy
[23:58:49 CEST] <wallbroken> iive, it must be the range 24:23 - 25:11
[23:58:55 CEST] <wallbroken> how to set?
[23:59:06 CEST] <wallbroken> the format is mp4
[23:59:48 CEST] <iive>  -ss 24:23 -to 25:11 -c copy
[00:00:00 CEST] --- Sat Aug 27 2016


More information about the Ffmpeg-devel-irc mailing list