[Ffmpeg-devel-irc] ffmpeg.log.20160213

burek burek021 at gmail.com
Sun Feb 14 02:05:01 CET 2016


[00:00:08 CET] <J_Darnley> Let ffmpeg detect your cpu features.
[00:00:14 CET] <J_Darnley> It knows better then you.
[00:00:34 CET] <J_Darnley> For example it knows intels have mmx, sse, avx, not the arm crap.
[00:01:18 CET] <J_Darnley> Back to your first question: if it was x86 then I would immediately say "64 bit is faster"
[00:17:56 CET] <mattf000> thanks.. but back to the 2nd one quick... it's not 100% clear to me that ffmpeg knows it is sending this video to an older android device
[00:18:48 CET] <mattf000> of course it always notes the cpu flags it leverages when i run the program ... i'm just not sure if that is more about helping transcode faster ... but would it negatively impact the decoding time on the droid?
[00:18:53 CET] <J_Darnley> So what?
[00:19:02 CET] <mattf000> i only ask because i'm trying to trim latency right now... i'm at about 180
[00:19:04 CET] <mattf000> ms
[00:19:48 CET] <J_Darnley> I don't understand your thought process at all.
[00:20:49 CET] <J_Darnley> The hardware that matters is the one that is running ffmpeg.
[00:21:20 CET] <J_Darnley> The decoder could be a set-top box, bluray player, mobile phone, desktop computer.
[00:21:32 CET] <J_Darnley> ffmpeg doesn't know and doesn't care
[00:21:33 CET] <mattf000> i guess i just have more confidence in the quad core intel to do its job
[00:21:48 CET] <mattf000> i'm more worried about the armv7a android being used by a surgeon
[00:22:19 CET] <mattf000> obviously any latency is an issue in this application
[00:22:54 CET] <mattf000> the server will be in the room with him but it's going over 2.4gz 802.11nba[
[00:23:27 CET] <J_Darnley> I am still lost.
[00:23:28 CET] <mattf000> it wasn't hard getting to sub 200ms but getting it lower now is way harder than i imagined
[00:23:40 CET] <J_Darnley> Are you running ffmpeg on the mobile device or the server?
[00:23:58 CET] <t4nk533> Hi. I can't find nowhere how to extract one image per FIELD from an interlaced video steam? Normally I get a single image per frame, with two fields merged (at least no deinterlacing is used)
[00:24:09 CET] <mattf000> server .. i just didn't know if there were ways to make the decoding easier on the client
[00:24:23 CET] <J_Darnley> Are you using libx264?
[00:24:31 CET] <mattf000> on the server but not the client
[00:24:43 CET] <J_Darnley> (of course.  it is not a decoder)
[00:24:43 CET] <mattf000> client has hw decoding tho via flash
[00:25:16 CET] <J_Darnley> I will point you to the fastdecode and zerolatency tunings
[00:26:00 CET] <mattf000> yea.. got that
[00:26:34 CET] <mattf000> -tune zerolatency,fastdecode
[00:26:36 CET] <furq> t4nk533: -vf separatefields
[00:26:58 CET] <furq> https://ffmpeg.org/ffmpeg-filters.html#separatefields
[00:27:39 CET] <mattf000> this would be better if i were using udp ... is there reliable way for libx264 to tell a client that it needs to throw out frames that are falling beghind?
[00:27:39 CET] <t4nk533> Thanks furq. Will that give me images of half height?
[00:27:44 CET] <furq> yes
[00:28:43 CET] <furq> mattf000: using udp over wifi sounds like a bad idea
[00:29:01 CET] <furq> especially if surgery is involved
[00:29:11 CET] <J_Darnley> yes libx264 has features for that
[00:29:23 CET] <J_Darnley> I don't think they are exposed through libavcodec though
[00:29:25 CET] <mattf000> do you recommend a transport over RTMP? (which is what i'm using)
[00:29:58 CET] <furq> i recommend it for streaming videos over the internet, which is about where my knowledge ends
[00:30:32 CET] <derekprestegard> hey guys - any advice on converting openexr files into an 4:2:0 8bpc video?
[00:30:52 CET] <J_Darnley> ffmpeg -i INPUT OUTPUT?
[00:30:53 CET] <derekprestegard> simple -i file.exr -pix_fmt yuv420p out.mp4 works, but the image is extremely dark
[00:31:05 CET] <mattf000> i'm basically going from a capture card to smart glasses over wifi
[00:31:31 CET] <J_Darnley> derekprestegard: perhaps you need to correct gamma so something
[00:31:32 CET] <mattf000> the glasses are locked down on 4.0.4 tho
[00:31:34 CET] <derekprestegard> the openexr format is huge - either 16 or 32 bpc, so I probably need to somehow tell ffmpeg how to do the conversion
[00:31:39 CET] <derekprestegard> J_Darnley: yeah thats what I was thinking
[00:32:02 CET] <t4nk533> Why are the fields merged by default though? As far as I know, an interlaced video stream is basically encoded in the same way as a progressive one, the fields are handled as frames with half height?
[00:34:56 CET] <t4nk533> Or not? furq?
[00:35:43 CET] <Guiri> I'm trying to troubleshoot a libfdk-aac error message: http://fpaste.org/322056/20118145/.  I did a build on Ubuntu 14.04 following these instructions: https://trac.ffmpeg.org/wiki/CompilationGuide/Ubuntu
[00:36:14 CET] <J_Darnley> What's so hard to understand about "No such file or directory"?
[00:36:41 CET] <Guiri> Because I compiled the libfdk_aac library in ~/ffmpeg_sources and there were no errors on make or make install
[00:36:58 CET] <J_Darnley> Does your system know to find libraries in that directory?
[00:37:29 CET] <J_Darnley> try LD_LIBRARY_PATH or stop building shared libraries
[00:38:04 CET] <Guiri> `./configure --prefix="$HOME/ffmpeg_build" --disable-shared` should've disabled the shared library build
[00:38:45 CET] <J_Darnley> Then Idon't know how you managed to link with a shared aac library.
[00:39:49 CET] <J_Darnley> as I said "try LD_LIBRARY_PATH"
[00:50:37 CET] <c_14> --disable-shared doesn't disable linking against shared libraries
[00:50:51 CET] <c_14> It just disables building of libav* shared libraries
[00:51:15 CET] <c_14> So configure probably found a shared libfdk_aac somewhere and linked against that
[02:56:13 CET] <Guiri> ls
[04:39:37 CET] <VelusUniverseSys> what does this mean  notification: Speex header too small
[04:39:49 CET] <VelusUniverseSys> and WARNING: Can't write keyframe-seek-index into non-seekable output stream! Writing Skeleton3 track
[04:46:25 CET] <J_Darnley> The second sounds like you're writing to a pipe or perhaps an http stream
[04:47:03 CET] <J_Darnley> The first sounds like a broken file.
[04:58:39 CET] <VelusUniverseSys> ok it shouldnt be a broken file lol but ok, i will check with another file
[06:21:00 CET] <Melchior> Unexpected decoder output format Planar 420P 10-bit little-endian
[06:21:25 CET] <Melchior> What in the world is that error from/for (?)
[06:21:37 CET] <Melchior> I know its related to 10bit colour space
[06:21:55 CET] <Melchior> Yet.... I'm using newish smplayer / mplayer
[06:25:31 CET] <Melchior> Gstreamer plays these fine; so its a mplayer ffmpeg type issue
[06:25:38 CET] <Melchior> ^^ what gives?!
[07:28:08 CET] <C0nundrum> When recording an hls stream, if there is a low framerate in the video could it be because of the connection ?
[16:41:13 CET] <k_sze> Is there some sensible parameter to remix TrueHD to ac3?
[16:44:58 CET] <JEEB> truehd tracks don't contain an ac3 track
[16:45:03 CET] <JEEB> you'd have to re-encode
[16:45:26 CET] <JEEB> with DTS-HD MA it's all an extension on top of base DTS so you can "extract" the base DTS thing in theory
[16:45:28 CET] <k_sze> JEEB: of course
[16:45:39 CET] <k_sze> I know I have to re-encode
[16:46:19 CET] <k_sze> So what I really mean is whether there is some sensible ffmpeg options I should use to re-encode and remix TrueHD to Dolby Digital 5.1.
[16:46:44 CET] <k_sze> I mean options that some of you may have tried and produced reasonably good results.
[16:49:03 CET] <rcombs> -acodec ac3
[16:54:38 CET] <k_sze> I mean bitrate/crf/q and mixing ratios.
[16:55:08 CET] <k_sze> (Is it even possible to specify mixing ratios from TrueHD to Dolby 5.1?
[17:38:34 CET] <someguy234> is it possible to change filter parameters over time with ffmpeg?
[17:39:11 CET] <someguy234> http://ffmpeg.org/ffmpeg-filters.html#Timeline-editing this only allows for conditional enabling/disabling of given filter
[17:40:12 CET] <someguy234> what I mean is to for example use the "rotate" filter and have the rotation increase by a couple of degrees every frame indefinitely
[17:57:52 CET] <xeons> I'm trying to combine several dozen videos (h.264/mov) with concat and noticing an increasing lag in audio sync. All the videos are the same (from the same camera) and re-coding fixes the image but any kind of -c copy is a problem
[17:58:25 CET] <xeons> *fixes the video sync issue, but coping the codec results in this discrepancy
[17:59:25 CET] <xeons> I've tried -auto_convert 1, -fflags +genpts, -async 1, -copyts, and several other flags as hours of googling have detailed
[17:59:49 CET] <xeons> ffmpeg -f concat -fflags +genpts -i ./segments.txt -copyts -c copy test8.mov
[18:17:36 CET] <rocktop> Hello is it possible to add logo to while video and add video intro and background in same command line ?
[18:18:08 CET] <rocktop> s/while/whole
[18:21:58 CET] <rocktop> anyidea ?
[18:32:31 CET] <groupers> Hi, I have a video production device that will stream to a RTMP server, is there some way to get ffmpeg to listen for incoming connections as if it were adobe media server
[18:33:14 CET] <Mavrik> ffmpeg isn't a streaming server, it's a video processing tool
[18:33:34 CET] <Mavrik> So you'll need to find another streaming server (a lot of them can invoke ffmpeg if needed).
[18:33:37 CET] <groupers> the other options are windows media pull and Windows media push. if I use the pull option it gives me an http address that will play in windows media player or vlc but not ffmpeg, could I play in vlc and stream to ffmpeg?
[18:33:47 CET] <groupers> the end goal is to transcoder to
[18:34:25 CET] <groupers> transcode to a format that can be played in a flash applet or embedded player on a webpage
[18:34:37 CET] <groupers> and send to a multicast address
[18:36:09 CET] <furq> groupers: https://github.com/arut/nginx-rtmp-module
[18:36:48 CET] <furq> specifically https://github.com/arut/nginx-rtmp-module/wiki/Directives#exec_push
[18:38:05 CET] <furq> also if your device streams to rtmp then it'll already in a format which can be played by flash
[18:38:09 CET] <furq> +be
[18:39:43 CET] <rocktop> anyidea ?
[18:41:38 CET] <groupers> furq great. so the nginx module will listen for incoming rtmp connections just like flash media server
[19:19:13 CET] <conkis> is  there a way to set ffmpeg  INPUT codec explicitly, without probing/analyzing ?
[19:22:15 CET] <c_14> ffmpeg -c:a codec -i blah should work
[19:22:26 CET] <c_14> Well, -c:[stream identifier]
[19:24:11 CET] <conkis> will tray, thank you c_14
[19:24:16 CET] <conkis> try*
[19:32:40 CET] <rocktop> is it possible to merge different video size for example 1280x720 with 180x320 ?
[19:37:28 CET] <rocktop> anyidea ?
[19:39:00 CET] <jkqxz> rocktop:  Probably.  You'll need to more precisely define what you mean by "merge", though.
[19:41:04 CET] <rocktop> jkqxz:  I have video format 1280x720 and other one with 180x320 in its format I would liketo merge theme to be one video
[19:43:20 CET] <jkqxz> You want them to play one after the other, from the same file?
[19:45:51 CET] <jkqxz> Putting different resolution videos in the same container is unfortunately a can with many worms in.  It is easiest if you are happy to up/downscale one of the videos to match the other.
[23:53:39 CET] <rocktop> Hello , how can I merge 2 video with different format the firstone 1280x728  the secnd format 180x320 ?
[23:56:01 CET] <J_Darnley> What do you mean by "merge"
[23:56:15 CET] <J_Darnley> Several very different oprations might be "merge"
[23:56:34 CET] <J_Darnley> I seem to recall you were asked this before.
[23:57:17 CET] <J_Darnley> Should the second video play after the first?
[23:57:37 CET] <J_Darnley> Should they play at the same time but be stacked next to each other?
[23:57:49 CET] <J_Darnley> Should there be two video streams?
[23:58:30 CET] <rocktop> J_Darnley, the first one is the intro and second play after the first one
[23:58:52 CET] <J_Darnley> Then you should look at "concat"
[23:59:17 CET] <rocktop> J_Darnley:  I already try but I failed
[23:59:24 CET] <rocktop> can you help please
[23:59:30 CET] <J_Darnley> What exactly failed?
[23:59:43 CET] <furq> they're different resolutions
[23:59:49 CET] <rocktop> J_Darnley:  I can't build right syntax
[23:59:52 CET] <furq> you need to scale one before you can concat them
[00:00:00 CET] --- Sun Feb 14 2016


More information about the Ffmpeg-devel-irc mailing list