[Ffmpeg-devel-irc] ffmpeg.log.20190109
burek
burek021 at gmail.com
Thu Jan 10 03:05:03 EET 2019
[04:39:11 CET] <FLX90> Hello, I'm trying to mux a video stream embedded in mkv from that mkv container to mp4 container (ffmpeg -r 24000/1001 -i test.mkv -map 0:0 -c copy test.mp4). Video stream has cfr 24000/1001 but ffmpeg thinks it is vfr. I tried the solution from here https://trac.ffmpeg.org/ticket/4768 and force framerate with -r. But that don't work. When I try to mux the raw h264 stream (ffmpeg -r 24000/1001 -i video.264 -c copy test.mp4) I get
[04:39:11 CET] <FLX90> this message: "pts has no value time=00:02:38.19 bitrate=1312.4kbits/s speed= 154x". many times Can you help me?
[04:40:24 CET] <FLX90> Tried with latest build of course.
[06:36:31 CET] <tombert> howdy everyone; I am trying to generate an HLS stream using the m3u8 playlist thingy; I want the urls in the file to be formatted with http://blahblahblah.com/vids/thevideo
[06:36:39 CET] <tombert> how do I go about specifying the format?
[06:42:08 CET] <tombert> nvm, I figured it out; -segment_list_entry_prefix is what I needed
[12:50:05 CET] <beta_alf> Hey, I am trying to integrate ffmpeg with an OpenGL Framework and I would like to get the state from the HW Decoder (NVDEC) directly to the OpenGL context, parallel to this: https://stackoverflow.com/questions/49862610/opengl-to-ffmpeg-encode. Is such a thing possible?
[13:21:35 CET] <victorqueiroz> is there any kind of advantage of using directly opus codec instead of using ffmpeg? (performance for example)
[13:25:41 CET] <Elirips> Hello. I'm using ffmpeg to extract frames using a cmd-line like 'ffmpeg -i rtsp://.. -r 5 -an -s 100x100 -q:v 2 -updatefirst 1 -y c:\myfile.jpg'
[13:26:02 CET] <Elirips> this works fine, but since some version of ffmpeg, ffmpeg first writes to a c:\myfile.jpg.tmp and then seems to copy that file to c:\myfile.jpg
[13:26:11 CET] <Elirips> is there any way to prevent it from creating the tmp file?
[13:51:54 CET] <jcelerier> hello :)
[13:52:15 CET] <jcelerier> when linking with ffmpeg 4.1 on linux, I'm having the same linker errors that are mentioned here: https://www.mail-archive.com/ffmpeg-devel@ffmpeg.org/msg73506.html
[13:52:27 CET] <jcelerier> does anyone have an idea of where it could come from ? thanks !
[14:00:45 CET] <Elirips> ah, with a more recent ffmpeg, no .tmp file is generated when using 'update'
[14:35:11 CET] <DHE> jcelerier: sounds like you have the libraries out of order when linking. the gcc linker is single-pass only (unless you use the messy parameters to do multi-pass)
[14:35:34 CET] <jcelerier> DHE: yes, it was that indeed ! sorry for the noise
[15:28:24 CET] <BtbN> Elirips, it will create a tmp file, and then rename it into place, so there are never half-written jpg files
[15:32:54 CET] <Elirips> BtbN: not the most recent windows-built static binary I downloaed
[15:34:05 CET] <Elirips> BtbN: the background is, in my case its not writing to a file, but to a named pipe, and with the hard-coded (?) .tmp file, I need to open two pipes, whats a little bit an overhead
[15:35:01 CET] <kepstin> if you're outputting to a pipe, you should probably be using -f image2pipe
[15:36:02 CET] <BtbN> The img2 muxer should auto detect writing to a pipe
[15:36:22 CET] <kepstin> it might not if it's a named pipe (on the filesystem)?
[15:36:34 CET] <BtbN> http://git.videolan.org/?p=ffmpeg.git;a=blob;f=libavformat/img2enc.c;h=a09cc8ec501e5a9a8426404890af9438ed467ef0;hb=HEAD#l59 the logic looks pretty simple
[15:41:21 CET] <Elirips> kepstin: uh, image2pipe
[15:41:26 CET] <Elirips> never seen that
[15:42:16 CET] <Elirips> hm, I'm just outputing to a file named \\.\pipe\my-pipe
[15:42:23 CET] <Elirips> \\.\pipe\my-pipe.jpg
[15:42:33 CET] <BtbN> oh god, Windows stuff
[15:42:45 CET] <BtbN> no idea how _that_ is recognized
[15:42:48 CET] <Elirips> I dont know if that AVFormatContext knows about that
[15:42:54 CET] <Elirips> :)
[15:43:45 CET] <kepstin> anyways, you can force the image muxer to use pipe mode by selecting -f image2pipe, and then it should just write the images out in sequence to the selected output filename whatever it is.
[15:44:01 CET] <Elirips> this is great, thanks for the help!
[15:50:16 CET] <Elirips> kepstin: this is indeed working, but how can I (on the other end of the pipe) then recognize the end of a "file"?
[15:50:31 CET] <Elirips> If I do not use -f image2pipe, I get an EOF for every file extracted from the stream
[15:50:55 CET] <Elirips> with the -f image2pipe I get an "infinite" stream of bytes and miss the end of a single image
[15:51:35 CET] <kepstin> huh. i forgot about that kinda weird behaviour of named pipes.
[15:51:51 CET] <kepstin> (in the case of jpeg, it's possible to parse the stream to see the boundaries, but yeah)
[15:52:48 CET] <Elirips> hehe, yes I would like to avoid having the parse the data read
[15:53:09 CET] <Elirips> I basically just need to copy every complete image to some other place
[15:53:14 CET] <Elirips> (in memory)
[15:53:44 CET] <Elirips> (and yes, it would be a lot smarter to just libffmpeg, instead of spawning external ffmpeg process that send their data via named pipes)
[15:54:01 CET] <kepstin> well, libavcodec+libavformat, but yeah - that's what I'd recommend
[15:54:04 CET] <Elirips> (but I really really failed at using ffmpeg to connect to a a lot of different streams)
[15:54:22 CET] <Elirips> failed using libffmpeg, while with ffmpeg.exe it "just works"
[15:55:24 CET] <Elirips> the problem is, I have tons of different cameras connected using rtsp, and with ffmpeg it "just works".
[15:55:47 CET] <Elirips> while, if I would use the libraries, I think I would have to do a lot of things myself (connect rtsp, find correct stream, and so on)
[16:42:45 CET] <GuiToris> hey, should I manually specify ; coder, context, slices and error correction when I use ffv1 version 3?
[16:52:05 CET] <zerodefect> I have an Mpge-2 video wrapped in MXF. The spatial resolution of the clip is 720x480. Using the C-API, is it possible to decode it to a 720x486 AVFrame without doing a copy? I thought about pre-allocating the AVFrame buffers, but I've noticeed that 'avcodec_receive_frame' calls av_frame_unref() on the input AVFrame.
[16:54:00 CET] <BtbN> You'll probably have to implement get_buffer yourself and then use the buffer you want there
[16:54:39 CET] <zerodefect> Ah ok. I'll take a look.
[16:55:19 CET] <DHE> Being able to specify my own buffers has been something I've wanted as well. Mainly because I was looking at doing some hugetlb things.
[16:57:23 CET] <DHE> I didn't see an easy way to do it that didn't involve some pretty invasive code changes
[16:57:29 CET] <zerodefect> :/
[17:04:57 CET] <kepstin> note that 720x480 mpeg video is usually ntsc lines 23-262 and 286-525 while 486 line video is typically ntsc lines 21-263 and 283-535, so you'd want the video to be offset down from the top of the frame, too :/
[17:08:54 CET] <zerodefect> Yes, correct. Those were my intentions.
[17:18:02 CET] <zerodefect> I'm only going to use BtbN's suggestion if AV_CODEC_CAP_DR1 is set?
[17:19:04 CET] <zerodefect> *I'm only going to be able to
[19:24:21 CET] <victorqueiroz> How can I compile FFmpeg using CMake?
[19:25:37 CET] <pink_mist> by yourself rewriting all the compilation stuff to use CMake instead of autotools
[19:27:08 CET] <victorqueiroz> Is it possible to integrate FFmpeg in a CMake project?
[19:30:55 CET] <DHE> you could just treat it as an external library that should already be installed into /usr/lib or whatever....
[19:34:04 CET] Action: Mavrik has a nice hunch he's trying to link it into an Android project :P
[19:35:23 CET] <BtbN> Android rarely uses cmake
[19:35:41 CET] <Mavrik> Uh.
[19:35:54 CET] <Mavrik> That's a strange statement.
[19:35:57 CET] <victorqueiroz> Mavrik: that's right!
[19:36:12 CET] <BtbN> Last time I looked at android it was like 99% gradle
[19:36:24 CET] <victorqueiroz> I'm heavily using NDK
[19:36:42 CET] <victorqueiroz> It's hard to think in raw Makefile when you have CMake *-* <3
[19:37:04 CET] <Mavrik> BtbN, yes, and the Android Gradle plugin supports CMake as pretty much the only standard way of compiling C++ code.
[19:37:18 CET] <Mavrik> BtbN, with the other option being the legacy ndk-build thing.
[19:37:34 CET] <BtbN> that's horrible
[19:37:44 CET] <Mavrik> BtbN, I fail to see how.
[19:37:54 CET] <victorqueiroz> I'm just not sure how can I link ffmpeg in the project
[19:37:59 CET] <Mavrik> victorqueiroz, the easiest way for me was to just build the ffmpeg libs three times using the configure stuff
[19:38:22 CET] <Mavrik> and then configure -L / -l in CMake script for the android .so
[19:38:35 CET] <victorqueiroz> Mavrik: using the Android toolchains?
[19:38:54 CET] <Mavrik> Yeah, setting compiler, linker and base path to NDK compilers
[19:39:09 CET] <Mavrik> It was awhile ago so I used GCC... no idea if ffmpeg compiles with clang these days.
[19:39:15 CET] <BtbN> FFmpeg, or really any lib, on android is really no fun. They don't support sonames or anything iirc
[19:39:33 CET] <Mavrik> Static linking is the way to go :)
[19:39:46 CET] <Mavrik> But I agree, lack of CMake supports makes it no fun ;)
[19:40:00 CET] <BtbN> Even if ffmpeg used cmake to build, it wouldn't really help you much
[19:40:42 CET] <Mavrik> It would save all the fsckery around dealing with different arches manually.
[19:40:54 CET] <Mavrik> Since Gradle + NDK toolchain configures that rather well.
[19:41:04 CET] <BtbN> You'd still need to setup an external project, cause you cannot just include external cmake project into a parent tree
[19:41:50 CET] <BtbN> It should be possible to add ffmpeg as ExternalProject in cmake without too many issues, except that it will be annoying on Windows
[19:42:54 CET] <Mavrik> Yeah, windows tends to be the headache.
[19:44:43 CET] <BtbN> You need a shell and make in PATH
[20:03:13 CET] <Aerroon> i'd like to generate proxies for editing with premiere
[20:03:55 CET] <Aerroon> 1080p videos. what kind of encoder should i look into? i'd like to emphasize on speed of playback while still being 1080p so that text remains readable (but bitrate can be low)
[20:06:51 CET] <BtbN> proxies?
[20:07:00 CET] <BtbN> For video editing, you usually want lossless source material
[22:10:10 CET] <flacs> am i right in my understanding that it's not possible to extend ffmpeg dynamically, e.g. write a plugin that ffmpeg dlopen()'s and includes in its codec list?
[22:12:04 CET] <furq> yes
[22:13:13 CET] <durandal_1707> you can use ladspa lv2 frei0r plugins
[22:13:23 CET] <flacs> for a video codec?
[22:13:32 CET] <durandal_1707> not for that
[22:31:10 CET] <GuiToris> hey, should I manually specify ; coder, context, slices and error correction when I use ffv1 version 3?
[00:00:00 CET] --- Thu Jan 10 2019
More information about the Ffmpeg-devel-irc
mailing list