[Ffmpeg-devel-irc] ffmpeg.log.20180724
burek
burek021 at gmail.com
Wed Jul 25 03:05:02 EEST 2018
[00:00:19 CEST] <Cracki> just benchmarking a 1-thread encode and nothing else on the cpu isn't representative. cpus have caches.
[00:00:35 CEST] <Cracki> people who use "virtualization" tend to forget that
[00:00:54 CEST] <Cracki> don't partition into smaller than full sockets. it's pointless.
[00:02:04 CEST] <furq> fwiw x264 definitely gets diminishing returns with more threads
[00:02:24 CEST] <poutine> What do you mean by that? I planned on using ECS VCPUs and calculating my threads count based on that. I'm not fully understanding the 32 concurrent single thread test, or how you could simply transcode a mp4 to h264/aac mp4 w/ scaling or something using 32 concurrent encodes
[00:02:25 CEST] <furq> if you're processing a lot of videos then running as many as you can concurrently with one thread each is theoretically fastest
[00:02:54 CEST] <furq> plus obviously any aac encoder you'd use would be singlethreaded anyway
[00:05:24 CEST] <poutine> I did not even consider that furq, and you guys are quickly showing I did not prepare well for this conversation. Could you just tip me off what I'm missing about this 32 concurrent 1 threaded encodes? How would I pull that off for a standard operation like a single input to a single output
[00:06:27 CEST] <poutine> I had planned that each individual container was launched in AWS Fargate for a single job (1 input -> 1 output), and I was going to tweak the VCPUs/memory to adjust between cost efficiency and speed
[00:06:44 CEST] <poutine> but just planned on using -threads to attempt to utilize those VCPUs
[00:08:30 CEST] <Cracki> 32 was an arbitrary number
[00:08:36 CEST] <Cracki> use 1234 if you feel like it
[00:08:40 CEST] <poutine> sure, understood
[00:08:44 CEST] <furq> like he said, it's something you need to benchmark
[00:08:51 CEST] <Cracki> there's definitely not a processor around with that many cores
[00:09:04 CEST] <Cracki> you want to make money, you have to know these things
[00:09:04 CEST] <poutine> I have 16 cores in my desktop, I think there are, aren't there?
[00:09:41 CEST] <Cracki> theoretically, running a different transcode on each core might be slower than running a single transcode across all cores
[00:09:46 CEST] <Cracki> cpu caches, remember?
[00:10:08 CEST] <furq> i wouldn't say that's theoretically
[00:10:34 CEST] <Cracki> it will definitely decrease the usefulness of across-core caches
[00:10:44 CEST] <furq> if you only consider x264 itself then one encode per thread is fastest (if you have enough videos to process concurrently)
[00:10:45 CEST] <Cracki> it's something one must benchmark
[00:10:48 CEST] <furq> but obviously in real life it's not that simple
[00:11:07 CEST] <furq> not only do you have hardware details but you also have the decode thread and the aac encoding thread running at the same time
[00:11:16 CEST] <Cracki> ah right, that too :>
[00:11:23 CEST] <furq> and any filtering you might want to do
[00:11:39 CEST] <Cracki> srsly considering audio transcoding is small fish
[00:11:40 CEST] <furq> and io
[00:11:56 CEST] <furq> and yeah audio transcoding is going to take a tiny fraction of that
[00:12:07 CEST] <Cracki> depending on use case, consider hw encoders
[00:12:14 CEST] <furq> but i assume it's still going to cause unpredictable behaviour with cpu caches
[00:12:19 CEST] <Cracki> indeed
[00:12:43 CEST] <furq> so yeah in summary you just need to test various setups
[00:13:42 CEST] <furq> that was a pretty longwinded way of saying that
[00:14:36 CEST] <Cracki> well... you have to be aware of some laws of nature before you can even imagine them not moving in circles around earth
[00:15:06 CEST] <Cracki> or you gotta be good at observing and writing down numbers
[00:21:53 CEST] <Mavrik> Are there any HW encoders out there that aren't arse quality-wise?
[00:22:11 CEST] <Cracki> heh
[00:22:24 CEST] <Cracki> depends on your taste of arse
[00:22:45 CEST] <Cracki> throw some more bits at them, they might give you good quality
[00:22:53 CEST] <Cracki> they also improve by generation
[00:23:36 CEST] <Cracki> you can guess why broadcast encoders work with 5-10 times the bit rate you'd use for a lovingly cpu-encoded video
[00:23:37 CEST] <Mavrik> Well usually you transcode to save those bits :P
[00:24:25 CEST] <Cracki> hw can help you with the motion estimation
[00:24:35 CEST] <Cracki> that's dumb work fit for a gpu
[00:25:41 CEST] <DHE> yes, but most hardware transcoders are all-in-one... and x264's lookahead offloading has been less than stellar
[00:25:56 CEST] <Cracki> aye
[01:58:46 CEST] <TekuConcept> I just realized, correct me if I'm wrong, but I don't even need an AVFrame to write raw data to a stream. I just need to set the data manually in an AVPacket.
[02:05:38 CEST] <DHE> sure
[02:05:51 CEST] <DHE> if the format is something you can write like that
[02:28:13 CEST] <TekuConcept> DHE, out of curiosity, when you say farmat, are you referring the data format or an av format? I need to create an RTSP carrier with an H264 stream, and audio stream, and a json stream.
[02:31:19 CEST] <Cracki> ooh what are you using that for?
[02:31:44 CEST] <Cracki> text streams must have been done before. teletext, subtitles, ...
[02:34:09 CEST] <TekuConcept> I have a signal processor on a small ARM computer. This device uses OpenCV to connect to an RTSP stream from an IP camera. I also have a mobile device which connects to the same RTSP stream and also has a streaming websocket connection to the ARM device.
[02:34:10 CEST] <DHE> or raw streams like rgb or yuv...
[02:34:30 CEST] <DHE> that's the obvious, easy solution to building an AVPacket by hand
[02:35:04 CEST] <TekuConcept> rgb? would you mind enlightening me?
[02:35:11 CEST] <DHE> red, green, blue
[02:35:16 CEST] <DHE> raw frame pixels
[02:35:18 CEST] <TekuConcept> yes I know that
[02:36:18 CEST] <TekuConcept> I guess what I'm getting at is, what do you mean rgb with respect to json/text data? (I'm using H264 for all video data)
[02:37:11 CEST] <DHE> just that building an AVPacket yourself is usually a rare thing, usually only when the format is something really simple
[02:38:21 CEST] <TekuConcept> You are suggesting I create a raw rgb packet and put my json data in the allocated buffers, yes?
[02:39:02 CEST] <DHE> well, no...
[02:39:16 CEST] <DHE> never heard of a json stream before, but whatever works for you I guess...
[02:40:25 CEST] <Cracki> you seem to want to make your own 'bitstream'
[02:40:35 CEST] <Cracki> the encoding just happens to be json
[02:40:46 CEST] <Cracki> and the container/packetizing doesn't need to know that
[02:41:01 CEST] <TekuConcept> Yes, precisely
[02:41:34 CEST] <DHE> well it usually needs some kind of header information to know how to mark the stream... fourcc, tag, id, or whatever it uses...
[02:41:52 CEST] <Cracki> that of course
[02:41:55 CEST] <TekuConcept> Oh I didn't know that, but that makes sense.
[02:42:02 CEST] <Cracki> JSON could be a fourcc
[02:42:30 CEST] <Cracki> then the basic unit is a json document (object)?
[02:42:49 CEST] <TekuConcept> Yes; Overall, my goal is to combine my currently stream json websocket into my streaming rtsp carrier.
[02:43:31 CEST] <TekuConcept> JSON objects are sent over websocket at a rate of about 15 fps. (as fast as the device can decode and process the H264 video)
[02:43:50 CEST] <Cracki> ffmpeg isn't supposed to try to decode (or encode) this data, just mux/demux it, eh?
[02:43:58 CEST] <TekuConcept> yes
[02:44:33 CEST] <Cracki> I have no clue what an AVPacket would contain... but it sounds to me like you want to fill one with opaque data and mux it
[02:44:43 CEST] <TekuConcept> data and size :)
[02:44:53 CEST] <Cracki> and a stream tag or something ;)
[02:44:56 CEST] <Cracki> I hope
[02:45:02 CEST] <Cracki> tag/id
[02:45:12 CEST] <TekuConcept> ah, let me see
[02:45:24 CEST] <Cracki> not sure how ffmpeg does the association of packets
[02:45:49 CEST] <Cracki> I'd also wonder when pts/dts come into play
[02:46:36 CEST] <Cracki> ah! https://www.ffmpeg.org/doxygen/3.2/structAVPacket.html
[02:46:38 CEST] <Cracki> all there
[02:46:59 CEST] <Cracki> there's buf and data... wonder why both
[02:47:54 CEST] <atomnuker> buf's for avbuffer refcounting
[02:48:13 CEST] <Cracki> ah so buf points to the buf structure and data _into_ it?
[02:48:27 CEST] <Cracki> or however that's organized
[02:48:42 CEST] <Cracki> nvm, i can click everything :P
[02:48:55 CEST] <Cracki> .buf->data == .data?
[02:49:05 CEST] <TekuConcept> data is a buffer
[02:49:13 CEST] <Cracki> heck that's a lot of boxes to unpack
[02:49:14 CEST] <TekuConcept> https://pastebin.com/Ja5Y2R9h
[02:49:33 CEST] <Cracki> avpacket -> avbufferref -> avbuffer -> finally an uint8*
[02:50:11 CEST] <Cracki> bufferref has an AVBuffer *buffer AND an u8* data
[02:50:27 CEST] <TekuConcept> I'm curious if the presentation time stamp would be necessary for a live stream. I figure it would if recorded to a file.
[02:50:32 CEST] <Cracki> sounds like data pointers keep pointing forward while the boxes are for housekeeping
[02:51:18 CEST] <Cracki> the only thing I can imagine looking at DTS (or pts) would be the de/muxer
[02:51:23 CEST] <Cracki> for packet ordering
[02:51:41 CEST] <Cracki> so better set sensible values
[02:52:10 CEST] <TekuConcept> Maybe then I would just increment the value every time I av::interleaved_write_frame()
[02:52:28 CEST] <Cracki> or copy it from the corresponding video packet?
[02:52:49 CEST] <TekuConcept> Ah, that didn't even cross my mind!
[02:52:51 CEST] <Cracki> dts/pts are not per-stream but per-program
[02:53:05 CEST] <Cracki> used for syncing streams
[02:53:20 CEST] <Cracki> (I might be mistaken...)
[02:53:48 CEST] <TekuConcept> per-program - you mean AVFormatContext?
[02:53:52 CEST] <Cracki> not sure
[02:53:59 CEST] <Cracki> I know there are tons of timebases everywhere
[02:54:31 CEST] <Cracki> the most contact i've had with ffmpeg libs is through PyAV and its documentation is very sparse
[02:54:38 CEST] <Cracki> not suited to learning about ffmpeg libs
[02:55:33 CEST] <TekuConcept> (I have a git-clone of the C source code)
[02:57:12 CEST] <TekuConcept> I think pts/dts is reserved for demuxing.
[02:57:48 CEST] <Cracki> the muxer uses something to decide muxing order
[02:58:03 CEST] <Cracki> for files at least that's the case
[04:06:59 CEST] <IsSnooAnAnimal> Hey, I'm having some issues with pixelated motion when encoding with h264_nvenc. Example: https://i.imgur.com/97tS3Jh.png . The command I am using is here: https://pastebin.com/10TjgJnD
[04:11:15 CEST] <IsSnooAnAnimal> Would this be CRF value, preset, or other?
[04:14:37 CEST] <kepstin> IsSnooAnAnimal: nvenc doesn't use the same options as x264. In particular, it doesn't use the "-crf" option.
[04:14:39 CEST] <pzich> that's called interlacing, you might want to try deinterlacing
[04:15:02 CEST] <kepstin> oh, yeah, i should have looked at the image too
[04:15:04 CEST] <kepstin> :)
[04:15:07 CEST] <pzich> ;)
[04:15:19 CEST] <pzich> I did the same thing before I realized what he meant my "pixelated motion"
[04:19:47 CEST] <IsSnooAnAnimal> kepstin: wow, thanks!! what should I use instead of CRF? I'm encoding for a 1080p Vimeo upload, and the source is lossless 40Mbits/s. I have as much time as I need to encode
[04:21:21 CEST] <furq> -cq is the closest equivalent
[04:21:34 CEST] <furq> although obviously the values don't map to x264's crf values
[04:21:40 CEST] <furq> so you'll need to trial and error that
[04:22:07 CEST] <furq> -preset slow also doesn't work with nvenc, you probably want -preset hq
[04:22:29 CEST] <furq> oh wait nvm slow is 2pass isn't it
[04:22:35 CEST] <IsSnooAnAnimal> I believe it is
[04:23:17 CEST] <IsSnooAnAnimal> Speaking of which, I've seen sources say that it's better for quality, and others say it's not. Would you mind clarifying?
[04:23:30 CEST] <furq> but yeah to fix the actual problem you want -c:v h264_cuvid -deint bob -i foo -c:v h264_nvenc ...
[04:23:41 CEST] <furq> or -deint adaptive, idk which is better
[04:24:15 CEST] <IsSnooAnAnimal> h264_cuvid vs h264_nvenc?
[04:24:21 CEST] <IsSnooAnAnimal> or are they the same thing
[04:24:49 CEST] <furq> cuvid is the decoder
[04:25:06 CEST] <IsSnooAnAnimal> ahhh, thanks
[04:25:19 CEST] <furq> and -deint is specific to that decoder, it does the deinterlacing on the gpu
[04:25:27 CEST] <furq> apparently it gives good results and it saves a bunch of memory copying
[04:25:28 CEST] <IsSnooAnAnimal> also, if this makes a difference, the deinterlacing is not apparent in the source file
[04:25:40 CEST] <furq> is your player automatically deinterlacing it
[04:25:49 CEST] <furq> a lot of players will do that if the source is flagged as tff
[04:26:19 CEST] <furq> which it should be if you have a 1080i bluray
[04:26:27 CEST] <kepstin> IsSnooAnAnimal: if you have as much time as needed to encode, use x264 instead of nvenc
[04:26:50 CEST] <furq> if it's going to vimeo anyway and you have plenty of bandwidth then it doesn't matter too much
[04:26:53 CEST] <furq> just throw more rate at it
[04:28:42 CEST] <IsSnooAnAnimal> Ok, that was a bit of an overstatement, I'm encoding 1-2h interviews
[04:30:46 CEST] <IsSnooAnAnimal> It would be nice for around 15 of them to be done in 10 hours
[05:14:52 CEST] <codebam> how can I stream to a file?
[05:15:27 CEST] <codebam> like to create a video stream, but put it in a folder instead and host it with nginx
[05:15:38 CEST] <codebam> can I do that?
[05:17:32 CEST] <pzich> you'll have to be more specific. is the video stream you're talking about something you want to be able to access live, recorded from an earlier stream, or it's just file to file?
[05:18:48 CEST] <codebam> I want to take a pre-recorded mkv h264 file and make it stream so that it's synced to the right time for people streaming
[05:19:23 CEST] <codebam> like I'm pretty sure I can do that with http streaming, but I'd rather mount it so I don't have to turn off my webserver
[05:19:39 CEST] <codebam> or I guess I could proxy pass
[05:21:08 CEST] <pzich> proxy pass is probably your best bet, and it's pretty powerful
[05:21:24 CEST] <codebam> okay, I'll try that. thanks!
[12:51:50 CEST] <jonmall> I've found the stream with the issue and generated a debug log. Error can be seen in lines 179-182. https://pastebin.com/xBGWgCLc . Any ideas on how I ignore or fix the issue? TIA!
[14:21:41 CEST] <Hello71> seems more like you are exceeding the local file size limit
[14:21:50 CEST] <Hello71> oh, decoding
[14:21:52 CEST] <Hello71> never mind
[14:25:41 CEST] <flydev> Using tutorial from https://trac.ffmpeg.org/wiki/CompilationGuide/Ubuntu and can't get rid of ERROR: x265 not found using pkg-config
[14:25:59 CEST] <flydev> Can't compile ffmpeg, Ubuntu 18 LTS
[14:32:22 CEST] <flydev> Anyone? Something is really broken in Ubuntu 18, it compiles fine on 16 using same tutorial
[14:33:20 CEST] <friendofafriend> flydev: You installed libx265-dev?
[14:33:56 CEST] <flydev> Yes, both via apt-get and compiled
[14:40:22 CEST] <friendofafriend> It would probably be best to remove the package if you've compiled from source.
[14:42:00 CEST] <flydev> apt-get remove --purge libx265-dev -y ? and compile again?
[14:43:01 CEST] <friendofafriend> Just remove, but this time compile with PATH="$HOME/bin:$PATH" cmake -G "Unix Makefiles" -DCMAKE_INSTALL_PREFIX="$HOME/ffmpeg_build" -DENABLE_SHARED:bool=off ../../source
[14:43:09 CEST] <friendofafriend> And then make && make install
[14:45:19 CEST] <flydev> like so: https://pastebin.com/wM8EJReb
[14:47:07 CEST] <friendofafriend> Sure, and then find the location of your x265.pc file.
[14:47:13 CEST] <flydev> Still not finding x265, shall I get ffbuild/config.log ?
[14:48:00 CEST] <friendofafriend> You'll want to copy the x265.pc file to your PKG_CONFIG_PATH.
[14:48:56 CEST] <friendofafriend> Or export PKG_CONFIG_PATH=/<path_to_x265>/x265.pc:$PKG_CONFIG_PATH
[14:50:16 CEST] <flydev> https://pastebin.com/7j7aq82j it's in the path but still not finding it
[14:53:35 CEST] <friendofafriend> You want it to find the copy of x265.pc from your ./ffmpeg_sources/x265/build/linux/ directory.
[14:54:21 CEST] <friendofafriend> So you can add that directory before everything else in PKG_CONFIG_PATH
[14:54:50 CEST] <barhom> How do I use "nvreszie" to scale using the GPU instead of CPU?
[14:55:03 CEST] <barhom> Do I really save that much CPU on the scaling?
[14:57:29 CEST] <friendofafriend> Like with a "export PKG_CONFIG_PATH=/<whateveryourabsolutepathis>/ffmpeg_sources/x265/build/linux/:$PKG_CONFIG_PATH"
[14:57:54 CEST] <flydev> friendofafriend so can you give me example how can I do that, here is the ffmpeg compile lines: https://pastebin.com/SyU4ERv3
[15:00:03 CEST] <friendofafriend> Sure, I'd "mv $HOME/ffmpeg_build/lib/pkgconfig/x265.pc $HOME/ffmpeg_build/lib/pkgconfig/x265.pc.backup ; cp $HOME/ffmpeg_sources/x265/build/linux/x265.pc $HOME/ffmpeg_build/lib/pkgconfig/"
[15:03:23 CEST] <flydev> Did not help, same thing
[15:04:41 CEST] <flydev> file seem to be copied and backup file made, i'm really confused now
[15:08:59 CEST] <flydev> Shall I get the ffbuild/config.log log ?
[15:10:25 CEST] <friendofafriend> Yes, please.
[15:11:34 CEST] <flydev> https://pastebin.com/cQSNMi5J
[15:17:31 CEST] <friendofafriend> flydev: When you cat x265.pc, do you find the proper location for the library?
[15:19:42 CEST] <flydev> https://pastebin.com/y2a44cTz
[15:20:50 CEST] <friendofafriend> Did you do a make install after you compiled x265 from source?
[15:21:21 CEST] <flydev> Yes, also here is ls of paths in file:
[15:21:22 CEST] <flydev> https://pastebin.com/ZFfANb0s
[15:22:20 CEST] <flydev> Here is x265 install: https://pastebin.com/Bu1Mr0cg
[15:22:38 CEST] <friendofafriend> Your x265.pc is in /root/ffmpeg_build/lib/ and not /root/ffmpeg_build/lib/pkgconfig/ ?
[15:25:03 CEST] <flydev> Yes, have a look: https://pastebin.com/frBPdP5V
[15:26:16 CEST] <friendofafriend> My x265.pc looks like this. https://pastebin.com/Va78n7Vj
[15:29:04 CEST] <flydev> So yours is not static build but directly compiled instead?
[15:31:22 CEST] <friendofafriend> I think it was compiled out-of-tree and installed.
[15:32:13 CEST] <flydev> OK, what do you suggest for my case?
[15:33:19 CEST] <friendofafriend> I'd remove the package, build x265, and install it.
[15:44:10 CEST] <flydev> the package is removed, x265 rebuild, still no luck
[15:49:01 CEST] <friendofafriend> Does /usr/local/lib/pkgconfig/x265.pc exist?
[15:54:22 CEST] <flydev> No, it doesn't
[15:55:38 CEST] <friendofafriend> Making what could be a rash assumption about your arch, does /usr/lib/x86_64-linux-gnu/pkgconfig/x265.pc exist?
[16:04:52 CEST] <flydev> No, this one doesn't as well
[16:06:10 CEST] <friendofafriend> x265.pc should exist someplace after install, so maybe a sudo find / | grep -i x265.pc
[16:11:51 CEST] <DHE> /usr/local/lib/pkg-config ?
[16:13:49 CEST] <flydev> cp /root/ffmpeg_sources/x265/build/linux/x265.pc /usr/lib/x86_64-linux-gnu/pkgconfig/x265.pc , trying to compile ffmpeg again...
[16:15:55 CEST] <flydev> nope, still complaining...
[16:23:50 CEST] <friendofafriend> x265 is installed now, is there no x265.pc in /usr/ someplace?
[17:09:37 CEST] <th3_v0ice> When I open the encoder context, should the extradata be initialized to something?
[17:12:54 CEST] <flydev> friendofafriend i put one in /usr/lib etc, but to no avail....
[17:13:24 CEST] <flydev> the whole ubuntu 18 is a bloody mess, beyond imagination, all old/legacy stuff are broken, _all_ of them!
[17:17:01 CEST] <friendofafriend> When you did a find on the filesystem, the only x265.pc file you found was in your home directory, nothing in /usr/ someplace?
[17:17:24 CEST] <friendofafriend> If it installed, flydev, it should have thrown that .pc file in /usr/ someplace.
[17:18:45 CEST] <flydev> it threw it in /usr/lib on clean install, still the same, the whole script in the tutorial must be modified to accomudate this, otherwise _no one_ will be able to install it via this script
[17:34:57 CEST] <podman> I've got an mp3 with cover art that's showing up as a mjpeg stream. when i try to convert it to an mp4, i get `Could not find tag for codec h264 in stream #0, codec not currently supported in container`. Any idea how to get this to work?
[17:36:17 CEST] <th3_v0ice> flydev: Try to export the path to the x265.pc file, something like this "export PKG_CONFIG_PATH=/usr/local/lib/pkgconfig"
[17:40:05 CEST] <flydev> Tried, this breaks everything telling me aom is missing, etc.
[18:48:02 CEST] <kubast2> what's the best lossless video format? I tried out h264 ,ffv1 ,vp9 ,hevc(3 different profiles) with some time consuming options enabled(outside of ffv1 when I let the defaults go)
[18:49:09 CEST] <kubast2> 1.FFV1 was the fastest one on pair with hevc. 2.H264 was the most space efficient one and right behind it was vp9. 3.HEVC on 2 different profiles ,one produced a fairlly defective video that was 2MB smaller than h264 one and the other that was a size of FFV1.
[18:49:23 CEST] <kubast2> that was on small resolution though
[18:50:33 CEST] <john3voltas[m]> Hey there. Wonder if someone could give me a hand with a problem. I have a couple of audio files supposedly g729a (mediainfo) that I can't play/convert with ffmpeg.
[18:52:50 CEST] <john3voltas[m]> this is what I get.
[18:52:52 CEST] <john3voltas[m]> https://pastebin.com/Vmbw3nTb
[18:52:58 CEST] <DHE> kubast2: there's also a ffvyuv, a variant of huffyuv
[18:53:15 CEST] <john3voltas[m]> My PC is running windows 7
[18:53:43 CEST] <durandal_1707> john3voltas[m]: that is prahistoric ffmpeg
[18:54:45 CEST] <john3voltas[m]> well...I guess I can download one of the latest builds and try. Give me a couple of minutes. brb
[18:55:13 CEST] <durandal_1707> anyway, if that is g729a its not supported
[18:56:02 CEST] <john3voltas[m]> durandal_1707: the way I understand it, ffmpeg supports reading of g729
[18:56:17 CEST] <john3voltas[m]> it doesn't support encoding to g729
[18:57:43 CEST] <durandal_1707> g729a != g729
[18:59:16 CEST] <john3voltas[m]> hmmm, I see....
[18:59:34 CEST] <john3voltas[m]> durandal_1707: this is what I get from MediaInfo: https://pastebin.com/05durJde
[18:59:49 CEST] <john3voltas[m]> apparently it is g729a
[19:00:30 CEST] <john3voltas[m]> I would assume g729a even because this comes from an Avaya product and Avaya doesn't work with g729, only g729a
[19:01:28 CEST] <john3voltas[m]> if ffmpeg, the swiss army knife of the converters doesn't support g729a, what could i use instead...?
[19:04:12 CEST] <durandal_1707> perhaps this http://www.linphone.org/technical-corner/bcg729/overview
[19:08:39 CEST] <john3voltas[m]> durandal_1707: cool. Is it me or there is no binary for that? only source code?
[19:10:41 CEST] <durandal_1707> only source
[19:11:11 CEST] <john3voltas[m]> argh 😞
[19:12:59 CEST] <durandal_1707> or this http://asterisk.hosting.lv/#bin ?
[19:14:05 CEST] <durandal_1707> could you report this on trac and upload sample somewhere?
[19:14:38 CEST] <john3voltas[m]> sure. will do
[19:15:56 CEST] <john3voltas[m]> i mean, not right away because these tracks are propriety of my customers. will have to encode a couple using their pbx system and post them on trac in a couple of days time.
[19:16:05 CEST] <john3voltas[m]> thanks for the headsup
[19:16:13 CEST] <Hello71> DHE: isn't it called ffvbuff
[19:16:16 CEST] <Hello71> huff
[19:16:18 CEST] <Hello71> fuck
[19:16:30 CEST] <DHE> yes, you're right...
[19:47:24 CEST] <servolk> hi, I'm trying to build ffmpeg on Linux using the instructions at https://trac.ffmpeg.org/wiki/CompilationGuide/Ubuntu and ran into two small issues
[19:48:01 CEST] <servolk> 1. there is a small mistake in libx265 build steps:
[19:48:11 CEST] <servolk> ... if cd x265 2> /dev/null; then hg pull && hg update; else hg clone https://bitbucket.org/multicoreware/x265; fi && \
[19:48:11 CEST] <servolk> cd x265/build/linux
[19:49:20 CEST] <servolk> I believe we need to add "&& cd .." after "hg pull && hg update". Otherwise, if x265 dir already exists, the script will and up with current dir x265 after hg update, and then the subsequent "cd x265/build/linux" fails
[19:49:29 CEST] <servolk> can somebody here edit the wiki?
[20:37:33 CEST] <NovemberGreeting> Hello people... Fresh member here
[20:39:17 CEST] <NovemberGreeting> Is there anyway to use ffmpeg to take input from webcam and encoding the stream with x265 then stream it over http to client use uses an hevc supported player?
[20:40:04 CEST] <NovemberGreeting> ***client who uses....
[20:47:12 CEST] <poldon> Hi. I have fixed some issues with the doc/examples/transcoding.c example. Should I send a patch somewhere?
[20:50:59 CEST] <durandal_1707> poldon: yes to ffmpeg-devel mailing list
[20:57:15 CEST] <th3_v0ice> Is there a way to specify the bistream to be annexb and not avcc. The problem is that extradata is not populated and stream cannot be decoded. Using the API.
[21:01:18 CEST] <poldon> Yeah, I just read that also.
[21:03:14 CEST] <poldon> Another question... I've tried using that example to use the movie and overlay filters. If I use ffmpeg from the command line, it works, but if I add it to transcoding.c, the overlay'ed video displays at the wrong frame rate (about 1/2 speed).
[21:06:06 CEST] <durandal_1707> poldon: needs more info to help you, transcoding.c is very basic demo
[21:14:31 CEST] <Yukkuri> hi, is there any solutions to stream rtmp continiously and re-stream some video data only when received?
[21:14:43 CEST] <Yukkuri> i recon there was ffserver, but it is gone
[21:15:28 CEST] <Yukkuri> i'd like something similiar, indefinently streaming empty black screen and also capable of receiving some video/audio data and then re-streaming it
[21:15:36 CEST] <Yukkuri> but when input is gone -- back to black screen again
[21:15:49 CEST] <Yukkuri> without terminating rtmp connection
[21:16:15 CEST] <Mavrik> That's done by a streaming server
[21:16:34 CEST] <Mavrik> The pro ones like Wowza can do that easily, no idea if there's any free that can do what you want
[21:18:19 CEST] <Yukkuri> okay, maybe some localhost solutions? something like OBS capable of pausing and changing the playback or GUI VLC with RTMP support would also do
[21:22:47 CEST] <poldon> OK, I'll put some code together so you can see my issue.
[21:30:58 CEST] <john3voltas[m]> durandal_1707: interesting. regarding the g729a issue, i was able to successfully convert using this "C:\ffmpeg>ffmpeg.exe -acodec g729 -i C:\ffmpeg\test1.wav -acodec pcm_s16le -f wav C:\ffmpeg\converted-test1.wav"
[21:31:19 CEST] <john3voltas[m]> no errors and it did reencode the track
[21:32:33 CEST] <john3voltas[m]> so, either g729 is very very similar to g729a and in some cases it works, or there's something that i am not following properly on the command line.
[21:35:16 CEST] <durandal_1707> john3voltas[m]: same file? with latest ffmpeg?
[21:48:11 CEST] <john3voltas[m]> durandal_1707: yep, same 4 files
[21:48:12 CEST] <john3voltas[m]> same ffmpeg
[22:06:12 CEST] <sangy> Hi, I'm trying to make a a video in which a series of images start stacking on top of each other (like a photograph stack effect) using ffmpeg and the fade-in effect. I''m using this script (https://ptpb.pw/hcQr/bash), but once an image is faded in the previous one disappears...
[22:08:03 CEST] <pzich> you need every previous image stacked up at the end?
[22:10:31 CEST] <tdr> looks like your using crossfade, thats what crossfade does: fade the next image in
[22:13:46 CEST] <sangy> pzich: yeah I want it to be at the "bottom" of the new one that's faded in
[22:13:53 CEST] <sangy> bottom as in "behind"
[22:14:16 CEST] <sangy> tdr: I was, although I don't think I'm using crossfade anymore https://ptpb.pw/hcQr/bash#L-52
[22:23:48 CEST] <pzich> sangy: I think your best bet is going to use something like imagemagick to create the stacked versions for you, then crossfade between those.
[22:28:28 CEST] <sangy> pzich: yeah, that's what I was afraid of. Thanks!
[00:00:00 CEST] --- Wed Jul 25 2018
More information about the Ffmpeg-devel-irc
mailing list