[Ffmpeg-devel-irc] ffmpeg.log.20191214

burek burek at teamnet.rs
Sun Dec 15 03:05:02 EET 2019


[03:04:48 CET] <hiihiii> I'm stuck using ffmpeg 3.2.x to encode x264 videos for posting to social media (mainly twitter) cause my ffmpeg 4.x.x encoded vids always get them rejected
[03:06:19 CET] <hiihiii> everything from the same exact command
[12:08:26 CET] <void09> anyone know if the ffmpeg scenedetect filter is better than pyscenedetect ?
[14:27:23 CET] <Filarius> kepstin: I got advice to directly place stdout of first ffmpeg as stdin of second ffmpeg, and finally got its working, its just about 4-5 frames "lag"
[14:27:53 CET] <Filarius> ffmpeg -loglevel quiet -f rawvideo -pix_fmt gray -s:v '+str(w)+'x'+str(h)+' -re -r 60 -i - -c:v libx264 -preset fast -crf 28 -g 100 -tune zerolatency -f h264 -
[14:28:08 CET] <Filarius> ffmpeg -loglevel quiet -probesize 32 -f h264 -re -r 60 -i - -c:v rawvideo -pix_fmt gray -f image2pipe -
[14:31:19 CET] <kepstin> Filarius: i'm pretty sure that's what i told you to do :/
[14:52:37 CET] <Filarius> kepstin, maybe i did not was carefull listener :)
[14:53:55 CET] <Filarius> i miss part "take stdout buffer from one and place as stdin for second"
[14:55:25 CET] <Filarius> whatever, there was several other thing just today realize I did wrong
[14:58:27 CET] <furq> Filarius: idk if it'll help but on the right hand side you should drop -re and have -pixel_format gray as an input option
[14:58:44 CET] <furq> instead of -pix_fmt gray as an output option
[15:00:16 CET] <furq> actually after reading it again i have no idea which of those commands is which
[15:02:48 CET] <Filarius> i had "magic" problem with making commandline so I just trying make something what works, not i got what I did wrong
[15:03:08 CET] <Filarius> *now I got it
[15:03:50 CET] <Filarius> its 2 commands for 2 ffmpeg running at same time, what one of it you speaking about? both ?
[15:04:33 CET] <Filarius> python -> gray frame -> ffmpeg encoding -> ffmpeg decoding -> gray frame -> python
[15:06:23 CET] <furq> what part of this means lag is an issue
[15:09:36 CET] <Filarius> (if I got what you mean) i need to have very short queue of frames being "in ffmpeg box", now its about 7-8 frames (4-5 was mistake, I wrote 2 times per iteration then i said it)
[15:11:21 CET] <Filarius> i mean, i said there was 4-5 frames "lag", but then I cound frames I assidently made "write 1 frame to ffmpeg" 2 times, but count only 1
[15:17:25 CET] <Filarius> i just remove both -re and -r, i do not need frame rate here (its not affects how x264 do encoding, right ?)
[15:45:51 CET] <DHE> -re makes ffmpeg throttle its file read speed to simulate "realtime" encoding. -r overrides the framerate on the input file
[15:46:37 CET] <DHE> setting the framerate on the input makes ffmpeg treat that as the file input framerate. setting it on the output will drop/dup frames to make the output file meet your requested framerate but keeping the video as close to the input as possible
[15:49:25 CET] <cehoyos> furq: You cannot force the pix_fmt for h264
[16:13:25 CET] <f00lest> I'm trying to open video and save all frames as png files
[16:14:02 CET] <f00lest> can I use this example? https://github.com/FFmpeg/FFmpeg/blob/master/doc/examples/decode_video.c
[16:14:27 CET] <cehoyos> I think you need transcoding.c
[16:15:37 CET] <f00lest> cehoyos will transcoding.c all of it?
[16:15:46 CET] <cehoyos> all of it?
[16:16:08 CET] <f00lest> I mean extracting frames from video, then convert those frames to png images
[16:16:46 CET] <cehoyos> I believe I already understood that.
[16:17:05 CET] <f00lest> okay, so transcoding.c does that
[16:17:14 CET] <cehoyos> I am not sure it does
[16:17:25 CET] <f00lest> owh
[16:17:31 CET] <cehoyos> But it will show you how to encode video (you want that) and this is not shown in decode_video.c
[16:17:38 CET] <cehoyos> (At least I would be surprised if it is)
[16:17:39 CET] <DHE> should be able to use image2 as the output "format" and it will deal with the file IO as long as you give it png encoded packets
[16:18:06 CET] <cehoyos> Or you simply write each frame into a file...
[16:18:29 CET] <DHE> depends how much extra work you want to do since a mulit-file muxer already exists
[16:19:00 CET] <f00lest> DHE what is this image 2
[16:19:09 CET] <f00lest> is it an example filename?
[16:19:35 CET] <DHE> f00lest: ffmpeg includes a muxer/demuxer called image2 which can simulate a "video" which is actually a bunch of images in a directory, both reading and writing
[16:20:00 CET] <DHE> I used it to turn jpegs taken once every 10 minutes over a time of months into a single time lapse video
[16:20:45 CET] <f00lest> DHE so cool
[16:21:21 CET] <f00lest> is there some code I need to insert into the transcoding.c code so that I can use image2
[16:24:48 CET] <DHE> line 121, avformat_alloc_output_context2, has an option for the oformat (output format). you need to set this to the image2 output format which av_guess_format("image2",NULL,NULL) should provide
[16:25:47 CET] <DHE> the filename then would conform to image2's file pattern rules. so you likely want something like: "image%d.png"  or generally anything suitable for sprintf(frame_filename, your_string_here, framenumber);
[16:26:17 CET] <f00lest> DHE okay thanks
[16:26:24 CET] <f00lest> I'll try that right now
[16:26:30 CET] <f00lest> cehoyos thank you
[16:28:25 CET] <DHE> and obviously video codec of png (this example just transcodes to the same codec as the input)
[16:32:35 CET] <f00lest> is that too in the options of `avformat_alloc_output_context2` ?
[16:37:22 CET] <f00lest> may be I can read a guide or tutorial that can teach me these basics about ffmpeg?
[16:42:24 CET] <f00lest> owh so the transcoding example just converts high quality to a lower quality video
[16:42:47 CET] <furq> well that's what transcoding is
[16:43:29 CET] <f00lest> owh wait, I think I am mistaken, output size is actually > input file size
[16:45:38 CET] <f00lest> I thinks it just packs frames into a new video
[16:47:19 CET] <f00lest> sorry about that
[17:39:58 CET] <asterismo_l> hi
[17:40:00 CET] <asterismo_l> is there any repository of lib x264 and ffmpeg with disabled sse3 and sse4 instruction sets? i tried to compile lib x264 with parameter --disable-asm but gif conversion to mp4 will fail again complaining on sse3 and sse4 cpu
[17:46:19 CET] <cehoyos> asterismo_l: How did you compile x264, what command line did you test and how does the complete, uncut console output look like?
[17:46:56 CET] <asterismo_l> i'm digging, actually it may resulted ok
[17:47:09 CET] <DHE> these things are supposed to be auto-detected. how did such instructions get used if your CPU doesn't support them?
[17:47:26 CET] <cehoyos> Its a virtualized (broken) cpu
[17:47:50 CET] <DHE> that's pretty broken
[17:47:56 CET] <asterismo_l> its a VPS server
[17:48:00 CET] <cehoyos> And it is not using them although they are not supported, they are supported but the cpu does not report them correctly (iiuc)
[17:57:49 CET] <asterismo_l> the AES-NI instructions as well as sse3 and sse4 are disabled somehow
[18:04:26 CET] <DHE> according to gcc, the minimal athlon64 profile only had sse2, so I guess it's a legal CPU configuration..
[22:06:36 CET] <Guest84> hey, I am trying to upload a video to youtube, but the ffmpeg of streamable and youtube fails with exit code 1
[22:06:54 CET] <Guest84>  .\ffmpeg.exe -i .\lmao2.png -i .\SoundRecord-2019-12-14-21-32-52.wav -c:v libx264 -crf 18 -preset slow -pix_fmt yuv420p -movflags +faststart -vf scale=w=1920:h=1080:force_original_aspect_ratio=decrease -c:a aac -b:a 256k test.mp4 -framerate 60 < this is a cmdline I use
[22:07:21 CET] <Guest84> the video plays back fine
[22:07:39 CET] <Guest84> not sure what the deal is
[22:07:54 CET] <Guest84> at first I am like "ah right .mkv doesn't work at all on youtube"
[22:08:13 CET] <Guest84> then I am like "ah right it can't be lossless"
[22:08:20 CET] <Guest84> then "ah right it can't be yuv444p"
[22:09:49 CET] <Guest84> I will just make lossless mkv version and convert with handbreak
[22:09:57 CET] <Guest84> it uses ffmpeg anyway but who knows
[22:51:26 CET] <grosso> I'm trying to use android mediacodec h.264 decoder. It turns out that this codec has the AV_CODEC_CAP_DELAY flag, so...
[22:51:42 CET] <bodqhrohro_> I watch analog TV with weak signal and wonder if some codec supports temporarily discarding U/V information in order to fit into bitrate constraints, instead of other quality regressions.
[22:52:43 CET] <grosso> after calling avcodec_decode_video2 and receive a decoded frame, then I keep calling avcodec_decode_video2 with an empty packet in order to flush codec
[22:53:59 CET] <grosso> so the codec first eat about 10 av_packets and then it spits about 10 decoded frames, and then...
[22:54:38 CET] <grosso> when I call avcodec_decode_video2 with the next encoded video packet, it hangs and never returns
[00:00:00 CET] --- Sun Dec 15 2019


More information about the Ffmpeg-devel-irc mailing list