[Ffmpeg-devel-irc] ffmpeg.log.20180925

burek burek021 at gmail.com
Wed Sep 26 03:05:01 EEST 2018


[07:25:44 CEST] <acresearch> hello people, i have a video that i want to add to keynotes, but i think apple stopped allowing any video format except .gif    is it possible to convert my video from .mp4 to .gif?
[07:27:43 CEST] <Matador> fun freakin times
[07:28:36 CEST] <acresearch> i tried ffmpeg -i video.mp4 -o video.gif  but that did not work, any help please?
[07:29:12 CEST] <acresearch> my video is 150MB would it be possible to convert to .gif? i alway simagine .gif as low quality videos, is that true?
[07:49:00 CEST] <acresearch> i tried .avi but that does not work either. how do i find out what format works with keynotes apple? it seems no format does
[08:31:33 CEST] <LigH> Hi.
[08:33:29 CEST] <LigH> If I export single frames from a video as images, I can add the running number of the image to the file name via sprintf placeholder (e.g. %04d); is it also possible to use the number of the frame in the video instead of an increase +1?
[08:34:32 CEST] <LigH> Like, the first frame being exported is number 0, so it becomes "0000", but the next frame is from video frame 25, so it shall be 0025 instead of 0001.
[08:46:28 CEST] <JEEB> 19
[09:09:09 CEST] <LigH> o?
[09:40:34 CEST] <LigH> When exporting images from video frames, does the number placeholder only support +1 increasing numbers?
[10:05:39 CEST] <Nacht> LigH: As apposed to ?
[10:06:42 CEST] <LigH> There is a question in our German video forum if there is a way to use the original frame number instead, when frames are exported to images via a filter which selects I-frames only.
[10:07:02 CEST] <durandal_1707> no
[10:08:38 CEST] <LigH> Well, thank you for this definitive answer...
[10:08:44 CEST] <LigH> Bye. :)
[11:49:30 CEST] <kwizart> hello, if anyone can provide use some help to know what's wrong with our ffmpeg compilation failing on arm (only with fedora 29 target, not with f28 or f30)
[11:49:33 CEST] <kwizart> https://bugzilla.redhat.com/show_bug.cgi?id=1632636
[11:54:20 CEST] <BtbN> Are you trying to run a library? oO
[12:04:28 CEST] <kwizart> BtbN, apparently this is what ldd does
[12:42:52 CEST] <kwizart> the ldd test is failing because of function or data relocation issue
[14:27:57 CEST] <barhom> I am transcoding HLS INPUT to HLS output, then I am reading the transcoded hls files with ffmpeg -re -i /tmp/hls/transcoded_input.m3u8 and outputting to UDP
[14:28:12 CEST] <barhom> I am trying to make a UDP output that is "streamed" and not "bursty"
[14:28:48 CEST] <barhom> Sometimes the output is still bursty and I have no idea how to fix it. Do you know what Im trying to achieve? Any tips?
[16:00:56 CEST] <teratorn> anyone familiar with image mosaic with ffmpeg and/or opencv? e.g. stitching two or more images side-by-side to form a panorama? i'm looking around at various example code, including the Mosaic class from the Camera app in android and looking for options (?)
[16:56:44 CEST] <sine0> ok, so is there any easy way to work out an aspect ratio from a single dimensio
[16:57:15 CEST] <sine0> so width is 960 pixels, I want it 4:3 and 16:9 so what would the height be for both of those
[16:57:25 CEST] <sine0> I know its 540 from habit for 16:9
[16:57:41 CEST] <kepstin> assuming square pixels (i.e. sar = 1/1)?
[16:57:47 CEST] <sine0> yea
[16:57:57 CEST] <Mavrik> 960 / (16/9) = 540
[16:58:03 CEST] <Mavrik> It is a ratio after all ;)
[16:58:07 CEST] <furq> in what context do you need to know this
[16:58:26 CEST] <sine0> im making images of different web sizes
[16:59:37 CEST] <sine0> Mavrik: Ill try and do that in bc
[17:00:44 CEST] <furq> if you just want a one-size-fits-all thing to pass to -vf scale then -vf scale="if(gt(a\,16/9)\,960\,-2):if(gt(a\,16/9)\,-2\,540)"
[17:01:22 CEST] <Mavrik> although, scale="960:-2" would work better I guess? :)
[17:02:06 CEST] <furq> well yeah sure if you actually read the question then that's a better answer
[17:02:26 CEST] <Mavrik> Uhh.
[17:02:30 CEST] <Mavrik> Hm.
[17:02:31 CEST] <furq> i meant your answer
[17:02:42 CEST] <kepstin> doesn't the scale filter preserve sar tho? so both of those will give unexpected results with non-square-pixel input video.
[17:02:46 CEST] <Mavrik> I honestly don't really know what the endgoal is :D
[17:03:28 CEST] <furq> i'm used to people asking for "how do i scale to 576p" in which case my answer is right (assuming square pixels)
[17:03:36 CEST] <furq> but if the width is always 960 then yeah just 960:-2
[17:05:58 CEST] <sine0> omg i have set this chan on fyre
[17:29:53 CEST] <BigNick> I was wondering if anyone had a parameter list for the target preset ntsc-dvd? Or is there a preset file which I can review these parameters?
[18:06:41 CEST] <BigNick> *update, found the settings in the source
[18:18:11 CEST] <c_14> BigNick: fftools/ffmpeg_opt.c lines 2795-2815
[19:47:57 CEST] <^Neo> Hello friends, does anyone here have experience with DeckLink cards and would be willing to answer some questions about leveraging FFmpeg and DeckLink?
[20:12:54 CEST] <ChocolateArmpits> ^Neo, go ahead
[21:02:12 CEST] <gustavbarnacle> Looking for some help with a concat issue.  I have three videos that, individually are playable but when joined with concat are unplayable and the resulting duration is not right.  Expected an output with a duration of 00:06:20, instead received an output of 124:05:43..  I think this has to do with vbr of the a and b videos.  https://pastebin.com/SF4ykBcB
[21:06:50 CEST] <ChocolateArmpits> gustavbarnacle, what's with the last video's framerate?
[21:07:33 CEST] <ChocolateArmpits> maybe it's a timestamp issue
[21:08:16 CEST] <gustavbarnacle> ~2x2.mp4 has fps of 30.10
[21:08:37 CEST] <ChocolateArmpits> gustavbarnacle, well that implies the source is unlike the first two videos
[21:09:26 CEST] <gustavbarnacle> a.mp4 and b.mp4 have 30fps.  Is the difference between 30 and 30.10 causing an issue?
[21:10:14 CEST] <ChocolateArmpits> I guess the file would simply end up being flagged as vfr, but I think the timestamps may be off
[21:11:13 CEST] <ChocolateArmpits> gustavbarnacle, could you give the ffprobe output of the output file?
[21:13:04 CEST] <gustavbarnacle> https://pastebin.com/D6X6p38F.  Fps is definitely the issue - output is 0.03fps.
[21:13:24 CEST] <ChocolateArmpits> well then from that you get the insane duration
[21:14:21 CEST] <ChocolateArmpits> gustavbarnacle, did you consider transcoding the video?
[21:14:43 CEST] <gustavbarnacle> Right, I'm learning here.  a.mp4 and b.mp4 are created from ffmpeg by looping a single jpg for 20 secs.  ~2x2.mp4 is created from use hstack/vstack on 4 other videos.
[21:15:34 CEST] <gustavbarnacle> Transcoding is much slower than concat, right?
[21:15:48 CEST] <ChocolateArmpits> gustavbarnacle, well as an alternative to -c copy
[21:15:55 CEST] <ChocolateArmpits> you'd still be concating anyways
[21:16:01 CEST] <ChocolateArmpits> concatenating*
[21:19:44 CEST] <gustavbarnacle> Can you point me in the right direction - i can't just drop the -c copy from the command, can I?
[21:20:40 CEST] <ChocolateArmpits> gustavbarnacle, but you can, then ffmpeg will use default options to transcode that fit the output file
[21:21:04 CEST] <ChocolateArmpits> in the case of mp4, h264 for video and aac for audio
[21:36:17 CEST] <gustavbarnacle> Running the command without -c copy hangs and throws "More than 1000 frames duplicated"
[21:36:45 CEST] <ChocolateArmpits> ok then something is really wrong with the third file
[21:38:23 CEST] <gustavbarnacle> I was assuming that the 1000 duplicated frames is because the 'a' and 'b' videos were each created from a single jpg and are just repetitive frames.
[21:40:25 CEST] <gustavbarnacle> The third video plays ok.  Here is the output from when it was created: https://pastebin.com/TkAh9rmk
[21:42:32 CEST] <ChocolateArmpits> gustavbarnacle, is a and b actually a single frame video, or are the populated with frames? frame duplication can also happen when there's a missing sequence of timestamps
[21:42:38 CEST] <ChocolateArmpits> they*
[21:44:31 CEST] <gustavbarnacle> How can I tell.  Command used to create a.video is:  ffmpeg.exe -framerate 1/20 -i "C:\dev\a.jpg" -y -c:v libx264 -r 30 -pix_fmt yuv420p "C:\dev\a.mp4"
[21:58:58 CEST] <relaxed> gustavbarnacle: you want a single frame to be looped 30 fps in your output video?
[22:03:43 CEST] <jorb> can ffmpeg do midi output encoding?
[22:06:00 CEST] <kepstin> jorb: midi isn't really an "encoding", that doesn't make sense.
[22:06:36 CEST] <kepstin> (generating a midi file for a piece of music is something that you'll want to hire a musician to do, since it basically involves transcribing the music and doing an arrangement of it)
[22:08:26 CEST] <jorb> kepstin: ok
[22:09:02 CEST] <jorb> there seem to be a ton of midi to an encoded format type programs
[22:09:31 CEST] <jorb> i didn't realize it was not as simple to go from a mp3 or wave to midi..
[22:09:44 CEST] <kepstin> yeah, if you have a midi file, you can "play" it in a midi synthesizer to get a wave file back and encode that to anything
[22:09:49 CEST] <jorb> i mean it wouldn't be applicable for most actual audio too i suppose
[22:10:16 CEST] <kepstin> midi is basically like a piece of sheet music
[22:10:28 CEST] <kepstin> it's instructions on how to play a song, not the audio for a song
[22:11:14 CEST] <jorb> what i have is a wave file of morse code, so midi makes perfect sense, and would be so easy to parse!
[22:11:40 CEST] <jorb> i tried this program morse2ascii, and it produces the output, but it doesn't seem to detect spaces, i'm just tyring to figure out how to parse it
[22:13:06 CEST] <kepstin> detecting morse code (with minimal background noise) in an audio file shouldn't be very hard, I wish you luck. I imagine there's a lot of related software for that in the amateur radio community.
[22:13:40 CEST] <kepstin> (and my impression was that morse didn't really have spaces? you might end up needing to use a dictionary or something to figure out where spaces should be)
[22:14:34 CEST] <gustavbarnacle> relaxed: sort of.  I want a single frame looped 20 sec (30fps), a different single frame looped 20 sec (30fps), then a four panel video made from hstack/vstack.
[22:14:49 CEST] <jorb> the other suggestion i've gotten so far is to load it into audacity and just look at the audio spectrograph
[22:16:25 CEST] <jorb> kepstin: yeah that was my first thought, but it seems the characters are diffrent lengths, not a simple byte encoding heh
[22:17:25 CEST] <kepstin> jorb: if you take the signal and run it through an fft (like the spectrogram does), it's pretty easy to detect symbol start and stop times programmatically. after reading up a bit, people say "there are tiny gaps between letters and slightly longer gaps between words"
[22:17:49 CEST] <kepstin> of course, with hand-keyed morse those gaps may or may not be super consistent :)
[22:18:58 CEST] <relaxed> gustavbarnacle: try,  ffmpeg.exe -loop 1 -i "C:\dev\a.jpg" -y -c:v libx264 -r 30 -t 20 -pix_fmt yuv420p "C:\dev\a.mp4"
[22:19:29 CEST] <gustavbarnacle> relaxed:  thanks, I'll try.
[22:20:03 CEST] <durandal_1707> use loop filter
[22:25:25 CEST] <gustavbarnacle> relaxed:  ffmepg.exe -loop1 command creates an identical video to the  ffmpeg.exe -framerate 1/20 -i "C:\dev\a.jpg" -y -c:v libx264 -r 30 -pix_fmt yuv420p "C:\dev\a.mp4" command.  Same problem exists when final video is concat.
[22:26:52 CEST] <relaxed> you can probably do this with one command using -filter_complex and needs filters
[22:27:00 CEST] <relaxed> needed*
[22:28:33 CEST] <gustavbarnacle> relaxed:  that's I think where I'm heading next.  I am already encoding the 2x2.mp4 video with:  ffmpeg.exe -i "C:\dev\~upper_left.mp4" -i "C:\dev\~upper_right.mp4" -i "C:\dev\~lower_left.mp4" -i "C:\dev\~lower_right.mp4" -filter_complex "[0:v][1:v]hstack=shortest=1[t];[2:v][3:v]hstack=shortest=1[b];[t][b]vstack=shortest=1[v]" -map "[v]" -shortest  -y "C:\dev\~2x2.mp4"
[22:29:02 CEST] <gustavbarnacle> Can I create one filterchain to create and add the two 20 videos from still images first?
[22:30:26 CEST] <relaxed> you may want to start with the images, otherwise you're encoding the same thing twice
[22:30:49 CEST] <relaxed> durandal_1707 said use the loop filter
[22:31:02 CEST] <ChocolateArmpits> gustavbarnacle, there's a concat filter to join several streams together on the filter graph
[22:38:48 CEST] <GuiToris> ChocolateArmpits, hey, by the time I got to check that profile, you'd already left. I just wanted to say thank you, -pix_fmt yuv420p -vprofile main solved the problem. It can be played back now
[22:38:59 CEST] <ChocolateArmpits> GuiToris, cool
[00:00:00 CEST] --- Wed Sep 26 2018


More information about the Ffmpeg-devel-irc mailing list