[Ffmpeg-devel-irc] ffmpeg.log.20190123
burek
burek021 at gmail.com
Thu Jan 24 03:05:01 EET 2019
[00:18:37 CET] <sn00ker> hi all
[00:18:59 CET] <sn00ker> how can i pipe the audio from ffmpeg in screen one to ffmpeg in screen two?
[00:33:10 CET] <another> named pipe?
[00:34:21 CET] <sn00ker> but how?
[00:55:51 CET] <another> mkfifo
[01:03:12 CET] <kevinnn> Hi! Does anyone have a link to a basic example of how to record desktop audio on a windows machine in c++
[01:21:23 CET] <sn00ker> another, jey i have it
[01:21:35 CET] <sn00ker> but the mux is wrong.. i only have audio
[01:22:45 CET] <sn00ker> ffmpeg -f video4linux2 -s 640x360 -i /dev/video0 -r 30 -i \\audiopipe -c:a copy -f flv rtmp://XXXXXX:1935/live/test
[01:46:59 CET] <another> -map 0 -map 1
[02:56:51 CET] <hendry> I'm trying to convert https://media.dev.unee-t.com/2019-01-22/B06C6A3A-DB9E-4A4F-876A-C2ADFA568016.MOV into an MP4, but conversation fails https://media.dev.unee-t.com/2019-01-23/B06C6A3A-DB9E-4A4F-876A-C2ADFA568016.mp4.log
[02:56:56 CET] <hendry> Should I file a bug report?
[02:57:32 CET] <hendry> media is from IOS screencast
[03:12:07 CET] <iive> hendry, not really sure, but it looks like vaapi hw acceleration might be causing problems
[03:12:19 CET] <iive> first see if you can play the input file with `ffplay`
[03:19:02 CET] <hendry> iive: ffplay / mpv works on the source file fine
[03:19:37 CET] <iive> then try recoding without using hw acceleration. the input seems to be h264, so it could be supported.
[03:19:50 CET] <iive> i really cannot see what is wrong from the log.
[03:20:16 CET] <iive> too many packets
[03:21:09 CET] <iive> hum, 0:0 is audio, so it says that the audio steam is not consumed.
[03:21:35 CET] <iive> just for the test, try recoding without audio (-an) see if that works.
[03:22:36 CET] <iive> the audio is already aac.
[03:23:28 CET] <furq> too many packets buffered when stream copying normally means the issue is with the other stream not filling its buffer
[03:23:31 CET] <iive> hendry, why not just stream copy the content , no decoding, encoding, just -c copy.
[03:23:58 CET] <iive> furq, i don't see copy.
[03:24:06 CET] <furq> Reading option '-acodec' ... matched as option 'acodec' (force audio codec ('copy' to copy stream)) with argument 'aac'.
[03:24:12 CET] <furq> oh nvm i misread that
[03:24:33 CET] <iive> ;)
[03:24:36 CET] <iive> gtg
[03:24:52 CET] <furq> -an is worth trying but it's still probably the video
[04:03:44 CET] <hendry> furq: there are two audio streams in IOS screencasts, one capturing App audio IIUC, and another capturing naration through the microphone (which I've enabled)
[04:03:54 CET] <hendry> but audio shouldn't make it choke, surely
[04:06:30 CET] <hendry> https://s.natalian.org/2019-01-23/without-hw-accel-B06C6A3A-DB9E-4A4F-876A-C2ADFA568016.mp4.log
[04:06:40 CET] <hendry> Too many packets buffered for output stream 0:0
[04:07:18 CET] <hendry> https://trac.ffmpeg.org/ticket/6375
[04:41:15 CET] <hendry> damn trac is slow for me
[08:32:56 CET] <kensanata> I have a jingle.mp3 that is 8s long and a podcast episode.wav and I'd like the two to overlap for 2s. I'm currently using -filter_complex [0:a][1:a]concat=n=2:v=0:a=1 to concat the two, so no overlap. I think I need delay and amerge to get the episode to start playing sooner but can't get it to work. Any ideas?
[08:35:08 CET] <friendofafriend> kensanata: Do you want them overlaid or to crossfade?
[08:36:05 CET] <friendofafriend> kensanata: You'll find lots of discussion about various options here. https://stackoverflow.com/questions/14498539/how-to-overlay-two-audio-files-using-ffmpeg
[08:36:12 CET] <kensanata> friendofafriend: overlaid is fine because the jingle fades out and the episode starts with a greeting, so when I tried crossfade it didn't sound good (fading in the greeting).
[08:37:48 CET] <furq> kensanata: adelay and merge won't work because merge stops as soon as the shortest input ends
[08:38:00 CET] <furq> which is annoying because most (all?) video filters let you configure that behaviour
[08:38:55 CET] <friendofafriend> What do you think about this? ffmpeg -y -i ./jingle.mp3 -i ./episode.wav -filter_complex "[0:0][1:0] amix=inputs=2:duration=longest" -c:a libmp3lame ./output.mp3
[08:39:15 CET] <kensanata> The stackoverflow link has amix using duration=longest which sounds interesting, but none of the examples seem to help me delay my episode by 6s, so effectively the jingle and the episode start playing at the same time?
[08:39:20 CET] <kensanata> Ah! Exactly.
[08:39:25 CET] <furq> not sure if amix works
[08:39:29 CET] <furq> if it does then yeah that'll be perfect
[08:39:35 CET] <furq> oh
[08:39:42 CET] <furq> yeah it will work. that was easy
[08:40:06 CET] <furq> that's cool that amix lets you configure that and amerge doesn't, and also that they have different defaults
[08:40:11 CET] <furq> very simple
[08:41:11 CET] <kensanata> I still need a way to delay the onset of the episode?
[08:41:31 CET] <friendofafriend> I think you'd use [1:a]adelay=6000[a1]
[08:42:03 CET] <furq> -i jingle.mp3 -i episode.wav -filter_complex "[1:a]adelay=6000|6000[tmp];[0:a][tmp]amix" out.wav
[08:42:17 CET] <kensanata> Thanks!
[08:42:26 CET] <furq> also i forgot what amerge actually does so i retract my complaint from earlier
[08:42:31 CET] <furq> sorry ffmpeg i love you really
[08:42:44 CET] <friendofafriend> ffmpeg is just the bee's knees.
[08:43:51 CET] <kensanata> Given that bees have 6 legs and befuddling number of knee joints...
[09:10:43 CET] <hans> why isn't my ffmpeg command working?
[09:11:34 CET] <friendofafriend> You could post the command you're using and the output to a paste site like http://paste.debian.net .
[09:11:41 CET] <hans> "ffmpeg -i test.mkv -ss 00:00:3 -vframes 1 -c:v png -" gives the error "[NULL @ 0000000002c00040] Unable to find a suitable output format for 'pipe:'" - how can i tell ffmpeg that the output format is png? *
[09:12:12 CET] <hans> (the first message was a joke, the 2nd isn't)
[09:13:08 CET] <friendofafriend> I think you'd want a "-f png" before the pipe.
[09:14:03 CET] <hans> "ffmpeg -i test.mkv -ss 00:00:3 -vframes 1 -c:v png -f png -" doesn't work either, ends with "[NULL @ 0000000002b60040] Requested output format 'png' is not a suitable output format \n pipe:: Invalid argument"
[09:15:45 CET] <friendofafriend> There is a format called "png_pipe" that might be more suitable.
[09:16:57 CET] <hans> "-c:v png_pipe -f png_pipe -
[09:16:57 CET] <hans> " says "Requested output format 'png_pipe' is not a suitable output format \n pipe:: Invalid argument",
[09:17:22 CET] <friendofafriend> I see someone trying to do a similar thing here. https://superuser.com/questions/1047660/ffmpeg-pipe-images-extracted-from-video
[09:18:47 CET] <hans> oh great that worked, thanks; ffmpeg -i test.mkv -ss 00:00:3 -vframes 1 -c:v png -f image2pipe -
[09:19:07 CET] <hans> ("-f image2pipe" was suggested in the comments of that thread, and it worked!)
[09:19:12 CET] <friendofafriend> Hey, awesome! What a handy command, I'll make a note of it.
[09:19:45 CET] <friendofafriend> And you're always welcome. Best of luck!
[13:02:30 CET] <Fyr> guys, what is "GPAC ISO Hint Handler"?
[13:03:08 CET] <Fyr> ffprobe says:
[13:03:08 CET] <Fyr> Stream #0:2(und): Data: none (rtp / 0x20707472), 167 kb/s (default)
[13:03:08 CET] <Fyr> Stream #0:3(und): Data: none (rtp / 0x20707472), 9 kb/s (default)
[13:03:19 CET] <Fyr> I've never seen something like this.
[13:04:27 CET] <furq> it's an rtp hint track
[13:04:41 CET] <furq> it's some mp4 peculiarity iirc
[13:05:24 CET] <Fyr> ffprobe says that it's unsupported codec.
[13:05:50 CET] <furq> are you planning on streaming the file using quicktime streaming server
[13:05:58 CET] <furq> because otherwise you can probably just discard it
[13:06:05 CET] <Fyr> ok, thanks
[13:06:18 CET] <furq> -ignore_unknown if ffmpeg is bailing out on it
[13:06:30 CET] <furq> or -copy_unknown if you really want to
[13:07:00 CET] <Fyr> thanks
[13:07:32 CET] <Fyr> furq, does GPAC add them by default?
[13:07:50 CET] <furq> no
[13:08:29 CET] <furq> https://gpac.wp.imt.fr/mp4box/#cont_deli
[13:10:41 CET] <Fyr> thanks
[13:11:26 CET] <Fyr> the two files have identical video stream, but one of them contains this RTP stuff.
[13:27:12 CET] <sn00ker> hi all
[13:51:25 CET] <PhantomOfNyx> Morning lovely people in here, I was wondering if anyone could probably give me a quick course in how sliced threads work ( mixer is strongly trying to push real time streaming and recommends Tune: Zerolatency ) which I happen to know enables sliced threads to some degree
[13:52:06 CET] <furq> slice threading splits each frame into multiple slices and encodes them separately
[13:52:27 CET] <furq> it's both slower and less efficient than frame threading
[13:52:44 CET] <furq> you shouldn't use it or zerolatency unless you're actually aiming for sub-second latency
[13:52:51 CET] <PhantomOfNyx> but decodes faster then ?
[13:53:01 CET] <PhantomOfNyx> like there have to be some point in them trying to push it
[13:53:07 CET] <furq> it makes no difference to decode speed
[13:53:20 CET] <furq> but frame threading buffers multiple frames in the encoder
[13:53:40 CET] <PhantomOfNyx> Ahhhh that's why they are doing UDP and trying to get rid of any buffer
[13:54:03 CET] <furq> right
[13:54:52 CET] <furq> there's also -tune fastdecode and you can use them together
[13:54:57 CET] <furq> -tune fastdecode,zerolatency
[13:55:25 CET] <PhantomOfNyx> So weird question, I'm sitting on 32 threads clocked at 4.1ghz roughly ( is there any thing I can do to optimize the stream for using a high thread count )
[13:55:31 CET] <PhantomOfNyx> wait you can mix tunes O_O
[13:55:41 CET] <furq> you can mix tunes that don't change psy settings
[13:55:47 CET] <furq> which is pretty much just those two
[13:55:48 CET] <PhantomOfNyx> furq I remember you from last time I stopped by you're seriously a goldmine of information
[13:56:48 CET] <Mavrik> I've had nothing but strange video artifacts and segfaults when I was trying to use slice threading on FFMPEG ~3.x
[13:56:54 CET] <PhantomOfNyx> So this might be a dumb question but lets say just for arguments sake I'm trying to use sliced threads instead of frame threading
[13:57:04 CET] <furq> there isn't really anything you can do to optimise for slice threading other than increasing the output dimensions
[13:57:12 CET] <furq> (that's not a recommendation)
[13:57:25 CET] <PhantomOfNyx> are the sliced threads based of the total amount of set threads
[13:57:55 CET] <PhantomOfNyx> like you would set threads=24 and then Sliced-threads=14
[13:57:59 CET] <furq> i believe it uses -threads if you enable slicing but idk
[13:58:09 CET] <PhantomOfNyx> or would that result in total 34 threads used
[13:58:18 CET] <furq> threads is specifically for frame threading
[13:58:25 CET] <furq> so if you set slice-threads it probably just gets ignored
[13:58:58 CET] <furq> also slices have minimum dimensions so there's a maximum number of threads you can use for given input dimensions
[13:59:05 CET] <PhantomOfNyx> ah so essentially I would want to replace threads=24 with sliced-threads=24 ?
[13:59:12 CET] <furq> yeah
[14:00:05 CET] <PhantomOfNyx> Ah thanks a bunch, do I have to set slices as an additional option or ? :)
[14:00:12 CET] <furq> zerolatency will set it for you
[14:00:53 CET] <furq> actually never mind sliced-threads is a bool
[14:01:03 CET] <furq> so -threads will work for both
[14:01:14 CET] <PhantomOfNyx> ._. sliced-threads:1 it is then
[14:02:13 CET] <furq> The maximum number of sliced threads is MIN( (height+15)/16 / 4, 128 )
[14:02:16 CET] <furq> so i guess that's 17 for 1080p
[14:02:27 CET] <PhantomOfNyx> I read something about the non-deterministic parameter could help your encoding if you're running a lot of threads
[14:02:57 CET] <PhantomOfNyx> wait what would it be for 900p then ?
[14:03:11 CET] <furq> 915/16/4
[14:04:02 CET] <furq> also i think non-deterministic would only affect frame threading
[14:04:07 CET] <furq> there's no harm in enabling it though
[14:04:37 CET] <PhantomOfNyx> Now I'm a bit confused as you just mentioned sliced-threads is a bool
[14:04:52 CET] <PhantomOfNyx> would it do that calculation and etc by itself and enable it
[14:04:57 CET] <PhantomOfNyx> or do I have to set it somewhere?
[14:05:07 CET] <furq> it'll probably just clamp it
[14:05:30 CET] <furq> iirc non-deterministic does something with lookahead, and zerolatency disables lookahead
[14:05:40 CET] <furq> so i don't know that it'd do anything
[14:05:46 CET] <furq> maybe it does other stuff though
[14:06:08 CET] <PhantomOfNyx> that was exactly why i was trying to avoid using zero tune and just add what mixer wants without zerotune ;)
[14:06:22 CET] <furq> i take it you are actually aiming for sub-second latency
[14:06:26 CET] <PhantomOfNyx> Hence why I'm trying to figure how sliced threads and stuff functions
[14:06:34 CET] <furq> otherwise all of this stuff is just harmful to video quality
[14:06:38 CET] <PhantomOfNyx> yeah they are on mixer
[14:06:41 CET] <furq> fair enough
[14:06:49 CET] <PhantomOfNyx> <_< they are very obsessed with sub second
[14:07:06 CET] <PhantomOfNyx> and their servers decoding capabilities is... no offense terrible
[14:07:20 CET] <furq> well zerolatency and fastdecode will also coincidentally ensure it's using baseline profile
[14:07:21 CET] <PhantomOfNyx> if you throw too high ress or too high an encoding at it it poops out on their end
[14:07:31 CET] <furq> so maybe they're just trying to save money on h264 licensing
[14:07:55 CET] <PhantomOfNyx> they are using webrtc, I am still very puzzled as to why they didn't go vp9
[14:08:04 CET] <furq> yeah webrtc needs baseline
[14:08:12 CET] <furq> you might want to try just using -profile baseline
[14:08:47 CET] <furq> to clarify, webrtc needs baseline because everyone uses openh264 for it, which is currently baseline only
[14:08:53 CET] <PhantomOfNyx> 2 sec for once I'm actually taking notes
[14:09:31 CET] <PhantomOfNyx> I swear they recommend setting high as profile on their site
[14:09:46 CET] <furq> nice
[14:09:47 CET] <PhantomOfNyx> but then again twitches bitrate recommendation of 6k for 1080p 60 is also a bit iffy
[14:10:02 CET] <furq> yeah this stuff is always ropey
[14:10:10 CET] <furq> youtube's recommendations used to be completely insane
[14:10:16 CET] <PhantomOfNyx> I have no clue what monster you need to pull that off I know I am murdering my machine
[14:10:19 CET] <furq> although i guess someone who works there hangs out in here because they fixed it now
[14:10:26 CET] <PhantomOfNyx> 8.5k for 1080p60 is actually relistic
[14:10:29 CET] <PhantomOfNyx> though
[14:10:31 CET] <PhantomOfNyx> *realistic
[14:10:46 CET] <PhantomOfNyx> ...... given you have a very high end dedicated render machine
[14:10:57 CET] <PhantomOfNyx> 6k though ... that's just insanity
[14:11:08 CET] <furq> depends on the game and the encoder
[14:11:24 CET] <furq> a twitchy fps will need way more bitrate
[14:11:43 CET] <PhantomOfNyx> true but I meant for games like overwatch or MMORPG's
[14:11:51 CET] <PhantomOfNyx> hearthstone should probably be fine xD
[14:11:56 CET] <sn00ker> Hello. As some have got here I am currently building a streaming server. I had the problem that the stream had always disconnected when changing the input files.
[14:11:56 CET] <sn00ker> I have now found a way that the stream is not canceled.
[14:11:56 CET] <sn00ker> # run.sh
[14:11:56 CET] <sn00ker> https://nopaste.linux-dev.org/?1191752
[14:11:56 CET] <sn00ker> with a second script, I send the whole thing to an rtmp server
[14:11:58 CET] <sn00ker> # send_stream.sh
[14:12:00 CET] <sn00ker> https://nopaste.linux-dev.org/?1191753
[14:12:02 CET] <sn00ker> That with the audio is nasty, but I'll be right there.
[14:12:04 CET] <sn00ker> If now the input is exchanged (run.sh) then one can recognize in the window of send_stream.sh how the frames go very slowly from fps = 30 on fps = 0. only at 0 is the script terminated. however, since this takes 10 - 20 seconds, you can also stop the stream by hand and start streaming from run.sh or something else without the stream breaking off.
[14:12:09 CET] <sn00ker> perfect, I thought to myself.
[14:12:11 CET] <sn00ker> the stream has no audio. Unfortunately, I have no experience with pipe. can someone help me as I get the audio output from the ffmpeg of run.sh to the audio input of ffmpeg in send_stream.sh?
[14:12:14 CET] <sn00ker> depending on what triggers the initial video in run.sh has the video jerky. can someone help me with this as well?
[14:14:36 CET] <PhantomOfNyx> Also the last point is I don't think there exist a streaming service of any of the major players that doesn't use x264 ( h.264 ) even when twitch rolls out vp9 is it only going to be for internal use to try to save bandwidth by recoding all x264 streams into vp9 at a lower bitrate
[14:15:42 CET] <PhantomOfNyx> So there I think you can only expect a bit better quality transcode of your stream but that's about it, they don't plan on adding ingests for vp9
[14:16:23 CET] <sn00ker> So the stream is created directly on a server and passed on to local
[14:18:22 CET] <sn00ker> if I send the run.sh but directly on a rtmp it does not matter then nothing jerky but then breaks the stream when switching together
[14:24:24 CET] <sn00ker> and how i can pipe the audio?
[14:40:25 CET] <PhantomOfNyx> furq is there anything further I can do to increase multithreaded performance than the non-deterministic parameter ?
[15:43:10 CET] <Phantomofnyx> I deeply apologize my pc rebooted so lost potential answers. But I was wondering if there is any suggestions as for how I can increase Multithreadeded performance in x264 <3
[15:44:04 CET] <pink_mist> there was no answer to your last question
[15:44:14 CET] <pink_mist> so far
[15:44:28 CET] <Phantomofnyx> Ah, I'll keep waiting then ;)
[15:44:52 CET] <DHE> you're still using sliced threads? doesn't regular (aka frame) threads for x264 work better anyway?
[15:45:06 CET] <DHE> or maybe that's mainly a quality thing
[15:50:28 CET] <Phantomofnyx> Yeah I got rid of the sliced threads as I didn't see any performance difference with it
[15:51:30 CET] <Phantomofnyx> but i'm still hiccuping a tiny bit I could ofcourse just lower all my settings but I'm so close to a profile I'm happy with, I noticed the dissonate thingie I mentioned above I have totally forgotten how to spell helped quite a bit
[15:51:49 CET] <Phantomofnyx> and I was wondering if there was more like that which could help high thread count performance
[15:51:51 CET] <DHE> using "veryfast" preset or such?
[15:52:16 CET] <Phantomofnyx> I'm sitting with a ryzen Threadripper 2950x on a dedicated render rig clocked at approx 4ghz
[15:52:26 CET] <Phantomofnyx> So 16 cores, 32 threads
[15:52:41 CET] <DHE> yeah I know the specs...
[15:53:12 CET] <Phantomofnyx> yeah I know everyone keeps saying use 1 thread for ultimate performance, but I can't even run veryfast if I do that
[15:53:13 CET] <Phantomofnyx> xD
[15:53:38 CET] <DHE> while I only have a ryzen7 1700 (non-X) to compare with, I wonder if you'll have better luck if you try to treat it as a NUMA system and run x264 on one half of it with only 16 threads enabled
[15:53:55 CET] <Phantomofnyx> already tried
[15:54:11 CET] <Phantomofnyx> 16threads SMT performs better than 16threads without SMT
[15:54:16 CET] <Phantomofnyx> I don't exactly know why
[15:54:38 CET] <DHE> memory locality. threadripper isn't really quad channel. it's 2 dual-channel ryzen chips glued together.
[15:54:54 CET] <DHE> mind you it's pretty good glue, but still
[15:55:25 CET] <Phantomofnyx> Yeah I setup a linux enviroment with NUMA enabled, but it only shattered any hope of decent performance with x264
[15:56:05 CET] <Phantomofnyx> so I came to conclude that x264 does not like numa ._.
[16:01:12 CET] <Phantomofnyx> Whelp so overclocking the ram to be a bit speedier seemed to do the trick ._.
[16:02:00 CET] <DHE> x264 loves numa. but you have to set it up properly
[16:02:49 CET] <DHE> numactl --cpunodebind=0 -p 0 ffmpeg ... # syntax may vary by version of numactl
[16:03:38 CET] <Phantomofnyx> wait x264 got NUMA support O_O
[16:05:15 CET] <DHE> nope. you launch it with a numa profile set by the OS
[16:06:36 CET] <Phantomofnyx> Lets say that OS is windows, could you guide me on how to set that up because just switching to numa, completely murdered the performance last time i tried
[16:06:54 CET] <DHE> can't help you on windows, sorry
[16:07:07 CET] <Phantomofnyx> but you would on linux ?
[16:07:18 CET] <DHE> see the numactl command above
[16:07:35 CET] <DHE> you can also use numactl --hardware to see what linux thinks of your hardware layout, make sure it looks sane
[16:08:36 CET] <Phantomofnyx> Give me a moment I need to boot into linux, I had no clue there were any performance to gain from it and well stuff like lack of capture card support just made me go back to windows
[16:10:06 CET] <Phantomofnyx> derp back to windows it is, did you know you need windows on your machine to activate numa for the 2950x ?
[16:10:41 CET] <Phantomofnyx> there is no bios option for it so you need the ryzen master tool thingie and do it from there
[16:10:46 CET] <Phantomofnyx> which .... requires windows
[16:17:30 CET] <Phantomofnyx> Quick question, does the speed at which the CPU processes the encoding affect the quality ?
[16:18:04 CET] <Phantomofnyx> because I just lost around 50% quality from clocking my ram at a higher speed a long with a bigger OC for the cpu ( same profile same settings )
[16:42:01 CET] <zerodefect> Using the C-API is it possible to get to the private user data or user data bits in the mpeg2 video? I have a clip which I believe has 708 embedded captions.
[17:05:29 CET] <DHE> zerodefect: captions are exported into the Side Data of the AVFrames
[17:07:45 CET] <zerodefect> Thanks. So it must be 'AV_FRAME_DATA_A53_CC' ? Not familiar with term/standard A53. Research to do!
[17:30:35 CET] <sn00ker> hi aöö
[17:30:36 CET] <sn00ker> all
[17:30:41 CET] <sn00ker> i have this script...
[17:30:42 CET] <sn00ker> https://nopaste.linux-dev.org/?1191757
[17:31:02 CET] <sn00ker> it reencode i see on terminal but i dont bekame an output file
[18:43:24 CET] <iive> sn00ker, the script seems to run ffmpeg twice, first time it outputs to /dev/null, so no output is produced.
[18:44:00 CET] <iive> the second one should produce output. however if there is some error with it...
[18:44:38 CET] <iive> hum...
[18:45:28 CET] <iive> the first ffmpeg does not call libx264 explicitly, like the second one. ffmpeg used to default to xvid like mpeg4,
[18:45:54 CET] <iive> if you use different codecs for the first and second pass... that would explain why it fails.
[19:23:13 CET] <hans> have you tried turning it off and then on again?
[19:31:04 CET] <sn00ker> yes
[19:31:42 CET] <sn00ker> iive, i have copyed from another script that runs fine on another machine
[19:39:38 CET] <sn00ker> on 1pass i bacame this output
[19:39:39 CET] <sn00ker> 00007520 9a 22 9b 7e a1 19 7a e9 69 83 52 10 41 0e c5 99 .".~..z.i.R.A...
[19:39:39 CET] <sn00ker> 00007530 1c df f9 46 ef ca 12 36 a7 0a 13 18 58 fd f3 99 ...F...6....X...
[19:41:58 CET] <MichaelJoel> trying to convert an mov to mp4. I must need to change some flag. If I open in ffplay it is ok. If I open it in windows media player I just get audio and a black screen.
[19:41:59 CET] <MichaelJoel> ?
[20:22:09 CET] <MichaelJoel> how would one join two video files into one?
[20:23:01 CET] <BtbN> https://trac.ffmpeg.org/wiki/Concatenate
[20:23:10 CET] <BtbN> If you just want to concat them, that is
[20:38:08 CET] <MichaelJoel> yes thanks
[20:49:25 CET] <sn00ker> when i run ffmpeg in a script
[20:49:48 CET] <sn00ker> how i can make an if [ FFMPEG HAS an ERROR ] then doo this ELSE do this fi ?
[20:50:02 CET] <JEEB> exit codes?
[20:50:41 CET] <sn00ker> JEEB, you mean $? ?
[20:50:58 CET] <JEEB> or you have to get it in a more funky way in case of pipes being utilized, but yes
[20:51:10 CET] <JEEB> for ffmpeg.c that's the only way to get "errors"
[20:51:22 CET] <JEEB> any more detailed error detection requires API usage
[20:51:49 CET] <sn00ker> JEEB, i need after ffmpeg in script if error yes or no
[20:52:41 CET] <JEEB> you clearly know of exit codes
[20:52:53 CET] <JEEB> thus you already have answered the question you have asked
[21:49:16 CET] <Kadigan> Here's a question... can you build ffmpeg w/ support for both 8bit and 10bit x264?
[21:49:25 CET] <relaxed> yes
[21:49:30 CET] <Kadigan> Is it hard to do?
[21:49:34 CET] <relaxed> no
[21:50:03 CET] <Kadigan> I'm using a "helper script" because I failed a number of times... and the helper script works, but trying to encode yuv420p gives me the error that ffmpeg was not built w/ 8-bit support...
[21:50:50 CET] <Kadigan> I can see that if I enable "high bitdepth", it configures as "--bit-depth=10"
[21:51:15 CET] <DHE> actually the latest (git) version of x264 does support that, and the git version of ffmpeg does support it. this is over a year old now so I suspect there's some release version that just supports it as well...
[21:51:35 CET] <relaxed> Kadigan: --bit-depth=all
[21:51:39 CET] <Kadigan> Thanks.
[21:52:09 CET] <relaxed> my builds it: https://www.johnvansickle.com/ffmpeg/
[21:52:14 CET] <relaxed> have it*
[21:53:03 CET] <Kadigan> Do you also nonfree? Because IIRC if you do, you can't distribute...
[21:53:13 CET] <relaxed> I do not
[21:53:30 CET] <Kadigan> Then I guess thanks, but no thanks. I use libfdk-aac. ^^
[21:54:01 CET] <Kadigan> Thank you for offering, though.
[21:54:46 CET] <Kadigan> And most importantly, thank you for telling me how to enable both. Much appreciated.
[21:59:07 CET] <relaxed> you're welcome
[23:34:53 CET] <Zexaron> Hello
[23:35:01 CET] <Zexaron> so -pix_fmt yuv420p isn't valid anymore ?
[23:35:10 CET] <Zexaron> I get warning about deprecation about this
[23:37:57 CET] <BtbN> You propaly mean yuvj420p
[00:00:00 CET] --- Thu Jan 24 2019
More information about the Ffmpeg-devel-irc
mailing list