[Ffmpeg-devel-irc] ffmpeg.log.20170420

burek burek021 at gmail.com
Fri Apr 21 03:05:01 EEST 2017


[04:07:09 CEST] <Butterfly_> Hi guys. I would like to cut out multiple parts of a video. Is there a one liner command that can help me achieve that?
[04:22:00 CEST] Last message repeated 1 time(s).
[04:22:55 CEST] <dystopia_> what type of video?
[04:23:20 CEST] <dystopia_> it can be done with ffmpeg but it's easier to cut with an app designed for that like videoredo or tsdoctor
[04:25:38 CEST] <dystopia_> here's a script i wrote to do it with ffmpeg https://paste.ofcode.org/4NnJcSA2igtDjKzpGTsDu9
[04:25:44 CEST] <dystopia_> but yeah it's a hasstle
[04:26:00 CEST] <Sparkyish> Hey All, question about live DASH encoding. I am able to encode multiple video Representations (qualities/resolutions), but I need to encode multiple video AdaptationSets. I cannot find information on how to encode videoInput1 and videoInput2 into separate AdaptionSets in a single DASH manifest. Can anybody point me in the right direction please?
[04:27:41 CEST] <petecouture> Lol I guess Butterfly_ had what he needed and didn't have to thank you dystopia_
[08:16:41 CEST] <zap0> which pix_fmt does the highest res in grayscale ?
[10:32:01 CEST] <superwar> I have a live h264 stream over UDP (without a transport stream), and I'm trying to mux it into an mp4 file, the problem is the incoming AVPacket's dts and pts are always AV_NOPTS_VALUE, so I guess av_interleaved_write_frame can't do its magic, what should I do?
[10:44:43 CEST] <kerio> superwar: send mpegts over UDP :D
[10:45:52 CEST] <Mandevil> Hello kerio
[10:46:03 CEST] <kerio> waddup
[10:46:42 CEST] <superware> kerio: unfortunately I have no control over the stream source, there must be a simple workaround, maybe tell the output format to somehow generate it's own timing as the packets are being written etc, any ideas?
[10:47:03 CEST] <Mandevil> superware: The UDP contains elementary stream?
[10:48:06 CEST] <kerio> superware: -fflags +genpts
[10:48:16 CEST] <kerio> or something
[10:50:33 CEST] <superware> I'm using libav directly...
[10:55:01 CEST] <superware> Mandevil: no, h264 over UDP
[10:55:27 CEST] <Mandevil> Yeah, but what is the format? h.264 elementary stream?
[10:56:16 CEST] <superware> how can I check that?
[10:56:39 CEST] <kerio> ffprobe, i guess?
[10:56:51 CEST] <Mandevil> Yeah, that might work.
[10:57:13 CEST] <kerio> elementary stream is kinda like raw video but for h264
[10:57:36 CEST] <kerio> or well, most codecs
[10:58:23 CEST] <superware> "Input #0, h264, from 'udp://192.168.19.121:40002': Duration: N/A, bitrate: N/A Stream #0:0: Video: h264 (High), yuv420p(tv, bt470bg, bottom first), 704x576, 25 fps, 25 tbr, 1200k tbn, 50 tbc
[10:58:25 CEST] <JEEB> in case of H.264 that would be the bit stream in a format called "Annex B" which is defined by the H.264 specification itself
[10:58:35 CEST] <kerio> yep, that would be elementary
[10:58:43 CEST] <kerio> the "h264" format
[10:58:44 CEST] <JEEB> which is basically NAL units dumped as-is with prefixes
[10:58:51 CEST] <kerio> JEEB: no timestamps, right
[10:58:54 CEST] <JEEB> nope
[10:59:02 CEST] <Mandevil> Annex B defines some signalling... like frame format etc?
[10:59:04 CEST] <kerio> only a framerate indication that may or may not be accurate
[10:59:08 CEST] <JEEB> in theory you could have timing SEI but not sure if anything uses that
[10:59:34 CEST] <JEEB> in general if you want timestamps you need proper container
[10:59:41 CEST] <superware> as this is a live stream, packets arrival is actually the presentation timing, sorta
[11:00:11 CEST] <Mandevil> But how does the decoder know frame boundaries (in the bistream)?
[11:00:12 CEST] <kerio> superware: that only works if there are no B-frames tho
[11:00:21 CEST] <kerio> doesn't it
[11:02:23 CEST] <Mandevil> kerio: Decoder needs to store N reference frames at any rate.
[11:02:40 CEST] <Mandevil> Where N depends on profile/level I guess.
[11:02:55 CEST] <kerio> something something intra frame refresh
[11:03:33 CEST] <superware> kerio: why not?
[11:07:09 CEST] <Mandevil> Is there any preliminary version of the AOM AV1 codec out?
[11:16:00 CEST] <superware> what can I do if I still need to mux that h264 stream into a mp4?
[11:16:25 CEST] <kerio> what happens if you just ffmpeg -i udp://@:whatever -c:v copy foo.mp4?
[11:19:58 CEST] <superware> kerio: the generated mp4 file plays well! the output shows "Timestamps are unset in a packet for stream 0. This is deprecated and will stop working in the future. Fix your code to set the timestamps properly"
[11:20:30 CEST] <kerio> hm, try `-fflags +genpts` before -i
[11:20:41 CEST] <superware> I'm using the library directly, not ffmpeg.exe, and I have no idea what it does internally
[11:20:56 CEST] <kerio> yeah but you can just read the source to figure that out
[11:21:12 CEST] <superware> technically :)
[11:21:13 CEST] <kerio> the ffmpeg binary uses the ffmpeg library, as you might imagine
[11:22:25 CEST] <superware> -fflags +genpts seems the same, works well
[11:23:06 CEST] <superware> of course :) but I know it won't be simple to trace
[11:23:52 CEST] <superware> "copy" doesn't do any encoding right?
[11:26:24 CEST] <Mandevil> superware: Yes.
[11:27:12 CEST] <superware> can someone please direct me to the right place in code where this is implemented? :|
[11:28:00 CEST] <kerio> https://github.com/FFmpeg/FFmpeg/search?utf8=&q=genpts
[11:38:03 CEST] <superware> kerio: I went over the results, couldn't find the one related to pts generation (it's not ffplay, or ffserver) :|
[11:38:43 CEST] <kerio> AVFMT_FLAG_GENPTS?
[11:38:55 CEST] <kerio> in AVInputFormatContext->flags
[11:39:09 CEST] <superware> oh maybe this? https://github.com/FFmpeg/FFmpeg/blob/b8f26779d615dfb466e90627323b1a4e40639f76/libavformat/utils.c#L1717
[15:21:14 CEST] <dreadkopp> hey guys. someone can help me with the correct syntax for exporting one frame every second to image ?  right now i am trying 'ffmpeg -filter:v "select=not(mod(n\,24))" -i /path/to/video.mp4 /path/to/output-$04d.jpg '
[15:21:38 CEST] <dreadkopp> however the filter option cannot be applied this way
[15:25:16 CEST] <Mandevil> $04d? Shouldn't that be %04d?
[15:25:35 CEST] <Mandevil> (not that this would be the problem).
[15:26:03 CEST] <dreadkopp> right.. typo here
[15:27:22 CEST] <Mandevil> dreadkopp: Why do you have (n\,24)
[15:27:30 CEST] <Mandevil> I mean, why is the , escaped?
[15:27:38 CEST] <dreadkopp> right now my command is: 'ffmpge -i /path/to/soure.mp4 -vf "select=not(mod(n\,24))" -q:v 2 /path/to/output-%04d.jpg
[15:28:07 CEST] <dreadkopp> Mandevil: i was looking at this : https://superuser.com/a/391749
[15:28:31 CEST] <dreadkopp> ffmpeg -h isn't wasn't that helpful .. :P
[15:29:20 CEST] <Mandevil> Hm.
[15:29:28 CEST] <Mandevil> Let's see the documentation.
[15:30:05 CEST] <Mandevil> I'm pretty sure the backslash shouldn't be there.
[15:30:08 CEST] <Mandevil> Try to remove it
[15:30:15 CEST] <gp> I am using ffmpeg to create an hls stream.  If I point ffmpeg at the playlist via http, it can understand and encode to another format.  But my web player is choking after the first segment.  Is there any way to validate the stream with ffmpeg sort of like the apple media stream validator but on linux? Or any other tools that could be recommended for testing compatibility?
[15:31:06 CEST] <furq> dreadkopp: -i foo.mp4 -r 1 out%04d.jpg
[15:31:48 CEST] <dreadkopp> furq: that was waay too easy! thanks man!
[15:32:08 CEST] <Mandevil> furq: That converts every frame of the video into a picture....
[15:32:11 CEST] <Mandevil> furq: Not every 24th.
[15:32:25 CEST] <furq> you missed a bit
[15:33:14 CEST] <dreadkopp> looks like a frame each seconds extracted to me.
[15:33:44 CEST] <dreadkopp> every 24th was my idea of 'every second'. didn't know there way a direct option as well :)
[15:34:03 CEST] <Mandevil> Hm, interesting.
[15:34:08 CEST] <Mandevil> -r N sets framerate.
[15:34:34 CEST] <Mandevil> "As an output option, duplicate or drop input frames to achieve constant output frame rate fps."
[15:34:44 CEST] <Mandevil> That explains it :)
[16:38:56 CEST] <nwoki> hi guys, i'm piping an image to ffmpeg via the following command:
[16:38:56 CEST] <nwoki> ffmpeg -y -f image2pipe -vcodec mjpeg -video_size 512x424 -r 15 -i pipe:0 -an -vcodec libx264 -tune zerolatency -pix_fmt yuv420p -f flv rtmp://192.168.1.6:1935/live/webcam
[16:38:56 CEST] <nwoki> and I get a long init delay from ffplay. Any ideas why?
[17:23:28 CEST] <Mista_D> Is that supoposed to take longer: "-ss $t -i $file" VS. "-ss $t -f concat -i $file.concat" ??? Wondering why is seeking much slower with concat...
[17:33:18 CEST] <pihpah> What's wrong with this command:  ffmpeg -i {} -map 0:s:0 {}.srt it extracts ony subtitles of the stream 0:s:0 if I try to specify another subtitles stream I am getting: Error while opening encoder for output stream #0:0 - maybe incorrect parameters such as bit_rate, rate, width or height
[17:34:00 CEST] <pihpah> find . -name "*.mkv" -type f -exec ffmpeg -i {} -map 0:s:1 {}.srt \;
[18:04:00 CEST] <kerio> pihpah: are you sure there's a second subtitle?
[18:16:40 CEST] <pihpah> kerio: yeah
[18:28:54 CEST] <furq> what codec is the second subtitle stream
[18:35:52 CEST] <kerio> is ac3 bad
[18:39:47 CEST] <kerio> oh man, mpegts over udp wifi is BRUTAL
[18:44:04 CEST] <bencoh> haha
[18:44:34 CEST] <kerio> can i transmit every packet twice or something
[18:53:19 CEST] <furq> if only there were a transport protocol that ensured ordered delivery of each packet
[18:53:24 CEST] <furq> some kind of transmission control protocol
[18:53:26 CEST] <furq> but there isn't
[18:58:00 CEST] <kerio> maybe one day we'll have the technology
[19:02:18 CEST] <bencoh> kerio: actually there are simple programs that do just what you're looking for and are used in real production systems
[19:02:42 CEST] <bencoh> see the helper tools shipped with multicat (iirc)
[19:05:41 CEST] <bencoh> kerio: aggregartp / reordertp
[19:34:32 CEST] <TAFB> not having much like with ffmpeg HLS streaming. Is there another program I should try other than ffmpeg?
[19:34:35 CEST] <TAFB> luck
[19:38:11 CEST] <kerio> TAFB: what kind of luck?
[19:39:37 CEST] <TAFB> ffmpeg stops, sits there waiting for the next HLS segment forever, usually right after the first segment, sometimes after a random amount.
[19:41:36 CEST] <klaxa> TAFB: what are you even trying to do? stream hls? record hls?
[19:41:49 CEST] <TAFB> sometimes when I start ffmpeg I get " HTTP error 416 Requested Range Not Satisfiable, Failed to reconnect at 515. Stream ends prematurely at 515, should be 1844674407370955161"
[19:42:31 CEST] <TAFB> i'm using 60 second segments
[19:42:46 CEST] <klaxa> sounds like a connection timed out, ffmpeg tried to reconnect and the webserver doesn't support partial content
[19:43:53 CEST] <TAFB> ffmpeg doesn't try very hard to reconnect I guess, one timeout and it sits there and twiddles its thumbs
[19:44:27 CEST] <TAFB> is there another program similar to ffmpeg you could recommend?
[20:12:26 CEST] <DHE> with 60 second segments, ffmpeg will average 30 seconds doing nothing
[20:13:00 CEST] <DHE> which is actually rather high for most HLS videos. 10 seconds would be considered "normal"
[20:13:04 CEST] <TAFB> it's been sitting waiting for the next segment for 20 minutes or so now
[20:13:16 CEST] <TAFB> i'll try 10 seconds
[20:14:58 CEST] <TAFB> my current command line: https://pastebin.com/raw/CzXmkU7B
[20:15:22 CEST] <TAFB> that's the "encoding" side
[20:15:28 CEST] <TAFB> no issues with that
[20:16:16 CEST] <TAFB> changed hls time to 10
[20:16:31 CEST] <TAFB> will let it "encode" for a bit then try streaming again
[20:18:55 CEST] <TAFB> nope, no good, just sits there at [hls,applehttp @ 0x4583140] HLS request for url 'http://mydomain.com/Live/mystream/index0.ts', offset 0, playlist 0
[20:22:39 CEST] <BtbN> "5000_b3a7d5fa1e3394d4-p - 5193K.flv.part"
[20:22:46 CEST] <BtbN> are you sure that file is intact?
[20:34:54 CEST] <TAFB> kinda, it's downloading it as I'm streaming it
[20:35:08 CEST] <BtbN> Yeah, that's why it stops at some point
[20:35:21 CEST] <BtbN> it reached the point where the file ended when it was opened
[20:35:31 CEST] <BtbN> Can't read further than that without re-opening
[20:36:22 CEST] <BtbN> must have been quite shocking
[20:37:56 CEST] <DHE> if I avcodec_close() a context, is it safe to re-open it again otherwise unmodified as a new encoding session with the same parameters?
[20:45:42 CEST] <DHE> just found the doc clause that says no..
[21:39:14 CEST] <yasar> Hi,
[21:39:28 CEST] <yasar> I am looking for a simple library to read/transform/write wav files
[21:39:37 CEST] <yasar> can I use ffmpeg libraries for that?
[21:39:49 CEST] <thebombzen> you can, but you should check out sox
[21:39:52 CEST] <kerio> use ffmpeg to turn your wav into pcm_s16le
[21:40:02 CEST] <kerio> then it's just bytes
[21:40:21 CEST] <thebombzen> ffmpeg's audio filtering is mediocre. sox does a better job at wav and audio manipulation
[21:40:29 CEST] <yasar> kerio, do you have a link to some sort of documentation/tutorial?
[21:40:46 CEST] <thebombzen> you should probably use sox anyway
[21:41:03 CEST] <thebombzen> http://sox.sourceforge.net/libsox.html
[21:42:32 CEST] <yasar> sox have a library that I can include in my program, or is it command line utility only?
[21:43:10 CEST] <yasar> Because I need to make custom computation on byte level
[21:44:30 CEST] <yasar> Oh, nevermind, I just saw it has a library too
[22:04:03 CEST] <durandal_1707> sox sucks
[22:04:34 CEST] <durandal_1707> it supports only uint32 sample format for filtering
[22:05:06 CEST] <durandal_1707> so i do not understand your reasoning that ffmpeg is mediocre
[22:05:28 CEST] <durandal_1707> when it is better in almost every aspect
[22:05:47 CEST] <durandal_1707> besides sox development is stalled
[22:11:19 CEST] <slalom> what sample format are you using?
[22:27:13 CEST] <thebombzen> durandal_1707: I've had problems with libavfilter's audio filters being subbar
[22:27:20 CEST] <thebombzen> at least in the past. I haven't used them recently
[22:27:59 CEST] <thebombzen> then again, my impression of ffmpeg's audio support is from a long time ago. pre-swresample so idk
[22:28:05 CEST] <durandal_1707> slalom: double
[22:29:03 CEST] <durandal_1707> thebombzen: then keep silent if you last time touched af when it didnt even existed
[22:29:46 CEST] <thebombzen> af existed before swresample?
[22:29:57 CEST] <thebombzen> and I have used atempo recently and got poor results
[22:46:54 CEST] <yasar> I will mostly deal with uint16
[23:03:14 CEST] <IntruderSRB> hi, anyone here can help me out to understand why interlaced DRM protected stream would work in Firefox and in Chrome (PC) would throw MEDIA_ERR_DECODE?
[23:03:26 CEST] <IntruderSRB> same stream does play in Chrome on Tizen 3.0 for example etc...
[23:03:29 CEST] <IntruderSRB> it's a bit weird :/
[23:04:28 CEST] <IntruderSRB> my application is the muxer so I can confirm that h264units gets muxed in fragmented mp4 the same way for both nonDRM and DRM stream
[23:04:36 CEST] <IntruderSRB> (with addition of several boxes for DRM protected ofc.)
[23:04:46 CEST] <BtbN> Don't they all use totally different DRM blobs?
[23:05:24 CEST] <IntruderSRB> well you just pass h264 through cenc encryptor and add cenc related mp4 boxes into mp4
[23:05:43 CEST] <IntruderSRB> offsets and timestamps are pretty much the same
[23:07:11 CEST] <durandal_1707> thebombzen: what exactly was bad with atempo?
[23:07:24 CEST] <thebombzen> it sounded awful
[23:07:56 CEST] <thebombzen> I got better results with audacity and with librubberband (which does have a libavfilter addon, but still)
[23:08:14 CEST] <mbparsa> guys i need some help to convert YUYv422 to YUV420
[23:08:26 CEST] <furq> i've had perfectly good results with atempo
[23:09:18 CEST] <mbparsa> 1- sws_ctx = sws_getContext(ctx->width, ctx->height, AV_PIX_FMT_YUYV422, ctx->width, ctx->height, AV_PIX_FMT_YUV420P, SWS_BICUBIC, NULL, NULL, NULL);
[23:09:43 CEST] <mbparsa> 2- const uint8_t *pImageBuffer = (uint8_t *)ptrGrabResult->GetBuffer();
[23:10:00 CEST] <mbparsa> *pImageBuffer is a Pylon buffer
[23:10:14 CEST] <mbparsa> 3- int size = avpicture_fill((AVPicture*)src_picture, pImageBuffer, AV_PIX_FMT_YUYV422, ctx->width, ctx->height);
[23:10:24 CEST] <mbparsa> 4-                 sws_scale(sws_ctx, src_picture->data, src_picture->linesize, 0, ctx->height, dst_picture->data, dst_picture->linesize);
[23:11:09 CEST] <mbparsa> it gives me access violation error to memory
[23:14:49 CEST] <mbparsa> any one can help me to figure out how to convert a GigE camera stream (YUYV422) to YUV420P?
[23:17:30 CEST] <mbparsa> any one can help me to figure out how to convert a GigE camera stream (YUYV422) to YUV420P?   here is part of the code  https://pastebin.com/D7fENAWE
[23:18:24 CEST] <thebombzen> mbparsa: do you mean with libswscale or with the command line tools?
[23:18:52 CEST] <mbparsa> thebombzen with libswscale
[23:18:59 CEST] <thebombzen> oh. that I don't know
[23:20:08 CEST] <JEEB> been quite a while since I did swscale
[23:20:30 CEST] <mbparsa> thebombzen: this is my first time using ffmpeg, so ffmpeg is at least capable of doing this conversion , correct ?
[23:20:35 CEST] <JEEB> yes
[23:20:51 CEST] <thebombzen> Yea definitely
[23:20:52 CEST] <JEEB> if the input and output colorspaces are something supported by avfilter or swscale, yes
[23:21:01 CEST] <JEEB> https://github.com/jeeb/matroska_thumbnails/blob/master/src/matroska_thumbnailer.cpp#L321
[23:21:07 CEST] <JEEB> this is what I did for X=>YCbCr
[23:21:10 CEST] <JEEB> back in 2013
[23:21:10 CEST] <durandal_1707> mbparsa: have you seen examples in repo?
[23:21:11 CEST] <mbparsa> so I have my code posted here https://pastebin.com/D7fENAWE
[23:21:15 CEST] <JEEB> X=>RGB I mean :P
[23:21:30 CEST] <JEEB> and yes, there's examples in the repo
[23:23:35 CEST] <mbparsa> can you explain how you can move from a packed format toward a planar format , what steps i should take ?
[23:24:23 CEST] <mbparsa> also where i can find list of acceptable pixel format for each encoder (e.g. H,264)
[23:25:08 CEST] <JEEB> the command line tool can expose them as a list per-encoder (there's multiple for H.264 f.ex.), and then of course there's the source code
[23:25:22 CEST] <JEEB> it's a list in a struct
[23:25:29 CEST] <BtbN> I don't think h264, or most codecs at that, have an inherent pixel format
[23:25:36 CEST] <BtbN> it's just about what the specific encoder accepts
[23:25:54 CEST] <JEEB> well, H.264 is just generally planar
[23:26:06 CEST] <JEEB> so for example RGB in it is GBR and planar
[23:26:44 CEST] <BtbN> but it doesn't matter if you feed it nv12, yv12, or any other variant
[23:26:59 CEST] <BtbN> it only matters that it's yuv420, 444, or whatever you want to use
[23:27:47 CEST] <JEEB> well not exactly. the encoder needs to know how that data of yours looks and support that. nv12 for example is a special case due to how x264 switched to it internally
[23:27:52 CEST] <mbparsa> so i have a YUVU422 what is the best format that i should convert to ?
[23:28:23 CEST] <mbparsa> i meant YuYV422
[23:28:29 CEST] <BtbN> JEEB, yes, the encoder does.
[23:28:37 CEST] <BtbN> But the codec does not care or carry that kinda information
[23:28:49 CEST] <Mavrik> Isn't that a meaningless debate?
[23:29:00 CEST] <JEEB> well the codec just has the standard and as I said H.264 is specified to be planar
[23:29:07 CEST] <Mavrik> Anything that's not YUV420P is not going to be widely playable.
[23:29:12 CEST] <JEEB> so you don't get NV12 in H.264 itself
[23:29:34 CEST] <BtbN> kinda weird that most h264 encoders prefer it
[23:29:44 CEST] <mbparsa> ok I should convert whatever I have to YUV420P? correct ?
[23:29:58 CEST] <JEEB> it's just a convenient format for SIMD in some cases as well as GPUs seem to like it
[23:31:50 CEST] <faLUCE> hello, I have three files Y, U, V ... each file is the corresponding plane of a 640x480 YUV420 image. Do you know any utility for displaying each plane (= file) separately ?
[23:31:51 CEST] <JEEB> also now that I re-think I think I might have talked bullshit as I don't remember at all how the coding in H.264 went again. although I remember there being a coding mode that just enabled you to push the raw bits there so I guess it might as well be in planar order one after another slice per slice.
[23:31:51 CEST] <mbparsa> guys any one of you can help me with the code ?
[23:34:31 CEST] <thebombzen> faLUCE: if you have a plane and you want to view it as a grayscale image, you can do: ffmpeg -f rawvideo -video_size 640x480 -pixel_format gray8 -i y_plane.raw y_plane.png
[23:34:37 CEST] <thebombzen> and then view it as a grayscale png
[23:34:57 CEST] <mbparsa>  guys any one of you can help me with the code (ffmpeg API)?
[23:35:02 CEST] <Afshaal> aha
[23:35:14 CEST] <faLUCE> thebombzen:  thanks
[23:35:16 CEST] <thebombzen> the U and V planes are going to have half-resolution in each direction so you have to take that into account
[23:35:40 CEST] <Afshaal> Can ffmpeg be used to recover video and audio from files that weren't finalized at encode time?  For instance a stream recording that crashes before it can be finished?
[23:35:42 CEST] <JEEB> mbparsa: if you know how your buffers are set something similar to my code but of course with a different output colorspace should work :P
[23:35:53 CEST] <faLUCE> thebombzen: do you mean that I'll see dots in the U and V files?
[23:35:59 CEST] <JEEB> Afshaal: if it's a format that requires an index but it was never written then no
[23:36:17 CEST] <JEEB> Afshaal: you can't really do that easily
[23:36:26 CEST] <thebombzen> faLUCE: no I mean with yuv420p, the u and v planes are half resolution in each direction
[23:36:36 CEST] <Afshaal> What do you mean by index?  Is MP4 (x264 and AAC) one of those formats?
[23:36:37 CEST] <thebombzen> so the Y plane is 640x480 and the other two are 320x240
[23:36:46 CEST] <JEEB> Afshaal: yes, unless fragments are used
[23:36:48 CEST] <thebombzen> that's what the 4:2:0 subsampling is
[23:37:01 CEST] <JEEB> but most likely if things have failed you have no movie fragments in use :P
[23:37:28 CEST] <Afshaal> I bet this wouldn't be a problem if I had been dumping video to .TS
[23:37:35 CEST] <Afshaal> -.-
[23:37:51 CEST] <JEEB> well MPEG-TS is for broadcast. not for seeking but yes, each packet is 188 bytes and that's it
[23:37:54 CEST] <thebombzen> yea don't record to mp4 in OBS
[23:37:59 CEST] <JEEB> they can be parsed A=>B
[23:38:06 CEST] <thebombzen> it's generally a better idea to record to .ts, and then remux to mp4 later on
[23:38:13 CEST] <Afshaal> I was recording to MP4 with some digital camera streaming software
[23:38:22 CEST] <JEEB> or mp4 with movie fragments but almost nothing does that :P
[23:38:24 CEST] <thebombzen> oh. yea but either way don't record to mp4 if you can record to .ts
[23:38:36 CEST] <Afshaal> mmhmm, lesson learned :(
[23:38:49 CEST] <durandal_1707> you can readd index with other software
[23:38:54 CEST] <Afshaal> Still gotta recover these files if I can though
[23:39:12 CEST] <JEEB> durandal_1707: only if you have similar files and those have the exact same parameters
[23:39:48 CEST] <JEEB> I think someone had a project going for it, but no idea how well that works :P
[23:40:00 CEST] <thebombzen> lsmash might be able to do it
[23:40:13 CEST] <JEEB> l-smash is just a ISOBMFF thing
[23:40:16 CEST] <JEEB> it doesn't do anything more
[23:40:16 CEST] <thebombzen> speaking of lsmash, jeeb why is there a muxer and a remuxer command
[23:40:26 CEST] <thebombzen> and "muxer" takes muxed files
[23:40:32 CEST] <Afshaal> I do have some video files that were finished by the streaming software without it crashing
[23:40:33 CEST] <JEEB> uhh, no?
[23:40:42 CEST] <JEEB> at least it shouldn't
[23:41:04 CEST] <JEEB> remuxer is the one that takes non-"raw" input
[23:41:05 CEST] <faLUCE> thebombzen: in fact the command you wrote doesn't work for the u and v planes, which are half of the size of the y plane
[23:41:18 CEST] <thebombzen> faLUCE: in fact I told you it wouldn't work
[23:41:21 CEST] <thebombzen> and gave yout he reason why
[23:41:24 CEST] <JEEB> also they're separate apps because nobody had the resources yet to make them into a single app :P
[23:41:25 CEST] <thebombzen> and told you what to do instead
[23:41:51 CEST] <faLUCE> thebombzen: I tried with 320x240 for them
[23:42:56 CEST] <Afshaal> [mov,mp4,m4a,3gp,3g2,mj2 @ 0xe5f1a0] moov atom not found
[23:42:59 CEST] <Afshaal> mmhmm
[23:43:15 CEST] <Afshaal> guess I can't just re-contain this
[23:43:20 CEST] <Afshaal> re-containerize
[23:43:47 CEST] <JEEB> that thing contains the decoder init data among other things if I recall correctly :P
[23:43:48 CEST] <thebombzen> faLUCE: then what do you mean by "it didn't work"
[23:44:52 CEST] <Afshaal> is there a way to like...  grab the format specifications from a working file with ffprobe and then force ffmpeg to read a broken file with that same info?
[23:45:05 CEST] <faLUCE> thebombzen: http://paste.ubuntu.com/24423003/
[23:46:02 CEST] <JEEB> not with that tool you said, and no
[23:46:36 CEST] <JEEB> you can in theory make it decode'able if you can stick the right bits together and the video and audio streams are the same in both working and non-working files
[23:46:53 CEST] <JEEB> but it's not a normal use case in any way or form for the ffmpeg toolset
[23:46:56 CEST] <thebombzen> also faLUCE are you sure you started with yuv420p?
[23:48:11 CEST] <faLUCE> thebombzen: http://paste.ubuntu.com/24423016/
[23:48:30 CEST] <Afshaal> ._.
[23:48:56 CEST] <faLUCE> thebombzen: not sure if the u and v planes are corrupt or valid
[23:49:01 CEST] <Afshaal> it looks like someone made a handy tool for just this sort of problem here https://github.com/ponchio/untrunc
[23:49:15 CEST] <Afshaal> but it's not in the Ubuntu repo so I gotta compile it.  How annoying
[23:49:55 CEST] <JEEB> yes, that was the project that was mentioned before here
[23:50:04 CEST] <JEEB> unfortunately I have no idea how well it works
[23:50:36 CEST] <Afshaal> I've give it a go and let ya know
[23:50:49 CEST] <thebombzen> faLUCE: those u and v planes would be half the resolution if it's 4:2:0
[23:50:53 CEST] <thebombzen> it's possible someone else upscaled them
[23:51:12 CEST] <thebombzen> try using full resolution and see what happens
[23:52:29 CEST] <djk> I'm doing an x11grab stream to facebook and it works fine when at the local machine and I have to restart but I would like to be able to restart from a remote ssh cli is that possible?
[23:52:53 CEST] <faLUCE> thebombzen: I'll check that. but the input files are probably corrupt. I have to check that too
[23:53:15 CEST] <thebombzen> that error message just means the input was bigger than a 320x240 plane
[23:54:12 CEST] <thebombzen> djk: trying to x11grab from an ssh session might give you xauth troubles
[23:55:14 CEST] <thebombzen> it should be possible if you get rid of those issues
[23:55:45 CEST] <djk> was thinking not likely but hey you can do anything unix ;-) guess I would have to do a vnc session bummer
[23:55:52 CEST] <BtbN> Just set DISPLAY accordingly, and you are fine if you are the right user.
[23:56:00 CEST] <thebombzen> BtbN: it's not that simple with xauth
[23:56:16 CEST] <BtbN> I have been doing that forever, never had issues.
[23:56:21 CEST] <thebombzen> good for you
[23:56:32 CEST] <thebombzen> djk: theoretically, you should be able to do it. you provide the display number to ffmpeg when you grab so there shouldn't be any problems
[23:56:39 CEST] <BtbN> xauth is an issue if you want to access the display from a foreign user
[23:56:41 CEST] <thebombzen> you could run into xauth problems, but you could also just not
[23:56:53 CEST] <thebombzen> BtbN: yes, but if you use a display manager your x server is owned by root
[23:57:14 CEST] <thebombzen> which makes your user login a foreign user
[23:57:15 CEST] <BtbN> Every sane display manager adjust the xauth to the user on login
[23:57:21 CEST] <Afshaal> if I can actually manage to compile it, that is...
[23:57:32 CEST] <thebombzen> idk, I've had issues with lightdm
[23:57:37 CEST] <djk> nohup ffmpeg -thread_queue_size 2048 -probesize 64M -framerate 30 -r 30 -f x11grab -s 1280x720  -i :0.0+1280,10 -f lavfi -i anullsrc -c:v libx264 -pix_fmt yuv420p -g 30 -c:a aac -ar 44100 -b:a 128k -preset veryfast -maxrate 4000k -bufsize 960k -r 30 -f flv "rtmp://rtmp-api.facebook.com:80/rtmp/$KEY" &
[23:57:42 CEST] <thebombzen> I'm not sure if it's because I have it set to autologin as pam
[23:57:52 CEST] <thebombzen> djk: don't use -r for x11grab
[23:57:57 CEST] <BtbN> If the logged in user couldn't communicate with the X server, it wouldn't be able to start new applications
[23:58:09 CEST] <thebombzen> especially since your display is probably 60 Hz
[23:58:31 CEST] <thebombzen> BtbN: I know what you're saying, but it doesn't change the fact that I've had those issues appear
[23:58:42 CEST] <BtbN> Something was weird on your system then.
[23:58:46 CEST] <thebombzen> probably
[23:58:54 CEST] <BtbN> Can always just turn xauth off if you run into issues
[23:59:03 CEST] <thebombzen> you can? I didn't know that :O
[23:59:08 CEST] <djk> it works and throws error in the log. yes I could -v error but I want to see that the stream is still doing thing and the shuts that off
[23:59:25 CEST] <BtbN> xhost +
[23:59:30 CEST] <BtbN> turns off basically all access control
[23:59:32 CEST] <thebombzen> anyway djk, you want to use -video_size and -framerate
[23:59:51 CEST] <BtbN> which includes all clients on your LAN
[23:59:54 CEST] <BtbN> so be careful with it
[00:00:00 CEST] --- Fri Apr 21 2017


More information about the Ffmpeg-devel-irc mailing list