[Ffmpeg-devel-irc] ffmpeg.log.20170616

burek burek021 at gmail.com
Sat Jun 17 03:05:01 EEST 2017


[00:03:07 CEST] <FishPencil> I think nlmeans might be "better"
[00:08:37 CEST] <FishPencil> Wow nlmeans is slow
[00:33:41 CEST] <ac_slater> hey guys. I'm muxing a video stream with a data stream in an mpegts container. Is it odd if my data stream has a PTS that simply starts at 1 and monotonically increases for every packet?
[00:34:03 CEST] <ac_slater> or should the data PTS be relatable to the nearest video PTS? ...
[00:35:19 CEST] <ac_slater> (I guess PTS and DTS)
[01:02:32 CEST] <ac_slater> no one? Damm
[01:02:44 CEST] <ac_slater> I've been stuck on this for days. Maybe I should make a mailing list post
[01:36:30 CEST] <alexpigment> hey guys, is there a way to get the source code from a zeranoe nightly?
[01:37:28 CEST] <alexpigment> ffmpeg-20170601-bd1179e-win32-shared.zip is what i'm looking at
[01:37:46 CEST] <alexpigment> i'd like to basically re-build this and leave out unnecessary libraries
[01:43:48 CEST] <c_14> git clone https://git.ffmpeg.org/ffmpeg.git; cd ffmpeg; git checkout bd1179e
[01:44:05 CEST] <JEEB> if you have unlimited bandwidth raw will just be quicker. but you usually don't have that.
[01:44:17 CEST] <JEEB> welp, was way up there
[01:46:01 CEST] <alexpigment> c_14 thanks
[01:46:58 CEST] <ac_slater> guys, is there something special I have to do to mux video sources with B-frames? ie - h264 ?
[01:47:24 CEST] <JEEB> with avformat and proper pts, no
[01:47:36 CEST] <JEEB> I mean, in avformat
[01:47:57 CEST] <ac_slater> If I just read all packets from a file, for example, then mux them into an mpegts container with av_interleaves_write_frame, I get out of order PTS errors
[01:48:29 CEST] <JEEB> I know that some demuxers sometimes give bad timestamps
[01:48:45 CEST] <ac_slater> ah right, I guess I'm just reading raw H264
[01:48:47 CEST] <JEEB> for example the MPEG-TS demuxer I've had sometimes give me issues
[01:48:58 CEST] <JEEB> ac_slater: read it with avformat?
[01:49:03 CEST] <ac_slater> yea
[01:49:05 CEST] <JEEB> I would guess that's one of the things that should work :P
[01:49:20 CEST] <ac_slater> I'll post a small example + file, seriously like 30 lines of code
[01:49:22 CEST] <JEEB> although in that case you get no initial frame rate in most cases
[01:50:07 CEST] <JEEB> but yea, sleep for me :P
[01:50:11 CEST] <JEEB> almost 3 in hte morning already
[01:50:19 CEST] <ac_slater> JEEB: nooooo you're my only hope ;)
[01:50:20 CEST] <ac_slater> night bud
[02:36:20 CEST] <thebombzen> for the builds for windows, is there any particular reason to pick shared over static or vice versa? it seems to me that shared is smaller in filesize but static could cause fewer issues? what would be the reason to pick one or the other?
[03:10:25 CEST] <relaxed> thebombzen: shared would be for third party applications that use ffmpeg's libs, static would be easier if you only want ffmpeg, ffprobe, etc
[03:10:32 CEST] <thebombzen> alright thanks
[03:41:11 CEST] <hendry> Anyone know if EXT-X-I-FRAME-STREAM-INF is actually needed for HLS streaming? IIUC it's some sort of key frame index, but I've seen videos playback without it, so I am not sure what it's value is.
[04:01:09 CEST] <waterworks> I have a problem with an x264/mp4 file having wrong framerate and duration when creating it with the C API. FPS should be 60, but is usually 61.02. I have a minimal example that reproduces it: https://pastebin.com/U3hhqYLR
[04:17:37 CEST] <DHE> hendry: from a reading of the spec, it might make trickplay (aka fast-forward) work better but otherwise seems useless
[04:18:36 CEST] <hendry> DHE: "trickplay"? didn't know about that term. Thanks =)
[04:19:00 CEST] <hendry> waterworks: i would try reproduce in shell with ffmpeg binary to be sure
[04:20:47 CEST] <ac_slater> waterworks: good choice using c++14
[04:21:19 CEST] <DHE> need an ffmpeg filter that merges several frames together into a single frame in such a way that it looks like a tape/analog VCR in fast-forward mode. (also reduces framerate, obviously)
[04:21:58 CEST] <ac_slater> DHE: have fun writing that
[04:22:47 CEST] <DHE> haha... that is so far outside my area of expertise...
[04:23:07 CEST] <waterworks> hendry, ac_slater: ffprobe -show_streams reports "codec_time_base=59/7200" and "avg_frame_rate=3600/59" which is the problem, it's ignoring that I'm setting them.
[04:23:44 CEST] <waterworks> This is not a problem if my container is avi, however. In that case if I dump every packet with show_packets, pts seems to be unused?
[04:24:43 CEST] <ac_slater> waterworks: hmm nothing looks too out of the ordinary
[04:24:51 CEST] <waterworks> The correct values for both should be 1/120 and 60/1.
[04:25:35 CEST] <ac_slater> what containers/output formats are choosing/overriding your timebases?
[04:26:03 CEST] <ac_slater> cause it looks like this example code just encodes h264 then writes it
[04:26:10 CEST] <ac_slater> and it's a raw format with no timing
[04:26:35 CEST] <waterworks> No, it muxes a stream into an mp4 container.
[04:26:52 CEST] <ac_slater> oh right via path
[04:26:55 CEST] <ac_slater> gotcha
[04:27:15 CEST] <ac_slater> waterworks: you're using ffmpeg 3.3?
[04:27:31 CEST] <waterworks> ac_slater: yes, I have the latest of the website
[04:28:48 CEST] <ac_slater> waterworks: I would try 3.2 honestly. Lots of mp4 changes
[04:29:04 CEST] <ac_slater> Also, you might want to ask ffmpeg-devel or post on the mailing list
[04:29:09 CEST] <ac_slater> cause this might be a bug
[04:29:13 CEST] <waterworks> ac_slater: I upgraded to 3.3 from 3.2 before this, same problem
[04:29:26 CEST] <ac_slater> shit. What if you write to test.ts
[04:29:29 CEST] <ac_slater> ie - mpegts
[04:32:47 CEST] <waterworks> ac_slater: Only have avi and mp4 containers enabled in this configuration, give me a few min to change linker settings
[04:34:58 CEST] <ac_slater> sorry mate. I think it might shed some light on it. But you might not be doing everything that the output context needs in terms of timebase
[04:35:07 CEST] <ac_slater> you're setting it on the codec, but not the container
[04:35:23 CEST] <ac_slater> I know how mpegts works, so maybe it'll shed some light
[04:35:50 CEST] <waterworks> yes mpegts does it right
[04:36:55 CEST] <waterworks> the format context has no timing fields
[04:38:20 CEST] <ac_slater> hmm
[04:38:39 CEST] <waterworks> mkv works as well
[04:39:21 CEST] <waterworks> and mov doesn't, so that family then
[04:39:32 CEST] <ac_slater> yea it's an mp4 thing
[04:39:52 CEST] <ac_slater> I've ran into that container type causing issues with time stamps as well
[04:39:55 CEST] <ac_slater> there is something you have to do
[04:42:10 CEST] <waterworks> I have no clue what it'd be, I've been staring at it for days
[04:45:12 CEST] <ac_slater> waterworks: for AVStream::time_base, the doc says the muxer will overwrite the value
[04:45:18 CEST] <ac_slater> when encoding
[04:45:22 CEST] <ac_slater>  / muxing
[04:45:35 CEST] <ac_slater> (which may or may not be related to the user-provided one, depending on the format)
[04:45:49 CEST] <waterworks> ac_slater: yes, after write_header it gets correctly changed to 1/15360. The issue is the codec however
[04:46:06 CEST] <ac_slater> not the muxer?
[04:46:38 CEST] <waterworks> no clue, just what ffprobe tells me
[04:46:57 CEST] <waterworks> only in the mp4 family of containers those settings don't get written
[04:47:20 CEST] <ac_slater> I think if you ask in #ffmpeg-devel, someone might know specifically about this
[04:49:01 CEST] <waterworks> support even allowed there? from the site it seemed like project developer only talk
[04:49:24 CEST] <ac_slater> well, you're determining if you're going to file a bug or not
[04:50:09 CEST] <waterworks> cannot be a bug, I know OBS Studio can write flawless MP4 files
[04:50:26 CEST] <ac_slater> yea and the ffmpeg commandline tool can do it as well
[04:50:48 CEST] <ac_slater> you should maybe try to set the log level to trace in your application and stare at it
[04:51:49 CEST] <ac_slater> I see some mention to set vsync=2 in your muxer AVDictionary
[04:52:25 CEST] <ac_slater> the trace log will show you what defaults the muxer is applying (most times)
[04:53:27 CEST] <waterworks> do you want to see it? everything is normal
[04:53:44 CEST] <k_sze[work]> Whatever happened to the -level option of ffv1?
[04:54:03 CEST] <k_sze[work]> ffmpeg -help encoder=ffv1 doesn't list it.
[04:54:39 CEST] <ac_slater> might as well upload the trace log wherever mpeg4 is mentioned
[04:55:09 CEST] <waterworks> ac_slater: not mentioned, only libx264 output
[04:55:17 CEST] <ac_slater> hmm
[04:55:53 CEST] <ac_slater> you did av_log_set_level(AV_LOG_TRACE); ?
[04:56:03 CEST] <waterworks> ac_slater: yes
[04:57:04 CEST] <ac_slater> what happens if you take your raw h264 file and put it in a mp4 container via `ffmpeg -i raw.h264 -vcodec copy test.mp4` ?
[04:57:16 CEST] <ac_slater> might as well add `-v 99`  at the end to get a trace log
[04:59:18 CEST] <waterworks> well I don't have that
[05:01:22 CEST] <ac_slater> :( why
[05:02:22 CEST] <ac_slater> you said your app works if you spit out a mpegts or mkv. Just use `ffmpeg -i test.ts -vcodec copy raw.h264`
[05:02:49 CEST] <waterworks> Ah, I don't know much about the CLI
[05:03:34 CEST] <ac_slater> it's awesome and a good way to test before you write libav* stuff
[05:04:30 CEST] <ac_slater> I'm going offline waterworks, good luck man. Please post to the mailing list. They're quick over there
[05:04:37 CEST] <waterworks> ac_slater: ffmpeg provided an invalid file
[05:04:56 CEST] <ac_slater> waterworks: wait
[05:05:23 CEST] <ac_slater> doing the 2 commands I said (strip the GOOD h264 from your muxed file, then REMUX with CLI) &. that STILL failed?
[05:05:26 CEST] <waterworks> using your commands, it produced an mp4 file with 51.58 fps
[05:05:54 CEST] <ac_slater> try doing `ffmpeg -i raw.h264 -vcodec copy -vsync 2 test.mp4`
[05:06:28 CEST] <waterworks> still same and invalid unfortunately
[05:06:33 CEST] <ac_slater> awesome
[05:06:38 CEST] <waterworks> I get a message from the muxer this time around though
[05:07:31 CEST] <waterworks> When using ffmpeg and your commands -> https://pastebin.com/RUw0V4KW
[05:08:11 CEST] <ac_slater> right since you're feeding the muxer h264 without timestamps
[05:08:42 CEST] <ac_slater> `ffmpeg -i raw.h264 -framerate 60 -vcodec copy -vsync 2 test.mp4`
[05:09:04 CEST] <waterworks> still same
[05:09:46 CEST] <ac_slater> `ffmpeg -r 60 -i raw.h264 -vcodec copy -vsync 2 test.mp4`
[05:10:35 CEST] <waterworks> Better, but this time it's playing slower instead of faster like my code
[05:10:44 CEST] <waterworks> 59.78 fps
[05:10:51 CEST] <ac_slater> is that intended?
[05:10:59 CEST] <waterworks> what do you mean?
[05:11:01 CEST] <ac_slater> ie - what's your goal here
[05:11:04 CEST] <ac_slater> 60fs?
[05:11:06 CEST] <ac_slater> fps *
[05:11:06 CEST] <waterworks> yes
[05:11:16 CEST] <ac_slater> 59.87 isn't acceptable?
[05:11:19 CEST] <waterworks> complete normal video creation, no effects
[05:11:21 CEST] <waterworks> and no it is not
[05:11:34 CEST] <waterworks> the duration must be 1:1
[05:11:44 CEST] <ac_slater> remove the -vsync option just for kicks
[05:12:24 CEST] <waterworks> same output
[05:12:56 CEST] <ac_slater> thinking
[05:13:32 CEST] <waterworks> "codec_time_base=1713/204800" = 0.00836425781, where does it even get this value? it should be 1/120
[05:14:02 CEST] <waterworks> same with "avg_frame_rate=102400/1713" = 59.7781669586, it just sets it to whatever it seems like, same with my code
[05:14:46 CEST] <ac_slater> you can specify avg_frame_rate on the AVFormatContext if I remember correctly
[05:14:55 CEST] <waterworks> if I show_streams on a valid OBS Studio file, those are what I see, or if I create an mkv I see the right values as well
[05:15:27 CEST] <waterworks> nah ac_slater, only AVStream has the avg_frame_rate field
[05:15:41 CEST] <ac_slater> ah right
[05:15:48 CEST] <waterworks> I looked in the source code for the MOV muxing family and theres no such dictionary option there either
[05:18:24 CEST] <waterworks> it is strange indeed, I'm confident in saying I have not missed setting anything in the code
[05:18:52 CEST] <ac_slater> also check `ffmpeg -h muxer=mp4`
[05:19:02 CEST] <ac_slater> same information on the website somewhere
[05:19:36 CEST] <ac_slater> waterworks: your goal should be to get your raw h264 file muxed into an mp4 container via the command line tool first
[05:19:51 CEST] <ac_slater> since you'll have faster iteration time and can replicate it easily
[05:20:01 CEST] <ac_slater> (via libavformat/codec)
[05:20:08 CEST] <ac_slater> I g2g sorry I wasnt more help
[11:09:10 CEST] <Nacht> Quick question. For AAC-LC, libfdk_aac or just regular aac as Codec ?
[11:09:10 CEST] <DogTheFrog> Hello everyone, i'd like to push a TS/UDP ,  TS/RTP , or  ES/RTP Stream from an Extron SMP 351 to my server running ffmpeg to redistribute the Stream to other users. Is this possible? It worked with pulling from the Extron (-i rtsp://......) but i don't wanna open a port for the smp, i wanna open the port on the server for ffmpeg and push the stream to ffmpeg. Sorry for my bad english, hope you understand it. How does this work? 
[11:09:29 CEST] <furq> Nacht: fdk is better, but not that much better it's worth recompiling ffmpeg for
[11:09:39 CEST] <DogTheFrog> I am running CentOS 7.3
[11:09:39 CEST] <Nacht> cheers furq
[11:10:41 CEST] <furq> DogTheFrog: how are you distributing the stream
[11:12:33 CEST] <DogTheFrog> i wanna distribute the stream with ffserver
[11:13:52 CEST] <furq> you should just be able to connect to ffserver on the remote box directly
[11:14:07 CEST] <furq> with that said, ffserver isn't very good, so if it doesn't have to be rtsp then you should maybe use nginx-rtmp
[11:15:40 CEST] <DogTheFrog> ok, thanks for your help, i will try that:)
[11:26:50 CEST] <Nacht> Are there some more tools to get more insight into an AAC file ?
[11:27:18 CEST] <Nacht> Im trying to create an HLS Audio stream using AAC, but it just doesnt want to play in Chrome on Android, while the samle of Apple does
[11:28:13 CEST] <furq> does android chrome actually support that
[11:28:39 CEST] <furq> apple stuff is pretty anal about standards compliance for hls, so if it works there then there's probably nothing wrong with it
[11:29:30 CEST] <Nacht> Well the Apple stream actually plays, and it's true, HLS on android is a drag
[11:29:57 CEST] <Nacht> I just can't dig deep enough in de AAC files of theirs to see what differs from them when I make em trough FFMPEG
[11:36:12 CEST] <zack_s_> kepstin: I tryed cutting my video exactly at the keyframes: ffmpeg -ss 4.566667 -i 1920x1080_30fps_2minuten.mp4 -vcodec copy -acodec copy -t 24.9 output.mp4
[11:36:24 CEST] <zack_s_> however it doesnt work, the video starts at second 0
[11:36:42 CEST] <zack_s_> and the duration is 30 seconds
[11:37:06 CEST] <zack_s_> how can I perform a perfect cut without any problems in a mp4 video, without any reencoding?
[11:54:01 CEST] <zack_s_> can anybody help?
[12:43:02 CEST] <waterworks> ac_slater: if you are here right now, I managed to fix it and I'm actually convinced it is an ffmpeg bug at this point
[12:48:09 CEST] <ArsenArsen> How do I calculate the buffer size needed for an AVFrame so I can allocate and align properly
[12:48:28 CEST] <ArsenArsen> Or not never midn
[12:48:32 CEST] <ArsenArsen> Mind
[13:04:53 CEST] <chrysn> hi, i'm trying to set up files for dash streaming, ahd it left me utterly confused:
[13:05:59 CEST] <chrysn> on one hand, http://wiki.webmproject.org/adaptive-streaming/instructions-to-playback-adaptive-webm-using-dash and mozilla documentation indicated that it's nowadays perfectly viable to have just one file per stream, ie. no chunks.
[13:06:32 CEST] <chrysn> in practice, the dash reference player at http://dashif.org/reference/players/javascript/v2.5.0/samples/dash-if-reference-player/ lets me play my files only if the complete stream can be loaded in ca. 15 seconds
[13:06:41 CEST] <chrysn> and the examples they are providing are all chunked up too.
[13:08:02 CEST] <chrysn> trying to understand what ffmpeg is doing to create streamable files, i didn't find the -dash option that is obviously accepted by ffmpeg, but which i didn't find in the documentation
[13:08:46 CEST] <chrysn> can you help me shed light on whether this complete workflow is supposed to yield dash-streamable files from an ffmpeg PoV?
[13:33:29 CEST] <zack_s_> are GOP and I frames the same?
[13:34:35 CEST] <furq> no
[13:34:59 CEST] <furq> a gop is the frames between two keyframes
[13:35:10 CEST] <furq> including the first keyframe
[13:35:52 CEST] <furq> "group of pictures" i.e. a self-contained group as far as the decoder is concerned
[13:36:14 CEST] <furq> none of the frames in the group reference or are referenced by any frames outside the group
[13:37:20 CEST] <JEEB> :D except with open gop where there can be pictures in decode order that cannot be decoded without previous references, but every picture that comes after RAP in presentation order would be decode'able ;)
[13:37:46 CEST] <furq> is open gop a thing worth knowing about any more
[13:37:57 CEST] <zack_s_> furq: I want to perfom a cut without re encoding, thats why I think it should work between two I frames, so two GOP
[13:38:28 CEST] <JEEB> furq: it actually got better defined in HEVC from which I actually more or less grasped that explanation
[13:38:39 CEST] <JEEB> and no idea how much that is used
[13:38:41 CEST] <furq> oh
[13:38:46 CEST] <furq> i had no idea it was even defined in x264
[13:38:54 CEST] <furq> let alone x265
[13:38:56 CEST] <JEEB> in AVC it was SEI+intra
[13:38:59 CEST] <JEEB> which was shitty
[13:39:05 CEST] <JEEB> in HEVC they made proper NAL unit types
[13:39:05 CEST] <furq> well yeah i mean x264 has an -open-gop switch
[13:39:22 CEST] <furq> apparently it's required on bluray because why not
[13:39:24 CEST] <JEEB> basically that's why you can't always define a RAP with a single NAL unit
[13:39:29 CEST] <JEEB> in AVC
[13:39:45 CEST] <JEEB> because if you don't take into mention that SEI you only have an intra picture
[13:39:56 CEST] <JEEB> thankfully with HEVC they noticed their mistake :P
[13:41:09 CEST] <furq> i'll probably just carry on not using it and not giving a shit about bluray
[14:05:10 CEST] <zack_s_> ffmpeg cuts differently depening on the parameter placement
[14:05:17 CEST] <zack_s_> ffmpeg -ss 4.566667 -i "ffmpeg-1920x1080_30fps_2min.mp4" -vcodec copy -acodec copy -t 10.0 output.mp4
[14:05:31 CEST] <zack_s_> is not the same with:
[14:05:32 CEST] <zack_s_> ffmpeg -i "1920x1080_30fps_2min.mp4" -vcodec copy -acodec copy -ss 4.566667 -t 10.0 output.mp4
[14:06:10 CEST] <zack_s_> for unkown reason both cmd lines, does not cut on the iframe at 4.566667
[14:06:12 CEST] <furq> ss seeks differently depending on whether it's an input or output option
[14:06:19 CEST] <zack_s_> the first is one second earlier
[14:06:24 CEST] <zack_s_> the other on second later
[14:07:19 CEST] <zack_s_> input or output options?
[14:07:27 CEST] <zack_s_> furq: what do you mean?
[14:07:31 CEST] <furq> before or after -i
[14:08:24 CEST] <furq> iirc as an input option it uses the container seek index, as an output option it decodes and uses the frame timestamps
[14:09:25 CEST] <watlon80> hey all, is there a good way to profile FFmpeg / filter complex performance (with a release binary)? I'd like to roughly pinpoint my filter processing time and optimize where possible.
[14:10:00 CEST] <furq> not that i know of
[14:10:06 CEST] <grublet> maybe some kind of verbose option?
[14:10:07 CEST] <zack_s_> furq: so what is the right choice?
[14:10:10 CEST] <furq> nothing better than just outputting to -f null
[14:10:15 CEST] <furq> zack_s_: it depends
[14:10:17 CEST] <watlon80> I tried Instruments on OS X, but lacking debug symbols the output is kind of useless
[14:10:22 CEST] <furq> this is a nontrivial problem, as you've probably noticed
[14:10:33 CEST] <zack_s_> furq: crazy...
[14:10:45 CEST] <zack_s_> I just want to cut on i-frames, without re encoding
[14:10:48 CEST] <grublet> doesnt ffmpeg have debug symbols?
[14:10:55 CEST] <zack_s_> I thought it cannot be that hard
[14:11:09 CEST] <watlon80> grublet: the build I got (OS X) doesn't seem to have them
[14:11:19 CEST] <grublet> oh ok
[14:11:22 CEST] <watlon80> I only see system calls in my call tree
[14:11:38 CEST] <grublet> you could build ffmpeg yourself
[14:11:39 CEST] <furq> zack_s_: you might want to look at -noaccurate_seek and -seek_timestamp
[14:11:48 CEST] <grublet> assuming you know how/want to
[14:12:04 CEST] <watlon80> grublet: that's my fallback plan, but there might also be performance differences between debug and release builds
[14:12:22 CEST] <grublet> yeah i wouldnt know, never tried building with symbols
[14:12:27 CEST] <watlon80> it's just that I have this way complex filter graph and I want to optimize where possible
[14:13:09 CEST] <furq> you should get an ffmpeg_debug binary when you build it yourself
[14:13:11 CEST] <furq> by default
[14:13:16 CEST] <Threads> zack_s_ you still trying to cut frameaccurate with ffmpeg ?
[14:13:39 CEST] <zack_s_> Threads: yeah...
[14:13:47 CEST] <zack_s_> I have a video, which has every second an iframe
[14:14:11 CEST] <zack_s_> closed GOP
[14:14:21 CEST] <zack_s_> so this has to work at iframe boundary
[14:14:22 CEST] <watlon80> furq: alright, I'll see if I can get it to build on my box
[14:15:34 CEST] <watlon80> BTW, minor thingie: channel topic mentions 3.3.1 while 3.3.2 is out (?)
[14:16:00 CEST] <furq> durandal_1707: ^
[14:16:58 CEST] <durandal_1707> pay me and i will change it
[14:17:22 CEST] <furq> idc my irssi window isn't wide enough to show that bit of the topic anyway
[14:17:49 CEST] <grublet> does anyone actually read channel topics?
[14:17:53 CEST] <grublet> i never do
[14:17:56 CEST] <grublet> well, rarely
[14:17:57 CEST] <watlon80> it's front center on the web client
[14:17:59 CEST] <furq> i'd rather that bit was replaced with something telling you to paste the command line and full output to pastebin
[14:18:05 CEST] <grublet> ^
[14:20:18 CEST] <zack_s_> furq: how does -noaccurate_seek help?
[14:21:18 CEST] <watlon80> zack_s_: that should skip only to keyframes (for speed)
[14:23:16 CEST] <furq> i didn't say it would help, i said it might
[14:27:35 CEST] <zack_s_> does anybody know, how I can do the same with the API?
[14:27:47 CEST] <zack_s_> because I need to find the nearest keyframe
[14:28:09 CEST] <zack_s_> at a certain time
[15:51:56 CEST] <zack_s_> furq: do you know how I can do the same with the API? retrieving the keyframe near a position?
[16:38:49 CEST] <kerio> is there a grayscale pixel format where pixels are stored as double
[16:38:57 CEST] <durandal_1707> nope
[16:41:54 CEST] <shincodex> ERror -135
[16:42:09 CEST] <shincodex> is generally RTSP saying we cant support some protocol within its format
[16:42:12 CEST] <shincodex> yar?
[16:42:44 CEST] <durandal_1707> yes
[16:42:56 CEST] <shincodex> so... so..
[16:43:06 CEST] <shincodex> If i force TCP and say YOUR doin it
[16:43:19 CEST] <shincodex> and camera is like mmmmm nope. i only do UDP over RTSP suck it
[16:43:25 CEST] <shincodex> then im sucking on error 135
[16:52:03 CEST] <victorqueiroz> Hi
[16:52:14 CEST] <victorqueiroz> Can I use ffmpeg in a C project?
[16:52:57 CEST] <DHE> yes, license permitting
[16:53:47 CEST] <shincodex> i wouldnt use ffmpeg.c though
[16:54:07 CEST] <shincodex> maybe the collection lib av stuff
[16:57:50 CEST] <victorqueiroz> shincodex: Why not?
[16:57:57 CEST] <victorqueiroz> I just need to encode pcm data to ogg file
[16:58:52 CEST] <shincodex> so your pulling in all of ffmpeg just for audio?
[17:05:01 CEST] <kepstin> if you're just encoding pcm to ogg vorbis, the 'libvorbisfile' should provide a much simpler api and smaller dependency
[17:05:48 CEST] <kepstin> or, right, libvorbisfile is decoder only
[17:05:56 CEST] <kepstin> even so libvorbis directly isn't super hard to use
[17:06:10 CEST] <BotoX> Hi, I'm trying to stream an RTSP stream from an IP camera to youtube (RTMP)
[17:06:18 CEST] <DHE> indeed, ffmpeg (unless stripped way down) tends to be a 20 megabyte binary package
[17:06:19 CEST] <BotoX> video works fine but audio is choppy and spams errors: https://p.botox.bz/view/raw/1a3af996
[17:07:11 CEST] <BotoX> reencode to aac same issue, -c:a copy audio broken on youtube but no errors in ffmpeg
[17:07:20 CEST] <BotoX> so I am assuming something is wrong with the audio of the camera
[17:07:25 CEST] <BotoX> but it plays fine in mpv
[17:15:23 CEST] <verb5> hello everyone
[17:15:34 CEST] <verb5> i need help to create ffmpeg mosaic
[17:15:46 CEST] <verb5> 2rows 3x3
[17:17:37 CEST] <verb5> i have tried to follow this tutorial but this is what i get https://pastebin.com/raw/iHW8kniz
[17:18:49 CEST] <verb5> any help ?
[17:19:37 CEST] <furq> "ac-tex damaged" is a decoder error
[17:19:42 CEST] <furq> so presumably your input is broken
[17:20:11 CEST] <verb5> how could it be broken it's live stream from mumudvb
[17:20:23 CEST] <verb5> i can play the stream
[17:21:10 CEST] <victorqueiroz> for pcm to audio.ogg encoding, what library should I use?
[17:21:15 CEST] <DHE> no graphical glitches when you watch it?
[17:21:43 CEST] <verb5> do you find anything wrong in my command ? https://pastebin.com/raw/idjwxZJ2
[17:21:57 CEST] <verb5> no glitches at all
[17:22:50 CEST] <furq> that looks fine, but you should use hstack and vstack instead of overlay
[17:22:53 CEST] <furq> !filter hstack
[17:22:53 CEST] <nfobot> furq: http://ffmpeg.org/ffmpeg-filters.html#hstack
[19:02:52 CEST] <Felishia> halp D:
[19:03:08 CEST] <Felishia> I has a mp4 file and a ogg file :<
[19:03:14 CEST] <Felishia> I want to merge them toguther
[19:03:34 CEST] <Felishia> replace the audio from the old videos with the new videos
[19:03:45 CEST] <Felishia> sorries I means new audio
[19:04:05 CEST] <iranen> same lenght?
[19:09:38 CEST] <Felishia> iranen, yes c:
[19:09:44 CEST] <Felishia> I just fixed the audio
[19:15:21 CEST] <Felishia> :<
[19:15:48 CEST] <relaxed> which file has the audio you want and which has the video?
[19:17:44 CEST] <relaxed> Felishia: ^^
[19:21:52 CEST] <Felishia> relaxed, one.mp4 and one-equalized.ogg
[19:22:50 CEST] <Felishia> ah I got it
[19:22:56 CEST] <Felishia> but the video only plays in vlc
[19:23:00 CEST] <Felishia> anyway
[19:31:27 CEST] <relaxed> try, ffmpeg -i video -i audio -map 0:v -map 1:a -c copy output.mkv
[20:12:20 CEST] <chrysn> regarding my questions of 13:05 CEST: they've been resolved by not messing up the sequence of the substreams (and updating the documentation); things appear to work fine with unchunked dash files generated out of ffmpeg, just the -dash option appears to be undocumented.
[20:18:42 CEST] <maziar> I have too many TS but my m3u8 file is broken and its empty how to fix it
[20:37:11 CEST] <maziar> my M3U8 file is broken, how can i create a new M3U8 from my TS ?
[20:37:48 CEST] <DHE> umm... copy the streams into a new hls instance?
[20:46:13 CEST] <maziar> DHE you are talking to me ?
[20:47:07 CEST] <DHE> maziar: yeah
[20:47:46 CEST] <maziar> DHE , no, somebody  just remove it
[21:20:58 CEST] <maziar> my M3U8 file is broken, how can i create a new M3U8 from my TS ?
[21:24:56 CEST] <llogan> A simple example could be "ffmpeg -i input.ts output.m3u8" but you didn't really provide much info or requirements, etc.
[21:27:22 CEST] <victorqueiroz> Hi. To encode pcm into audio.ogg. What ffmpeg library should I use? I'm developing a C project
[21:29:29 CEST] <llogan> libavformat, libavcodec
[21:30:47 CEST] <llogan> and possibly libavfilter and libavutil depending on what you want to do. see stuff in doc/examples
[23:02:33 CEST] <JodaZ> maziar, M3U8 is a very simple format, in its most basic form it has #EXTM3U in the first line and the segment filenames.ts in order in the following lines
[23:03:13 CEST] <maziar> JodaZ I know that, but how can I mention time, for 10000 TS files? !
[23:03:29 CEST] <JodaZ> i don't think you need to
[23:04:25 CEST] <DHE> traditionally each segment is of fixed length, in which case you can just assume the same number over and over again. failing that, algorithmically running ffprobe to measure it also works
[23:04:55 CEST] <DHE> or, concatenate all the .ts files into a single huge file (piping is fine) and have ffmpeg regenerate the whole m3u8 is also an option, resources permitting
[23:05:20 CEST] <DHE> this effectively regenerates the stream from scratch using the original stream (or what's left of it) as the source
[23:05:35 CEST] <JodaZ> setting a ext-x-targetduration might help some players with seeking
[23:05:58 CEST] <JodaZ> otherwise timing all files with ffprobe i guess?
[23:06:27 CEST] <JodaZ> concatenating all files and using copy codecs is basically as fast as copying them
[23:07:48 CEST] <maziar> DHE what do you mean, how should I do that
[23:08:46 CEST] <DHE> maziar: assuming all the files are named in a sortable order, I'd do something like:  cat /path/to/*.ts | ffmpeg -f mpegts -i - -c copy -f hls [-hls_options_here] regenerated.m3u8
[23:09:22 CEST] <DHE> the notion that the inputs and outputs are in different directories may make sense here. I imagine a bit of prep work is required
[23:10:22 CEST] <maziar> DHE my ts file order is ok on M3U8 but it doesn't work
[23:11:14 CEST] <maziar> DHE https://pastebin.com/QvPVgzzh
[23:12:19 CEST] <maziar> DHE but in original file there is 2500 line
[23:22:53 CEST] <JodaZ> maziar, do the few ts files that are in there work as it is?
[23:24:10 CEST] <maziar> JodaZ it is begin with  1080_17141000.ts and end with 1080_1714999 , its means that there are 2234 files
[23:24:29 CEST] <maziar> I past a few of them to show you the order of files and files name
[23:24:38 CEST] <DHE> maziar: if you're doing this by hand, add #EXTINF:10.00,    (including the comma) just above each .ts file. this of course assumes each file is 10 seconds long exactly
[23:25:18 CEST] <JodaZ> maziar, actually there should be 3999 files then
[23:25:54 CEST] <maziar> DHE there is 2234 line, show should I add this among of  #EXTINF:10.00,
[23:26:04 CEST] <JodaZ> see if you can play the .ts files as they are
[23:26:08 CEST] <JodaZ> like by themselfes
[23:26:28 CEST] <maziar> is there any way to create a M3U8 from this TS's ?
[23:26:57 CEST] <JodaZ> your m3u8 should play
[23:27:05 CEST] <JodaZ> check if the ts by themselfes play
[23:27:22 CEST] <DHE> I adequately described how to use it already
[23:39:26 CEST] <maziar> DHE thankyou Im checking it
[00:00:00 CEST] --- Sat Jun 17 2017


More information about the Ffmpeg-devel-irc mailing list