[Ffmpeg-devel-irc] ffmpeg.log.20160427

burek burek021 at gmail.com
Thu Apr 28 02:05:01 CEST 2016


[00:54:12 CEST] <vade> Hi all - quick question about encoding - do all frames need to arrive in PTS order ? are there any mechanisms to re-roder AVFrames that have been decoded to presentation time? Or is that something one has to roll their own?
[05:45:22 CEST] <ajsharp> Trying to overlay one video onto another. Running into an issue where if the main video is shorter than the secondary (overlaid video), then the overlay video stops playing, and both videos remain frozen, but the audio track continues playing to the end. Does anyone know how to keep the overlay video playing until it's finished.
[05:46:22 CEST] <ajsharp> specifying the following options: overlay=shortest=0:eof_action=pass:x=main_w-overlay_w-10:y=10
[05:48:05 CEST] <ajsharp> running ffmpeg 2.7-tessus on OS X 10.11.4
[05:52:40 CEST] <Guest14493> Has ffmpeg dropped windows xp support?
[07:50:14 CEST] <shouya_> hi
[07:51:13 CEST] <shouya_> I'm new to ffmpeg, can I ask a (possibly silly) question about hardware accelerations? I searched on the web and find no clue.
[07:51:44 CEST] <JEEB> just ask and stick around
[07:51:46 CEST] <shouya_> my goal is to enable hardware acceleration for encoding h264 videos on linux with VAAPI
[07:52:49 CEST] <shouya_> I'm on gentoo, and have the latest version of ffmpeg (3.0.1) installed, here's the `configure` output: http://lpaste.net/401919298643165184
[07:53:30 CEST] <shouya_> this shows that h264_vaapi hwaccel is enabled
[07:53:51 CEST] <shouya_> but when I type in `ffmpeg -hwaacels` it prints empty list
[07:55:47 CEST] <shouya_> I don't have X on this machine, could that possibly be the reason?
[09:37:18 CEST] <DHE> shouya_: vaapi is only for decoding right now. I think the only hardware h264 accelerator available is nvenc, nVidia's GPU feature
[09:38:16 CEST] <shouya_> DHE: thanks. how can I tell if vaapi is used for decoding?
[09:38:45 CEST] <DHE> oh, no there's a quicksync encoder...
[09:39:16 CEST] <shouya_> I checked that one, it seemed to be a proprietary software
[09:39:47 CEST] <DHE> the point is, it should say which decoder is used in the output
[09:42:54 CEST] <shouya_> fflogger: sorry, here it is: http://lpaste.net/9122915165572956160
[09:44:40 CEST] <shouya_> DHE: I expected that, but it seems the word 'vaapi' never appear in the output (see the paste above)
[09:46:52 CEST] <DHE> Stream #0:0 -> #0:0 (h264 (native) -> h264 (libx264))
[09:46:57 CEST] <DHE> `dat metadata though
[09:48:36 CEST] <shouya_> DHE: native means decoded using vaapi here?
[09:49:12 CEST] <DHE> pretty sure it's just the standard software decoder
[09:49:16 CEST] <c_14> Don't you have to add -hwaccell vaapi before the input?
[09:49:25 CEST] <c_14> -l
[09:50:03 CEST] <c_14> hmm, vaapi isn't listed in the manpage
[09:50:04 CEST] <shouya_> Unrecognized hwaccel: vaapi.
[09:50:04 CEST] <shouya_> Supported hwaccels:
[09:50:36 CEST] <shouya_> nothing else is printed after that
[09:52:28 CEST] <DHE> my only hwaccel is vdpau. I think it's a rendering accelerator, not an encoding accelerator
[09:53:14 CEST] <DHE> wait, you're specifying libx264 as your codec directly
[09:53:20 CEST] <DHE> ffmpeg -codecs | grep h264
[09:53:44 CEST] <DHE> ffmpeg -encoders | grep h264  # even better
[09:54:13 CEST] <shouya_>  DEV.LS h264                 H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 (encoders: libx264 libx264rgb )
[09:54:31 CEST] <c_14> I just checked the backlog, if you want to encode h264 with vaapi you need a patch that hasn't been merged to master yet.
[09:54:31 CEST] <shouya_> seems libx264's the only one available
[09:56:01 CEST] <c_14> https://ffmpeg.org/pipermail/ffmpeg-devel/2016-January/188356.html
[09:56:05 CEST] <c_14> The ones from this thread
[09:56:39 CEST] <c_14> Since it hasn't been merged to master, there might be some issues remaining
[09:58:15 CEST] <shouya_> I'll check this one out
[09:58:46 CEST] <shouya_> thanks all. before that I want to see if I vdpau works
[09:59:06 CEST] <c_14> vdpau is for decoding/presenting and requires a running X server
[09:59:54 CEST] <shouya_> c_14: just figured it out...
[10:00:27 CEST] <shouya_> on this page https://trac.ffmpeg.org/wiki/HWAccelIntro, it says that vaapi currently only supports AVHWAccel
[10:00:42 CEST] <shouya_> what does AVHWAccel mean?
[11:27:14 CEST] <kagami_> Hi. How do I save HLS WebVTT streams with ffmpeg? "ffmpeg -i http://... out.vtt" shows a lot of requests but doesn't save anything.
[11:28:42 CEST] <relaxed> kagami_: try ffmpeg -i input -f webvtt out.vtt
[11:29:28 CEST] <kagami_> relaxed: still the same. Here is output: http://pastebin.com/FCriEUuL Output file is empty.
[11:29:53 CEST] <kagami_> It's like it doesn't start downloading like with normal hls video stream.
[11:32:12 CEST] <relaxed> maybe, ffmpeg -i input -map 0:s -c:s copy -f webvtt out.vtt
[11:32:43 CEST] <relaxed> pastebin.com the command and output if that doesn't work
[11:35:18 CEST] <kagami_> relaxed: the same http://pastebin.com/aaLYRaNY It seems to be it tries to download entire WebVTT stream and prints Input info banner only when I press Ctrl+C.
[11:57:47 CEST] <kagami_> Should I create issue at bugtracker? Though this stream url won't be available after few hours, I need to somehow host my own webvtt hls server...
[12:38:25 CEST] <kagami_> Hm, seems like it just doesn't write intermediate result, but at the end of translation saves entire subtitle file instead.
[14:50:53 CEST] <gerion> hi, can you help me? I want to overlay two videos, but the second video should begin later than the first (and is shorter), other said: play first video x secs, then overlay the two until the second ends, then only play the first
[14:51:25 CEST] <gerion> searching some kind of delay filter
[14:52:25 CEST] <furq> gerion: add -itsoffset x before the input you want to delay
[14:53:20 CEST] <gerion> ah nice, thx
[15:21:51 CEST] <spirou> can the "global headers" be added to a video stream without having to recompress it completely?
[15:34:41 CEST] <sybariten> oh hai
[15:36:12 CEST] <sybariten> i have MTS files that cause problems for my Sony Vegas editing suite. What would you reckon is the most lightweight transformatoin i can do, in order to try some other format?  Is it for instance possible to make a new mts file where the audio is mp3 instead of ac3?
[15:50:00 CEST] <c_14> sure
[15:50:10 CEST] <c_14> It depends on what Sony Vegas wants to eat though
[15:50:40 CEST] <c_14> to just convert the audio to mp3 use ffmpeg -i input -c copy -map 0 -c:a libmp3lame -b:a 320k out.mts
[16:23:11 CEST] <rsevero> Hi. I'm trying to create a libx264/opus file but ffmpeg apparently only allows it to be a MKV file. Tried MP4, OGG and OGV but all of them fail with the same message: "Could not write header for output file #0 (incorrect codec parameters ?): Invalid argument" You can see the whole output here: http://pastebin.com/mYNnQMit To be exact I want to create a h264/opus file that is natively supported by Chrome. Ideias?
[16:25:55 CEST] <rsevero> AFAICT Chrome supports narively the following containers: Ogg, WebM, WAV and MP4.
[16:27:27 CEST] <furq> none of those formats support both h.264 and opus
[16:27:44 CEST] <furq> you either need h.264 and aac/mp3 in mp4, or vp8/vp9 and opus in webm
[16:28:10 CEST] <furq> the former is more widely-supported
[16:28:12 CEST] <rsevero> Humm... Ok. thanks.
[16:28:20 CEST] <furq> and also means you don't have to use libvpx
[16:30:40 CEST] <kepstin> note that youtube does support h.264 and opus - but it does that by having separate audio and video streams, using DASH.
[16:54:57 CEST] <blaataap> does anyone know... I am trying to cut a simple video no transitions. I have used PiTiVi (using gstreamer) and AviDemux (2.6 or something) and the input video is h.264. As output I try x264. The input framerate is 30, output is 30. The frames are getting recoded.
[16:55:21 CEST] <blaataap> or at least a transition that took 10 steps in the original now takes 11 steps in the output as the 2nd frame is getting duplicated.
[16:55:40 CEST] <blaataap> the frames are really the same but there is now an extra frame in there.
[16:56:23 CEST] <blaataap> the same happened when I outputted to mpeg4-2
[16:57:30 CEST] <blaataap> and, now I have tried x265 or something and the result is again the same.
[16:58:45 CEST] <vade> what do you want to do ?
[16:59:43 CEST] <JyZyXEL> [gif @ 0x123c0c0] ERROR: gif only handles the rgb24 pixel format. Use -pix_fmt rgb24.
[17:00:00 CEST] Last message repeated 1 time(s).
[17:00:07 CEST] <blaataap> apparently I cannot just delete frames as I will need a new key frame at the beginning so it needs reencoding.
[17:00:10 CEST] <JyZyXEL> i don't think "-pix_fmlt rgb24" is even valid?
[17:00:24 CEST] <blaataap> and the only thing I'm doing is cutting out a clip (deleting the beginning)
[17:00:32 CEST] <JyZyXEL> -pix_fmt[:stream_specifier] format (input/output,per-stream)
[17:00:36 CEST] <blaataap> and I want the video to stay the same, frame by frame.
[17:01:07 CEST] <blaataap> basically I guess I want to "copy" while getting a new keyframe in the beginning.
[17:01:26 CEST] <blaataap> i don't care about transcoding but now an extra frame is constantly getting inserted.
[17:01:38 CEST] <blaataap> I just want a perfect copy in that sense.
[17:01:48 CEST] <JyZyXEL> Option pixel_format not found.
[17:02:56 CEST] <blaataap> all it needs to do is transcode/recode with a new keyframe as starting point, nothing else.
[17:03:22 CEST] <blaataap> and I'm surprised and flabbergasted that not any tool can do this....
[17:03:32 CEST] <blaataap> they probably all use the same libraries?
[17:04:32 CEST] <vade> blaataap: well you cant move samples from one container to and make up new keyframes without re-encoding
[17:05:00 CEST] <vade> if you want a perfect copy you want to just change container formats there are ways to do that.
[17:05:12 CEST] <vade> if you want a transition this requires blending and making new frames, and re-encoding them
[17:05:25 CEST] <blaataap> sure that's what I'm saying, but how come the length of the scene is changed?
[17:05:37 CEST] <blaataap> basically it is now 1/30 th of a second longer
[17:06:22 CEST] <blaataap> something that took 10 frames now takes 11, this is not right.
[17:06:43 CEST] <vade> are you using -vcodec copy -acodec copy ?
[17:07:07 CEST] <blaataap> I don't think it can do that because that would not create a new keyframe.
[17:07:18 CEST] <vade> you cant make new keyframes
[17:07:26 CEST] <vade> you either re-encode the video totally
[17:07:33 CEST] <vade> or you copy samples to a new container as is
[17:07:34 CEST] <blaataap> that's what I'm saying man.
[17:07:36 CEST] <blaataap> don't repeat me.
[17:07:42 CEST] <vade> so you are re-encoding.
[17:07:45 CEST] <blaataap> yes
[17:08:16 CEST] <blaataap> but the re-encoding changes the length of the scene for every output codec I try in both of these two programs.
[17:08:53 CEST] <blaataap> every time the 2nd frame is duplicated.
[17:08:53 CEST] <vade> whats your command? or are you doing this with like, libavformat / libavcodec in your own software?
[17:09:03 CEST] <blaataap> through a GUI
[17:09:15 CEST] <blaataap> probably yes.
[17:09:31 CEST] <vade> well, try it with pure FFMPEG command and see if you can re-create
[17:09:39 CEST] <blaataap> I mean I could probably use a direct command by inputting the starting key frame
[17:10:53 CEST] <blaataap> can I select a starting frame?
[17:11:18 CEST] <blaataap> oh -ss
[17:12:08 CEST] <furq> -vf select="gt(n\,1234)"
[17:12:12 CEST] <vade> any libavformat devs who could help me understand how to handle the properly setting timestamps for AVFrames being sent to avcodec_encode_video2() then passed on to av_write_frame() ?
[17:12:14 CEST] <furq> if you want to use the frame number
[17:12:22 CEST] <vade> im getting  Encoder did not produce proper pts, making some up.
[17:12:26 CEST] <vade> Timestamps are unset in a packet for stream 0. This is deprecated and will stop working in the future. Fix your code to set the timestamps properly"
[17:13:07 CEST] <vade> Im am currently successfully (I think!) decoding via avcodec_decode_video2, and passing the resulting decompressed AVFrame to avcodec_encode_video2
[17:17:44 CEST] <blaataap> how do I specify a codec for x264?
[17:18:17 CEST] <furq> what?
[17:19:31 CEST] <blaataap> ok working
[17:20:38 CEST] <blaataap> vade: it's the same with pure FFMPEG command.
[17:20:45 CEST] <blaataap> thanks for learning me the command though.
[17:20:56 CEST] <blaataap> this might be easier in a while. Anyway.
[17:22:33 CEST] <vade> interesting - not sure I can help
[17:23:43 CEST] <vade> maybe email the list? or post on a forum - might get more eyes on it over-all
[17:23:51 CEST] <vade> (or stack overflow?)
[17:24:03 CEST] <blaataap> hate on stack overflow ;-).
[17:24:13 CEST] <blaataap> I will email the list when I have time I guess. thanks.
[17:25:17 CEST] <jbreeden> Can FFmpeg copy an existing data track? `c:d copy` doesn't seem to do anything
[17:27:47 CEST] <furq> jbreeden: you can do it with -map using the stream number
[17:27:53 CEST] <furq> i don't think there is anything like -map 0:d
[17:55:46 CEST] <jbreeden> furq: thanks. I'm looking for something that finds it and passes it through. Bummer.
[17:59:10 CEST] <furq> oh apparently 0:d does exist
[17:59:16 CEST] <furq> maybe try that then
[17:59:29 CEST] <jbreeden> haha yeah, just discovered that literally as you were typing that
[18:00:02 CEST] <jbreeden> How do I use 0:d and still get the v/a tracks, if I map data, must I map everything?
[18:00:16 CEST] <furq> yeah -map overrides automatic stream selection
[18:00:22 CEST] <furq> so -map 0:v -map 0:a -map 0:d
[18:00:37 CEST] <furq> i'd have thought just -map 0 would work as well but maybe it'll ignore data streams
[18:02:18 CEST] <jbreeden> thanks for your help, this solves my problem
[18:02:41 CEST] <jbreeden> I didn't know about map 0:v|a|d
[18:27:32 CEST] <DHE> is there documentation on how to deal with PTS/DTS overflow? I'm using the API and after a day (which mathematically overflows the mpegts pts/dts values) my program breaks with "Application provided invalid, non monotonically increasing dts to muxer in stream 0"
[18:31:22 CEST] <vade> DHE: a day of encoding?
[18:32:23 CEST] <vade> can anyone explain what circumstances init an AVStream* with priv_data = NULL? I am trying to understand why im crashing with exe_bad_access within compute_muxer_pkt_fields
[18:32:23 CEST] <c_14> DHE: you might want to ask on the libav-user ml
[18:33:13 CEST] <DHE> vade: live streaming. I just leave it running
[18:33:32 CEST] <vade> you might need to segment your recordings.
[18:34:04 CEST] <DHE> no, ffmpeg the app doesn't have this problem
[18:34:06 CEST] <DHE> so something's up
[18:34:31 CEST] <vade> examine the resulting PTS/DTS and see if they are different
[18:34:36 CEST] <vade> (from your app?)
[18:40:39 CEST] <DHE> yeah, I'm going to start debugging. was hoping for immediate guidance... since it takes a day to have it happen, and live source is a pretty bad source to run from
[18:40:52 CEST] <DHE> I'm going to generate 1.5 days of test pattern and burn that in instead
[18:41:54 CEST] <vade> Well, you can probably see some difference even in the difference between pts / dts ?
[18:42:03 CEST] <vade> (for short clips)
[18:42:07 CEST] <vade> sorry distracted @DHE
[18:43:28 CEST] <vade> @DHE maybe pts_wrap_behavior for AVStream is of interest?
[18:43:49 CEST] <pfelt> DHE: could you set an ffmpeg process to stream data pulled from a file and just set, the PTS with the setpts filter, to something high?
[18:44:02 CEST] <pfelt> (and dump that stream output into a pipe)
[18:46:47 CEST] <DHE> pfelt: interesting idea, I'll give that a whirl
[18:46:54 CEST] <pfelt> that way you don't have to a) record a day's worth of data, or b) wait a day to test
[18:47:32 CEST] <pfelt> i'd think that something like (psudocode here) pts=pts+(maxpts-someundeterminedvalue) just to get you close
[18:47:45 CEST] <pfelt> no idea if that'll work, but i suspect you'd be able to make it get you somewhere clse
[18:50:46 CEST] <matthias_> Hello where can i find equivalent arguments, from ffmpeg that i use with rtmpdump ? like -a -s and multiple -C
[18:51:21 CEST] <c_14> Read the manpages?
[18:51:42 CEST] <c_14> If you tell me what the options do, I might be able to give you hints.
[18:54:19 CEST] <c_14> Maybe look at https://ffmpeg.org/ffmpeg-protocols.html#rtmp
[18:55:27 CEST] <c_14> Seems like you want rtmp_conn, rtmp_app and rtmp_swfurl
[18:55:46 CEST] <matthias_> c_14: -a is Name of target app on server , -s URL to player swf file, -C type:data     Arbitrary AMF data to be appended to the connect string
[18:56:05 CEST] <c_14> ye, those three options I just listed
[18:56:23 CEST] <matthias_> c_14: okay, i will try thanks so far
[19:08:36 CEST] <galex-713> Hi
[19:08:40 CEST] <galex-713> How can I quickly convert a 1080p movie either in raw either in 720p in order to read it on my slow-hardware bad-res computer?
[19:09:08 CEST] <matthias_> c_14: is -rtmp_conn 'S:0 S:23100539 S: O:1' valid ?
[19:12:04 CEST] <c_14> The third S: is missing a value, isn't it? Also the last O:1 starts an object which is never ended. Not sure whether that's valid
[19:12:25 CEST] <trough> I'd like to overlay a VP9 video in SDL.  Would FFmpeg be overkill for this?
[19:12:39 CEST] <matthias_> c_14: it is working with rtmpdump -C S:0 -C S:23100539 -C S: -C O:1
[19:13:24 CEST] <c_14> try -rtmp_conn S:0 -rtmp_conn S:23100539 -rtmp_conn S: -rtmp_conn O:1
[19:13:36 CEST] <matthias_> c_14: ok
[19:13:48 CEST] <c_14> trough: you want to overlay a video on another video and then display that via sdl?
[19:14:38 CEST] <trough> more likely, overlay a video on an image, and display that via SDL
[19:15:14 CEST] <c_14> galex-713: ffmpeg -i 1080p -vf scale=-2:720 -map 0 -c copy -c:v libx264 -preset ultrafast out.mkv
[19:15:33 CEST] <trough> I wasn't sure whether I should use plain libvpx, ffmpeg, or gstreamer.
[19:16:20 CEST] <trough> I am not sure about gstreamer, because it is part of gnome, and I heard some strange things about the gnome build-chain.
[19:18:03 CEST] <c_14> trough: you can probably do that with just commandline ffmpeg (even if the outdev support is kind of a hack) by using ffmpeg -i image -i video -filter_complex [0:v][1:v]overlay[v] -map [v] -map 0:a -f sdl
[19:18:41 CEST] <c_14> Plain libvpx might be a bit difficult because you'd have to manage overlaying yourself, I can't comment much about gstreamer, and it should be relatively easy with ffmpeg though it might take a bit to get the api usage down right.
[19:19:45 CEST] <trough> I already know how to do the overlaying in SDL, and this is going to be part of a game, so I don't think I can use the command line.
[19:20:04 CEST] <trough> But I'm expecting to use the libavcodec/format API
[19:20:40 CEST] <matthias_> c_14: not working: https://bpaste.net/show/1a4d8ca79086
[19:23:39 CEST] <c_14> matthias_: maybe try this -rtmp_conn S:0 -rtmp_conn S:23100539 -rtmp_conn S: -rtmp_conn 'O:1 NS:srv:72737 NS:sid:5155079152 NS:sessionType:preview O:0' ?
[19:23:50 CEST] <c_14> Also try adding -loglevel debug to see if it spits out anything interesting
[19:24:23 CEST] <furq> trough: fwiw i don't think gstreamer actually depends on any gnome stuff
[19:24:28 CEST] <furq> just glib
[19:24:43 CEST] <spiderkeys> are there any good places to ask in depth libav questions besides the mailing list?
[19:25:09 CEST] <spiderkeys> didn't get any bites there
[19:25:09 CEST] <furq> here if you're lucky
[19:26:45 CEST] <trough> that's good to know, furq.  Thanks.
[19:26:48 CEST] <spiderkeys> well, if anyone has experience working with avformat and avcodec doing muxing, would appreciate having this looked at: http://ffmpeg.org/pipermail/libav-user/2016-April/009063.html
[19:30:54 CEST] <galex-713> c_14: thats great it works! \o/
[19:33:05 CEST] <matthias_> c_14: in the debug log: -rtmp_playpath '5155079152'  -rtmp_app 'reflect/5155079152' output: Proto = rtmp, path = /reflect/5155079152, app = reflect, fname = 5155079152 and it says trailing options found
[19:33:21 CEST] <matthias_> c_14: isn't app cut off?
[19:35:42 CEST] <c_14> try escaping the '/' with a '\' ?
[19:37:13 CEST] <matthias_> c_14: no difference
[19:42:04 CEST] <c_14> hmm, no idea. You might want to ask on the ffmpeg-user ml. Maybe someone else has had a similar issue.
[19:43:29 CEST] <matthias_> c_14: or i will pipe rtmpdump to ffmpeg
[19:43:42 CEST] <c_14> or that
[21:04:07 CEST] <vade> hi all. Im trying to deduce an issue where a resulting file from an encode has the wrong timestamps and is very short. Im unsure if I am handling my main encode loop / PTS logic correctly. https://gist.github.com/vade/a48d2445262a3e94a51bfc12424c55ca
[21:04:08 CEST] <vade> should I be using av_rescale_q on the frames being decoded , or on the packets populated from the encoder prior to writing into the muxer?
[21:12:57 CEST] <Mavrik> vade, the rescale call looks wrong
[21:13:10 CEST] <Mavrik> encoders expect proper PTS though
[21:13:13 CEST] <Mavrik> (filter as well)
[21:13:26 CEST] <Mavrik> so your PTS going into an encoder needs to be in encoder's codec timebase
[21:13:34 CEST] <vade> hi Mavrik - thanks. Interesting.
[21:13:41 CEST] <Mavrik> and PTS/DTS then has to be in output format timebase when written to muxer
[21:15:15 CEST] <vade> here is a question - im used to AVFOundation (forgive me) - when I pull AVPackets off of the stream, they are all in DTS order - do I need to re-order them to PTS so my AVFrames from the decoder are in the right order for the encoder? In COre Media, you can do this via a CMBufferQueue and request PTS ordering or DTS ordering. Im unfamiliar with the expectation in FFMPEG
[21:15:24 CEST] <vade> (thank you in advance btw)
[21:15:51 CEST] <Mavrik> Nope, DTS order is fine
[21:15:59 CEST] <vade> interesting. unexpected
[21:16:02 CEST] <Mavrik> Why?
[21:16:12 CEST] <vade> Just was something I noticed while debugging is all :)
[21:16:15 CEST] <Mavrik> The point of DTS is that it determines the order of decoding.
[21:16:32 CEST] <vade> sure - well, im writing a transcoder
[21:16:33 CEST] <Mavrik> So decoder needs them in DTS order to properly decode :P
[21:16:42 CEST] <Mavrik> And it can't decode them in any other order than in PTS order :P
[21:16:48 CEST] <Mavrik> Due to how references work.
[21:16:53 CEST] <Mavrik> Anyway, be careful of timebases
[21:16:58 CEST] <vade> oh. interesting. I didnt know that last part
[21:17:40 CEST] <Mavrik> Input packets are in input format timebase, decoded frames should be in decoder timebase, encoders expect their frames in encoder timebase and muxers expect frames in the output stream timebase :)
[21:17:46 CEST] <Mavrik> (IIRC)
[21:18:02 CEST] <Mavrik> (If sample code uses any others, it's probably the right one, talking out of my head :) )
[21:18:07 CEST] <vade> in AVFoundation if you do pass through, you can just pull sample buffers (AVPackets I guess) in decode order and they can be passed in as decode order - but if you want to *re-encode* uncompressed frames they need to come in as PTS only. I guess you do get the in PTS on the way out from the decoder. I think im jus confused and have been staring at this too long haha
[21:18:40 CEST] <vade> Interesting. I was unaware of the encoder vs muxer timebase nuance
[21:19:00 CEST] <vade> whats a good sample to look at regarding this nuance?
[21:33:01 CEST] <Mavrik> hmm, I think decoding_encoding.c or muxing examples should handle that properly?
[21:33:37 CEST] <Mavrik> vade, your main issue is that you're trying to change PTS timebase from OUTPUT codec timebase to OUTPUT stream timebase.
[21:33:46 CEST] <Mavrik> When you should be changing them from INPUT codec timebase to OUTPUT codec timebase
[21:34:07 CEST] <Mavrik> And then after encode from OUTPUT codec timebase to OUTPUT stream timebase (if they're different, commonly they're not)
[21:36:42 CEST] <spiderkeys> Mavrik: you sound well versed in handling this muxing issue. I've tried to follow all of the examples, tutorials, mailing lists, etc and put together a muxing application that takes a raw h264 elementary stream and muxes each packet into an mp4 container, but none of the timing info seems correct. could I trouble you to take a quick look at my code and see if I'm doing something wrong with regards to how I'm doing stream identific
[21:50:40 CEST] <pandb> I adapted one of the examples on ffmpeg.org to make a program outputs a video from a contiuously updating source of image data
[21:52:06 CEST] <pandb> if I output a file it plays fine, but I'm trying to broadcast live to rtmp servers like at twitch.tv or youtube's live stream feature, and in either case nothing displays
[21:53:08 CEST] <pandb> can someone who knows about streaming look at my video-related code here: http://pastebin.ca/3584711
[21:53:45 CEST] <pandb> codec and stream settings are set in add_stream()
[21:54:31 CEST] <pandb> and write_video_frame()/get_video_frame() are the functions for creating the new frame and writing it
[21:58:46 CEST] <pandb> also, wireshark is showing that my data is being sent to whatever rtmp server i specify
[21:59:43 CEST] <pandb> (and the debug messages that I receive by calling av_log_set_level(AV_LOG_DEBUG) indicate a successful handshake and data being sent)
[22:02:10 CEST] <Mavrik> spiderkeys, you can try, but I can't guarantee I'll have time :)
[22:06:13 CEST] <spiderkeys> Mavrik: fair enough :P It isn't too much code, just the update function in this class: https://github.com/OpenROV-Dev/geomuxpp/blob/inproc_rewrite/src/CMuxer.cpp
[22:06:17 CEST] <spiderkeys> starts at lines ~131
[22:06:46 CEST] <vade> Mavrik: thanks. Im getting closer
[22:06:51 CEST] <spiderkeys> a callback somewhere else is filling the inputBuffer with raw h264 frames (complete nal units)
[22:07:28 CEST] <spiderkeys> so i probe for the format, then find the stream info, then create an output stream with stream and codec info copied from the input stream
[22:08:11 CEST] <Mavrik> spiderkeys, you should set time_base on the output stream
[22:08:11 CEST] <spiderkeys> but the timing info is always weird: see my mailing list post here for the output of my program when it detects the stream info: http://ffmpeg.org/pipermail/libav-user/2016-April/009063.html
[22:08:40 CEST] <Mavrik> spiderkeys, and then av_rescale_q the packet pts and dts (if they're not AV_NOPTS) from the input stream time_base to output stream time_base
[22:08:44 CEST] <spiderkeys> how do I know what value to set it to? is that a field I find in the codec or the input format?
[22:08:57 CEST] <Mavrik> You can just copy it from input.
[22:09:01 CEST] <Mavrik> But there are conventions.
[22:09:15 CEST] <Mavrik> In TS it's usually 1/90000, in others it's 1/fps
[22:11:21 CEST] <spiderkeys> unfortunately, for the input stream it gets AV_NOPTS as the value for the start and duration timestamps: [h264 @ 0x7fcd400008c0] stream: start_time: -9223372036854.775 duration: -9223372036854.775 bitrate=0 kb/s
[22:11:57 CEST] <Mavrik> ah
[22:12:01 CEST] <Mavrik> That would explain your issue.
[22:12:14 CEST] <Mavrik> Which makes sense if they're raw NALs
[22:12:18 CEST] <Mavrik> Do you know your FPS?
[22:12:29 CEST] <spiderkeys> It is variable, but it should generally be at 30fps
[22:12:52 CEST] <spiderkeys> my SPS packet does contain this timing info: timing_info_present_flag :1  num_units_in_tick :6006  time_scale :180000
[22:13:05 CEST] <Mavrik> Hmm.
[22:13:47 CEST] <spiderkeys> and doing an avdump produces this guess for the stream: http://pastebin.com/Gkk8082z
[22:13:59 CEST] <spiderkeys> the tbc looks correct but the other values are whack
[22:14:22 CEST] <Mavrik> Do you get pts and dts set on your avpackets?
[22:14:27 CEST] <Mavrik> When reading them?
[22:15:33 CEST] <spiderkeys> let me print them real quick. i think they were always invalid, but let me double check
[22:17:04 CEST] <spiderkeys> Writing muxed packet. Bytes: 32440. PTS: -9223372036854775808 DTS: -9223372036854775808
[22:17:08 CEST] <spiderkeys> yea, av_nopts
[22:17:34 CEST] <Mavrik> That's a bit of an issue for VFR video :/
[22:17:41 CEST] <Mavrik> Do you know which muxer does it use?
[22:18:37 CEST] <spiderkeys> should be mp4. I'm pulling in each h264 NAL unit and wrapping each one in an mp4 container as segmented with "movflags", "empty_moov+default_base_moof+frag_keyframe"
[22:18:51 CEST] <Mavrik> spiderkeys, on the input, not output
[22:19:01 CEST] <Mavrik> (To determine why muxer doesn't set timestamps on packets)
[22:19:38 CEST] <spiderkeys> oh, no idea. The camera we have comes straight from an OEM that has some onboard encoding process, sending complete frames over usb
[22:20:03 CEST] <spiderkeys> I could ask them at some point what is going on with timing info. is there something in particular I should ask about that would need to be included?
[22:20:18 CEST] <Mavrik> hmm
[22:20:29 CEST] <Mavrik> Nah, I think it's ffmpeg's "fake" demuxer that's causing you issues
[22:21:14 CEST] <Mavrik> spiderkeys, I'm also wondering why you're initializing a decoder which you then don't use
[22:21:41 CEST] <Mavrik> spiderkeys, check which demuxer you got in p_InputFormatContext.
[22:22:55 CEST] <spiderkeys> alright, I'll do that real quick. I was under the impression that I needed to get all of the codec information in that manner in order to create an output stream. I'm not interested in actually decoding anything, just want to wrap it in an mp4 container
[22:26:30 CEST] <Mavrik> Yeah, you may have to call the decoder to generate timestamps tho
[22:26:47 CEST] <Mavrik> I think "raw video" fake decoder can handle timestamping.
[22:26:54 CEST] <Mavrik> Buuut would have to check the code :)
[22:27:22 CEST] <Mavrik> Since timestamps have to be takes from NAL's and applied to AVFrame/AVPacket
[22:33:02 CEST] <spiderkeys> Mavrik: strange, not sure how to access the AVCodecTag structure at m_pInputFormatContext->iformat->codec_tag. Getting error: invalid use of incomplete type struct AVCodecTag
[22:33:11 CEST] <spiderkeys> even though I've definitely included avformat.h
[22:34:21 CEST] <Mavrik> spiderkeys, just print name or long_name ? :)
[22:36:21 CEST] <spiderkeys> Ah ok: name: raw H.264 video codec id: 0
[22:37:33 CEST] <spiderkeys> Mavrik: do you know what the name of the field that would contain a timestamp in the NAL is?
[22:38:11 CEST] <Mavrik> There is no such field.
[22:38:21 CEST] <Mavrik> Since those fields are generic for all formats.
[22:38:28 CEST] <Mavrik> They won't have codec specific stuff.
[22:39:33 CEST] <spiderkeys> Oh I think I see what you're saying now. The raw frames don't have that info, so it would be the decoder that produces them using the SPS timing info, but since I'm not using the decoder, they never get generated
[22:40:41 CEST] <Mavrik> I don't see raw H.264 demuxer setting pts/dts anywhere :/
[22:40:57 CEST] <Mavrik> yeah, pretty much
[22:41:13 CEST] <Mavrik> That wouldn't be a problem if you'd have a constant FPS video
[22:41:19 CEST] <Mavrik> Then you could just generate PTS
[22:41:30 CEST] <Mavrik> Hrmf, even though with B-frames that wouldn't work :/
[22:41:50 CEST] <spiderkeys> I could potentially lock the framerate (since we don't really want it variable) and then do that
[22:42:09 CEST] <Mavrik> Yeah, but B-frames will still mess you up.
[22:42:19 CEST] <spiderkeys> I think we only have I and P frames, but couldn't be sure B frames wouldnt show up some point in the future
[22:42:21 CEST] <Mavrik> I guess you could manually parse the pts/dts ?
[22:42:33 CEST] <Mavrik> Maybe someone else has a better solution
[22:44:48 CEST] <spiderkeys> If I went that route for now, I would basically just create a 64 bit int, start at 0, and add the duration as calculated from framerate to generate the PTS/DTS for the AVPacket before writing it?
[22:45:58 CEST] <vade> ive been having similar but slightly different issues, and ive seen solutions/ sample code that uses that solution
[22:46:07 CEST] <madgino> Can I use ffmpeg to convert from HLS to to mpeg-ts?
[22:46:11 CEST] <vade> and uses av_rescale_q based on the incrememented packet or frame count
[22:46:12 CEST] <Mavrik> spiderkeys, yes, the duration you add has to be in stream timebase
[22:46:20 CEST] <Mavrik> spiderkeys, but VFR and B-frames can mess you up
[22:46:56 CEST] <spiderkeys> Since we are live streaming this into the browser and pushing the frames into the MSE decoder as fast as we can, it can typically play the data with no problem, and doesnt really care about the timestamps, but when we save the video to an .mp4 file, most media players have trouble figuring out how to play it at all, or if they can, the playback rate is usually wrong
[22:47:14 CEST] <kepstin> madgino: sure, but you probably don't have to - HLS is just a bunch of mpeg-ts files and you can simply concatenate them in most cases.
[22:48:36 CEST] <spiderkeys> Mavrik: I really appreciate your help, learning the libav internals can be a bit of a chore, and I understand a bit better what is going on now.
[22:49:12 CEST] <Mavrik> Yeah, I'm afraid reading the source is pretty much a must for those things :/
[22:50:21 CEST] <spiderkeys> Yea, I've had to do that on a few occasions to figure out how to get the muxer to write out frames without buffering them. I still find it odd that none of the flags created that behaviour and I had to call: ret = av_write_frame(m_pOutputFormatContext, &packet); ret |= av_write_frame(m_pOutputFormatContext, NULL );
[22:50:30 CEST] <spiderkeys> that second null write to force the flush
[22:50:35 CEST] <spiderkeys> took me forever to find that
[22:58:37 CEST] <kbarry> I'm strugging to get more information on google. I believe i'm just short a word or so. What is the window that appears (the visualization) when using ffplay ?
[23:15:46 CEST] <pandb> can someone knowledgable about streaming to rtmp look at my code, tell me if there's anything wrong with my codec settings, or what I do to the frame or the packets I send to the muxer? http://pastebin.ca/3584711
[23:16:50 CEST] <pandb> the files it outputs play fine, but nothing shows up on twitch or youtube live stream if I specify an rtmp server
[23:25:55 CEST] <petecouture> Maybe someones run into this before. I have a live stream I'm processing to HLS. I added another output that generates JPG frame images once a second. I'm able to get constant frames generated using frame_%d.jpg  where %d is the frame number. However I'm looking to just have the file overwritten every second and it's not working when I just output a single file. I'm getting an error happened: ffmpeg exited wit
[23:25:55 CEST] <petecouture> h code 1: Conversion failed!
[23:26:46 CEST] <petecouture> Also is there way to generate an image once every 5 seconds? I'm using -r 1 and don't think I can go any lower
[23:27:16 CEST] <furq> -r 1/5
[23:27:32 CEST] <petecouture> furq: I figured just didn't try it. Thanks!
[23:27:48 CEST] <petecouture> can you put the fraction in like that or use a decimal?
[23:27:52 CEST] <furq> either
[23:29:28 CEST] <petecouture> Thanks again. Any advice on the filename issue?
[23:30:03 CEST] <petecouture> Could it be a read/write conflict? I'm spawning ffmpeg from within Node so there's not much in teh way of errors mesages coming back.
[23:31:12 CEST] <furq> use -update 1
[23:31:27 CEST] <furq> otherwise the filename has to be a pattern
[23:31:38 CEST] <furq> or the input has to be one frame
[23:32:32 CEST] <petecouture> Cheers again!
[23:37:01 CEST] <petecouture> Is 'update' a new option? I can't find it in the documentation
[23:40:36 CEST] <furq> https://www.ffmpeg.org/ffmpeg-formats.html#Options-5
[23:46:30 CEST] <petecouture> thnx!
[00:00:00 CEST] --- Thu Apr 28 2016


More information about the Ffmpeg-devel-irc mailing list