[Ffmpeg-devel-irc] ffmpeg.log.20130419

burek burek021 at gmail.com
Sat Apr 20 02:05:01 CEST 2013


[00:23] <jacobs1> durandal_1707: is piping to rawvideo demuxer same as using image2pipe? I have a process generating realtime rgba images of the same size and i am trying to stream these images as a video stream,I tried with rawvideo but got bad results, so I thought maybe its the wrong way, can you take a look at this post and see if i am doing somthing wrong: http://ffmpeg-users.933282.n4.nabble.com/multicas
[00:23] <jacobs1> t-streaming-real-time-raw-rgb-images-td4658526.html
[01:55] <Karen> Hiii
[08:28] <bogdanp> what's a good way to handle variable frame rate from a rtmp stream? av_frame_get_pkt_duration returns 0 for video packets.
[08:30] <bogdanp> nvm, I was actually doing something wrong :D.
[09:19] <praveenmarkandu> hi, can I know what version of HLS support is avaiable in FFPMPEG
[11:32] <brontosaurusrex> can i assume that if max_volume = 0.0 (out of volumedetect) the file is surely clipped?
[11:32] <brontosaurusrex> and can i assume the same for lossy files, like mp3?
[11:34] <brontosaurusrex> example: ffmpeg -i some.mp3 -vn -af volumedetect -f null -
[11:37] <brontosaurusrex> and more, what would i use to echo the amount of clipped samples?
[11:37] <brontosaurusrex> (working on an info script)
[11:51] <brontosaurusrex> cough
[11:54] <durandal_1707> brontosaurusrex: probably, see: ffmpeg -f lavfi -i aevalsrc=99999 -vn -af volumedetect -f null -
[11:55] <brontosaurusrex> what is with the decoder roundup errors and lossy files?
[11:55] <durandal_1707> but max just gives maximum found volume, more is tell with histograms...
[11:56] <brontosaurusrex> durandal_1707, right
[11:56] <brontosaurusrex> durandal_1707, i'd like something similar to : slight clipping, 5 samples or severe clipping, more than 5 seconds
[11:56] <brontosaurusrex> on output basically
[11:57] <brontosaurusrex> like i said , an info script
[12:00] <durandal_1707> you cant get clipping that way, you can get how much is left when clipping happens...
[12:01] <durandal_1707> so if you have max volume of 0, there would be at least 1 sample where clipping happens if you increase gain (by using any filter)
[12:03] <durandal_1707> and histograms from volumedetect already provide that info (separated in several dbs)
[12:04] <brontosaurusrex> i see, example?
[12:05] <durandal_1707> example of what?
[12:05] <brontosaurusrex> histogram_0db: 78139392 < this number is # of samples?
[12:06] <durandal_1707> that just lits samples in some range ...
[12:06] <durandal_1707> you could also use ebur128 filter
[12:06] <brontosaurusrex> but i don't see any clipping info in ebur128?
[12:07] <brontosaurusrex> hmm, let me retest that
[12:09] <brontosaurusrex> basically if i get a file that is clipped before r128 analisis, that should say to the provider: do it again, no need for r128
[12:09] <brontosaurusrex> right?
[12:12] <brontosaurusrex> what i have so far, if anyone is interested: http://paste.debian.net/plain/250444
[12:12] <durandal_1707> hmm, you are probably looking for: Flat factor is a measure of the flatness (i.e. consecutive samples with the same value) of the signal at its peak levels (i.e. either Min level, or Max level).
[12:12] <durandal_1707> from sox's stats effect
[12:13] <durandal_1707> i gonna port it soon to ffmpeg/lavfi
[12:13] <brontosaurusrex> so what exactly is histogram_0db ?
[12:14] <durandal_1707> probably found number of samples between 0 and 1 db
[12:15] <brontosaurusrex> mkay
[12:21] <brontosaurusrex> i think iam just gonna echo that number as well if 0 is detected in the 1st place
[12:22] <durandal_1707> that may not mean that clipping actually happened nor how severe is clipping....
[12:23] <durandal_1707> but if 0db histogram have more samples than any other, than its probably clipped (just guessing....)
[12:24] <durandal_1707> you could do little resarch with little, moderate and severe clipped files and compare their volumedetect report
[13:24] <flowolf> hi
[13:25] <flowolf> I'm exploring various solutions for live video streaming
[13:25] <flowolf> but I can't find informations on the scalability of ffserver
[13:26] <flowolf> I don't even understand if it is meant for home/local streaming or for production environments on a cdn
[13:27] <flowolf> can you help?
[13:27] <Magicking> flowolf: What kind of stream ?
[13:27] <JEEB> ffserver is used in production in some places, but it's really a mess regarding getting information about it :D
[13:28] <JEEB> depending on the type of streaming you could use ffmpeg itself, which often is much more documented and so forth
[13:28] <JEEB> (rtmp(e) pushing to a rtmp(e) streaming server, pushing http post requests to a http streaming service and so forth)
[13:34] <JEEB> I've not done large-scale streaming or anything, but just saying that when I wanted to use a single piece of software to do something simple like mpeg-ts via http (served by the same thing), I just found VLC (and its command line version) the best alternative. That said, as long as ffmpeg itself can be used for your use case instead of ffserver, that *is* simpler to use than VLC.
[13:41] <flowolf> I need a way to stream to every device and browser
[13:41] <flowolf> and I need to load balance it on multiple servers
[13:42] <flowolf> as there isn't a single streaming format supported on every device
[13:42] <flowolf> I will probably need multiple formats
[13:43] <flowolf> http live streaming (hls) for apple devices, webm/mp4 for desktop browsers and rtmp for flash fallback
[13:44] <JEEB> for rtmp(e) you need to use one of the servers, and then you can just feed stuff to them via ffmpeg itself
[13:47] <flowolf> JEEB, define "one of the servers" please :)
[13:49] <JEEB> flowolf, I think there are multiple rtmp(e) streaming servers
[13:49] <JEEB> so one of those, I have no idea at all :P
[13:50] <flowolf> isn't ffserver one of them?
[13:51] <JEEB> no
[13:52] <JEEB> ffserver only does stuff like http serving IIRC
[13:58] <bogdanp> why would av_frame_get_pkt_duration return the same duration for every video frame even though the frame rate is variable? I'm reading in a rtmp stream and this is messing up my timing.
[15:49] <phantomis> Hi guys, I'm having some troubles playing an HLS stream how need save cookies to retrieve the next list
[15:49] <phantomis> and I don't know how to do that :(
[15:49] <phantomis> ./ffplay http://radioalacarta.cooperativa.cl/playlist/playlist.m3u8
[15:57] <phantomis> Hi guys, I'm having some troubles playing an HLS stream how need save cookies to retrieve the next list
[15:58] <phantomis> and I don't know how to do that :(
[15:58] <phantomis> ./ffplay http://radioalacarta.cooperativa.cl/playlist/playlist.m3u8
[15:58] <phantomis> (sorry if this message was twice, my client die for some moment)
[16:03] <hendry> what's a good mono -acodec for making a dump to .mkv ?
[16:04] <hendry> my ffmpeg invocation is bombing out with  pcm_s16le on a mono input, because IIUC it expects 2 channel!
[16:05] <hendry> i get a "cannot set channel count to 2 (Invalid argument)"
[16:11] <hendry> ah, I wanted -ac 1
[16:30] <Magicking> flowolf: You could just use HDS and HLS
[16:31] <Magicking> Since it's a streaming file-based, you can easily ajust your architecture using only http caching server
[16:32] <Magicking> If you except old Android device, you can have iPhone / Android(without bandwith scalling) / Mac (HLS and HDS) / Windows / Linux
[16:34] <flowolf> Magicking, what would you use to stream content using HDS?
[16:34] <Magicking> HDS use segment like HLS
[16:34] <flowolf> I have see that ffmpeg can produce HLS segments and playlist
[16:34] <Magicking> The stream work almost the same as HLS
[16:35] <flowolf> is it the same for HDS?
[16:35] <Magicking> I don't know
[16:43] <Oele> flowolf: does it need to be free/open source software?
[16:44] <Oele> flowolf: you could take a look at Wowza. it's proprietary but afaik it does everything you mentioned
[16:46] <Magicking> Oh yeah, I wasn't sure that I was allowed to talk about other project
[16:46] <Magicking> But I used Wowza to do that, they are cheap and well documented
[16:46] <xlinkz0> is it possible to read_frames from the end of the file?
[16:47] <Magicking> flowolf: I used Wowza to do both hls and hds
[16:48] <Magicking> It's cheap and it do everything you want, you can easily extend it with your own plugin also
[16:49] <Magicking> What you want is http origin server I think and a bunch of reverse proxy like nginx ou varnish in front of it
[17:15] <flowolf> Magicking, I want to avoid proprietary software
[17:19] <flowolf> rtmp and hls streams can be produced with open source software
[17:20] <flowolf> nginx-rtmp supports both of them, crptmd supports rtmp, ffmpeg can be used to segment for hls and then it can be served using a random httpd
[17:22] <flowolf> I can't find open source software for hds and I don't get why hds is better than rtmp (other than being server on a standard http channel)
[17:22] <flowolf> *served
[17:26] <Oele> i think that's the main advantage. it means that you can use http caching etc.
[17:30] <Oele> but it seems you have already found a working solution for flash (rtmp), apple (hls) and desktop browsers (plain webm/mp4 over http) :)
[17:31] <Oele> so why bother with hds?
[17:35] <xlinkz0> I use a rtsp stream as input and copy it into a local file, is there any way to tell when ffmpeg started writing?
[17:36] <xlinkz0> like the exact milisecond that the first frame was written at
[17:37] <flowolf> Oele, the "desktop browsers (plain webm/mp4 over http)" is missing actually
[17:38] <flowolf> and I would like to know what can be done to ffmpeg/ffserver
[17:39] <flowolf> *with ffmpeg/ffserver
[17:43] <Oele> webm/mp4 of http is just a matter of producing a standard webm or mp4 file and putting it on a generic web server
[17:43] <Oele> you can probably produce those files with ffmpeg
[17:43] <JodaZ> Oele, and paying the licensing cost
[17:43] <Oele> oh wait, you're talking about *live* streaming
[17:44] <flowolf> yep, live :)
[17:44] <Oele> i don't know if that's even possible?
[17:44] <flowolf> webm supports live streaming
[17:45] <flowolf> and it can be done with ffserver
[17:45] <flowolf> https://www.virag.si/2012/11/streaming-live-webm-video-with-ffmpeg/
[17:45] <JodaZ> does anyone know how to transcode individual HLS segments ? to like allow a user to start a video and then skip right to the end without it having to transcode all the stuff in between
[17:51] <Magicking> flowolf: If you use rtmp, you'll use proprietary protocol
[17:51] <Oele> it's an open specification. and there are open source implementations
[17:52] <Oele> flowolf: i have never used ffserver, i'm affraid i can't help you with that. :( i do use icecast for live audio streaming. the latest version supports Webm too. Maybe you could take a look at that..
[17:52] <Mavrik> flowolf, note, webm live stream can't be played in all browsers
[17:52] <Mavrik> :)
[17:55] <Oele> yep, if you want to support all browsers you'll need two codecs at least
[17:56] <Oele> and i haven't seen any free software that can stream live mpeg4 video yet?
[17:57] <flowolf> *sigh*
[17:57] <Mavrik> Oele, VLC :)
[17:58] <flowolf> I need this to scale for thousands of clients, vlc streaming is build for local/home streaming
[17:59] <Magicking> Oele | it's an open specification. and there are open source implementations, yeah right! And when you have Adobe Product in front of you suddenly nothing work
[18:00] <JEEB> flowolf, actually vlc's streaming capabilities aren't much worse off than what you'd get from ffmpeg in many cases :P
[18:00] <Oele> never tried it Magicking , but i believe you :)
[18:00] <JEEB> <Oele> and i haven't seen any free software that can stream live mpeg4 video yet? <- what, also what do you mean with 'mpeg4' here?
[18:00] <JEEB> because MPEG-4 contains awfully a lot of stuff
[18:01] <JEEB> starting from containers to video and audio formats
[18:01] <Oele> yes
[18:02] <Oele> i guess H264 in an MP4 container. but i'm not sure
[18:02] <JEEB> ok... now name even one streaming "standard" that does that
[18:02] <JEEB> other than reading a static already encoded file and giving it out via http
[18:02] <JEEB> which really isn't streaming at all
[18:04] <Oele> yeah, you're right, it's probably not even possible
[18:04] <Oele> the thing is, the HTML5 specs seems to have been written with media "files" in mind
[18:04] <JEEB> No, you /can/ actually stream MP4 while you're encoding it if you use the movie fragments feature in the container
[18:04] <JEEB> but no-one uses that >_>
[18:05] <Oele> i know that is is possible to stream MP3, ogg vorbis and in some cases AAC+ (ADTS) audio to browsers
[18:05] <JEEB> those are all pretty much containerless files
[18:05] <Oele> but it's not in any spec, it's just a matter of trial and error
[18:05] <JEEB> just raw streams, so yes
[18:05] <Oele> exactly.
[18:05] <JEEB> also yes, if you limit yourself to "HTML5 specified" there really aren't /any/ proper streaming solutions so I'm not sure why you were specifying "free software" there in your comment
[18:06] <Oele> webm will probably work, but i don't know of any "streamable" format for "mpeg4 video"
[18:06] <JEEB> as in, there afaik are no official ways of doing streaming that are "defined" in "HTML5"
[18:06] <JEEB> ok, now you're once again going for the "mpeg4" word
[18:06] <JEEB> stop it for eff's sake
[18:06] <Oele> i'm sorry
[18:06] <JEEB> if you're talking about these things, BE SPECIFIC
[18:07] <JEEB> also since no formats have been specified for "HTML5" stuff
[18:07] <JEEB> you might as well use H.264 in matroska or something :P
[18:07] <Oele> i'm talking about the browsers, like IE, apple, etc. that decided not to support the open codecs like OGG, vorbis, etc.
[18:07] <JEEB> also I think they were trying to do something rtsp'ish
[18:07] <JEEB> uhh
[18:08] <JEEB> now I found Yet Another Thing To Pick On You With
[18:08] <JEEB> you say "open codecs"
[18:08] <Oele> i know they support h264 in mp4 container, maybe other stuff, that's why i was trying not to be too specific because i don't have clue what exactly they support
[18:08] <JEEB> now do tell me how on earth f.ex. H.264 isn't an "open codec"
[18:08] <JEEB> also ogg is not a codec but a container, (ogg )vorbis is a video format, yes
[18:09] <Oele> nope, vorbis is an audio codec ;)
[18:09] <JEEB> argh yes, that was miswriting, but no -- it's not a codec, it's an audio format
[18:09] <JEEB> libvorbis is the coder/decoder
[18:09] <JEEB> :P
[18:10] <Oele> pff :P
[18:10] <Oele> i know, you are right JEEB :)
[18:10] <flowolf> JEEB, do you stream stuff directly with ffmpeg?
[18:10] <JEEB> flowolf, there are some things you can stream straight with ffmpeg, but those are limited, in that way ffmpeg itself is much worse off than f.ex. VLC :P
[18:11] <JEEB> ffserver is pretty much black magic for most people, too
[18:11] <flowolf> because if you do and you have thousands of clients and you say that vlc can do the same I have to explore it
[18:11] <JEEB> and undocumented
[18:11] <JEEB> I'm pretty sure VLC is used to deliver content for thousands as well, but I use neither ffmpeg or VLC in such environments
[18:11] <Oele> anyway, the point i was trying to make is that as it currently stands, there are browsers that have chosen the ogg/vorbis/webm path for audio and video and browsers that have gone in the mpeg4/h264/mp4/whatever direction
[18:11] <JEEB> I'm one of those people writing encoders or decoders :P
[18:12] <JEEB> Oele, yes but that doesn't make your trying to note that H.264 f.ex is not an 'open format' any more correct
[18:12] <JEEB> actually H.264 is more open than VP8 for eff's sake
[18:12] <Oele> the first group could *probably* be served by webm via ffserver or icecast, even though these browsers do not *specify* themselves in their documentation that it is even possible to do live streaming
[18:13] <Oele> for the second group, i have *no idea* how to do live streaming without flash
[18:13] <Oele> or maybe hls for apple
[18:13] <JEEB> the iDevices have their own way, HLS, yes
[18:14] <flowolf> http://www.longtailvideo.com/html5/
[18:14] <JEEB> I have no idea about other stuff because there is not a single specified way of streaming things in "HTML5"
[18:14] <flowolf> Oele, it is just hell :D
[18:14] <Oele> i *think* that also works on safari on the mac, but i don't know. safari on windows hasn't been updated for ages
[18:14] <Oele> but IE ?
[18:14] <JEEB> if you really want to stream stream video, flash is still your nr1 thing
[18:14] <JEEB> unfortunately
[18:15] <JEEB> for mobiles there is HLS and rtsp
[18:15] <JEEB> so that is pretty much cleared up
[18:17] <flowolf> android has some limited support for hls
[18:17] <JEEB> yes, ever since 4.x or so, but I didn't note it because rtsp was supported all the time
[18:18] <JEEB> or well, "quite longer"
[18:18] <Oele> microsoft has its own hls/hds variant called smooth streaming.. maybe that works on IE
[18:19] <JEEB> yes, but that once again isn't IMHO usable "as-is" anyways
[18:19] <JEEB> it's done by using silverlight
[18:19] <JEEB> doesn't really make sense to use it if you can just grab a bigger amount of the user base by just using flash :P
[18:19] <Oele> meh :(
[18:19] <Oele> yeha
[18:19] <Oele> yeah
[18:21] <JEEB> also while Google had all the capability to create VP9 in a more open way, unfortunately they have chosen not to do that. :<
[18:21] <JEEB> you just have access to the source code, but the specification is completely done behind closed doors AFAIK
[18:21] <Oele> :(
[18:22] <Sashmo_> is there a way to use an external timestamp when encoding with ffmpeg?  I know I can ignore pts, but can I use another?
[18:22] <Oele> HLS for apple, RTSP for other mobiles and some Flash format for desktop is probably the way to go for flowolf
[18:22] <JEEB> that's one of the main reasons why I can't help laughing with a bad taste in my mouth when they call their stuff "open web standards" :<
[18:23] <JEEB> so far the only format that is being developed more openly than the ISO/IEC MPEG ones is "Daala"
[18:23] <JEEB> which is still years from fruition
[18:23] <JEEB> and only a code name
[18:23] <JEEB> s/code/working/
[18:25] <flowolf> Oele, what is the point of RTSP?
[18:26] <JEEB> it's a live streaming protocol that works for most androids?
[18:26] <Oele> RTSP is/was the media streaming protocol that 3GPP recommends
[18:26] <Oele> so it works on android
[18:26] <JEEB> and many other devices methinks
[18:26] <flowolf> oh, right
[18:26] <Oele> and most other phones.. including those old nokia/sony ericsson etc devices with proprietary OS'es
[18:28] <Oele> only apple decided to completely ignore it :-)
[18:29] <flowolf> this makes the setup harder
[18:29] <flowolf> nginx-rtmp can serve both rtmp and hls
[18:29] <flowolf> but I have no idea on how to stream rtsp too
[18:29] <Oele> rtsp is a nightmare too btw, it uses rtp over udp by default, which is blocked by many firewalls and some mobile operators that use NAT
[18:30] <Oele> VLC can do it :-)
[18:30] <flowolf> hurray
[18:30] <flowolf> I have to test vlc on big load
[18:30] <Oele> on it's own or with apple's darwin rtsp server
[18:31] <Oele> (seems apple can't decide wether to support rtsp or not :P)
[18:31] <JEEB> also ffserver/ffmpeg combo should be able to deal with rtsp
[18:31] <JEEB> (good luck and have fun if you need to poke ffserver with it, though)
[18:32] <JEEB> I actually did rtsp with VLC the other day
[18:32] <JEEB> and it worked relatively OK
[18:32] <JEEB> naturally this was just local testing
[18:33] <Oele> you could also take a look at Helix for rtsp. never used it myself
[18:35] <flowolf> I see people using vlc streaming only in small environments
[18:35] <Oele> or, since the newer android supports HLS, only use HLS and flash and ignore older phones :)
[18:35] <JEEB> flowolf, I think there are a couple of IPTV channels at least that use it for a rather large amount of users
[18:36] <Oele> as a server JEEB ? or just to produce the rtp stream?
[18:36] <flowolf> Oele, http://www.longtailvideo.com/blog/31646/the-pain-of-live-streaming-on-android/ << it does support HLS but it is buggy
[18:36] <JEEB> yes, for android and friends it's just easier to use rtsp
[18:37] <Oele> RTSP was also broken in a few android releases. and many devices never get updated :(
[18:37] <JEEB> Oele, no idea -- but I just know that it is used to create the a/v streams as well as most probably the "source" protocol-wise packaging. Not sure if it's the thing on the most outernmost layer.
[18:38] <flowolf> I want to lie down and cry a lot
[18:39] <Oele> i have used vlc to produce a radio stream and darwin streaming server to relay it to hundreds of listeners. the amount of listeners didn't really matter but every now and then DSS would just stop relaying the stream for some strange reason
[18:40] <Oele> eventually i got so frustrated that i wrote my own rtsp server, which just relays aac streams from icecast. that one is working fine. unfortunately it doesn't support video ;)
[19:52] <burek> Oele, you should publish your source code to dss developers perhaps, so they could see where did they go wrong and fix it :)
[21:52] <brontosaurusrex> how would i write a test script to determine if ffmpeg installed is able to perform all the tasks ahead?
[21:53] <brontosaurusrex> in this case : -af volumedetect and -vn -filter_complex ebur128
[22:09] <eric__> qestion about mp4 format. I am using qt-faststart to get information of the containors what is the diff in ftype 24 bytes and 32 bytes?
[22:09] <eric__> http://pastebin.com/7zv9buzQ
[22:15] <eric__> any suggestions?
[22:32] <larrikin> any flvstreamer/rtmpdump ppl here ?
[23:10] <brontosaurusrex> 44100 samples would be 1 second in 44.1 kHz file, right?
[23:14] <larrikin> yes? I'm thinking it's probably two interleaved streams of 22050 that I'll arbitrarily refer to as 'left' and 'right' ...
[23:17] <brontosaurusrex> larrikin, can't be
[23:18] <brontosaurusrex> that would be 11khz, no?
[23:19] <dericed> is there a way to force the output of the file format detection score?
[23:21] <retard> larrikin: are you leafy?
[23:24] <larrikin> retard: no?
[23:25] <retard> oh
[00:00] --- Sat Apr 20 2013


More information about the Ffmpeg-devel-irc mailing list