[Ffmpeg-devel-irc] ffmpeg.log.20120219

burek burek021 at gmail.com
Mon Feb 20 02:05:02 CET 2012


[00:13] <hotwings> using debian testing here..  how do i build ffmpeg-dev from git?  i dont see any option for it in ./configure
[00:16] <burek> what exactly do you need it for
[00:20] <hotwings> building the VDR softhddevice plugin
[00:26] <matteowiz> hi all, anyone can help me with a cross compilation problem?
[00:36] <burek> hotwings, well, I'm not sure, but, if you have all the headers and you've built your ffmpeg than you have your -dev
[00:36] <burek> matteowiz,
[00:36] <burek> can you please use pastebin.com, to show your command line and its output?
[00:37] <matteowiz> the problem is related to x264
[00:37] <matteowiz> if i build it with --disable-asm
[00:37] <matteowiz> no problem.
[00:37] <matteowiz> but with asm enabled
[00:37] <matteowiz> i got:
[00:37] <matteowiz> i486-openwrt-linux-uclibc-gcc -D_ISOC99_SOURCE -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE -D_POSIX_C_SOURCE=200112 -D_XOPEN_SOURCE=600 -DPIC -O2 -pipe -march=i486 -fno-caller-saves -fhonour-copts -Wno-error=unused-but-set-variable -I/usr/src/trunk/bin/x86/OpenWrt-SDK-x86-for-Linux-i686-gcc-4.6.2_uClibc-0.9.33/staging_dir/target-i386_uClibc-0.9.33/usr/include -I/usr/src/trunk/bin/x86/OpenWrt-S
[00:37] <matteowiz> DK-x86-for-Linux-i686-gcc-4.6.2_uClibc-0.9.33/staging_dir/target-i386_uClibc-0.9.33/include -I/usr/src/trunk/bin/x86/OpenWrt-SDK-x86-for-Linux-i686-gcc-4.6.2_uClibc-0.9.33/staging_dir/toolchain-i386_gcc-4.6.2_uClibc-0.9.33/usr/include -I/usr/src/trunk/bin/x86/OpenWrt-SDK-x86-for-Linux-i686-gcc-4.6.2_uClibc-0.9.33/staging_dir/toolchain-i386_gcc-4.6.2_uClibc-0.9.33/include -fpic -fno-strict-aliasing
[00:37] <matteowiz> -std=c99 -fomit-frame-pointer -fPIC -pthread -E -o /tmp/ffconf.TfiqPYxq.o /tmp/ffconf.EiXMx4Ai.c
[00:37] <matteowiz> In file included from /tmp/ffconf.EiXMx4Ai.c:1:0:
[00:37] <matteowiz> check_func x264_encoder_encode -lx264
[00:37] <matteowiz> check_ld cc -lx264
[00:37] <matteowiz> check_cc
[00:37] <matteowiz> BEGIN /tmp/ffconf.EiXMx4Ai.c
[00:37] <matteowiz>     1   extern int x264_encoder_encode();
[00:37] <matteowiz>     2   int main(void){ x264_encoder_encode(); }
[00:37] <matteowiz> END /tmp/ffconf.EiXMx4Ai.c
[00:37] <matteowiz> i486-openwrt-linux-uclibc-gcc -D_ISOC99_SOURCE -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE -D_POSIX_C_SOURCE=200112 -D_XOPEN_SOURCE=600 -DPIC -O2 -pipe -march=i486 -fno-caller-saves -fhonour-copts -Wno-error=unused-but-set-variable -I/usr/src/trunk/bin/x86/OpenWrt-SDK-x86-for-Linux-i686-gcc-4.6.2_uClibc-0.9.33/staging_dir/target-i386_uClibc-0.9.33/usr/include -I/usr/src/trunk/bin/x86/OpenWrt-S
[00:37] <matteowiz> DK-x86-for-Linux-i686-gcc-4.6.2_uClibc-0.9.33/staging_dir/target-i386_uClibc-0.9.33/include -I/usr/src/trunk/bin/x86/OpenWrt-SDK-x86-for-Linux-i686-gcc-4.6.2_uClibc-0.9.33/staging_dir/toolchain-i386_gcc-4.6.2_uClibc-0.9.33/usr/include -I/usr/src/trunk/bin/x86/OpenWrt-SDK-x86-for-Linux-i686-gcc-4.6.2_uClibc-0.9.33/staging_dir/toolchain-i386_gcc-4.6.2_uClibc-0.9.33/include -fpic -fno-strict-aliasing
[00:37] <matteowiz> -std=c99 -fomit-frame-pointer -fPIC -pthread -c -o /tmp/ffconf.TfiqPYxq.o /tmp/ffconf.EiXMx4Ai.c
[00:37] <matteowiz> i486-openwrt-linux-uclibc-gcc -L/usr/src/trunk/bin/x86/OpenWrt-SDK-x86-for-Linux-i686-gcc-4.6.2_uClibc-0.9.33/staging_dir/target-i386_uClibc-0.9.33/usr/lib -L/usr/src/trunk/bin/x86/OpenWrt-SDK-x86-for-Linux-i686-gcc-4.6.2_uClibc-0.9.33/staging_dir/target-i386_uClibc-0.9.33/lib -L/usr/src/trunk/bin/x86/OpenWrt-SDK-x86-for-Linux-i686-gcc-4.6.2_uClibc-0.9.33/staging_dir/toolchain-i386_gcc-4.6.2_
[00:37] <matteowiz> uClibc-0.9.33/usr/lib -L/usr/src/trunk/bin/x86/OpenWrt-SDK-x86-for-Linux-i686-gcc-4.6.2_uClibc-0.9.33/staging_dir/toolchain-i386_gcc-4.6.2_uClibc-0.9.33/lib -Wl,--as-needed -o /tmp/ffconf.lAgnQI9K /tmp/ffconf.TfiqPYxq.o -lx264 -lmp3lame -lfaac -lm -pthread -lz
[00:37] <cbreak> oh god
[00:37] <cbreak> stupid spammers attack us!
[00:38] <matteowiz> sorry for long lines
[00:38] <cbreak> go use a pastebin
[00:39] <matteowiz> http://pastebin.com/QjB90cih
[00:42] <cbreak> and what's the problem?
[00:42] <cbreak> you can't link ffmpeg?
[00:42] <matteowiz> after that ffmpeg configure says libx264.so not found
[00:43] <matteowiz> because of several undefined reference to functions _sse|_mmx|_mmx2 etc
[00:43] <matteowiz> as u see in the pastebin
[00:43] <hotwings> burek, ok thanks
[00:44] <cbreak> matteowiz: check if the x264 headers you use match the library you try to link to
[00:44] <matteowiz> cbreak: yes cause i zeroed my toolchain several times
[00:45] <cbreak> and you don't have any other headers in the include path? (like /usr/include or so)
[00:45] <matteowiz> if i build in the same toolchain x264 with --disable-asm everything works fine
[00:45] <burek> matteowiz, did you build yasm
[00:45] <cbreak> hmm.
[00:45] <matteowiz> yasm installed and uptodate
[00:45] <cbreak> you build for some weird platform?
[00:45] <matteowiz> i build for openwrt-i486
[00:46] <cbreak> hmm...
[00:46] <cbreak> so check why the symbols are not in the library
[00:46] <matteowiz> the main problem is that with libx264 with disable-asm performance is horrible
[00:47] <matteowiz> creak i run nm libx264.so.120
[00:47] <matteowiz> and the symbols are there
[00:47] <burek> maybe they are not exactly "there"
[00:48] <burek> if the linkage type is not the same
[00:48] <burek> meaning int @@main is not the same as int __main
[00:48] <matteowiz> for example:
[00:48] <burek> althought they both show as "main"
[00:48] <matteowiz> 0001e360 t x264_pixel_ads1
[00:48] <matteowiz>          U x264_pixel_ads1_avx
[00:48] <matteowiz>          U x264_pixel_ads1_mmx2
[00:48] <matteowiz>          U x264_pixel_ads1_sse2
[00:48] <matteowiz>          U x264_pixel_ads1_ssse3
[00:49] <cbreak> that means they are undefined.
[00:49] <cbreak> not in there
[00:49] <matteowiz> ah
[00:49] <matteowiz> great to know
[00:49] <cbreak> T would mean that they are in the text segment
[00:49] <matteowiz> why are U then?
[00:49] <cbreak> (code)
[00:49] <cbreak> it's an external dependency
[00:49] <cbreak> it didn't link to code that contained it
[00:50] <matteowiz> i use no stripping btw
[00:50] <cbreak> did you manage to create an x264 executable?
[00:50] <matteowiz> yes
[00:50] <cbreak> so ... that executable must have those symbols defined
[00:51] <cbreak> wonder from where.
[00:53] <matteowiz> mmm if i try to run x264 seg fault
[00:53] <matteowiz> :)
[00:55] <matteowiz> so is yasm messing up things here?
[00:56] <matteowiz> here is gdb core processing:
[00:56] <matteowiz> This GDB was configured as "i486-openwrt-linux".
[00:56] <matteowiz> (no debugging symbols found)
[00:56] <matteowiz> Core was generated by `x264'.
[00:56] <matteowiz> Program terminated with signal 11, Segmentation fault.
[00:56] <matteowiz> [New process 1720]
[00:56] <matteowiz> #0  0xb76b8857 in ?? ()
[00:57] <matteowiz> and here the run program:
[00:57] <matteowiz> This GDB was configured as "i486-openwrt-linux"...
[00:57] <matteowiz> (gdb) r
[00:57] <matteowiz> Starting program: /usr/bin/x264
[00:57] <matteowiz> Program received signal SIGSEGV, Segmentation fault.
[00:57] <matteowiz> 0xb77df857 in parse_enum ()
[00:57] <cbreak> stop spamming
[00:58] <cbreak> and check your x264 executable wether it has the symbols defined
[00:58] <matteowiz> sorry but i don't think it's spam
[01:01] <matteowiz> nm shows that are undefined even in the executable
[01:02] <cbreak> more than two lines in a single second is spamm :)
[01:02] <cbreak> seems you never linked them
[01:03] <matteowiz> oh sorry for spam then :)
[01:03] <matteowiz> mmm
[01:04] <matteowiz> due to the complexity of openwrt buildroot
[01:04] <matteowiz> i modified the Makefile in x264
[01:05] <matteowiz> cause there was a strange trailing space missing, causing AR to miss 'rc' flags
[01:05] <matteowiz> i think i better have to recheck LD flags
[01:07] <cbreak> maybe the makefile generation aborted prematurely?
[01:08] <matteowiz> no, it was saying something like: i486-openwrt-linux-uclibc-arlibx264.a  ... i486-openwrt-linux-uclibc-arlibx264.a: Command not found
[01:11] <matteowiz> and then : i486-openwrt-linux-uclibc-ld x264  x264.o input/input.o .... i486-openwrt-linux-uclibc-ld: cannot find x264: No such file or directory
[01:11] <matteowiz> ... bingo...
[01:11] <matteowiz> ld x264 with no flags prior the x264?
[07:38] <^qop^> hi there, can I write tags to wav?
[08:54] <magn3ts> Can I ask a lot of possibly annoying questions? I hope they're not annoying because I've spent like two weeks trying ot figure these things out on my own and I'm stumped.
[08:54] <magn3ts> How does one best transcode video and stream it "live"? I would like to be able to transmit duration information since it is known to the encoder, even though the transcoded video will be streamed live as soon as it has begun.
[08:55] <magn3ts> Is it possible to get any muxer to properly encode duration length at the beginning of the file when in streaming mode?
[12:36] <NonFish_> how come osd frame numbers are always skewed.. if i divide frame number by framerate of source, the time that results is not the same as the time mpchc shows
[12:40] <NonFish_> or, how are the frame numbers properly converted to a timecode in simple math?
[12:45] <JEEB> it's only simple in case of stuff like avisynth where there is no such thing as variable frame rate
[12:47] <NonFish_> the source file is xvid+mp3
[12:48] <NonFish_> nothing vfr
[12:48] <NonFish_> vbr mp3
[12:50] <JEEB> let's say you have 24000/1001fps content
[12:50] <JEEB> (framenum * 1001)/24000 , I would guess
[12:51] <NonFish_> ehh 1001 hm
[12:54] <NonFish_> is (framenum * 1001)/23976 right for 23.976fps
[12:55] <JEEB> 23.976 is 24000/1001
[12:55] <NonFish_> oh. lol sorry.
[13:18] <NonFish_> thanks tho it is still coming out wrong
[13:18] Action: NonFish_ sleeps
[13:22] <JEEB> well, what is "wrong"?
[13:22] <JEEB> it gives you seconds and something that's usable as milliseconds :P
[13:22] <JEEB> and the results in python look very much like what I get with ffmpegsource
[13:49] <Diogo> hi, one question please, first this is possible use ffmpeg for real time encoding, i need to build a file upload in php (send file via POST http) and the server when is getting the video start to encode..any tutorial on the web? like transloadit.com..
[13:49] <Diogo>  i can see a solution http://transloadit.com/demos/video-encode/encode-a-video-in-realtime
[13:49] <Diogo>  but i don't know how they do this..
[13:49] <Diogo> thanks for your help
[13:52] <Tjoppen> you somehow get the upload as a pipe, then send that to ffmpeg
[13:53] <Tjoppen> a CGI script could do it I think
[14:59] <Medi> Hi, I want to set several http headers, I found out I can do it with "-headers" option in command line, my question is how to set several headers, Should I use several -headers option, or I should use , / | .... to split values ?
[15:09] <Medi> I want to set several http headers, I found out I can do it with "-headers" option in command line, my question is how to set several headers, Should I use several -headers option, or I should use , / | .... to split values ?
[15:09] <Medi> I will be grateful for you help
[15:09] <Medi> please
[15:17] <Medi> is there anyone who can help me ?
[18:51] <MindSpark> hi, I am going crazy over this. Can someone please tell me how to convert rtmp to rtsp using ffserver?
[18:55] <maujhsn> Can someone show me how to incorporate this command:  -timestamp  now|([(YYYY-MM-DD|YYYYMMDD)[T|t| ]]((HH[:MM[:SS[.m...]]])|(HH[MM[SS[.m...]]]))[Z|z])
[18:55] <maujhsn> Looking for a practical example!
[19:11] <maujhsn> Can someone show me how to incorporate this command:  -timestamp  now|([(YYYY-MM-DD|YYYYMMDD)[T|t| ]]((HH[:MM[:SS[.m...]]])|(HH[MM[SS[.m...]]]))[Z|z])
[19:16] <sacarasc> -timestamp now
[19:16] <sacarasc> There you go. :p
[19:19] <maujhsn> sacarasc "ffmpeg -i input.mpg -timestamp now"?
[19:19] <sacarasc> You'd need an output file, too.
[19:19] <sacarasc> But that's what it seems to do.
[19:22] <maujhsn> sacarasc I will give it a shot.
[19:26] <q|o|p> with all honesty... how well does the ffac aac encoder compares to the nero encoder? I rather convert all my files to oga but I've got a free ipod so...
[19:28] <q|o|p> btw, and fmpeg -i "$file" -acodec libvorbis -aq 9 #should output similar quality to oggenc -q 9 right?
[19:28] <maujhsn> sacarasc When I run the the command: ffmpeg -i cab_A.mpg -timestamp now cab_B.mpg" I see no difference in the output. I also see this yellow tag: "timestamp is deprecated, set the 'creation_time' metadata tag instead."
[19:43] <quebre> hello
[19:43] <quebre> how can i effectively convert the mp4 avc720p to avi ?
[19:43] <q|o|p> olleh
[19:43] <quebre> -vcodec libx264 -preset slow -crf 18 -acodec libfaac -ac 2 -ab 128k
[19:44] <quebre> i use this
[19:44] <quebre> ;_)
[19:44] <Mavrik> quebre, you'll have to find out what "avi" do you want
[19:44] <q|o|p> I wouldnt
[19:44] <quebre> xvid
[19:44] <quebre> i want the same or similar DVDRip-XviD
[19:44] <quebre> can i get some syntax for ffmpeg
[19:44] <q|o|p> he just want to change the container
[19:44] <quebre> maybe some of you have it
[19:44] <quebre> but what i use now
[19:44] <quebre> is bad quality
[19:44] <quebre> and im pretty sure i did it in the lame way
[19:44] <quebre> ;/
[19:44] <Mavrik> you're not even creating xvid with that.
[19:45] <quebre> yeah, i noticed
[19:45] <Mavrik> crf 18 should be awesome quality actually
[19:45] <quebre> thats why im here ;p
[19:45] Action: q|o|p would use mkv instead
[19:45] <quebre> i can't
[19:46] <Mavrik> first of all you need to make sure your ffmpeg is compiled with libxvid
[19:46] <Mavrik> (ffmpeg -codecs)
[19:46] <quebre> i am converting it so i can run it on old laptop IBM Thinkpad T20
[19:46] <quebre> thats why i wnt to make it run smoothly
[19:46] <quebre> on older computers
[19:46] <quebre> so i thought decerasing quality will improve perfomance..
[19:46] <quebre> but..
[19:47] <q|o|p> http://ffmpeg.org/faq.html#How-do-I-encode-Xvid-or-DivX-video-with-ffmpeg_003f
[19:47] <quebre> you guys know better so im open for advices
[19:47] <q|o|p> read the two contiguous ones
[19:47] <quebre> i need the one-liner ;f
[19:47] <quebre> coz im very tired...
[19:47] <q|o|p> it is one line
[19:47] <quebre> but not link ;/
[19:47] <quebre> ffmpeg [options]
[19:47] <quebre> like this, if you would be so kind..
[19:47] <q|o|p> there is lazy... and there is  fking lazy
[19:47] <quebre> no, im just tired
[19:47] <quebre> pretty please ;/
[19:48] <q|o|p> click it! it is one line
[19:48] <quebre> i need it to finish the work
[19:48] <Mavrik> first hit on google "ffmpeg encode xvid" gives your full commandline
[19:48] <Mavrik> stop wasting our time.
[19:48] <quebre> did the pharse earlier
[19:48] <quebre> on google
[19:48] <quebre> but it jumped out with nonsence
[19:48] <quebre> like:
[19:49] <q|o|p> whatever... so... how well does faac compare to neroaccEnc?
[19:49] <q|o|p> I cant fnd any good comparisions beyond perceived
[19:49] <hi117> faac's docs do say its not as good as some
[19:51] <q|o|p> quebre: if you were to at least check the FAQ before anything, you could have read "ffmpeg -i input.mp4 -same_quant output.avi" was your answer with fewer effort than getting up in irc
[19:51] <q|o|p> hi117: yeah... but how good is not as good lol
[19:52] <hi117> the standard ogg encoder my distro comes with did better then faac at same bitrate IMO
[19:52] <q|o|p> ogg?
[19:52] <hi117> vorbis
[19:52] <q|o|p> I am talking of aac
[19:53] <hi117> i know but from what i hear aac>vorbis in terms of quality at same bitrate
[19:53] <q|o|p> I thought fmpeg -i "$file" -acodec libvorbis -aq 9 #should output similar quality to oggenc -q 9
[19:53] <q|o|p> hi117: the study I saw said that was true at low level bitrates
[19:53] <q|o|p> and had ogg as winer in higher bitrates
[19:54] <hi117> heh that might be why since i was doing a 300kb/s+ encode
[19:54] <hi117> it was a -q 10
[19:54] <q|o|p> -q 10 is 500 k
[19:54] <Mavrik> q|o|p, neroaacenc should be noticably better
[19:55] <hi117> the input was crap though so it only needed 300
[19:56] <q|o|p> Mavrik: then there is a problem, to convert using neroaacenc I'd have to convert to wav before and I'd lose the tags in the process >.>
[19:56] <hi117> convert to flac? doesnt that keep tags?
[19:56] <Mavrik> q|o|p, yep
[19:56] <Mavrik> q|o|p, that's why we just use libfaac anyway
[19:56] <q|o|p> yeah but neroaacenc only accepts wav as ainput
[19:56] <hi117> ah
[19:57] <q|o|p> meh, I'll use aac
[19:57] <q|o|p> err ogg
[19:57] <Mavrik> quality gain isn't work d*cking around with wav conversion and offhanding to neroaacenc
[19:57] <q|o|p> :)
[19:57] <q|o|p> yeah
[19:57] <cbreak> FLAC ftw.
[19:58] <q|o|p> yeah except I only have 8 gbs of space, and 800 gbs of mostly flacs dont fit in :p
[19:58] <Mavrik> 3TB drives are like 180¬ :P
[19:58] <cbreak> 3tb hard disk: 200$
[19:59] <q|o|p> yeah, do you know any 3TB cellphone?
[20:00] <cbreak> it's the cloud!
[20:00] <cbreak> or so I was told
[20:01] <q|o|p> oh yeah, not in the US, no googleMusic and wireless internet is stupidly expensive here
[20:01] <Mavrik> just throw 128k mp3s on the phone ffs
[20:01] <q|o|p> never!!!!!!!!!!!!!!!!!!!
[20:01] <cbreak> 128k aac would be better
[20:02] <Mavrik> running around city with traffic noise you'll never hear the difference between 128k and 320k anything
[20:02] <cbreak> although, with the DAs in a mobile phone
[20:02] <cbreak> you'd probably not hear the difference
[20:02] <q|o|p> 256 aac min
[20:02] <Mavrik> so you're just causing yourself problems with compilcations
[20:02] <q|o|p> well I am not a gifted listener but 128 is crappy enough for me to notice it :)
[20:03] <Mavrik> 160 VBR then
[20:03] <q|o|p> 192
[20:03] <q|o|p> :)
[20:03] <Mavrik> wasting space on a device when it's gonna be destroyed by ambient noise is just silly :P
[20:04] <q|o|p> I have noise blocking headphones!
[20:04] <cbreak> just change your music style to industrial techno
[20:04] <cbreak> then the ambient noise would be an added benefit
[20:05] <Mavrik> q|o|p, so now you even have headphones which add additional noise
[20:05] <q|o|p> http://www.bjs.com/webapp/wcs/stores/servlet/ProductDisplay?catalogId=10201&storeId=10201&partNumber=P_138743126&sc_cid=DF&ci_src=14110944&ci_sku=138743126
[20:05] <Mavrik> cbreak, ^^
[20:05] <q|o|p> lol
[20:06] <q|o|p> lol gtg :)
[20:21] <maujhsn> Trying learn how to use the avfilter in ffmpeg! This page gives one practical corny example "http://ffmpeg.org/libavfilter.html" does anbody know of other url's that are more user friendly?
[20:25] <mapreduce> In http://dranger.com/ffmpeg/tutorial08.html I see static struct SwsContext *img_convert_ctx;  I believe swscale isn't thread-safe, what's the minimum I could do to make this thread safe?
[20:25] <mapreduce> I tried using gcc's __thread, but got whole application freezes that I couldn't debug, and logging didn't show that I was creating lots of threads.
[20:26] <mapreduce> Less than 10.
[20:38] <cbreak> if thread safety were easy, everyone would do it.
[20:39] <cbreak> in general, if you want something to be thread safe, you have to ensure that either each thread acts in completely different data
[20:39] <cbreak> or that access is serialized, and correct
[20:40] <Mavrik> mapreduce, keeping separate SwsContexts with their own datasets SHOULD be enough
[20:40] <Mavrik> (for each thread)
[20:41] <cbreak> unless there's global state
[20:41] <cbreak> (like non-const function statics, or globals, or static members if this were c++ which it isn't :)
[20:42] <Mavrik> yeah, but IIRC there isn't
[20:42] <Mavrik> (but it was awhile ago since I used swscale)
[20:42] <kriegerod> offtop: does somebody knows existing technical approach to do live, realtime online video translation of airsoft game?
[20:43] <kriegerod> (with points of views of players)
[20:43] <cbreak> airsoft?
[20:43] <cbreak> translation?
[20:43] <kriegerod> sorry?
[20:44] <Mavrik> kriegerod, it's expensive and hard
[20:44] <Mavrik> and you need video professionals
[20:50] <NonFish_> is airsoft a videogame or a realworld offline game with airguns?
[20:54] <NonFish_> well, if a videogame, and this is totally a proprietary, windows only method, kriegerod.. but you could download xsplit and stream the screen to a www.justin.tv account.
[20:57] <kriegerod> NonFish_, nope, it's real-world game
[21:00] <NonFish_> what did you mean by translation?
[21:01] <cbreak> translating languages probably
[21:01] <cbreak> dubbing?
[21:20] <kriegerod> broadcasting
[21:20] <kriegerod> "translation" word in russian means sth close to "broadcasting", thus i mis-used this word
[21:21] <NonFish_> oh good. realtime language translation is hard.
[21:22] <vadim> hi guys ))
[21:23] <vadim> i may be in role of "real time translator" for a while )))
[21:23] <vadim> english-russian-english
[21:29] <NonFish_> broadcasting is easier. xsplit will let you use any attached camera as a video source. maybe you can find a wireless webcam and mount it to player helmets, kriegerod.
[21:30] <cbreak> vlc does broadcasting
[21:30] <cbreak> out of the box
[21:33] <NonFish_> yes though, never quite understood how to access the output over the web.. but maybe that was many versions ago
[21:35] <kriegerod> NonFish_: most used option for accessing over the web is publishing to Wowza, as i know
[21:35] <kriegerod> wowza gives rtmp, rtsp, apple hls
[21:36] <cbreak> vlc gives that and more
[21:36] <kriegerod> vlc can _serve_ rtmp?
[21:37] <cbreak> yes.
[21:37] <kriegerod> wow, since when?
[21:37] <cbreak> and http live streaming
[21:37] <cbreak> and a bunch of other stuff
[21:37] <cbreak> no idea.
[21:37] <cbreak> http live streaming since 1.12 or so
[21:37] <cbreak> I use that at work
[21:37] <JEEB> I've done http streaming with mpeg-ts as the container since 0.8.6 or so
[21:37] <JEEB> lol
[21:38] <cbreak> what kind of http streaming?
[21:38] <cbreak> http live streaming is a specific protocol
[21:38] <JEEB> mpeg-ts pushed into a http port
[21:38] <cbreak> that's not it.
[21:38] <JEEB> yes, you're talking about HLS
[21:38] <JEEB> :3
[21:38] <JEEB> so yes, we are talking of different things
[21:39] <JEEB> although both in the end boil to a very similar concept of mpeg-ts over http
[21:39] <JEEB> IIRC
[21:39] <cbreak> http live streaming is a pull protocol
[21:39] <cbreak> (of course it is, it's based on http...)
[21:39] <kriegerod> cbreak, sorry i'm just so surprised. Please tell me again, you launch VLC and it receives connections from clients, and gives them media by HLS? Just as far as i know HLS even client-side support is not available in vlc 1.1 at all
[21:39] <cbreak> it uses chunked .ts files
[21:40] <cbreak> and apparently the splitting was new in 1.12
[21:40] <JEEB> yeah
[21:40] <NonFish_> one thing is though, justin.tv or wowza type services save you bandwidth with many viewers.
[21:40] <cbreak> kriegerod: no, I write the output into some directory, and apache (or an other web server) serves it
[21:40] <mapreduce> Mavrik: I hoped the same, but the mechanism I tried to use to keep separate SwsContexts, gcc's __thread, caused freezes.
[21:40] <kriegerod> cbreak, so vlc just segments it, right?
[21:40] <cbreak> in the mode of operation I use it yes
[21:41] <cbreak> but if you do rtp/udp/udpbroadcast or a bunch of others it actually does stream
[21:41] <kriegerod> cbreak, returning to your previous reply, are you confident VLC is usable as rtmp server?
[21:41] <JEEB> this all reminds me of the hacks I did to stream stuff from my receiver in Japan to my place
[21:41] <JEEB> lol
[21:41] <cbreak> kriegerod: I know it can do it
[21:42] <cbreak> I don't know if it's worth anything or only supports a single digit number of clients
[21:42] <JEEB> UDP from the receiver to VLC on localhost, then VLC streaming out mpeg-ts into http
[21:42] <JEEB> lol
[21:44] <cbreak> http://news.slashdot.org/story/12/02/19/0127204/vlc-20-twoflower-released-for-windows-mac
[21:44] <cbreak> kriegerod: do you have a VLC available to you?
[21:45] <NonFish_> does streaming with vlc require viewers to have vlc too?
[21:45] <cbreak> depends
[21:45] <JEEB> there is no single way of streaming in VLC
[21:45] <cbreak> I Stream to iPads without VLC
[21:45] <JEEB> there are multiple
[21:45] <cbreak> (with http live streaming)
[21:46] <cbreak> the whole point of using httpls is to be able to stream to them in my case :)
[21:46] <kriegerod> cbreak: sorry?
[21:47] <cbreak> kriegerod: can you start VLC now?
[21:48] <cbreak> if you can, you can try out the streaming :)
[22:33] <mape> I'm dumping images from a video and was wondering if there is any way of defining what %d in img-%d.jpg starts on. I have multiple videos of one timestamp  and want to pick up where the previous left off
[22:36] <mape> sorry, multiple videos but one "moment", so they are cut up but I can't merge them as new ones keep coming in
[00:00] --- Mon Feb 20 2012


More information about the Ffmpeg-devel-irc mailing list