[Ffmpeg-devel-irc] ffmpeg.log.20130311

burek burek021 at gmail.com
Tue Mar 12 02:05:01 CET 2013


[00:23] <elxa> hi
[00:24] <elxa> is there a way to check if anybody is working on mvc h.264 http://en.wikipedia.org/wiki/Multiview_Video_Coding ?
[00:34] <burek> maybe check our git online
[00:56] <stunts513> can anyone help me with a compilation error
[00:56] <TheSashmo> does anyone know if there is an application that can do HLS or RTMP stream checking? for example video pixillation and black screen, low/high audio, etc?
[00:58] <TheSashmo> does anyone know how to force h.264 to be almost CBR?  I know it cant be 100%, but Im not able to get it closer than 400kb variance
[02:01] <p4plus2> Would anybody have a guess why "ffmpeg -an -f x11grab -r 30 -s 1920x1080 -i :0.0 -vcodec libx264 -crf 0 -preset ultrafast -threads 8 ~/recording/pre_video_1.mp4" would only result in 11FPS when recording?
[02:01] <p4plus2> I use to use this fine and could take my framerates as high as 60 perfectly fine.  I use to be able to even get away with the medium preset on 30FPS easily.
[02:01] <p4plus2> ffmpeg also only ever seems to cap out at 15% of my CPU usable and the total system at about 19%.  So the CPU isn't the bottle neck.
[02:01] <p4plus2> Additionally, I tried recording to a tmpfs in RAM to test if suddenly the HD was my bottleneck -- again not the case.
[02:02] <p4plus2> Any guesses why suddenly ffmpeg is so slow would be great.
[02:15] <p4plus2> okay as an update if I remove "-r 30" my framerates start at 80 then stabilize at 30 after a minute or so
[02:16] <p4plus2> dup=xxxx is going up quite a lot though, but no dropped frames.
[02:17] <p4plus2> Now I get the desired framerate but the video feels laggy.
[02:41] <stunts513> can someone help me with a compilation error?
[02:42] <klaxa> maybe
[02:42] <klaxa> put logs on pastebin or something similar and paste the link in here
[02:42] <stunts513> ok
[02:43] <stunts513> http://pastebin.com/LPSGN6Mf
[02:44] <stunts513> for the record im cross compiling to ios 4.2
[02:45] <stunts513> which is just arm architecture
[02:46] <klaxa> hmm... no idea actually... given that there is an error with an already defined variable, maybe someone just didn't use pre-processor flags correctly :V
[02:46] <stunts513> i will admit right now im brand new to cross compiling
[02:47] <stunts513> i used a script someone else made for cross compiling from mac os x and modified the directories for my toolchain with the gas preprocesor but i dont know much about this
[02:47] <klaxa> i've only done it for mxplayer on android and that was rather simple, everything was provided
[02:49] <stunts513> so is this an error wiht how im compiling it or something in the actual coding?
[02:50] <klaxa> i would think it's with the coding
[02:50] <stunts513> ah great...
[02:50] <klaxa> and tbh i'd just look for that line and remove the definition of the variable that has been defined twice
[02:52] <stunts513> yea i tried doing that sort of, i'm not a coder, but i can someitme sdebug things a little bit, but when i went to the line referenced i didnt see anything that looked liek what i woudl think i see when defining something, but further down from that line i saw the function that it was saying was trying to be defined again
[02:52] <stunts513> considering that i know people have compililed this for ios before i might just try getting a older version of the source code an dbuilding it and see wat happens
[02:53] <stunts513> im just compiling it for 3 of its libraries so i can compile libdlna and ushare
[02:53] <klaxa> are you... sure? i can't read arm assembler at all
[02:53] <klaxa> or rather i don't really know what is going on because i never learned it
[02:54] <klaxa> ah wait this looks like something...
[02:55] <klaxa> did you pull the latest git source?
[02:55] <stunts513> yes
[02:56] <stunts513> well actually less it was updared last night, i started doign this yesterday
[02:57] <stunts513> tried to compile it natively on my ipod but it keeps crashing i think from low ram and rebooting when it gets to the allcodecs.c file i think so i gave up on that and started crosscompiling
[02:58] <klaxa> okay i don't even, maybe wait for some expert on arm assembler or write to the mailing list :S
[02:58] <stunts513> yea... i might have to , never used a mailing list before
[03:01] <stunts513> ok well thanks fro the help im going to try an older release
[03:51] <ubitux> buu: ok got the issue about the build
[03:51] <ubitux> temporarly you can --enable-gpl
[03:56] <ubitux> you can git pull for a fix
[09:00] <dmonjo> can i use ffmpeg on a stream file?
[09:21] <ItsMeLenny> is there a way to split a video into 5 second increments?
[09:23] <zap0> yes.
[09:23] <ItsMeLenny> lol, i probably shouldve asked what command would do that for me
[09:24] <dmonjo> can i use ffmpeg to convert a file stream generatein in icecast into a different format?
[09:25] <ItsMeLenny> icecast can do ogg and mp3, what other format did you want
[09:25] <dmonjo> mp4\
[09:25] <ItsMeLenny> you wont be able to restream it
[09:25] <ItsMeLenny> you cant convert it to mp4 then push it back to icecast
[09:25] <dmonjo> ItsMeLenny:  i want to send only 1 source to icecast as ogg and convert it on the server to mp4
[09:25] <dmonjo> instead of sending 2 source streamsin to icecast
[09:26] <ItsMeLenny> icecast cant broadcast mp4, as far as im aware
[09:26] <dmonjo> ItsMeLenny: it will not be repushed to icecast, ogg convert to mp4 mp4 goes into webserver
[09:27] <ItsMeLenny> how are you broadcasting it?
[09:28] <dmonjo> i am using gstreamer to send source
[09:28] <dmonjo> to icecast
[09:28] <dmonjo> ogg stored in icecast on port 8000
[09:28] <dmonjo> i want to use ffmpeg to conenct to that stream and convert it to an mp4
[09:28] <ItsMeLenny> but icecast is what broadcasts
[09:28] <ItsMeLenny> so you just want to capture the stream as mp4?
[09:28] <dmonjo> ItsMeLenny: i am bcasting an ogg
[09:28] <dmonjo> yes
[09:29] <dmonjo> i dont want to send a second source stream
[09:29] <dmonjo> i only want to send that ogg souirce stream and most work to be done on server
[09:29] <ItsMeLenny> ah
[09:29] <ItsMeLenny> my first thought is vlc can capture
[09:29] <dmonjo> ?
[09:31] <dmonjo> i want to do something like wget <ogg stream> | ffmpeg -i pipe:0 -c:v libx264 .........
[09:31] <dmonjo> is that possible?
[09:38] <ItsMeLenny> dmonjo, im not sure
[11:14] <sovereign313> OK!  So, I've downloaded the git repository of ffmpeg (with all of it's dependencies)  [git-2013-03-10-fb14e37], and I've also tried changing all of the jpg's to the same WxH with imagemagick, to generate an mp4 with a sequence of images.  It generates the mp4 with no issues, with a reasonable amount of time given the number of jpeg's and the wait duration between each frame.  However, in vlc and xbmc (the only two players I'v
[11:14] <sovereign313> e tried) it still only shows the first image for the entire duration of the video.
[11:14] <sovereign313> here is a pastebin of the command and output: http://pastebin.com/xM8Ei06n
[11:14] <sovereign313> any idea's why it's showing only the first image?
[11:18] <Mavrik> hmm
[11:18] <Mavrik> does any player play the video correctly?
[11:18] <Mavrik> like ffplay?
[11:20] <sovereign313> lemme try
[11:24] <sovereign313> unfortunately, it does not.
[11:25] <Mavrik> hmm
[11:26] <Mavrik> first of all, "yuvj420" is a rather strange format for h.264 video, pass "-pix_fmt yuv420" next time
[11:26] <Mavrik> but that wouldn't cause the problem you're having
[11:26] <Mavrik> I think the catch is in how your filenames are parsed
[11:26] <sovereign313> http://pastebin.com/Lnhvqc0F
[11:26] <sovereign313> ok
[11:26] <sovereign313> I'll rename them quick
[11:27] <sovereign313> image000x?
[11:30] <sovereign313> hmmm, I get a no such pixel format yuv420
[11:30] <Mavrik> it's yuv420p, sorry :)
[11:30] <Mavrik> sovereign313, also your images have to be in sequence
[11:30] <Mavrik> so
[11:31] <Mavrik> Big_6898469_0001.jpg, Big_6898469_0002.jpg, ...
[11:31] <sovereign313> k
[11:31] <sovereign313> they are already
[11:31] <sovereign313> from 0001.jpg to 0291.jpg
[11:31] <sovereign313> (Big_etc)
[11:33] <sovereign313> it's converting again with yuv420p
[12:06] <sovereign313> I got it!  My dumb @$$ didn't realize that mogrify somehow made all of my images the same image (no doubt, something I fubar'd)
[12:06] <sovereign313> Thanks again for all your help Mavrik.
[12:07] <Mavrik> does it work now? :)
[12:07] <sovereign313> switching to the git revision, and using -px_fmt did the trick
[12:07] <sovereign313> it sure does.
[12:08] <sovereign313> the second issue was mogrify had made all my images the same (the first one)
[12:08] <Mavrik> ok, keep your output video in yuv420p if you want player compatibility :)
[12:08] <sovereign313> :(
[12:08] <sovereign313> absolutely!  It's a wedding video I'm making for my wife (slideshow of wedding images) to make a Roku channel for her
[12:09] <sovereign313> next will be adding music :)
[12:09] <sovereign313> you guys rock
[12:09] <sovereign313> thanks again
[13:30] <dmonjo>  how can i convert a stream from an ogg  (source http) to an s out (mp4) ?
[13:31] <dmonjo> not files**
[13:31] <dmonjo> streams!
[14:28] <dmonjo> gst-launch-0.10 filesrc location='input.wmv' ! decodebin ! x264enc ! rtph264pay ! rtpmux ! udpsink
[14:28] <dmonjo> will send the file to a rtsp server?
[14:28] <dmonjo> where do i specify the addrwss of the streamer/
[14:29] <dmonjo> can i use tcpserversrc and tcp clientsink
[14:29] <dmonjo> ?
[14:31] <held> hi, i'm trying to limit cpu usage by reducing the number threads used by x264: "ffmpeg -threads 1 -i input -c:v libx264 -x264opts threads=1 ..." yet libx264 reports "threads=6" and consumes 100% cpu. any hints?
[14:32] <zap0> what does 100% CPU mean?     and why is that number a concern?
[14:32] <zap0> what are CPU's for but to use them?
[14:32] <dmonjo> guys urgent:
[14:32] <dmonjo> can i use FFserver to take an input file into it as a HTTP://source file ? example icecast stream
[14:33] <dmonjo> i wouldnt say input file
[14:33] <dmonjo> its an input stream
[14:33] <dmonjo> is that possible?
[14:34] <held> zap0: it means it uses all my cores, and it's a concern because i have other services running on the servers that i don't want to have it affected.
[14:35] <held> the encoding itself is not time critical, so i'd rather have ffmpeg/x264 only use 1 core
[14:37] <zap0> why are you trying to micromanage cores?   OS's do that for you.
[14:37] <zap0> your concern and solution are unrelated.
[14:38] <held> whatever my concern may be, the solution is not working :)
[14:39] <zap0> why do you care that your CPU meter (that runs way slower than the things is attempting to monitor) says 100% ?
[14:39] <held> i couldn't find any docs that "-x264opts threads=x" should not be working, yet it does seem so.
[14:40] <held> because i know what other workloads will happen on the server, and those have priority over encoding .
[14:42] <zap0> ok.  so what has that got to do with the number of threads?
[14:43] <zap0> you are trying to micro manage  4 or 6 threads, as if thats a concern.     how many threads are currently running in your system right now?      are you not concerned about those?
[14:43] <held> from the docs: "Enables parallel encoding by using more than 1 thread to increase speed on multi-core systems."
[14:44] <held> i do want to disable parallel encoding so that only 1 core can be used.
[14:45] <zap0> ffmpeg does not have core control.  only a thread count.
[14:45] <held> thats what i'm trying to pass x264, whose docs says the above...
[14:45] <zap0> my system says i already have 100 threads running.  i bet yours does too... are you not equally concerned about that?
[14:46] <zap0> what is 6 more, when you already have 100 running?
[14:46] <held> no because 99% of those are ideling.
[14:48] <held> so lets ignore threads and the failure to set them through x264opts for now, what can i do to have encoding use only 1/4 of my ressources, i.e. 1 core?
[14:51] <dmonjo> what is wrong with this command ? ffmpeg -i event1.ogv -c:v libx264 -crf 24 -preset slow -c:a libfdk_aac -afterburner 1 -s 320x240 -ar 44100 -b:a 128k  -f ffm http://127.0.0.1:8090
[14:51] <dmonjo> http://127.0.0.1:8090: Invalid data found when processing input
[14:51] <dmonjo> ffserver is running on port 8090
[14:51] <dmonjo> :/
[15:05] <dmonjo> i need assitance been stuck for hours on this....
[15:05] <dmonjo> ffmpeg -i http://localhost:8000/event1.ogv -crf 24 -preset slow -c:a libfdk_aac -afterburner 1 -s 320x240 -ar 44100 -b:a 128k  -f ffm http://127.0.0.1:8090/feed1.ff
[15:05] <dmonjo> what is wrong ?!
[15:07] <zap0> what makes you think it ought to work?
[15:08] <dmonjo> zap0 can you enlighten me? need someone to put me on track at least whats wrong with it ?
[15:09] <dmonjo> zap0:  read this http://ffmpeg.org/ffserver.html#Synopsis
[15:12] <held> dmonjo: "It can also stream from files, though that is currently broken." - aren't you trying just that?
[15:12] <dmonjo> held no
[15:12] <dmonjo> i am trying to stream from an icecast HTTP source stream
[15:13] <dmonjo> ffmpeg -i http://localhost:8000/event1.ogv
[15:15] <held> what happens if you just pass it through?
[15:16] <dmonjo> held: http://pastebin.com/DyYZDiwp
[15:16] <dmonjo> zap0: what is wrong with it?
[15:16] <held> did you try "ffmpeg -i http://localhost:8000/event1.ogv -f ffm http://127.0.0.1:8090/feed1.ffm"?
[15:17] <zap0> what is right with it?
[15:18] <dmonjo> held: no good
[15:18] <dmonjo> http://pastebin.com/VDfcSnJf
[15:20] <held> so the ogv doesn't have audiop?
[15:20] <held> -p
[15:21] <dmonjo> it does
[15:21] <dmonjo> it works fine vorbis
[15:21] <dmonjo> but yea now it doesnt have audio
[15:22] <dmonjo> cause i am not streaming so it is falling back to an non audio stream
[15:22] <dmonjo> oly video
[15:22] <dmonjo> held: it works fine when i stream to it
[15:22] <dmonjo> so when there is audio/video
[15:22] <held> your feed format requires audio then, thus if you don't have audio, it fails.
[15:25] <dmonjo> held: ffmpeg -i http://localhost:8000/event1.ogv -f ffm http://127.0.0.1:8090/feed1.ffm"? works fine but ugly pic and out of synch with lot of pixels around so i am trying to add the options
[15:25] <dmonjo> Option afterburner not found.
[15:25] <dmonjo> it is exiting on that one
[15:57] <Mordae> Hi
[15:59] <dmonjo> zap0: so you have any comments?
[16:00] <zap0> no.
[16:01] <Mordae> I need to get x11grab output to ffserver as main at 4.0 h264. Is that possible?
[16:02] <Mordae> So far, I have only managed high444, which is something our boxes can't play.
[16:24] <dmonjo> anyotnhing wrong this now?
[16:24] <dmonjo> ffmpeg -i http://127.0.0.1:8000/event1.ogv -c:v libx264  -c:a libfdk_aac  -s 320x240 -ar 44100 -b:a 128k -f ffm http://127.0.0.1:8090/feed1.ffm
[16:33] <held> dmonjo: are you sure that scaling works like that? i'm using "-vf scale=320:240" - see http://ffmpeg.org/trac/ffmpeg/wiki/FilteringGuide
[16:33] <dmonjo> ok progress
[16:33] <dmonjo> :)
[16:33] <dmonjo> the video works fine
[16:33] <dmonjo> the audio if disabled
[16:40] <dmonjo> held: for  -c:a libfdk_aac  what audiocodec should i use in ffserver.conf ?
[16:41] <held> aac probably? idk
[16:44] <dmonjo> libfaac?
[16:45] <dmonjo> held: i mean in that ffserv.conf
[16:45] <dmonjo> doesnt look it has the option aac to set
[16:46] <held> "AudioCodec aac" maybe? again, i don't know ffserver, just guessing from the sample config.
[16:47] <held> but it seems you're trying to encode the audio twice anyway
[16:47] <dmonjo> where?
[16:47] <held> once in your command line you pasted, and once in the ff config?
[16:48] <dmonjo> it is sending aac in the command
[16:48] <dmonjo> and in the ffserver it outputs
[16:48] <dmonjo> so i think i should convert it to the same ? :/
[16:48] <held> ffserver does the transcode, thats why you specify the codecs in its stream configs
[16:50] <held> so again, just from the sample, it seems like you just feed it with anything that ffmpeg understands, and then it will transcode it to either flash, mpeg, or any of the formats defined in the ff stream config
[16:50] <dmonjo> held: it is not working if i dont config the audio in the ffserver, it only works when i set audiodisabled=true
[16:50] <held> you need to config it in the ffserver, but not in your ffmpeg commandline
[16:51] <dmonjo> hmmm
[16:51] <dmonjo> let me check
[16:51] <held> ffserver = output format
[16:51] <dmonjo> yep
[16:51] <dmonjo> -c:a libfdk_aac removed
[16:52] <dmonjo> --enable-gpl --enable-libass --enable-libfaac --enable-libfdk-aac --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libspeex --enable-librtmp --enable-libtheora --enable-libvorbis --enable-libvpx --enable-x11grab --enable-libx264 --enable-nonfree --enable-version3
[16:52] <dmonjo> it doesnt look like aac is recognized in the config file
[16:52] <dmonjo> Codecs do not match for stream 0
[16:52] <held> it only supports a few output formats
[16:52] <held> see http://ffmpeg.org/sample.html
[16:53] <held> mpeg4 does not seem to be listed there
[16:53] <dmonjo> yea
[16:53] <dmonjo> mp2?
[16:53] <held> you can try using avi
[16:53] <dmonjo> avi container with h264 you mean?
[16:53] <held> # avi        : AVI format (MPEG-4 video, MPEG audio sound)
[16:55] <held> on the other hand, check the #<Stream live.h264> example
[16:55] <held> it's probably what you want.
[16:59] <dmonjo> that section output an h264 rtsp stream ? ;/
[16:59] <held> yes? from your cli it seems you want h264/aac
[17:00] <dmonjo> yes i am converting ogv to h264
[17:00] <dmonjo> but format is #Format rtp
[17:02] <held> rtsp is using rtp
[17:02] <dmonjo> but my source is not rtsp
[17:02] <dmonjo> it is an http
[17:02] <held> again, it's the OUTPUT format, not the input format
[17:02] <dmonjo> ok
[17:03] <dmonjo> so i would go to the output file as rtsp://server:8090/live.h264
[17:03] <dmonjo> isntead of http
[17:05] <held> no
[17:05] <held> you define the RTSPPort
[17:06] <held> usually 5554 or something
[17:06] <held> then you connect to rtp://localhost:5554/...
[17:06] <dmonjo> ok
[17:06] <held> the ... is what is written here: <Stream ...>
[17:08] <dmonjo> RTSPORT should be manully added then to that <stream>
[17:08] <held> to the config.
[17:08] <held> http://stackoverflow.com/questions/12962358/stream-from-mp4-file-over-rtsp-with-ffserver
[17:09] <dmonjo> i hope its a working config that one :)
[17:10] <held> there are others. google is your friend: "ffserver rtsp"
[17:10] <dmonjo> will do
[17:12] <dmonjo> are you sure  -c:a libfdk_aac has to be ommited when feeding from ffmpeg to ffserver ? :/
[17:14] <dmonjo> /etc/ffserver.conf:319: AVPreset error: defaultFile for preset 'baseline' not found
[17:14] <dmonjo> weird
[17:16] <dmonjo> and getting this error from ffserver Error while opening encoder for output stream #0:3 - maybe incorrect parameters such as bit_rate, rate, width or height
[17:19] <dmonjo> held: http://pastebin.com/2fpX8r5n
[17:26] <held> donno. you seem to input 4 streams now
[17:27] <held> but then again, you did remove the command line...
[17:30] <dmonjo> held:  gst-launch-0.10 v4l2src ! videoscale ! video/x-raw-yuv,width=320,height=240,framerate=5/1 ! queue ! ffmpegcolorspace ! videorate ! theoraenc bitrate=128 ! queue ! oggmux name=mux pulsesrc ! audio/x-raw-int,rate=44100,channels=1,depth=16 ! queue ! audioconvert ! audiorate tolerance=40000000 ! vorbisenc ! queue ! mux. mux. ! queue ! shout2send
[17:30] <dmonjo> this is my gstreamer command
[17:30] <held> no the one for ffmpeg
[17:30] <dmonjo> VideoFrameRate 10
[17:30] <dmonjo> i think this is wrong
[17:31] <dmonjo> it should be 5/1 means what in ffserver ?
[17:31] <dmonjo> 5?
[17:34] <dmonjo> held: this is my configuration
[17:34] <dmonjo> http://pastebin.com/qNzUvieC
[17:34] <dmonjo> i think there is something mismatching with the gstraemer command
[17:49] <dmonjo> ffmpeg -i http://1.2.3.4:8000/event1.ogv -c:v libx264  -c:a libfdk_aac  -s 320x240 -ab 44100 -b:a 128k -f ffm http://127.0.0.1:8090/feed1.ffm\n
[17:49] <dmonjo> held: tried removing the aac
[17:49] <dmonjo> same problem....
[17:49] <held> did you try just using ffmpeg -i http://1.2.3.4:8000/event1.ogv -f ffm http://127.0.0.1:8090/feed1.ffm
[17:50] <dmonjo> Error while opening encoder for output stream #0:1 - maybe incorrect parameters such as bit_rate, rate, width or height
[17:51] <held> it doesn't make sense to transcode the stream twice. ffm already does that.
[17:51] <held> so paste your ffm stream config.
[17:51] <dmonjo> i am running this: ffmpeg -i http://12.3.4:8000/event1.ogv -f ffm http://127.0.0.1:8090/feed1.ffm
[17:51] <dmonjo> i will apste the config
[17:52] <held> it has problem encoding, not decoding, so i guess the error is within your stream config
[17:54] <dmonjo> http://pastebin.com/hgaprTCa
[17:54] <dmonjo> this is my stream configuration
[17:56] <dmonjo> held:  you mean a problem encoding in ffserver?
[17:56] <held> try a different format to see if its a general problem or with h264
[17:57] <dmonjo> held: i think there is a mismatch between the gstream-launch coammnd creating the ogv and the ffserver encoding
[17:57] <dmonjo> gst-launch-0.10 v4l2src ! videoscale ! video/x-raw-yuv,width=320,height=240,framerate=5/1 ! queue ! ffmpegcolorspace ! videorate ! theoraenc bitrate=128 ! queue ! oggmux name=mux pulsesrc ! audio/x-raw-int,rate=44100,channels=1,depth=16 ! queue ! audioconvert ! audiorate tolerance=40000000 ! vorbisenc ! queue ! mux. mux. ! queue ! shout2send
[17:57] <held> try setting the ff framerate to 5fps as well?
[17:59] <dmonjo> when i start ffserver i get this directly even before launching the stream on  "ffserver &"
[17:59] <dmonjo> Codec width, height and framerate do not match for stream 1
[18:04] <dmonjo> is ffserver broken?
[18:05] <Mavrik> yep
[18:05] <dmonjo> Mavrik: what is an alternative?
[18:06] <dmonjo> ive been fighting since 2 dayas
[18:09] <dmonjo> anyone?
[18:09] <dmonjo> ...
[18:21] <dmonjo> rtsp://1.2.3.4:8090/live.264 ?
[18:21] <dmonjo> this is how we access it?
[18:35] <dmonjo> anyone?
[18:35] <dmonjo> hello?
[18:36] <held> wrong port.
[18:38] <dmonjo> held ok 8544
[18:38] <dmonjo> how can i run that?
[18:38] <dmonjo> from my webserver?
[18:38] <held> ?
[18:38] <dmonjo>  <source src="rtp://1.2.3.4:8544/live.264"/>
[18:38] <dmonjo> ?
[18:39] <held> <a href="rtp://1.2.3.4:8544/live.264">click</a>
[18:41] <dmonjo> doesnt open in a browser
[18:41] <dmonjo> chrome
[18:41] <dmonjo> trying in vlc
[18:42] <dmonjo> VLC is unable to open the MRL
[18:43] <dmonjo> port 8544 is opened
[18:43] <dmonjo> stream is streeamin now
[18:43] <dmonjo> but i want to know how i can see it
[18:45] <held> is 8544 the port that you defined in your ff server config?
[18:45] <dmonjo> ad_ffmpeg: initial decode failed
[18:45] <dmonjo> yes
[18:46] <dmonjo> Could not open audio decoder ffmpeg.
[18:46] <dmonjo> this is reported by mplayer
[18:49] <held> url?
[18:50] <dmonjo> held my intention is for my webserver to read this rtsp
[18:50] <dmonjo> i dont know if apache/wordpress on top of it can read rtsp
[18:51] <held> erhm and what is your webserver supposed to do with it?
[18:52] <dmonjo> show it to the users
[18:52] <dmonjo> man i wanted a mp4 out of this ffserver
[18:52] <dmonjo> i dunno if RTSP will solve it
[18:53] <dmonjo> i wanted a mp4 stream out of ffserver
[18:53] <held> what you want is a plugin/html5-video tag that uses this url
[18:54] <held> the "webserver" cannot "show" it.
[18:54] <held> it's not a video player :)
[18:54] <dmonjo> i have videojs as a player on it
[18:55] <dmonjo> it is working fine with http stream off
[18:55] <held> so stick that url as a source to the html5 <video> tag
[18:55] <dmonjo> i wanted to create an mp4 http stream
[18:55] <dmonjo> and point to it
[18:55] <dmonjo> duno if videojs will work with rtsp
[18:55] <dmonjo> this is what i wanted to do instead of the href
[18:56] <held> http://stackoverflow.com/questions/1735933/streaming-via-rtsp-or-rtp-in-html5
[18:56] <dmonjo> <source src="rtp://...:8544/live.h264"/>
[18:57] <held> but you're aware that mp4 as rtsp stream can only be used by a few browsers at best? http://www.longtailvideo.com/html5/hls/
[18:59] <dmonjo> held i want it to support iphones
[18:59] <dmonjo> and ie
[19:00] <held> good luck with IE
[19:00] <dmonjo> now ios supports this RTSP right
[19:02] <dmonjo> this rtsp is not working
[19:02] <dmonjo> ....
[19:02] <dmonjo> showing a black screen on the browser
[19:02] <dmonjo> black screen isntead of the video
[19:02] <dmonjo> <source src="rtp://...:8544/live.h264"/>
[19:04] <held> i don't think ios/webkit supports rtsp just yet.
[19:04] <held> check here: https://developer.apple.com/library/safari/#documentation/AudioVideo/Conceptual/Using_HTML5_Audio_Video/AudioandVideoTagBasics/AudioandVideoTagBasics.html
[19:05] <held> "Safari on the desktop plays the RTSP stream, while Safari on iOS plays the HTTP Live stream."
[19:05] <dmonjo> held i think it is best if we try to transform that ogv stream to mp4
[19:05] <dmonjo> *sighs*
[19:05] <dmonjo> i dunno why we did it on h264 rtsp...
[19:05] <Mavrik> basically if you want to live stream video you need to make sure:
[19:05] <dmonjo> it is not supported anywhere :
[19:05] <dmonjo> :D
[19:05] <dmonjo> lol
[19:06] <Mavrik> 1.) RTMP for flash 2.) HLS for mobile devices
[19:06] <Mavrik> 3.) webm for chrome/firefox
[19:07] <dmonjo> Mavrik: i managed to convert using ffmpeg A file.ovg to file.mp4 and mp4 plays very good on most browsers
[19:07] <dmonjo> now my challenge is converting a stream ovg to mp4 stream
[19:07] <held> stream != file
[19:07] <dmonjo> yes exactly
[19:07] <dmonjo> i dont want to work on files, it already worked
[19:07] <held> and mp4 support won't give you 100% browser support, unfortunately.
[19:07] <Mavrik> mp4 files aren't streamable really.
[19:07] <dmonjo> i want to work on converting streams
[19:07] <held> i'm converting our videos to mp4 + webm
[19:08] <Mavrik> you need a streaming server with segmenter
[19:08] <dmonjo> Mavrik: i understoof that ffmpeg can convert stream.ogv to stream.mp4
[19:08] <dmonjo> you saying it cant be done?
[19:09] <Mavrik> I really have no idea what you're actually trying to achieve
[19:09] <dmonjo> its very easy man
[19:09] <Mavrik> yes?
[19:09] <dmonjo> stream.ovg from icecast to stream.mp4
[19:09] <dmonjo> both streams ovg and mp4 will go to a webserver
[19:09] <dmonjo> thats it..
[19:10] <Mavrik> dmonjo, is that a live stream?
[19:10] <dmonjo> yes
[19:10] <Mavrik> or file on the disk?
[19:10] <dmonjo> no a livee stream
[19:10] <dmonjo> read from a webcam
[19:10] <dmonjo> goes like this:
[19:10] <dmonjo> webcam -> icecast -> ogv
[19:10] <Mavrik> ok
[19:10] <dmonjo> now i want ogv -> ffserver -> mp4 stream
[19:10] <Mavrik> do you need icecast or can you replace that too?
[19:10] <dmonjo> it is doing its work for ogg
[19:10] <dmonjo> what do you suggest
[19:11] <Mavrik> hmm
[19:11] <dmonjo> i ve been palying with ffserver
[19:11] <Mavrik> the problem is... the easiest software for you costs about 1k$ per month :D
[19:11] <dmonjo> lol
[19:11] <Mavrik> anyway
[19:11] <dmonjo> ovg to mp4 damn 1k per month?!
[19:12] <dmonjo> i am sure ffserver can do that
[19:12] <held> you better show something good on that webcam ;)
[19:12] <Mavrik> you need a streaming server which will mux mp4 to proper formats
[19:12] <Mavrik> and you should probably consider using RTMP instead of "just" htnml5
[19:12] <Mavrik> *html5
[19:12] <Mavrik> since mp4 isn't really live streamable
[19:12] <Mavrik> (videojs can handle that afaik)
[19:12] <dmonjo> i have videojs running
[19:13] <Mavrik> look at red5 streaming server
[19:13] <dmonjo> Mavrik: now what we did me and held is we converted the ogv to a h264 RTSP
[19:13] <dmonjo> does videojs support that?
[19:13] <Mavrik> not really
[19:13] <Mavrik> RTSP isn't really supported... anywhere :)
[19:14] <dmonjo> *handshakes* held
[19:14] <Mavrik> basically, your chain would better look like
[19:14] <Mavrik> webcam -> ffmpeg (2 outputs, ogv + h.264 in mpegts) -> red5 + icecast
[19:15] <Mavrik> so you use a ffmpeg process to grab from webcam and encode to H.264 and theora
[19:15] <dmonjo> you mean stream from webcam into ffmpeg 2 sources?
[19:15] <Mavrik> and then send theora to icecast
[19:15] <Mavrik> and send H.264 to red5
[19:15] <dmonjo> webcam can only send 1 source
[19:15] <dmonjo> slow link
[19:15] <Mavrik> yeah, but ffmpeg can encode to several outputs
[19:15] <dmonjo> this is why conversion neeeds to be done on server side
[19:15] <Mavrik> you'll need enough CPU
[19:15] <Mavrik> but it's doable
[19:16] <dmonjo> ffmpeg would be on the computer running the webcam?
[19:16] <dmonjo> send from webcam to the internet ogv and h264?
[19:16] <held> ffmpeg would replace icecast
[19:16] <dmonjo> hmmm
[19:16] <Mavrik> dmonjo, well, if the webcam isn't on the server machine
[19:17] <Mavrik> you'd have to figure out how to get stream from webcam to the server first :)
[19:17] <Mavrik> transcode there
[19:17] <Mavrik> send transcoded streams to streaming servers
[19:17] <dmonjo> ffmpeg can replace icecast? can it act like a streaming server?
[19:17] <Mavrik> no
[19:17] <Mavrik> you're not listening
[19:18] <Mavrik> you still need streaming server to which clients connect
[19:18] <dmonjo> ok
[19:18] <Mavrik> you just need separate ones for theora and h.264
[19:19] <dmonjo> webcam-> internet -> hit server1 : ffmpeg receives it on portX => transcodes outputs 2 STREAMs OGV to go to icecast and another H264 STREAM (not files) to go to red5?
[19:20] <Mavrik> yeah, that would work
[19:20] <Mavrik> do you have enough bandwidth to that?
[19:21] <LithosLaptop> I would think that would be alot of bandwidth
[19:21] <dmonjo> why do i need bandwidth i would be sending from webcam to server1 and server1 will have ffmpeg,icecastand red5 installed
[19:22] <Mavrik> well, you do have to get video from webcam to server :)
[19:22] <dmonjo> Mavrik: 300KB/sec
[19:22] <dmonjo> for upload
[19:22] <dmonjo> wors good
[19:23] <dmonjo> Macey:
[19:24] <dmonjo> Mavrik: can also be done this way? webcam->Icecast(OGV stream) -> ffmpeg -> convert OVG stream to MPG ?
[19:24] <dmonjo> withtout the red5?
[19:25] <dmonjo> Mavrik:  can ffmpeg do STREAM:OVG to  h.264 in mpegts STREAM ?
[19:25] <Mavrik> ffmpeg isn't a streaming server.
[19:25] <Mavrik> red5 is
[19:25] <Mavrik> you're mixing up types of software :)
[19:25] <Mavrik> and yes, you can do that
[19:25] <dmonjo> i just want to convert the streams
[19:25] <dmonjo> if i am able to convert them i dont need  red5 right
[19:26] <Mavrik> you always need some kind of streaming server.
[19:26] <Mavrik> you can't just send video to random clients :P
[19:26] <Mavrik> you need a target for them to connect to
[19:27] <dmonjo> Macey:
[19:27] <dmonjo> Mavrik:
[19:27] <dmonjo> once  i have a ogv stream and another converted mpg stream
[19:27] <dmonjo> i can send them to a webserver
[19:27] <dmonjo> where clients connect
[19:27] <dmonjo> right
[19:27] <held> the streaming server takes care of creating the appropriate stream headers when a new client connects, thats why you need a streaming server inbetween and can't just generate 1 stream for all.
[19:28] <Mavrik> dmonjo, webservers don't really know how to properly handle video
[19:28] <Mavrik> streaming servers are "webservers" for video
[19:30] <dmonjo> i am using a icecast streaming server that accepts the stream and generated an ogv so if i convert that ogv to a mpg it would look like its a stream generated from the icecast with mpg encoding schema
[19:30] <dmonjo> no? :/
[19:30] <dmonjo> i see what you mean
[19:30] <dmonjo> the connections initiated from the webserver to the stream server
[19:30] <dmonjo> for ogv it is fine since icecast is handling
[19:31] <dmonjo> but for the converted mpg it doesnt have any reference
[19:31] <dmonjo> this is why red5 would help make it better
[19:32] <LithosLaptop> seems like VLC can stream MPEG4-Video/MP3 through HTTP: http://www.videolan.org/streaming-features.html
[19:33] <dmonjo> yea so? :/
[19:33] <dmonjo> you would need to convert to ogg :)
[19:33] <LithosLaptop> can't that work for HTML5?
[19:33] <dmonjo> yea it works
[19:33] <Mavrik> LithosLaptop, it can connect TO http server
[19:34] <Mavrik> it cannot act AS a http server
[19:34] <Mavrik> that's a huge difference.
[19:34] <LithosLaptop> oh
[19:34] <LithosLaptop> so how is my stream working at home?
[19:34] <dmonjo> Mavrik: let me please sum up the stuff to start rerestructing my project
[19:34] <LithosLaptop> I don't hve a HTTP server
[19:34] <held> vlc can also stream, i don't know if it's able to stream to multiple clients at the same time.
[19:34] <Mavrik> LithosLaptop, no idea, don't know your topology ;)
[19:35] <dmonjo> webcam stream (which protocol?) -> ffmpeg 1) convert to theora/vorvis 2) convert to mpeg each of these streams will be feeders to each of icecast and red5?
[19:35] <LithosLaptop> TV Tuner -> VLC -> HTTP stream
[19:35] <Mavrik> I really don't get why all the fuss when there's widely used, tested, software doing exactly what is required
[19:35] <LithosLaptop> and then I connect to it wit my laptop in other room
[19:36] <dmonjo> Mavrik: i print that sentence and stick it on my desk ? :D
[19:36] <Mavrik> ah, I see, it seems VLC implemented HTTP server in later versions
[19:36] <LithosLaptop> :)
[19:38] <dmonjo> Mavrik: how can i  send the stream from webcam to ffmpeg? step1 ?
[19:38] <dmonjo> so far i am sending directly from webcam to icecast on the port of icecast
[19:38] <Mavrik> no idea, use VLC or ffmpeg?
[19:38] <Mavrik> well, you can do that as well
[19:38] <Mavrik> if you can send to icecast
[19:38] <Mavrik> you can send to ffmpeg or vlc :)
[19:38] <dmonjo> can i make ffmpeg listen ?
[19:38] <dmonjo> wow
[19:38] <dmonjo> how can i do that?
[19:39] <dmonjo> it is like a ffmpeg server
[19:39] <Mavrik> how are you doing it now? :)
[19:40] <dmonjo> Mavrik: using gstreamer to send vide0 to icecast port 8000
[19:41] <Mavrik> what protocol?
[19:41] <dmonjo>  gst-launch-0.10 v4l2src ! videoscale ! video/x-raw-yuv,width=320,height=240 ! queue ! ffmpegcolorspace ! videorate ! theoraenc bitrate=128 ! queue ! oggmux name=mux pulsesrc ! audio/x-raw-int,rate=44100,channels=1,depth=16 ! queue ! audioconvert ! audiorate tolerance=40000000 ! vorbisenc ! queue ! mux. mux. ! queue ! shout2send
[19:41] <dmonjo> shout2send into icecast
[19:41] <dmonjo> what you mean is use ffserver? (ffmpeg server ? )
[19:41] <dmonjo> to receive datafrom the webcam?
[19:45] <dmonjo> Mavrik: can you input to RED5 a mmpeg STREAM ?
[20:27] <Jordan_> ffmpeg crashes after encoding: http://pastebin.com/08PtcgU0
[20:29] <Jordan_> Also, I want to make sure I'm solid on using ffmpeg from CLI and running it as separate program ok to use in proprietary app.
[20:29] <Jordan_> As I understand it is ok to use GPL as long as you run it in a separate process and distribute the source.
[20:33] <LithosLaptop> I had weird crashing issues with ffmpeg before when using 64bit builds with certain codec combinations
[20:33] <LithosLaptop> 32bit builds worked fine
[20:33] <Jordan_> so switch to 32 bit builds?
[20:33] <LithosLaptop> dunno
[20:33] <LithosLaptop> check if 32bit does the same for you
[20:34] <Jordan_> ok i'll try
[20:34] <Jordan_> LithosLaptop, do my cli options make sense?
[20:34] <Jordan_> it actually works fine if i just copy the audio stream
[20:35] <Jordan_> when I switched to -an it started crashing at the end
[20:35] <Jordan_> LithosLaptop, actually they are 32-bit builds
[20:36] <LithosLaptop> ah ok
[20:36] <LithosLaptop> looks ok to me, but I am not ffmpeg expert :)
[20:36] <LithosLaptop> *not/no
[20:38] <Jordan_> the output also looks ok plays fine
[20:42] <Jordan_> i would have thought ffmpeg would be very stable
[20:43] <Jordan_> if i do -c:a copy no crash
[20:43] <Jordan_> if i try to remove it, will crash
[20:48] <Jordan_> why would i want to use 64 bit anway?
[20:48] <dmonjo> difference between avconf and ffmpeg?
[20:48] <dmonjo> which one to use?
[20:54] <Jordan_> well i guess the 3-10-13 build works ok
[20:55] <durandal11707> dmonjo: this is ffmpeg channel and there is no avconv here
[20:56] <Jordan_> cool i think it was a problem with the zeranoe windows builds
[20:56] <Jordan_> high five
[20:59] <Jordan_> will ffmpeg do the same thing ffprobe does?
[21:03] <Jordan_> ffmpeg.exe and ffprobe.exe are both about 20mb each, i get the feeling most of the code base is exactly the same
[22:03] <johto> hello fine gentlement. what's the different between -x264opts and -x264-params?
[22:12] <brontosaurusrex> johto, hold on
[22:16] <brontosaurusrex> hrm, looks like the same thing to me
[22:16] <brontosaurusrex> based on "man ffmpeg-codecs"
[22:17] <brontosaurusrex> but the man is still incredibly weird, so ...
[22:17] <johto> incredibly confusing :-(
[22:19] <brontosaurusrex> best way is try&explode
[22:44] <Orbixx> Trying to grab a few seconds of a video with -ss and -t operators, but the output is always a minimum of 29 seconds of video.
[22:44] <Orbixx> However the total specified amount of time to cut (say -ss 00:00:00 and -t 5, 5 seconds)
[22:44] <Orbixx> Minus the 29 seconds of the video (so 24 seconds)
[22:45] <Orbixx> Is just still video of the first frame at the point specified with -ss
[22:45] <Orbixx> until it hits the 24 second mark and then the frames begin to flow
[22:45] <Orbixx> Is this normal behaviour?
[22:46] <Orbixx> Can ffmpeg only do >=30s jobs?
[22:47] <Orbixx> Doesn't look like it since the ffmpeg output indicates nothing of the sort
[22:56] <Sashmo_> I forgot, but where are the default h264 profiles located when using ffmpeg and ubuntu?
[22:56] <Sashmo_> actually the preset configs
[22:56] <Sashmo_> is what I am really looking for
[22:56] <Mavrik> in x264 source in new versions
[23:09] <Scotteh> Anyone any idea whats up with ffmpeg and python? If I have a function to start an ffmpeg stream using webcam it works fine, adding audio works fone, if I use multiprocess to start the function with video it's fine, with video and audio I get ALSA buffer xrun errors?
[23:20] <mark4o> Orbixx: are your -ss and -t output options (between input file name and output file name)?
[23:21] <dmonjo> hi guys
[23:21] <dmonjo> ffmpeg -i http://localhost:4444:/test.ogv -vcodec libx264 -acodec libvo_aacenc -b:v 128k -flags -global_header -map 0:0 -map 0:1 -f segment -segment_time 4 -segment_list_size 0 -segment_list testlist.m3u8 -segment_format mpegts stream%05d.ts
[23:22] <dmonjo> is not working for me
[23:22] <dmonjo> trying to convert from ogv to m3u8 mpg-ts
[23:22] <dmonjo> what am i missing?
[23:22] <Orbixx> mark4o: Aha! thanks, that sorted it
[23:23] <dmonjo> get this error: Stream map '0:1' matches no streams.
[23:27] <mark4o> Orbixx: https://ffmpeg.org/trac/ffmpeg/wiki/Seeking%20with%20FFmpeg
[23:38] <dmonjo>  -acodec libvo_aacenc not working for me
[23:38] <dmonjo> what other can i use for libx264
[23:43] <dmonjo> i am always getting this error when transcoding Broken file, keyframe not correctly marked.
[00:00] --- Tue Mar 12 2013


More information about the Ffmpeg-devel-irc mailing list