[Ffmpeg-devel-irc] ffmpeg.log.20130904

burek burek021 at gmail.com
Thu Sep 5 02:05:01 CEST 2013


[00:00] <Plorkyeran_> oh, ffmpeg's does a bit more than libav's
[00:01] <Plorkyeran_> so maybe not completely trivial
[00:03] <Mavrik> looking at the master it is trivial
[00:04] <Mavrik> not really in accordance to standard encoder behaviour
[00:07] <pyBlob> Mavrik: ok, using the simple udp-setup (with port), I can display it using ffplay on the same machine
[00:08] <pyBlob> but when I try to connect from another machine, it doesn't receive anything (it also doesn't send anything to the "server")
[00:08] <Mavrik> you don't connect, UDP is being sent to a target
[00:08] <Mavrik> UDP is connectionless protocol, remember.
[00:08] <pyBlob> so I should enter the target-ip?
[00:08] <Mavrik> on the target machine you open a player and start listening to the proper port
[00:09] <Mavrik> mhm
[00:09] <Mavrik> also, firewalls tend to break
[00:10] <pyBlob> specifying the ip of the other machine works ... now there arrive MAANY packets =)
[00:11] <pyBlob> and ffplay can reconstruct the stream, thanks for the help
[00:13] <Mavrik> np :)
[00:16] <well0ne> Hi, i'm having a question HLS related (libav?!) ... I want to customize the HTTP-Request and add some headers, i see that avio_open2 works with options, but howto use this exactly? can anybody help?
[00:21] <Mavrik> well0ne, http protocol handler accepts header as option
[00:21] <Mavrik> I'm not sure what your usecase is though.
[00:23] <Mavrik> anyway, good night.
[00:39] <pyBlob> ffplay currently uses a buffer of >4s ... is there a way to make it smaller?
[00:41] <CentRookie> hi all
[00:41] <CentRookie> is there a way to specify the audio stream per language id, like eng, ger, instead of number?
[00:42] <CentRookie> i know there is -map 0:a and 0:v but sometimes there are multiple audio sterams and i want to select the english stream per default
[00:52] <pyBlob> setting buffer_size in server reduces latency quite much =)
[01:11] <pyBlob> wow ... how long can it take to compile ffmpeg, this crappy computer is compiling for +2h now oO
[01:14] <tsjiller> depends on how crappy your computer is i guess
[01:16] <bencoh> :)
[01:17] <pyBlob> :)
[01:19] <well0ne> Hi, i figured out howto set a Custom User-Agent/Cookie or any Header on ffmpeg , with the "-headers" option .  But i'm using windows and  \n is not recognized as newline
[01:19] <well0ne> but ffmpeg says i need to send the newline, if not a timeout will happen cuz the header is not send
[01:19] <well0ne> what should i do :S
[01:19] <pyBlob> well0ne: try \r\n
[01:20] <well0ne> windows shell gives a fck
[01:20] <well0ne> dont work
[01:20] <pyBlob> :/
[01:20] <well0ne> really frustrating
[01:20] <well0ne> do i really have to recompile it just for this litte error? i dont even have the toolchains ready :/
[01:22] <sacarasc> What happens if you try: -headers "I like cheese
[01:22] <sacarasc> Do you like cheese?"
[01:24] <well0ne> how to go in the 2nd line exactly?
[01:24] <well0ne> if i press error the command is beeing executed
[01:24] <well0ne> enter*
[01:25] Action: sacarasc shrugs.
[01:27] <well0ne> have anybody here a prepared toolchain for ffmpegs latest version w32/w64?
[01:33] <sacarasc> well0ne: Try doing it in PowerShell?
[01:33] <CentRookie> is there a map parameter for subs like 0:s ?
[01:33] <well0ne> powershell?
[01:33] <sacarasc> That seems to support going to multiple lines in "".
[01:33] <well0ne> whats that
[01:33] <vilalian> well0ne: http://ffmpeg.zeranoe.com/
[01:33] <well0ne> vilalian?
[01:34] <sacarasc> well0ne: It's Windows' more powerful command line interface thingy...
[01:34] <well0ne> ill try lol
[01:34] <well0ne> i never know
[01:34] <vilalian> well0ne: You were asking about w32/w64 setup or builds? They're on that site.
[01:34] <well0ne> no i havent
[01:35] <well0ne> sacarasc
[01:35] <well0ne> the same behaviour
[01:35] <well0ne> [http @ 0000000002658820] No trailing CRLF found in HTTP header.
[01:36] <well0ne> -headers "HEAD\r\n"
[01:38] <well0ne> this maybe works flawless under  *NIX systems .... but
[01:38] <well0ne> total failure on windows.....
[01:38] <pyBlob> have you tried \\\r\\\n ?
[01:38] <well0ne> yupp
[01:38] <well0ne> i tried everything
[01:39] <tsjiller> have you tried running it from bash instead of cmd?
[01:39] <tsjiller> or some other shell compiled for windows
[01:39] <well0ne> no
[01:39] <well0ne> what would you prefer
[01:41] <tsjiller> I'm not convinced thats your problem, but try win-bash ( http://sourceforge.net/projects/win-bash/files/shell-complete/latest/ )
[01:41] <well0ne> i did
[01:41] <well0ne> dont work
[01:41] <well0ne> same
[01:42] <well0ne> [http @ 00000000003e8680] No trailing CRLF found in HTTP header. [http @ 00000000003e8680] HTTP error 408 Request Time-out
[01:42] <well0ne> -headers "X-Forwarded-For: 80.74.130.38\r\n\r\n"
[01:42] <well0ne> -.-
[01:43] <pyBlob> have you had a look at the packets using wireshark or something the like?
[01:44] <well0ne> yes
[01:44] <well0ne> its a pure shell thing
[01:44] <well0ne> under linux you can pass newlines as argument for  commands
[01:45] <well0ne> under windows you cannot
[01:45] <well0ne> thats simple....
[01:45] <well0ne> so if anybody here have a  toolchain and could just make a fixed version for my, please help me out :S
[01:45] <pyBlob> http://superuser.com/questions/150116/how-to-put-a-newline-in-dos-command-line-as-part-of-the-option
[01:46] <well0ne> there is no answer
[01:46] <well0ne> you just cant
[01:46] <well0ne> dont wanna need ruby/python for that...
[01:47] <well0ne> thanks ...
[01:47] <sacarasc> The first answer didn't work?
[01:47] <well0ne> nope
[01:47] <well0ne> becuase i need to pass my argument in quotes
[01:48] <well0ne> "HEADER: BLABLA^
[01:48] <well0ne> so windows doesnt ask for the next line, just executes it
[01:50] <pyBlob> http://stackoverflow.com/questions/7041069/dos-working-with-multi-line-strings
[01:50] <well0ne> thanks for trying to help me
[01:51] <well0ne> but i already tried everything
[01:51] <well0ne> and this is bash
[01:51] <well0ne> and a echo
[01:51] <well0ne> not an command argument
[01:51] <pyBlob> batch!=bash
[01:52] <pyBlob> why don't you want to use batch-files?
[01:52] <well0ne> it dont work man
[01:52] <pyBlob> :)
[01:53] <well0ne> ne jetzt mal im ernst
[01:53] <well0ne> das funzt nicht ....
[01:53] <well0ne> bin ja nich bescheuert....
[01:54] <well0ne> in dem beispiel was du gepostet hast wird ja ein echo verwendet. bei mir ists aber eben kein echo sondern der prozess an sich der gestartet wird, und da wird das dann wiederrum nicht berücksichtig
[01:54] <well0ne> the toolchain is preparing since 4 hours, and no end in sight
[01:56] <well0ne> I'll pay 10 ¬ to anyone how does 1-2 line patching in latest code && compile it for me w32/w64
[01:56] <well0ne> who*
[01:57] <pyBlob> ... I have no further ideas anymore ._.
[01:57] <well0ne> thanks anyway :)
[01:57] <well0ne> i'll have to wait until my toolchain is ready :/
[01:58] <well0ne> hope the script from zeranoe is working
[01:58] <pyBlob> yep ... I'm also waiting for my build to finish xD
[02:30] <well0ne>  I'll pay 10 ¬ to anyone how does 1-2 line patching in latest code && compile it for me w32/w64
[03:05] <BallsDeep> Does the "ffmpeg" utility always decode a lossy stream into a raw PCM (32-bit floating point) stream before it sends the data to a lossy encoder? How can I tell? Is there some utility that I can use to view what a "ffmpeg" lossy decoder is doing in real-time?
[03:17] <BallsDeep> I use the following command to transcode DTS to AC-3: ffmpeg -i <filename.mkv> -threads 8 -map 0:1 -ac 6 -f ac3 -ab 640k <filename.mka>
[03:18] <BallsDeep> When this command is executed, what is the DTS decoder decoding the stream too, before the encoder starts to encode the stream to AC-3?
[03:27] <klaxa> well... raw would make sense
[03:27] <klaxa> aka pcm
[03:51] <well0ne> toolchain broken, build was unsuccessfull :( :(  :(
[04:22] <well0ne> lol
[04:23] <well0ne> i'm on debian wheezy (7.1) the toolchain from zeranoe
[04:23] <well0ne> says i need  makeinfo
[04:23] <well0ne> its not available at all ?!
[04:23] <well0ne> textinfo == not exists
[04:23] <well0ne> makeinfo === not exists
[04:23] <well0ne> where do i get this O.o
[04:23] <well0ne> wtf
[04:25] <well0ne> HELP
[04:50] <klaxa> what are you trying to do?
[04:50] <klaxa> compile ffmpeg?
[04:51] <klaxa> well0ne: ^
[04:51] <well0ne> ich hab versucht ffmpeg für windows auf einem debian system zu crosscompilen, komischerweise gabs da ein packet nich, habs manuell nachgeladen...
[04:52] <klaxa> why not compile ffmpeg for windows... on windows?
[04:52] <klaxa> also, even though i do understand german, keep it english for other people's sake
[04:52] <well0ne> cuz there was already a prepared script for the complete toolchain by zeranoe
[04:52] <well0ne> and i had a debian server available
[04:54] <klaxa> can you pastebin whatever errors you get?
[04:59] <well0ne> huh, i solved the problem
[04:59] <well0ne> but now while creating the buildroot something happened
[05:00] <well0ne> ill try again, if it crashes again ill pasebin
[05:01] <klaxa> i will go to bed now though, because 5 am and work tomorrow
[05:01] <klaxa> way too late already
[05:05] <well0ne> :D
[05:05] <well0ne> aight, gn8
[05:51] <wangoceantao> hi
[06:30] <well0ne>  I'll pay 10 ¬ to anyone how does 1-2 line patching in latest code && compile it for me w32/w64
[06:57] <wangoceantao> We are developing a video player under ios platform.
[06:57] <wangoceantao> We use four threads.
[06:57] <wangoceantao> Thread 1 is used for analyzing images;
[06:57] <wangoceantao> thread 2 is used for decoding videos;
[06:57] <wangoceantao> thread 3 is used for decoding audios;
[06:57] <wangoceantao> thread 4 is used for outputting the audios and videos.
[06:57] <wangoceantao> Here is the situation of the problem. The decoded data which is generated from Thread 2 will be put in the cache. Then when it plays, it plays too fast to watch. I adjusted NStimer, but it does not solve the problem. However, if I add ‘sleep’ in the thread, then the problem is solved. But I cannot add ‘sleep’ in my application since I need the synchronization of the video and audio when it plays. So, does anyone know what the problem is act
[06:57] <wangoceantao> ually, please? And how to solve it?
[07:12] <moonmayor> I'm trying to convert a movie to one image of tiled screenshots with timestamps.
[07:13] <moonmayor> I can use drawtext to draw text all over one movie and make a new movie file, but I can't get it to draw on frames individually
[07:13] <moonmayor> here are some commands I've been trying
[07:13] <moonmayor> http://pastebin.com/HYgBPGrG
[07:14] <moonmayor> I'm trying to find out how to fix:
[07:14] <moonmayor> Unable to find a suitable output format for 'drawtext=fontfile=/Users/moonmayor/Desktop/FreeSerif.ttf: text='text to write': fontsize=20: fontcolor=red: x=10: y=50'
[07:49] <relaxed> moonmayor: quote the entire filter chain
[07:51] <moonmayor> trying now.
[07:55] <moonmayor> ok, that made all the tiles, but the text was only on the first frame.
[07:56] <moonmayor> I'm moving the drawtext earlier in the filter chain thinking that it will draw it on each frame before tiling
[07:56] <moonmayor> and, yeah! that seems to work well.
[08:22] <dawe> hello, can somebody help me with ffmpeg
[08:24] <moonmayor> dawe, what are you trying to do?
[08:39] <Guest33941> moonmayor: Im tying to convert flv video to mp4
[08:40] <Guest33941> But I have problem that video/audio is out of sync
[08:40] <Guest33941> and sometime quality of video is poor
[08:40] <Guest33941> when I convert from flv to flv, then video is almost ok, but when I change .flv to .mp4 then I have problem
[08:42] <Mavrik> do you understand the difference between a container and stream_
[08:42] <Mavrik> ?
[08:45] <Guest33941> yes a little
[08:45] <Guest33941> flv is container and mp4 too
[08:45] <Mavrik> mhm, and which video format are you using? are you even telling ffmpeg what quality do you want?
[08:45] <Guest33941> and streams inside have different codecs
[08:46] <Guest33941> I described my problem here http://stackoverflow.com/questions/18585607/red5-1-0-2-recorded-flv-convert-to-mobile-html5-format-with-ffmpeg-av-out-of-syn
[08:46] <Guest33941> There are all info I hope, if u need more info I will tell you
[08:47] <Guest33941> I recorded video via RED5 server and I want to convert it to some more friendly format
[08:48] <Mavrik> yeah, first, stop converting to mpeg4
[08:48] <Mavrik> and using libvo_aacenc& this will make your life easier
[08:48] <Mavrik> use libx264 to encode into H.264
[08:48] <Mavrik> and make sure you PASS QUALITY SETTINGS to ffmpeg
[08:48] <Guest33941> Im not sure if I know how to use it, Im beginner with ffmpeg
[08:48] <Mavrik> because now ffmpeg just defaults to very low settings since you never tell it what you want your video quality to be.
[08:48] <Guest33941> can u help me with command please
[08:49] <Guest33941> I want quality for web
[08:49] <Mavrik> https://www.virag.si/2012/01/web-video-encoding-tutorial-with-ffmpeg-0-9/
[08:49] <Guest33941> 30sec video about max 5mb
[08:50] <sacarasc> Guest33941: Try ffmpeg -i input.flv -c:v libx264 -preset:v veryslow -crf 22 -c:a copy output.mp4
[08:54] <Guest33941> sacarasc I tried and : track 1: could not find tag, codec not currently supported in container
[08:54] <chrisjunkie> Guys, doing development on ZoneMinder the CCTV software package and having some issues with avformat. What we're hoping to do in the future is save video to a video container rather than a bunch of JPEGs and I'm starting the work on that feature. I'm trying to take a raw H264 frame (as opened by ffmpeg), and then just place it inside a MKV container. The ouput context it allocated, and stream created. The codec context is copied
[08:54] <chrisjunkie> from the codec context of the incoming stream to the codec context of the output stream and then the header is written. Packets are being saved into the container fine, but the issue is the time base is all wrong
[08:54] <Mavrik> sacarasc, you can't just throw speex into an mp4 -_-
[08:54] <chrisjunkie> I can see that the time_base is 1,90000 on the stream, but on the codec itself its 1,30
[08:54] <sacarasc> I didn't see what he had as an input file. :D
[08:55] <Mavrik> chrisjunkie, that tends to not be a problem
[08:55] <chrisjunkie> Where should I be copying this time_base to? The output stream or the output codec?
[08:55] <Mavrik> are you having sync issues?
[08:55] <chrisjunkie> Yeah, i.e video is 58mins long but is actually only say 30 secs
[08:55] <Mavrik> that's probably not due to timebase
[08:55] <Guest33941> Stream #0:0: Video: flv1, yuv420p, 640x360, 625 kb/s, 1k tbr, 1k tbn, 1k tbc ; Stream #0:1: Audio: speex, 16000 Hz, mono, s16, 16 kb/s
[08:56] <Mavrik> but if you're just copying raw frames you should certanly also copy all codec metadata
[08:56] <Guest33941> but Iam able to record to Nellymoser codec if it is better
[08:56] <chrisjunkie> Mavrik: sorry, I meant the clip is 30sec but it is shown to me as being 60 mins or similar
[08:56] <chrisjunkie> Mavrik: raw frames? I'm copying the packets
[08:56] <chrisjunkie> Should I be doing the frames instead? I don't want to decode the h264 obviously
[08:57] <Mavrik> chrisjunkie, sorry, miswrote, packets :)
[08:58] <chrisjunkie> Mavrik: ah cool. So I'm on the right track then! Super hard to paste example code but if I do a paste would you be willing to take a look at it?
[08:58] <chrisjunkie> (due to the huge source code base)
[08:58] <Mavrik> that's usually not all that constructive
[08:58] <Mavrik> chrisjunkie, check out ffmpeg.c
[08:59] <Mavrik> and what it does when you pass "-codec copy"
[08:59] <chrisjunkie> Oh ok. Will do thanks
[08:59] <Mavrik> sadly checking out ffmpeg.c is usually still the best documentation for that stuff -_-
[09:00] <chrisjunkie> Haha yeah, there are few examples :-(
[09:00] <chrisjunkie> Duration: 02:19:02.28, start: 0.000000, bitrate: 6 kb/s
[09:00] <chrisjunkie> That is hugely incorrect :P
[09:01] <chrisjunkie> I'll dig into ffmpeg.c after dinner and report back
[09:01] <Mavrik> yeah that's usually timebase/duration fail :)
[09:01] <Mavrik> don't remember the exact nuances tho :\
[09:23] <Kurvivor> hello
[09:23] <Guest33941> please can somebody help me with conversion command, Im started with this: ffmpeg -i test.flv -c:v libx264 -c:a libvo_aacenc test.mp4
[09:23] <Kurvivor> in ffmpeg api, what is the equivalent of passing -c:v <codecname> for ffmpeg tool?
[10:06] <Guest33941> please can somebody help me with ffmpeg
[10:07] <sacarasc> Guest33941: Try ffmpeg -i input.flv -c:v libx264 -preset:v veryslow -crf 22 -c:a libvo_aacenc -b:c 128k output.mp4
[10:09] <Guest33941> I will try
[10:09] <Guest33941> after that video is slow and lagy
[10:10] <Guest33941> I created this command
[10:10] <Guest33941> ffmpeg -i test.flv -c:v libx264 -profile:v high -preset slow -b:v 300k -maxrate 300k -bufsize 600k -r 30 -c:a libvo_aacenc -b:a 128k test.mp4
[10:10] <Guest33941> it works, but sound is out of sync :(
[10:12] <Guest33941> I added to your command -r 30 for fps . Without that command I got laggy and bsplayer sayed, that video have 1000fps
[10:12] <Guest33941> ffmpeg -i test.flv -r 30 -c:v libx264 -preset:v veryslow -crf 22 -c:a libvo_aacenc -b:a 128k output.mp4
[10:12] <Guest33941> but still is video out of sync
[10:12] <Guest33941> audio
[10:13] <sacarasc> Is it constant out of sync? Or does it get out of sync over time?
[10:13] <Guest33941> it looks constant, from beginning
[10:13] <Guest33941> audio is before video
[10:14] <Guest33941> when I play flv file in flash videoplayer, then sound and video is okay
[10:14] <sacarasc> -itoffset can change the sync of audio and video. I've never used it, so you might have to play with ti.
[10:14] <Guest33941> I can try, but in future I will need some automatic process, its for web, for many videos
[10:15] <spaam> Guest33941: are you stealing videos from the web ?
[10:15] <sacarasc> Guest33941: Is it happening for ALL videos, or just this one?
[10:15] <Guest33941> no Iam creating webcam video recorder
[10:15] <Guest33941> all recorded video
[10:16] <Guest33941> when I change output file to "flv", then in bsplayer video and audio is synced correctly
[10:16] <Guest33941> when set mp4, then not
[10:16] <Guest33941> maybe to use some other people friendly format - I need that everybody will be able to play it
[10:18] <Kurvivor> how can i get framerate using ffmpeg api?
[10:27] <BoR0> anyone has any idea how to solve this issue when building shared on mingw32? make: *** No rule to make target `libavdevice/avdevice.dll', needed by `all-yes'
[10:27] <BoR0> it doesn't occur when I build static
[10:31] <Guest33941> sacarasc: so itsoffset helped me, but Im not sure if it will work globaly for longer/shorter videos.. so I will try, thank you for help. But if somebody have another solution without manual set, so tell please
[13:30] <pyBlob> I'm streaming a live-video from my webcam using (which works fine):
[13:30] <pyBlob> raspivid -t 0 -w 640 -h 480 -fps 25 -b 500000 -vf -o - | ffmpeg -i - -vcodec copy -f mpegts tcp://192.167.2.241:2020?listen
[13:30] <pyBlob> but using
[13:30] <pyBlob> ffplay tcp://192.168.2.241:2020
[13:32] <pyBlob> ffplay only starts to display something, when it has buffered ~20s video, so I have to wait for the output-window to appear and then have to manually seek to the end of the buffer
[13:33] <pyBlob> > now it's running nearly realtime, is there anything to tell ffplay, that it should just display the last frames/smaller buffer/...?
[13:59] <pyBlob> actually the same happens, when using "ffmpeg -i tcp://192.168.2.241:2020 ..."
[14:00] <pyBlob> and that's even more annoying, because I can't seek there
[14:10] <Mavrik> pyBlob, why are you using tcp at all?
[14:10] <pyBlob> it currently is the most stable ...
[14:11] <Mavrik> just how bad is your line you can't use the standard UDP transfer?
[14:11] <Mavrik> which also has all parameters you need (TCP isn't really widely supported or used)
[14:11] <pyBlob> how would you go about streaming the h264-video from my raspberry pi to another computer (for further analysis/viewing)
[14:14] <Mavrik> I'd send a MPEG2-TS packaged stream over UDP protocol to another machine which listens on a port
[14:14] <Mavrik> that's the industry standard way of broadcasting
[14:16] <pyBlob> so the command would be "ffmpeg -i - -vcodec copy -f mpegts udp://TARGETIP:PORT"
[14:18] <Mavrik> pyBlob, depending on how H.264 is structured, you'll need -bsf h264_mp4toannexb filter as well
[14:28] <soul-d> on that note  any good source on file format layouts
[14:29] <soul-d> Q: >  http://i.imgur.com/lWVFlsP.png could  use some reading on file format markers  etc
[14:30] <pyBlob> Marvik: here's some ouptut:
[14:30] <pyBlob> http://pastebin.com/kGu1uUw3 and http://pastebin.com/p6nX7xJ8
[14:39] <pyBlob> Mavrik: ... any idea, what I'm doing wrong there?
[14:40] <Mavrik> pyBlob, either you're losing alot of packets on the network
[14:40] <Mavrik> or your encoder doesn't inject PPS/SPS packets and you need the bitstream filter
[14:41] <Mavrik> soul-d, define file format layouts? different container types?
[14:42] <soul-d> http://pastebin.com/YEv5gwg3   was   my test from yesterday
[14:43] <pyBlob> Mavrik: now I'm getting http://pastebin.com/nfNXPTLc
[14:44] <Mavrik> pyBlob, ah& is it possible to tell RPi encoder to insert SPS/PSS data_
[14:44] <soul-d> but mkv seemed much to complex for simple  debugging basicly i need to learn to extract frames  and get egg find start  off  the frame extract it  to buffer   so i was searching for markers  but  haven't had much luck except finding about header markers  (hex view)
[14:45] <pyBlob> let's see ...
[14:45] <Mavrik> soul-d, MJPEG would probably by far the simplest
[14:45] <Mavrik> soul-d, it expects frames in JPEG though
[14:46] <soul-d> did read a bit on jpeg    but thats  with al the ecoding tables huff dmt  ? screen is pure rgb   though  so  if there is an ecoding in that  id  like to experiment with that  but  need  raw  8 bits  per  color  data
[14:50] <soul-d> so basicly for starters  just copy paste  a sequentual 1byte data   in   < R> <G><B>   where R is 0.0  top left     but since the screen itself alreay works with frames and fields  thought maybe single vframe  would be "easy"  to  extract
[15:05] <well0ne>  I'll pay 10 ¬ to anyone how does 1-2 line patching in latest code && compile it for me w32/w64
[15:18] <viric> there is a web page for those offerings, right?
[15:19] <viric> bugfix-on-demand, with money offer
[15:21] <well0ne> zeranoe, where yre youuu
[15:21] <JEEB> http://wiki.videolan.org/Bounties <- this is the only bounty list I know of
[15:24] <pyBlob> Mavrik: I can specify a parameter -g for "GoP" .. but that breaks ffmpeg: "pipe:: Invalid data found when processing input"
[15:36] <well0ne> toolchain broke again :(
[15:36] <well0ne> no success
[16:35] <well0ne> i'll try cygwin now
[17:18] <well0ne> cygwin looks way better ....
[17:31] <jangle_> well0ne: what are you trying to patch
[17:40] <well0ne> crlf in http.c
[17:40] <well0ne> you cannot pass a crlf in windows shell but its required to use a custom http headwer
[17:55] <well0ne> problem solved
[17:55] <well0ne> but now i have to add the cygwin1.dll, is it possible to make it full-static?
[17:57] <teratorn> somehow i doubt if linking cygwin1 library statically would be any fun at all
[17:57] <well0ne> hm kay
[17:57] <teratorn> does it even build that way?
[17:57] <well0ne> yeah
[17:57] <teratorn> there is a cygwin1.lib ?
[17:57] <well0ne> working perfectly now
[17:57] <well0ne> added crlf
[17:58] <teratorn> oh you're just babbling.. :)
[17:59] <well0ne> wtf?!
[18:00] <well0ne> no i was speaking about a ddl
[18:00] <well0ne> which is needed by the ffmpeg executable after building it with cygwin
[18:01] <jamesba> When using ffmpeg h264 decoder library with thread_type = FF_THREAD_SLICE and nThreads = 16, and incoming data bitrates in the 400Mb/s+ range I find that the decoder often experiences double frees and segfaults. It seems that there are frequent calls into av_buffer_unref with buffers that have already been unreffed or never properly assigned as (*buf)->buffer == NULL. The issues seem to begin with the decoder outputting  "decode_slice_header error".
[18:02] <jamesba> looking in the code the point where "decode_slice_header error" is printed doesn't seem to do any bailing out or cleanup of the context
[18:03] <jangle_> well0ne: just distribute the cygwin.dll with your application?
[18:03] <well0ne> hahahaha you're so funny
[18:03] <well0ne> i was asking if i can build it without needed the dll
[18:03] <jamesba> from my debugging it seems as though there may be a problem with h->cur_pic after that error
[18:04] <jangle_> well0ne: whats funny about saving yourself the work?
[18:04] <well0ne> that was sarcastic
[18:05] <jangle_> ...
[18:05] <well0ne> anyway thanks ^^ problem is solved
[18:05] <jangle_> how did you solfe it
[18:05] <jangle_> solve*
[18:05] <well0ne> yeah ^^
[18:06] <jangle_> fantastic
[18:06] <schauer97> hi
[18:08] <schauer97> I want make a livestream with ffmpeg. The problem is that i can't switch betwenn input sources (Desktop, Webcam etc). Has someone a idea?
[18:08] <schauer97> It must be live. I want stream it to my own server or to ffmpeg
[18:08] <schauer97> *twitch
[18:09] <schauer97> no one?
[18:09] <well0ne> wieso muss es denn ffmpeg sein?
[18:09] <schauer97> kann auch was anderes sein
[18:10] <well0ne> there many desktop streaming programms
[18:10] <well0ne> but you can use ffmpeg to catch the webcam/desktop aswell
[18:10] <well0ne> just read the doku
[18:11] <schauer97> I was not able to found something that work for me
[18:11] <well0ne> huh
[18:11] <well0ne> well sry
[18:11] <well0ne> i have to leave
[18:11] <well0ne> have an appointment
[18:11] <schauer97> Webcamstudio is not a solution
[18:11] <well0ne> have a nice day
[18:11] <schauer97> It is progammed in java --> it is very slow
[18:12] <schauer97> I want something like wirecast
[18:12] <schauer97> ciau
[18:19] <schauer97> Is someone  here
[18:20] <schauer97> ?
[18:20] <jamesba> yes, but I can't really help you
[18:20] <schauer97> okey :)
[18:20] <schauer97> thank and bye
[20:03] <hallagulla> i have some flac format songs but my mp3 player wont play it..can i convert it to mp3/wma(these are the supported formats in my player) without loosing any /with smallest loss?
[20:29] <Renich> hello
[20:29] <Renich> I need help with this: http://ur1.ca/fdmhg
[20:30] <Renich> it works, but the audio seems to be out of sync and the video ends before the audio does
[20:30] <Renich> how can I optimize that?
[21:47] <relaxed> relaxed: you can use -shortest to end after the shortest stream ends.
[21:47] <relaxed> -qscale 9 does nothing there
[21:48] <relaxed> oh, I messaged myself
[21:48] <relaxed> and he's gone
[21:49] <relaxed> for the record I'm still on my first cup of coffee
[21:50] <klaxa> noted
[23:24] <chrisjunkie> Guys, getting a segfault when trying to access the last_dts value in the info struct of an AVStream
[23:25] <chrisjunkie> i.e printf("%lld", (long long) input_stream->info->last_dts)
[23:25] <chrisjunkie> Is this field not always set?
[23:26] <chrisjunkie> In fact, I can't read anything in the info field
[23:53] <chrisjunkie> In an InputStream there is a dts value, is there a value which refers to the same timestamp in an AVStream?
[23:56] <chrisjunkie> i.e cur_dts or reference_dts?
[23:56] <Mavrik> InputStream? wat?
[23:56] <Mavrik> dts is a property of a packet
[23:57] <Mavrik> everything else is internal bookkeeping you shouldn't rely on.
[00:00] --- Thu Sep  5 2013


More information about the Ffmpeg-devel-irc mailing list