[Ffmpeg-devel-irc] ffmpeg.log.20151118
burek
burek021 at gmail.com
Thu Nov 19 02:05:01 CET 2015
[00:25:40 CET] <kittyfoots> Hi, I'm trying to convert a frame of YUV4:2:0 data to a JPEG, except that the YUV data in question has a row stride greater than the width of the frame, and not the same between the luma and chroma planes. E.g. the image is 640x480 with a luma stride of 768 and chroma stride of 512. Is there a way of specifying the stride on the command line?
[00:39:50 CET] <reincore> Greetings everyone
[00:46:49 CET] <waressearcher2> reincore: hallo
[00:47:22 CET] <reincore> I have a question about libav and ffmpeg. Sorry if the answer is obvious, I am kind of a newbie.
[00:47:44 CET] <waressearcher2> kittyfoots: wie geht's wie steht's ?
[00:47:59 CET] <kittyfoots> wut
[00:48:23 CET] <reincore> I just installed ffmpeg successfully on my ubuntu 14.10, but it seems like I still miss "libavcodec". Does anyone have any ideas?
[00:48:43 CET] <c_14> Wasn't 15.04 the first with ffmpeg?
[00:49:11 CET] <reincore> I am not sure about 15.x, I had to manually add the ppa and download ffmpeg myself
[00:49:24 CET] <c_14> Which ppa?
[00:49:49 CET] <reincore> kirillshkrogalev/ffmpeg-next
[00:51:32 CET] <c_14> I don't know that repo. It's possible it's a static build.
[00:51:49 CET] <reincore> oh...
[00:52:09 CET] <reincore> what would you suggest me to do? I need several packages from ffmpeg for another project that uses them
[00:52:27 CET] <c_14> Either compile ffmpeg yourself or update to 15.04+
[00:53:15 CET] <furq> you can also try installing just the ffmpeg packages from the 15.04 repos
[00:53:25 CET] <furq> but things have a tendency of going wrong when you mix repos too much
[00:53:44 CET] <reincore> in that case I guess I will try to compile ffmpeg myself...
[00:54:09 CET] <furq> i take it you have some compelling reason not to upgrade
[00:54:40 CET] <reincore> should I git clone the whole project? when I visit ffmpeg.org site, it forwards me here where I get lost: https://launchpad.net/~mc3man/+archive/ubuntu/trusty-media
[00:55:10 CET] <c_14> download the sources preferably https://ffmpeg.org/releases/ffmpeg-snapshot.tar.bz2
[00:55:23 CET] <reincore> @furq: oh yes... unfortunately I do for the time being :)
[00:55:42 CET] <c_14> Though depending on what you're trying to get to link against it, the 2.8.2 snapshot might be better
[00:57:25 CET] <reincore> you mean depending on which project I am using that links to ffmpeg libraries?
[00:57:35 CET] <c_14> yes
[00:57:42 CET] <c_14> Some of them haven't updated their api usage yet.
[00:57:48 CET] <reincore> it's essentia that I try to get running
[00:57:55 CET] <reincore> in case you are familiar with that..?
[00:58:21 CET] <c_14> No clue, try the 2.8.2 snapshot. You'll probably be safer with that.
[01:00:48 CET] <reincore> got it, will give it a shot. thanks :)
[01:02:20 CET] <reincore> being the newbie that I am: where can i find the right documentation for the 2.8.2 compilation?
[01:03:05 CET] <c_14> https://trac.ffmpeg.org/wiki/CompilationGuide/Generic or https://trac.ffmpeg.org/wiki/CompilationGuide/Ubuntu
[01:03:12 CET] <c_14> The second includes a lot of packages you probably won't need.
[01:03:29 CET] <c_14> It also does a static build which you don't want.
[01:03:44 CET] <reincore> thank you again!
[01:03:58 CET] <c_14> It's basically just ./configure --enable-shared --enable-<other things you need>
[01:04:02 CET] <c_14> then make make install
[01:04:11 CET] <c_14> That'll install everything into /usr/local (unless you change the prefix)
[01:06:12 CET] <reincore> "./configure yasm/nasm not found or too old. Use --disable-yasm for a crippled build."
[01:06:36 CET] <c_14> install yasm
[01:06:53 CET] <reincore> :)
[01:46:11 CET] <Zeranoe> Is fontconfig needed for text and logo overlay?
[02:34:41 CET] <waressearcher2> Zeranoe: hallo
[02:38:15 CET] <Zeranoe> waressearcher2: You mean "Hello"?
[02:38:37 CET] <waressearcher2> Zeranoe: nein
[03:01:41 CET] <jakj> Is there actual API documentation anywhere? I have successfully compiled FFmpeg and related libraries, and I want to create a C/C++ library that uses FFmpeg to encode real-time-generated video and audio to a streaming site using RTMP.
[03:02:27 CET] <jakj> So basically, given a stream of video frames and audio samples, and an RTMP URI, I need to create a stream and upload it using FFmpeg from C/C++.
[03:04:33 CET] <llogan> jakj: see doc/examples and http://ffmpeg.org/doxygen/trunk/index.html
[03:05:26 CET] <jakj> Ah good. That really needs to be a more prominent link somewhere on the Documentation page of the main FFmpeg site.
[03:08:14 CET] <llogan> it is there. under "API Documentation"
[05:04:04 CET] <NetworkingPro> hello everyone
[05:05:08 CET] <NetworkingPro> Im trying to-rebroadcast a video stream with this: ffmpeg -rtsp_transport tcp -i rtsp://admin:password@10.31.90.115/Streaming/channels/101/ -vcodec copy -acodec copy -f rtp rtp://0.0.0.0:60000
[05:05:25 CET] <NetworkingPro> it seems to get the stream, but not binding and rebroadcasting it.
[05:05:31 CET] <NetworkingPro> Anyone have any ideas what Im missing?
[05:07:48 CET] <waressearcher2> NetworkingPro: hallo
[05:07:58 CET] <NetworkingPro> hey waressearcher2
[05:08:02 CET] <NetworkingPro> how are ya?
[05:08:21 CET] <waressearcher2> NetworkingPro: es geht's
[05:08:24 CET] <waressearcher2> und selbst ?
[05:40:07 CET] <NetworkingPro> Anyone around tonight?
[05:54:18 CET] <NetworkingPro> superdump: you around?
[07:13:54 CET] <satinder> Hi guys I have two external devices : camera /dev/video0 and external audio hw:TW68SoundCard,0,1
[07:14:25 CET] <satinder> when I playing /dev/video0 directly that is very good playing
[07:15:00 CET] <satinder> audio is also playing good directly from hw:TW68SoundCard,0,1
[07:15:53 CET] <satinder> but when i want make a compressed file I mean capture video in h.264 and audio in mp3 then that file is playing too much fast
[07:15:58 CET] <satinder> :(
[07:16:09 CET] <satinder> please any one can help me
[07:16:18 CET] <satinder> my command is following :
[07:17:25 CET] <satinder> ffmpeg -i /dev/video0 -f alsa -ac 1 -i hw:TW68SoundCard,0,1 -vcodec libx264 -acodec mp2 -f mpegts video_audio.ts
[07:17:50 CET] <satinder> please help any one
[07:19:45 CET] <relaxed> satinder: https://trac.ffmpeg.org/wiki/Capture/Webcam
[07:21:05 CET] <satinder> relaxed : ok thanx
[07:40:02 CET] <satinder> hi that is not working
[07:40:26 CET] <satinder> my problem not video capturing
[07:41:02 CET] <satinder> when i capture single video in libx264 format that is working perfectly
[07:41:38 CET] <satinder> but when I capturing both raw video and audio in encoded format in a file
[07:41:47 CET] <satinder> and then play that file
[07:41:58 CET] <satinder> then that is playing very fast
[07:42:27 CET] <satinder> so please if you have any idea or solution then please share
[07:42:31 CET] <satinder> :)
[08:26:44 CET] <satinder> Hi please any one can help
[10:36:29 CET] <Fyr> guys, could you recommend me FLAC encoding options to get highest possible compression?
[10:37:29 CET] <Fyr> there are many options such as "-lpc_coeff_precision", "-lpc_passes", "-max_partition_order", "-prediction_order_method". I don't know what to set.
[10:37:49 CET] <Fyr> any change increases output size. =)
[10:38:05 CET] <furq> flac -8
[10:38:07 CET] <furq> hth
[10:40:29 CET] <Fyr> I usually set 9 =)
[10:41:12 CET] <furq> the numerical options set the appropriate set of compression options for you
[10:41:20 CET] <Fyr> documentation shows its best brevity.
[10:41:22 CET] <furq> -8 already uses the most ridiculous ones
[10:41:51 CET] <furq> flaccl has >8 but apparently you might get a noncompliant result, so i've never tried it
[10:42:03 CET] <furq> also flaccl is much faster for compression levels >5
[10:42:32 CET] <Fyr> also flaccl can't do anything but 16-bit/44100 Hz.
[10:43:49 CET] <furq> i don't see how that's a negative
[10:44:02 CET] <Fyr> nor do I.
[10:44:21 CET] <Fyr> I'm just converting 24-bit/192 kHz right now.
[10:44:38 CET] <Fyr> flaccl's good in compression and when you don't want to use CPU when converting large files.
[10:44:55 CET] <Fyr> however, it's only for 16-bit/44.1.
[12:57:17 CET] <satinder> hi that is not working
[12:57:18 CET] <satinder> <satinder> my problem not video capturing
[12:57:18 CET] <satinder> <satinder> when i capture single video in libx264 format that is working perfectly
[12:57:18 CET] <satinder> <satinder> but when I capturing both raw video and audio in encoded format in a file
[12:57:18 CET] <satinder> <satinder> and then play that file
[12:57:18 CET] <satinder> <satinder> then that is playing very fast
[12:57:20 CET] <satinder> <satinder> so please if you have any idea or solution then please share
[12:57:22 CET] <satinder> <satinder> :)
[12:57:38 CET] <satinder> Hi guys I have two external devices : camera /dev/video0 and external audio hw:TW68SoundCard,0,1
[12:57:38 CET] <satinder> <satinder> when I playing /dev/video0 directly that is very good playing
[12:57:38 CET] <satinder> <satinder> audio is also playing good directly from hw:TW68SoundCard,0,1
[12:57:38 CET] <satinder> <satinder> but when i want make a compressed file I mean capture video in h.264 and audio in mp3 then that file is playing too much fast
[12:57:38 CET] <satinder> <satinder> :(
[12:57:39 CET] <satinder> <satinder> please any one can help me
[12:57:41 CET] <satinder> <satinder> my command is following :
[12:57:43 CET] <satinder> <satinder> ffmpeg -i /dev/video0 -f alsa -ac 1 -i hw:TW68SoundCard,0,1 -vcodec libx264 -acodec mp2 -f mpegts video_audio.ts
[12:57:46 CET] <satinder> <satinder> please help any one
[12:57:50 CET] <satinder> ??
[13:22:28 CET] <AlexRussia> huh
[13:22:49 CET] <AlexRussia> Seems like fall back with libavx is easiest way to prevent problems for now
[13:23:16 CET] <AlexRussia> Because apps like vlc can't run/compile with current master
[13:23:43 CET] <AlexRussia> but thank you anyway, the project is amazing helpful to me :))
[13:24:16 CET] <AlexRussia> libvpx*
[14:42:26 CET] <beastwick987> Is it possible to output to a named pipe on Windows? The documentation clearly shows usage for Unix, but it is unclear about Windows. I have seen posts on the mailing lists that indicate yes.
[14:43:05 CET] <fritsch> can't you use a shared memory instead?
[14:43:18 CET] <fritsch> (if you control both programs)
[14:43:47 CET] <beastwick987> If it is possible, how many procesess on Windows can read from a named pipe at once? Is this memory intensive? I'd like to pipe video and audio sources to a pipe and then have once process for streaming and another for local recording. I want it this way so I can stop one without killing the other.
[14:44:04 CET] <beastwick987> How can I use shared memory?
[14:44:23 CET] <fritsch> an easy to use library is apache's crossplatform apr
[14:44:33 CET] <fritsch> but might be too much for your usecase
[14:45:00 CET] <fritsch> https://trac.ffmpeg.org/ticket/986 <- did you see this?
[14:54:01 CET] <c_14> beastwick987: windows does not have named pipes
[14:55:10 CET] <furq> sure it does
[14:55:30 CET] <furq> you can't create them on the command line though
[14:55:34 CET] <c_14> afaik not in a unix fifo compatible way
[15:05:09 CET] <beastwick987> yes I saw that post
[15:05:15 CET] <beastwick987> sorry got distracted :D
[15:05:43 CET] <beastwick987> can I make a named pipe with powershell and, then, use it the same way?
[15:06:20 CET] <furq> try it and see
[15:06:28 CET] <furq> i'm not sure you'll be able to have multiple readers reading the same data though
[15:06:52 CET] <beastwick987> bummer, lastly, do you know if multiple processes can read from one pipe in linux?
[15:07:19 CET] <furq> that's what i'm basing my assumption that named pipes can't do it on
[15:07:26 CET] <furq> windows named pipes, rather
[15:07:30 CET] <fritsch> and that's why I said: shared memory
[15:07:37 CET] <furq> you can't do that with a fifo
[15:07:40 CET] <beastwick987> oh, unix pipes can't be shared :(
[15:08:08 CET] <furq> you can have multiple readers but they'll read whatever's at the top of the queue and then it's gone
[15:08:35 CET] <beastwick987> well, i'm assuming my PC could keep up with having multiple readers.
[15:08:39 CET] <furq> you could maybe hack something together with tee on *nix
[15:08:49 CET] <beastwick987> true
[15:10:33 CET] <sopparus> hello, is there any intrest for amd VCE encoding? is it possible that it end up in ffmpeg in a year or two?
[15:10:39 CET] <furq> you could maybe run a local rtmp server which rebroadcasts and records
[15:10:40 CET] <sopparus> or should I just head for nvidia
[15:10:46 CET] <furq> nginx-rtmp can do that and you can control the recording separately
[15:30:27 CET] <beastwick987> I can create pipes in PowerShell, I just don't know how to pass a reference to a pipe object to FFMPEG via command line...
[15:30:54 CET] <beastwick987> I guess I would need its address, not sure if I can get that from the object.
[15:31:36 CET] <c_14> beastwick987: you could try writing to stdout and then using powershell to redirect from stdout to the object
[15:32:49 CET] <furq> beastwick987: \\.\pipe\mypipename
[17:16:49 CET] <beastwick987> hey chocolatearmpits
[17:16:58 CET] <ChocolateArmpits> hi
[17:17:02 CET] <beastwick987> I was told to talk to you in the powershell irc chat
[17:17:17 CET] <beastwick987> I am trying to create windows pipes for use with ffmpeg, did you succeed?
[17:17:23 CET] <ChocolateArmpits> about what?
[17:18:47 CET] <ChocolateArmpits> If it's PS-related then let's probably carry this over there
[17:58:47 CET] <beastwick987> Since I can't get pipes working on Windows is it possible to write to a file that gets overwritten every x time?
[17:59:19 CET] <beastwick987> Like once it reaches a certain size or has gone for a specific time discard what has been recorded and start over?
[17:59:29 CET] <beastwick987> actually that might be a bad idea, nevermind.
[18:08:52 CET] <ChocolateArmpits> beastwick987: Well you could split it into chunks where let's say 10 would be total at a single time, and just delete the last one. Using concat demuxer you can make a list with the filenames and go from there
[18:09:16 CET] <ChocolateArmpits> I'll look into named pipes, but can't guarantee anything currently
[18:10:03 CET] <beastwick987> chocolatearmpits no problem, I mean I could just use Linux, but then I'd have to test having multiple processes read from a single pipe.
[18:10:23 CET] <beastwick987> how fast that is, if it would even work, etc.
[19:27:14 CET] <Flerb> How might I convert an MJPEG stream to H.264, on-the-fly?
[19:29:39 CET] <llogan> ffmpeg -i input -c:v libx264 output
[19:30:48 CET] <Flerb> llogan: thanks
[19:32:31 CET] <Flerb> llogan: and the output would be in some webserver's folder such that when you access it from a browser, you see such a live stream
[19:32:35 CET] <Flerb> would that work?
[19:34:20 CET] <DHE> save it to .mp4 and it might
[20:32:50 CET] <menacer> Hi all, I have just managed to get a bat file to add my watermark to a video
[20:33:10 CET] <menacer> I also need to add my intro to the start of the video
[20:35:29 CET] <menacer> F:\ffmpeg\bin\ffmpeg.exe -i video.mp4 -i logo.png -filter_complex overlay=main_w-overlay_w-10:main_h-overlay_h-10 wvideo.mp4
[20:35:35 CET] <menacer> This is what I have so far
[20:36:00 CET] <menacer> Now as I understamd it, ffmpeg can only run one sequence at a time
[20:38:33 CET] <menacer> ffmpeg -i concat:intro.mp4|wvideo.mpg|complete.mpg -q:v 0 -y concat_tmp.mpg
[20:38:50 CET] <menacer> But is there any way I can set this up to run together?
[20:39:28 CET] <menacer> I want it so that any file that lands in my designated folder will have the intro added and the watermark and then sent to my choice of foler once complete
[20:39:38 CET] <menacer> folder*
[20:48:09 CET] <llogan> menacer: use the concaft filter
[20:48:25 CET] <menacer> I shall look this up now
[20:48:55 CET] <llogan> sorry, "concat"
[20:49:21 CET] <llogan> also see https://trac.ffmpeg.org/wiki/Concatenate
[20:50:22 CET] <menacer> Thanks very much
[20:53:50 CET] <menacer> Nope i'm getting lost here
[20:53:58 CET] <menacer> I'm very new to this
[21:01:43 CET] <shincode> what is optimal mips flags to use with ffmpeg
[21:04:01 CET] <voip_> hello guys
[21:07:15 CET] <waressearcher2> voip_: hallo
[21:09:01 CET] <shincode> -mhard-float with -msoft-float warning!!!! libc! is it ok to just ignore this :O
[21:13:01 CET] <Mavrik> uh.
[21:13:10 CET] <Mavrik> I wouldn't agree :P
[21:15:50 CET] <shincode> correction its lib/crti.o
[21:15:55 CET] <shincode> im doing it
[21:16:20 CET] <shincode> i only have 400 mhz to work with
[21:30:47 CET] <menacer> anyone willing to code this for a small fee?
[21:31:27 CET] <fritsch> not for a small fee
[21:31:30 CET] <fritsch> lol
[21:31:36 CET] <menacer> roflmao
[21:31:39 CET] <menacer> >_<
[21:31:44 CET] <fritsch> "Can you do something for me - I will pay you little money!"
[21:31:49 CET] <fritsch> try again ...
[21:32:00 CET] <menacer> Well I dont think it's a massive task for someone with the knowledge
[21:32:21 CET] <fritsch> How much time do you need to get that knowledge?
[21:32:30 CET] <fritsch> and how much would an hour of this process cost?
[21:35:16 CET] <menacer> Ok... much would someone charge to code... auto watermark any file that lands in designated folder this is then sent to a designated folder, from here the code will automatically add the intro to the file and then send the file to another designated folder
[21:37:52 CET] <fritsch> if you look at www.ffmpeg.org
[21:38:02 CET] <fritsch> you will find that there are some devs available to be hired
[21:38:32 CET] <fritsch> you should also discuss with them about the way you "think" this task would be working correctly
[21:38:35 CET] <fritsch> :-)
[21:38:47 CET] <fritsch> cause this folder to folder to folder approach is most likely not what you want
[22:00:27 CET] <shincode> 500k
[22:06:52 CET] <menacer> Ok thanks fritsch, I have someone looking at it now but if no luck I shall take a look.
[22:07:18 CET] <menacer> Just thought going from one folder to another would stop any conflict
[22:23:00 CET] <fritsch> menacer: not really - file system access is not conflict free
[22:23:10 CET] <fritsch> and think of the time when one copies but is not "fully there yet" :-)
[22:23:11 CET] <fritsch> and so on
[22:23:31 CET] <fritsch> you need an expert :p
[22:33:27 CET] <daveande> I have a general video question that isn't specifically related to ffmpeg. I have a cordova mobile app that allows users to record and upload video. I'd like to give users the ability to edit the recorded videos. Edit ability should include cropping, selecting start and end time (trimming video). Any suggestions on how I should tackle that within a cordova app?
[22:33:53 CET] <daveande> If this isn't the right place for the question, please just let me know. Thank you!
[22:36:27 CET] <shincode> this place is the default place for all answers
[22:41:05 CET] <menacer> I don't think so shincode, I haven't found anything here other than links to things I have already seen
[22:45:15 CET] <mh_2007_1> Could anybody tell me if current ffmpeg supports rtmp streaming to Akamai CDN?
[23:05:22 CET] <c_14> Try it?
[00:00:00 CET] --- Thu Nov 19 2015
More information about the Ffmpeg-devel-irc
mailing list