[Ffmpeg-devel-irc] ffmpeg.log.20131018
burek
burek021 at gmail.com
Sat Oct 19 02:05:01 CEST 2013
[01:57] <_8680_> Concatenating audio (.wav) files with `ffmpeg -f concat -i input-file-list -codec copy output-file` spews hundreds of warnings for me: <https://dpaste.de/CmV6>. Is ignoring these warnings safe? If not, what am I doing incorrectly?
[03:22] <SarBear> hi
[03:23] <SarBear> can ffmpeg be used to extract meta data from an mpeg-2 file? If not, do you know of any tools that can?
[05:23] <kriskropd> so, I tried to get ffmpeg to concat videos again, this time I used some mp4 videos from youtube and noticed I was having the same problem as before: when I concat, only the first video is placed into the output video file
[05:37] <kriskropd> oh, I got it :D for somereason, it won't work if I use the shell,b ut if I use a text file and list each file as "file '00.flv'" the ffmpeg concat works - this worked with the videos I was trying to concat earlier, if anyone here was here earlier for my conundrum
[05:50] <kriskropd> ick, nwo i need to figure out why it's breaking between each video (I'm noticign some videos are 800x600 while others are 480x360 - what a pain
[05:50] <kriskropd> now*
[05:51] <relaxed> kriskropd: use MP4Box to concat mp4s, MP4Box -cat 1.mp4 -cat 2.mp4 -new combined.mp4
[06:39] <kriskropd> relaxed: thanks for the suggestion - appears to have the same problem though
[06:44] <relaxed> kriskropd: well, if they have different resolutions it won't work
[06:46] <kriskropd> relaxed: yeah i gotta learn how to re-encode or transcode in my script it seems - anyways, i'me tired of using my free time on this today, im off to bed
[11:12] <guest35324165> hi
[11:14] <pjetr> is there a way to keep the quality of an input video and just transcode it to another filetype?
[11:15] <pjetr> I'm on a mac and just did: `ffmpeg -i in.mov out.avi` this encoded my 908M mov to an 1M avi. and the quality was horrible
[11:16] <spaam> you want to change the container ?
[11:16] <spaam> or change video codec?
[11:16] <spaam> or what?
[11:16] <spaam> maybe both?
[11:16] <spaam> i think the default value is 200kbps
[11:17] <spaam> and your .mov file is much more then that :)
[11:17] <pjetr> maybe both, I'm not completely certain.
[11:18] <pjetr> I exported the .mov from flash, when I want to convert a mov from flash to HTML-video, I get nothing. (transcoded on a Ubuntu server)
[11:19] <pjetr> but if i convert it first using "Adobe Media Encoder" it convert's without a problem. But it's kinda difficult to automate that as a process
[11:19] <spaam> html5-video need to be h264/aac in .mp4 :)
[11:20] <pjetr> and ogg and webm
[11:20] <spaam> yes :)
[11:21] <pjetr> but now I'm just tooling around to see if I can convert my source file to something that the server can encode
[11:21] <pjetr> So I wanted to convert it to a lossless AVI
[11:27] <pjetr> and it's safe to say that I don't really know what I'm doing, or what most of the options mean
[14:12] <luc4> Hi! Is there any guide on how to stream using http using the APIs? Not using ffserver. I had a quick look at ffserver and it seems pretty complex, maybe there is some tutorial or something else that can simplify the process?
[14:15] <Mavrik> wel
[14:15] <Mavrik> ffserver is about as simple as it gets
[14:15] <Mavrik> what is your usecase exactly?
[14:15] <Mavrik> and which formats would you like to stream?
[14:16] <luc4> Mavrik: I'm not an expert of video streaming, I would like to stream mjpeg and h264 at the moment.
[14:17] <Mavrik> in what container to what?
[14:17] <Mavrik> what audio if any?
[14:17] <luc4> Mavrik: no guide or similar to get an idea before digging into ffserver code?
[14:17] <luc4> no audio, if possible, no container
[14:18] <Mavrik> there's no generic "stream everything" guide
[14:18] <luc4> simple mjpeg frames and h264 bytes
[14:18] <Mavrik> you MIGHT find one if you know what do you want to achieve
[14:18] <Mavrik> you can't really stream raw H.264 without a container around
[14:18] <Mavrik> nothing's gonna play that
[14:18] <Mavrik> so again, what are you streaming to_
[14:18] <Mavrik> ?
[14:19] <luc4> myself... another application
[14:19] <luc4> But I see your point, just didn't know. And do you also know for mjpeg?
[14:19] <luc4> Do I need a container for that as well?
[14:19] <Mavrik> mjpeg is just a set of JPEG images
[14:20] <luc4> exactly
[14:20] <Mavrik> mjpeg already is a container definition ;)
[14:20] <luc4> oh what tha... you're right :-)
[14:20] <luc4> sorry
[14:20] <Mavrik> yep, mjpeg is just a sequence of jpegs
[14:21] <Mavrik> there's no official standard for it
[14:21] <luc4> Of course, I'm just stupid :-) assuming I want to send the jpegs I encode through http, can I do that with ffmpeg?
[14:22] <Mavrik> luc4, yep
[14:23] <Mavrik> luc4, of course you'll have to decide which side will be the server
[14:23] <Mavrik> ffserver implements full HTTP server to serve data
[14:23] <Mavrik> if you want to *SEND* data
[14:23] <Mavrik> your target app will have to implement it
[14:24] <again_ffm> Mavrik, do you know, ffmpeg can repeat input video?
[14:24] <Mavrik> there was a command but it was changed
[14:24] <Mavrik> i'm 100% sure it says so in the documentation if repeat is supported for a stream ;)
[14:25] <again_ffm> Mavrik, if loop_input than, it just repeats image (img2video)
[14:25] <luc4> Marik: ah yes, the code gave me that idea... so I have no other option than doing the entire work? Any tutorial/guide/whatever that you can suggest? Just want to know this before deciding which way to go. And thanks for your help ;-)
[14:26] <Mavrik> luc4, well, I don't know how you imagined doing HTTP data transfer without having a HTTP server somewhere :P
[14:26] <Mavrik> luc4, are you sure you wouldn't rather buy a Wowza license and use that?
[14:26] <Mavrik> and then use ffmpeg to stream to Wowza?
[14:27] <Mavrik> or one of the other streaming servers?
[14:27] <luc4> Mavrik: I was not imaging a solution without an http server, but possibly that ffmpeg was implementing a minimal itself or maybe some other lib "wrapping" this... just wanted to ask before doing much useless work :-)
[14:28] <Mavrik> luc4, mhm
[14:29] <Mavrik> luc4, well& as I said, I suggest you find a good streaming server instead of rolling your own
[14:29] <Mavrik> if you can't use ffmpeg/ffserver combo as they are now
[14:29] <Mavrik> luc4, and then use your lib to create a MPEG2-TS video or something like that and UDP stream it to the streaming server
[14:29] <Mavrik> if your target is streaming to multiple clients
[14:29] <Mavrik> if you just want to transport video from one point to another
[14:30] <again_ffm> Mavrik, and last q: ffmpeg can join repeated image and video? ffmpeg -i 1.mp4 -loop 1 -i Untitled.png -t 10 -filter_complex "[0] [1] concat=n=2:v=1:a=1 [v] [a]" -map "[v]" -map "[a]" 3.mp4 - say: Stream specifier '' in filtergraph description [0] [1] concat=n=2:v=1:a=1 [v] [a] matches no streams.
[14:30] <Mavrik> forget about HTTP and just use UDP
[14:31] <Mavrik> again_ffm, that looks like a syntax error in your filter spec
[14:33] <luc4> Mavrik: maybe I could use a pipe or something like this to provide data to ffserver?
[14:34] <again_ffm> Mavrik, may be but ffmpeg -i 1.mp4 -i 1.mp4 -filter_complex "concat=n=2:v=1:a=1" 3.mp4 works fine
[14:34] <luc4> Mavrik: this is only for a local network, internet access is not supposed to be needed.
[14:35] <Mavrik> luc4, look, you still haven't described your use case
[14:35] <Mavrik> WHO IS THE TARGET?
[14:35] <Mavrik> a single app? a browser? 15 people?
[14:36] <Mavrik> are you transporting stream over internet or over a cable from one machine to another?
[14:36] <Mavrik> how fast is your transport? do you need error correction?
[14:36] <Mavrik> there's tens of ways of doing that
[14:36] <Mavrik> and unless you know what you want to do we can't really help you :)
[14:39] <luc4> Mavrik: stream coming from a camera will be transferred over a wifi link (Internet access is not necessary) to some clients (number is not known but limited to 1-5 devices). The client is already working. I need to implement the server. In my application I already use ffmpeg for other things, so it would be nice to use it again for this. Target platform is Android. http is preferrable to be able to see it on a browser, but other ways if simpler might be good as
[14:39] <luc4> well.
[14:40] <Mavrik> ah, cool
[14:40] <Mavrik> :)
[14:40] <Mavrik> luc4, if you want live streaming you'll have to use HLS, not pure HTTP for Androids
[14:41] <luc4> "The client is already working": I mean I aready implemented it and wroks correctly. It is also using ffmpeg and it was really satisfying so re-using ffmpeg libs might be a good idea...
[14:41] <luc4> Mavrik: yes, I was looking at HLS indeed.
[14:41] <Mavrik> I think ffserver doesn't support HLS
[14:43] <luc4> Mavrik: what is not clear to me... maybe you know it... is HLS the only way to transfer a video stream using http?
[14:43] <luc4> Mavrik: I'm pretty ignorant, sorry.
[14:43] <Mavrik> luc4, HLS is pretty much the only way to transfer _LIVE_ (e.g. not stored in a prebuilt file) video to be watched in browser on mobile devices
[14:44] <Mavrik> also, HTTP is really shit for video transport
[14:44] <Mavrik> so unless you have a requirement of using a browser
[14:44] <Mavrik> there's tons of better transports for video than HTTP
[14:44] <luc4> Mavrik: thanks for the confirmation. Then ffserver is out of the question :-) still can I use ffmpeg? Maybe rtp?
[14:44] <Mavrik> if you DO want to use a browser, HLS is practically your only choice for mobile
[14:45] <Mavrik> and flash for desktop
[14:45] <luc4> preferrable...
[14:46] <luc4> rtp is the other choice right?
[14:46] <Mavrik> um
[14:46] <Mavrik> not really
[14:46] <Mavrik> or, rather I'll say: RTP is one of the worse things you can do to yourself
[14:46] <Mavrik> besides jumping into a pool of razors
[14:46] <Mavrik> UDP/MPEG2-TS, RTMP, RTSP are all preferrable than dealing with RTP
[14:47] <luc4> I see... and I was also reading this: http://sirlagz.net/2012/08/04/how-to-stream-a-webcam-from-the-raspberry-pi/
[14:49] <luc4> This seems pretty interesting, but confusing... that is not hls... so what is it?
[14:50] <Mavrik> Format mjpeg
[14:51] <luc4> I've been doing some opensource with rasp myself, always using ffmpeg, so I could test on my own pi... but if it is not hls, what is it?
[14:52] <Mavrik> it's just sending jpegs over an open socket
[14:52] <luc4> Not http?
[14:53] <Mavrik> test it
[14:53] <Mavrik> does that even work in a browser?
[14:55] <luc4> Mavrik: I'll try now :-) the guide says it should work on a browser... but I'll test it :-)
[14:58] <Mavrik> luc4, yeah, maybe it even works
[14:58] <Mavrik> probably browser keeps HTTP connection open and just redraws the frame
[14:58] <luc4> Mavrik: I'm goind to tell you :-)
[15:01] <luc4> Mavrik: nope... [http @ 0x8ac3f20] HTTP error 404 Not Found. Maybe my version of ffmpeg is too old...
[15:01] <luc4> Mavrik: I'm testing on Ubuntu right now anyway...
[15:13] <luc4> Mavrik: maybe something recently implemented?
[15:13] <luc4> Mavrik: http://ffmpeg.gusari.org/viewtopic.php?f=12&t=914
[15:14] <pyBlob> I'm running this http://pastebin.com/aYVBMAFn command to capture from my webcam to a file and at the same time displaying it using ffplay
[15:15] <pyBlob> that works ... only problem is that the video in the file is too fast+crippled when viewing it
[15:15] <pyBlob> any ideas?
[15:20] <pyBlob> oh ... I think I know the probelm
[15:25] <pyBlob> the FILE<&1 didn't work as expected
[15:29] <sacarasc> Try using tee, pyBlob.
[15:29] <pyBlob> I had problems with tee ... so I wanted to try something new, which failed too xD
[19:24] <smjd> how do I mute audio in ffplay?
[19:33] <olspookishmagus> smjd: if you're not refering to when it's playing you can use the -an switch
[19:33] <smjd> I managed to do it with pavucontrol
[23:45] <plm> HI all
[23:46] <plm> people, where I found samples of videos.. I would like 320x240 samples, any format, .ogg .mp4
[23:57] <saste> plm, why don't you create them youself?
[00:00] --- Sat Oct 19 2013
More information about the Ffmpeg-devel-irc
mailing list