[Ffmpeg-devel-irc] ffmpeg.log.20140405

burek burek021 at gmail.com
Sun Apr 6 02:05:01 CEST 2014


[03:32] <t4nk697> what's that?
[03:33] <t4nk697> has been ffmpeg developed in java?
[03:34] <sacarasc> No.
[11:19] <superware> can I use the same packet from av_read_frame for two outputs? say avcodec_decode_video2 for display, and av_write_frame for file dumping?
[11:32] <superware> is the right architecture to use when I have one input and two outputs is to reuse the same AVPacket (from av_read_frame)?
[11:46] <Mavrik> check documentation about ownership transfer when calling methods with that packet
[11:46] <Mavrik> if the callable function takes ownership you have to copy it
[11:54] <superware> Mavrik: well, I've checked avcodec_decode_video2 and av_write_frame documentation but couldn't find a reference to that..
[11:55] <superware> the avpkt parameter is defined as [in]
[11:56] <Mavrik> hmm
[11:56] <Mavrik> documentation fail then
[11:56] <Mavrik> it probably doesn't take ownership, but you SHOULD check that
[11:56] <superware> how? :|
[11:56] <Mavrik> you have full source
[11:57] <Mavrik> check the method you're calling and the decoder
[11:57] <Mavrik> it's not like there's alot of code.
[11:57] <superware> might be too complex
[11:57] <Mavrik> dude
[11:57] <Mavrik> it takes less time than you sat here waiting for answer ;)
[11:57] <superware> heh
[11:58] <superware> I look at the code, it's not that simple to be sure nothing is being changed
[12:00] <superware> http://www.ffmpeg.org/doxygen/trunk/libavcodec_2utils_8c_source.html
[15:05] <nell> im at a hackathon right now, we needa know what to embed within html5 and what the config looks like for that -- server config for html5 capable format
[15:05] <nell> we been tackling this problem already for many hours and we're drained >_<
[15:13] <Mavrik> nell, what's a "html5 capable format"
[15:13] <Mavrik> what are you trying to do exactly?
[15:14] <nell> We're trying to stream webcams to the browser
[15:31] <nell> anyone?
[15:31] <nell> ill pay someone doge or bitcoin
[15:32] <nell> for their help in solving this
[15:32] <klaxa> maybe you can use HLS
[15:34] <klaxa> i don't know in how far webm supports live streaming, but that might be worth looking into as well
[15:35] <klaxa> mp4 is somewhat live-streamable too
[15:35] <klaxa> ogv might be too
[15:36] <klaxa> i think that are all your options if you are going for html5 compatibility
[15:36] <klaxa> nell
[15:37] <BtbN> What was used before av_frame_free was introduced? Trying to add compatiblity for older ffmpeg versions to an existing application.
[15:38] <nell> thanks klaxa ill explore those
[17:26] <Moonlightning> How do I add chapter markers to an existing video file?
[17:27] <JEEB> you generally can't without a remux, for matroska you can create and mux chapters easily with, say, mkvmerge(gui) [part of mkvtoolnix]
[17:27] <JEEB> for mp4 you'll have to use either GPAC or L-SMASH's tools for it, but chapters aren't an official thing in mp4 so YMMV in which things will support those
[17:28] <Moonlightning> Wait, really?
[17:28] <Moonlightning> I swear I had an .mp4 video with chapters in it once
[17:29] <JEEB> yes, but both ways of putting those chapters there are not in the specification
[17:29] <JEEB> and one of them is now actually banned in the specs
[17:29] <JEEB> (the MOV way)
[17:29] <JEEB> the two ways are basically "the Nero way" and "the MOV way"
[17:30] <JEEB> aka something that Nero made up, and something that was used in MOV (which is what the MPEG-4 container mostly bases upon)
[17:40] <Moonlightning> &wait, does ffmpeg not do chapter marking at all?
[17:59] <alusion> how download ogv codec
[18:10] <Guest11536> I'm posting this again..I really need your help.. I don't think it's difficult for you to figure it out...
[18:10] <Guest11536> Hey guys,
[18:10] <Guest11536> I'm building an android application, and I need to use some ffmpeg commands.
[18:10] <Guest11536> I've got a source video and I need to edit it:
[18:10] <Guest11536> 1) Cut only the last 7 seconds.
[18:10] <Guest11536> 2) Add a watermark(logo) on the top left side of the video.(00:00-00:07)
[18:10] <Guest11536> 3) Add a video inside the source video on the bottom right side of the video.(00:00-00:07)
[18:10] <Guest11536> 4) Add another video to the edited video from steps 1-3 (00:07-00:10)\
[18:10] <Guest11536> Can I do it in one ffmpeg command line?
[18:10] <Guest11536> What do I need to consider when I'm editing a video for a cellphone (quality,sizing..)?
[18:10] <Guest11536> Thanks a lot!
[18:26] <Bombo> i'm trying to recode a 'Video: mpeg4 (Simple Profile), yuv420p' to theora with '-i foo.mp4 -c:v libtheora -b:v 1000k -c:a libvorbis -b:a 128k' but i get '[mp4 @ 00000000026dd9e0] track 0: could not find tag, codec not currently supported in container'
[18:28] <Bombo> oh
[18:28] <Bombo> theora in mp4.
[18:28] <Bombo> i'm dumb.
[18:28] <Bombo> sorry ;)
[18:30] <Bombo> wow 19fps
[18:33] <Bombo> are there different profiles like for x264 superfast? is encoding theora generally slow?
[18:52] <thebombzen> Bombo: x264 is particularly good about configuring how you want it to encode
[18:52] <thebombzen> most other encoders like libtheora only allow you to set the bitrate because the algorithm isn't advanced enough
[18:53] <thebombzen> to determine speed/quality at the same bitrate
[18:53] <thebombzen> Guest11536: you can do all that in several command lines
[18:54] <thebombzen> Guest11536: but you might need to probe it first to find out the duration so you know how far to skip
[19:34] <name_space> howdy guys. So I've configured ffserver for 4 feeds. When we execute ffmpeg -f mpeg /dev/video0 http://0.0.0.0:8090/feed1.ffm it runs until it crashes and I can't get a feed in the browser.
[19:35] <name_space> is there a format I can just embed into the browser? Am I configuring things right?
[19:38] <Guest11536> thebombzen: I'm new to FFMPEF technology. Would you write me the command so i could implement it?
[19:39] <name_space> ?
[19:54] <Bombo> thebombzen: hmmkay
[19:55] <Bombo> Guest11536: for watermarking i found http://stackoverflow.com/questions/10918907/how-to-add-transparent-watermark-in-center-of-a-video-with-ffmpeg
[19:56] <Bombo> Guest11536: not sure if that works with videos
[19:57] <alusion> hey
[19:57] <alusion> i need information on stream format & ffserver config file
[19:58] <Bombo> Guest11536: when you do ffmpeg -i inputfile you get time, you would get that and calculate the last seconds, then use -ss skip to timestamp, -t duration (or use ffprobe)
[20:06] <Bombo> Guest11536: also: you need a nick ;)
[20:13] <xenphor> hi im using the looping command found in the concatenating section of the help file except I want to alter it slightly. Instead of playing each file once and then looping, I would like to have it finish a complete loop of a file and then move on to the next one. This is the command I was using: http://pastebin.com/nTmAw9Hz
[20:19] <xenphor> actually sorry that's the exact command in the help I first altered it a little: http://pastebin.com/XMEEnKpH
[20:48] <llogan> xenphor: i don't understand. can you explain in more detail?
[20:48] <llogan> alusion: did you see https://trac.ffmpeg.org/wiki/Streaming%20media%20with%20ffserver
[20:51] <xenphor> ok maybe this will help: http://pastebin.com/czpezVZ8
[20:53] <llogan> xenphor: how many files are you working with?
[20:53] <xenphor> well probably eventually will be around a couple hundred or more
[20:54] <xenphor> they're all short 5 second clips
[20:54] <xenphor> so i want them to loop a couple times each
[20:54] <xenphor> before merging them together
[21:29] <Guest11536> Bombo: thanks a lot, I manage to do the commands one after another, but how i can combine them together? It is possible at all?
[22:50] <Dumpy> Hi
[00:00] --- Sun Apr  6 2014


More information about the Ffmpeg-devel-irc mailing list