[Ffmpeg-devel-irc] ffmpeg.log.20130617

burek burek021 at gmail.com
Tue Jun 18 02:05:02 CEST 2013


[00:35] <ZenGuy311> i tried using avimux crop preview filter tool but the dimensions didn't match up when i entered it into ffmpeg
[00:35] <ZenGuy311> how do i preview a video i want to crop to get the crop dimensions?
[00:47] <evdw87> ZenGuy311: I dunno exactly what you mean, but I guess you just need to make your dimensions a multiple of 8
[00:48] <ZenGuy311> i'll try to expxplain better
[00:50] <ZenGuy311> i recorded my desktop , whole desktop by accident.. i want to crop the video to the position i originally  wanted to record.. i did manage to crop it using avimux but the resulting output video was stuttering during playback...
[00:51] <ZenGuy311> i tried to crop with ffmpeg using the same dimensions i used in avimux but that didn't work, it cropped the wrong section of the video
[00:51] <evdw87> ZenGuy311: did you use compression in avimux to export?
[00:51] <ZenGuy311> i think so
[00:52] <ZenGuy311> yes
[00:52] <ZenGuy311> how would a compression setting affect the video position?
[00:53] <evdw87> ZenGuy311: not, but you were talking about stuttering
[00:53] <evdw87> this could occur with raw avi files
[00:54] <ZenGuy311> it was a avi file that was originally a ogv file
[00:54] <evdw87> hrrmm, I know less about ogv
[00:55] <evdw87> I'm also not that handy with ffmpeg to be honest
[00:55] <evdw87> Others could help you out with ffmpeg
[00:55] <evdw87> personally I would use virtualdub
[00:56] <ZenGuy311> virtualdub is on linux?
[00:57] <ZenGuy311> but how are people cropping video with just dimensions alone? i don't know what i am cropping
[00:57] <evdw87> ZenGuy311: oh damn, I thought avimux was windows
[00:58] <ZenGuy311> it's cross platform
[00:58] <ZenGuy311> i think
[01:05] <evdw87> ZenGuy311: I might know what your error is, I could be wrong..
[01:05] <ZenGuy311> i'll try anything
[01:05] <ZenGuy311> let me get on my desktop and paste my command
[01:06] <evdw87> sure
[01:08] <zenguy_pc> ffmpeg -i videoa.avi -vb 400k -f flv -vf crop=484:364 -r 15 ~/output-video.flv
[01:09] <zenguy_pc> the cropbottom croptop etc switches was removed but it looked like it would have worked .
[01:10] <zenguy_pc> i don't really udnerstand the height:weight dimensions since in my mind that could be any position in a video.. does it work like a graph?
[01:12] <evdw87> ZenGuy311: seeing the manual, as you defined it would crop from the center with the given dimensions
[01:12] <ZenGuy311> is there a grapgh overlay tool i can look at to calculate the dimensions?
[01:13] <evdw87> ZenGuy311: grab a screen and use something similar to photoshop
[01:14] <ZenGuy311> ok i have gimp
[01:15] <evdw87> If you know the x and y start value and the output resolution, you're fit
[01:21] <evdw87> first example will help you out: http://www.ffmpeg.org/ffmpeg-filters.html#Examples-27
[01:22] <ZenGuy311> simple IVTC of a top field first...
[01:23] <ZenGuy311> loaded it in opera and it showed the very top portion of the page
[01:26] <evdw87> ZenGuy311: go to section 9.13.1 Examples
[01:27] <ZenGuy311> ah, i see
[06:08] <SirCmpwn> is it possible to seek the subtitles independently of the video?
[06:26] <zap0> you don't seek video;  you seek data blocks.
[06:28] <SirCmpwn> which is supremely annoying and makes no sense to someone unfamiliar with the technical details of video encoding
[06:29] <SirCmpwn> my real question is still this, by the way, if anyone wants to take another crack at it: http://superuser.com/questions/608192/
[06:30] <zap0> you have to learn what video is before you can do anything with its sub components.  this is true of every field.
[06:32] <SirCmpwn> I just want a script that pulls a frame every minute, I wouldn't think that'd require intimate knowledge of video encoding
[06:33] <zap0> just because you know a few terminology doesn't mean you are making a coherent sentence.   frames only exist because of video.
[06:33] <zap0> video is data constructed into pixels over time.
[06:33] <SirCmpwn> assume I'm using the laymen's definition of these terms
[06:34] <SirCmpwn> by "frame" I mean something I'd see when I pause the video
[06:35] <zap0> you keep saying "the video"  is there only 1?
[06:36] <SirCmpwn> no
[06:36] <SirCmpwn> I'm trying to write a script that I can give a video file to and then get one "frame" every minute from
[06:36] <SirCmpwn> starting from a certain time (such as 00:00:01 to get frames 00:00:01, 00:01:01, 00:02:01, etc)
[06:37] <SirCmpwn> oh, with subtitles rendered into the "frames"
[06:40] <SirCmpwn> I've started trying to run ffmpeg several times and pipe them together, I may be getting somewhere
[06:40] <SirCmpwn> takes a hell of a lot longer to run, though
[06:44] <SirCmpwn> is there an expensive, uncompressed encoding I can tell ffmpeg to use to reduce time?
[06:46] <sacarasc> SirCmpwn: Try looking at -vf framestep.
[06:46] <SirCmpwn> is that a shorthand for select=...?
[06:47] <SirCmpwn> because I was using select earlier and ended up getting a hundred identical screenshots per frame
[06:47] <SirCmpwn> also, seeking through subtitles appears to not really work at all
[07:04] <SirCmpwn> [rawvideo @ 0x1a888e0] Could not find codec parameters for stream 0 (Video: rawvideo, yuv420p, -4 kb/s): unspecified size
[07:04] <SirCmpwn> is there a way to tell ffmpeg to include the size when using rawvideo?
[07:05] <SirCmpwn> or is there another encoding that I can use that's quick and lossless
[07:21] <SirCmpwn> alright, giving up again for a while. Hopefully some answers on superuser come in.
[11:08] <alin|mobile> l
[13:25] <khali> is there an equivalent of option -t but where the limit is specified in number of frames?
[13:28] <pyBlob> -vframes or -frames:v
[13:34] <khali> pyBlob: excellent, thank you!
[14:10] <markcl> "ffmpeg -y -i " . $path . $_REQUEST['fn'] . "_Video.flv -i " . $path . $_REQUEST['fn'] . "_Audio.flv " . $path . $_REQUEST['fn'] . "_Merged.flv"
[14:10] <markcl> the quality of the merged file i get from using the code above is very bad
[14:10] <markcl> compared to the quality of the video
[14:11] <markcl> recorded originally
[14:11] <markcl> is there a setting i need to use to be able to merge the audio and video files more properly?
[14:11] <khali> markcl: I think you want -vcodec copy -acodec copy
[14:12] <markcl> khali: thanks. ill try what you said. hopefully it will merge the two files more properly. =)
[14:14] <ubitux> (or just "-c copy")
[14:15] <khali> ubitux: ah, I didn't know this trick, thanks :)
[14:16] <vulture> is there a way to get avformat_open_input() to not fail if it doesnt think I have enough input data yet? seems to read about 4MB, then cant read anymore (because it's streaming), fails, then refuses to try again? or am I doing it wrong?
[14:36] <markcl> im already using ffmpeg -c -i $video -i $audio merged/$merged. but it seems the quality is still bad =(
[14:36] <markcl> what can i do to improve the quality to be the same as that of the original video?
[14:40] <relaxed> markcl: Your command is all wrong. ffmpeg -i video -i audio -c copy output.ext
[14:45] <markcl> relaxed: following your suggestion my code is now: ffmpeg -i $video -i $audio -c copy merged/$merged
[14:45] <markcl> it tells me
[14:45] <markcl> Unrecognized option 'c'
[14:45] <markcl> Failed to set value 'copy' for option 'c'
[14:48] <relaxed> markcl: use "-vcodec copy -acodec copy" instead.
[14:48] <relaxed> ^^ which will work with older versions of ffmpeg.
[14:52] <ubitux> sounds like a very old ffmpeg though..
[15:00] <markcl> ubuntu actually told me ffmpeg is deprecated
[15:01] <markcl> now im trying out the command in avcodec
[15:01] <markcl> avconv
[15:02] <JEEBsv> ffmpeg is deprecated as a command in libav, not ffmpeg (project-wise). Basically if you're using libav, you use the 'avconv' command, and if you're using the ffmpeg project, you use the 'ffmpeg' command
[15:04] <ubitux> markcl ^
[15:04] <markcl> > ffmpeg
[15:04] <markcl> ffmpeg version 0.8.6-6:0.8.6-0ubuntu0.12.10.1, Copyright (c) 2000-2013 the Libav developers
[15:04] <markcl>   built on Apr  2 2013 17:02:16 with gcc 4.7.2
[15:04] <markcl> *** THIS PROGRAM IS DEPRECATED ***
[15:04] <markcl> This program is only provided for compatibility and will be removed in a future release. Please use avconv instead.
[15:04] <ubitux> markcl: read the link.
[15:04] <ubitux> it's propaganda
[15:06] <vulture> haha
[15:06] <vulture> pretty jackass thing to do there
[15:07] <markcl> i knew it!
[15:07] <markcl> those ubuntu devs are evil!
[15:07] <JEEBsv> lol
[15:07] <markcl> =)
[15:07] <JEEBsv> doesn't have much to do with ubuntu itself to be honest
[15:08] <JEEBsv> just that it uses libav, and that message is meant that the "ffmpeg" within libav wasn't updated beyond a period
[15:08] <JEEBsv> the 0.8 release IIRC was the last one @ libav that still had the ffmpeg binary
[15:08] <JEEBsv> after that it was removed
[15:10] <vulture> does libav have a more sane avformat_open_input() ? :(
[15:11] <JEEBsv> well, since most of the new API changes come from libav generally (and then they generally get merged for compatibility at ffmpeg), I would say they generally are alike
[15:11] <JEEBsv> but stuff can be different depending on the version :P
[15:41] <markcl> ok, i got the thing working thanks to you guys!
[15:41] <markcl> but the problem now is that my input audio is 1 second longer than the input video
[15:42] <markcl> is there a way tot adjust it so the audio gets merged from the 1second mark onwards?
[15:43] <markcl> because the audio in my merged file is 1 second delayed from the video
[15:43] <vulture> while keeping sync? good luck with that :D
[15:44] <illusion_> hi guys, is there any tutorial in how to use ffmpeg to stream mp4 video to ffserver and watch it with jwplayer ?
[15:44] <vulture> I've never managed to get ffmpeg to keep sync, but maybe I'm doing it wrong (tm)
[15:44] <markcl> its just always 1 second delayed
[15:44] <markcl> because the speakers are opened first before the webcam
[15:45] <vulture> well the timestamps when each are captured should be saved/encoded in order to keep them synchronized... it cant always be 1 second
[16:13] <newbie|4> vulture: yes
[16:14] <newbie|4> but what command should i use
[16:14] <newbie|4> to encode them so that they are synchronized by x seconds
[16:14] <newbie|4> depending on the timestamp
[16:14] <newbie|4> i can handle the timestamp code
[16:14] <newbie|4> i just need the ffmpeg command
[16:14] <newbie|4> to do that
[16:14] <vulture> idk I do that part programmatically
[16:15] <newbie|4> programmatically?
[16:15] <vulture> not from command line :P
[16:15] <newbie|4> oh, how do you do that?
[16:15] <vulture> you write a program haha
[16:15] <newbie|4> and what language you use?
[16:15] <vulture> c/c++
[16:15] <newbie|4> cool
[16:15] <newbie|4> im a c++ dev  long time ago
[16:15] <vulture> I use directshow to capture from the camera
[16:15] <vulture> it provides the timestamps of when the sample arrived
[16:16] <vulture> and I duplicate frames to ensure an average framerate of 29.97
[16:16] <vulture> (if the source is a variable frame rate, like a webcam)
[16:16] <vulture> but yeah I always have issues with the commandline version
[16:16] <vulture> most notably with synchronizing audio with video
[16:17] <vulture> I have no idea how to do it, or if it even can do it
[16:22] <newbie|4> well i use flash to upload the video to the server
[16:22] <newbie|4> cant really use directshow for web
[16:23] <vulture> ahh
[16:23] <vulture> I know little about flash
[16:23] <vulture> (despite using it this very minute, lol)
[16:24] <newbie|4> me either
[16:25] <newbie|4> but im the only guy in the company competent (or stupid) enough to get the video/audio recording app for the company the boss requested done.
[16:25] <newbie|4> the other guys are all designers
[16:25] <newbie|4> im the only developer in my current company
[16:25] <newbie|4> should have told them to outsource the stuff. lol.
[16:26] <newbie|4> never even touched ffmpeg my entire life
[16:26] <newbie|4> anyway, at least i was able to produce what is required for now thanks to the people on this irc room
[16:26] <newbie|4> the sync thing hopefully wont matter too much
[16:28] <vulture> nobody here ever helps me :(
[16:32] <newbie|4> ask for it i gues
[16:33] <newbie|4> i had this friend who asked for a dollar to complete strangers
[16:33] <newbie|4> and sometimes he gets some
[16:33] <newbie|4> he says its for emergency
[16:33] <newbie|4> but actually he's just trying to see if it works
[16:33] <vulture> yeah I ask occasionally when I see people reply to other people, but I never get that reply myself :P
[16:54] <xlinkz0> can i make an mp4 file that is playable while ffmpeg transcodes into it?
[16:54] <xlinkz0> i mean writes video into it
[16:54] <xlinkz0> or is there any container that supports this?
[16:57] <vulture> mp4 supports that
[17:16] <xlinkz0> vulture: if it does, ffmpeg doesn't do it by default
[17:16] <xlinkz0> i can't open the file in vlc, before ffmpeg exits
[17:17] <xlinkz0> The format of 'file:///C:/Users/xLnk/Desktop/effects/cam.mp4' cannot be detected
[17:17] <vulture> really?
[17:17] <vulture> are you flushing the output
[17:18] <vulture> and/or maybe you need to enable some streaming format variant?
[17:19] <vulture> also maybe vlc just cant handle partial files
[17:22] <xlinkz0> vulture: it seems it does not indeed flush the output, any way i can tell ffmpeg to do that?
[17:22] <sacarasc> To get MP4 to do it, you might have to use one of the -movopts or whatever it is. I can never remember...
[17:23] <xlinkz0> i did use -movflags faststart but don't know if the mov atom is being applied before or after the file is 'finished'
[17:23] <vulture> are you handling the file output stream yourself?
[17:23] <sacarasc> Or you can just use matroska? :D
[17:23] <xlinkz0> vulture: don't think so
[17:23] <xlinkz0> i just said -i bla bla cam.mp4
[17:24] <vulture> oh, using commandline? no idea :P
[17:35] <marauder> Hey I am trying to compile the latest version of ffmpeg as my current one is outdated. I'm following the compliltion guide for ubuntu that i've used in the past, but can't get the system to read opus
[17:35] <marauder> the guide seems to have been updated and is quite different than the last time i used it
[17:36] <marauder> When running .configure i get error:opus not found and the config.log ends with: check_pkg_config opus opus_multistream.h opus_multistream_decoder_create
[17:36] <marauder> ERROR: opus not found
[17:41] <xlinkz0> is it possible to stream rtmp with ffserver?
[17:42] <sacarasc> marauder: Do you have opus and opus-dev installed?
[17:43] <marauder> i downloaded the latest opus and installed it according to the guide
[17:43] <marauder> i tried installing opus-dev from source as well, but it gives me an error saying opus is not installed
[17:44] <marauder> during make i see this output is that normal: make[4]: Nothing to be done for `install-exec-am'.
[17:44] <marauder> make[4]: Nothing to be done for `install-data-am'.
[17:49] <obeattie> Hi - quick question; is it possible to use ffmpeg to demux a video input stream (image2) and combine an audio stream with the resulting video in one command, or does that require two passes?
[18:32] <trose> obeattie: that is definitely possible
[18:33] <trose> you need to define the video and audio inputs separately
[18:34] <trose> obeattie: http://ffmpeg.org/trac/ffmpeg/wiki/Create%20a%20video%20slideshow%20from%20images check out the last example
[19:24] <Sirisian|Work> I'm having an issue following the tutorial here: https://ffmpeg.org/trac/ffmpeg/wiki/UbuntuCompilationGuide I'm on ubuntu server 12.04 LTS. I run the yasm lines with no errors. Then for the x264 section for the .configure line it says "Found no assembler Minimum version is yasm-1.2.0"
[19:25] <JEEB> 12.04 has an old yasm
[19:25] <JEEB> and x264 needs the current release
[19:25] <Sirisian|Work> I built it though from scratch using those commands
[19:25] <Sirisian|Work> oh you mean I have 2 installed?
[19:25] <JEEB> the hell I know :D
[19:25] <JEEB> I haven't looked at the page and I don't know
[19:26] <JEEB> you should run `which yasm`
[19:26] <JEEB> to see which binary is getting used
[19:26] <Sirisian|Work> returns nothing
[19:26] <JEEB> :D
[19:26] <JEEB> do you get anything if you just do `yasm --version` ?
[19:27] <Sirisian|Work> The program 'yasm' is currently not installed.  You can install it by typing:
[19:27] <Sirisian|Work> apt-get install yasm
[19:27] <Sirisian|Work> I'm guessing whoever wrote that tutorial never tested it.
[19:27] <JEEB> I have no idea how the hell you have "compiled/built" it
[19:27] <JEEB> but it seems like it most surely isn't under your current PATH
[19:27] <Sirisian|Work> that tutorial I linked under yasm gives step by step instructions
[19:27] <JEEB> compiling yasm isn't exactly hard :D
[19:28] <JEEB> uhh
[19:28] <JEEB> I have a feeling that you just haven't updated your PATH yet
[19:29] <Sirisian|Work> explain
[19:29] <JEEB> ahaha
[19:29] <JEEB> the tutorial surely is doing it right in the way that it recommends you install some dependencies into your home folder
[19:29] <JEEB> but
[19:29] <JEEB> it doesn't add the stuff into your PATH
[19:31] <JEEB> try doing `export PATH=${HOME}/bin:${PATH}` in a terminal
[19:31] <JEEB> and then try doing `yasm --version`
[19:32] <JEEB> (that basically adds ${HOME}/bin , which the tutorial seems to use as the default output binary directory, to the top of your PATH)
[19:33] <Sirisian|Work> yeah why does it install it into /root/bin/yasm. That seems a bit silly.
[19:33] <JEEB> because ${HOME} is supposed to be your home folder... also why the hell are you doing the compilation itself as root :D
[19:34] <Sirisian|Work> I always use root. This is linux :\
[19:34] <JEEB> yes, I would definitely run all those compilation scripts of all those projects all the way as root, sure
[19:34] <JEEB> yeah,
[19:34] <JEEB> no
[19:34] <relaxed> root is only for things that require root.
[19:35] <relaxed> which this does not
[19:35] <Sirisian|Work> is yasm in the /root/bin/yasm standalone? Could I move it without breaking anything?
[19:35] <JEEB> also the fail of that tutorial seems to just be that it completely forgets that you have to, you know, actually add that damn binary directory into your PATH or stuff isn't going to be found
[19:35] <JEEB> Sirisian|Work, it should be
[19:35] <JEEB> that said, I really like the "build in your home folder" way personally
[19:36] <relaxed> The bin dir should be in the folder with lib and include, I think.
[19:36] <JEEB> yes, if --prefix is used and --bindir is not separately set
[19:37] <relaxed> Some distros have $home/bin in the users PATH. So maybe it was short sighted.
[19:37] <JEEB> it definitely was because it's for Ubuntu
[19:37] <JEEB> and Ubuntu most definitely doesn't have it there
[19:37] <JEEB> heck
[19:37] <Sirisian|Work> this is sound complicated. I'll just add the path. PATH=$PATH:${HOME}/bin ?
[19:37] <JEEB> it has `. ~/.profile` there
[19:38] <JEEB> Sirisian|Work, yes -- I do prefer writing the variables with ${SOMETHING} tho
[19:38] <JEEB> instead of $SOMETHING
[19:38] <JEEB> (I've had bad encounters when $SOMETHING doesn't get "opened up"
[19:38] <relaxed> Debian's .profile has an if/then to add $home/bin if it's there.
[19:38] <relaxed> But it shouldn't be assumed.
[19:39] <JEEB> well, could be that him running it as root has something to do with it :P
[19:39] <JEEB> this reminds me that if I ever end up administrating any *nix project like this, I should definitely make people pay for running its setup scripts as root
[19:39] <JEEB> :P
[19:39] <JEEB> "yes, trusting random scripts from the internet is oh such a great idea"
[19:40] <relaxed> "someone set us up the bomb."
[19:40] <Sirisian|Work> I'm using an open source OS. It's all scripts from random people on the internet :\ I get your point though.
[19:40] <xlinkz0> does anyone know where i could get support for crtmpserver?
[20:00] <Schakal_No1> Hi, how can I tell ffserver which audo track to use (i got avi- and mkv-files containing multiple audio tracks)
[20:28] <Sirisian|Work> jeeb, any idea why when I run the last step of configuring ffmpeg it would say "ERROR: opus not found" The last line in the config.log is "check_pkg_config opus opus_multistream.h opus_multistream_decoder_create" so I'm assuming it's another path issue. The only opus file is in "/root/ffmpeg_build/include/opus" I'm using the ./configure from the previous link which has '--extra-cflags="-I$HOME/ffmpeg_build/include"' so it shoul
[20:28] <Sirisian|Work> d see it?
[20:30] <JEEB> if it uses pkg-config, you should add PKG_CONFIG_PATH=/your/prefix/lib/pkgconfig before the dot-slash-configure and the options
[20:31] <JEEB>  /your/prefix naturally being the --prefix you set when compiling whatever component was being searched for
[20:36] <klemax> how to correct if ffmpeg is working properly?
[20:36] <Sirisian|Work> how to check?
[20:37] <klemax> yeah
[20:40] <Sirisian|Work> jeeB, hmm no change. It can't seem to notice the opus file.
[20:41] <Sirisian|Work> klemax, the documentation has "ffmpeg 2>&1 | head -n1" to see if you've installed it correctly.
[20:42] <klemax> thanks Sirisian|Work
[20:42] <klemax> let me check it
[20:47] <Sirisian|Work> JEEB, interesting. If I remove --enable-libopus it works. Is opus important?
[20:48] <Sirisian|Work> oh audio format
[20:53] <Sirisian|Work> yeah I don't need that one anyway. Thanks.
[21:01] <obeattie> trose: thanks very much, i'll take a look at that
[22:01] <tlhiv_laptop> is there any way to change the key that stops ffmpeg from encoding ... it's normally "q"
[22:07] <vulture> the math prof?
[22:09] <vulture> if you're able to recompile it yourself, you could get rid of it via https://lists.ffmpeg.org/pipermail/ffmpeg-cvslog/2011-May/037290.html , or change it in  static int decode_interrupt_cb(void) -- return q_pressed || (q_pressed = read_key() == 'q');
[22:10] <vulture> looks like there's multiple 'q' 's hardcoded, just search for 'q' in the code I guess =/
[22:11] <tlhiv_laptop> i am the math prof :-)
[22:12] <tlhiv_laptop> actually there may be a better solution to what i need ... i'm actually recording videos of me at my desk describing how to work problems that I have typed solutions for in LaTeX
[22:14] <xlinkz0> ffmpeg.c line 2638
[22:14] <tlhiv_laptop> i have a video of my explanation, but i would like to split this video into a sequence of videos where each video in the sequence describes only one step of the solution
[22:15] <tlhiv_laptop> i have identified the SPACE key as pausing the video with FFPLAY, but i would like to output the time in the video at the moment that i paused into a file ... is this possible?
[22:16] <xlinkz0> do you know C?
[22:16] <tlhiv_laptop> i little
[22:16] <vulture> I dont know that ffplay does it, though many other media players will show you the time
[22:17] <tlhiv_laptop> ffplay shows the time and stops when i pause it ... but i would like to save that time into a file at the moment i pause it
[22:17] <vulture> oh does it
[22:17] <tlhiv_laptop> yep
[22:17] <vulture> maybe you can pipe the output to a file then
[22:18] <tlhiv_laptop> well ... the time is always playing
[22:18] <tlhiv_laptop> i only want it to be in a file when a pause it
[22:18] <vulture> ahh
[22:19] <vulture> I dont see any way to distinguish that, would probably have to modify the source to do it
[22:20] <xlinkz0> tlhiv_laptop: add this http://codepad.org/OdCTj0Do between lines 3156 and 3157 in ffplay.c
[22:20] <xlinkz0> recompile
[22:20] <tlhiv_laptop> well let me see what version i have
[22:21] <xlinkz0> or just search for SDLK_SPACE and add those lines right after that
[22:58] <marauder> hey so i just recompiled ffmpeg about an hour or two ago and it seems to have reverted back to its old
[22:58] <marauder> Last login: Mon Jun 17 14:20:59 2013 from taz.superb.net
[22:58] <marauder> admin at akakios:~$ ffmpeg 2>&1 | head -n1
[22:58] <marauder> ffmpeg version git-2013-06-17-de12b45 Copyright (c) 2000-2013 the FFmpeg developers
[22:58] <marauder> Last login: Sat Apr  6 06:21:57 2013
[22:58] <marauder> acrane1 at akakios:~$ ffmpeg 2>&1 | head -n1
[22:58] <marauder> ffmpeg version git-2012-10-29-c1804dc Copyright (c) 2000-2012 the FFmpeg developers
[23:01] <vulture> marauder: is that newest, then oldest?
[23:01] <vulture> seems correct to me, no?
[23:02] <marauder> i recompiled it earlier today and when i was done I ran ffmpeg 2>&1 | head -n1 and got a date of today
[23:02] <marauder> i just reran that command a few minutes ago and now it is posting a date of 2012
[23:02] <vulture> it also says lastlogin sat apr 6
[23:02] <vulture> and is a different account
[23:02] <marauder> right, i changed back to admin and it still displays the same into
[23:03] <marauder> logged in through another machine, that i dont normally use
[23:03] <vulture> find / | grep ffmpeg ? :P
[23:04] <marauder> that could take a while
[23:05] <marauder> i compiled it using the same guide I used before from ffmpeg.org ubuntu compilation guide.
[23:05] <vulture> I doubt ffmpeg randomly decided to uninstall/revert so it's probably some issue with your system
[23:07] <marauder> yea that wouldn't make sense, I'm just thinking it might be reading from the wrong location
[00:00] --- Tue Jun 18 2013


More information about the Ffmpeg-devel-irc mailing list