[MPlayer-dev-eng] [PATCH] fix for -srate bug
D Richard Felker III
dalias at aerifal.cx
Thu Oct 21 04:07:45 CEST 2004
On Thu, Oct 21, 2004 at 03:14:45AM +0200, Michael Niedermayer wrote:
> Hi
>
> On Thursday 21 October 2004 01:14, Ed Wildgoose wrote:
> > >next try :)
> >
> > ....
> >
> > > libavcodec 22bit coeff, 2.5x longer filter, linear interpolation
> > > between polyphase entries
> > > Worst case Signal-to-Noise Ratio : 108.65 dB.
> > > Worst case conversion rate : 327680 samples/sec.
> > > Measured -3dB rolloff point : 97.02 %
> > >
> > > libavcodec 22bit coeff, 16x longer filter, linear interpolation
> > > between polyphase entries
> > > Worst case Signal-to-Noise Ratio : 104.81 dB.
> > > Worst case conversion rate : 43116 samples/sec.
> > > Measured -3dB rolloff point : 99.51 %
> > >
> > >the difference for the default filter is probably caused by a 10l bugfix
> > >(padding samples wherent memset(0))
> > >
> > >wc -l libavcodec/resample2.c
> > >247 libavcodec/resample2.c
> > >
> > >cat libsamplerate-0.1.2/src/*.{c,h} | wc -l
> > >26989
> > >
> > >[...]
> >
> > Wow, I am extremely hooked! Can you send me (privately?) your latest
> > diff for this test please? (and to libavcodec)
>
> it should be the same as the one i already posted, except:
> ----
> if(v > INT16_MAX || v < -INT16_MAX)
> printf("i thought its supposed to be 1.0 ..
> -1.0\n");
> }
> + for(; i<input_len + PADDING; i++){
> + data16[i]= 0;
> + }
> }else{
> ----
>
> and lavc is as it is in cvs, just with the #defines at the top of resample2.c
> changed for each test
is there any way to make it runtime-configurable? imo something like
this should be runtime configurable. if it's compiletime you get into
a problem about which mode distros, etc. will ship, and whether users
can rely on it to be fast or high quality...
rich
More information about the MPlayer-dev-eng
mailing list