[MPlayer-users]  Re: vsync and nvidia (was: [BUG] -vf tfields=4	jumpiness -- OSD flickers madly)
    Etienne SANDRE 
    etienne.sandre at polytechnique.org
       
    Tue Nov  4 16:47:38 CET 2003
    
    
  
> I have exactly the same problem. I think the problem is that there is not a
> 1:1 scanline conversion between the framebuffer and the t.v. out. I believe
> the t.v. encoder scales the video -> which destroys the interlacing. There
> is a tool for winblows called t.v. tool which is supposed to be able to
> disable this "feature". Nvtv for linux may be able to do the same thing but
> its pretty limited as to which nvidia cards / t.v. encoders it will work
> with.
With some tv-encoder chips on nvidia cards you can fix this by setting a 
"768x576" resolution. The framebuffer is then directly encoded without line 
suppression or whatever. You may have to set a custom modline to use this 
resolution. It only works with some chips like connexant or bt ones (but 
buying a nvidia card is a bit like playing the russian roulette concerning 
the tv chip you will find on it)
Since I bought my video card some time ago when I was not concerned by the 
quality of the tv-out, i found recently I had a bad chip. There is a dirty 
solution : you plug your tv on the VGA (yes!) connector, connecting directly 
the Ground, R,G and B signals on the ones in the SCART connector of the TV. 
The sync signals are a bit more complicated since you need a few gates or 
transistors to convert it for tv input. There is a lot of docs on the net on 
this subject. Then you have to create a correct 768x576 50Hz modline (PAL) 
and it should work (but I  take no responsability for any damage on the TV or 
video card...)
-------------------------
Etienne SANDRE
    
    
More information about the MPlayer-users
mailing list