[MPlayer-users] Bug? Anomalous CPU usage when playing HDTV clips.

John Stebbins stebbins at jetheaddev.com
Thu Mar 18 01:13:00 CET 2004


First let me say that I have verified that I am using xv for output and
have verified that X has minimal CPU utilization on more ordinary video
clips.

Also, this could well be an X server problem. But I wanted to report
here because the behavior is so strange, I suspect the problem is some
odd interaction between mplayer and X.

System is (laptop) Pentium-M 1.4Ghz with a Radeon Mobility M9.
Note that I have run the same experiments on 2 other configurations with
very similar results.
1. P4 2.6Ghz with Radeon 9200SE
2. P4 2.6Ghz with integrated Intel chipset (865G)

I have some clips I generated using a pcHDTV card.  The capture
resolution is 720p (1280x720 @ 60fps).  When playing these clips, CPU
utilization by X server jumps to 55-60%!  mplayer cpu utilization hums
along at about 35-40%.

Since I knew that more ordinary streams did not exhibit this problem, I
decided to narrow the variables by re-encoding the stream at various
resolutions and frame rates.  I eventually created a stream that I can
demonstrate a 10x increase in CPU utilization (by the X server) by
changing the playback frame rate by 30%.

e.g.
test.avi is a re-encoding to 1024x576 @ 60fps

$ mplayer -nosound test.avi > /dev/null

Causes the X server to use 20% of the CPU

$ mplayer -fps 40 test.avi > /dev/null

Causes the X server to use 2% of the CPU

$ mplayer -fps 30 test.avi > /dev/null

Causes the X server to use 1.25% of the CPU
Note that X server cpu usage runs from .5% to 1% when mplayer is NOT
running.

Anyone out there have any theories?
Want me to run other tests?
Want access to the test clip (its 20M, it might pass thru e-mail)?

John





More information about the MPlayer-users mailing list