[MPlayer-dev-eng] Re: [matroska-devel] Re: Common Opensource codec API

D Richard Felker III dalias at aerifal.cx
Mon Dec 29 18:44:18 CET 2003


On Sun, Dec 28, 2003 at 09:58:40PM +0100, Enrico Weigelt wrote:
> <snip>
> > This is the oldest and most idiotic analogy used by "unification"
> > advocates. Please drop it. It was wrong the first time someone said it
> > and it's wrong now. The wheel is very simple and universal and there
> > is essentially only one design. Software is much more complicated. A
> > better analogy would be "reinventing cancer treatment"....maybe since
> > we already have some treatments with horrible side effects, it's not
> > worthwhile for someone else to research a different approach????
> Audio IO is also a quite simple thing. Video IO also.

Neither is at all simple. For audio, you need to keep track of buffer
delay and depth so that you can keep the buffers as full as possible
while also knowing which sample is currently being played. This is
very different from games, which are realtime and force you to use
tiny buffers to minimize audio delay.

For video, you need to manage allocating and freeing multiple buffers
(lots of you want to decode ahead! and at least 3-4 if you want to
play files with B-frames!), controling and monitoring the time when
the visible buffer is swapped, knowing which buffers are safe to write
to and which aren't, ... If you want crappy vo_x11 (which tears) or
vo_xv (which is slow and tears with some bad drivers), it's easy. We
want speed and perfect output.

> Audio+Video Codecs being implemented as blackboxes should also be 
> relatively simple.

It's not as simple as it sounds, but anyway, it's _ALREADY_ _DONE_.
For the 2093595494380943854th time, it's called libavcodec!!

> Why such redundant work is necessary ?!

If that's the way you see it, too bad.

> <snip>
> > One second you're talking about common _codec_ api, the next second
> > about something else, and I have no idea what it is. If you're talking
> yeah, I'm talking about multiple multimedia APIs for several jobs
> which somehow belong together:
> 
> * audio IO
> * video IO

These classifications are nonsense. Input and output have _nothing_ to
do with one another. The requirements are totally different...

> * streaming input
> * streaming output

What do you mean? Have you even _read_ MPlayer's stream code? It's
already there, and it will be entirely modular in G2 (maybe it already
is). Any other project that wants to use it is free to.

> * audio codecs
> * video codecs

libavcodec!!!

> <snip>
> > about a common demuxing/decoding/processing/filtering/output layer,
> > then the answer is the world's biggest NO. We (Ivan and myself) can't
> > even agree on a single api for MPlayer G2. Finding something that all
> Why not ? Where's the problem ?

Why don't you read the fucking mailing list archive, _learn_ something
for yourself, and then make intelligent comments, rather than just
asking me to explain everything to you?

> > the groups agree on (including the evil Matroska gang) is all the more
> > impossible. There are just too many strikingly different approaches.
> Why cares about matroska ?
> As long as this thing is not really working and frequently used, we
> dont need it. If someday someone wants it, he can code a mux for it ...

Ok, who cares about other players either? We'll just make our nice
libs (as already planned) for MPlayer G2, and if anyone else wants to,
they can code stuff for xine, etc. to use them.

> > As long as individual project authors write the code well, porting
> > from one api to another is _not_ difficult. But using a common api
> > _is_ _very_ difficult and is not going to happen.
> Perhaps we're not at the point for a common video codec api yet, but many
> other things like audio-io can be unified today. Other parts may follow
> later. 

It's already been attempted and failed multiple times. See SDL,
ClanLib, libaudiofile or libsndfile or whatever, ... They're all made
with stupid assumptions of trying to accommodate newbie coders, rather
than doing stuff properly.

> <snip>
> > > But that's not the point here, instead we first have to talk about library
> > > interfaces, since protocols should be constructed for environmental 
> > > needs and wrapped into a lib which provides the protocols's functionality
> > > on a higher abstraction level (i.e. this is what xlib does for X11)
> > 
> > And this makes it slower, less efficient, and more bloated. No thanks.
> Why ?!
> What else does libavcodec do ?

Mainly it encodes and decodes audio and video. It also contains a
horribly slow software scaler (not used by MPlayer, which has a much
better GPL'd one, but there for ffmpeg which needs LGPL code) and an
optional (GPL) postprocessing implementation.

Anyway, if you don't even know what libavcodec is, you probably belong
on mplayer-users, not mplayer-dev-eng...

Rich




More information about the MPlayer-dev-eng mailing list