February 24th, 2006

face

Xgl vs AIGLX - the missing bit (and a rant)

While I think the Xgl and AIGLX approaches are both quite interesting, and I believe the roadmap for getting the current X server to turn into something like Xgl is quite practical.

The method thrown about is to replace XAA and EXA with an XGL acceleration method which allows drivers to use a standard OpenGL acceleration architecture accelereated via aiglx,

You might still want some of EXA pixmap management capabilities, but the acceleration layers should all be in OpenGL.

I believe you end up with Xegl without having to split the tree into a lot of separate pieces. I however do believe that the tree needs to be split, the X server needs to become a smaller piece, the current X drivers need to be using an architecture where modesetting and memory management can be split from rendering and acceleration code.

Of course while I love both ideas I believe both vendors (I know the engineers know it .. but the management might not realise how bad it is,...) are missing one point:

None of the open source DRI based drivers are stable, not one, I can probably write an OpenGL app of some sort or play a game for long enough on any single one of them and have my machine hard hang. These crashes aren't easy to debug, they take serious work, the last one I tracked down took 2-3 weeks of work to figure out, and result in a hacky workaround, not a fix. So basing our future on drivers that are not given engineering support by manufactures, i.e. driver developer gets access to all the information about bugs in the chip that the windows people get, or on closed source evil locked in drivers, is not good. Neither fglrx or nvidia are exactly stable either, nvidia is probably the closest.... Intel are probably the best, and TG are doing some interesting memory management work which will let a lot of things work a lot better, however I think the high level people in the RHs and Novells need to stop trash-talking each others engineering effort and go engage the big customers (Dells, Lenovo, HP, etc..) to put pressure on the graphics card vendors to do something about this (and maybe try not to fuck it up this time, you know who you are....).

Reverse engineering drivers while fun, isn't going to produce stability, the chips seemed to be designed around some quantum instability, and only the manufacturer knows how to get the inertial dampners to work so it doesn't crash, we can hack up an inertial dampner but every so often we manage to lose the whole crew and my desktop.

Thanks. This rant is complete.