?

Log in

airlied
airlied
:.:..:.

March 2015
1 2 3 4 5 6 7
8 9 10 11 12 13 14
15 16 17 18 19 20 21
22 23 24 25 26 27 28
29 30 31

airlied [userpic]
reverse optimus implementation

So I took some time today to try and code up a thing I call reverse optimus.

Optimus laptops come in a lot of flavours, but one annoying one is where the LVDS/eDP panel is only connected to the Intel and the outputs are only connected to the nvidia GPU.

Under Windows, either the intel is rendering the compositor and the nvidia GPU is only used for offloads (when no monitors are plugged in), but when a monitor is plugged in, generally the nvidia takes over the compositor rendering, and just gives the Intel GPU a pixmap to put on the LVDS/eDP screen.

Now under Linux the first case mostly works OOTB on F18 with intel/nouveau, but switching compositors on the fly is going to take a lot more work, particularly with compositor writers, and I haven't see much jumping up on down on the client side to lead the way.

So I hacked up a thing I called reverse optimus, it kinda sucks, but it might be a decent stop gap.

The intel still renders the compositor, however it can use the nvidia to output slaved pixmaps. This is totally the opposite of how the technology was meant to be used, and it introduces another copy. So the intel driver now copies from its tiled rendering to a shared linear rendering (just like with USB GPUs), however since we don't want nouveau scanning out of system RAM, the nouveau driver then copies the rendering from the shared pixmap into the nvidia VRAM object. So we get a double copy, and we chew lots of power, but hey you can see stuff. Also the slave output stuff sucks for synchronisation so far, so you will also get tearing and other crappyness.

There is also a secondary problem with the output configuration. Some laptops (Lenovo I have at least), connect DDC lines to the Intel GPU for outputs which are only connected to the nvidia GPU, so when I enable the nvidia as a slave, I get some cases of double monitor reporting. This probably requires parsing ACPI tables properly like Windows does, in order to make it not do that. However I suppose having two outputs is better than none :-)

So I've gotten this working today with two intel/nvidia laptops, and I'm contemplating how to upstream it, so far I've just done some hackery to nouveau, that along with some fixes in intel driver master, and patch to the X server (or Fedora koji 1.13.1-2 server) makes it just work,

http://cgit.freedesktop.org/~airlied/xf86-video-nouveau/log/?h=rev-optimus

I really dislike this solution, but it seems that it might be the best stopgap until I can sort out the compositor side issues, (GL being the main problem).

update: I've pushed reverse-prime branches to my X server and -ati repo.

Comments

Hello Dave.

I have a Sony VAIO VPCZ23A4R laptop, which, I think, can be used as a testbed for almost all possible graphics offloading situations. It has two GPUs: one Intel and one Radeon (Radeon is in the dock station). Both GPUs have outputs attached. The Intel GPU can drive the internal eDP panel, as well as VGA and HDMI outputs on the main body of the laptop. The Radeon GPU has VGA and HDMI outputs located on the dock station.

Currently, the only way to use outputs on the dock station simultaneously with the built-in panel is to write a custom xorg.conf that either defines two separate screens or a big Xinerama screen.

Could you please tell, what's missing in X in order to have one big Radeon-accelerated tear-free screen that is made of the outputs physically connected to two different GPUs? E.g. I want to use HDMI on the dock station, HDMI on the main body of the laptop and the internal panel at the same time, and to be able to drag windows between them.

Also, what's the situation in Wayland WRT such hybrid graphics setup?

And, I am willing to test radeon patches equivalent to what you have done for nvidia :)

The problem is to move to having the radeon do the main rendering we need to switch X to using it, and using the Intel as a slave, and I haven't gotten to making switching working yet.

wayland has no work done in this area so far.

I've posted some ATI bits as well, but you'll only get intel rendering to radeon outputs.

This would be a great help. I was disappointed to find out my ThinkPad T410s doesn't push anything to the external display when using the Intel graphics. Unfortunately, when using nouveau to drive a large 30" display, things get hot pretty quickly. Any solution that would work for a mere mortal (who doesn't build his own x server) would be much appreciate.

well as I said it'll still suck, because it has to keep the nvidia gpu powered up, and as such it will still burn power and generate heat.

Until there is better dynamic GPU power management, it'll probably always be toasty.

(Anonymous)

Does this solution at least work on FreeBSD or is it limited only to Linux?

I wouldn't be holding my breath. Everything I do is limited to Linux.