Notebooks with dual GPUs have been shipping for a while now, but switching between the fancy discrete GPU and the low-power integrated one hasn’t exactly been painless. Today, NVIDIA introduced a technology called Optimus, which makes the switching process automatic and transparent. In Windows, that is.
Despite many laptops already shipping with two graphics processors – a low-power integrated one and a discrete powerful one – software support for switching between the two has been a bit problematic. On Windows Vista and Windows 7, you needed to manually switch between the two via the power settings or the graphics tray icon. Upon switching, the screen would go blank for a few moments, et voil~A .
On Mac OS X, the situation is a million times worse. Even though the hardware is fully capable of switching between the two GPUs “live”, Apple has never implemented the support for it in software, requiring you to log out if you want to use the discrete GPU and again when switching back. Then again, Mac OS X still doesn’t support SLI either, so little surprise there.
I’m not entirely sure what the situation is like on Linux, but from these recent Phoronix stories it would seem that there’s only experimental support for this. This experimental support still requires you to kill and restart the X server in order to get there, so it’s all rather crude. However, with the pace of Xorg and Linux development, the situation could already be different today. On top of that, it might be that some other project out there has already yielded better results – feel free to enlighten me.
So, Windows currently has the easiest switching method, but it’s still not ideal. What you really want is that the soft and hardware know when it’s time to power on the discrete GPU, so that users don’t have to worry about that sort of thing. When you load up a 3D-intensive program (a game, probably), the discrete GPU should automatically take over, without noticeable delay or flickering; if you close the game, and go back to browsing, the discrete GPU should power down. Preferably, Hybrid SLI should be used so that you can use both GPUs at the same time for optimal performance – when needed.
NVIDIA’s Optimus technology delivers just that kind of functionality. It will detect when an intensive 3D application is started, and will power on the discrete GPU accordingly, without user intervention, without screen flicker, without logging in or out. The only problem spot here is that the system relies on application profiles that you silently download from NVIDIA – in other words, if you happen to run into an application or game without a profile, you might still need to power on the discrete GPU manually.
The system works by basically having the discrete GPU send its framebuffer contents to that of the integrated GPU as soon as the former is powered on. This eliminates the blank/flickering screen during the switch. In previous iterations of dual-GPU laptops, both GPUs were connected directly to the display using multiplexers, which required the system to switch between two separate pipelines, causing flicker.
From the press release and Optimus web page it doesn’t become clear whether or not it also supports Hybrid SLI (using both GPUs at once), but I’m assuming that NVIDIA thought of that too, despite not mentioning it explicitly.
“Consumers no longer have to choose whether they want great graphics performance or sustained battery life,” said Rene Haas, general manager of notebook products at NVIDIA, “NVIDIA Optimus gives them both – great performance, great battery life and it simply works.”
A small number of new ASUS laptops (four, to be exact) support the new technology. Hopefully, the Linux NVIDIA driver will support this sooner rather than later, so we can enjoy Optimus on Linux as well. On the Mac OS X side of things – it all depends on Apple. Cupertino still doesn’t support SLI, and has so far refused to make switching as (relatively) seamless as it is on Windows (the hardware supports it), so I wouldn’t be surprised if it will take a while for Apple to catch up.
You make it sound like Apple writes their own Nvidia drivers, they don’t, it’s all Nvidia.
Support is still needed higher up in the graphics stack to make this work.
I can’t find the link atm, but NVIDIA once said that all the support is there, and it’s only Apple that can now bring it to Mac OS X.
However, with the pace of Xorg and Linux development, the situation could already be different today.
No, it is not. Be honest. Xorg is still struggling with multidisplay support. Heck, it STILL after 20+ years of development struggles with a single goddamn display. The two last times (~1 yr each) I tried using a Linux-based system full-time, the plenty-a-day xorg crashes reconviced me that my time was worth too much to spend it tinkering with the steaming pile that is Xorg. Still the same, every time I try. And that’s on a system that’s supposed to work well with Linux. But not even the mighty open-source ecosystem has managed to fix this. Time is only spent on crying how hw manufacturers create shitty (AND/OR closed-source) drivers.
Linux, yes, is developing at a fast pace. But not Xorg.
Edited 2010-02-09 17:20 UTC
I’m really trying to figure out how the heck do you have problems with xorg on a single monitor display, heck even on two (if you consider proprietary drivers)
Any kind of a “special” hardware ?
Yeah, I haven’t had X crash on me since 97. Is it because I’m lucky or that I only use open source drivers?
I’m only using the nvidia proprietary ones on two pc, one of witch is my daily work one.
Any kind of a “special” hardware ?
I wouldn’t consider an Intel integrated graphics chip that special.
It is also rather ironic that I specifically bought a new computer with an Intel graphics chip rather than a nVidia discrete one, just so that it would work great with Linux. I even suppressed my inner urge to play games with my computer, just for that. And turns out, now I’m on windows with a sub-par graphics chip all along.
Kinda explains my poisonous attitude though…
Xorg itself probably isn’t unstable in this case, Intel’s driver is
Intel certainly has a good reputation for releasing open source drivers for their chips, but some recent cards have apparently been accompanied by poorly written drivers
“Time is only spent on crying how hw manufacturers create shitty (AND/OR closed-source) drivers.”
Guess why?
Xorg could create a thousand killer features and write millions of lines of perfectly bug free code that does everything anyone could ever dream of, but it still wouldn’t mean shit to most end users as long as they rely on shitty, closed source drivers who want to do things their own way and don’t keep up with the development pace.
Yet you put all the blame on Xorg…
Are you really that surprised that people are frustrated?
He has an intel card, so it is shitty, opensource drivers for shitty, open hardware
Don’t forget the crappy open source drivers, and driver-level features that should be core X features.
X is a minefield of crappy politics, even without closed source drivers, which would be very easy to deal with (stop supporting old crap, and let nVidia and everyone else catch up, if they want).
Also, isn’t it funny how a Windows+nVidia news item becomes about X and FOSS?
Edited 2010-02-10 07:00 UTC
Urg, please people read up on XOrg and what is happening.
There is a lot happening. There is crying about closed source slowing things, but there is still lots of action. Biggest of which is drivers being moved out of X into the kernel. That’s what the big fuss with KMS,DRM and Gallium3D is. NVidia aren’t joining in because it doesn’t fit with there cross platform driver development, as it would make the Linux driver completely different than the other platforms. But this matters less and less, because Nouveau is really picking up speed now, and is increasingly becoming a viable alternative. The Nouveau drivers are keeping up with XOrg development. Once the drivers are out of X, the code base shrinks massively and it won’t neeed to run as root anymore. This all means it makes easier to write X reimplimentations or alternatives, like Wayland.
Xorg is not lacking multidisplay support. I’ve been using it at home for years. Normally now it’s all done with Xrandr bar the NVidia closed drivers, and as I said, that won’t matter soon enough as they won’t be needed.
From what you have described, I don’t see why Apple is a million times worse! W/O knowing whether Apple have implemented it or not, and what it would take for them to do so, your complaint is only baseless.
It really is a step Nvidia should have taken to make the switch easily controlled from Hardware side, then all other companies or group implement it on OS side.
Edited 2010-02-09 18:44 UTC
I’ve always thought using a dual-GPU setup was kinda kludgey.
Wouldn’t it be better to enable/disable separate cores on the GPU individually based on usage, rather than have two cores, one of which is completely on and the other just idling?
AMD and Intel CPUs can do this.
It would be an absolute minefield to try and get this working with xorg. Intel drivers are now in the kernel, nvidia drivers are not. As I understand it (I could be wrong), the nvidia driver loads the igp drivers so it is able to use standard api calls to pass stuff around.
Then there is the issue with the fact that as good as the nvidia driver is on linux (stability and performance), in infrastructure (kms, randr1.2, responsive powermizer) they are awful.
Sigh, I guess it is time for me to move to osx as it will get this eventually, possibly for the next macbook pro refresh. I like having a massive amount of battery time,but sometimes I really need power. Until now linux has been fairly comparable, it wont be anymore (at least not for a while)
Edited 2010-02-09 19:24 UTC
Watch the Nouveau driver development. It already addresses some of your issues with closed Nvidia drivers, some of which is why there is such pressure to replace them.
xf86-video-nouveau
http://nouveau.freedesktop.org/wiki/
http://nouveau.freedesktop.org/wiki/FeatureMatrix
xf86-video-ati
http://www.x.org/wiki/radeon
http://www.x.org/wiki/RadeonFeature
and xf86-video-intel
http://intellinuxgraphics.org/
All three of the main open source video drivers for Linux support Randr12 and KMS
http://www.phoronix.com/scan.php?page=article&item=927
http://intellinuxgraphics.org/dualhead.html
Linux desktops have configuration settings for multiple-monitor set-ups:
http://ourlan.homelinux.net/qdig/?Qwd=./KDE4_desktop&Qif=Resize_and…
My but there is a lot of FUD spread about these days concerning Linux and open source drivers for it.
Edited 2010-02-10 13:18 UTC
I know this is taken from the PCWorld article (can’t find any mention in the NVIDIA press releases), but it seems kinda silly: even though the fast GPU is “discrete”, wouldn’t it use main memory for its frame buffer? In which case there’s no need to copy anything.