“In late August we started asking our readers for any questions they had for NVIDIA about Linux and this graphics company’s support of open-source operating systems. Twelve pages worth of questions were accumulated and we finally have the answers to a majority of them. NVIDIA’s Andy Ritger, who leads the user-space side of the NVIDIA UNIX Graphics Driver team for workstation, desktop, and notebook GPUs, answered these questions. With that said, there are some great, in-depth technical answers and not the usual marketing speak found in many interviews.”
Even two years ago I would not have bought anything but nVidia graphics hardware just because their drivers where the best there was. Now, things look totally different. nVidia is quickly moving from 1st place to last place. Intel already rules the integrated graphics sector with their surpreme drivers and AMD is quickly closing the gap, supporting more and more stuff in their free driver every month or so. Even nouveau is getting usable (for 2D stuff, with 3D soon to follow) and it actually already works better than nVidia’s binary-only driver for what it can do.
And here we see nVidia not releasing code or docs because they are worried about their precious IP…
I can’t disagree with you more.
intel: Intel drivers are not high quality. Look at all the people bitching about intel performance in Ubuntu 9.04. When that’s fixed, you’re still going to be left with hardware that can barely run a 3d app. For people who just want compiz that’s fine, but for anything else, intel is not an option.
ati: I realize they’ve released a bunch of documentation. Their drivers still suck though (and I’m including both the open source drivers and the binary ones). Maybe, just maybe if one of the developers working on the ati open source drivers has your exact card you’ll get “okay” 2d performance.
Intel has been seriousely working on their drivers with a huge amount of refactoring on underlying infastructure.
Ati is also working on a new free driver.
Promises, promises. The same promises we’ve heard before so many times.
Intel’s graphics drivers are fairly well written. It’s the graphics chips themselves that suck.
Ehm. Uhm. Hmmm.
I am not sure how to take this comment about these “supreme drivers” that during the past few years have caused more trouble than any other open source driver, in any context, in any open source operating system.
The reasons why they rule the integrated graphics sectors can be found in such things as cheap — and arguably low-quality — chips, volume sales, bundling with other Intel hardware, and so on, but certainly not from their drivers for Windows and even less so from their drivers for X.Org.
Edited 2009-10-20 22:43 UTC
ATI has apparently been quickly closing the gap for years now.. I tried the Intel drivers (X3100) 3 months ago on linux, and they were crappy.
And nouveau will never support the latest graphics card out of the box
Again, NVidia eludes the question why it doesn’t release documentation: quote:
“Unfortunately the vast majority of our documentation is created solely for internal distribution. While at some point it may be possible to release some of this information in pubic form it would be quite a monumental effort to go through the vast amounts of internal documents and repurpose them for external consumption.”
This is plain nonsense. Here, NVidia is just assuming that a majority of people are gullible and that the rest don’t matter. Lame!
If you want to know why it’s nonsense, consider that even if they really didn’t have any documentation that’s fit for a public release, they would still have the obvious possibility of letting a few employees write such documentation from scratch. We don’t need all the super top secret features, only give us just as much as AMD/ATI is already giving, that’ll do…
Edited 2009-10-20 22:51 UTC
Agreed.
A ridiculous claim that “there is so much documentation that we can not summarize parts of it for external use”. Right, like nVidia would be the national library of Britain, say.
I would be thrilled however to see how their information looks in pubic form
Edited 2009-10-21 17:37 UTC
Releasing the code would cost too much money, for the return they expect it is simply not worth it.
The majority of people simply want working drivers, especially with respect to the few people who keep on bitching about demanding proprietary information, only to either alienate vendors/developers or realize how over their heads a lot of the technical issues are and subsequently lose interest in the project leaving it to languish and to be forgotten by the inexorable advance of time…
Part of running a successful business consists on knowing how to prioritize things. It takes literally 5 minutes to get a Linux/BSD/solaris system up and running with NVIDIA drivers. To this day I have not seen a single open source zealot explain to me what the exact value proposition on having the proprietary code from NVIDIA would be, especially if you consider that NVIDIA literally defined 3D in Linux, and they are the main reason why there are people working on 3D linux modelling commercially.
Edited 2009-10-21 17:43 UTC
This used to be true but Linux graphics are evolving at a very high speed now. You can even read in the interview that Nvidia can’t keep up with linux because it just isn’t a priority. Linux people expect more now. They want tight integration like kernel mode setting that is only possible with open source drivers.
You are only allowed to reiterate the opinion that NVIDIA is a bastard for not wanting to open source their drivers.
Independent thought is unwelcome if it threatens FOSS ideology. Linux on the desktop has been a raging success without a stable abi. Over a decade in development and currently at 1%.
Well .64% to be exact:
http://gs.statcounter.com/#os-na-monthly-200809-200910
And to think that people in 1998 questioned if harassing hardware companies into open sourcing their drivers was a wise strategy.
I seriously think that nVidia is the number one thing holding desktop Linux back. If their drivers were open, it would be much easier to make radical changes to the X.org graphics stack. Imagine having open source drivers with KMS and Gallium3d on every major card! The X.org server could have much old functionality stripped out and desktop Linux would be faster and more robust in general.
A lot of people blame X. But we can’t change X without changing the drivers. And when we don’t have the source to those drivers, X will be stuck in the past.
Perhaps, then, it’s time to change without nVidia? Sometimes you’ve just got to grow a backbone and lay it out. You tell everyone what’s going to happen and then you do it. We shouldn’t be held in the past because of drivers from one vendor, look at Windows for an example of how much cruft that eventually brings.
Haven’t we already suffered enough, over the last year or more, from Xorg churn and chaos??? It’s time to *stabilize* Xorg. Clearly, we’ve had enough dream-chasing for now. Holders of grand visions for Xorg need to just chill, meditate, do some opiates, or go f–k themselves for a while.
Edited 2009-10-22 00:09 UTC
Indeed, but nVidia or no nVidia Xorg is always going to be in a constant state of flux. Actually, stabilizing and desktop Linux seem to be quite the opposite concepts right now, and probably always will be. Stabilizing only happens if there’s some guiding entity behind a project, and that’s simply not the case with desktop Linux. Xorg has its own, GNOME and KDE have their own, GTK has its own… and everything going in different directions. I think a stable desktop Linux is a pipe dream. The only good thing about the chaos in Xorg now is that it may lead to stability and better performance later, but then again everyone always says that when major changes destabilize a product (*cough* Vista). If Xorg does stabilize because of these modifications however, losing nVidia’s binary blob would be a small price to pay if performance is much better on all other video cards.
Why can’t good decisions regarding feature inclusion/exclusion, upgrade path, etc. keep the Xorg project stable, within reason?
Note that you have just changed the subject, completely.
Gnome and GTK+ have been evolving gradually and comfortably. As an admin of corporate desktops, my radar on that is pretty good. My criticisms and opinion of KDE4’s leadership are likely well enough known around here that I don’t need to repeat.
The problem is not centralized leadership, or the lack of. It is cavalier and reckless decision making at Xorg and KDE. The problem is leaders succumbing to the temptation to compete on the basis of marketing bullet points rather than cultivating stability.
I suppose those ultimately at fault are those cheering the ever-destabilized projects on, providing positive reinforcement for bad decisions, when they should be scolding the leadership. Large parts of “Desktop Linux” have done a very good job of maintaining order, stability, atop a wave of gradual and steady evolution… and they have been criticized for not *innovating enough*. (Thanks a lot, Microsoft. This is largely a result of distortions you have propagated for your own marketing purposes.)
A stable Linux Desktop stack is not a pipe dream. It simply requires that certain projects get their heads out of the clouds and get with the program. If Xorg would just get their act together, Gnome users could already be enjoying one.
P.S. Aside from the large projects, pointless, destabilizing, regression-producing atrocities like PulseAudio also need to be dealt with. But remember that individual distros *do* have a strong, centralized, leadership and are in a position to include or exclude such things, as appropriate… if only *they* can avoid succumbing to the bullet point over stability temptation.
Edited 2009-10-22 13:54 UTC
That’s certainly possible (look at Wayland), but have fun trying to convince Ubuntu to drop support for one of the most common GPU vendors.
It’s easy to say “let’s just drop support for nVidia”, but in real life, it isn’t so easy. It would end up hurting desktop Linux more than helping it, because it would make a large fraction of hardware unusable.
I would not blame nVidia for the current state of “desktop Linux”. Desktop PC’s are used much more for office productivity applications (e.g. word processor, spreadsheet, etc.) than for gaming/3D-centric applications and in many instances it is Microsoft’s Office suite (Microsft Windows platform) that rules the “office” space. Once UNIX/UNIX-like systems have enough market share in the desktop arena then I’d consider if nVidia has any deleterious effects on Linux/etc.
Personally, I use Opensolaris for my development (SunStudio C++) and office (OpenOffice) needs and I am very fine with this. Part of my coding deals with my own C++-based multimedia-engine (OpenGL 3D-graphics, OpenAL audio, etc.) and the NVidia driver for (Open)Solaris has been fine for my development/testng/usage needs.
Most Linux/(Open)Solaris/Unix/etc. users would realise that there is a “sheep” mentality concerning PC usage, from lack of time/inspiration/etc., where most people accept what is put in front of them (Windows-based PC) and never consider if a more better (or less troublesome) computing experience is available.
However, after enduring the “pain” of using an insecure/virus-prone/etc. Windows-based PC, these people tend to be easy candidates for migration to a more properly-designed UNIX-based system like OpenSolaris. I should know as I have converted friends from Windows to OpenSolaris.
When there is a problem they will call you first.
Just have them use vista in limited user mode and show then how to only download software from safe sites like softpedia. I have my friends and relatives doing that and not one of them has gotten a virus.
You really are smoking something if you think Linux is less troublesome than Windows. I used Linux exclusively for 4 years and recently gave up on that when I realized I was using an ungodly amount of time installing new kernels to get new hardare to work, figuring out why old hardware broke, keeping self compiled software working as libraries updated etc etc. It was no longer worth my time.
Sure, you might say I was doing it wrong, but what part of “you couldn’t figure out how to do it right in four years” is supposed to convince me it’s less troublesome? If given that time I couldn’t figure out how to do it right, it’s troublesome.
That will last at most slightly longer than you are there to hold their hand, guaranteed. Also from personal experience.
Bottom line: if you can figure out how to keep alsa/pulse audio, xorg, kernel, gcc etc working, you are savvy enough to be able to keep yourself safe from malware on Windows, where your hardware (and software; yeah really. All the proprietary Windows stuff AND the open source stuff) is a lot more likely to be supported. Or you could pay for OSX and have your hand held all the time.
I keep Linux around, and love what it is and stands for, but my time is better used elsewhere.
That change is in the pipeline right now, anyone running the latest in-development Xorg software stack, to get the new radeon driver + KMS + 2D/3D hardware acceleration, has seen it.
NV will be forced to change the current structure of their driver to work with it, but I imagine they’ll manage to keep up (I’m sure they have a new rev of their driver in development for the new Xorg codebase right now).
Maintaining that external binary blob, though, is what always complicates their ability to support Linux (that and never treating Linux as a first-class platform alongside Windows). AMD will no longer have this problem once their open driver takes over from fglrx, AMD’s hardware support will all either be in Xorg (Mesa+driver), or the kernel itself (KMS).
Because a Unix with a stable abi for binary drivers is just unthinkable.
Oh wait there’s Solaris and OSX. Nevermind.
But Linux kernel devs are never wrong. Giving the finger to hardware companies is a wise strategy when you have 1% of the market.
You must work on the typing, Mr Jerkov. You meant to type
http://www.slashdot.org
No. An unstable API, poor vendor support and a lack of respect for IP among the FOSS community (driving away potential killer-app developers) is holding desktop Linux back.
If X.org was refactored in such a radical way, nVidia would release an updated driver.
Linux Hates wrote one of the best comments on the state of linux graphics and nvidia.
http://linuxhaters.blogspot.com/2008/06/nitty-gritty-shit-on-open-s…
I’ve been in computer graphics for 20+ years now. All I can say is no one in this business would consider anything but Nvidia for its graphics solutions. Intel is out of the question and ATI has a minimal impact with its FireGL/FirePRO cards, while Radeons are simply not used either (GeForces on the other side, do their job pretty well.)
I’ve found the interview very sensible and hopeful. All these discussions about open source are interesting to a bunch of FOSS fanatics only. No one in the real world cares about the nature of the source code in their drivers. What we care about is driver quality and Nvidia is just so far ahead (even in Windows) that there’s simply no discussion.
On a side note, the idea that a graphics driver is holding the Linux desktop back is just ridiculous.