“On the Phoronix Forums we have been running a Q&A with the developers of the Nouveau project. For those out of the loop or new to Linux, the Nouveau project aims to provide an open-source 2D/3D graphics driver for NVIDIA hardware. After collecting a number of questions from our readers, KoalaBR and Marcheu have answered these questions. The questions range from whether there will be open-source SLI support to asking if NVIDIA has ever contacted the Nouveau developers.”
I hope that this project achieve his goals soon. For the mainstream users the nVidia driver is good enough, but nVidia is droping support for older-but-capable hardware. I’ve got a few years old Geforce 3 Ti200 card, that can run the new WM and Xorg capabilities very smoothly, but nVidia keep us (the [not so] old hardware users) in the “8000 series driver ghetto”. The firstest releases of the 9000 series got support for that hardware, but it doesn’t work well with a DVI out (a problem already appeared-and-solved in 8000 series), and the lastest releases (the ones with support for the new features) drop support.
So, again, people that resist to make unnecesary hardware update or doesn’t represent a important market share are ignored before their hardware become unusable. That’s the same thing for bcm43xx hardware and other stuff. This is *sad*.
So thanks a lot for the nouveau guys, and the guys from similar projects.
Edited 2007-09-03 22:56
I get where you are coming from, but really, that is life. Old technology is superseded by new technology (in the case of computer graphics, very, very quickly). The burden in terms of time and effort required to support older hardware is just not worthwhile in most cases, and for linux to make serious inroads into the desktop computer OS scene, it needs to be able to support current 3D hardware through a simple driver installation if not out of the box. So the priority for the nouveau team should be to get the latest DX10/openGL3.0 cards working before worrying about obsolete hardware. It is very hard to reverse engineer drivers for 3d hardware, especially when the bulk of the market for that hardware expects full support for all of the features of the latest hardware, not stuff that is several years old.
So while I can empathise with your frustration, unless you are prepared to contribute resources to the project to enable them to work on out of date hardware, I think you will be stuck with the commercial nVidia driver for quite a while longer.
I don’t understand what you’re trying to do with this argument – it is worthwhile for users of older or non-mainstream hardware to have these capabilities, it is for these users that an open-source solution (which can service the long tail of the market) is necessary. Why have nouveau focus on only the latest and greatest, when some just want their cards to ‘work’? Your argument would make more sense if you were advocating nvidia to open source their own drivers – and I don’t think that ethical quandaries over ‘binary blobiness’ is what’s stopping linux from making inroads on the desktop OS scene.
It seems like the most progress has been made on 6xxx and 7xxx series cards, which aren’t exactly the latest and greatest – I think they are probably the most popular linux graphics cards right now. I do think you could make an argument that supporting the older cards first would be simpler and allow extensions to then work on newer cards, but I’m not sure if that is really true or not. Maybe the architecture just changes too much between releases.
Edited 2007-09-04 05:29
I think I know why Linux is not gaining. I read the GPL and it includes a big disclaimer. I think success requires you put your ass on the line and stand behind with a guarentee. That’s what customers want–a product that’s guarenteed by somebody.
Have you read the NVIDIA driver license lately?
http://www.nvidia.com/content/license/location_0605.asp?url=-1
6.1 No Warranties. TO THE MAXIMUM EXTENT PERMITTED BY APPLICABLE LAW, THE SOFTWARE IS PROVIDED “AS IS” AND NVIDIA AND ITS SUPPLIERS DISCLAIM ALL WARRANTIES, EITHER EXPRESS OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE.
6.2 No Liability for Consequential Damages. TO THE MAXIMUM EXTENT PERMITTED BY APPLICABLE LAW, IN NO EVENT SHALL NVIDIA OR ITS SUPPLIERS BE LIABLE FOR ANY SPECIAL, INCIDENTAL, INDIRECT, OR CONSEQUENTIAL DAMAGES WHATSOEVER (INCLUDING, WITHOUT LIMITATION, DAMAGES FOR LOSS OF BUSINESS PROFITS, BUSINESS INTERRUPTION, LOSS OF BUSINESS INFORMATION, OR ANY OTHER PECUNIARY LOSS) ARISING OUT OF THE USE OF OR INABILITY TO USE THE SOFTWARE, EVEN IF NVIDIA HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES.
“That’s what customers want–a product that’s guarenteed by somebody.”
Because that was all the commercial companies are doing. Have you actually ever read an EULA? Its’ pretty much just a way for the company to not take any responsability whatsoever for their product. MS, Sun, Apple, IBM etc, all the same. If something goes wrong, it’s not their fault and not their responsability.
All software come with a disclaimer. Nobody wants to take responsibility for their software product – understandably because of the many unknown factors (endless hardware+software combination and weird user behavior).
Well, yes you are right in the sense that the nVidia drivers work reasonably well, and are easily installed on most distros.
The thing is, the Nouveau team will always be several steps behind nVidia, simply because they are trying to reverse engineer something. This takes considerable time, and if the Nouveau team focus on cards that are old (eg, 2 years or more), by the time they have developed a fully functional driver for it, it will be well and truly ancient, and even people clinging to old hardware may have moved on.
If you are going to reverse engineer something, start with the 8800/8600 series, because by the time you have a fully functional driver for these cards, they will be starting to look a bit old.
So yes, it is worthwhile for users of older hardware to have these drivers, but the definition of older hardware will shift, and what is bleeding edge today will be older hardware by the time you get a decent driver written for it.
Unless of course nVidia or ATI/AMD open up their drivers, or Intel actually make graphics hardware worth using…
The parent comment shows a lack of understanding of how graphics cards and most other kinds of technologies evolve through product generations. It assumes that reverse-engineering the 8 series is distinct effort compared to reverse-engineering older NVIDIA cards. This is simply not the case.
For one thing, as you can see from the article, the “reverse-engineering” is not the most resource-intensive part of the process of producing an alternative graphics driver. The Nouveau team already knows most of what they need to know about the cards. The implementation is significantly more work than the discovery of the functionality.
But more importantly, the various generations of NVIDIA cards are related to one another much like the layers of a Russian matryoshka doll. It doesn’t make sense to create a driver for the 8 series without implementing the previous generations as functional subsets.
Similarly, it doesn’t make sense to work from the outside in. Basic 2D comes before textures, T&L, and the various iterations of programmable shader models. Besides the engineering sensibility, very few applications use the most recent shader models. The leading-edge applications often trail the hardware by 6-12 months, and most applications are multiple generations behind.
The same marketing ploys that inform the parent have consumers buying cards with features they might not use until after they jump on the upgrade treadmill yet again. By the time applications start using the features, the graphics cabal will have long since started beating the drums that tell consumers that these cards are “starting to look a bit old”.
Case in point: the latest midrange performance box recommended by Ars is under $1500 US including monitor, speakers, and input devices. This includes a $379 graphics card, equal to the CPU, motherboard, and half of the memory combined (or more than four times the cost of all of the memory).
If I didn’t know better, I’d say it seemed like there might be something fishy about this conspicuous splurge on one item within a system that is otherwise quite modest.
It simply doesn’t make sense to spend so much money on a graphics card for a Linux box. I can’t imagine what I’d do with anything beyond a 7600GT, just about the cheapest card with dual-DVI. A hardware MPEG decoder would be nice, but as the article explains, there’s currently no way to expose it to applications through X11 or OpenGL.
Plus, with quad-core CPUs reaching down into the sweetspot and many more cores on the horizon, the case for discrete accelerators in general is gradually eroding. Between Fusion, Larrabee, and SSE5, it’s becoming more and more likely that the CPU will win the stream processing tug of war in the mainstream space.
I for one welcome the impending niche-ification of specialized graphics processors (discrete and integrated alike). I was never a fan of the clunky, proprietary programming models anyway, and common form factors were never designed to properly address the thermal dilemma of 130W expansion cards.
But in the meantime, the free software graphics community should ignore the hype and focus on implementing the subset of functionality (on as many cards as possible) that is likely to be the most useful for our application developers. Even Carmack is fed up with the marketing machine.
Enough is enough. The parent comment is an example of the strained logic that has been widely indoctrinated in the enthusiast community. It doesn’t take a thermonuclear jet engine to enjoy a rich multimedia experience. Just some honesty and some efficient free software.
Wow! So *that’s* what all those cores might be good for on a desktop.
Good post, Butters.
I was actually lumping the reverse engineering process and the implementation of functions together. After all, it is not much good if you reverse engineer something if you then do nothing with that knowledge.
I agree that to the extent that graphics cards have evolved, and share older features that are a subset of newer features, you are right. If you implement a driver for an 8800GTX that fully utilises every feature of the card, it would be a relative piece of piss to adapt that driver to just about every nVidia card that has preceded it, because for the most part, all you would have to do is disable the features in the driver that earlier cards don’t support. This is roughly how nVidia do it with their unified drivers AFAIK.
But this only emphasises the need to tackle the most recent hardware you can, because in doing so you can cover the widest range of cards in the market possible. Just starting with a Ti300 is not going to help that many people, because most of us have moved on to something a bit more powerful, and would expect to be able to get the same functionality from the card with FOSS drivers as they would with the proprietary ones to consider them worthwhile.
I don’t think the GPU will be done away with for a long time, especially when it comes to games. You can only pack so many transistors onto one chip before you run into serious heat dissipation and quality control issues – CPUs are designed for serial processing, GPUs are designed for parallel processing, and the two functions are too different to combine into one chip adequately and affordably.
My preference would be for a drop-in GPU chip rather than an actual card. Graphics capability evolves very quickly, and it is a lot cheaper to upgrade just the graphics card for modern games, because most games are GPU limited, not CPU limited. You can keep going with the same CPU for years and still be able to play the latest games, but you need to upgrade graphics cards every 12 to 18 months to keep up. Or you can just wait patiently for games to get really cheap before you play them, as I do, and only upgrade every 4 years or so. Either way, I doubt it would be more cost effective to go with combined CPU/GPU chips – the more complex the chip, the more expensive it is.
No it doesn’t take a thermonuclear jet engine to enjoy a rich multimedia experience, but if you want to build a Linux based home theatre PC that can play Blu-Ray discs with hardware h.264 decoding (HDCP DRM notwithstanding), then those features of the latest nVidia cards (the 8600 series) need to be worked on. There are a reasonable number of 3d games that run well on Linux (either natively or through Wine) provided there is a decent graphics card supporting fairly recent features available. Of course, you can use the proprietary nVidia drivers for this, but if you are insisting on open source varieties, you are out of luck at the moment, and probably for some time to come.
The best way to get the widest possible number of nVidia cards working with Nouveau drivers is to start with the 8600 series. Bear in mind also, many of the older features, like 2D acceleration, are legacy functions that will eventually disappear from graphics cards, and are increasingly not used at all in modern software, so there is a limited need to work on them.
Edited 2007-09-04 22:50
Sorry but I just have to jump into to the discussion… You’re thinking of the whole process in completely wrong way…You DON’T start from the top and then work your way to the bottom. You DO however start from the bottom, from the easier and smaller things and work your way towards the top. It would be pure idiocracy to just aim for the high-end stuff right from the start when you don’t even have fully working 2D yet. I don’t really understand your obsession with 8×00 series of cards…As all the features of earlier models are also present on the new cards you don’t lose anything whatsoever when they support the older cards too!
About h.264 decoding…Well..There is no way to use the hardware functionality under Linux anyway at the moment! Xv nor XvMC do not support that, not even with the proprietary drivers. And you don’t need a 8×00 card to enjoy hardware decoding of stuff. As far as I know, you could implement atleast some part of the decoding process in SM 3.0 and as such even older models be useable. Just do not expect such things to happen in a good while cos the Nouveau devs are working on a lot more important things first.
The best way to get the widest possible number of nVidia cards working with Nouveau drivers is to start with the 8600 series. Bear in mind also, many of the older features, like 2D acceleration, are legacy functions that will eventually disappear from graphics cards, and are increasingly not used at all in modern software, so there is a limited need to work on them
No, no, no, no, no! You have to implement the basic stuff first so you can’t just aim that high immediately! And since you have to implement the basic stuff anyway then why artificially lock out older cards? AFAIK things like 2D graphics work the same on all the cards across the board….
2D acceleration are legacy functions and will disappear? I don’t quite understand. You do know that apps don’t directly utilize those features, it’s X which does that. And there will always be need for 2D acceleration. Apps ARE 2D. Even if they were presented as 3D objects in a wide and rich 3D environment it would be pure madness to remove 2D acceleration from the cards as apps still need to draw things like buttons, text, toolbar and all that..
I disagree. If you use a bottom up approach you will always be way behind. Older features become depreciated. Newer and better methods replace them. Ergo, start with the newer and better methods to stay in the game.
Let’s remember who drives the market: MS is still the main player, and Vista is designed to use the 3D functions of the card, and only resorts to the 2D stuff if you have a lesser 3D card. Once Vista becomes the mainstream installed OS, hardware manufacturers will likely start dropping 2D acceleration on their newer cards, along with a bunch of other functions that have been superseded by better methods.
If you start with the oldest features first, you will end up putting a lot of work into features that are more likely to be obsolete or depreciated by the time you have them working. If you start from the top down, you only have to work back as far as necessary to cover the majority of use cases. The newer functions are also harder to implement, and IMO it is better to do the hard stuff first before knocking over the easy stuff.
Locking out older cards is not artificial – most people have a mid-range card that is neither particularly new or particularly old. You therefore need to work on bleeding edge hardware so that when you finally get a driver out, it will be applicable to the widest installed user base possible, otherwise the project winds up just being an exercise in catering to the minority with ancient hardware, and will be of limited interest to anyone with newer hardware.
If the nouveau drivers don’t support programmable shaders, stream processing, hardware T&L, bump mapping, parallax mapping etc., they will be of little use to most people with modern software – they will use nVidia’s drivers.
2D Buttons, windows etc can be drawn much faster by the 3D functions of the card – the 2D acceleration features on modern cards have not been updated in years, and will only be around for as long as 3D desktops are not the norm. As demonstrated by Vista or XGL/Compiz-Fusion, 2D apps can be rendered quite effectively without using the near-obsolete 2d functions on cards.
I am not obsessed with the 8×00 series of cards – they simply happen to be the most recent nVidia cards available. By the time Nouveau have a functional driver, these cards will be very commonplace, therefore, it makes the most sense to target the card that will be the most popular when you expect it to be finished, so that you can compete with nVidia’s driver.
Competing with nVidia’s driver is a desirable thing, because it will put pressure on nVidia to increase the quality of their proprietary drivers or to open up the specifications or source code. If you just piss around making drivers for obsolete functions in ancient graphics cards that hardly anyone uses any more, then nVidia will have no reason to improve their drivers beyond their current level (unless AMD or Intel come up with something special).
Directly supporting any cards other than the most recent cards available is a waste of time in my view. By all means feed back the drivers down through the range, but start with cards people actually want to use FFS!
OTOH, if you view the Nouveau project as only something for users of old hardware, then all would appear to be well. However, the proportion of people who both a) like Linux and b)refuse to upgrade their hardware more than once a decade represent a very small minority of computer users, so I’d prefer to see people working on projects that have some kind of relevance to this century at least.
First of all, Nouveau isn’t competing with nVidia’s drivers, they’re just aiming to offer a FOSS alternative. As such they don’t need to rush into things. Besides, for the recent hardware there is the proprietary drivers, but support for older cards is getting worse. So there is a need for a good and updated driver for older hardware.
Let’s remember who drives the market: MS is still the main player, and Vista is designed to use the 3D functions of the card, and only resorts to the 2D stuff if you have a lesser 3D card. Once Vista becomes the mainstream installed OS, hardware manufacturers will likely start dropping 2D acceleration on their newer cards, along with a bunch of other functions that have been superseded by better methods.
It seems to me that you don’t quite understand the way graphics even works…Even when Vista does use Aero it still DOES draw 2D graphics. Websites you browse and all such..the contents of the windows must be drawn by either hardware or software and of course it’s more efficient to do it in hardware. Ergo, no one is gonna create a driver which doesn’t accelerate those functions. Let’s take an example…I use Beryl on my GeForce 4 on my laptop. It does provide me with a nice 3D cube and all that fluff and the windows are drawn using the 3D capabilities of the card. However, you also need something to fill the window with, otherwise it’d just be blank. That’s where 2D comes into play: X draws a 2D image of my Opera window into a buffer and Beryl then draws that image onto a 3D surface and that’s what I see on my screen. Simple enough?
About the features then..well, you have to implement the basic drawing primitives and such anyway to even have a useable driver…Then you can start working on 3D acceleration stuff. But hmm, if you don’t implement older 3D features what happens when you try run an app which uses those features? And what do you consider as features that should be dropped? 2D texturing? Following you logic it should be dropped without even blinking an eye since it’s ANCIENT…However, any app whatsoever which does want to display more than wireframe graphics depends on it. When you draw a 3D cube you have to create one or more textures first, then map those textures to the cube’s sides..And so on and so forth. So you see, all that builds up on the older and simpler stuff. Even the more recent features like SM 3.0 are dependent on the older features on one way or another. So you just can’t drop support for those.
If you just piss around making drivers for obsolete functions in ancient graphics cards that hardly anyone uses any more
Where do you get the idea that hardly anyone uses anything older than 8×00 series cards? :O Do you have any idea for example how many laptops there are, running Linux, and that have something other than a 8×00 card? And since they’re laptops you have to deal with what you have or buy a whole new laptop! And not everyone is always buying the latest and greatest, not everyone uses their computers for gaming and so on and so forth.. You are just trying to push people to implement the features YOU wanna see.
Vista Aero draws 2D graphics using the 3D hardware. Yes, some advanced functions are built on top of older functions, and are here to stay for some time. But other parts are not. GPUs are increasingly becoming generalized parallel stream processors, and a lot of those set functions in older GPUs are redundant as a result.
I don’t think you are really understanding my post very well. Of course 8×00 cards are very new, and are not installed on that many computers NOW. It will take the Nouveau team at least another year or two to develop a comprehensive, stable 3D driver for these cards, by which time, the now uncommon 8×00 cards will be very common indeed.
You’re STILL trying to argue? *sigh* Well, I tried my best.. But I’ve just got one question: what features should be dropped then? Name a few examples?
Just wanted to make a few more things clear, either to you or anyone else reading these comments:
If you create an app which draws a line with pixel width of 6, starting from point 0,0,0 (x,y,z) and ending at f.ex. 30,50,0 is it 2D or is it 3D? Looking at the screen you just can’t really tell. And essentially there is no difference. It could be drawn by specific 2D hardware functionality, but it could just as well be drawn by the 3D pipeline. And that’s how the cards nowadays do that: they just use the 3D pipeline to accelerate 2D drawing functions. 2D graphics is nothing more than just pixels drawn on the same virtual depth whereas for 3D objects the Z value can vary.
I hope you understand it
Edited 2007-09-05 05:44
Excellent, you just explained why the old 2D acceleration feature of graphics cards is now redundant, and why the 3D functions are used to accelerate 2D graphics!
That’s life, and life has sad things, what’s the matter?
No, really, what I want to mean is that they drop support unnecessary. There are toons of people with this kind of cards, there are toons of people of *customers* that doesn’t reserve to be treated that way. The effort necessary to keep supporting cards like that is minimum, since there’s a unified driver to rule them all. I don’t know the technical details, perhaps I’m wrong, but if you asked me for a reason for nVidia to drop support I will answer that people on nVidia wants to force their older customers to update their hardware.
But it’s not a checked fact, and perhaps the effort necessary it’s bigger than I think.
I like this part in particular:
No holy wars, just a sane reasoning. How refreshing . I wish them a great success.
Distributions should pull together to get Nouveau to a really decent state. If each distribution devoted an engineer half time to it, we could soon not need Nvidia.
Of course, it would be even better if Nvidia actually decided to support the Nouveau efforts, but I really do not see that happening.
The nvidia card in my workstation is working well enough. My laptop has an ATI card, which is all but useless. Eventually, I hope to replace this laptop with an all-round Intel laptop (graphic, wireless ,etc). My brother has an Intel laptop and it is a thing of beauty to have everything work out of the box with no effort.
If Intel were to release a stand-alone card for PCs with open source drivers, I would also replace my NVIDIA card with it.
I am currently working with a hardware company to offer Linux-based workstations and nothing would please us more than not to have to worry about NVidia’s proprietary drivers “as good” as they are.
In the old days, Matrox G550 cards were great for Linux as they supported 3d, Xinerama and more. Sadly, Matrox has stopped offering competitively priced products that perform half-way decently.
Edited 2007-09-04 00:35
They also stopped supporting Open Source developers, so even if they did offer consumer hardware, it wouldn’t be worth purchasing it.
Unfortunately in a ideal world all the vendors would say, ‘we don’t need secrets’ and have all their specifications and drivers opensource and available to not only tweaking but allow those on non-x86 platforms to get the same level of support as their x86 counterparts.
Matrox was an awesome graphics card company; I owned a 450 and 550; I would have loved to see a laptop with G550 embedded on a PCI 2.0 connector, it would be awesome, especially with the great level of OpenGL support.
As for Intel – Arstechnica had some ‘insights’ into the future direction; lets put it this way, it is looking awesome. I would have gone for a Intel one (I’m using nVidia) but bad performance in the past from their 810 series really put me off. Maybe next time I’ll purchase one with one, along with the latest wireless/chipset etc.
I tell you the key to your problems and you mark me down. Commercial products have certain guarentees. I don’t know what they are, but you can get your money back if it doesn’t work. You guys are stupified as to why linux is not as big as Windows. I honestly can’t see why you want it to be? Linux is less dumbed-down than Windows and the only way you’re going to compete…. is to make another windows. Then we will have a free windows and a not free windows and they won’t be any different.
It’s like Las Vegas. Everyone is proud of growth. Why do they care? What do citizens gain from growth except more traffic nightmares and a strain on water supplies.
You guys should forget about growth.
Why don’t you read an EULA or try to get MS (for example) to compensate you if a bug in Windows wiped out your music collection. Good luck with that.
“I don’t know” are the keywords here.
You mean just like how you’d get your money back if you bought a boxed Linux distro in a store and you took it back?
Who’s confused about this? Who’s even talking about this?