The news already got out yesterday, but now it’s official: AMD will open the specifications to its graphics chips. “AMD announced on Sept. 7 a major strategic change in open-source graphic processors support. The company announced it would provide open-source information and a development package supporting the ATI RadeonHD 2000 series ATI Radeon X1000 series of graphics processing units on Linux desktops.” The new information is that AMD will partner with Novell’s SUSE team.
Not that I’m a huge fan of ATI, but it’s about time one of the major graphics card makers open sourced their drivers. Well to some extent anyway.
Not that I’m a huge fan of ATI, but it’s about time one of the major graphics card makers open sourced their drivers. Well to some extent anyway.
They are not opening up their drivers to any extent. They are just providing needed information to create new free ones.
They are not opening up their drivers to any extent. They are just providing needed information to create new free ones.
If I were AMD/ATI (or any hardware manufacturer for that matter), this is how I’d do it. Why bother to write the drivers myself when there are hackers out there perfectly willing to do it for me … for free!! You can’t beat that with a stick.
I don’t think that you can hire Novell employees for free and neither does the in-house dev team that AMD established to support Novell work for free. And I also don’t think that AMD’s own team that writes the updated Catalyst dirvers for Linux works for free either.
The article says that AMD does all these things and yet you wrote that comment. You did read the article *before* commenting, right?
Edited 2007-09-08 09:47
Did *you*?
Neither ATi nor Novell are making an open source *3D* driver. They are working together to open the specs, as the previous commenter said, and to build a reference implementation 2D driver. The plan is that the ‘community’ will develop the 3D one. The fact that the community in this case probably mostly equates to developers paid by some company is neither here nor there, ATi won’t be paying to build an OSS 3D driver.
Ooops! *blush*
>>Why bother to write the drivers myself when there are hackers out there perfectly willing to do it for me … for free!! You can’t beat that with a stick.<<
Maybe, but let’s not fall into the delusion that free software developers are always able to create the best software, it depends:
– on the level of interest
– on the quality/completeness of the spec and whether the NDA is bearable or not.
I remember the time when previous ATI cards (7900?) had ‘open’ specification: the free driver took a long time to mature and I’m not sure that it was very good in 3D at the end.
That said, now that GUIs are made on top of OpenGL, the interest from free SW developers should be high this time, I’m not sure that there are many free SW developers who care about games on Linux.
All I can say about that is Thank God! I mean have you actually tried to use ATI’s Linux driver? As much grief as those drivers have caused I imagine the code would probably blind any developer who looked upon it. I have a mental image involving the Ark of the Covenant and a bunch of Nazis for some reason…
Yea but will this mean several drivers to choose from and confusion about which to use like can happen with kernels? At least with prop. drivers there’s no confusion about which driver to use. I mean yea, open is nice because your hardware will always have a driver available (even if old).
This is fantastic news, this partnership with Novell should really kickstart development of open drivers.
It now looks like the likely candidate to replace my Nvidia 7800 when i feel it is necessary will be ATI… I never thought those words would come out of my mouth.
Edited 2007-09-07 21:10
Does anyone know if this will also include the ATI TV Tuner chips from the All In Wonder series.
I’m on the other side of your statement. I’m a few months away from a major system upgrade and until now had been set on the idea of nVidia/Hauppage for my gaming 3D and tv tuner functions.
I’ve been an ATI customer since my first system build and humble first generation ATI All In Wonder and it was the piss poor drivers that convinced me to consider another brand of video card. Under Windows, the ATI provided drivers and media software are just plain broken (driver updates break the media software, cleanly installed media software freezes regularily). Between the “fully functional” ATI provided Linux kernel mod and community developmed “minimally functional” kernel mod, the community driver is far better and more stable (ATI’s mod won’t give me a frame rate remotely close to basic 2D use).
I’m still planning for nVidia 3D and hauppage tv tuner on the new rig since ATI just doesn’t have somethign like the 8800 board right now. If ATI manages to help the community develop a stable and fully functional driver with TV tuner support *and* get something close to the 8800 on the market then it’ll again be a fair fight for my business. More realisticaly, my hope is that this drives nVidia to release complete driver interface specs since Hauppage has a long history of working cleanly under Linux.
Hmm, Novell… Non Disclosure Agreements… Hope I’m too nervous
makes me nervous too, but Novell and AMD have delivered before (the opteron port of linux), so there’s reason for optimism
I’m sorry Nvidia works on Solaris, FreeBSD and Linux – when ATI works on all three I’ll be happy to buy their hardware.
BTW what’s all the fuss about 2D drivers for ATI?. VESA is good enough for text graphics and video.
>I’m sorry Nvidia works on Solaris, FreeBSD and Linux – when ATI works on all three I’ll be happy to buy their hardware.
BTW what’s all the fuss about 2D drivers for ATI?. VESA is good enough for text graphics and video.<
have you ever seen how windows redraw with the VESA driver?
Also, this is the full specs which will allow both 2d and 3d drivers to be developed for any os that dedicates developers to it.
Edited 2007-09-07 21:22
You obviously never used VESA drivers with a resolution higher than 800*600.
With closed drivers, or the poor but open nv driver, I guess it does.
With Open Source drivers, that’s clearly going to happen.
No it isn’t. Even slightly. Even for a “light” system like Syllable, VESA is rubbish. At a bare minimum you need hardware blits.
Open source drivers will make it on all the major *nix platforms. This is seriously good news for FreeBSD users who’ve always suffered from terrible ATI support.
I am going to continue to use the proprietary drivers because R300 card specs (my Radeon 9600XT) won’t be opened and the R300 FOSS drivers are slow, buggy, and incomplete.
A) I read the same headline last year…
B) they talked about opening source only for 2D
C) 5 years too late…
I used to like ATI… but they have a long way to go before I install one ever again on a linux machine.
A) They announced they were planning something along these lines but didn’t go into details. This is concrete.
B) They’re opening the *specs* for both and also providing a reference 2D driver for people to build 3D support on top of based on the specs. The specs are the important bit and what the OSS community have been clamouring for from ATi and nVidia for years.
C) What the hell? They’ve opened the specs, and now you’re whingeing that they didn’t do it earlier? This was learly AMD’s doing, now that they own ATi.
The difference is that AMD didn’t own ATI 5 years ago, otherwise you would have a point.
Why can’t AMD just publicly release their ATI info, release the ATI specs and documentation? Since OpenBSD is 100% against all binary and proprietary stuff, if AMD provides enough info that OpenBSD can write their own code based on ATI’s specifications and documentation, THEN I will call it truly open and will buy ATI instead of using on-board. I know the OpenBSD development team actually prefer specs and documentation over a company giving them driver source code, that’s why I say if they get enough info to write their own native driver to support ATI cards.
It seems that exactly that is what they are going to do, PLUS work with Novell to write the free code. Surely the OpenBSD programmers can participate too, if they are not above collaborating with lesser beings.
haha i like openBSD as much as the next open source enthusiast but seriously when was the last time u needed an openBSD box with killer graphics preformance haha
OpenBSD intention to get full documentation and specification from hardware is noble but they need to deal with reality that it is not that easy in hardware manufacturers world especially where there are some third party patented specifications to deal with.
I don’t think that the OpenBSD devs think it’s an easy task. They try anyway, because without trying you can’t achieve anything.
That’s exactly what AMD and ATi are doing here.
Why can’t AMD just publicly release their ATI info, release the ATI specs and documentation? Since OpenBSD is 100% against all binary and proprietary stuff, if AMD provides enough info that OpenBSD can write their own code based on ATI’s specifications and documentation, THEN I will call it truly open and will buy ATI instead of using on-board. I know the OpenBSD development team actually prefer specs and documentation over a company giving them driver source code, that’s why I say if they get enough info to write their own native driver to support ATI cards.
i like this comment. you are doing what many passive people are not doing. taking a stand and thinking for yourself. that is why i use mainly openbsd and dislike binary blobs. i want to see the code.
You’re making one judgement flaw which all opensource advocates do – that there is actually well written documentation that accompanies the drivers/hardware.
As sad as it sounds, good programming practice within many organisations takes a back seet to pushing the product out the door in the fastest possible time.
“As sad as it sounds, good programming practice within many organisations takes a back seet to pushing the product out the door in the fastest possible time.”
Says a lot about the professionalism of the IT industry,doesn’t it?
Nevertheless, there’s no reason we shouldn’t expect and demand professionalism just because some companies can’t measure up.
This won’t change anything overnight but it’s a good start. What happens next greatly depends on the quality of the drivers that come out of it. If they suck, that may be the end of open-source graphics drivers for a very long time. We’ll just have to hope that Intel start developing high-end graphics cards.
Actually I expect it to be developed quickly and be suitable for at composited desktops. R500 is not that different from r300 in 3D core and lots of needed features is already implemented. Also many common parts from DRI and Mesa can be reused, so basically 50% of work is already done and the rest isn’t that hard.
Doesn’t this portend a great opportunity for new, exciting 3D games on Linux? Y’all are all work and no play. I can’t believe nobody mentioned this yet! That alone will provide a boost to the desktop market.
“Doesn’t this portend a great opportunity for new, exciting 3D games on Linux? Y’all are all work and no play. I can’t believe nobody mentioned this yet! That alone will provide a boost to the desktop market.”
I whole-heartedly agree. One of the biggest selling points of Windows for people who are into it is the fact that games and games development is much easier on Windows due to the common interfaces and unobstructed support of the hardware on this platform. Having the same on other OS’ with OpenGL will now open the doors for more games to come out for it and without much fuss so far as the system itself will support it, not to mention the potential for true cross-platform development.
What works on one should work on the other with minimal changes or adaptations.
Actually the biggest selling point is that Windows has such a huge install base. Which is why the Macs are still so much behind in games even though it’s a pretty coherent environment.
That and the fact that it also costs money to have linux developers in your team in addition.
It’s a big commitment for a company to decide to write a linux version of their game and apparently only a handful of them think it’s worth it.
Cash is basically the main driving force there.
I think you’re partially correct about your assessment as to why the big 3D games aren’t likely to ever end up (in meaningful numbers) on Linux, outside of the larger install base of Windows (where most gamers are by numbers) is that people using “free” operating systems seem to be more demanding that all their applications (including games) are “free” as well, and with the extremely huge development costs of a lot of modern games, that just doesn’t rate very high on business sense for a game developer to release games onto a platform that has (right or wrong) the reputation of having users generally not willing to pay for their software. Of course, after they’ve milked all the first wave of buyers (generally on Windows) and there’s no more meaningful new sales on Windows, perhaps you might make a case that it’s good PR to do a Linux release, and maybe it is good PR, but that’d have to be an easily-ported game to make it worth that much.
However, there are some types of games (WoW, EQ) that I could see having it be irrelevant as to what the platform is, or even if people are willing to buy the binary itself, because they require a subscription to play the game over the network. In that case, it’s less a technical reason than the Windows platform being the biggest platform (though that’s still true) and largely then devolves into testing with all the various mutations of Linux, which can get expensive to do proper installation and play-testing on.
“the reputation of having users generally not willing to pay for their software.”
Well, if we are to believe BSA et al. piracy is so rampant in Windows that it shouldn’t really be profitable to make games for it either. I’d wager most games developed for Windows doesn’t break even, which raises the question how in the hell all these companies make money.
Perhaps it’s more the perceived chance (or misguided hope?) that the larger install base will bring profits.
“largely then devolves into testing with all the various mutations of Linux, which can get expensive to do proper installation and play-testing on.”
You wouldn’t really have to test with all of them, just the big 5 or so and if it works on one of the big 5 it’s pretty likely it works on all of them.
I gave up hope for Linux gaming – and that is not because I am already waiting for a decade
If you check the latest game developments and listen to people like Carmack, there is a clear shift away even from the Windows-PC. The game developers have a focus on consoles and they will not optimise for Windows-PC in the future. That makes Linux and OSX even more of an after thought in games development than it used to be. And before you’ll see someone port to Linux, you’ll see a Mac-port.
I am not saying it cannot be done, I am saying it is ever more unlikely you’ll find someone who’s willing to do it.
I think it’s a really good thing that game development is shifting from the Windows PC to the consoles. That means more OpenGL code is being written (Wii, PS3) and less DirectX. This will make it a lot easier to port these games to either Linux, Solaris, FreeBSD or Mac OS X.
And if this move makes the PC obsolete as a gaming platform for both Windows and Linux/Mac users I am fine with that too. I will buy one of the OpenGL consoles and play my games there.
I think it’s a really good thing that game development is shifting from the Windows PC to the consoles. That means more OpenGL code is being written (Wii, PS3) and less DirectX
Isn’t Xbox 360 using DirectX?
I suppose the game developers use a wrapper so they can switch to OpenGL.
Edited 2007-09-08 10:53
By this stage, it’s pretty much irrelevant anyway.
The consoles are all different enough from each other that you really need to write a different renderer for each one anyway. PCs have standardized hardware, and if you ignore all the legacy crap in both D3D and OpenGL, both provide almost the exact same interface. Compared to consoles, D3D and OpenGL can be considered to be identical.
Besides, a properly designed renderer should be 90% portable code, 10% platform-specific code.
OGRE3d baby!
Actually late games are ported to linux sooner than to Mac. I’ve seen at least 3 commercial games ported to linux with “Mac beta, ETA unknown”.
But you’re right on the consoles.
Let me start out saying that I get my general clue about this sort of thing from reading / browsing head-lines of a bunch of major pages only, I am not deeply looking into Linux gaming. So I might be wrong but I have that strong suspicion that the three games ported to Linux are pretty much all that are out there, for all it’s worth. Can you name three RELEVANT (really !) games ported to Linux that do not include any form of Quake or direct derivative of it, or some version of UT? What remains?
So I like to think I stand uncorrected here. Linux-games were always a tricky business and the issue is not 3D drivers or Linux audio anymore, it really is just the fact that noone is willing to make the effort anymore. Ironically, Linux audio and 3D immaturity were said to be the hold-backs for the longest time, and now where they are not any longer, developers seem to lose interest just generally, because supporting extra platforms is never a smooth thing, no matter how “easy” it might have become over the past years. And a console can never replace the game play experience of a PC – I must know, I have a gaming PC and there are also two consoles in the house which just don’t cut it in comparison.
A nice shooter is Coldwar
Review in http://www.linux-gamers.net/modules/smartsection/item.php?itemid=1
Download demo from
http://demofiles.linuxgamepublishing.com/coldwar/
Other game http://www.americasarmy.com/
Edited 2007-09-08 14:32
Depends on where you’re looking, and what your definition of “relevant” is.
In terms of big blockbuster games, there’s virtually nothing. There’s no technical reason for it that’s inherent to Linux itself – the required OS support has been there for a long time. I can think of three possible reasons.
1 – Middleware. If you’re using an engine that doesn’t support Linux, or third-party components (think physics, high-level sound APIs and so on) that don’t support Linux, then there can be no Linux port. Note that iD don’t use middleware of any kind, and the core Unreal engine doesn’t use any non-portable middleware.
2 – Copy protection. There are loads of companies providing copy protection for Windows. There are none for Linux. Publishers demand copy protection, and the likes of iD and Epic have enough clout to force the issue. Other developers do not.
Besides, how effective could copy protection be when the user can modify the entire operating system?
3 – Money. Publishers aren’t going to front the money for a Linux (or Mac) version unless they think they’re going to make a profit on it. They don’t think that. iD insist on supporting Linux. Epic insist on their game engines running everywhere.
None of that applies to smaller developers. They don’t tend to use commercial middleware, don’t tend to have such rigid copy protection requirements, and have a completely different economic model where supporting all possible systems is generally a benefit. In a system where you’re selling a game over a period of several years, and an extra couple of hundered sales would be a significant increase, it makes sense to support Linux. Especially if you can re-use the Mac code – it’s near suicide for an indie developer to not support Mac OS X.
I think the only way the Linux community could court more big commercial developers is to develop our own middleware, and set up a company to support it. Maybe even develop copy protection software too, although I doubt that could ever turn out well. Even then, most publishers only like dealing with other big businesses.
Or just continue to develop the Wine and Cedega DirectX translator.
Seriously, that is pretty much all I use Wine for these days. There is no such thing as “alternatives” to top commercial games so the solution is to make them work.
I’m surprised at the comments on the R300 drivers They do lack some features which slows some games. I thought they were the same as those of Linux. The divers in X themselves 6.6.3 which is 11 months old. Although the drivers between 6.6.3 up until the 6.7 series are buggy. I was disappointed there wasn’t a release with X of those drivers. As the next Generation of Linux based GNU distributions of Linux based Distributions would have New Linux(with better wireless)+(Desktop Experience)+Mesa 7.01(Stable OpenGL 2.1)+X. X seems to be weak although hopefully with the its relatively new modular nature releases of both the radeon and nouveau will follow with improved releases. I suggest a better Distribution.
“Or just continue to develop the Wine and Cedega DirectX translator.
Seriously, that is pretty much all I use Wine for these days. There is no such thing as “alternatives” to top commercial games so the solution is to make them work.”
I have had poor experience with Modern commercial gaming with all but those games that use OpenGL. The reality is though the exciting commercial games have all moved off the PC, and personally I’d rather spend my money on my CPU than GPU. The surprising thing is that I used to use my xbox, for my gaming, but I like open-source gaming much more with its communities and continual improvements. Its reached a stage where I don’t even use my xbox anymore and have games unplayed. Its also an area full of little surprises currently i’m occupied with OpenlieroX.
For me, I think people are better off getting a games machine and using that instead of a PC; but if people do want games on *NIX, I think that wine would be the best avenue for all concerned.
This goes beyond games btw, companies who don’t want to port to *NIX should atleast work with wine and point out flaws that will cause compatibility issues, and how to correct those flaws. Sure, it isn’t an ideal situation but if wine is 100% compatible with a large array of applications, it’ll be better than having no games or applications at all.
Good point!
Sometime, however, it is pure incompetence on part of the game developers. I remember this “reason” being used on Slashdot by the project lead of Never Winter Nights, e.g. audio library not available on Linux, to which a developer of said library replied that in fact it is.
This could be used as an excuse. Excuse because there is no such thing as a working copy protection as far as I know.
Actually, having talked to engineers of Sony’s copy protection team, the customer requirement is “one week”.
Even at a simultanious start of Windows and Linux version, the Linux sells would probably fall under the “later than first week” category anyway.
Additionally, most of the new games are using some kind of server based strategy, e.g. either being an online-only game or having a major experience improvement when being connected.
IMHO the main reason.
There is this myth that Linux users would be less willing to spend money, which is bogus, but unfortunately makes a Linux version look like a financial risk.
“There is this myth that Linux users would be less willing to spend money, which is bogus, but unfortunately makes a Linux version look like a financial risk.”
yeah, just because we arent willing to shell out money for complete crap(like winblows) where we for free, can get BETTER, they conclude that we are unwilling to pay for software.
while in reality the oposite is most likely true
While I don’t think that Linux users are inherently less likely to spend money, there is a different culture on Linux systems. Users are used to free software with absolutely no artificial restrictions. That’s not the case for most Windows or Mac users.
Of course, a game is not comparable to an office application, or an operating system. It’s more like a movie.
The bigger problem is one of user base, I suppose. Same reason there are so few Mac OS X games.
You have a point on the copy protection. It really does seem like an excuse. It’s rarely effective on Windows anyway, with games frequently being cracked and pirated before they’re officially released.
I hadn’t considered the online activation stuff There’s no reason that wouldn’t work on Linux, especially given the time frame involved.
Even Microsoft doesn’t care about games for PCs anymore. MS removed hardware acceleration from DirectSound and look what games MS develops for PC: Train Simulator and ports from old Xbox games (Halo 2, Viva Pi~nata).
Heck, Microsoft does as much game development for Nintendo DS as MS does for PC (Diddy Kong Racing, Viva Pi~nata DS, there was even an almost finished Halo DS beforte MS cancelled it).
The hot stuff comes out mostly for consoles these days: Halo 3, Halo Wars, Mass Effect,…
<irony>That damn Novell and their corporate deals… @#$%!</irony>
Thinking about proper INT 10h routines implemented in LinuxBIOS.
Intel is (almost) already there. This will certainly put pressure on NVidia to open-up their drivers. Go Open source!
Comment I posted to a blog earlier:
Wait – providing incomplete specifications on a supposedly ‘open’ interface in order to try and ensure that all third-party implementations are inferior and cast reflected glory upon the first-party implementation…isn’t that what Microsoft’s been doing for decades? Why are we celebrating when ATI does it?
I suggest that anyone working on an open source driver for an ATI product trust anything ATI release exactly as far as they can throw it (if it’s tied to a very heavy rock). After all, ATI have specifically professed their intention that the drivers based on the information they release should be inferior to ATI’s own drivers. Given this, why would anyone sane trust their information?
Adamw,
Because if the information is indeed wrong, it will be found out and they will look like fools. If they say that register x of their videocard does y and it does not, they are the ones that will look like fools.
In the long run, AMD will not want to keep paying to have two codebases that address the same hardware. Of course, the ATI guys that used to do Linux driver development may not be all that happy with this piece of news, but I am sure they too will adapt.
My hope is that overtime ATI’s in-house team and the outside developers work on a single codebase that is free software, but that will take time.
I see where coming from but I think there is more to this than meets the eye.
I’ve got a feeling that this is an experiment in as much as can AMD/ATI get the opensource community to help them develop a driver that can, in time, replace they’re current development methods and enable them to cut considerable development costs.
To start with, they open the specs, get another company on board to help jump start the process and drum up support, while releasing an updated closed source driver to to keep interest high.
If this little stunt works out and AMD/ATI get a great codebase, they can streamline they’re development and end up only needing to either add optimizations or help out existing developer groups in optimizing the code.
Just my ^a`not0.02
I am not as optimistic.
People around here sure have short memories – it seems no-one really remembers that ATI did this stuff as a matter of course up to, IIRC, the Radeon 8xxx generation (provided specs to the X driver authors). And they did it without great PR fanfare and so forth. Given this, why would they need to experiment? They already know perfectly well it can be done and it works.
Your message is also doing a considerable disservice to the existing open source projects for ATI drivers. “Jump start the process”? What needs jumpstarting? We already have a very mature driver up to the r400 series and a new driver for the r500 and r600 series (avivo). Nothing needs jumpstarting. The whole Novell angle stinks of PR.
This would stink a lot less if it had simply been handled from an engineering perspective: put the specs up on a site somewhere and dedicate a couple of engineers to helping the *existing* work on ATI drivers. As it stands it’s been done from a PR angle: it’s been trailed for months so they get the maximum possible positive response, the announcement is massively dressed up and tied in with a ‘partnership’ with Novell…eesh. Just give us the specs and stop yelling about it already.
This is what AMD / ATI does with their other hardware, to their credit. There’s a guy from AMD named Shane Huang who helps us (and probably other distributions) with support for AMD motherboard chipsets. He files bugs pointing to kernel patches, responds to emails, and generally makes himself helpful. There are no month long press charades and ticker tape parades and general crap. I just wish they’d take that approach to graphics hardware too.
ATI gave up on helping out with the driver before they where bought by AMD. Just because AMD wants to do things they’re way does not make it bad.
Who’s to say they’re not gonna help out/use the code and developers from the avivo project? They did state that they where setting aside some engineers for the purpose of helping out. And if they don’t want to use the code, did you ever stop to think that there may be a really good reason for that?
Anyway, what’s wrong with a clean room implementation if they do want to create a portable codebase for the future? Who better to do it than the company that manufactures the chips?
Drumming up PR is not a bad idea when you consider the reputation ATI cards have on Linux. Anyway, they stated they are going to release the specs, most probably on a website, so I don’t see any issues there.
Furthermore, do you have inside information that led you to your caution? or are you just bitter they choose Novel instead of Mandriva? Looks to me that you guys could do with a bit of PR yourselves, especially considering the standard response to every Mandriva announcement on this site. ;-P
Edit: Added bits.
Edited 2007-09-08 00:43 UTC
er, no. it’s not like we have engineers to spare working on graphics card drivers anyway. we’d rather, y’know, work on the distribution.
“Who’s to say they’re not gonna help out/use the code and developers from the avivo project?”
Well, Novell isn’t involved in it (avivo), and I ain’t seen any Novell engineers showing up on the X.org mailing list to discuss the driver either. If Novell and ATI have been working on this already, they’ve been doing so behind closed doors, in the best ‘not-really-open’ spirit. If they haven’t, it’s a bunch of PR guff till proven otherwise. Doesn’t look good either way.
And no, absolutely no inside information of any kind. First I heard about this was on LWN, same as everyone else.
Even the head avivo developer wants to drop his project and either let Novell run with the ball or, and this is where opensource really shines, help out.
You don’t like the deal? ok, what ever. Do you have any evidence that both AMD/ATI and Novell are not going to work on what could become the 3D reference driver for ATI cards? Because I have yet to see any.
In fact, everything you have said so far has seemed to me, and judging from some other posts, other people as well, like you haven’t actually read what is being said.
You say AMD/ATI should just release the specs? They stated that they would do exactly that.
You gripe about ATI dropping the ball? That was before they where bought by AMD.
You complain about both AMD/ATI and Novell ignoring the avivo driver? Even the head of the avivo project thinks it’s a good idea to work from scratch with the released specs.
You state that until AMD/ATI come up with the goods that your not gonna believe anything is gonna come of this? You stated yourself that AMD are really good at dealing with questions concerning they’re chipsets.
So, what exactly is this miss trust based on? Because I really don’t see it.
“So, what exactly is this miss trust based on? Because I really don’t see it.”
Ati have repeatedly made unfulfilled promises, and broken them. Its only R500 and above cards, and involves NDA, after a previous NDA has stopped a R500 driver for over a year. Its not a perfect working solution.
I find it bizarre that you point to just one thing as affecting change the imminent release of both a revere engineering tool, and an open-source driver seem as much catalysts as anything. Thats without the overwhelmingly negative press over both there binary drivers.
Its all looking very positive, but trust has to be earned.
Well the thing that needs to be asked is whether the NDA is required because there some nasty hacks which can only be described in code than by documentation.
Even so, I’d also be sceptical not just because of the situation you outlined, but Ati promised to officially support a 2D driver and failed to provide specifications, they promised to provide specifications for previous generation of graphics hardware and failed to provide adequate hardware information.
If their current attempt to ‘provide specifications’ anything to go by I’m not holding out much hope for the drivers; simply dumping things onto the opensource community and told ‘best of luck’ isn’t how you get success. Netscape tried it with Netscape 5 only to find that the code was so ugly, so terribly written and documented that they had to throw the whole lot out and start again.
I hope that for all those who have bought Ati hardware in the last couple of days, in a hope for better support, aren’t let down. Sure, I don’t like the idea of using binary drivers, I’ll probably go for Intel hardware the next laptop I purchase, but Nvidia does an ok job.
It’ll be interesting to see Intels next generation of hardware and how it will stack up against Ati and Nvidia; if it is even 80% of what Nvidia and Ati can do with their proprietary setup, I’d be happy camper purchasing their gear.
It’ll be interesting to see Intels next generation of hardware and how it will stack up against Ati and Nvidia; if it is even 80% of what Nvidia and Ati can do with their proprietary setup, I’d be happy camper purchasing their gear.
It’s about time someone brought up Intel!
Lets not discount what they bring to the table, which is quite a bit. http://www.intellinuxgraphics.org/index.htm
We (the novell guys), are just somewhat busy atm
(yes, i know, why am i not writing code now — because i haven’t had coffee yet)
“Jump start the process”? What needs jumpstarting? We already have a very mature driver up to the r400 series and a new driver for the r500 and r600 series (avivo). Nothing needs jumpstarting.
The drivers up to r400 won’t be affected by this at all. The r500-600 avivo driver is still very new, and if I recall correctly the main developer has completely dropped the project saying that it is better to restart with what ATI is giving out. So apparently he thought it needed to be jumpstarted, and I’m certain he knows a lot more about this than either of us do.
People around here sure have short memories – it seems no-one really remembers that ATI did this stuff as a matter of course up to, IIRC, the Radeon 8xxx generation (provided specs to the X driver authors).
Correct, although really the people at the top making decisions are all different now. Not only have they been absorbed by AMD, a lot of their old CEO’s have been quitting. The open source environment has also changed a lot in the past few years.
As it stands it’s been done from a PR angle: it’s been trailed for months so they get the maximum possible positive response, the announcement is massively dressed up and tied in with a ‘partnership’ with Novell…eesh. Just give us the specs and stop yelling about it already.
Clearly true. Still, it IS a big deal and people are excited to hear about it. Frankly, they wouldn’t be doing a very good job of managing their company if they didn’t try to promote it when possible. They’ve finally got something real to brag about rather than just PR fluff, and they’re going to milk it for all they can.
There’s a guy from AMD named Shane Huang who helps us (and probably other distributions) with support for AMD motherboard chipsets. He files bugs pointing to kernel patches, responds to emails, and generally makes himself helpful. There are no month long press charades and ticker tape parades and general crap. I just wish they’d take that approach to graphics hardware too.
As I understand it, that’s what they’ll be doing for this driver as well, but obviously with more fanfare.
I do think you have a point about some things. They aren’t fully releasing all the specs to the public (they can’t release some of it without being sued), and they have no intention of allowing their proprietary driver be outperformed. However, I have no problem with only releasing specs under NDA as long as the people who are truly working on it are allowed access. I would hope if people from *BSD or even something like Haiku were to ask, they would be granted access as well. Lot’s of hardware is handled that way. I don’t really care if the open source driver is only half as fast as ATI’s either, I just want something that will be 100% reliable and fully up to date whenever a new X.org or kernel release comes out, and that can run a composited desktop and simple 3D apps like Google Earth with ease. The developers can always try to reverse engineer any little tricks AMD doesn’t share about the hardware in order to gain speed or features, which would leave them in exactly the same position they’re in now – except way ahead from all the work AMD has shared.
You seem to be awfully upset about all the hype going on about this, and I’m not really clear why. Sure, they don’t do this when they release specs to a mb chipset, but then there are already tons of those available and serious high performance alternatives. Are you just upset about the attention AMD/Novell are getting? Or did someone there snub you or something?
Edited 2007-09-08 07:20
“You seem to be awfully upset about all the hype going on about this, and I’m not really clear why. Sure, they don’t do this when they release specs to a mb chipset, but then there are already tons of those available and serious high performance alternatives. Are you just upset about the attention AMD/Novell are getting? Or did someone there snub you or something?”
Nothing like that.
Call it the result of two years of my life spent attempting to assist people with the *existing* ATI drivers. This is not an experience that breeds goodwill towards ATI, let’s put it that way.
It’s not mature if it performs like crap and is buggy and features are missing. The same could be said about their proprietary driver for both linux and windows. So yes the OSS community has done a good job considering he circumstances, but the drivers are far from mature. This will help actually get them to a point where they can be considered mature.
I think they are trying to move toward open source drivers because it will help increase their performance for their future Fusion line of Graphics CPU’s and also in order to get the most out of the “steam” processors they will need to be able to be reprogrammed for specific tasks especially in workstation GPUs.
I think you are on the right track here. When the fusion processors come out (putting the graphics and cpu into a single package), they (AMD) will definitely want to make sure that they are compatible with alternative OS’s (especially Linux). The easiest way for that to happen is ensuring that there is a healthy developer community around their platform..
Its still all about the $$ for them. I’m sure that a fair amount of AMD sales out there goes into systems used on Linux servers (HPC systems?), and alienating that part of the market could hurt their market share more than it has already been eaten by Core 2..
I’m wondering if Novell has a relationship with AMD that if their developer resources produce any inadvertent “improvements” on the Catalyst team’s work that they absorb this code into Catalyst, or due to the GPL2 these branches of code can’t cross-pollinate without legal ramifications forcing Catalyst code being opened up?
I personally can’t see how two teams working so closely won’t ultimately see some improved solutions coming from the FOSS team and no one on the Catalyst team “borrowing” these improvements for their proprietary codebase.
You can’t get people to ethically mind their pees and queues as it is for the most trivial of matters, what makes them think they will with this partnership?
according to this article (http://www.phoronix.com/scan.php?page=article&item=826&num=1), it appears that AMD/ATI will work with Novell under NDA to discuss specifications, and from there release an open library that contains a good subset of features found on most cards, enabling basic 3D acceleration, suitable for compositing window managers or light graphics work. ATI will continue to publish a monthly closed-source driver, that will utilitize the card to it’s fullest, giving better performance and all the newest bells and whistles.
If this pans out, I should think it wouldn’t be too far-fetched to see a reasonably functional general purpose ATI driver included by default in most distributions that will accomplish most desktop tasks, with the closed source driver available to gamers and CAD/graphic designers.
im not sure thats what i get from the announcement, but if this is really true, and that the devs of the free drivers wont be given every documentation needed, well then amd can go to hell as far as im concerned, and i will not buy their shit then.
AW RIGHT go ATI / AMD!!!!
i think novell is up to the task honestly. i think they have the skill to write some top notch video drivers. i’m just hoping they can do it in a very very very short time
deal. Finally we could get rid of the binairy blobs.
When there is open drivers for booth ATI and Intel cards, nvidia will be the only vendor that relies on binary blobbs f"or acceleration. They already have a semi working driver to work with (nouveau) if they want to. Do you think they will follow?
When I first heard about this story, it was actually announced by Chris Blizzard who (AFAIK) works for Red Hat (http://www.0xdeadbeef.com/weblog/?p=302). Was there any exclusivity in the AMD-Novell agreement? According to Chris, AMD is being open to everyone.
My humble opinion is that AMD is positioning itself to increase it’s presence in the ever growing GNU/Linux/openBSD/openSolaris market. While I am a consumer of all things FOSS I still have to shell out cold cash for the hardware that runs it. AMD is doing what Intel has already done, but the difference is that the ATI graphics cards utilize high-end GPUs. ATI’s fglrx driver has always trailed the quality and performance of nVidia’s proprietary driver. This was the sole reason I switched to nVidia GPUs.
This, in my view, is what DELL is doing with Ubuntu. Sure others are out there, but no major vendors were willing to take the plunge. Now HP is stepping up to bat, as is Lenovo, in shipping Linux based desktop systems.
Imagine having high-end 3D graphics support out of the box! I’m betting nVidia will be watching this closely. If this all pans out I’m hoping other hardware vendors will follow suit.
Exciting times….
I think way too many people are simply just jumping up and down, without putting things in perspective. Most likely people will want to use closed driver over open anyways, as the closed driver will be able to have more features and better performance. An open source driver will be assistance to a few distros and custom Linux solutions that will not require the closed driver
Fact is neither ATI or Nvidia are necessarily going to just open up the books. Maybe the graphics cards of 10 years ago, when the driver was nothing but a driver. But with todays advanced GPU, an OS takes much more advantage of high performance driver than many understand.
Having an open-source driver is not going to move games to Linux. A DirectX compatible or similar API would help much more than anything, although at the moment there is absolutely no reason for a game company to spend the money porting to Linux when Windows already has DirectX.
Personally, the only thing that an open source driver truly means is that a company like Novell can write one that strips out much of the necessary features to provide a stable platform. This may be very nice for servers and workstations, but will do absolutely nothing for desktop. If you are a desktop user who wants to take advantage of the costly ATI or Nvidia card, then a closed source driver will be the only option, otherwise you have just wasted a lot of money for a card you can not take advantage of.
Note that all the article states clearly is that specifications for the 2D support will be released. All it says about other features is, “AMD will continue to work with the open-source community to enable 2D, 3D and video playback acceleration”. That could mean a lot of things, only the least likely of which involves releasing the specification for 3D features. Releasing information on 2D features is nothing really. It doesn’t endanger their business model and its information that has often been reverse engineered in the past. Bear in mind that NVIDIA is an open-source supporter as well. They released Cg open source, they endorse and contribute patches to open source nforce chipset drivers. Much as I’d like to see an open 3D specification, (which is more accurately what AMD/ATI are discussing), I don’t think it’s going to happen.
Quake wars is going to be out of beta status soon
Latest news: The specifications will be available without any form of NDA. For the whole R300->R600 range.
Wow, what a turn around for AMD/ATI. From last to first place in a week
NVidia, what do you answer to this?
source: http://www.phoronix.com/scan.php?page=news_item&px=NjA0Ng