Reuters reports that Nvidia shares climbed more than 8 percent on Wednesday amid speculation the graphics chipmaker could be acquired by Intel. “Investors have been speculating that Nvidia might be acquired since July, when AMD agreed to buy Nvidia rival ATI Technologies Inc. for USD 5.4 billion. Rumors of such a transaction resurfaced on Wednesday, spreading quickly across Wall Street trading desks, according to three options market participants.”
I will wager the reason AMD purchased ATI was that ATI was the least expensive option .. but it still gave them access to the patents, intellectual property they need to build high quality chipsets & GPU’s for their upcoming architecture
I think under AMD’s control we will see linux driver support improve as historically AMD have been supportive of open source eg 64 bit x86
I think under Intel’s control we will see nVidia linux driver support eroded
Read : http://www.theinquirer.net/default.aspx?article=34873
for more info.
Some reporters are really thick:
“An Intel acquisition of Nvidia wouldn’t have that appeal because Intel already sells graphics chips”
Excuse me, but their current GPU’s are crap; they’re in desperate need to have something competitive with AMD *NOW* not 2-3 years time once their marketshare has disappeared!
Nvidia has the technology, products and most importantly, engineering talent NOW to deliver NOW what Intel needs for their ‘complete package’ which it’ll be trying to pitch to OEM’s.
NVIDIA’s stuff is overkill for the average corporate desktop’s needs.
An onboard 6150 or 6200, or 7200/7300 would run Aero, it would also be likely to play HD content fairly well.
This seems to me a perfect fit into the Viiv platform.
Nah. Intel has the GMA3000 that is DX10 compliant.
Viiv is… how can I put this? Dedd.
Intel doesn’t need NVIDIA anymore than AMD needed ATI to compete with Intel’s GPU business. It would be a potential benefit to NVIDIA to have its own fabs, and the long-term unification of GPU and CPU could make it attractive for Intel with this push for massive multicoring despite lagging software developers, but there really doesn’t seem any immediate need for Intel to acquire NVIDIA. It would be a longer-term action.
And the last thing Intel has to worry about is their marketshare disappearing in 2-3 years.
Intel has good CPU’s but AMD has proven others can make good CPU’s as well. Intel makes crappy GPU’s and ATI and Nvidia have proven good ones can be made. AMD bought ATI, Now what happens if Intel doesn’t buy Nvidia ? remember all of these cores! Who needs a GPU when there is going to be 128 CPU cores in the future… Nvidia will build CPU/GPU’s and chip sets that’s what! Is a Nvidia 128 core CPU/GPU in the future? Yes if they want to stay in business.
AMD is now a CPU/GPU/ChipSet shop
Intel is a CPU/GPU/ChipSet shop
Nvidia will *soon* be a CPU/GPU/ChipSet shop
Just my $.02
Actually its better that AMD acquire nvidia if they could, to better fight Intel domination.
AMD couldn’t afford (at least risk-wise) to buy nVidia – maybe merge with it.
– Gilboa
It would never be allowed to happen with the way things are.
That they theoretically can put out better graphics chips but their chips are designed to sell processors.
I only can see nvidia going down the drain if Intel buys them, sad but true.
The entire i9xx graphics chips line is designed around the principle, put as few things as possible into the gpu so that it still is competitive enough, and push as much load as possible onto the processors so that we can seel faster processors.
(heck the latest gpus even pushed some pixel shading stuff onto the processor, so that they are vista and osx ready…)
nVidia is too expensive to let it go down the drain.
Also nVidia own (or did own) part of Real3D, which is now owned by Intel, and which provided them with some valuable patents in the 3D chip arena which they at one time wielded against ATI (source Wikipedia — search i740).
So it kind of fits.
If Intel did buy Nvidia what do you think the chances are that linux drivers would disappear? Wintel may have found another way to knock the competition.
If Intel did not want linux on the desktop, they wouldn’t do that : http://intellinuxgraphics.org/index.html
If Intel did buy Nvidia what do you think the chances are that linux drivers would disappear? Wintel may have found another way to knock the competition.
I don’t think that would happen. Intel’s present graphics chipsets are more open than Nvidia’s. This may actually be a good thing as we may see Nvidia’s driver open up more.
Intel is NOT in compitition with Linux! MS is…
One thing seems inevitable to me:
Onboard graphics are the future.
Let me explain why.
I can hear everybody scream “Nooo! Onboard graphics suck!”. Yes, they suck _now_ but they’ll get better.
Computers shrink in size all the time and I think graphic cards are somehow getting in the way.
Plus they remind me of slot cpus.
And after all, a graphics card is just some sort of vector processor (which is why some guys use it for scientific computing).
It would make sense to me to even integrate it in the cpu.
When we eventually go 128 cores or whatever, it won’t hurt anybody to have, say, 8 cores doing the graphic stuff.
It just seems really old fashioned to me to plug thousand cards into a mobo to give it the most basic functionality.
It’s happening all over the place:
raid, ethernet, sound, all become onboard.
And where exactly is Nvidia left if they’re not bought?
Besides making excellent mobo chipsets, of course.
It’s perfectly possible to make onboard graphics chips that don’t suck, yes (Commodore did it for the Amiga, and they rocked in their day).
It’s perfectly sensible to have a CPU with an ALU and and FPU on one board or even one chip. However, that doesn’t mean the idea of onboard devices doesn’t suck. It does.
The only way to “upgrade” them is to put another one in a slot or change the whole motherboard – and if (God forbid) the idea catches on, motherboard manufacturers will ditch expansion slots and we’ll only be left with one of those options. I fell into the same trap with a network port – the onboard ethernet started flaking out, and the only way to replace it was to add a pci ethernet card.
This has left me worrying about the state of my motherboard. Next time I buy a motherboard, I’ll be trying a lot harder to find one without onboard everything.
Fortunately, I predict that things like this are cyclical: Every so often, somebody decides that a motherboard with everything on it that can’t be expanded is a great idea (Mac 128k), and then after a while people come to their senses (MacII).
I fell into the same trap with a network port – the onboard ethernet started flaking out, and the only way to replace it was to add a pci ethernet card.
As opposed to what you would do if an ethernet PCI card failed… It’s the same thing, except you didn’t have a card taking up space to start with! Onboard components are a godsend. No way am I ever going back to separate cards for every little thing.
If you don’t want onboard ethernet, then you should go all the way and get a motherboard without onboard USB and onboard firewire. After all, those are just ports like ethernet, so they should go on separate cards right?
2 all those pimping no expansion slots = GOD
When you want 10Gig ethernet, a better sound card, a physix card, Firewire 9000 and 802.11NZSYS you don’t have any slots for it…
onboard has it’s place… want a media center pc, go onboard. Want a mega thumper gamming PC? Better have some slots.
(Future me….. “Thank god, I had that 5th and 6th PCI-Express 16X slot for my Hex-SLI setup… or I’d have to run Oblivion 7 at 1600×1200”)
Assuming you can find a board with enough PCI cards for everything, yes.
You’re right on some things, but wrong on others.
First, “onboard” devices *are* the future. In fact, the future is sealed, integrated devices like the Apple stuff. These things are much easier made to “just work”, because the producer can select hardware that works together (read: doesn’t violate electrical bus standards) and for which working drivers exist. Also, by not allowing the customer to change something, producers are able to build advanced, much more powerful computing architectures.
> And after all, a graphics card is just some sort of vector processor
This is only half correct. A big part of the GPU (the geometry engine) is a vector processor that is capable of scientific computing. Another big part (the rasterizing engine) may be called vector processor too, but “vector” has a different meaning here (this word is so overloaded in computing that it just shouldn’t be used anymore). It means that the rasterizer can emit several pixels at once, i.e. its vector elements are pixels, not floating point values, and its operations are depth / stencil / transparency functions.
And yet other parts of the GPU have nothing to do with vectors (VGA timers, specialized caches, …)
> When we eventually go 128 cores or whatever, it won’t
> hurt anybody to have, say, 8 cores doing the graphic stuff.
This is complete nonsense. When it comes to 3d graphics, a GPU with the same chip area as an 8-core CPU will blow the CPU to pieces. CPUs are designed to be highly efficient at handling control flow, which is mostly irrelevant in 3d graphics. It is actually a rule of thumb that any control flow be avoided in graphics to speed things up (e.g. OpenGL manuals always say “change the render state as seldom as possible”. This can be translated to “avoid control flow”).
GPUs are optimized at data flow, not control flow, and thus outspeed even the 8-core CPU. If you want to replace the GPU by a general-purpose processor, it has to be a dataflow-oriented processor. Such things exist, but aren’t well-known (since they contradict to the classic sequential processing model). There might be still a gap between dataflow processors and real GPUs (read: optimized specially for graphics), but I can’t tell how bad this is. Maybe the real solution would be a generic dataflow processor plus a small graphics-optimized part.
“This is complete nonsense. When it comes to 3d graphics, a GPU with the same chip area as an 8-core CPU will blow the CPU to pieces.”
Nah, wait. Using general purpose CPUs to render stuff was not really my point, though I should have expressed this more clearly:
I believe that there are not just going to be many cores of the same kind but the whole thing could be a bit more heterogenous.
Some of the cores could be ordinary CPUs, some could be more like graphic processors, maybe even some of them could be like high performance DSPs or whatever.
The point I’m trying to make is:
If GPUs can be used for scientific computing, it should be possible to integrate them into the whole multicore setup as one unit that is used for graphics and vector processing
Yes, you may hate me now for saying vector again
To me, the parallel is obvious:
First CPUs had no floating point stuff integrated but a coprocessor instead. Then the FPU was made part of the CPU.
So graphic processors could be used in multicore setups as a couple of cores that the other, more general processors supply with data.
I just hope backwards compatibility won’t prevent it…
“Maybe the real solution would be a generic dataflow processor plus a small graphics-optimized part.”
Yes, that’s what I think.
> Nah, wait. Using general purpose CPUs to render stuff
> was not really my point, though I should have expressed
> this more clearly:
Ah, ok. Sorry for the harsh reply then Sadly, many people *do* think that throwing general-purpose (control-flow driven) CPUs at everything *is* a solution.
> Yes, you may hate me now for saying vector again
I dunno, maybe the rasterizing can be combined into general-purpose dataflow processors too (if they support the nessecary operations). But I think a really fast rasterizer has also something to do with specialized memory access paths.
But I agree that a great part of the GPUs can probably be made more generalized to achieve a higher total speed. This processor might be somewhere in the right direction (I’m linking this one since it’s the only dataflow oriented processor I heard of, except of course raw gate arrays):
http://www.pactxpp.com/main/index.php?id=technology_XPP-III_summary
However it appears that both companies are avoiding the qestions when asked. They did not even deny. That only look like they are indeed into something, whether it will materialised is another thing.
it would be a rather strange world if the acquisition were to occur. i wonder how long interoperability would last with ati cards on intel boxes and likewise nvidia on amd. there could be a whole new kind of vendor lock in.
All this means to us, poor bastards, that if this merge takes place, slowly but surely nvidia’s linux drivers will go down the toilet. We can’t trust intel providing quality linux drivers for gpus, so what remains is ati, which, sic, didn’t manage to make me want to buy an ati card yet, since I want a good and working linux driver.
Intel releases open source Linux Drivers, do they not?
So aren’t they ahead of ATI/NVidia in that respect?
Ancient Win_Tel FUD again – flush it down the toiled where it belongs.
You actually can’t ‘trust’ any company to produce and maintain drivers – hence the need for OPEN specs. If we can’t have them, though, we might as well support the companies that throw us the biggest bone.
do the intel gpu chipsets have open specs?
No, I don’t believe so…Just decent drivers, afaik.
I have an ATI card , So I wouldn’t know… Been a bit lax reading the news as of late
haha i can relate
I use some Intel hardware and do not have any complaints about support in Linux. Intel 2200BG wireless chipset works out-of-the box in every distro I tried, 855GME graphics have decent open-source drivers and can run AIGLX+Compiz nicely.
AMD should have bought NVidia, not ATI. Nvidia is the bigger vendor of AMD chipset and NForce are the better chipset on the AMD market. Nothing more to say. Even if you don’t consider this, keep in mind that AMD has always target a bit more on gamers than Intel, which you could say the same with NVidia. ATI is a lot more specialized with video manipulation (just think about their All In Wonder serie). Ok, I already hear people raving about this… but hey I always seen it like this: Nvidia for gaming, ATI for video. Same with CPU, Intel for desktop and brute force integer processing, AMD for gaming and major floating point opperation.
Everything should be inverse, AMD should merge with NVidia, ATI should be bought by Intel.
NVIDIA was over-valued, thus AMD going with ATI.
Intel is almost convicted monopolist in AMD court case and I don’t think it would be good to ruin good GPU maker (NVidia) with this overall bad for market health acquisition. If only Intel would release all NVidia drivers as “Public Domain” then I’ll change my mind.
… Won’t happen.
Forget about it.
nVidia already tried to open source their initial TNT2 drivers, but a couple of cease and desist letters from Microsoft and… wait for it… Intel [1,2] soon forced them to close the drivers.
Gilboa
[1] http://lists.suse.com/archive/suse-linux-e/2006-May/3373.html
[2] http://marc.theaimsgroup.com/?l=dri-devel&m=114981283225530&…
AMD couldn’t afford nVidia!
penstarsys.com at least believes AMD has made some very smart decisions.
The only way to “upgrade” them is to put another one in a slot or change the whole motherboard
Well as I understand it … AMD will have a socket(s) so you can upgrade in the same way you would upgrade a CPU
Well, that’s better news, but it isn’t good news – look how often CPU-to-motherboard interfaces have changed; expansion card interfaces change a lot less frequently.
Link?
Can I ask why Linux users with is 1% marked share think they are the center of the universe and every propietary company should surrender to them?
Because open-sourcing drivers won’t just benefit Linux. Proprietary technology was bad for computers, it’s bad for peripherals, it’s bad for OSes and it’s bad for users and consumers.
Says who?
Says I and anyone who’s ever been burned by proprietary technology.
How many of you are comparing to those who don’t relly need the source code of the drivers, just to work out of the box?
Edited 2006-10-05 15:49
Assuming you’re replying to me: *I* don’t need the source code, but Linux developers *do* need the source code in order to make sure the drivers work with the OS. Failing that they need the specs, but I think history shows that companies that don’t release the source code aren’t any more willing to release the specs, and companies that DO, are.
Why do Linux developers need the source?, Apple and Microsoft doesn’t need it and their marker share is larger, so, How does Linux stand here?
Apple and Microsoft’s business are built around proprietary technology. Linux is built around open technology. The marketshare argument is irrelevant. Linux developers need the source so that when the device driver API changes, they can change the drivers to match. The alternative is shipping several different driver interfaces at the same time (iirc, Windows now has three separate, incompatible network stacks for backwards compatibility). Linux developers don’t like that approach.
Why not trust the device manufacturer to update the drivers when the API changes? Because they don’t like doing it, and quite frequently drop support and drivers for old devices, even for use with OSes like MacOS and Linux.
Commodore and Atari would have said the same thing about using compatible devices, OSes, and architectures. They learned a hard lesson. It’s about time Apple and Microsoft learnt it too.
Edited 2006-10-05 16:13
So the constant change of the Linux API and is incompatibility with older versions with the same old API could be the cause? shouln’d be better to not deprecate the compatibility as soon as 6 months?
Ain’t the Linux developers the responsable of make this compatible envyroment and make it easier for manufacturer, just like Apple and MS do?, could be the cause that manufacturer don’t like to change their drivers every six monts to make it compatible with a new linux version and every Linux Distro who has changed the API too?, And giving away the drivers code to please a none profillable marked doesn’t make sense to them?
So the constant change of the Linux API and is incompatibility with older versions with the same old API could be the cause? shouln’d be better to not deprecate the compatibility as soon as 6 months?
MS developers do this, Linux developers don’t. I’ve already explained why.
Ain’t the Linux developers the responsable of make this compatible envyroment and make it easier for manufacturer, just like Apple and MS do?, could be the cause that manufacturer don’t like to change their drivers every six monts to make it compatible with a new linux version and every Linux Distro who has changed the API too?, And giving away the drivers code to please a none profillable marked doesn’t make sense to them?
The Linux developers prefer open source licences, the MS developers prefer closed. Frankly, I don’t think Linux developers or users care whether manufacturers think opening code is “difficult” or whether Linux has a small market share. Linux developers and users will just go on sourcing components from companies who provide open source drivers. And Linux distributors don’t change the API; all they do is provide patches. The only reason an employee of a Linux distributor would change a device API is because he is also assigned to the one-and-only Linux kernel team
So, maybe the lack of stability of the Linux API would be a problem for manufacturers and Linux developers are not interested in give that envyroment, so, who’s fault is it then?
As I’ve already said, it would not be a problem if the manufacturers would release the drivers as open source, as the Linux kernel developers would take care of making them compatible. If the manufacturers do or don’t want to become the maintainers of their open source drivers, that’s up to them.
But what would they win giving their IT, and IT that btw cost them tons of money to develop, to a non profillable market?, because at the end, they are there for the money, can Linux developers grant that they will be profillable anought to be the open sourse of drivers worthy?
As I’ve already said, open drivers don’t only benefit Linux.
Atari and Commodore and IBM and DEC made bucketloads of money with their proprietary architectures in their day, too, and all but one of them has bit the dust – many of their proprietary architectures buried with them, to the annoyance, inconvenience, and economic detriment of their clients. In fact in 1987, by which time the IBM-*compatible* manufacturer-independent PC was clearly dominating the businessworld, DEC were still dithering about whether a company could succeed in computing without controlling its own architecture. Microsoft and Dell don’t control their own (hardware) architecture, and yet they are now even more successful than they were in 87. Or at least MS is, I’m not sure whether Dell was around then. Nevertheless, Dell is still hugely successful.
As I’ve already said, open drivers don’t only benefit Linux.
But they are the only ones who are asking for it, looks like there are many more who don’t care.
Who can grant that Open Sourcing drivers would be better for them?
And who can grant nobody would use that drivers code for something they don’t want to, like the results of lots of years and money of investment?
Edited 2006-10-05 17:06
And who can grant nobody would use that drivers code for something they don’t want to,
As far as I’m concerned, if I’ve bought someone’s hardware or software, it’s nobody’s business but mine what I do with it, as long as what I do with it isn’t illegal. The rest of your sentence doesn’t make any sense.
So the constant change of the Linux API and is incompatibility with older versions with the same old API could be the cause? shouln’d be better to not deprecate the compatibility as soon as 6 months?
Ain’t the Linux developers the responsable of make this compatible envyroment and make it easier for manufacturer, just like Apple and MS do?, could be the cause that manufacturer don’t like to change their drivers every six monts to make it compatible with a new linux version and every Linux Distro who has changed the API too?, And giving away the drivers code to please a none profillable marked doesn’t make sense to them?
Hardware vendors do not have to release drivers for a platform if they don’t want to, but they should release specs for their products. Linux developers will glady code their own drivers, as would the BSDs and other operating system developers.
Commodore and Atari would have said the same thing about using compatible devices, OSes, and architectures. They learned a hard lesson. It’s about time Apple and Microsoft learnt it too.
Atari and Commodore died because they where obsolete, in your theory FreeDos should be the dominating OS and look, its almost dead despite being open source.
Edited 2006-10-05 16:23
Commodore and Atari would have said the same thing about using compatible devices, OSes, and architectures. They learned a hard lesson. It’s about time Apple and Microsoft learnt it too.
Atari and Commodore died because they where obsolete,
having become obsolete because they couldn’t compete with non-proprietary technologies…
in your theory FreeDos should be the dominating OS
Erm, no. In my theory (a) FreeDOS is even more obscure than Linux, (b) it doesn’t provide anywhere near the capabilities of that OS (or Windows), and (c) Windows is the dominant OS because it is the successor of MS-DOS both as a programming platform, as Microsoft’s flagship OS, and in having been installed on all PC’s (an open hardware architecture) from 1990 onwards. That allowed it to wipe out all the other proprietary OSes for the PC (which were farther along in development than Linux when they were killed) except the PC Unixes – and Linux killed them.
having become obsolete because they couldn’t compete with non-proprietary technologies…
What other alternatives?
But if FreeDos was Open Source why anybody was interesting in give it new life?, so Open Sourcing cannot be taking as a sure succes, right?
Edited 2006-10-05 16:37
The PC became the non-proprietary technology with which Atari and Commodore et al. couldn’t compete. And no, open source doesn’t guarantee success. But the idea that proprietary technology does, is nothing but proprietary technology companies’ favourite canard.
Yeah know..I could care less that Nvidia has closed sourced drivers, etc… I can understand they have patent issues with 3D algo’s, etc…I don’t understand why they don’t make hardware specs more open so that one is able to talk to the card / device correctly.
The marketing droids will tell you that it’s a quality control issue. That they don’t want everyone releasing drivers for a device since that could lessen the percieved quality. Bull honky. a) not everyone is even able to write device drivers and b) even less of them want to. So you have the Linux kernel getting better, more stable devices and same with FreeBSD.
I don’t need my open-source Nvidia 3D driver to play UT2009 at insane speeds, but I would like my xscreensaver flying toasters to not stutter and jump all the time when using a 2D only open-source driver.
The PC became the non-proprietary technology with which Atari and Commodore et al. couldn’t compete.
But that’s a matter of standars and not open sourcing, right?
As I’ve already said, companies who don’t want to open source their drivers don’t seem any more amenable to conforming to standards. In fact, arguably there are companies who release their driver code, but not their specs.
As for standards vs. open sourcing, ask anyone who’s been burned by a “standard” but obsolete Unix (e.g. someone who relied on an application that runs only on DG/UX, or someone who worked on IRIX for SGI) about the value of “standards”. Unix was a “standard” long before POSIX or Linux came around, but competing and incompatible closed source implementations almost killed it. In software, the only acceptable “standard” is an open source one.
But you havent answered my questions, can you give a direct answer to each one of them?
What questions haven’t I answered?
Any of them, could you please answer them?
I mean quote the question and put the direct answer. thx.
No, I’m not going to go through it all over again and quote all the questions and give all the answers. My answers are right underneath your questions; if you think there are specific questions I haven’t answered, you’re welcome to point me to them and ask me to answer them again.
Of course:
It is fair manufaturers release their code wich cost them tons of money to develop, to a non profillable market?,
Can Linux developers grant that they will be profillable anought to be the open sourse of drivers worthy?
Who can grant that Open Sourcing drivers would be better for them?
And who can grant nobody would use that drivers code for something they don’t want to, like the results of lots of years and money of investment?
Edited 2006-10-05 18:36
“””Is it fair manufaturers release their code wich cost them tons of money to develop, to a non profillable market?”””
Unless they have taken advantage of copylefted code and distributed it, it would not be fair to *force* them to release code.
However, it may be in their best interest to do so. Making such code ready for release as OSS involves a certain investment of work, but can also yield benefits in the form of improvements to the code which may flow back to them, some relief from the responsibilities of driver maintenance, and even in what you call an “unprofitable market”, it may be a way to sell some more cards at relatively little cost. It’s also good PR. Each company must evaluate these costs and benefits as they apply to their specific situation.
“””Can Linux developers grant that they will be profillable anought to be the open sourse of drivers worthy?”””
Sorry, but this question is unintelligible.
“””Who can grant that Open Sourcing drivers would be better for them?
“””
No one. They need to systematically evaluate the pros and cons for themselves. The commununity is usually pretty open to answering specific questions they might have.
“””And who can grant nobody would use that drivers code for something they don’t want to, like the results of lots of years and money of investment?”””
That depends upon the license they choose. If they went with a permissive license like BSD, competitors could take their work, modify it and use it against them without giving back. Copyleft licenses like the GPL, level that playing field a bit. Competitors could use the code. But the company releasing the original is guaranteed to get the enhancements back if the competitor distributes binaries based on the released code.
There are, of course, other considerations. Perhaps the code is mixed with 3rd party licensed code which they have no authority to release. But that would be part of the systematic evaluation.
One thing I will say is that if AMD opens their code first, thereby putting pressure upon NVidia to do likewise, it is AMD which will get my future business, regardless of how NVidia reacts.
It is fair manufaturers release their code wich cost them tons of money to develop, to a non profillable market?,
That depends on your definition of “fair”. It may not be “fair” that the open market robbed IBM of the profits it would have gained from being the sole supplier of PC’s. Nevertheless, I have contended elsewhere that they did better out of it than they would have if they HAD been able to keep it closed. Also, Linux is not “a non-profitable market,” and as I keep saying, Linux would not be the only beneficiary – even Microsoft users would benefit, because the opening of technology would drive prices down.
Can Linux developers grant that they will be profillable anought to be the open sourse of drivers worthy?
Nobody can “guarantee” that anyone will be profitable, whether they use proprietary technologies or not. Sony is profitable despite the fact that it keeps releasing proprietary technologies that eventually fail in the market, not because of it. Also, if Linux developers are “profitable”, the device manufacturers won’t see any of that profit. The only profit they will see will be from people who buy their products because they provide open source drivers. As I have noted several times already, this will include people who do not necessarily care about open source itself, because opening technologies to the free market subjects them to economies of scale and lowers prices, allowing those on lower incomes to buy them.
Who can grant that Open Sourcing drivers would be better for them?
Open sourcing drivers would be better for them for the reasons I’ve explained.
And who can grant nobody would use that drivers code for something they don’t want to, like the results of lots of years and money of investment?
I’m sorry, but that still doesn’t make sense to me. However, I’ll assume that you mean “If companies can use that technology instead of developing their own, it saves them years and years and lots of money in investment in new technologies.” I can only see that as a good thing; to draw an analogy, I can’t imagine how many people would not be able to afford to learn to read and write if one had to pay a licence fee to someone as part of the price of a sheet of paper. Most of the Unix and Linux distributors and embedded device manufacturers would also say that being able to use Linux off-the-shelf on their devices is a Godsend.
The only acceptable “standard” is an open source one.
If that’s true, why is Linux market so small?
Because people are apparently happy with Microsoft being the dominant vendor. That’s not a viable long-term strategy, however.
Because people are apparently happy with Microsoft being the dominant vendor
If the people are happy with it as you say, could it be that Linux have failed in make people happy as Windows does?
Edited 2006-10-05 18:32
No. I don’t know anyone who uses Linux long term and isn’t happy with it, but I DO know people who are forced to use Windows and think it sucks. Some of them don’t even use other systems e.g. at home.
Linux has not conquered the desktop for four reasons:
(a) There aren’t the same number of apps available for Windows (which is mostly due to (c))
(b) Not everyone releases drivers for Linux, so some people can’t use it with their hardware, and/or won’t buy Linux compatible hardware.
(c) Windows is preinstalled on the majority of people’s computers. It’s familiar. There are certain things about Linux that make it different to Windows but that are are not going to change (e.g. noone is going to implement drive letters in Linux). That doesn’t mean that Windows is better, or more intuitive – all it means is that people have gotten used to the way it works. On the contrary, if you walked into a parallel universe where Linux is installed onto the majority of computers, you would see people moaning about the fact that Windows uses drive letters, some for technical reasons, some because it’s different to what they’re used to.
To people who buy a computer without any thought as to what’s on it, Windows is essentially as “free” (as in money) as Linux. The biggest incentive to switch to Linux is when Windows doesn’t work or Microsoft does something you don’t like.
(d) Linux isn’t advertised anything like as much as Windows is. The best thing anyone could do for Linux would probably be to buy out or employ the people responsible for Microsoft’s marketing.
Nevertheless, unless ReactOS makes a BIG impact (and frankly, I don’t see that happening), when Windows is dead and buried it’s likely that the only game in town will be Linux or some system that works like it, not like Windows. That’s thanks to Microsoft keeping Windows closed. I don’t claim that Windows will be dead and buried next week, only that because of its ties to Microsoft and Microsoft alone, it WILL happen someday.
Could it be that maybe user and manufacturers see Linux to complicated and don’t see it will be easy soon?, could be Linux be failing there?
That’s one reason, but it’s not the only reason and it most certainly isn’t as bad as most people who say it make out. If Linux and Windows were both starting from square one today, Windows wouldn’t get a look in.
That was just an old made up description for the type of computers Windows ran on. It’s obviously outdated now with AMD and all. There is no big alliance between Intel and Microsoft. Intel supports Linux also, and there are Intel processors in Macintosh computers. This would not mean the end of Linux support for NVIDIA, in fact it may even improve if Intel made the drivers open source.
this would put an end to people buying AMD for linux. this is a very smart move by intel. And it’s really all AMD’s fault for buying ATI.
If Intel does buy Nvidia, I wonder if they’ll drop the FreeBSD driver. That’s one of the reasons I stick with Nvidia.
…just made a lot of money creating this baseless rumour.