AMD had a bit of a setback with their Barcelona server processor, the company seems to have moved on. During a conference call today, the company laid out its plans for the server space for the coming years, putting 6 and 12 core processors on the horizon.In the first part of the conference call, AMD assured that the Barcelona debacle is now part of the past, and that it is now shipping in volumes. During the rest of the conference call, the new processors were the centre pieces of the show: the quad-core Shanghai in early 2009, the 6-core Istanbul in late 2009, and the 12-core Magny-Cours and the 6-core Sao Paulo in 2010. They even made a nice diagram that kind of sums it all up:
Ars explains the importance of the Shanghai processor design:
The pressure on Shanghai at this point is immense; all of the CPUs on AMD’s server/workstation roadmap for the next 2.5 years are Shanghai derivatives. AMD’s current financial situation and market share have left the company with no room for error; its next-generation CPU will need to shine from the get-go.
Nvidia is still going to make chipsets for amd, even after they bought ati? I thought they would stop, but maybe I am a bit naive.
There’s no point in letting someone else take your business voluntarily.
If i was Nvidia i’d be quite willing to keep making and selling chipsets as long as AMD wanted them.
Intel makes graphics chips too. Come to think of it, Nvidia is the only major graphics chip producer that does not also make x86 processors…
Sounds like the start of a great rumor!
Yes.
As long as AMD wants to buy them nVidia is going to sell them chipsets. It would be foolish to stop just because AMD has their own graphics division. In fact, it’s almost a feather in the hat for nVidia. They’re selling chipsets to the owner of their biggest competitor.
AMD doesn’t buy Nvidia’s chipsets; system/mobo vendors buy them. So I’d say Nvidia should keep making chipsets as long as their customers want them. Letting your competitor decide what products you make generally isn’t a good idea.
Why the hell would NVIDIA stop? That would be excellent business sense. “Despite what our customers – the people who throw money at us – may want, we’re going to give AMD the finger and stop making chipsets for them. So THERE!”
Are you going to assume that NVIDIA will stop making chipsets for Intel chips because (gasp!) Intel makes their own chipsets and (double gasp!) even makes fairly competent graphics hardware?
Intel’s graphic chips are the lowest common denominator, crap.
Because Intel can make 45nm parts, they could be a formidable competitor to Nvidia, but up to this point, they make lousy graphic chips, for the uninformed masses.
Maybe I’m confused but I doubt AMD buys any chipsets from Nvidia, really.
Motherboard makers buy them and build motherboards, AMD has little to say about this (maybe they have some patents on it which could be used to shut it down though).
Nothing big & exciting – but considering they are trying to survive (also mentioned in the AMD v Intel court cases on ars technica) it will hopefully be enough to gather strength i.e. money again .
I must say I haven’t read of any really new things technologically from Intel or AMD .
AMD’s Fusion should be cool ,but now quite a few years still away it seems .
Intel with all the money isn’t bringing out anything drastically new in the next years either .
All really exciting things from either two are still a few years away .
How come everybody else in the chip industry is bringing out cool new things ,but AMD & Intel aren’t ?
Is there no incentive to do new things for Intel in normal x86 computing ?
Also ,what about the chip partnerchip between AMD & IBM (? & Toshiba) – I expected more to come from that .
Edited 2008-05-08 12:08 UTC
How come everybody else in the chip industry is bringing out cool new things ,but AMD & Intel aren’t ?
Just wait til people start licensing the memristor technology from HP! I do bet things will start looking pretty interesting then Also, it should apply to quite a lot of things, not only to processor technology but to storage, memory, various chipsets..It is probably a few years away though.
Well, it blurs the lines completely between RAM and hard storage, and that really is pretty interesting.
Then again, it might be closer than we think. Depending on the election in the US, we might see a renewed effort by the US administration on encouraging investment into bleeding edge technologies such as memristor – which could mean coming on stream a lot earlier.
Even with that being said, there are new things coming out at record speed; what we think could be 2 years ago – in a few months, a technology might be developed to bring it from 2 years to 6 months. Having lived through 16 years of technology improvement, I don’t doubt the will of the market place and the drive of the capitalist to want to come up with a better widget.
Fusion is “cool”, but that’s the only benefit I can see from it. Instead of having a low-end GPU in the chipset, you’ll have a low-end GPU in the processor. Shrug.
Intel with all the money isn’t bringing out anyhing drastically new in the next years either .
I’m pretty excited about Nehalem. Quickpath and no more FSB should be nice. Of course, since that’s basically how AMD has done it for years, one could easily say it isn’t drastically new.
Perhaps the dynamic overclocking? I’m a little shaky on the details, but apparently if there is a thread running that is chewing up CPU, one of the cores can be overclocked and the other(s) shut off to keep the whole thing within normal thermal limits.
Maybe someday Larrabee will be something
São Paulo has a tilde!
edit: ã
Edited 2008-05-08 14:39 UTC
I think your post explained why they didn’t bother with the tilde.