“Why did Apple just release new MacBook Airs, MacBook Pros, and a Retina MacBook Pro, but no new iMacs or Mac Pros? And why are the iMacs probably being updated this year while the Mac Pro update won’t happen for 12-18 months? As usual, I have some guesses.” Good points.
Thunderbolt may only be 2 lanes per direction per port, at 10 Gbps per, but DisplayPort 1.2 can be 4 unidirectional lanes per port, at 5.4 Gbps per lane.
So, two DisplayPort 1.2 ports together can just do it, not using Thunderbolt at all.
And, what does the MBPR have? Two MiniDP/Thunderbolt ports. Wouldn’t be surprised if they support DisplayPort 1.2.
Other thing is, there are display compression protocols – DisplayPort 1.3 supports panel self refresh, the beginning of such a protocol, and IBM was doing research on a standard for only doing updates over the video link (to allow cheaper video link hardware to drive their T221), which ended up being used as a low power video link (I can’t recall the name of it (edit: DPVL, which actually inspired DisplayPort: http://en.wikipedia.org/wiki/Digital_Packet_Video_Link ).
Edited 2012-06-22 19:07 UTC
delete
Edited 2012-06-22 19:02 UTC
It’s probably not so much Intel is having problems with Thunderbolt on Xeon, but more along the lines of Thunderbolt is a non-option without on-CPU graphics, or at least excessively complex, considering the other options that are common on systems in Xeon’s market vs. consumer market, i.e. Fibre Channel, 10Gbit networking, etc.
Apple is probably waiting for Xeons with on-CPU graphics. This would eliminate the difficulty of routing the PCIe available from the CPU or mainboard and the graphics from an add-in board to the same port.
With on-CPU graphics, it makes it much easier to add a thunderbolt port, and add-in graphics boards can still output via the integrated chip.
Thunderbolt is basically a transceiver which allows PCI Express lanes to operate over longer distances.
I don’t see why you think it would depend on the onboard graphics.
Thunderbolt isn’t just PCIe; the controller multiplexes PCIe and a DP video signal, then send it over the cable. It is this video signal that is used to drive displays, not the PCIe data.
If anything the would implement thunderbolt as a connector coming off the graphics board, or having the PCH generate the displayport data from a PCI-e board. No one in their right mind would buy a Mac Pro class machine to run graphics off a crappy intel IGC.
Well, using an add-in board for Thunderbolt would require either sacrificing some PCIe lanes for Thunderbolt, or using a slot design that is out of spec. Considering Apple’s ongoing commitment to using open specs for their desktops (standard memory, ports, slots, etc etc), the latter isn’t likely. The performance hit also makes the former unlikely as well.
Alternatively, an internal ribbon cable could carry DisplayPort data to a location to a pinout on the mainboard. However, this requires some messy, messy routing (Thunderbolt is already a complex beast to route on a mainboard on it’s own), and is aesthetically ugly. Apple likes their Mac Pro internals to be aesthetically pleasing, too.
No, the likely scenario is something akin to nVidia’s Optimus, or an enhanced version of Apple’s own Switchable Graphics tech (Which is already part of OS X), or maybe something along the lines of BumbleBee (a Linux implementation of Optimus that isn’t necessarily tied to nVidia tech). Basically, the CPU graphics is used for output, as well as rendering of OS graphics. However, the graphics card does the heavy lifting for more demanding tasks, but instead of outputting rendered graphics to the display, the output is directed to the integrated graphics, which then actually displays it.
Again, OSX already has this capability. This also has other benefits. When the user isn’t doing anything that requires the graphics card, it can be put in a low power state, and only draw power when actually used.
Every time the effects of limiting GFX bus performance are tested over the years (within limits – like, “just” halving it), the differences turn out to be marginal at most…
It seems the push for higher speeds is more a “just in case” kind of thing. And/or a lingering legacy of AGP marketing claims (about outright usage of main memory as the primary place to store textures), which didn’t really materialise.
This is true. For graphics, cutting the bus down from 16 lanes to 8 doesn’t have much of an effect (Are devices capable of operating on a number of PCIe lanes that isn’t a power of two? Say, 14 lanes?), but I’m not sure that the same can be said about GPU computing. At least, I haven’t seen any benchmarks testing the effects of reducing bandwidth for compute loads on GPUs.
I doubt it would be much different – after all, the GPGPU computations are of a kind similar in nature to graphics processing …that’s why we nowadays try to do them on GPUs in the first place.
Well, if you’re really curious, that’s nothing a small slice of duct tape can’t answer ;p (but seriously, OTOH this could be just as well a drivers thing at most)
nice read but he is wrong on the retina-part
those displays have been around for more than 10 years now and the big question is not availability but price (and apple wants it cheap)
2880×1800 resolution displays in 15 inch form factor? There’s never been a laptop display like that before.
1920×1200 (and more recently 1920×1080) has been the highest rez you could get on most laptops for 10 years or so, that actually became more rare over the past few years.
To get more than 1920×1200, you’d have to go to a 27 inch display usually (and most 27 inch displays are actually 1920×1200/1080).
the panel doesn’t care if it’s put into a stand-alone device or a laptop
but for the panel you can take a look at the ibm t220 from 2001
http://en.wikipedia.org/wiki/IBM_T220/T221
How exactly do you figure it became rarer?
You could get 15″ the Thinkpad R50P, an NEC Versa Pro NX VA20S/AE or NEC LaVie G Type C at 2048×1536 a decade ago:
http://arstechnica.com/civis/viewtopic.php?f=9&t=555206
http://forum.thinkpads.com/viewtopic.php?t=43774
So no, we are just now catching back up to where we where years ago with pixel density tech.
As for “most” laptops on the market today, They are running the pitiful resolution of 1366×768, not by choice of the buyer, but because the companies got very cheap and only offer that resolution unless you are paying for a gaming class notebook when even the most crappy of today’s IGPs can handle a much higher resolution for non 3d tasks. You’d think they’d go for 1280×800 instead since at least then 720p video wouldn’t look distorted by being stretched across pixels.
Is that even proper english?!? What about “cheap enough” or, if you’re swimming in money, “very cheap”?
But, nitpicking aside, credit where credit’s due: not only can Marco Arment predict the future (at least when it comes to the next Macs) he must know a definition of “inexpensive” I wasn’t previously aware of. I mean:
MacBook Air 11″ starts at 1.049 Eur
MacBook Air 13″ starts at 1.249 Eur
MacBook Pro 13″ starts at 1.249 Eur
I won’t argue whether they’re worth every cent or not, and I agree that they can’t probably be equipped with a Retina Display at that price point, but… “too inexpensive”?!? C’mon!
RT.
Well, if they are waiting for tech enabling high-fps (100+) rendering on 24″+ retina displays with >1200 line resolution (needing to effectively render at least 2x that size iirc), that could be a long wait.
Zero information provided ..