Another Apple-to-switch-Macs-to-ARM post. “Apple engineers have grown confident that the chip designs used for its mobile devices will one day be powerful enough to run its desktops and laptops, said three people with knowledge of the work, who asked to remain anonymous because the plans are confidential. Apple began using Intel chips for Macs in 2005.” No idea when Apple will make the switch, but they will do it. I’m thinking 5-10 year timeframe.
Considering that Apple was a co-inventor of ARM from the days of Next computer, and that iOS is basically a stripped down version of OSX, it is very likely that Apple has had this in the works for a very long time.
Edit: Mind fart. Removed comment.
http://en.wikipedia.org/wiki/ARM_architecture#Apple.2C_DEC.2C_Intel…
Edited 2012-11-06 00:34 UTC
Apple was one of the the first 3 investors in ARM the company along with Acorn and VLIW.
They did not however “invent” the “Acorn RISC Machine” processor. That was done by Acorn on it’s own several years earlier.
The real shame is that Acorn didn’t hang in there, could have been one of the biggest international companies by now. Their processor legacy is pretty much in every device I own that is not a desktop or laptop.
And possibly in some/most/all(?) of your desktops or laptops – things like HDD or optical drive controllers could also use ARM SoC I suppose; similarly monitors, or even some more fancy mouses and keyboards.
Why do Apple fanboys always re-write history?!
ARM already existed for several years before Apple invested money on them.
Perhaps he was just mistaking, that doesn’t make him an Apple fanboy or a history rewriter. The mistake doesn’t really influence his real statement about iOS.
This place is getting as bad as Slashdot with the rabid Apple fanboy animosity (actually it’s everywhere): anything questioning or not praising Apple is responded to with vitriol retort — this coming from a die-hard Mac user from ’95-’08.
Hey Apple fanboys, you won! Be happy. Apple is the most valuable company in the world, everyone has (wants) an iPhone, and the iPad has been selling like crazy! No need to be on the defensive about your platform any longer.
I really like this site (a site devoted to OSes) and am glad it exists. Thom does a good job with it too.
The shoe is also on the other foot too. Apple get’s bashed hard by Android fans. The truth is that anonymous forums are not the place to discuss any serious subject. There’s no accountability, so one can call anyone all kinds of names and no one will have any way to un-do that, save deleting wholesale chunks of the thread (as Thom has recently) or ban the users in question (which achieves little if they then create a new account.) What OSAlert needs is a non-anonymous user sign up process – then people have some kind of recourse for ensuring the users commenting are genuine and not out to personally attack people. Another thing might be to have a system where people are unable to comment if they are deemed to have a low “karma” level. The problem with a simplified moderation system is that it only works when people play fair. People don’t play fair here.
This is the text in question:
“Considering that Apple was a co-inventor of ARM from the days of Next computer, and that iOS is basically a stripped down version of OSX, it is very likely that Apple has had this in the works for a very long time.”
There is nothing fanboyish about it. It doesn’t praise Apple, say nasty stuff about non-Apple companies/products/people. Making a statement about Apple that doesn’t say they must die doesn’t make someone a fanboy.
The text includes one slight inaccuracy, which probably wasn’t made with evil intent. This inaccurate bit doesn’t even add or subtract anything from the core statement and a logical assumption. iOS is indeed a light version of OS X and Apple probably runs more on ARM than we know.
People here have said some pretty nasty things about Apple and Steve Jobs, made inaccurate statements, much worse than this one. Well, they can do that and if someone doesn’t like it he can reply to it.
I find it strange that that didn’t tick you off, but this slight inaccurate neutral post somehow is seen as an example of anonymous Apple fanboys being overly annoying.
Do you want to know why such a ‘lie’ pisses off people?
Because it may seem ‘small’ to you, however, the way psychology and the internet works is thus.
Some random asshat says Apple co-created ARM on OSAlert. Then other people repeat this falsehood on other forums and other sites because they saw this ‘truth’ on OSAlert. Then more people do so, and finally one little falsehood becomes another part of Apple’s current fan-boy predilection for re-writing history when a big name Apple blogger repeats it.
So sticking to real & verifiable facts while posting here is much more appreciated in the long run.
The person in question is called Nikato Muirhead, who I have had never heard of or noticed. I’d be very much surprised if you find a single forum where his inaccuracy, or lie as you call it, is quoted. Even more surprised if people on these forums will accept it as a fact and use it in pro-Apple propaganda.
Had it been Al Gore, Lady Gaga, or someone very influential it would be different.
The general public doesn’t know or care about ARM, Nikato Muirhead, OSAlert, they don’t care if Apple did or didn’t have anything to do with it. They won’t flock to the Apple store because Nikato Muirhead said Apple co-invented ARM.
If you scout the comment sections of OSAlert you probably can find inaccuracies in every one of them. Even this statement is probably inaccurate.
I see what you did here…
what is most popular 8bit home computer?
“Fanboyish” isn’t really about evil intent …it might be seen more as an outlook on things which makes it easier to do such inaccuracies, going in the direction of brand-praise (or even, as you put it in other post, history-rewriting)
In another post he does seem a bit over-enthusiastic: http://www.osnews.com/permalink?541139
(but it’s not like he’s most annoying with that, really)
I think the amount of effort one needs to put in to earn the status of fanboy has been lowered considerable. Even owning an Apple product will get you that label.
For some reason a number of people, certainly present on this site as well, want to terminate any chance of a reasonable discussion by labeling someone as a fanboy. If you don’t say anything negative about Apple, have an Apple product or say something bad about Google you get called a fanboy and deemed unreasonable.
In this case someone made a slight mistake, which he admitted without any struggle shortly later. He gets accused of being a fanboy, rewriting history and even being a risk that his claim goes ’round all the forums causing legions of other fanboys to repeat it.
Not so long ago on this site someone was upset at Apple for suing The Beatles and the original iPod laughed at for not having a shuffle function. Both false claims and even though these claims are much more easily repeated, spread around and even understood by the general public it never happened, nor did anyone else but me point out the error, nor did the people making these false claims retract them or even responded.
BTW why don’t you add me on Facebook, you always tend to reply to me just before the deadline and then I’m not allowed to respond!
Granted, such mildly annoying outlooks on things go all ways.
But you know, there’s probably something about fans of formerly-barely-surviving-and-a-bad-deal-for-a-decade-but-most-valued- company-now Apple, those of them who live in places where Apple always had marginal market share, hardly any presence – they were and/or got really crazy defensive …or at least, stand out more. Which in turn makes it easier to point at them as “brainwashed” and maintain that perception.
I mean, I have an iPod (getting one here was fairly unusual; I got it largely for ~technical reasons, believe it or not: it was, back then, the most decent and least expensive option supporting Last.fm scrobbling), and I sort of recommended Apple products for quite a while – but for the last ~2 years I really don’t, because I don’t want to be associated with that sub-group of fans.
There’s this Atari-logo guy here, kovacm – looking at his name, he’s probably from Czech Republic (so, one of places with historically marginal Apple presence); IIRC, you suggested once that his posts might be largely the result of EN as a 2nd language – but EN is also non-native for you and me. With him it goes further than that… look at his response to my post about Project Star Trek in http://www.osnews.com/comments/26322 – plus, generally, a bit juvenile way of talking, with the “LOL, !!!, and smiley overload”
And uhm, well, I don’t reply right now just before the deadline, for example :p (hm, and WRT Facebook – I guess I would have to start really using it; right now, I average maybe one visit per month – so much so that I see ads on random websites to join FB – and maybe I don’t want to change that)
Kovacm is cool and his English isn’t very good, but far from bad. I judge my English to be mediocre, but acceptable and readable for most people (I hope). If one’s English isn’t very good and people don’t one’s opinion or person they tend to be far more hostile. Kovacm just doesn’t have your level of English, like I’m sure a lot of people don’t and even more don’t speak it at all.
I don’t mind deciphering what people write.
If you don’t like Apple/Steve you tend not to like people who either do like them or don’t feel the same way as you do.
Very often we’ve had these anti-Apple folks vs a few pro-Apple characters. Now I might be a bit biased, but the pro-Apple ones are less aggressive, don’t tend to swear, offer motivated arguments and are often right.
The anti-Apple crowd feature a number of people who produce these one-liners, like “that’s it, I’m never going to buy Apple” or “Steve must rot in hell”. If we disregard them what leaves us is a number of people who are mostly against Apple because they prefer something else, making them sounds a bit jealous of Apple success.
I can understand your feeling of not wanting to be associated with a certain group. Personally I had the same thing when I was a full-time Linux user. Back then it was really bad. They made claims about Windows/Microsoft that was far beyond wrong and claims regarding Linux that were very optimistic at best.
We all know the year of the Linux desktop for example. Every tiny tweak or event would cause Linux desktop world domination. Every smart open source product was attributed to “Linux”, but when bugs or problems were mentioned Linux turned in to “just a kernel” and something else was to blame. This made it easy for me to switch to the Mac.
I think I was a Linux user from 1998 to 2005 en from 2005 to now an Apple user, so both 7 years. I think there’s a Biblical quote regarding this.
Before 1998 I was an Amiga user who started using Windows more and more.
I’d rate Commodore and Apple users to be the most pleasant people to be around.
I don’t think your posts (that I’ve read) ever needed deciphering; I hope mine are mostly like that, too.
And “cool” @kovacm …I don’t know, it goes further, it’s more than weak EN, it’s this consistent… attitude – like he didn’t notice yet that platform wars* days are over, still lives in that mindset.
Or maybe drunkenness… (check out lower regions of the discussion I linked previously; and again about his reply to my post that mentions Project Star Trek – how bad your EN must be to argue no possibility of MacOS Classic x86, even when that’s the thing I linked to?; also, throwing around “great past quotations” seemingly without trying to connect the dots, analyse how it turned out in the 3+ decades since: http://www.osnews.com/thread?522309 )
But valid points about different kinds of fans or anti-fans. Though I’ll also add to it a possibility of hiding behind outward politeness, but trying to drill the same tired, inaccurate “facts” no matter how many times they are successfully refuted. Maybe it just doesn’t register – after all, that’s one of the ways human minds deal with cognitive dissonance.
Now that I think about it, there might be one reliable ~definition: FANBOYS ARE MONO-THEMATIC (for example: participating in discussions only about “their” brand or its biggest competitor, and virtually always praising the former and finding some faults in the latter)
But in the end, it’s all just irrelevant bickering on the internet which will be lost in time… (same for our conversations…)
*BTW the old days – remember, we often tend to see them through rose-tinted glasses.
Also there was much less flamewars because… most people didn’t have internet access
Apple was not CO-INVENTOR of ARM !!
anyway, it would be great if someone make deep analyze of using ARM instead of intel ivy bridge.
looking from side, all premises that it could be done are there:
1. Apple control third party developers through AppStore
2. Apple control developing tools (language, framework, compiler…)
3. Apple investing heavy in GPGPU & multiprocessing (OpenCL/GDC…)
my question is:
how much faster could professional application* run on ARM + GPGPU chips (vs intel chips) if you optimize everything (metal, compilers, frameworks)?
*FinalCut, Logic, Aperture, Adobe Suit, Cubase, Cinema 4D…?
Is Apple in position to do this:
– potentially leap frog x86 in terms of performances optimizing everything I already mention (metal, compilers, frameworks…)?
That sounds like making something that’s slower go faster than something that goes faster.
Why no optimize for the faster option and make it even faster?
iOS devices can do impressive stuff speed wise, but when it comes to certain desktop applications it’s hard to beat raw power.
Still, it would be interesting to see if Apple comes with an ARM Mac and how they’d do it.
Actually that’s how x86 has been operating for the last decade. Intel has had to employ a number of ‘cheats’ to keep up with Moores law and as a result the x86 CPUs have gotten very long in the tooth.
Few people run applications that need that kind of raw power, and the few times it is required, a switch to multiple RISC cores over fewer CISC cores might pay dividend in the long run. Admittedly there’s still a software hurdle to over come there (teaching developers to write god multi-threaded code isn’t a small task). But for DAWs, video editors, and image manipulation software; having dedicated RISC cores for filters and fx makes a lot of sense for low latency work.
I genuinely think if we want to sustain the exponential growth in processing power we’ve enjoyed thus far, then we need to learn to better parallel process rather than rely on clever processor tricks to mimic such effects (eg out of order execution). And to do that, it makes more sense to have more dedicated RISC cores: it’s easier to stack cores on one die and the lower draw on power means the CPUs run cooler (as cooling top end multi-core monsters is always a bit of a battle).
Yes, I can imagine!
I wonder how many threads God had to use to create the universe… Talk about multi-threaded. And I wonder if went with agile processes or good old fashioned up-front design?
I echo this guy’s sentiment: ARM may not be as powerful as a general purpose CPU as x86/x64, but it can be tailored for certain kinds of tasks after which it can beat x86/x64 hands down. Video and image manipulation are actually things that are already handled quite well on ARM by specific cores designed for those tasks, like e.g. many video cores these days can handle both decoding and encoding of video in real time. It’s just a matter of adding support for effects in the core and updating the software to make use of the core, and you’ll likely get better performance and better battery-life than with x86/x64 solutions.
That is to say, x86/x64 is good for all-purpose tasks where general, overall, raw power is important, and ARM is much worse for that, but a whole lot better for more specific tasks.
About the topic itself: I could certainly see Apple going for ARM in the future, they have a lot to gain from such a shift in architecture. It would possibly start with only Macbook Air’s going ARM in an effort to see how the public reacts, to let developers start the transition on their ends, and to prepare the public for a bigger push in a few years from that.
Didn’t amiga try to offload various types of processing onto a variety of units in their desktop hardware? Or am I smoking something?
No, you are completely lucid, for the Amiga 500:
Motorola 68000 – Main processor
Fat Agnus – OCS / ECS display controller
Denise / Super Denise – OCS / ECS display encoder
Paula – audio and I/O controller
Gary – system address decoder
For the Amiga 600:
Similar to the 500, just with one difference.
Gayle – system address decoder and IDE controller
For the Amiga 1200:
Alice – AGA display controller
Lisa – AGA display encoder
Paula – audio and I/O controller
Gayle – system address decoder and IDE controller
This is also how game consoles work, with specialized sets of chips.
Edited 2012-11-07 08:20 UTC
Thank you. And I used to have an Amiga tower… you would think I would remember. :/ It stinks getting old and losing your memory/sanity.
Yes. Including what can be probably seen as the first compositing window manager.
But it can be argued that this, and overall tight coupling of hardware & software, was also the undoing of Amiga – the 500 generation was great; but later improvements, while maintaining compatibility, proved hard & expensive. 1200 gen was not much better, half a decade later.
Essentially, Amiga had console-like hw and market dynamics (most people never upgraded past 500, hence games always targeted mostly 500), but without matching business model (Amiga being an open platform, Commodore unable to extract money from dev houses the way Nintendo could; something like that also essentially killed Atari and led to video game crash of 1983 …in which Commodore played a major part, seems they didn’t quite realise what they did)
x86 are RISC from PentiumPro
real speedup coming from massive parallelism of GPU.
It is simply wrong. It never was (except RISC86 core) and most microops now translated into 1 CISC instruction. Also there is a pure CISC-y Atom.
Edited 2012-11-07 04:15 UTC
it is simple true.
since PentiumPro intel take all benefits from RISC philosophy! (but he still had (/ have even today) burden of translation of x86 instruction to microops – burden that prevent him to compete with ARM).
RISC vs CISC war is end long time ago… everything is RISC-like today.
could you elaborate this?? give some examples?
The only RISC that was ever part of x86 was so deep in the processor core and pipelines that no one programming the processor for general use ever noticed. It was only intensive applications where any special kind of optimisation made a lot of difference. The whole point of the x86/IA32 architecture was that it was meant to be backwardly compatible with the previous generations.
Again, RISC core + CISC userspace interface /= RISC processor
Edited 2012-11-07 11:06 UTC
Intel can do pretty much the same with Atoms? Actually, Intel might be furthest along WRT such many-chips solution (and future software support), with post-Larabee http://en.wikipedia.org/wiki/Intel_MIC
Intel are far from alone. MIPS (I think it was) and IBM have both been doing this for years. In fact one such joint venture between IBM and Sony is sat in many peoples homes: the Playstation 3 Cell processor.
So I wouldn’t say Intel are ahead of the game on this one – though they’re certainly not sat idle either.
Your reply reminded me about one other curious many-cores project, which uses… ARM:
http://en.wikipedia.org/wiki/SpiNNaker
http://apt.cs.manchester.ac.uk/projects/SpiNNaker/
And I guess I also meant there that Intel does seem the closest to really productising such architecture; also with their usual top-notch compilers expected. OTOH, they also had sort of failures along the way… (Larabee; too bad it seems to have killed Project Offset game/engine)
Yeah, I forgot about Cell …but maybe there were good reasons for my omission? ;p
– not that many cores
– not homogeneous
– sort of failure, WRT goals (IIRC, PS3 was supposed to have only a Cell inside – when first performance results showed up, Sony contemplated using two Cells; eventually, they just used relatively classic CPU+GPU architecture)
Imagine a computer with twenty or so ARM Chips working in tandem. AMD has already developed such a system based upon ARM, using dozens of ARM cpu’s. Megahertz will not be an issue. This will ultimately lead to a MacOS with full compatibility with iOS apps. Effectively the two operating systems will be merged this way. As for Compatibilty with legacy windows Apps… Microsoft doesnt seem to care anyome, so why should Apple. for Microsoft it’s do or die. For Apple it’s evolve and fly. I was mistaken in my Assertion that Apple “Invented” ARM. I can say that they saw the future coming, almost twenty years in advance….Wow.
I think they can run iOS apps on a Mac right now if they made an emulator, like they could run PPC software on Intel machines.
A number of ARM CPUs would make a powerful machine, but (and I’m not expert) I don’t think that would help much if the computer was doing one single heavy task. Multiple CPUs/cores can handle multiple processes, but I don’t think they can join up to speed up a single process.
But I wouldn’t mind being wrong this time.
Wouldn’t you need a touchscreen Mac for best effect? Haven’t tried the Magic Mouse or Magic Trackpad so maybe those would do in a pinch.
You would need a Mac with a touch screen for the best effect, but I don’t think that would be very practical. It would be very hard on your arms. Even on a laptop it wouldn’t be very comfortable.
I have a magic trackpad and I like it, but using it to control on-screen iOS apps would be guessing where to press your finger. Better would be to use the mouse pointer, but them multi touch wouldn’t work.
Making a fat binary that includes version for the iPhone, iPad and a Mac would be a terrible waste of space.
Desktops and laptops are a different thing than mobile devices like phones and tablets. I like my iOS devices, but I would hate it if my Mac became a restricted device.
Applications can use threads to take advantage of multi-CPU/multi-core sysems. This is not an issue specific to ARM and is very much an issue with x86 based systems as well.
Yes, but if you have a very difficult calculation, would a single 500 Mhz CPU solve it slower than 10 500 Mhz ones? The 10 version can do the same calculation 10 times in the same amount of time as the single CPU, but can it do a single instance faster?
While there is no guaranteed improvement, I think it is imporant to keep in mind that multi-core systems can improve performance or throughput even if all they are running is single threaded processes.
Because they are almost certainly running multiple processes and being able to dedicate a core to a certain process leads to optimization options, e.g. never ever having to flush caches, potentially having dedicated memory or I/O lines, etc.
Multithreading improves the utilization of multipe-core CPUs, aber multithreading is only one of two widely deployed parallel-processing strategies (the other obviously being multiple processes)
More cores will allow each core to spend more time on a certain task, but they can’t work together to do one task faster (I think).
So having a whole lot of ARM chips in your Mac doesn’t make it faster at CPU intensive tasks than a Mac with 4 or 8 x86 based cores.
Then again, do they need to be? Are there so many tasks that require a lot of CPU to make it show?
When you move from a normal hard disk to solid state you’ll notice what a bottleneck it was.
The CPU spends a lot of time being idle. So my guess would be that a computer with a solid state disk, enough and quicker memory, faster system bus can work very well even if it has ARM CPUs compared to a x86 machine.
Around here I have a PC with Athlon XP 1700+ (1.46 GHz, but actually set to 1.1 GHz for hardware conservation purposes), an 11 year old CPU.
And you know what? It’s still quite fine for the typical stuff (browsing, office, IM, basic media consumption), the idle thread takes ~90% of CPU time.
Though the machine has, yes, a bit more RAM than was typical in its heyday – 768 MiB. If it was dual CPU and with an SSD, I guess it would be very comfortable.
Speaking of dual – on my dual Pentium II 266 the largest “consumer” of CPU power is also the idle thread…
I once heard a story about a programmer who had asked how to kill the idle thread, because it was taking up 99% CPU time(…).
It seems I just uncovered, here, some programmers who still have some moderately related issues with task manager: http://www.osnews.com/thread?541619 ;>
I must admit, I don’t always understand it either.
We have a Windows 2003 server, it tends to be around 0-2% (CPU) busy. When it gets to 5% people complain about performance. Neither network of hard disk activity are even near 100%.
Time for an upgrade! ;>
At least I’m fairly certain that “mem usage” shows only the physical RAM, more or less – it tends to greatly drop when an app gets minimised / flushed out to swap; “VM size” in the meantime stays constant, and that would be the memory really consumed …I think.
…but you already have a iOS device emulator for Mac OS X… for 5 years now.
http://www.youtube.com/watch?v=VvOs6UpVJes
they (Apple) do not even call it “emulator” but rather “simulator” (since there is nothing to emulate! e.g. ARM )
so this is not a issue (or purpose)! – switch Mac to ARM and than run iOS apps on Mac.
you can do it today.
— but
take a look on today Mac OS X; you can burn CPU 100% if you do:
1) rendering
2) zip/unzip stuff
3) run Adobe Flash
4) some heavy PhotoShop-ing
5) Aperture
6) complex javascript/html pages
… and all this stuff could run even faster on GPU! (e.g. new Aperture heavy relies on GPU!)
on other hand, for:
1) chatting
2) skyping
3) typing
4) exceling
5) emailing
6) ~browsing
you do not need nothing faster than todays ARM!
all professional (content creation) application could benefit more from GPU than from CPU in future (as we have more and more advanced underlying infrastructure: compilers, languages, frameworks…)
…and one very important thing for future: it is easier to double GPU power than CPU power!
Not sure what you meant by “since there is nothing to emulate” followed by a winky face, but if your comment was meant to be tongue-in-cheek, sarcastic, or ironic somehow, note that there literally is nothing to emulate. Apps built in Xcode for the simulator are built as x86 targets, not for ARM.
ok, let’s elaborate this:
if you need emulator: that mean that you need to emulate ARM CPU on intel Macs.
if you have “simulator”: that is meant that you do not need to emulate ARM CPU because all code (frameworks) are compiled for Macs x86 and you still have iOS apps & OS itself running on x86 CPU without emulating ARM.
EXACTLY as you said:
there is no need to emulate ARM.
back to first topic: there is no need to switch Mac to ARM just to be able to run iOS Apps!
I don’t understand: even if Apple could get that much of a performance boost by using GPGPU on ARM, couldn’t they also get the very same perf boost with GPGPU on x86?
Unless I’m missing something, GPGPU does not sound like a major area of differentiation between x86 and ARM, since GPUs must work similarly on both architectures (there’s only so many ways to compute a matrix product in silicon).
Edited 2012-11-06 18:16 UTC
simple answer: if they want to switch to ARM than they could use GPU to compensate speed difference between ARM and intel x86 chips!
and even on desktop x86 you will need to utilize GPU in future if you want real impressive speed gain!
at the end speed of x86 would be less and less relevant! – “it should be only fast enough for “Office-like” jobs…” and real speed (where it is needed) will come from GPU.
even AMD bet entire company on this premise! – CPU part could be anything (ARM/x86); it does not matter. GPU is one that count. look at latest AMD GPUs – they go so far that you can pass pointers from x86 directly to GPU, they add exception support, virtual memory and page tables, calls and conditional branches, integer operations with 256 GPR. it looks like complete CPU and still massive SIMD monster.
they only need “fast enough” x86 execution since real speed will come from SIMD GPU.
I see. Thanks for the clarification!
as long as they don’t use this excuse to completely turn OS X into iOS, of course.
Why else would they do it? iPhone, iPad & iPad with a lid plus the odd boxen (perhaps with it’s own internal cloud of several PC’s) with some humongous retina screen monitor. How cool would that be…..very, from what I gather.
Apple would be idiots to do this, but then again what do I know? I am not running the most valuable company in the world. Off subject, I am very much enjoying Windows RT on my Surface
Why so? They’ve got a lot of experience of ARM now, it having been the basis of their best selling products for quite a few years. If they think the next-generation chips are going to be competitive as desktop hardware, why would they be idiots?
Adobe Suite, MS Office for starters … although nothing in Apple’s behavior says that it wants to be a company that facilitates content creation any more, but merely consumption.
Presumably that’s why Apple put so much work into developing Final Cut Pro X, Aperture, Logic Pro, the photo editing capabilities of iPhoto especially on the iPad, Garage Band, etc.
The status and importance of Microsoft Office is declining quickly and will continue to do so. Almost no one needs Office anymore unlike say ten years ago.
I’m no expert on the matter, but I’ve heard a lot of bitching about the dumbing down of Final Cut Pro X. The rest of the apps you mentioned IMO are leftovers from a time when desktop Windows was the thing to beat. As for the editing capabilities of iPhoto … please …
I don’t need MS Office, maybe you don’t need MS Office, after all LibreOffice is pretty good and free, but in the actual office, people will use MS Office for the foreseeable future.
Microsoft’s office sales seem to point otherwise…
I see you stressed “needs”… true enough, but the corporate world still uses office… Ours is a mixed world of linux and windows… but office is the defacto standard for all presentations, documentation, etc.
There’s already ARM versions of MS Office for Windows, so I don’t know where you get this idea that Adobe nor MS couldn’t cross-compile their software if Apple decided to support ARM on their desktop OS.
Edited 2012-11-06 10:21 UTC
Microsoft could support ARM Mac OS X, but I imagine it would take a couple of years. And to my knowledge, Adobe doesn’t have ARM versions of their applications. The amount of platform specific optimizations are probably mind boggling. Doable? Yes. Immediate? Not by a long shot.
Adobe has a version of Photoshop on both iPad and Android tablets. Both versions do a lot (but not all) of what the entry level Photoshop elements does, so I can’t really agree with you here.
No it wouldn’t. Office shouldn’t have much -if any- processor specific optimisations. Most of the real graft of MS Office will be maths OS API calls – both of which should be CPU independent.
Sure, porting Office will not be a 5 minute re-compile job, but two years is grossly overstating the time frame.
Plus -and as I’ve already stated- Office already supports ARM so there’s already reusable code there – even if every Office platform is it’s own individual project (which seems a little unlikely).
Actually they do. And they also had PowerPC versions before Apple moved to Intel (granted out of date code, but it shows a history of supporting multiple platforms).
It makes very little sense to pile on CPU specific optimisations by hand – not least because it makes human coding errors massively easier to make yet harder to spot and correct, but also because packages like Office simple don’t need to. Even in the case of Photoshop, most of the hardware optimisations should come in the form of GPU acceleration (GPU optimisations will reap better returns than CPU optimisations due to GPUs having better support for floating point and -lets be honest- being tailored for graphics already).
You make this big deal as if developers haven’t ever had to target other CPU architectures – and I guess if you’re exclusively a Windows desktop developer then that would be true – but software developers of this sort of scale routinely target other platforms so porting to new architectures shouldn’t be a problem so long as the underlying OS framework remains relatively static (and OS X has proven itself there already: PPC->i686->AMD64)
iMacs, laptops, mini? Maybe. Pro? hahahahahahaha
By the time they would consider such a move ARMv8 cores will be commercially available. They could design an SoC with 16 CPUs for example. Remember they did purchase PA Semi which designed high count multi-core PowerPC CPUs. The hahahahaha is on you.
Keep in mind that Intel has 10 core Xeons available (running 20 threads), with raw power far far surpassing anything that ARM has ever done.
My guess is that they’ll wait until arm64 becomes available so that they won’t have a repeat of PPC->x86->amd64 transition, and just go from one 64 bit arch to another.
Plausible.
A changeover would likely be building on the strengths of the ARM which is low power consumption. First would be the MacBook Air (i.e. grafting a real keyboard into the iOS experience) and then moving on to more and more powerful systems.
After three switches in CPU family (6800 to 68000 to PowerPC to X86), Apple must have the game plan for success in implementing the change pretty well rehearsed.
The Mac as released always had a 68K processor or later. The 6809 was only ever used in internal early models.
I just don’t see it happening.
When Apple did the PPC->x86 switch, it was because PPC performance had been severely behind x86 for years, with only a brief glimmer of hope to the contrary when Apple switched to IBM for the Mac Pro line. There are no ARM designs even close to Intel at the higher end.
Meanwhile, x86 is catching up to ARM in terms of performance/watt in small forms, and doing so at a much quicker pace.
I think I would go so far as to say that it is more likely Apple would switch to x86 in the iPad and iPhones than switching to ARM on the desktop.
EDIT: I don’t think that last part is likely, either.
Edited 2012-11-06 01:45 UTC
Would not make much sense. Apple invested tons of money into building one of the world top silicon design team. They have the opportunity, or at least, they think they have, to out compete rivals on raw power. All third party that’s left on the A6 is the GPU. I don’t think this is going to be the case for long.
They are aiming for a complete, bottom up Apple product that they can design in the direction they want. This is a very expensive dream, but is Apple dream since day 1. I am not so sure it is the most profitable road to world domination, but it is doable given you have the budget and will to do it. They definitively have both, at least for now.
Except ARM releases new CPUs and pushes further ahead and they’re ahead of Intel for performance/watt by a massive margin.
Apple switched away from x86 with the Apple TV to ARM and there is literally ZERO incentive to switch from ARM to x86 for the iPad or iPhone. You’re bat shit crazy.
Intel is also releasing new processors, and are targeting the low-power areas aggressively.
And, yes, ARM currently beats Atom at performance/watt, but note that there aren’t any existing ARM chips where such a comparison would make sense, if you want to compare to i5/i7. There aren’t any ARM chips that would be a suitable replacement for the i7 in my laptop. None.
Intel is way closer to having a chip compete with ARM designs in phones than any ARM maker has to competing with Intel on the high end.
Apple’s switch from x86 to ARM probably had more to do with software than hardware, keeping development of iOS simpler.
Also, I didn’t say they would switch to x86 for phones and tablets, only that it was more likely, which I also said wasn’t going to happen.
Intel is still a very long long way to beat ARM on the performance/watt game. Intel is seen struggling to scale down its Atom line without sacrificing significant amount of processing power and features. ARM however, is now creeping into Intel’s territory:
http://www.computerworld.com/s/article/9231156/Early_ARM_64_bit_pro…
ARMv8 64bit CPUs are now at Intel’s doorstep. It is only matter of time before Apple starts churning out similar chips at PA-Semi for the rest of their line.
Atom is much, much closer to ARM in the low-power segment than ARM is to Xeon in the high-performance segment.
Here’s an example:
http://www.anandtech.com/show/5770/lava-xolo-x900-review-the-first-…
CPU performance is at or near the top, while battery life and GPU performance is mid-range.
Keep in mind that the Saltwell core used in the this phone is manufactured at 32nm. Intel still has to move Atom to 22nm, and will have a new Atom core at 22nm mid- to late-next year. Intel is currently the only company selling 22nm devices.
This phone is a real product that exists, not the successor to a not-ready prototype that the article you linked to was about.
Well the ARM-servers are coming:
http://www.youtube.com/watch?v=njmQBqUuYqU
Even from AMD/SeaMicro:
http://www.eweek.com/servers/amd-arm-set-stage-for-competition-with…
So maybe even desktop/laptop ?
Macbook Air with 12 hours or 15 hours of battery time… I’d love it.
BTW I think that the PRO line must stay with intel. ARM doesn’t have enough power to virtualize Windows for example.
You do know that Windows 8 RT runs on ARM
And this is of what relevance to this article, considering that you can’t even purchase Windows RT aside from purchasing a device preloaded with it?
And how would you virtualize Windows on ARM in the first place, unless Windows comes out with a consumer-purchasable ARM version? You couldn’t virtualize X86 Windows on ARM, you would have to outright emulate the X86 architecture. This would be extremely slow no matter how you slice it; anyone remember Virtual PC for PPC Macs?
With the larger amount of apps already available on iOS, what is the point of Windows RT?
Windows RT != Windows Pro
Well… http://www.raspberrypi.org/phpBB3/viewtopic.php?f=41&t=12727 … nothing is impossible.
Yes of course Windows won’t run under emulation or other under any ARM Mac,
I think it would ..loong term, be incredibly foolish for Apple to jump *wholesale* in the Arm computing world.
We came to know that x86 and PPC version of OS X were developed side by side for a long while before Apple switch to x86 machines
And I’d find it very likely that they would do the same with x86 and ARM were they to make a switch again,
…only this time I would expect x86 machines to, at the very least, remain as the powerful processor of choice on their top end Mac Pro machines -presuming they still exist in a few years time. But I’d hold the same hope for other machines within the desktop line up, leaving ARM if it happens on perhaps the “non-pro” laptops and the tablets.
I don’t see multi-core ARM CPUs eclipsing multi-core low voltage x86 CPUs (even if terms of power per W, let alone raw FLOPS and MIPS for) …for well over 5 years, maybe 10 years.
I would gladly welcome the change of architecture if it allows me to develop on it and not as locked as Windows RT (I don’t think windows RT would allow to develop directly on it).
As unfortunate as it sound, Apple is the market driver, so Having one viable ARM Laptop computer would be the signal for other makers to be more serious about building an ARM laptop and pushing toward an ARM ISA standard.
I have high hope for linux on ARM, but right now there is little effort toward it as dev effort is divided among devices.
Apple have probably been ‘playing’ with this for several years. By playing, a small team somewhere has been given a brief to investigate using ARM cpu’s in the OSX Product line.
They will have gone away and have been trying all sorts of permutations. They may have even got their colleagues in PA-Semi to create some custom silicon for them.
When the time comes for the project to become mainstream the results of this ‘playing’ will be fed into their product direction. The ‘playing’ will have covered all sorts of different ways/permutations etc on how to do this.
They might even say, ‘sorry not possible yet because…’
This is what any company that is serious about making this sort of transition would do. Apple have done it before. They might even have several teams looking into it. you can be sure that if/when Apple makes this move they will have a really good idea about what will be produced and sold to the consumers.
This is how new Cars are produced. You see lots of concepts at the motor shows. Very few of them see the light of day but often bits of these concepts appear in new cars a few years later.
Naturally this is all speculation but if I were an apple shareholder(I’m not) I would be expecting them to be doing this sort of thing.
Are we not forgetting smartbooks?
Only Apple could pull this off simply because of the iOS library of apps. iOS is Apple’s 25-year platform just as Mac OS was.
The wave of smartbooks that were promised in 2008 were never launched because of a perceived negative bias from consumers over the lack of Flash for ARM.
I perceive that Apple would pre-announce a ‘smartbook’ device (basically iOS + keyboard), allow developers to register to receive a prototype so as to get serious app development for a serious machine ready before it hit the shelves.
That said, I can’t reconcile the mouse problem. If a smartbook has a keyboard, then where goes the touch? Reaching to the screen is bad for the arms, and a touch-pad / magic trackpad detaches you from being able to touch the content. It’s the only reason I can think of why Apple has not released such a product yet.
Are we really to stick with Mac OS X for another ten years because Apple cannot reconcile their own competing platform that by nature abhors a keyboard?
Oh, I hope OS X sticks around and I think it’ll have to. Why? iOS is too limiting, and Apple wants at least a small foothold in business. One of two things has to happen: Either OS X has to stick around, or Apple has to relinquish a little of its control over iOS. I’m betting it’ll be the former, that way there’s still a definitive line (as far as Apple is concerned) between the various products.
I code a lot in Codea, an IDE and development system for iOS/iPad. It has an emphasis on creating graphical apps, such as games, uses Lua as programming language, has full 2d/3d graphics access, and has a vibrant community. There’s also a route to the App store by using a Mac based SDK. So, really I’m in a good position to answer this question.
The IDE will work with the on-screen and a Bluetooth keyboard. I go both routes. At a desk, I use my bluetooth keyboard (it’s a targus model – pretty nice, though not on a par with the Apple version) and on the train, I tend to use the on-screen one. I develop code. Others develop amazing stuff. I can do 100% of the things I need on the iPad.. nothing requires an external app at all. The only thing you’d need the desktop for is the App Store, but nothing stops me from sharing the code with others – this doesn’t require the App store. If there was an IDE that compiled native iOS apps on the iPad, I can’t say that I’d ever touch a Mac again for iOS development.
On a side note – on my Nexus 7, I have AIDE – this is a full IDE and compiler for Android. It create a fully installable APK and lets one run it on the device natively. Couple that with the Bluetooth keyboard, and the experience is actually more pleasant than using a desktop machine and the super slow Android emulator.
Edited 2012-11-06 12:26 UTC
Was it really about Flash or maybe fears about confusion with Windows-running netbooks, about software in general? (also poor choice of OS for such a machine)
Kinda like Linux-running netbooks mostly disappeared… (and they did support Flash)
So… the display ~mirrored (with lower fidelity) on on a 2nd display …build into the keyboard half, largely also on keys themselves? (yes, you’ve heard it first here! …except, that would be really “just” Optimus keyboard done proper & for a laptop)
When Apple switched from PowerPC to Intel x86 cores, they did it because out of pragmatism. Since they cannot rely on first party apps alone, they need the performance of third party apps (which are seldom optimized down to the metal for all their target platforms) so they need to have at least comparable performance to PC’s sold at that time.
Doing so with their own CPU designs, with the PowerPC chips they were developing together with IBM, would have cost quite a lot of money and the cost would have spread over far less processors than Intel ships regularly each quarter. Going to Intel was, long term, a very good move for Apple and a pragmatic one to boot.
If they can keep their iOS based devices selling as well as they do now or better, the possibility to switch from x86 to ARM would be there IMHO only if they could do it with a scale up strategy: invest in micro-architecture enhancements, use few cores on iOS devices, and scale up to very large number of cores on Desktops and Laptops. Still, they would have to solve the problem of keeping up single thread performance with Intel and help developers achieve at least similar performance to Desktop CPU designs by Intel without jumping through hoops.
This is not so much a HW challenge, but a very delicate balance of SW and HW efforts to make it happen. It might be why apple has been investing lots of resources on the LLVM project and initiatives such as GCD, but much more needs to be done.
More: going with the PPC was likely itself a mistake – Apple CEO from the time thinks it was one of his biggest ones: http://www.macworld.co.uk/mac/news/?newsid=7045
(and Mac OS Classic was running on x86 at the time… http://en.wikipedia.org/wiki/Star_Trek_project )
It’d be quite logical to switch, because ARM is just better in many ways; it’s cleaner, simpler, needs less power, etc.
The problem [especially for big corps like Apple] is that ARM is constantly evolving. Software projects, like Debian GNU/Linux, or anything else can just write new code to work on new ARM hardware specification. Especially with open source / free software that is just possible and there’s usually no ‘money’ obstacle there.
When it comes to hardware+software vendor, the things get much more complicated. Not only they would have to make their OS constantly available for ever-changing ARM spec, but they would also need to make their hardware updated every now and so. That doesn’t make a reliable product, and that’s what Apple wants for sure.
P.S Thom, in a fast changing world I don’t think we have any chances to predict what’s going to happen in a next 5, not mentioning 10 years. This is pure insanity. ARM could be replaced with other, even better HW platform. I’d be VERY careful to make any predictions about the future.
So you predict that most future predictions will turn out false? ;P
Apple has an inhouse ARM design group
….—> more profit
….—> better control
….—> custom designs possible
……..—> lock in (like.iP[od|ad|hone] or consoles) could be integrated in HW
……..—> services like remote system lock via network could be integrated in HW
……..—> custom coprocessors could be integrated
I hope not. As we learned from the PPC->x86 transition, a change of arch is a pain the rear. Suddenly all your programs will need an emulator, or will need to be “universal binaries” and eventually they won’t work any longer. Ask people who had to abandon perfectly fine programs or not upgrade to Lion.
7 years after the Intel transition, the “healing” process is far from complete. Why start such a nightmare all over again? And probably losing many users in the process (certainly most of those who remember the previous transition).
Edited 2012-11-06 15:48 UTC
In most measures, 7 years is long enough to migrate. It’s not like it was a secret. Also, Snow Leopard and Leopard didn’t just stop working, so if one needs a legacy app, just use the legacy OS. If you need a legacy OS, just install one. As an example – my late 2011 Lion shipped Mac Mini happily boots my “hacked”** version of Snow Leopard. SL isn’t officially supported, but it works.
** Added a few drivers and changed a couple of plists. It wasn’t hard and it runs like a champ.
A guy in bloomberg has a friend wanting to make some money for christmass presents by shrorting some Intel stock. He takes an apple guyl and pons him down with loads of loaded questions.
The truth is its more likely that apple foregoes the whole Mac line or puts Itanium in Ipods than going through (utterly piontless) CPU arch jump hurdle again.
Intel is much closer to reaching Arm in performance per watt than Arm to closing in on raw single threaded efficiency.