I thought OSAlert would be a good forum to talk about a matter that has been weighing on my mind lately primarily because the site has been so focused on Apple’s patents and litigation as of late. The news that HP, the largest PC manufacturer in the world is spinning off or getting out of this business is what really prompted me to write this article.
Those of us using PCs (that being the generic word to describe a computer that is not a Macintosh), see Apple’s actions as going against all that we stand for with regard to personal computing.
Software patents in particular are regarded by a good many in these forums as the antithesis of all that is wrong with this industry. For Apple to be litigating companies (at least partially) on the basis of this strategy certainly explains the light in which some tech news sites cast Apple in with regard to news relating to the company.
That there is still so much bile strewn about when discussing computing platforms particularly when Apple is referenced has much to do with passions and preferences but more to do with computing ideals… Those being “open” and “integrated.”
Two Sides To The Story
There are two sides to this story and I’m saying all this to show that I understand the side of a position which is most often taken on these forums. However, I think it important to reference the less often regarded position here as I so rarely see it.
The PC might best be best understand as a Robin Hood story. You might consider it a bizarre analogy but if you think about it, just like Robin Hood, the PC took from the haves and gave to the have-nots. That the PC managed to grow so quickly is the result of many factors but they all hinged on a single event that changed the course of history: Microsoft’s theft from Apple.
The Haves And Have Nots
Microsoft was given access to the Macintosh early on and made its own copy of Apple’s operating system. In addition to user interface concepts, Microsoft incorporated actual code from Apple into their software. When Microsoft sold their product, they in turn created a business model for countless PC manufacturers to build upon which would in turn create greater purchasing strength by the sheer number of company’s buying the PC’s standardized components.
It’s true, the result of that action did populate the world with personal computers, made jobs in countless industries and helped perpetuate the growth of other fantastic technologies all of which certainly explains the vehement defense of “open” architectures. What you don’t hear in that story is that Apple’s platform might have otherwise achieved all these goals and noteworthy achievements and in turn also allowed the company to see massive returns on their investments, if it were allowed to have its intended business trajectory run its course without Microsoft playing the role of Robin Hood.
The immediate response to that argument is that Apple would have kept prices high thus stifling potential growth. The “proof” presented to support that argument is that Apple had higher prices during the 90s when the transition away from Macs was taking place.
The retort that that argument is the diminished scale Apple had from the theft made it impossible to offer equal pricing without exclusively selling to high-end markets.
One need only point to Apple’s ability to out-scale the PC manufacturers competition for products like the iPod, Macbook air and the iPad as examples where Apple larger scale drive down prices and demonstrated that high prices aren’t a part of Apple’s internal culture as is often perpetuated in PC-centric forums.
Another Perspective
When you look at it from Apple’s perspective, the (near) entirety of the PC industry was theirs. They were the ones that would have been owners of 95% of the market and not 5% as is the case now. It was a badly written contract and mitigated software patents that lost them their dominant position in PC computing’s history and allowed others to benefit from their innovations and first mover advantage.
PC aficionados dislike of software patents and thus Apple’s position in leveraging them was formed originally as a result of Microsoft’s initial theft. Why wouldn’t they argue that point though? They are amongst the many benefactors of the theft.
In the same way, I’m sure Robin Hood’s benefactors (The have nots) would have argued just as vehemently against “the have’s” going after their stolen money and implementing greater security at the same time. I’m sure they would have argued just the same, “but look at all the good we’ve done with this stolen money!”
Lead. Follow, Or Get Out Of The Way
Competition is great but if you can’t compete without the benefit of a business model created by way of stolen goods then perhaps you should sell your company to someone who thinks they can or better yet, simply quit that business and do business in other markets. IBM understood that, so did Amiga, Compaq, Gateway and now so does HP as Apple becomes the leader (or at least a leader) in the post PC era.
I had always wondered if Apple’s computers would have grown as quickly as the PC market had done if allowed to run its intended course without Microsoft playing the role of Robin Hood. The iPod and iPad are both perfect barometers to answer that question.
Can we really blame Apple for not wanting the past to repeat itself. Similarly, can anyone legitimately argue against them without casting themselves as a champion for Robin Hood’s methodology?
Apple are going to fight tooth and nail to avoid a repeat of the past!
I can’t help wondering now, if Microsoft hadn’t stolen from Apple… Then we might still have our Amigas, Ataris, Archimedies and even OS/2 machines…
Microsoft’s own success was the result of the competition being so poorly run at the time – in many cases it was Microsoft who was in second place and only pulled ahead because of good luck combined with mistakes by the competition which resulted them taking the lead. We could have been running IRIX or SUN workstations but it didn’t happen because both organisations were hell bent on gouging the customer for every cent they had and more in some cases. Apple as of 1997 were charging an entry price of $4500-5000 for their computers when an mid range PC in NZ at that time was around $2000. Amiga was basically killed by the PC side of Commodore who saw no benefits of dedicating some resource to transforming Workbench into AmigaOS and marketing the benefits of their hardware (I’m sure a transition to MIPS or PPC at the same time Apple did would have made things a lot better).
One thing that needs to be taken into account is that Apple is an entirely different beast than it was 15 years ago (heck, its an entirely different beast than it was 5 years ago). The differentiating factor between Apple and the rest of the industry is the software hence the reason why they guard the exclusivity of Mac OS X so strongly – it is after all at the heart of the Mac just as iOS is at the heart of the i-devices, they make the devices what they are with the hardware merely being a means to an end rather than an ends in itself (hence their interest in patents as well – at the end of the day “hate the game not the player” because at the root of it are politicians who lack the backbone to say “no” to lobbyists). Whilst the rest of the industry has been slashing each others throats and shipping ‘more of the same’ PC’s loaded with Windows, Apple on the other hand has offered ‘something different’ which makes their product stand out.
I’ve said this multiple times that the future of the PC relies on either being the hardware manufacturing side operation of Microsoft except being owned by its own shareholders or PC vendors waking up and realising that the only way to protect and improve their margins is developing their own operating system. There are the raw ingredients out there using FreeBSD and other technologies but it seems to me that the vast majority of OEM’s don’t have the stomach for a long term investment hence they go back to the ‘race to the bottom’ that has seen HP and IBM leave (or contemplate) leaving the PC industry.
Edited 2011-08-25 23:31 UTC
The reality is that Jobs was a crazy hippy and Gates was the greatest businessman in the world. That is why MS won.
In fact Jobs probably would have bankrupted Apple if he wasn’t forced to leave in 1985. The Lisa was a total fiasco and the Mac was a commercial failure until people discovered desktop publishing.
If you’ve actually read anything about Steve Jobs him getting fired was the best thing that could have happen to him (he and his closest friends have said that publicly) – lets also take into account the fact that Tim Cook is a HUGE moderating factor that counter balances Steve’s sometimes single minded focus that blinds him to the larger picture. It is Tim Cook who negotiates and works behind the scenes grinding down the suppliers, securing supplies through large capital injections into partners etc.
All your final paragraph showed was Apple being ahead of its time rather than being an all our flop.
Apple wasn’t “ahead of it’s time”. The Mac was an underpowered expensive toy that sold very poorly. It was by pure dumb luck Apple survived. The credit goes to the laser printer, Aldus and Quark for creating the DTP market. This was not something Apple had even anticipated.
I would also wager he is the driving force behind all of the litigation. He was the first one to mention that Apple was going to start to heavily litigate against those they thought were benefitting from Apple’s innovation. Given his role in the company and a general description of the man, I definitely think he has a lot to do with the current patent wars going on.
There’s only one word that matters and that’s ‘execution.’ Microsoft was really running well in the 1980’s and 1990’s. Maybe they didn’t come to market with the best product, but they were well positioned and marketed. They exploited network effects and synergies where they could, keeping other companies from building a viable alternative for third party developers. They weren’t above using a little muscle to keep their licensees and third party developers in line, either. Apple, by comparison, may have been putting out a better product, but as a company they weren’t executing at the same caliber. Every company makes mistakes but Microsoft wasn’t making as many or at least making more manageable mistakes.
As a concrete example, Apple’s product lineup was a disaster. Before Jobs came back you couldn’t tell if one machine was a better deal because it had a faster CPU, or the other one with slightly better graphics and bundled software. Wordperfect, another example, botched the Windows transition with some pretty shoddy products. I was actually part of a rollback from WordPerfect 6.0 back to 5.1 because it was so bug ridden. (And on Windows 3.1 that was a time consuming process done at the user’s PC). Not that users who had spent time with the DOS version were that happy with 5.1 for Windows, either. Coming from DOS the world belonged to Novell, Lotus 123 and WordPerfect. It was theirs to lose and they lost it.
Now we have a Microsoft that’s fumbling product launch after product launch and tearing themselves up from the inside with infighting. Apple, by contrast, is executing well. Maybe not everything they do smells like roses, but the are staking out a profitable turf. They are now among the top purchasers of semi-conductors (if not the top). They are at P/E of about 15 which means (relative to their earnings) they are not over-priced. (By comparison: IBM P/E about 14, Oracle 15, SAP 23, Microsoft < 10, HP about 6, which might mean the market does not have great confidence in Microsoft or HP.) At the end of the day it’s mostly about how you run your business. If it were only about quality, the world would look very different than it does today.
And the interesting question is – would that be a good thing?
What would the industry look like if the Microsoft desktop monopoly hadn’t provided a defacto standard, where developers had to produce apps for all the individual platforms, instead of just one that would work on 99% of all desktop computers?
Arguably, MS’ desktop monopoly created further monopolies. It’s unlikely that MS or Adobe would have ported their products to the myriad platforms out there, leaving room for competition.
Hell, it might have even avoided the standards mess. Imagine if Office had to market itself as interoperable in order to compete.
The smartphone industry today?
While Adobe may not have ported to every platform, WordPerfect did – and provided excellent service for them all to boot. Their murder by Microsoft’s questionable business tactics* was a sad chapter in computing history.
Just saying.
(*I’m referring both to their use of undocumented APIs in Windows to prevent Office competitors from achieving feature parity plus the entire OS/2 / Windows bait-and-switch scam, for example.)
The thing Microsoft ‘copied without consent’ from Apple was Video-for-windows, the video-decoding system, nothing critical to the OS.
I think the article has a bit of revisionist history going on. If not that, the author is viewing the case with apple covered glasses.
The big thing that originally made Macs so great, and Microsoft later copied, was the GUI, driven by mouse clicks.
But guess what. Apple didn’t invent the GUI. They stole it from Kodak, who had developed it in their labs.
But when Apple implemented it, they did so on the high end of pricing. Then IBM turned around and licensed PC technology to OEMs, which initially used DOS (from MS – actually MS acquired DOS for cheap from someone else), and MS and third party OEMs made “good enough”, and cheap, PCs. Then they implemented GUI (Windows) on top of DOS (again, good enough for the times), and did so cheaply, and made it accessible for the majority of consumers.
Yup, Microsoft has most definitely made “good enough” copies of other people’s products for most of it’s history.
But Apple has never innovated in a vacuum. They’ve always copied other people’s ideas, but took more of the premium, highly refined, higher priced route. Macs, iPods, iPhones – none were first to market. Heck, they didn’t even invent multi-touch. They just made it better/slicker/sexier/easier.
And look at this whole iPad tablet design patent blocking Samsung in EU. What a joke. There’s an article running at ZDnet today, showing the tablets in the original Star Trek (1960’s folks), looking strikingly similar to iPads.
Kudos to Apple for making highly refined, desirable products that are extremely well marketed, and people are willing to pay a premium to get.
But they have never been original innovators.
So they can stuff it when they act like they’re great innovators and deserve complete ownership of markets.
And I’m saying that as an iPhone and iPod owner.
Think you’re thinking about Xerox PARC…
“Can we really blame Apple for not wanting the past to repeat itself. Similarly, can anyone legitimately argue against them without casting themselves as a champion for Robin Hood’s methodology”
We can certainly understand why apple is doing what it is doing.
We can, however, blame them for the way they are conducting their business. It is still wrong.
Edited 2011-08-25 23:26 UTC
The big assumption in the article is that only Apple is innovating and everyone else is stealing from them. If that really was the case, I would understand their actions.
A factor that may explain the level of aggression from Apple and Steve Jobs in particular is that they believe they “own” the phone market, that it is “their” market and Google is trying to steal “their” money (Steve Jobs said pretty much those exact words to Eric Schmidt in that famous lets-have-some-coffee meeting).
In Reality ™ however, no one owns any markets. And Apple, Google and everyone else copy a large number of ideas from each other. Just the fact that iOS5 borrows heavily from Android shows that ideas move in both directions. Which sort of justifies why Samsung did what they did, Apple copied a lot of ideas from them/Nokia/Ericsson/Motorola and they felt they were allowed to copy some stuff back.
—
So to summarize: Apple is not the victim here. Stop pretending that Apple owns the market and everyone else are thieves.
Edited 2011-08-26 15:10 UTC
There’s that word, innovate… already you can tell no one’s going to agree on anything past this point.
Apple owns the iPhone market. It is their market. You can make a phone that doesn’t look and work like an iPhone — WebOS, BlackBerry, WP7, and what Nokia’s been doing with touch are all genuinely distinct user interfaces, and BB, Droid, and Pre are all distinct form factors — but Samsung and Google aren’t doing any of that. They’re following the leader. You can argue that it’s legal, but you can’t argue that it’s original.
That’s what the courts are for – to make reality conform to the legal arrangements. Patents are an exchange with the public: the publication of methods in order to secure the market for a time. If you have a patent, you own a market. If someone doesn’t recognize your ownership rights, you take it to court.
An excellent essay on the topic:
http://www.paulgraham.com/softwarepatents.html
Idea, singular. One very useful, very visible feature made its way back from the imitator to the innovator, just like the keynote where Apple (Steve, probably, but I don’t recall) admitted that Windows had something great with fast user switching, so they were going to do it too.
If there was anything patentable about the notification tray, Google has every right to seek licensing fees and/or damages. But would they be so stupid, when almost everything else about Android is copied from the iPhone?
To summarize, everything is equal because you imagine it to be equal with no actual examples or other evidence. Everyone copies, and factors like how much they copy or whether what they were copying was patented at the time don’t matter, because… well, that’s the part I can’t be sure of, but I’m going to guess that you like what anarchy does to prices. On the other hand, I don’t like what it does to progress.
Quoted from what I linked above, an observation that continues to play out with every opinion I encounter: “One thing I feel pretty certain of is that if you’re against software patents, you’re against patents in general.”
Just holding Samsung Nexus S, iPhone4 and 1st gen iPod Touch in front of me… And I can’t stop thinking that I can’t tell them apart… (massive sarcasm)
That a gross misunderstanding of patents and markets.
Please keep in mind that other software developers have different views. And a lot of them have no less impressive biographies compared to Paul…
Please present a list, that we can disprove every single point. Here are a lot of people that can quote the originators…
“You might consider it a bizarre analogy but if you think about it, just like Robin Hood, the PC took from the haves and gave to the have-nots. That the PC managed to grow so quickly is the result of many factors but they all hinged on a single event that changed the course of history: Microsoft’s theft from Apple.”
One of these factors is greed and over profiteering.
This is a nice article that tries to perpetuate the myth that Apple created desktop computing while Microsoft just stole their ideas. So let’s reverse this Robin Hood metaphor a little.
The iPhone has been (and still is) the most popular smartphone in the world. There isn’t a single brand coming close to its market share. And for a time, the masses using the iPhone lacked a good notification system. Proper notifications were exclusive to Android users. Then Apple, in a true Robin Hood fashion took the idea of the Android notifications, and made it available for the masses using the iPhone.
A lot of companies had their firsts. Without the first mass produces and relatively successful mp3 players we wouldn’t have the iPod. Without the iPod and the cash it brought to Apple, who knows what would have happened! Point is, Apple wasn’t first here. Oh, and when Win95 hit the shelves, it wiped the floor with Mac Os. Mac OS 7-9 sucked so badly, it was so primitive compared with MS offerings, that Apple kept loosing market share at a steady pace until Jobs came back, and build a system using some parts of Next, and some parts of FreeBSD (including employing the project’s lead developer). MacOS X’s success is partly due to the solid Unix foundation it was built upon.
Anyway, I guess our resident Apple fan will have a field day with this, but I don’t mind. Just let them continue basking in the warm rays of the RDF
You just said “without the iPod, we wouldn’t have the iPod.”
I just thought that was kind of funny.
Are you suggesting that Windows 95 was actually more stable than… well, anything at all, really, or that preemptive multitasking and dynamic memory were actually killer features on single-user systems with 100MHz CPUs and 8 megs of RAM? Perhaps both?
Sure enough, Microsoft got onboard with the future (“the future” = 1970’s Unix) years before Apple was able to, but Windows 95 ran away because it managed to commoditize all the user-facing features of the Mac and become the cheap computer that anyone could use (even if no one could get it to work all the time).
I guess your overall thesis was that Apple wasn’t constantly years ahead, and sure enough, there was more than a decade where they weren’t technically competitive. But setting aside the parts that are a copy of Unix, how is Windows 95 not a copy of Mac OS? That nifty Start button?
It’s useless reading beyond that point. You do know that iPod wasn’t the first MP3 player and it wasn’t the first mass market MP3 player.
I bet the rosy tint of RDF will start fading with time… be prepared for a grey world.
I hope (but I’m not holding my breath) that the publication of this article puts the foot into the mouth of those that claim that only one side is presented on OSAlert. I have read so many times in various threads – “write an article on the opposing viewpoint and we will hapilly publish it.” Here is yet another example.
Editors are human and are allowed to have opinions and even agendas. That’s why their called editors and not encyclopedias or robots. And don’t believe anyone that tells you that there is such a thing as objective news reporting. Even the choice of words can place a subtle spin on the “reporting of facts.” But on the whole I believe that anyone without an agenda of their own would have to admit that OSAlert does a pretty good job of being fair.
People that claim otherwise… I don’t believe in telling people to shut up, so by all means continue complaining if you wish. That is your right. But recognize that you are only making yourself look foolish by doing so.
Nice article.
It is worth remembering that Steve Jobs was ejected from Apple very shortly after the Mac was launched. The people who ran Apple after Jobs left were, in retrospect, not up to the job, they didn’t understand how the new PC industry was developing and they made many poor decisions. Having said that Apple still managed to do OK for a long time after Jobs left but by the time Windows 95 came out the poor decisions at Apple seemed to be multiplying, the internal management chaos was escalating and with Windows 95 it could be plausibly argued the Mac had lost is claim to being special. As a result Apple nearly died.
It is also worth remembering that Steve Jobs was not the man he was in 1997 back in in 1985 when he was dumped out of Apple. Jobs in 1985 would have made just as many mistakes and may have made bigger ones, and if he had stayed at Apple, he might have killed it. He utterly failed to develop and sell his vision of where Apple should be going with the Mac and his early huge success at such a young age meant he lacked the sort of leadership and management skills that come with maturity. What transformed Jobs was watching Apple, Next and then Pixar all go through near death experiences. It was Jobs failures during his long absence from Apple and Apple’s failure by proxy that, I think, taught Jobs an immense amount. When he returned to Apple he knew where he wanted to take it, what sort of organisation he wanted to build, how to lead it, it was a 15 year strategy and it was almost pitch perfect in it’s implementation.
One of the many lessons he learned was about brand identity, protecting design and intellectual property, and how to innovate. Given Apple’s terrible experiences back in the 1990s it is obvious that Apple will never let that sort of stuff happen again.
thats the problem i have with apple
everyone tells me how innovative apple is, and what they invented
but all i see is a company that excels at copying other ideas, polishing them to a mirror shine and selling them
But there is a lot to be said for polish
I think Apple does “innovate”, but not as much as we (the Apple fanbois) would like to admit.
I think what Apple does well is take existing ideas (and truth be told, most ideas rest on the shoulders others) and takes them to a place few thought / dared to go.
The PC industry in general tends to be conservative, Apple (and some others) dare to push the envelope, even if just a little at a time.
Love Jobs or hate him, he has changed the IT world, your Android phone would be a much different beast without Apple (see Android back in Nov 2007 (10 months after the iPhone was first revealed)) http://www.engadget.com/2007/11/12/a-visual-tour-of-androids-ui/
Maybe without Apple the IT world would be better, who knows, but it would certainly be different
Don’t let your fellow fanbois (you said it – not me) hear you saying this… they might lynch you with the cord of a one button mouse!
THIS is why many, many people believe that Apple, as a company and the individuals that represent it, are being EXTREMELY hypocritical as of late. And many of us really don’t like hypocrisy – it jars us. It is a double standard to the extreme… when Apple uses the ideas of others it is “adding polish” and “being bold.” When others do the same, it is “stealing” and “unfair.”
My open question to people that defend this is – “Do you really not see why this makes people angry?”
And before someone chimes in with legal technicalities in specific cases – save your breath. I’m referencing the larger picture and the perception. If anyone would care to address the disparity I’m all ears.
Well, if only those other companies ripped off Apple AND added polish of their own. Most copies (like tablets and phones) are crap, the ones that aren’t aren’t because their very good copies.
Sure they add some features an iPhone/iPad doesn’t have, but the overall user experience is less than on an iDevice.
Apple’s stuff is part of a general plan, while other companies just try to bring something to market that sells and when it doesn’t sell as well they abandon it.
WP7 and webOS are examples that it is possible to do your own thing and not copy Apple.
Oh please… webOS borrowed more ideas from iOS than Android ever did.
WP7 is the only mobile OS that actually does something radically different.
That’s it! I think Apple do a fantastic job of polishing their products and providing a better experience due to it. It’s not anything truly groundbreaking, but it is development and integration you simply don’t see anywhere else in the industry. I can see why Apple are suing left, right and centre, but I don’t think it’s right. If something is a blatant rip-off, then yes, but they’re not really. They’re similar, but the user experience is different. The competitors seem to be lacking some of the Apple polish, and that being the case, let them. Let people decide between the cheaper tablet with some extra features but a sightly more disjointed experience, or pay extra for the more robust experience and the shininess.
The analogy that springs to mind is pens (since I have one in my hand) – I have a rather nice Cross pen which is lovely to write with, looks the part and is reliable. However the pen in my hand right now is a 20 cent Bic biro which feels cheap, doesn’t write as well, but gets the job done just fine. There’s space in the market for both, and people will always buy both for various reasons, be it snobbery, appreciation of that extra polish, financial considerations or whatever.
Since you brought it up….
When I help older folks with their computers I constantly get asked, “Do I click the left button or the right?”. I never get asked that by my grandmother though ( Pssst… don’t tell anyone, but she’s got one of those sinister one button mice! )
Hey I love my 5 button mouse, but some people just need one
Edited 2011-08-26 14:26 UTC
It’s also hypocritical for any Apple fanboy to call himself a technology enthusiast, when they’re really Apple enthusiasts first. Their mentality is fundamentally at-odds with true enthusiasm for technology, in the same way that fundamentalist religious apologists don’t make for very good scientists.
The best recent example was the local Apple fanboy brigade gloating over the discontinuation of WebOS. Real tech enthusiasts don’t gloat over the demise of an interesting technology, only fanboys do that.
I’m continually astonished by the depth and breadth of the anti-Apple trollbook on things that didn’t happen being done by boogeymen that don’t exist. You can link to anything that has actually happened on the Internet, but you can’t link to garbage you dreamed up to fuel your own illusions. So which is it: links or no links?
Wow, that’s some persecution complex you have. Better check in your closet & under your bed, there might be anti-Apple trolls hiding there!
Yeah, maybe you should at least wait until after the links have fallen off the first page before you start trying to pretend it didn’t happen.
Should I read it aloud for you too, or do you think you can manage that on your own?
http://www.osnews.com/permalink?486304
And that’s not to mention the 15 or so other nearly-identical comments that Tony posted on that article alone, with the same thinly-veiled gloating.
There are components, and there are arrangements. You invent a component, and you innovate an arrangement. This is why Apple is called innovative: they’re known more for original arrangements than for original components. They’ve invented quite a few things, too: LisaOS introduced pull-down menus and drag-and-drop, while MacPaint originated the paint bucket and lasso select tools. A lot of original ideas come from QuickTime.
Most of the popular anti-Apple comments around here either ignorantly or intentionally equivocate invention and innovation, and whether correctly naming it or not, they assign little or no value to innovation. That little value, of course, being monetary: that Apple shouldn’t be entitled to anything besides what the market voluntarily gives it (if even that) for having created the industry’s smartphone template. Of course there’s personal value in Apple’s multitouch inventions and presentational innovations; we all enjoy the benefits of these newfangled phones, whether Apple got anything back from us personally or not.
There is no hypocrisy in buying inventions to build innovations, and then reacting legally when your innovations are copied. Of course there’s Nokia tech in the iPhone: Nokia has been building industry standards for decades, and they have to license them in a fair and non-discriminatory way, no matter how much they want Apple’s technology back in return, because you can’t build a phone — any phone — without them. You can build a phone that doesn’t rip off the iPhone whole, however. Android partners, Samsung especially, simply choose not to.
Innovation is very valuable. It brings inventions to new fields. However, innovations do not deserve state protections on the same level of inventions.
Apple is suing HTC. How is any of the HTC phones an iPhone ripoff?
I think you may be confusing ‘innovate’ with ‘invent’ – they are two very different things. To give you an example: Apple’s retail stores was an innovation by Apple. There are plenty of existing retail stores and retail stores operated by tech companies but the Apple stores were innovative for Apple and also innovative in general in how they were operated.
The Apple retail stores are now the most successful retail stores (by value sold per square foot) in any retail sector.
What was not immediately apparent was how these retail stores would be so crucial in the new mobile device tech paradigm. It turns out that access to and control of sales channels is deeply important to the sales of phones and tablets. Android phones could compete against Apple for a range of reasons but one was that there were well established sales channels operated by the carriers already geared up to sell phones. Android tablets have been much less successful (and may never be successful) and one of the reasons is that the same carrier phone stores that helped so much with Android handsets are, it turns out, not very good at selling tablets (for all sorts of reasons that we can explore another time).
So the way to view innovation (particularly in the new and emerging post-pc world) is that it can involve lots of factors, only some of which might involve new hardware components or new software, and that it will often be multidimensional – bringing together disparate but critical factors like value-stacks, brand characteristics, retail operations and channels, customer support, etc etc.
Innovation is complex – that’s why success on the scale of Apple or Google (or in the past Microsoft and Sony) is so rare.
The problem with stretching the term “innovation” that far, is that then almost anything can be called innovative, making it a pretty meaningless word.
For instance, the first version of Windows wasn’t exaclty like the original Mac OS. If you count the differences between both systems (support for overlapping windows, for example) then you could say that Microsoft was innovative too. Maybe their product was worse than the one it was insired by, but “innovation” and “quality” are two completely different concepts.
So I more or less agree with the OP. I think “perfectionist” would be a much more accurate description of Apple than “innovative”.
Apple borrowing from Android is not like Robin Hood stealing from the rich – more like a robber baron pick-pocketing a young entrepeneur.
I think the post talking about developing your own operating system as being the way to differentiate your product is correct. BTW where would the PC industry be now without Microsoft/Windows? would 3d graphics be anywhere near as capable as they currently are? The cheap consumer grade 3d market was made possible when Microsoft killed SGI and Nvidia/ATi came into being. Would the same level of competition have been possible in an Apple dominated market? or would Apple have simply chosen a winner, and then that manufacturer would have kept prices high and performance low?
Edited 2011-08-25 23:34 UTC
You’ll find that Apple benefited substantially from the work of Xerox PARC. In my college days, we had one of the Xerox machines on site, and the similarities were unmistakable.
And regarding software patents, many here may remember that the US patent refused to grant patents on software – it was subject to copyright law instead. Thus there was a legal recourse for copying source code, but not for an independent clean room effort that yielded the same results.
Considering the relationship between software and mathematics, I wonder how we’d all get along if we all had to pay royalties to Newton (or Leibniz) every time we used calculus. They’d probably still have their descendants in court now…
Citation? Keeping in mind architecture differences (16-bit x86 vs 32-bit 68k), I find this highly improbable.
How much of the industry is littered with make-believe stories and innuendo?
IMHO:
At the end of the day, the GUI that made the Mac unique was a Xerox invention. How many operating systems owe their very existance to them? I know Apple people claim they were compensated, but is this really so?
It’s also worth mentioning, that at the same time the Mac and PC were originally battling it out, the Amiga’s operating system and user interface was light-years ahead of either of them. Never mind hardware specs.
So what exactly is the argument presented in this article? If they had begun suing much earlier then Apple (and therefore all of us) would be much better off? Or is just a tale of bitterness and/or morality?
More like wishful thinking to be honest…
Edited 2011-08-25 23:40 UTC
This smells of second-hand information among Apple fanboys. Microsoft was convicted of using Apple code. It was video-for-windows which contained Quicktime code. So there is a grain of truth, but the conclusion is still non sequitur.
I am not sure why this is included in the article, but I suspect the article is fan-fiction.
Considering the author, we’re lucky it’s this mild.
So why did you let it get posted? Its your site I think you would want to protect your image and reputation and not let people post stuff you think is subpar. Instead of posting snarky comments on the side how about you just remove the article or something.
I’m assuming you are the editor of this site. The responsibility falls on your shoulders.
So tell us, Thom, whose clone is the author I never bothered to count, but I think we have about 4-5 well known Apple fanboys and their numerous clones. Is mrshasbeen back again? Has he ever left? These people seem so similar (the same message: Apple was FIRST with everything, and everybody is stealing their idea).
And that’s the reason Win32’s calling convention is PASCAL: the original Mac toolbox was written in Pascal:
http://hintsforums.macworld.com/archive/index.php/t-97075.html
Except its not… Win32’s calling convention is stdcall and always has been.
Maybe you meant the 16-bit Windows API? Yes, this used pascal cc, but so did OS/2 and virtually every other 16-bit api on x86 at the time, and it certainly was not because of the ridiculous notion they were trying to be compatible with the Mac toolbox. Besides, Windows 1.0 was written in a mixture of assembly and C – Microsoft almost never used Pascal for development.
Hint: Using pascal calling convention has nothing at all to do with the Pascal language or compatibility with it – it was used because it was slightly faster than cdecl and more importantly it resulted in saving a few bytes of overhead relative to cdecl at every call point. The only thing you really lose is the ability to have optional parameters, but for most well designed APIs optional parameters are something you actively avoid.
Exactly. The iFanboys are fond of trotting out that “smartphones before and after the iPhone” comparison – so I’ve decided to put together a similar comparison.
Apple’s UI before Xerox:
http://www.deater.net/weave/vmwprod/apple/wavy_apple.gif
Xerox’s GUI:
http://netdna.webdesignerdepot.com/uploads/2009/03/xerox-8010-star….
Apple’s GUI after Xerox:
http://lowendmac.com/hodges/07/art0227/lisa9.gif
Erm… I think Xerox was compensated by Apple. Here are some quotes/sources:
“In return for the right to buy US$1,000,000 of pre-IPO stock, Xerox granted Apple Computer three days access to the PARC facilities. After visiting PARC, they came away with new ideas that would complete the foundation for Apple Computer’s first GUI computer, the Apple Lisa.[6][7][8][9] ”
Source: http://en.wikipedia.org/wiki/History_of_Apple_Inc.
“Xerox could have owned the PC revolution, but instead it sat on the technology for years. Then, in exchange for the opportunity to invest in a hot new pre-IPO start-up called “Apple,” the Xerox PARC commandos were forced — under protest — to give Apple’s engineers a tour and a demonstration of their work. The result was the Apple Macintosh, which Microsoft later copied to create Windows.”
Source: http://www.fool.com/news/foth/2000/foth000918.htm
Microsoft already owned the PC market when the Macintosh was released. Has the author ever heard of DOS? Or looked at a timeline of the 1980s computer industry?
Windows was created in response to Visi On (http://toastytech.com/guis/vision.html) which was a clone of the Xerox environment that was demoed at Comdex several months before the Apple Lisa was released.
Please go back to computer history school.
Your entire post is innuendo, but let’s just answer your question.
Yes.
It is not a humble opinion, but a matter of public record that Xerox invented much of what became Lisa and the original Mac, and subsequently the entire popular concept of a “computer”. Apple did a lot of work in enhancing the UI and creating a consumer-grade machine powerful enough to run it, but a lot of the original R&D owes to Xerox.
Steve Jobs and Apple engineers were specifically invited to tour PARC and see their inventions. In exchange for early access to Apple stock at a reduced price, Apple got to make products based on PARC concepts. Xerox invested itself in a company capable of delivering products based on its inventions, in order to share in any resultant success.
Two crucial points. One, the arrangement was Xerox’s idea. Two, it was an offer exclusive to Apple.
http://www.fool.com/news/foth/2000/foth000918.htm
http://www.macworld.com/article/50115/2006/03/30timeline.html
Microsoft’s Windows is a copy of an adaptation. They got early access to Mac technology — including some of the GUI code — to help them make an office suite for the new computers. Meanwhile, they made a clone behind Apple’s back. Apple, apparently not considering it a threat, settled over Windows 1.0 with a license covering most of the GUI technologies Microsoft had cloned. When Microsoft continued to develop Windows into something much more Mac-like, Apple brought it to court, where it was determined that either their license covered what Microsoft was doing (the license was foolishly written to be perpetual), or what had started on the Mac and moved to Windows was not subject to copyright.
And here we are now, waiting to see if patents hold any more water than copyrights did.
Why is that worth mentioning? MacOS was already ahead of Windows at the time, so neither excellence nor originality won out, whether you take Amiga into consideration or not. Are you suggesting that if Apple had been more consistently and aggressively litigious, Amiga might not be around today? (Perish the thought.)
The argument, as I interpret it, is that Apple got shafted and knows enough to try not to get shafted again by whatever legal means it can muster. Having done some reading to respond to you, one interesting loophole I noticed is that UI elements weren’t found to be out of scope of copyright per se, just that none of the ones Microsoft was using unlicensed were covered, nor was the overall innovative arrangement (“look and feel”). Since Apple didn’t license the iOS interface to Google, they may even still have copyright ammunition despite the Look and Feel outcome.
Ouch … Must have touched a nerve.
I have to try and focus my reply, so excuse my snipping:
So the idea of a “WIMP” GUI is *exclusive* to Apple? Presumably through patent or copyright transfer? Your provided links don’t seem to support this idea. Where can I find more information on this (anyone?) A solid reference would be nice.
Do you support the idea the Microsoft copied code from Apple?
Yes, it’s worth mentioning because this story is not really just about PC and Mac. There were numerous other players at the time, dozens of different user interfaces, some I’m sure some were influenced by the Mac, and others by Xerox and many had plenty of original ideas by themselves (see the Amiga’s “screen” concept.)
The fact that Windows was hardly even used at the time (certainly until v3) means that the Mac’s lack of “world domination” was nothing do with any perceived wrong-doing by Microsoft, but rather by the more open and performant platform that the PC was.
IMHO: Apple make nice products for sure, but the fact is, without them we’d still have modern GUIs, we’d still have tablets, smart phones and we’d still have portable music and video players. They are not are not a beautiful and unique snowflake.
Are Apple and it’s fans so immersed in the cult of victimhood that they fail to recognise that they didn’t get “shafted” ? They were out-competed by products which performed better and cost less. The success they have now is because they have a good product that seems to fill a (much-hyped) niche. Who will they blame if (or when) this too comes to end?
Edited 2011-08-27 05:48 UTC
I meant the innuendo comment as literally as possible, that the original post didn’t contain solid facts or assertions, but rather insinuated an alternate version of events with the use of rhetorical questions. I take the “ouch” to mean you thought otherwise.
And I appreciate pointed, essay-style replies; the line-by-line Usenet style arguing got old for me years ago, so snip all you like.
My point in emphasizing those details was to dispel the various myths that Apple and Microsoft both copied Xerox in much the same manner. One fable has it that Bill Gates was also invited to PARC, which is false — that’s the part that was exclusive to Apple, in exchange for the stock purchase. Other versions say both Apple and Microsoft copied the Alto, which is also false: Xerox didn’t have an actual product before it got Apple involved. Circumstantial evidence doesn’t allow for that, either. Apple and Microsoft somehow made exactly the same enhancements to PARC’s desktop system after taking the same cursory glance at a finished product?
At the time, Apple hadn’t thought of using the patent system to protect its software innovations, and the court threw out their attempts at copyright claims, so exclusivity at least didn’t end up being a legal right. It’s the access arrangement that was exclusive to Apple: tech tours for stock. What happened after that is history.
It’s my understanding that Microsoft had access to some of Apple’s GUI code during the development of Office, and I believe they made use of it, but I don’t know whether it was ever proven. Years later they set up a deal with Canyon, then in possession of Apple’s Quicktime code, to develop Video for Windows, so I believe they (again) used Apple code to make unauthorized, competing products.
I don’t follow the reasoning here, or what you mean by “performant.” Windows was always the more “open” (a better word would be flexible) platform; it only took off once it was mature, i.e. once it approached Mac usability.
For the record, I’m not proposing that Apple would have had the entire PC market as it was if not for Microsoft’s disruption; my interest was only in setting the record straight on who innovated and/or stole what, and what a company in Apple’s position might be able to do to prevent defeat at the hands of imitators. Patent protection for software hadn’t been conceived of at the time; now filings are everywhere, and it’s coming time to actually test them in court.
That’s speculation, not fact, and I counterspeculate that they would look and work more like CDE, UMPC, Blackberry, and Creative Zen respectively. Apple repeatedly puts the most work into an elegant design that the rest of the industry treats as a free template, originally with the Mac but especially now with iPhone and iPad knockoffs running Android. That they’re trying to fight that is no vice, and if they win, it will be a positive precedent for originality, much needed after WebOS’s failure. Lazy copying makes the world stand still.
What do you call it when you work is used against you, leaving you with no legal recourse? I call it shafted.
Today’s Apple is both more prolific and more aggressively plagiarized. Time will tell whether the courts have anything to say about the latter, but the former is what gives them a future.
Ah see, but being shafted is all in the eye of the beholder. You don’t like people borrowing from Apple’s designs, and yet Apple has borrowed quite a bit from PARC, BSD, from the Mach, from KHTML, etc.
Now legally they’re not obligated to give things back to these people who did the real work for them, but why after all this borrowing by Apple, do they not have a greater appreciation of the fact that software design has historically been shared amongst many different entities (despite Gates and Jobs claims to the contrary)? Why are they so quick to resort to litigation for even the smallest thing (say like an image viewer on a tablet that looks like their own)? They seem to be operating on the pirate’s code:
“Take what you can. Give nothing back.” – Jack Sparrow
Ironic invocation of piracy.
I don’t know what it is about software that makes people so blas~A(c) about the difference between legal and illegal copying. I mean, when an end user breaks an EULA or pirates something they never could have afforded, then yeah, whatever; but there are serious financial and other consequences with this sort of thing on a corporate scale, like Psystar’s end run around Apple’s R&D budget, which the courts determined to be a counterfeiting operation.
Apple is in compliance with the BSD license, and they were specifically authorized to commercialize PARC’s technology, so what do they owe to anyone that they’re not giving? I suppose I’d concede that BSD also got “shafted” by Microsoft (allegedly?) lifting their networking code and Apple their entire userspace in order to make products that compete with the original over users, developers, and other mindshare, but it was always BSD’s developers’ choice to reserve so few rights.
Xerox could be said to have been shafted as well, having all but given away the future of computing, but again, their choice. And as for whether everyone or just Apple should have access to PARC’s ideas, I guess that should have been up to Xerox, depending on the letter of their arrangement, but I’d bet money they were against it: they tried to sue Apple over… I’m not really clear on that; I think they wanted to collect back whatever Apple collected from any and all infringers, but in any case it was all dismissed, and anarchy ensued; cheap, ubiquitous anarchy.
I specifically said they were not legally obligated to give anything back, and I won’t even be so silly as to say “morally”, but how about just for the sake of being reasonable? As in, “Hey, FreeBSD project, thanks for all the hard work, and thanks for Jordan Hubbard. Howzabout we help you guys out in return just because we do appreciate open source and don’t call you a virus like Microsoft?”
As for the copying thing, did you really think that all of these Samsung devices should be banned all across Europe just because the image viewer app looks like Apple’s? If they make a device and looks and works exactly like the iPad, then by all means, sue them, but over something small like this? Really?
This attitude of “we are not legally obligated to give anything to the FreeBSD, but Samsung has legally infringed our IP if ever so slightly” is part of what’s wrong with the software industry. Being overly concerned with using the law as a weapon, rather than (godforbid) trying to foster a more relaxed business climate that doesn’t make a mockery of the legal system. You speak of anarchy, but it’s Apple and Microsoft that have started us down this path unnecessarily.
I should really let this thread die, but you know, emergency: someone is wrong on the Internet, and in fact I think this is the post I most thoroughly disagree with.
Starting with a point of information: Microsoft (and really all commercial tech houses) call the GPL specifically viral, not all open source in general, and they have a point. Depending on various degrees of definition and interpretation, even something as minor as dynamic linking counts as making a derivative work, so you can’t redistribute any code you write that depends on anything covered under the GPL unless your entire project is under the GPL. Commercial devs also have to worry that if any of their programmers are dabbling in GPL, they may be accused of incorporating copyrighted GPL code, without having any plausible deniability. So yeah, there are perfectly valid reasons to avoid GPL code as “viral.”
And why not call it a moral obligation, if that’s what you believe? OSAlert comments rarely look at the world as it is so much as how the author thinks it should be, and morals are a fine concept for that kind of discussion.
Anyway, I googled “Apple contributions to BSD” (without quotes), and the first two links were very interesting.
http://lists.freebsd.org/pipermail/freebsd-chat/2004-July/002483.ht…
This one rightly points out that Darwin essentially “gives back” everything they took by publicizing Apple’s modifications to BSD’s code (which they are not legally compelled to do). The second one contends that Apple is responsible for rekindling progress in FreeBSD, though the author’s reasoning relies on a lot of undated, unsourced correlation and gut feeling.
http://www.trollaxor.com/2004/02/thank-apple-for-freebsd.html
The fact that both top results are from 2004 suggests that it’s not exactly a heated controversy these days.
Why not? It’s in violation of a patent that the courts have decided is valid. The alternative is to say “OK, that’s decided, now run along and go on being illegal.” You could argue that Samsung should have a window to correct the problem before any injunction takes effect, but I could counter that Samsung has already profited from a window of consequence-free sales of an infringing product, so this outcome is perfectly equitable.
I think you’re confusing anarchy with pandemonium, though the two often hang out. Anarchy is the absence of rule of law, and in tech, it’s what gives us mountains of cheap Chinese equipment, because IBM and then Apple and Xerox and now Apple again could not legally protect their inventions from cloning. If I read you correctly, you’re specifically promoting anarchy. That’s fine; just own it.
Patent trolls are more the pandemonium side of things, and the law will slowly adjust itself to quell it. Yes, Apple and Microsoft bear some responsibility for creating the legal environment opportunists are presently taking advantage of, but they did it to fend off a different kind of opportunist.
I asserted that Apple took ideas from Xerox. This is not innuendo by any stretch of the imagination. Aspects of the story you’ve told so far are still up for debate, see below.
I don’t think Windows looked or behaved anything like an Apple product at the time.
http://www.fortunecity.com/marina/reach/435/windows.htm
http://internet.ls-la.net/ms-evolution/windows-1.01/
http://www.vectronicsappleworld.com/macintosh/lisagui.html
http://nd.edu/~jvanderk/sysone/
I’ve used them both too…
Does a tech tour mean exclusive rights? This is what I mean by innuendo. You’ve implied repeatedly that this deal gives them a certain privileges in the realm of ideas. You also state a court threw out their attempts to assert copyright – does this tell you anything?
So we’re speculating now? I thought you had the facts.
My initial assertion still rings true: Any code copied would have to be *really* high level due to architecture differences. I was a 68k assembly coder and I can assure you that any transition of code (tailored to the 68k) to x86 16-bit real mode would be a nightmare. It would be far easier starting from scratch.
There is a great blog by a fella named “Dad Hacker” and he talks about the same trouble Atari had when porting GEM from DEC to the then-new ST. The original GEM was x86 based.
But don’t let that stop you from believing it by any means…
The Mac was slow. The interface was slow. The display was a tiny black & white CRT. Their processors might have been comparable in clock speeds and MIPS, but PCs generally ran rings around Macs. The PC had crappy graphics, but the display was clear in text mode. That’s why you bought a PC, it was cheaper and it did the job. That’s coming from a former Amiga owner!
That’s the damn problem – they didn’t innovate in this circumstance, did they? They used existing ideas, like everyone else! I’m fighting the thought-process that this could be considered “innovation.”
The original article takes this to another a level and plays the “what if” card and then tries to justify Apple’s modern bad behaviour.
It is a fact. An absolute FACT – those technologies exist absolutely independently of Apple. That cross-pollination of ideas exists in the market no one denies. I’m absolutely sure Apple’s ideas have influenced others. And equally sure Apple have looked at other technology and said “sure, we’ll borrow that” (see this whole thread.)
Outrageous! When Apple copy, it’s innovation, when others do it then they are lazy. Do you work for Apple? Got Apple shares?
And it damned sure is a VICE when they are trying to manipulate court systems to get their own way at the expense of others (see the recent Samsung experience.)
I think we have entirely different world-views (at least on this matter.) I think Apple has innovated and I think they have lifted ideas. They have made nice products and they continue to make nice products. I think some people are jumping through mental hoops to try and justify Apple’s recent litigious behaviour, by viewing history through what I would call a “Apple distortion field.” Almost anything good about computer interfaces seems to be attributed to Apple, but I disagree with that. It would be nice if Apple could compete on merit alone and there by recognising their own dependence upon “borrowed-ideas.”
Edited 2011-08-28 01:06 UTC
“Jobs was so struck by the power inherent to the PARC that he offered Xerox the opportunity to invest a million dollars in Apple computer if the company would agree to let him and his Lisa team study Alto. Xerox felt that it had nothing to lose. After all, they couldn’t sell it. They did not believe the world was ready for the advanced PARC technologies. Apple was about to go public and Xerox’s investment branch, Xerox Development Corporation, sensed an opportunity to turn a quick profit. Xerox invested $1 million in Apple by purchasing 100,000 shares at $10 each. Furthermore, Xerox signed an agreement with Apple to never purchase more than 5 percent of Apple’s outstanding shares. Within a year, these shares split into 800,000 worth $17.6 million when Apple went public.”
Source:
http://vectronicsappleworld.com/macintosh/creation.html
http://en.wikipedia.org/wiki/History_of_Apple_Inc.
“Apple listed 189 GUI elements; the court decided that 179 of these elements had been licensed to Microsoft in the Windows 1.0 agreement and most of the remaining 10 elements were not copyrightable^aEUR”either they were unoriginal to Apple, or they were the only possible way of expressing a particular idea.”
Source:
http://en.wikipedia.org/wiki/Apple_Computer,_Inc._v._Microsoft_Corp…
Thank you for the informative reply, that answers my question.
I hope we’re done with this topic now!
Based on history of other technologies, I can make an educated guess that we would not have stayed with “CDE, UMPC, Blackberry, and Creative Zen”. Otherwise it’s not Creative Zen we would have been stuck with, it would be a portably wind-up vinyl player.(Since you imply that we would stay at the level of the first invention)
Apple isn’t the only one that can dream up stuff.
http://adaptivepath.com/ideas/aurora-concept-video-part-1
http://www.dontclick.it/
This is the most arrogant lacking research editorial I have ever read from you. You are blaming Microsoft for something that was inevitable because of the openess of the IBM architecture and the work that took place in the industry to reverse engineers IBM’s hardware to create an open architecture upon which the PC industry was built. Of course, Microsoft’s early work with IBM and strategic licensing deal and clauses allowed the company to license PC-DOS to IBM and MS-DOS to anyone else. A recent article on PCMag even detailed why IBM chose not to use the Motorola CPU which was lacking certain 16 bit features that were already being delivered by the Intel 8086.
Also remember that Bill Gates was also exposed to the XEROX Alto and Xerox Star which both had early GUIs which both Microsoft and Apple copied ideas from. An employee who worked on Windows 1.0 said Bill even bought a Xerox Alto for employees to play with and get an idea of what he wanted to achieve. To say that Microsoft stole from Apple is very one sided and I guess you suffering from the high Steve Jobs resignation or trying in some way to suck up to get his attention through this 4 paragraph drivel of yours.
The PC would have been popular no matter what, period. Lets not forget that IBM would have been the bad guy in the 80’s, there were other platforms that existed before and after the IBM PC such as the Altair, Commodore, Apple. Some even existed before the Apple 1. So, it is important to note, Apple was in this for a long time, they were only one out of many players in this business. Even some Steve Jobs decision gave the brand a bad rap such as the lack of a fan in the Apple II (but I am understanding the reason behind this is because of Steve Jobs hearing loss). The GUI was inevitable across all architectures and platforms. If Microsoft did not succeed, Linux probably would have instead, so Apple would have still been a 5% marketshare either way.
Also, remember that the Macintosh wasn’t that popular either, it didn’t have many applications during its debut, it wasn’t until the Apple Writer, Postscript, Aldus PageMaker, QuarkXpress it gained some attention. Even when it did, it still did not get that mass appeal and adoption like the PC. Microsoft didn’t get programs like Photoshop until version 3.0, PowerPoint not until 1990. So, Apple basically decided to make things be like this. John Sculley even admitted they were at fault and they should have followed the rest of the industry and migrated to Intel x86 and there were plans for the this, remember Star Trek
Just to be safe: I did not write this.
That mistake tends to be made.
True… But you did let it through right? I’m not in any way implying you shouldn’t have – it is certainly an entertaining read. I just wish someone who didn’t sound like a total loon had taken a crack at defending Apple instead of this kind of whiny “Microsoft stole our lunch money” revisionism that really does Apple no justice – that whole affair was eons ago and has nothing to do with the Apple of today. Why whine over the Apple that might have been when you have the Apple of today to be proud of? I don’t get it at all…
Edited 2011-08-26 06:00 UTC
Come on now, the author is someone who goes by the name UMAD – obvious troll is obvious right?
I stopped reading 1/3 of the way though…
Touch~A(c)
I’m not that much interested in Apple or Microsoft sort of things, but I simply don’t trust an author who’s using Greek words he obviously doesn’t understand:
Too tired to write much.
Read ‘Barbarians Led by Bill Gates’ by Marlin Eller, a very senior developer at Microsoft and you will see just how much Bill Gates in particular was obsessed with replicating the Macintosh. No Macintosh would have meant no Windows.
Apple mostly lost because it was very badly managed from almost the moment the Mac project was started. Given the ineptitude of Apple’s management (including Steve Jobs who was yet to learn several very painful lessons) its a miracle it survived and a testimony to just how adored the Mac OS was by millions of fans (including yours truly) who stuck with Apple through the ghastly chaotic period from 1984 to 1997.
We are leaving the epoch of the desktop PC behind now. There will be desktop PCs for a long time to come but there importance in the technology ecosystem is collapsing. In the future people will read the history of the PC period and laugh with disbelief at what people had to put up with: system crashes, rampant malware, the BSOD. We are finally leaving the medieval period of information technology behind and what a relief it is.
Good grief, not this again.
The bold is my highlighting.
What in your opinion, will we use in the future for:
1/ The design of new Microprocessors
2/ Industrial Design applications (e.g. Toasters, Fridges, Furniture, etc.)
3/ Mechanical Design (Cars, Trains, etc.)
4/ Software Engineering (Not only for PCs, but tablets, smart phones, embedded devices, etc.)
5/ Graphics/Photo Editing
6/ 3D Rendering (Movies, Images, etc.)
7/ Sound Editing (Including composition of Music, etc.)
8/ The ability to administer, deploy and manage these applications on a network (in an office or at home)
…I could go on, but I guess I’ve made the point…
I can’t wait to see the very fancy tablet to accomplish these tasks!
Or maybe we will just amuse ourselves with Angry Birds all day and forgo civilisation?
IMHO – The “PC” in one form or another is the MOST important part of technology for the foreseeable future!
Or I could. How about word processing which, it seems to be forgotten, was one of the primary selling points of PCs? As far as I can tell there’s been no improvement on this front in this so-called post-PC world.
PC’s are not dying or becoming irrelevant; they are finding their level, helpfully assisted by smartphones and tablets. Desks with computers on were never the best place to watch videos, or read magazine articles, or socialise, because its not comfortable, its too formal and tends to render one temporarily miopic.
They became used for these things; now they will be used less for these things and more for the things they were intended for—writing, compositing, analysis, Civilization games—things for which one wants an upright sitting position and the proper focus of staring at a big screen.
We will see a small decline of PC usage then a leveling out. PCs are currently overused, but will prove to be the best tools for most jobs, as opposed to most luxuries. Jobs like playing Civilization, for example.
Tablet docking stations? Word processing needs very little processing power, so there’s no full blown PC needed for the task.
Yes, I suppose so. But then you’re at a desk, emulating a desktop or laptop*. I suppose my point was no so much about the specifics of hardware, but about the situation in which it is used. For many activities the desk will remain ideal, and on it the smartphone and tablet has no inherent advantage**.
________
*Laptops when not used for those activities better done on a tablet, such as browsing the web on the sofa.
**Aside from power consumption, but there really is no reason other than screen size for most PCs to run at a significantly higher power than tablets most of the time any more. Its just poor design.
I would rather say PC’s (at their time) where quite lucky to expand their market from profesionnal information workers to casual consumers. The post PC era in my view about the consumers fleing the all in one PC to specialized devices that do the job better with less hassle:
– content consumption
– WWW
– simplified media production and sharing (web 2)
– gaming
The full featured office productivity is one excepion but if you look at it from a perspective, what consumer does actually need that after finishing education? For last 10 years I (privetly) haven’t created a document that google docs wouldn’t handle.
The PC value proposition for consumer market is actually quite questionable once you have valid alternatives and is immune to vendor lock-in (cough * iCloud * cough).
Though I agree with your general gist, we could very well be using work stations for that, by which I mean more dedicated, more powerful, versions of the “home PC”. It may well be we will see a split between consumer devices (tablets) and more professional devices (work stations). I’m not saying we will, but we could.
No need. Most of those tasks will not be performed at home. And when talking about Microsoft, they get a large share of their profit from home use. So it may be an important shift.
I sure hope not. I am where I am right now education/work-wise specifically because there was not much functional difference between “home PC” and “workstation” when I was a teenager/student. If all i had available to play with were, for example, a game console and some web-device (like a tablet) for emails and whatnot i would have never tried my hand in programming, 2d/3d graphics and all the other awesome creative things you try out on a regular pc. If the market is split again between content-creation (expensive/professional) and content-consumption (cheap) devices i think our industry (in VERY general terms) will lose many future creative people.
Edited 2011-08-26 10:03 UTC
We ran Silicon Graphics Onyx and O2 workstations in our CAD department for years – they cost a small fortune.
The advent of 3D acceleration for the workstation-class PC changed the business forever. For the cost of every single SGI box, we could deploy six Windows NT systems.
I fear the scenario you describe above might bring us back to this sort of era – consumer grade devices being cheap and “professional” equipment costing an arm and a leg.
I would consider it to be a massive retrograde step!
(Please note: I loved the SGI’s to pieces, but they were simply not competitive towards the end…)
May you live in interesting times, eh?
I couldn’t agree more. And even if you could bring some of those applications over to a tablet, it would be a hell of a task and take several years and then more years to get it right
Just think about how long it takes for many productivity applications for “small” things like be 64 bit native, be Universal Binary, get ported to Mac, etc.
I suppose it depends on how you define important. Nobody is suggesting that the PC will go away, it’s just that it’s weight in terms of what drives technology forward, where the money is being made and its overall social and cultural impact is declining.
We have already reached the point where more than half the devices being used to access the internet is non-pc, soon the non-pc devices will represent far more than half.
If you look at your points one by one you get this:
1/ The design of new Microprocessors
2/ Industrial Design applications (e.g. Toasters, Fridges, Furniture, etc.)
3/ Mechanical Design (Cars, Trains, etc.)
4/ Software Engineering (Not only for PCs, but tablets, smart phones, embedded devices, etc.)
True PCs will probably be best for these functions, but in relatively small numbers compared to the numbers of non-pc devices. How many PC will need globally to do this sort of stuff, a few millions perhaps.
5/ Graphics/Photo Editing
6/ 3D Rendering (Movies, Images, etc.)
7/ Sound Editing (Including composition of Music, etc.)
These points I think are more mixed, here many people will find that non-pc devices are perfectly good enough for what they want to do with their music, photos or videos, particularly as the capabilities of such devices improves. Then there will many pros who will use a mix of PC and non-pc such as pro photographers who are adopting the iPad and a complimentary device in large numbers. The same with videos and music.
8/ The ability to administer, deploy and manage these applications on a network (in an office or at home)
Here I think the collapse of the pc paradigm my be surprisingly rapid. In the home come iOS5 for example a pc will no longer be required to administer anything, the same will go for many small and medium sized firms I think.
I think you may be surprised at how rapid the non-pc device revolution is going to be
I have to make a big snip here, as this thread is already too big for me to manage. Thank you for keeping it civil though.
In my work place, where we design, build and manufacturer Medical Instruments, iOS has made exactly zero impact. We are around 400 people strong.
People do use their own iPads to browse the web at lunch time, but it’s going to take a lot more than that to convince me that this is a new paradigm. Maybe I just lack the vision
We’ll see I guess!
Yeah, someone is using “Pirates of Silicon Valley” as their main source of information.
IBM platform was not “open” until Compaq reverse engineered it.
Open – in a sense that you could develop and sell addons to those machines.
Hey, you’re making too much sense! Don’t forget, we are dealing with APPLE FAN BOYS!!! You know the type, right? Apple invented everything, the rest just copied – so apple has the right to do anything, no matter how ridiculous it is! I suggest you read about how religious fanaticism works – it’s hopeless to convince these people with reason or facts. They are believers. There have been hundreds if not thousands of attempts to reason with them, but all failed. They are still around, and they are relentless. I wouldn’t be surprised if the author was simply a clone of one of our regular Apple fan boys (just joined this month, right?).
This article should at least try to back its “facts”… I didn’t expect to see something so biased in OSAlert.
“theft” is a really strong word… specially if you don’t back your claims (or at least try to learn a little bit of computer history… Xerox not mentioned even once?! ok, there’s nothing else I need to say after that…)
[I hope editors at least put those “this article is the view of the author and have nothing to do with osnews… bla bla bla]
Wow, cry me a river.
That;s because it is a bizarre analogy. Are you saying computers should be a priviliege only for the rich? That’s what the analogy implies.
Microsoft and the PC was bigger than Apple even before Windows came along. Are you saying MS stole DOS from Apple?
Really, is that so? On what know facts do you base this allegation?
Except at the iPod came out, what, close to 30 years later when manufacturing has become much cheaper?
Except that the Macbook and iPad are comparatively expensive.
That’s what happen when you screw up.
Jesus, just stop with the sob story already.
Really? How are products created decades later under completely different circumstances any indication of that?
The goal does not justify the means.
Edited 2011-08-26 00:27 UTC
They are expensive. PERIOD. Just like a ^a‘not100’000 Ferrari is expensive. Their value is appropriate for the price, but they are expensive nonetheless.
My first PC that I could use for programming cost $600. If Apple style pricing would have dominated the industry, I probably would have ended up as a builder or something like it(due to lack of experience in programming).
If Apple wins and destroys all PC manufacturers today – Indians, Russians and Brazilians are screwed. As basically no one there can afford $600(+$200) for a Mac Mini(+Display) or $1000 for an Apple laptop.
A nice attempt at being fair Thom, but totally wrong. The only reason why Apple couldn’t take advantage of cheap off-the-shelf components (as the PC cloners did) is because Apple deliberately made every single piece of the Mac proprietary. I know, I owned a Mac, circa 1998, and it was horrible. Not ONE SINGLE COMPONENT was standard. Not only the internals (hard disk, video card, power supply, memory chips), but absolutely everything was non-standard. That included the monitor, mouse, floppy disk drives, keyboard, modem, even the screws that held the case together. That’s right, if I lost a screw, I had to go to an Apple shop to get a replacement. And of course, Mac parts always cost double the price of their PC equivalents. Really, the only hardware component that needed to be different was the motherboard, since Apple was based on the PowerPC processor at that time. Later, they went to i386, so from that point on there was no justification whatsoever for a single proprietary part.
In fact, an aftermarket developed. You could buy third party adapters so that you could use at least a PC monitor, mouse and modem.
Compare this all to Microsoft, which didn’t even make hardware. They just sold the operating system, and later Office, which were lucrative cash cows. They let the PC clone market make the hardware. Apple could have followed this strategy if they had wanted to, marketing MacOS (and later OSX) and not requiring you to purchase Mac hardware.
Actually, Apple DID allow clones for a brief while. There were several legal clones (plus a number of illegal ones). Apple decided to end the legal cloning programs because they found that they made more money with high-priced hardware. From Wikipedia:
Apple’s clone program entailed the licensing of the Macintosh ROMs and system software to other manufacturers, each of which agreed to pay a flat fee for a license, and a royalty (initially $50) for each clone computer they sold. This generated quick revenues for Apple during a time of financial crisis. From early 1995 through mid-1997, it was possible to buy PowerPC-based clone computers running Mac OS, most notably from Power Computing. Other licensees were Motorola, Radius, APS Technologies, DayStar Digital, UMAX, MaxxBoxx, and Tatung. However, by 1996 Apple executives were worried that high-end clones were cannibalizing sales of their own high-end computers, where profit margins were highest.[11]
[edit] Jobs ends the official program
Soon after Steve Jobs returned to Apple, he backed out of recently renegotiated licensing deals with OS licensees that Apple executives complained were still financially unfavorable.[12] Because the clone makers’ licenses were valid only for Apple’s System 7 operating system, Apple’s release of Mac OS 8 left the clone manufacturers without the ability to ship a current Mac OS version and effectively ended the cloning program
source:
http://en.wikipedia.org/wiki/Macintosh_clone
Edited 2011-08-26 00:39 UTC
Thom didn’t write this…
That is like blaming tigers for having stripes… Apple was a computer maker – ALL computers makers made proprietary hardware in the early days, that was kind of the whole business model of being a computer maker…
IBM used off the shelf components for the PC because they were cheap and they didn’t want to waste time, energy, and talent on designing something they never expected to be more than moderately successful. If they had a clue of the pandora’s box they were opening by doing that they would have never done it. Sorry, but the whole “off-the-shelf compatible components” thing was a fluke – it only happened because no one saw it coming. No computer maker in their right mind in the early 1980s would have done something like that on purpose…
No, you can’t blame Apple for not going that route earlier – they didn’t see it coming either and by the time they did they were too vested in their business model to abandon it without a fight. Just be thankful that a few guys at IBM accidentally let the genie out of the bottle and couldn’t manage to put it back in…
The fact that Windows was a conceptual ripoff of Macintosh completely ignores the real reasons that the PC platform became dominant – Windows had virtually nothing to do with it.
Windows 3.0 did not ship until 1990… That is 6 years after Macintosh first shipped. In those 6 years the sales of previous versions of Windows (which were all considered universally useless) were nil. Yes, it was bundled with a lot of systems, but about 98% of users just completely ignored it and used DOS (I know, I was one of them). So Apple had at least a 6 year window of opportunity to sell users on GUIs…
So how did PCs become so popular when there was such a better option available? Two reasons:
1. Because PCs were cheap, and they kept getting cheaper! They were not cheap because of Windows, they were cheap because of clones. This “Robin Hood” effect you speak of exists, but credit should go to Compaq for reverse engineering the IBM BIOS and defending it in court. Microsoft was simply pulled along for the ride and had nothing at all to do with it.
2. Users were NOT ready for ONLY a GUI yet… No one seems to acknowledge this, but at the time it was absolutely true. Experienced computer users knew how to deal with a command line, and they saw a GUI as nothing more than window dressing for dumb users. GUIs were neat and all, but it simply took time for them to development to the point that using one was actually BETTER – for a long time (and yes, even with Macintosh) it wasn’t better, it was just different. The problem with Macs was that there was no fallback, it was all or nothing – you did everything in a GUI, or you did nothing. PCs had the advantage (for a VERY long time) of having DOS as a functional fallback.
There are certainly other credible reasons too, a multitude of component vendors that could make hardware without licensing costs, Business users and the success of programs like Word Perfect and Lotus 123 for DOS, etc. All of these things would rank WAY above Windows for being a reason PCs became successfull, frankly Windows did not matter AT ALL until at least 1992, when 3.1 shipped.
ps. To back up my claim about GUIS… I frankly attribute a large part of OSX’s success relative to the original Mac OS to the fact that it actually HAS a functional CLI – if it didn’t there is a rather large portion of it’s userbase that would simply consider it unusable. And that almost 30 years later…
Edited 2011-08-26 00:52 UTC
This article, for all that it seems to have been scantly researched, has shown what OSAlert is best at—intelligent debate around an issue.* And your comment is the best thus far. I say flesh it out a bit, make the language a bit more neutral, and submit as a counter article. We could do with more historical articles, topicality be damned.
________
*Plus standard issue Internet unpleasantness from time to time of course.
I used computers extensively for over 10 years before I used Windows 3.1. I initially found the GUI really weird and quite unproductive compared with DOS.
Apple’s biggest problem is that it borrows all the ideas it has. Sure, it does a great job of polishing and selling them. But at the end of the day it takes them from the people who actually do the development. There isn’t anything in there products that didn’t exist in other devices first. Their problem is that the competition is getting better while iOS has started taking cues from Android. And with every company working on Android while only Apple works on iOS, its a losing battle. They are trying to slow the slide with litigation, but that will only get them so far. At the end of the day, their refusal to license their OS’s to other manufacturers will only hurt them.
You know, I see this kind of comment posted all the time. There is certainly truth to it, but it simply doesn’t mean anything. Apple is not and has never been successful because of ideas (regardless of what they or others say).
Everyone borrows ideas, singling out Apple (or Microsoft, or anyone else) is just a waste of typing. You want to ridicule them because of their recent litigation rampage be my guest, but this whole business of “borrowing ideas” is just stupid – ideas don’t matter in isolation. For every really good, unique idea there is almost always a string of failed products before someone finally hits a homerun.
Apple’s recent success boils down to two things – timing and design. Its not the “idea” of the product that matters – the mouse was invented in the 60s but no one cared. It is one of the most important inventions in computing history, and it slept quietly for over 20 years because:
1. The timing was completely wrong – there was nothing compelling you could do with it because all the other stuff you would need to exploit it’s capabilities didn’t exist yet. It was like someone inventing fax machines when there was no telephone system yet – you had to plug two of them into each other with a cable. Neat, it can send copies across the room… Who cares?
2. The design was shit. The ball mouse (patented in 1972) was MUCH better than the original wheel design and allowed a small enough form factor to use it rather comfortably. Even then, in 1972 there was STILL no compelling use for the damn things…
Apple has a knack for getting the timing right… and more importantly – the design. Design is not how something looks, it is how it works and it includes all of its usability tradeoffs. You can say what you want about iPads – they are absolutely not a unique idea but the timing was right and the design was too.
That is all there is to it – timing and design. Ideas without both of those going for them are nothing more than wishful thinking.
“Everyone borrows ideas, singling out Apple (or Microsoft, or anyone else) is just a waste of typing. You want to ridicule them because of their recent litigation rampage be my guest, but this whole business of “borrowing ideas” is just stupid – ideas don’t matter in isolation. For every really good, unique idea there is almost always a string of failed products before someone finally hits a homerun. ”
This. Exactly this. This is the real issue with software patents: everyone is standing on everyone else’s shoulders _all the time_, but some of them are self-important enough to decide that their trivial step down the road everyone is travelling is Special and Awesome and Important and deserves protection.
If everyone who ever came up with some trivial step forward in software patented it, it would become utterly impossible to get anywhere, but that’s just the practical issue (and sadly, we’re getting quite close to that point); really it’s more the sheer asshattery of ‘hey, I had a neat idea, I’m gonna file a patent and sit on it!’ that gets me. Anyone working in software is building on far more significant ideas that came before and were put in the public domain; it’s just bad manners to claim your little innovation is somehow so significant it demands to be patented and monetized for evermore (rather than just in the context where you came up with it).
This argument is crap. Apple did “steal” from others as well in order to achieve the macs. Where did, for example, the mouse come from? Steve Job’s idea? Get out of here!
Ah but you see, good artists copy, great artists steal.
Of course, the only “artist” in the world is Apple.
Everyone else are just, uh, copying and stealing arti….errr…thieves. No…wait….wtf?
Erm… Apple paid XEROX for using the PARC stuff:
“Jobs was so struck by the power inherent to the PARC that he offered Xerox the opportunity to invest a million dollars in Apple computer if the company would agree to let him and his Lisa team study Alto. Xerox felt that it had nothing to lose. After all, they couldn’t sell it. They did not believe the world was ready for the advanced PARC technologies. Apple was about to go public and Xerox’s investment branch, Xerox Development Corporation, sensed an opportunity to turn a quick profit. Xerox invested $1 million in Apple by purchasing 100,000 shares at $10 each. Furthermore, Xerox signed an agreement with Apple to never purchase more than 5 percent of Apple’s outstanding shares. Within a year, these shares split into 800,000 worth $17.6 million when Apple went public.”
Source:
http://vectronicsappleworld.com/macintosh/creation.html
http://en.wikipedia.org/wiki/History_of_Apple_Inc.
And Apple then messed up by granting that license to Mircosoft… but those companies are basically ok with each other now with a crosss-licensing agreement in place:
“Apple listed 189 GUI elements; the court decided that 179 of these elements had been licensed to Microsoft in the Windows 1.0 agreement and most of the remaining 10 elements were not copyrightable^aEUR”either they were unoriginal to Apple, or they were the only possible way of expressing a particular idea.”
Source:
http://en.wikipedia.org/wiki/Apple_Computer,_Inc._v._Microsoft_Corp…..
The differnce in the Smartphone arena is that Apple doesn’t seem to have licesened its interface stuff out to anyone this time around.
That’s a known fact, that MS swindled Apple and gained advantage on desktop. However I don’t think that situation when someone has 95% control over the industry is healthy and productive. On the contrary, it leads to total stagnation.
What are you talking about? Only in apple mythology is this a “known fact” everywhere else it is a blatant lie.
There should really be a “Dislike” button for articles as well!
This one: Very biased. Very poor. Waste of time.
(I will not go into details. Other people already explained everything.)
What theft?
Both copied from Xerox.
Yes because 68K software runs perfectly on 8086…
Please provide a reference…
This article is written as if in a rambling drug infused haze of dreams, lies and delusional baloney.
These are the facts: Apple was out competed by Microsoft because of:
1. Apple trying to make everything proprietary, hardware, software, everything.
2. Apple making it deliberately difficult to write programs for the Mac, manipulating and fighting against 3 party developers that develops for the Mac.
3. An excluding ecosystem where only Apple is allowed to make money. Apple treats 3 party developers and equipment makers as thieves that steals from Apple or destroys the Apple brand somehow.
4. MS got lucky and hit the jackpot with IBM.
This article seems to allege that if Microsoft hasn’t existed (or if Windows took a bigger step away from the Mac/Xerox look&feel), competition to Apple wouldn’t have existed.
I don’t think so. Just look at the amount of GUIs that were developed in the 80s, after Xerox’s tech became popular. There was demand for a cheap GUI computer. Apple failed to acknowledge it soon enough. So someone else did it.
Microsoft has won, but it would have been someone else otherwise. And that’s fine, one company cannot do everything, otherwise we’re back in the USSR.
Yep. Even the C64 had a an aftermarket GUI.
And they would have had the same 6 years to develop a GUI on top of DOS in any case… Windows didn’t come into play until 3.1 and 3.11.
FYI: There was a lot of cut-throat competition in USSR. It’s just that everything was sate owned in the end. (See competition between Sukhoi, MIG, Tupolev, Ilyushin and Antonov auronautical design bureaus)
Nothing forbid Apple to port their operating system to the PC open architecture, which was NOT made opened by Microsoft contrary to what the article try to imply but by IBM.
If their operating system was available too on PC open architecture at time Microsoft released their Windows 1.0, there is no doubt that, indeed, Apple will have then get 95% market share.
But they reject the idea of open platform: the Not Invented There syndrome. Since start Apple sell you an integrated product, where hard and soft can’t be mixed or tuned at will. That’s their choice.
The fact that a very inferior operating system on an initially inferior hardware BUT open and standardized PC platform took over the well integrated Macintosh is very much telling: high integration is not the ultimate selling point for everybody. In fact, too tight integration is not that well selling.
Otherwise, beside Apple being the top leader in personal computer field instead of PCs, we will also NOT have internet but a tight, closed and centraly controlled and opaque network, like eWorld, CompuServe or MSN. But, against all prevision, it’s an open community build network architecture which succeed.
Another forgotten point is that personal computing didn’t grown in people home at start. It grew in offices all over the world first. Where depending on one SINGLE provider for both hardware and software don’t make buying department happy.
The article jump totally over the “standardized components” aspect. It also jump totally over the contribution from the major PC components manufacturers. The Video cards manufacturers introduced quicker bus technology (VBUS, then VESA). Intel itself have improved year after year PC architectures. A lot of PC today hardware architecture points were (re)designed by Intel, and it’s now far from the one released into the (then) void by IBM.
There is a reason why Apple switched their in-house hardware platform to a PC-based one: they know it was a better one (even if up to a year before they were trying to say with pathetic Photoshop plugins benchmark the contrary…).
If someone was stolen a whole business model regarding personal computing, it’s IBM. But as they didn’t knew then that their PC platform had actual value, they failed to see it having any potential.
When you put something in the dust bin, do you call people finding it there, saying “wow that’s great” and taking it away a thief?
Integration can be great.
But in an interconnected world, interoperability comes first. Which means being able to team with other’s solutions.
Mac OS classic was written in Pascal for the upper layers and 68000 assembly at the bottom. May they have ported the bottom to x86 and done a good graphic card PC extension, Apple’s MacOS will have ruined Microsoft.
They think a computing system can only be an integrated object.
Internet today, with cloud and web services, is the new computing system. And it’s the exact opposite of an integrated object. It’s a interoperable web of heterogeneous computing devices.
Aka, diversity.
Aka, life’s rule #1: nobody can control everything long.
Nothing forbid Apple to port their operating system to the PC open architecture, which was NOT made opened by Microsoft contrary to what the article try to imply but by IBM.
Wrong. The BIOS was reverse engineered by Compaq. This allowed 100% IBM compatible clones.
If their operating system was available too on PC open architecture at time Microsoft released their Windows 1.0, there is no doubt that, indeed, Apple will have then get 95% market share.
Very unlikely. The Mac was an unloved orphan was almost no useful software when it was first released.
The fact that a very inferior operating system on an initially inferior hardware BUT open and standardized PC platform took over the well integrated Macintosh is very much telling: high integration is not the ultimate selling point for everybody. In fact, too tight integration is not that well selling.
Ever used DOS? I think not.
DOS was lightning fast on very low powered hardware and remarkably stable. However it needed some skill to use.
Yes, BIOS was not opened by IBM. But they did nothing when Compaq reverse engineered it. IBM let the PC platform be *cloned*. The let it become the open platform.
While Compaq technically allowed 100% IBM compatibles clones, IBM legally allowed it by doing nothing against it.
So was Windows 1.0 software ecosystem at start too.
At this time, the simple fact to feature a GUI API was a selling point.
But, true, it’s not certain that the Mac API will have bring developers to this operating software more than the Windows 1.0 API.
Maybe just selling a very pro (for this time, that would be a 512×342 monochrome screen!) graphics card only supported by MacOS on a PC clone will have done alone the radical switch too.
What drove Macs sales in the early years were the DTP software that MacOS made possible on a personal computer. What made this impossible on a PC at this time was the lack of a good graphic card and a good GUI, both technology that Apple could have made available on PC if they wanted to. And that would have drove their card and software the same way it did for their Macintosh.
But they didn’t want that.
They want a well designed integrated computer.
They got it.
They also got a niche market with it: the niche of well designed integrated computers, while the not-well designed not integrated computers market made Microsoft and PC clones makers rich.
That’s a choice which did it, not a fatality, or a trick by competitors.
Sadly, I must confess I’m old enough to have used DOS.
But the point is that at DOS time circa 86, the Macintosh was offering both a better hardware platform and a better, graphical, operating system.
And still, it lost its market share to an inferior hardware and a inferior software.
Which can’t mean anything but that people find something in an inferior hardware and software PC platform that they didn’t find in the Mac one.
Which mean that integrated product is not a win solution for everything.
Sorry for the too many english mistakes.
But the point is that at DOS time circa 86, the Macintosh was offering both a better hardware platform and a better, graphical, operating system.
And still, it lost its market share to an inferior hardware and a inferior software.
Back in 1986 the Mac was little more than an expensive toy.
The vast majority of “proper” computers were purchased by businesses. Macs were actually vastly inferior to PCs for business use.
Most businesses used PC for three things CAD, spreadsheets and word processing. The Mac was basically useless for all of these purposes due to a tiny 10″ monochrome screen, limited software, low performance and almost no peripherals.
In 1986 most people who needed to use a computer already had expertise in using them and had no real need for a GUI.
Mac “defects” circa 1986:
– no high end graphics (so no CAD or games).
– no professional CAD software
– high price
– tiny 10″ monochrome screens
– obsolete processor (68000 8MHz)
– very limited RAM (4MB maximum)
– very expensive SCSI harddrives
– non standard networking
– limited software
– very limited choice of peripherals and printers.
Back in 1986 my brother was using a HP 386DX-16 16MB workstation PC (he’s a surveyor). No Mac available could have possibly replaced this machine.
In fact Apple didn’t even have a reasonably powerful Mac with a separate monitor until the $8K 1990 MacII. By that time an equally powerful Windows 3.0 PC cost <$2k and the war was well and truly over.
IBM didn’t have any recourse. The reverse engineered BIOS was a clean room design. One engineer in one room read the specifications of the IBM BIOS to an engineering in a second room who was coding the Compaq BIOS.
That is perfectly legal, and there is nothing IBM could have done after the fact. IBM wasn’t being magnanimous; their hands were tied. If there were software patents back then, IBM probably would have sued Compaq.
So true.
And that the best proof that the *lack* of software patent allow far more competition between far more market players for the benefit of consumers, while the proof that the software patent protection will benefits consumers is still nowhere to be seen.
I agree with most of what you say except this. Home computing was quite popular in the 80’s. Remember names like Commodore, Atari, TI, Dragon, BBC, Oric, Microbee, Sinclair and many many others. In fact, in terms of competition and consumer choice it was probably the heyday of home computing.
The business success of IBM “only” helped the PC to become dominant.
Edited 2011-08-26 15:35 UTC
You right, indeed.
The pioneer personal computers weren’t PC, yes.
But it’s when companies switched to personal computers instead of mainframe terminals (or typewriters, or well paper and pen ) that the personal computers grew, and they goes by IBM’s PC first to replace IBM terminals, and then PC clones, which in turn made people switched at home to the same kind of computers.
I was an Sinclair Spectrum owner and an Atari ST at this time. These early pioneers didn’t survived it. Apple is the only survivor of this period in fact.
Mainstream people, them, began with either a PC at office or, for a few of them, a Mac.
Hmm…I dunno. I’m not sure there’s a cause-effect in play here. It could also be the natural evolution of more and more people becoming exposed to computers over time. Companies switched to PC’s long before the home computers started to fade. I don’t think the PC really started making inroads in the home until it got MCGA/VGA graphics and half-decent sound sometime in the early 90’s. The game market for PC’s pretty much skyrocketed around this time when PC’s suddenly didn’t suck for games anymore. Interesting that, again, it was games that fueled the acceptance in the home.
Well, IBM too. This fact makes me a bit sad.
MS was doing very well selling DOS.
The Mac was initially considered a commercial failure until Quark Express and laser printers were combined to create desktop publishing.
The Amiga would probaly have wiped out the Mac if Commodore had been better managed.
Edited 2011-08-26 07:21 UTC
Without diving too deeply into the history of the computer industry, all I want to say is this: Apple’s culture is incompatible for serving 95% of the market.
I’ll give you some examples, some very recent ones:
– The iPhone. Apple offery only one model/form factor. You want a physical keyboard? You want a replaceable battery? You need better battery life? You need a dual SIM model? You need a phone that supports GSM+CDMA because you travel a lot to other contries? You are looking more for sub $500 phone? You want a phone that’s very sturdy? You want a bigger or smaller one?
– The iPad. You want Flash in an internet surfing device, really? You want hardware interfaces without having to buy and carrie a gazillion Adapters? You want stereo speakers? You want cams (original iPad) in your slate? 10″ is too big and heavy for you, you are more thinking 7″?
– Macs. You want 3G in your laptop? You want a fair amount of fast and common interfaces like USB 3 in your $2000 laptop? You want a replaceable battery? You want easy access to your hard drive and be able to keep it and your data when the computer needs service repair? You like matte screens more? You want a reasonably priced tower? You are living in an area without broadband internet connection and can’t download 300 MB system updates all the time because Apple didn’t offer delta updates for a very long time?
– Business customers. You want workstations with 5 years of warranty? You need at desk repairing with reasonable response times? You want something like a roadmap that you can count on? You want to be able to buy computers with an older OS because some of your applications are not yet compatible with the most recent version of OS X? You want crazy cool docking stations for your laptop fleet and finger print readers?
I think you get the idea. Apple is just so not interrested in so many people’s desires/needs. I’m not saying that’s a problem or that they should change. I’m saying that you people should accept the reality and see that Apple is not even close to serving most peoples needs and is thus not compatible as the one supplier of everybodies PCs and gadgets.
The only exception I can think of is the iPod that had a very dominant market share, though this is probably mostly true for the US. This was possible because they had quite a few models that were totally different from each other. Not true for iPhone and iPad.
The PC market is totally awesome right now, IMHO. There is really a product for almost every possible need and at every price point.
While we are at it, the author talked about Apple’s prices and IBM/HP getting rid of their PC businesses. My take: Both still maka/made money from selling PCs, just not enough for the share holders. This says more about the f–ked up financial market than about the PC businesses in question. Apple just has margins that are a lot higher, how do you think they made those ~80 billion USD that they have in cash? Overprized or not, Apple could sell every iPhone for $150 less tomorrow and still make money from every unit sold.
A final closing comment on software patents: Software and algorithms are Math and you can’t patent Math either. It’s that simple. Software patents are thus an insane thing.
Edited 2011-08-26 10:48 UTC
Your article couldn’t be farther from reality. In fact, you are merely perpetuating a dangerous Apple-minded fallacy, them becoming master of the world but their destiny being only robbed by the “evil” Bill Gates.
Unfortunately, it doesn’t stand to scrutiny, but one has to look into the 80’s with a bit of unpassionnate look, something quite rare these days.
What you seem to forget is that, at the beginning of this story, Microsoft is nothing, just a software developper with a good reference in Basic. And that’s it. A big lie and great deal of luck later made it also the provider of DOS for IBM, which, by the time, is just considered a worthless simple “by-product”.
The Real Power of this time is IBM. And it tells. Just look at figures : from the day IBM starts selling the first PC, it makes terrible inroads into corporate, sold units far outnumbering any other computer manufacturer, ***including Apple***. Far outnumbering, this is something in the 10x range.
And this is just with DOS 1.
And this is just with IBM, and just the first years. No PC clone maker yet, which will make the balance tilt even more.
The success is so immense that it becomes pretty clear in less than 2 years that PC is going to become the “de facto” standard in corporate computing. The arrival of PC into the Home only comes years later, when prices are pushed down enough by corporate volumes.
Then comes the US Federal Investigation into IBM for monopolist pratices (oh?), which forces it to let the clone market grows, eventually overtaking IBM itself. At this stage, the only common element between all these computers becomes Microsoft’s DOS, hence its “lucky” monopolist situation, which it will then successfully defend as much as it can.
Years later, while the PC is already well entrenched and de-facto standard in corporate computing, Microsoft creates Windows 1 (which is a failure), then Windows 2 (which is also a failure), and then Windows 3 (which becomes a great success). That’s where the “robbing” fallacy comes into play : Microsoft robbed Apple of the GUI idea.
What ?????
But you know what ? Apple never invented the Graphical User Interface ! How come this could be even slightly mentioned ? No way, Xerox was first decades earlier, and by the time the Mac is out, there are already countless other GUI on the market, none reaching mass-market fame, but nonetheless technical innovative precedends, from which Apple eagerly borrowed, without ever paying a cent.
Microsoft borrowed from all these previous inventors, shamelessly but not more than Apple did itself. Apple did not “owned” the GUI invention, not even the drag&drop, the folder, the menu, or whatever obvious component you can try to patent these days. All these where already “invented” previously, most notably by Xerox.
Last irony : do you know for which reason the Mac has been successfull on the corporate market (but never ever reaching the number of units sold by the PC) ?
Well, a major one : thanks to Excel, an excellent and (time-limited) exclusive product for the Mac, made by … Microsoft. Yes, by this time, you bought a Mac to play Excel with. Obviously, the offer was less compelling when the exclusive agreement reached its end. Mac had to try to get other “exclusive” software to keep a niche for itself, which it did thanks to Adobe (oh, another ennemy now…). You see ? Mac is not about overtaking the world, PC has already won. Apple is just trying to find a good niche for itself, it cannot expect more.
Last pin, don’t forget that Apple would have been just simply dead without the considerable financial support of Microsoft in 1996/1997.
So who’s the robber here ?
And more importantly who’s the liar ?
It always amazes me to witness Apple zealots constantly trying to rewrite History. It just makes a sparkle when i see that the tactics just works, since non-Apple zealots are just blindly repeating the same fallacies, without ever checking anything by themselves.
Edited 2011-08-26 10:52 UTC
I think in these issues age clearly plays a role.
Many of the people arguing about Mac vs PC were not born in the 80’s, thus they have not lived the computing evolution as many of us did.
When you were not there, you’re bound to rely on 3rd parties which are not always accurate.
I’ve lived in the 80s and in my world it wasn’t about PCs or Mac. We all had home computers.
PCs were computers companies used, not people at home, Macs were computers you’d hear about, but never see.
Only in the second half of the 80’s did PCs show up in people’s homes and even then we wondered why you’d have a PC if you could have an Amiga which was much better.
Both PCs and Macs were pretty boring compared to any home computer.
I guess it all depends how geek we were as children.
I had mostly contact with ZX Spectrum family, Sam Coupe and Amiga.
But there were a few Macs and PCs around as well that we could access on our high school.
The PCs were Amstrad PC 1512 that we used to play “Defender of the Crown”.
It was also not uncommon to see some Schneider Euro PC II on sale.
We had Phillips P2000 computers in school, later to be joined by 2 PCs.
Most people had Commodore 64s, the gurus had a Commodore 128. Later those people upgraded to Amiga’s.
A few had a ZX Spectrum, Atari ST, MSX and one even had an Acorn Electron (excellent computer).
The only people that had a PC at home were the ones who got it from their employer so they could do stuff at home (and bring the results to work on a 1.44 MB floppy).
But I was a kid, mostly interested in games and PCs sucked for that.
Exactly. PCs were boring things usually equipped with green or amber monochrome screens that were only used for spreadsheets and word processing.
The first time I saw a Mac in person was in the early 1990’s (’91 or ’92 probably). The 80’s was all about the home computer, primarily the C64 and the ZX Spectrum and in the later half the Amiga (at least in Sweden). You also saw people with TI-99, Atati ST and on a rare occasion something like a Dragon or Oric. Our school had, for God knows what reason, Microbee. You could play some kind of cave adventure game and an educational game called Kepler.
PC’s were huge, ugly and expensive office monsters with either dull monochrome display or the brain damaged CGA palette. Why you’d ever want one of those were a mystery to us.
Apple? Who? They make computers? Really? Lisa? The girl from school? Macintosh? Is he scottish? Oh, a computer. Never heard of it.
My experience in Australia was virtually identical.
C64, TRS80 or Amiga at home. PC for word processing at work. A mainframe if you needed grunt.
I wish I could give you a +1, but I already commented…
This comment is far more informative than the article itself. In fact, It should probably be published on the front page (after some polishing).
Thom, are you reading this? What do you say?
Microsoft stole the following from Apple:
– An idea
– An expression
– A market potential
Think about it! (I hereby grant you the permission)
Well, TFA suggests that if Apple did not have to compete with Windows then they would have lowered their prices…however, even today in markets where Apple is the dominant (or considered the dominant) player, they still maintain the higher prices – one need only look at the iPad, iPod, and iPhone to see this.
Then again, they’ll probably make the same accusation regarding iPad/iPhone and Android.
Yes, Apple will eventually be marginalized back to the niche it has always had when it comes to the phone and tablet markets – Android will be the benefactor. Not because Android is better than iOS, but because everyone else will be using it since Apple won’t license out iOS.
Apple nearly went under when they tried to offer clones of the Mac as everyone else was able to undercut their pricing. But that’s not because Apple is a software company that happens to make hardware (like Microsoft), but because at the heart of it Apple is a hardware company that happens to make software – more like IBM, Dell, and HP in that respect.
The apple strategy failed, because it’s one company. One company that provides the OS, one company that provides the hardware. One company versus everybody else will fail eventually. Patenting basic concepts such as a gesture or the “general look” of an interface is ridiculousness.
Let other kids play and let the better business model succeed and the consumer will win.
The entire basis of all these arguments is that “Well, they should be innovating! Coming up with there own stuff!” Sometimes an invention is an idea critical to our human development. How do you improve the wheel? If there was a way to improve it, are we going to let ONE company dominate until something better comes along?
The worst part of this article is that your portraying this as “stealing” yet your not even mentioning if this is something that can be stolen. Your poisoning the well here.
Apple wasn’t the company to invent GUI. Historical analysis of GUI Research traces back to first tentative steps in the area by Stanford Research Institute and then by Xeroxe PARC. The author is implying undue credit to Apple.
Robin hood stole something tangible. Software and ideas are intangible. They are non-scarce resources and should be treated as such by free people. Intellectual property laws ignore this fact, that is why Apple or anyone else who sues for IP infringement is wrong, and why your whole article misses the point. It was an interesting read anyway.
Unfortunately the truth is a little more complex. Money is really just an intangible resource too, made scarce by legal rules rather than physics. Intellectual property is no different. The finite tangible resources that money represents are the goods you buy with it. For IP it’s the effort that went into generating the ideas.
Intellectual property laws don’t ignore the fact ideas are non-scarce, they create a world in which they are. Just as counterfeiting laws do for money.
Certainly theft, copyright infringement and patent infringement are all very different things and shouldn’t be confused, but perhaps saying that suing for IP infringement is wrong is a bit strong?
Concerning the article it’s a little moot anyway, since as many others have pointed out, Apple wasn’t responsible for generating the crucial ideas behind the desktop, nor were Microsoft or Windows solely responsible for making the PC dominant.
That Microsoft stole code from Apple is a very old story. You make is sound as though MS stole everything from Apple and that’s how they won. Microsoft stole things from more than just the Apple/Mac machine, you forget that others like IBM were also victims. Gates wasn’t a computer nerd as he was a business nerd. He knew how to put stuff together and market it to the largest segment of people he could.
It’s all about the business model. It’s like the Catholic church during the days of Martin Luther. They believed the Bible should only be in latin and read by and interpreted by the clergy. Martin Luther translated the Bible into a language everyone could read and it help spur inventions like the printing press. Now almost everyone can get a Bible in any form and language they want. Gates didn’t invent the pc, just as Martin Luther didn’t “invent” the Bible. He just took from someone else made it availble to a larger group of people.
Apple wanted to be able to have better control over their products just like now. They believe if you control hardware and software ends and only have your certified professionals do the work, the end product will be better. There is nothing wrong with that model, but like every model there are weaknesses. Not enough competition on the hardware front from vendors, not enough pressure to push inovation except from them. Would we have advanced as far in the tech industry without that “Robin Hood” effect that Gates proliferated?
http://xeroxstar.tripod.com/
I think “antithesis” does not mean what you think it means.
And the whole article is uninformed hogwash, anyway. It has been dissected above by Apple lovers and haters, so I won’t bother myself, other than to say that I agree that A. do know how to take an existing idea, polish it brilliantly and then show it to the world as theirs. Taking a crap on the shoulders of giants?
I think Umad meant to say “apotheosis”, which means “the perfect example”.
I think the prefix anti- should have given him a clue…
The IBM PC was the natural upgrade path from CP/M for both user and developer.
That made the MS-DOS PC a viable commercial product before the cloning of the IBM PC BIOS.
Product based on the use of off-the-shelf hardware and an OS that sold for $40 retail list.
The IBM PC and PC clone begin to move in on the still-infant home and home office market.
With Sierra’s King Quest, the light begins to dawn:
Your 16 bit office wokhorse is a viable platform for PC gaming —
and there is a bonus:
The modular design of the PC makes upgrading video and sound easy and affordable.
The Microsoft operating system performs well on hardware that is mid-line at the time of release and entry level a year or two later.
Wamart.com stocks 245 Windows 7 laptops, with the 64 bit Home Premium laptop starting at $300.
Top of the line at Walmart.com is an i7 HP “Silver” laptop with a 17″ screen, 8 GB RAM, 1.5 TB HDD, Radeon HD 6850M Graphics, and Blu-Ray for $1600.
The Mac was – at least in the beginning – was notoriously resource intensive. It was a stylish machine that found a significant niche market. But nothing more than that.
Win 3 and Win 95 were transitional operating systems – with, let us say, a more populist focus – that preserved MS-DOS compatibility while introducing a generation of users to a graphical user interface at a price they could afford.
Well, you seem to be making a lot of assumptions here. Let me preface by saying I’m neither pro-MS nor pro-Apple, I’m an open source guy myself. Also, if I’m misinterpreting what you meant, then let me know.
A few things that need to be cleared up right away.
First, I believe when you use the rich/poor mentality, you’re refering to quality of software. If that’s the case, then Apple was definitely not Prince John (the rich) until more recently. They had to replace a lot of the userland with BSD components to get where they are today. That’s not to say the early MacOS was bad, but I think the Prince John analogy is pushing it.
Now, let me address two points:
Can we really blame Apple for not wanting the past to repeat itself.
No, given the current legal regime, you can’t blame any company for trying to use all legal means at its disposal to hurt the competition.
Similarly, can anyone legitimately argue against them without casting themselves as a champion for Robin Hood’s methodology?
Oh yes, most definitely. Throughout your entire reasoning, there are the implicit assumptions that intellectual property is legitimate in the first place, and that violations of intellectual property are equivalent to theft.
First, although you may disagree, there are many (myself included) who challenge the legitimacy of intellectual property in any form. Near as I can tell, our arguments are at least as well-supported as those of intellectual property supporters.
Second, even if we take as given the legitimacy of intellectual property, that is not sufficient to compare intellectual property violations to theft. Why are IP violations not handled through the more basic laws criminalizing theft? Because IP is non-rivalrous.
Why can Apple, MS, or any other company threaten another company or individual with an IP suit? After all, if the company/individual being threatened had made a great heist with lots of cash, they should be plenty rich to afford legal defense, right? But, of course, that’s clearly not true. Many times the targets of IP suits do not have the money to defend themselves, because nothing was ever STOLEN in the first place.
PC aficionados dislike of software patents and thus Apple’s position in leveraging them was formed originally as a result of Microsoft’s initial theft. Why wouldn’t they argue that point though? They are amongst the many benefactors of the theft.
In the same way, I’m sure Robin Hood’s benefactors (The have nots) would have argued just as vehemently against “the have’s” going after their stolen money and implementing greater security at the same time. I’m sure they would have argued just the same, “but look at all the good we’ve done with this stolen money!”
I’m not sure what you’re trying to say here, but it seems a lot like a false dichotomy. It’s completely possible to be 100% against patents, regardless of whatever company is wielding them.
Quoting this again:
Similarly, can anyone legitimately argue against them without casting themselves as a champion for Robin Hood’s methodology?
If you’re arguing (and I’m not saying you are) that to be opposed to Apple’s (or any company’s) use of IP in the courtroom must then be construed as support of theft, then that is a complete mischaracterization.
In a world without IP, if I rip someone off, I’ve taken nothing from him. He can still do everything he could do the day before. Conversely, in the world we live in, if an IP holder brings suit against me (whether or not I’m ripping him off), he has most definitely taken something from me. Money, at a minimum. Maybe I’m also now legally prohibited from contributing to an open source project. If any entity here could be remotely compared to a thief, it is the IP holder.