“Back in 2005, we charted 30 years of personal computer market share to show graphically how the industry had developed, who succeeded and when, and how some iconic names eventually faded away completely. With the rise of whole new classes of ‘personal computers’ – tablets and smartphones – it’s worth updating all the numbers once more. And when we do so, we see something surprising: the adoption rates for our beloved mobile devices absolutely blow away the last few decades of desktop computer growth. People are adopting new technology faster than ever before.” BeOS not mentioned. Would not read again. 2/10.
The first true “personal computer” was the Altair
No, it was the “Micral”, try again
http://en.wikipedia.org/wiki/Micral
“The front panel console was optional, offering customers the option of designing their own console to match a particular application”
*has flashes of cheap Nokia candybar phones with replaceable front*
So, the case mod community also began with Micral?
Hm, and what are iPhone cases if not basically just that?
Try again.
The Heathkit EC-1 was probably the first “personal” computer when it appeared in 1959 or 1960, in the Heathkit consumer catalog: http://www.old-computers.com/museum/computer.asp?st=1&c=787
Actually, there were other less sophisticated computers sold to the public earlier in the 1950s
Okay, granted, analog computers predates digital ones !
At first, I was pissed by the juxstaposition of the Altair, which was for geeks and specialists and the iPad, for the masses.
This history also completely misses a very important type of electronic device: The programmable Calculator ! (HP, Texas, Sharp, Casio…). For example the HP-65 is contemporaneous to the Altair.
Edited 2012-08-14 23:19 UTC
Which couldn’t be used for general computation… I think it’s more about roughly Turing-complete machines, and preferably microcomputers (based on a microprocessor, making them affordable)
It’s not only about affordability, also what one could do with it, how distanced a given machine is from our concept of “personal computer” – if we count such analogue machines (in essence, not significantly different from a mechanical thermostat), we might as well …a “sliding” aid to mental computation (forgot EN name), more or less the same mode of using it (and why not earlier computation aides, or even tables and abacuses?). Or those: http://en.wikipedia.org/wiki/List_of_home_computers#Cardboard_and_d…
Micral seems like a decent candidate… (or maybe the likes of Datapoint 2200, or http://en.wikipedia.org/wiki/MIR_(computer) – don’t know / don’t care if MIR is the “first” of such kind, I simply had it in recent browsing history)
The article also fails to plot ZX Spectrum – with over 5 million units sold in its series (not counting clones), it should be numerous enough…
Edited 2012-08-22 00:19 UTC
Interesting – I didn’t know about the Micral. Thanks!
Nice read, but I disagree with the conclusion.
I believe the iDevices will eventually follow the footsteps of their bigger brothers in what concerns market share.
The article is confusing (not to mention that some of the diagrams are really hard to read, like the second one which has 7 lines but only 5 captions…). The first decades they look at market shares of different hardwares, but then they suddenly switch to looking at software instead when they go to smartphones and tablets. It would be odd to have BeOS in the first half, but they could have listed the BeBox (although I don’t know if it ever sold in any significant numbers).
The BeOS thing was a joke. BeOS is totally irrelevant in the grand scheme of things.
They sold only a few thousands BeBoxen, iirc.
So this article goes from talking about personal computers to… smart phones and tablets. Smart phones (as even the article says) are basically PDAs with phone functionality. Sure they run ‘applications’ but you still wouldn’t use your smart phone for the majority of generic computing tasks (writing an essay for example).
Sure you COULD, I have libreoffice running on my Nokia N9, but I still wouldn’t put it in the same class as my Amiga 4000 or my Atari Mega STe. Or even my generic 8 Core PC.
Overall, a rather retarded article.
Things are changing. If the iPad taught us anything is that the majority of people does not want a generic purpose computer. They want a computer that works like a TV: turn on, watch a couple channels and turn off. I really think the numbers of PCs as we see today (desktops and laptops) will become niche markets, and will only be used in places where it’s still needed, like in any creative company. In the homes of most people we should find tablets or smart TVs or any other media device.
I disagree with this view. Tablets and smartphones are just a new consumer medium and will not replace desktops nor laptops at large, simply because each device is different. People buy tablets in addition to their desktop/laptop.
Do you know anyone with a tablet that doesn’t own a desktop and/or a laptop? Because where I’m standing most of the tablet owners have a desktop, a laptop, a smartphone AND a tablet, plus the desktop and/or a laptop at work.
If anything, I think tablets and smartphones eat a bit into each other’s markets, as well as spelling the end of netbooks. Real laptops with screens more than 600 pixels high will continue to do well.
The reason why people still use desktops & laptops is mainly because of limits in input & output technologies (e.g. screens & keyboards).
With big companies hiring smart people and betting the farm on new / alternative input & output technologies (the tip of the iceberg being things like Siri & Project Glass), is it so inconceivable that one day people won’t need the big glass rectangles and collection of plastic squares with letters on them that are the main reason they stick with desktops & laptops?
Once the input & output options advance enough (e.g. perfect voice control & picoprojectors or HUDs), people won’t need to bother with desktops & laptops 95% of the time. And once that critical mass, is reached, solutions will probably be found for that other 5% of the time too.
I think you’re right. For example, the use case for Ubuntu for Android is interesting. You put your phone on a docking station and you *have* your computer with you. But you still need the screen and keyboard. Only the “processing box” changes.
Yeah, that’d be REALLY fun to program on. “Type open parenthesis ay eee ess capital pee asterisk comma space you eye en tee thirty-two underscore tee asterisk” etc. That would be a nightmare.
Or even trying to write. Have you ever sat there and listened to your own voice for hours on end? It’d drive any sane person nuts!
Edited 2012-08-15 18:43 UTC
My sister in law was completely happy with her MacBook, until she bought an iPad. Now she almost never touches the laptop and does even work related tasks on the iPad. I, myself, have been avoiding my laptop when possible just on the thought of having to turn it on and wait for it to boot and what not.
Edited 2012-08-15 02:46 UTC
You couldn’t write an essay on that Altair either.
One thing about comparing early computers to smartphones and tablets is their relative costs.
If you adjust for inflation the new comers are very cheap as a percentage of a person`s income.
My 8K PET 2001 costed over $1200 when I got it. 1980?
My Amiga 1000 with memory expansion (total 1 Meg) was about $2000 I believe in 1985.
My Dual-CPU BeOS machine (a PC-Clone) set me back $1200 when BeOS came out for Intel.
What will a smart-phone or tablet set me back today? $400-700 in inflated dollars. Ofcourse a lot more people buy them!
OS News being about OS which is the core of any personal Computer. Therefore, it would be great if OS News does a comprehensive historical review from the perspective of different versions of OS that have played an important role in the history of personal computers.
I believe the future of desktop/laptop computing lies in display devices. The power and miniturisation will make them be able to integrated easy. Traditional flat panels, video walls, projection systems, holographic displays together with wireless technologies will ensure the home PC survives (the size of a match box or smaller maybe).
Personally I’m bringing my household back to the days of centralised computing. I have a basement server that runs the lounge TV (DVB-T and Media files, apps etc). VNC/RDP for tablets and other devices with input devices (keyb/mouse/touch). They are all terminals to me. All running the same OS and apps.
Probably should have been some mention of Acorns they sold well in the UK and dominated in Education for a while. They also spun off ARM which has become globaly significant.