“‘What Intel giveth, Microsoft taketh away’. Such has been the conventional wisdom surrounding the Windows/Intel duopoly since the early days of Windows 95. In practical terms, it means that performance advancements on the hardware side are quickly consumed by the ever-increasing complexity of the Windows/Office code base. Case in point: Microsoft Office 2007 which, when deployed on Windows Vista, consumes over 12x as much memory and nearly 3x as much processing power as the version that graced PCs just 7 short years ago (Office 2000). But despite years of real-world experience with both sides of the duopoly, few organizations have taken the time to directly quantify what my colleagues and I at Intel used to call ‘The Great Moore’s Law Compensator’. In fact, the hard numbers below represent what is perhaps the first ever attempt to accurately measure the evolution of the Windows/Office platform in terms of real-world hardware system requirements and resource consumption.”
Without being anti microsoft, I just wonder what the payoff is. Sure, its nice to have pretty eye candy, but how many of us could get by with Win98 or Win2k? Win95 had an 8 meg memory requirement for crying out loud. Now its not uncommon to see machines with 8Gigs of memory. Yet there isn’t much I couldn’t do on Win2k that I can do on XP or Vista. So whats the point?
Some of us still *do* get by on Windows 95 (in my case Win95 OSR2). It works just fine, and the two boxes I have that have more robust requirements are running Win2k and are perfectly happy.
Most folks upgrade for three reasons:
(1) They bought a machine, and it came with an upgrade.
(2) They needed to upgrade to a newer OS in order to keep support costs reasonsable (some folks charge considerably more to support older OSes).
(3) They had a legitimate requirement for a newer API or feature not found in older versions of the OS.
Win95, huh? Wow. I hope that little spam-bot isn’t sitting on the network….I’m tired of getting loads of crap in my inbox from people’s machines who claim Win95 is “fine for what I need it to do”.
The good part, I guess, is that no one bothers anymore to write worms that can attack win 9x
I wonder if there are still enough machines for there to be a self-sustaining malware ecosystem, assuming more recent malware is incompatable with 9.x. I still hear of people using win 98 every so often….
According to stats from my sites, Win 98 users are about 1%, Win 95 users are almost non-existent … so not very much in terms of market share.
Now its not uncommon to see machines with 8Gigs of memory. Yet there isn’t much I couldn’t do on Win2k that I can do on XP or Vista. So whats the point?
Well, if you took something like Win2K and added the latest media support, IE 7, and a shiny new theme then you would have something very useful but probably not worth selling by Microsoft since it would be a service pack upgrade. You have to make it really big and new to sell it!
Yet there isn’t much I couldn’t do on Win2k that I can do on XP or Vista.
That’s the balance getting
, as the author puts it:
This closing summary is the first place that the author hints at a central issue–the profitable symbiotic relationship between Microsoft and Intel.
Here is my summary of his explanation: Intel produces more powerful processors, which not many people really need. Microsoft then produces slower software to use up the speed of the older processors, thus creating demand for new hardware and sufficient IT staff to set it all up.
This may sound sarcastic but it’s really not. That symbiosis has produced a lot of jobs and greatly benefited the world economy. And a pleasant side effect is that faster, inexpensive machines are available to those who really need the power (and can run a faster OS).
Excel and Word sit under 8MB on my system, Powerpoint at just over 10MB, and they start nearly instantly rather than waiting 5-10secs like you did for Office 2000.
I’m sure Office 2000 didn’t use much less if at all less RAM than that.
The only difference is Vista which they are comparing Windows 2000 to, they are basically like 2 different platforms at this point.
They say 7 short years, but 7 years in the computer world is 1/3 of the time elapsed since the first IBM PC was launched back in 1981.
Excel and Word sit under 8MB on my system
I just launched Word 2007 and in task manager it is showing WINWORD.EXE at about 65 MB. And if I just go through menus (ribbon), without doing anything, it jumps to ~75 MB. This is on Windows XP.
Not that I really care (Firefox is at 75MB now with 3 tabs opened), but you said — 8 MB?? Are you talking about Word 2007?
(Word 2003 is at 16MB on another Windows XP.)
I have word 2007 on Vista here, and WINWORD.EXE idles at 8,192k, a little over 8MB.
I don’t see how you’re coming up with such low figures. I’m tempted to call shenanigans.
On my Vista Home Basic, Word 2007 idles at 46,308.
On my Win XP, Word 2003 idles at 23,120.
Not sure how you’re saying Word 2007 is only using 8 MB, when I’m seeing that it’s requiring nearly 5.5 times that much memory.
Edited 2007-11-15 21:23
Guys, please, measure your stuff right.
Please grab Process Explorer from Sysinternals (now a MS subsidiary) and look at whole memory consumption, that is, working set plus virtual memory size.
As memory is purged to the VM backing store, you’ll naturally see artificially low working sets; then, as soon as you use the suite for real, not only it consumes more memory, but it also comes with a visible slowdown (which is why you see jumps in the working set as soon as you actually do stuff to your documents, instead of looking at the memory patterns of the idling suite). The increase in memory usage comes from obvious reasons; the slowdown is due to going to the disk to reclaim purged memory pages.
I pretty much chalk it up to Firefox’s exceptionally bad memory management (and I mean it: look at Opera’s memory usage; I really hope the FF guys will implement an efficient garbage collector for their own good before everything collapses into FF’s own gravitational pull) for its excessive memory usage, but in Office’s case it’s hard not to call it bloat, plain and simple.
I’m really starting to wonder whether Microsoft is bleeding enormous amounts of talent off to miscellaneous startups or other larger (perhaps even rival) companies. Their code quality is getting *bad*, real bad, really quickly. Excuse me for the foul language, but SuperFetch and ReadyBoost MY ASS.
well, I see nothing wrong with say SuperFetch, since it only caches (in other words mirrors) data that exists already on the harddrive. and since it mirrors existing data this means that in the event that I need the memory it can be instantly flushed as oppossed to say a buffer which is a intermediate storage point.
that doesn’t change the fact however that Vista uses alot more RAM than XP even when it’s not SuperFetching. now whether or not you like this behavious pretty much depends on how you use your computer. alot of people out there (I’d wager the vast majority) has alot more resources than their general computer use will ever utilize, and for them it’s like ‘hey, why not let the operating system use it for whatever rather than it being unused’. then we have the other side, which are users that actually use pretty much all of their computers resources on a general basis, and they likely want to OS to use as little resources as possible in order to give as much resources as possible to their applications.
I fall into the latter category, I do alot of 3d modeling/rendering, compression and encoding. I’ve never encountered a situation where I feel I have too much ram, if I still have free ram I can increase my render size, add another subdivision to my model etc.
for my particular usage pattern, Vista is not an attractive upgrade path, as it will allow me to do less on my computer than I can currently do with XP. however, as stated above I have no doubt that most users never come close to utilizing the full amount of resources their computer has and for them Vista is likely just a better looking windows.
I see you and others attacking Office 2k7 for being bloated. Now, I don’t have any personal experience with OO.o, but I’ve read many times on message boards that it is more bloated and slower than Office. And as for Windows being bloated, I do have experience on Widnows and Mac, and in my own experience (others may differ), and have seen that Mac Office, Mac Photoshop, Mac Adobe Acrobat/Reader, and Mac DrScheme are slower than their Windows counterparts (though Mac iTunes and Quicktime are faster than the Windows counterparts, I assume Apple has more expertise with programming Mac apps).
So, if Windows OO.o is slower than Windows MSO, then that would demonstrate that Microsoft is not more “taketh away” regarding app performance than anyone else. And if Mac Photoshop, Office, Adobe Acrobat/Reader, and DrScheme are slower than their Windows counterparts, than that would show that Microsoft is not more “taketh away” than others in the OS space.
It is simply wrong to say that Microsoft is soley responsible for “taketh away”. Every sofware developer is doing this; it’s the nature of the biz.
Edited 2007-11-16 02:45
Nobody can really defend OO.o. Even Linux folks grudgingly accept it as the primary office option for now, outside of Wine. That still doesn’t excuse new bloat in MS Office.
No, but it does show that Microsoft isn’t the only company whose programs’ requirements have bloated up over the years for some very negligible gain.
It is simply wrong to say that Microsoft is soley responsible for “taketh away”. Every sofware developer is doing this; it’s the nature of the biz.
Tell me about it. I can’t even think about installing the latest Adobe Premiere on my system since it recommends 1.5GB RAM!!!!
But by your own admission Mac iTunes runs quicker than Windows iTunes. So perhaps the performance difference between Windows and Mac with regards to Microsoft or Apple software is more down to the respective companies either:
a/ being not as skill in writing apps for rival platforms than their rivals,
b/ not wanting to program their applications on rival platforms as effeciently in order to gain a market advantage,
c/ not wanting to invenst in the same level of resources to program on rival platforms,
d/ or a bit of all 3.
Personally I think a more accurate visual bentchmark would be start up times, network communication speeds and how quick you can navigate a file system as well as copy / pasting files. Basically just general OS functionality.
This much is true. But what does it prove? That MS’s 20+ years of expertise on Office suites gives them some (any!) edge over OO.o’s? Well, this is just plain expected.
Not surprising.
I’m assuming you tried the CS3 version here (which is kind of buggy BTW), on Intel, on Tiger 10.4.11. Given that, I can’t relate to your experience.
Now you should blame nobody but Adobe here. Since Mac OS X 10.0 was out Adobe Acrobat stuff has been excruciatingly slow on Macs, and given how snappy Preview.app is, and how every time a new feature is added to Preview.app (like annotations) a lot of people praise the Lord for weakening the dependence on Acrobat, I see I’m not the only one dissatisfied on this regard…
I can’t really comment on that, but perhaps you should email the PLT guys and tell them how, as an active customer on the Mac platform, you’d really appreciate some extra love, care and nurturing on the Mac port. You know? Just so that support for Macs doesn’t become just another bullet point.
Or…
Or…
Or maybe, just maybe, lots of slowdowns are down to the API level.
Just try to benchmark loops that call sequences of very standard Win32 API calls under Win95, Win98, Win2k, WinXP, XP SP1, XP SP2, Win2k3 and Vista. Except for Win2k3 I bet you’ll see decreased response times every time you run this little benchmark on the following OS revision.
So yeah, The Great Moore’s Law Compensator is real, and it’s FTL.
To add to the anecodotal information, mine on Vista Business idles away at about 10M, goes up to about 14M in use and had a peak of 54M (which involved some images)
To add to the anecodotal information, mine on Vista Business idles away at about 10M, goes up to about 14M in use and had a peak of 54M (which involved some images)
Confirmed here (Vista Ultimate + Word 2007; after I installed updates from Microsoft Update – updates for both Vista and Office 2007).
Edited 2007-11-15 22:52
Strange: On WinXP at work just opening Word 2003 with a document consumes 39mb of Ram (interestingly Live Messenger is also using 39mb).
On Vista opening Word 2007 consumes 48mb with a document open.
In Vista any process requires much more memory because it virtualizes video memory. It doubles video memory for each process. It takes much RAM and CPU performance to synchronize virtual video memory with real one.
Yes, Word 2007. And if I start manipulating documents it jumps up to between 17MB-20MB
Yes, Word 2007. And if I start manipulating documents it jumps up to between 17MB-20MB
Actually, you may be right (not sure why people immediately started modding you down? I just gave you +1).
On my Vista, WINWORD.EXE starts at 17 MB. Strange, on XP it starts with ~ 65MB.
Perhaps, Vista is preloading some other stuff elsewhere in memory, while XP needs it through WINWORD.EXE. Don’t know.
If it requires 65MB on XP, then that’s the correct value. There are no magical memory fairies in Vista that make Office suddenly take 50 MB less for no reason. That memory is still being used, it’s just hidden in another process (not surprising, given that Vista uses 500MB of RAM on my machine just sitting there after boot).
Yes, Word 2007. And if I start manipulating documents it jumps up to between 17MB-20MB
And I have to say that I stand corrected.
I just installed all updates from Microsoft Update, and now when I fire up Word 2007 (on Vista) it is showing ~10MB.
Please guys, stop modding other people down, without checking the stuff in more details.
Yep, same here. Got a 2007 Word document I’m working on right now and it’s hovering around 26400K to a max of 27828K.
That’s because part of office now loads into memory when you start up the computer. Note that is there when you run games too, hogging memory. I’d rather it take X number of seconds to load.
Everyone always makes that excuse with MS products, but please point out to me where it is running, under what process?
I don’t know about 2K7 (haven’t really gotten a chance to play with it yet) But in Office 2K it’s a hidden process called OSA9.exe and in 2K3 IIRC it is OSA11.exe. If you’ll download a little freeware utility called “startup control panel” you’ll find the link. Or you can use MSConfig(does that still work in Vista?)
They say 7 short years, but 7 years in the computer world is 1/3 of the time elapsed since the first IBM PC was launched back in 1981.
Part of the point is that even though 7 years has passed, the perceived functionality or speed of software has severely lagged behind the increase in processing power during the same period. It may or may not be any better, but it’s certainly not making very good use of the hosepower of today’s processors.
Case in point:
http://hubpages.com/hub/_86_Mac_Plus_Vs_07_AMD_DualCore_You_Wont_Be…
It’s not an entirely fair comparison, but you’d still expect to see much better numbers from a recent computer.
But Vista an Office2007 are doing 10x as much Windows2000 and Office2000.
Pretty UIs and a lot of features we currently take for granted do not come cheaply.
I have Office 2000 and it starts instantly. Because I have modern computer.
I had a Pentium 166 with 192MB of RAM – ran Windows 2000 and Office 97 just fine.
For non-CPU bound tasks, it was faster and more responsive than my current Core 2 Duo Vista desktop with 4GB of RAM.
I know all that RAM and CPU is being used for something, but whatever that “something” is, it’s utility is lost on me when I am editing a Word document.
I’ve put together a PC using a PIII at 500MHz and 512 megs of RAM, with Windows XP, OpenOffice and Opera. Runs without a hitch. I’d have installed Ubuntu on it, except for the fact the owner needed some legacy software that won’t work in Wine.
Yeah, underpowered machines will work fine with Windows for a bit. They used to ship XP on machines with 128mb to begin with, and after a fresh install they would run fine. But leave it a couple of months for “Windows Rot” to set in and you will see a different story…..
NOt to mention things like SP2, which really bogs down the system after its installed due to the stuff they added and rewrote.
If even the lofty Core 2 Duo w/ 4 GB of RAM isn’t good enough, the entire software industry should hang its head in shame. It is just not reasonable to need that much. (Heck, I still think 1 GB is too much for most anything.)
Yes, apps (even the OS itself) can always be greatly optimized, but it’s a lost art. It’s just not the #1 priority to the developers (except uber geeks).
“If even the lofty Core 2 Duo w/ 4 GB of RAM isn’t good enough, the entire software industry should hang its head in shame. It is just not reasonable to need that much. (Heck, I still think 1 GB is too much for most anything.) ”
As another poster pointed out – that power is put to good use, just not all the time. For example, that P166 couldn’t do video encoding/decoding for crap. It could barely handle MP3s, and forget about 3D games. Stitching together the 100MB panoramic images I am fond of creating would have taken weeks, instead of dozens of minutes. Editing RAW images from my camera would have been painfully slow, if not impossible. And forget about anything like Adobe Lightroom.
My point was that for office apps, and things that aren’t CPU intensive, all that power appears to be going to waste. There are applications today that do take advantage of all this power – but depending on your needs, you might use them very infrequently, or not at all.
Sometimes I think it would be nice to make a PC with modern hardware, but install an old version of Linux or windows on it, with the applications of the era – think Windows 3.11, or possibly Windows NT 3.51. You could run the thing on 4GB of flash for persistent storage, and CPU cache would probably be all the memory you’d need – but what the hell, put 256MB of DDR in there just for kicks.
Such a computer would be blazingly fast for most office and productivity apps and would have more memory than you could ever think of using.
” For example, that P166 couldn’t do video encoding/decoding for crap. It could barely handle MP3s, and forget about 3D games.”
I beg to differ.
My P166 MMX laptop uses about 5-10% CPU playing back mp3s with Mplayer.
It can also run MDK:
http://en.wikipedia.org/wiki/MDK_(video_game)
at a decent framerate, while doing the 3d rendering entirely in software.
Software is not highly optimised nowadays because programmers no longer need to do so. This is a good thing as it makes code faster to write and more maintainable. It also encourages reusable code rather than tightly written special purpose code.
MULTICS is an example of this. The later *nix clones were simpler and more reliable as they no longer needed all the optimisation and memory sharing tricks required to run a multiuser OS in only 32kB of ram.
Yeah, and I played Wolfenstein on my 286. Certainly it was rather highly optimized code, but it was also doing a hell of a lot less, with much lower visual detail. You simply couldn’t play modern games, at an acceptable framerate on my old P166.
As for sound, there are certainly modern bitrates and codecs that would tax the P166, if not pegging the CPU, making it much more likely to skip when doing something else. And just forget about any modern video codec.
My point is that there are tasks that actually utilize much of the power available in modern systems – those tasks don’t happen to involve editing office documents.
Agreed. I also ran Battlzone (http://en.wikipedia.org/wiki/Battlezone_%28computer_game%29) on AMD-K5 with 100MHz frequency and 16Mb ram – it worked very fine even in software rendering mode (my video card did not support 3D acceleration at that time).
I think a P166 would be plenty good enough for Quake. (But I barely ever bothered running it, seemed fine though.) And I guess .MP3 wasn’t the best example (runs easily), but you may (?) have a point. Just saying, you don’t need all these newer computers with tons of RAM just for multimedia and games. Heck, look at ye olde XBox 1 (PIII 733 Mhz) or even the (old but cool) Atari Lynx. Quite good without all the extra fluff!
with every OS out there. Think about running Ubuntu in Pentium II/III. How come I was able to run Debian/Motif then and get the same work done but Ubuntu sucks up so much resources in my Core2Duo?
Agreed: it is not unique to Windows.
But at least you have the option of stripping out most of the crud with Ubuntu. Use a window manager instead of a desktop manager. Disable daemons that you have no particular use for. And so forth.
I’ve tried doing the same with Windows, but inevitably end up with a crippled machine that doesn’t use much less in the way of resources anyway.
(Then again, the nest of dependencies with Ubuntu is making it darned near impossible to remove otherwise useless components without breaking something.)
“Agreed: it is not unique to Windows.”
Sure, I can agree here, too. But I’d like to mention the following relationship:
First, let me define the value overall performance = machine speed / software requirements. Here, machine speed refers not only to the CPU frequency, but to amount of installed RAM and available hard disk space, too. Then, “software requirements refers to anything you do with software.
In conclusion you can say – in most cases -, thatt the quotient stays nearly the same since the PC saw the day of light. Machine gets better – software uses more ressources – you get better machine – software takes “advantage” and requires more ressources – you get even better machine … et cetera ad infinitum.
The overall performance, the very individual feeling you have when you’re using your software (on top of your hardare) almost stays the same.
For example, Geoworks Enselble 3.0 runs as fast on a 386 as, maybe, “Office 2007” runs on “Vista”, and people did the same things with this setting as they do today.
I think you get the idea.
“But at least you have the option of stripping out most of the crud with Ubuntu. Use a window manager instead of a desktop manager. Disable daemons that you have no particular use for. And so forth.”
These are good advices, but I think most users don’t change such system immanent settings because they simply don’t know them. Furthermore, they usually claim they need this and that (e. g. a full featured desktop like KDE), so they won’t even try a fast (and still comfortable) window manager like XFCE or WindowMaker.
But if you see just the base system, the quotient introduced above increases (numerator++, denominator–, != 0). I’d like to mention FreeBSD as an example: If you run a well configured FreeBSD system on machines with increasing power, the overall performance will increase, too. I tried the same installation (i. e. the same hard disk) on a Intel P1 / 150MHz / 64MB, AMD K6-2 /500MHz / 128MB and Celeron 200MHz / 768MB – you can feel the difference. Another tendency is the base OS getting faster with each release. While 4.x didn’t run that performant on a particular machine, 5.x, and 6.x ran faster, so I hope 7.x will do, too – on the same (!) hardware.
In “Windows” world, this tendency is not present. Each new release of a MICROS~1 OS runs slower than the one before this release – on the same hardware. So you usually need to compensate the bigger hardware requirements with better hardware in order to have the same overall performance (requirements++ imply hardware++ here).
1) A lot of people exaggerate over what can actually be done on their computer, it reminds me of the mythical 486 with 8 megs of memory running a full blown GUI – then the claims that ‘Windows is teh bloated’.
2) Its cool to bash Windows, its what all the cool kids do; if you’re not bashing Windows, bashing Apple or bashing Microsoft – you’re apparently a ‘fanboy’ in a lot of people’s eyes on this forum.
3) Some people will cut Linux (and other free *NIX’s) slack by virtue that it is free – for me, even if it is free, it still doesn’t make it acceptable to create unnecessarily large applications which consume large amounts of memory.
4) Some are blinded by their hate; someone raised the issue of Firefox; it has a plethora memory leak but there seems to be this dualistic notion that if you criticise Firefox, it apparently puts you in the Microsoft camp.
Have you used linux at all? Memory consumption on a typical desktop Linux install is still far lower than anything on windows. By a large margin and with far more apps and processes running in the background. This has a lot to do with the nature of the way applications are linked in linux.
1). I thought windows 1.0-3.0 ran with that configuration if not with less. Am I wrong? You may not be able to run most of the stuff in Linux on that configuration now, but there have been instances including embedded devices where Linux runs with a full blown gui, with ram to spare.
2). This wasn’t directed to me but I feel I must reply. The point of the article is that instead of windows/office getting faster with each new hardware upgrade, it instead gets slower since MS becomes more lax with their resource management. Are you saying that is not the case, because then I think the real fanboy has appeared.
3). Due to the nature of the libs in linux, most apps share libs and thus they use less ram than the equivalent in windows, where most apps are self-contained statically-linked apps ( even if they are dynamically linked, they usually use their own version of the libs that are not shared throughout the system). I have yet to see a Linux application that takes more ram or more resources than a windows app. If that were even the case, chances are that the application would get a faster, instead of slower throughout its development cycle. You can’t say the same for Office. No Firefox and OO.o are NOT Linux apps, they are OSS apps, there is a difference. They can run on multiple platforms. So those apps may bloated and slow but that has very little to do with the Linux community.
4). No, everyone knows that Firefox is a dog. There are much better alternatives, including Opera and Webkit. The issue with Firefox is documented and is seen regardless of what platform you use. I personally use firefox because I happen to like it. I think IE7 looks garish, but I definitely don’t have any issue with it just because it came from MS.
Yes I have use Linux; and still do. It is a nice theory but it isn’t true.
I wish it was that simple but it isn’t. I guess you aren’t a programmer. Even a beginner does know that code does not eat much of memory space. Runtime footprint of a code is very small. So saving on something small doesn’t lead to massive gains.
For example all the code of “bloated” firefox is only 10-11 mb in size. So by your reasoning even in doomsday scenario of completely static-linked-in-memory firefox it should eat no more than cca. 10 mb of space. If firefox’s memory consuption was this small nobody would dare to call it “bloated”.
Anyway the point is – it doesn’t work that way. When compared to size of memory on modern computer, purely dynamic linking saves very little. I would say no more than 20-30 mb’s for fully loaded desktop environment.
Real memory eaters are data structures. And their size has almost nothing with underlying OS or hardware. Same 100k jpeg will be decompressed to 1 mb in memory bitmap on any system you chose: Windows, Linux, Playstation, Gameboy Advance or cell phone.
And sharing lots of libs between non related programs is so 90’s. When main memory was 4 or 32 mb it was a form of good optimization. But these days it brings nothing but problems like dependencies, troublesome distributions, hard (de)installations etc. There isn’t any compelling reason why in year 2007. OOo shuld share same libs with unrelated software like Konequeror, K3B, Blender and Gimp (I don’t know if these apps really share same libs, it is just an example).
And truth be told, Windows are, to some extent, capable of sharing runtime code. MS chose “don’t do that, it is evil” path. They had good reasons to justify that decision.
Even better example would be Java. When it comes to size of code memory footprint nothing beats Java. And I mean it in a positive, vary-small-code, way. Runtime code of java is extremely small. It is not without a reason called Bytecode. And yet Java desktop programs are heavy weight memory consumers. Sharing runtime code != small memory consumption.
Edited 2007-11-16 20:50
> Real memory eaters are data structures.
Correct. That’s part of the reason why MS Office is so piggy with memory, their OLE document structure has heaps of ‘wasted’ space within it since they seem to use some kind of block structure and don’t care about padding at the end of the block (which is one way people can fish info out of sensitive documents written in Word).
Java programs are huge, since the Virtual Machine is essentially implementing an entire operating system over your other O/S. Work is underway to ensure that the VM overhead is amortized over all your programs (instead of the same overhead cost for each program). Just as well Java is now very fast (provided the programmer is smart enough to do a bit of profiled optimisation to adjust their source), otherwise Java would have been a goner (instead, it’s now the most used language in the corporate environment, refer to tiobe.com).
I agree, MS is keeping the software world in the stone age.
The hardware today behaves as slow as 5 years ago because the OS eats all the performance gained.
So yes, I hope MS Monopoly dies because it keeping us all behind the technology.
“I agree, MS is keeping the software world in the stone age. The hardware today behaves as slow as 5 years ago because the OS eats all the performance gained. So yes, I hope MS Monopoly dies because it keeping us all behind the technology.”
Let me get this straight:
Microsoft is keeping the software world in the stone age by actually making use of the increased hardware power by adding software features? The corollary being that if Microsoft had just stuck with Win95, which would run at lightspeed today, then that would be the opposite of of keeping the software world in the stone age, right? So stagnating an OS advances the software world but adding features to an OS keeps it in the stone age. Okey-Dokey.
Even though I’m responding to your post, I’m actually responding to all posters that are agreeing with you, such as TechGeek, who asked “Where’s the payoff [for more powerful hardware if the software eats up all of that power]?” (I guess the extra power should just lie idle?)
I’ll start with business productivity software.
Word processors are as fast as they need to be. As long as the cursor keeps up with my typing, that’s fine with me, and I’ve never had any Windows version of any word processor have trouble with that. (I can’t say the same for my Mac, however, where merely typing in Safari forms frequently spins beachballs or I have to wait whole seconds for the cursor to catch up with my typing.)
Wordprocessors and spreadsheets run at the same “speed” as yesterday’s wordprocessors and spreadsheets on yesterday’s hardware, but they do way more stuff. For example, today’s office apps can save files in zipfile’ed XML rather than binary, and still be able to load those XML documents reasonably quickly. This allows for open file formats to be more feasible and even allows a human to read through the XML if necessary. So the hardware increases in power, the software takes advantage of the power to store data in zipped XML rather than binary while retaining the same speed when loading the zipped XML document. And that’s just one example. Productivity apps do way more than yesterday’s apps do, and they do them fast enough for the user.
And there are cases where productivity apps take explicit advantage of today’s hardware. For example, Excel 2k7’s recalc engine was redesigned to take advantage of multi-processor and multi-core systems, which are common place today. When running on a multicore system, Excel 2k7’s recalc engine will run in parallel if possible.
http://blogs.msdn.com/excel/archive/2005/11/03/488822.aspx
As for things where raw speed does matter (such as spreadsheet recalcs), the speed is there. An OS has almost no impact on a program doing some scientific calculation, since the program makes few OS calls during the calculation. For example, code that calculates pi to the millionth place is CPU-bound, not OS-bound. Such code runs faster on more powerful hardware regardless of the OS. Or, for a real-world example taken from my own experience as a programmer (formally professional, now hobbyist), source code compilation speed is much greater today than before. I project I worked on in 1997 took two hours to build. A few years later, it only took 10 minutes, despite being a much larger code base. Today, it would build in about 2 minutes or less.
Multimedia is a big payoff. No way could I record TV on a 1997 computer. No way could I watch hi-def video on a 1997 computer. Even low-resolution 320×240 mpeg1 was a struggle. Now we can watch HD video on computers, and that’s because of both hardware and software (hardware providing the raw speed, software providing more efficient codecs). Similarly for audio, where back in the day, audio would sputter and stutter while playing. Today, Vista’s and Leopard’s audio systems blow away everything preceding them.
http://www.avsforum.com/avs-vb/showthread.php?t=713073&highlight=vi…
Then there’s games. DOOM 3 might run at the same framerate on today’s computers as DOOM did back in 1996, and DOOM could run at 10000 frames per second today, but look at the difference in graphical quality. To say there’s been no “payoff” in that area or that gaming has been stuck in the “stona age” is absurd.
Then we have things like speech recognition, which Vista’s and Dragon’s offerings blow away what we had before.
http://pogue.blogs.nytimes.com/2007/03/01/telling-your-computer-wha…
http://inetsynch.podbean.com/2007/09/26/windows-2-apples-episode-13…
And ink, which is used in Tablet PCs and One Note.
And Virtualization.
And on and on.
One last thing: Yes bling is part of the payoff. People like to work in pleasant environments, and bling adds to that. And people are more productive as a result.
The notion that software is stuck in the stone age or that there’s been no “payoff” to more powerful hardware is laughable.
Edited 2007-11-15 22:53
Since when does advancing an OS mean it needs to get exponentially slower? Other vendors have managed to advance their OS without slowing down substantially. And there are advantages to this. Good luck ever getting Vista to run on something like an EeePC (even XP is a bit sluggish by most reports). Whereas you can run modern Linux just fine on it.
Since when does advancing an OS mean it needs to get exponentially slower? Other vendors have managed to advance their OS without slowing down substantially.
Exactly. Mac OS X releases have even been known to get faster due to more optimizations.
Ho ho ho. Yes, keep telling yourself that you really need a 3000+ CPU to run the latest Office and graphical desktop environment from Microsoft. Hardware vendors and Microsoft love people like you.
See you in 10 years, when the looming energy crisis will make using needlessly powerful computers for menial tasks a crime.
Don’t think there’ll be an energy crisis? Think again. Global warming will eventually reach a point where we’ll either have to cut down hard on heat emission or watch the planet go to hell in a handbasket. It will happen during our generation’s lives. Enjoy the supercomputers on your desktop while you still can, the PC’s of tomorrow will need to be as power efficient as possible.
“Don’t think there’ll be an energy crisis? Think again. Global warming will eventually reach a point where we’ll either have to cut down hard on heat emission or watch the planet go to hell in a handbasket. It will happen during our generation’s lives. Enjoy the supercomputers on your desktop while you still can, the PC’s of tomorrow will need to be as power efficient as possible.”
Wow, you really think Global warming is caused by direct heat emissions?
They’re not exactly helping either, are they? You must have an idea about how much power we’re wasting every day? What does that power turn into? Heat. Where does it go?
+1 and agrees 100%.
I dont see why some people keep buying into the marketing hype.I get a greater pleasure using older hardware and see what performance I could get out of it.Maybe its the geek in me.Plus I get more satisfaction I didn’t have to shell out any money unless something is really broken and needs to be replaced.
I dream of the day solar energy becomes the number one energy source.To quote a professor I once knew,
“The sun will still shine when the oil runs out”
Nobody is denying anything; the fact is, processors are now using less power more efficiently; the processor in this MacBook is a 2.16Ghz, uses less power than my flat mates 3.2Ghz P4 processor and yet my laptop out performs it.
I’ve run Windows Vista Business on a HP laptop (came preloaded) and sure, I’ve migrated back to Mac, but I can assure you the perceived ‘bloat’ wasn’t what compelled me. My flat mate’s mate bought a new HP laptop, 1gig of memory, latest graphics card (nVidia 8400) and speedy processor, no problems.
Sure, memory requirements have gone up, and sure, I don’t like to do ‘comparing bad with bad’ but at the same time, one has to consider what has been added, what has been changed; and ultimately we as consumers ask for it. Microsoft and other companies merely react to what they’re told by the marketplace.
“Wordprocessors and spreadsheets run at the same “speed” as yesterday’s wordprocessors and spreadsheets on yesterday’s hardware, […]”
That’s where we agree.
“[…] but they do way more stuff.”
That#s where I’d like to comment, thatt it may be fact that they are able to do more stuff, but from my observations, users do the same things with today’s wordprocessors / spreadsheets as they had done with yesterday’s ones. So where’s the advantage in providing more functionality (which require more hardware power) when they are not used? Look at where productivity (strange terminlolgy) takes place: Here, secretaries use their up to date high power PC as they would use a 286 with Geoworks, and how they even would use a simple typewriter.
“Productivity apps do way more than yesterday’s apps do, and they do them fast enough for the user.”
In most cases, the user does not notice the internals you mentioned, nor does he know (or want to know) about them. So if you would provice the old fashioned memory dump function when hitting Ctrl-S, it would be much faster so the user would notice.
“Or, for a real-world example taken from my own experience as a programmer (formally professional, now hobbyist), source code compilation speed is much greater today than before. I project I worked on in 1997 took two hours to build. A few years later, it only took 10 minutes, despite being a much larger code base. Today, it would build in about 2 minutes or less.”
This is where I definitely can agree, make buildworld was almost 24 hours on a P1 150MHz and is less than 1 hour on Celeron 2GHz. If it would be the other way round, I’d have to wonder…
“Multimedia is a big payoff. No way could I record TV on a 1997 computer.”
It’s just a matter of quality. And please take into mind that there are more computers around than just PCs, just think about the machines SGI built these days. Oh memories… =^_^=
“No way could I watch hi-def video on a 1997 computer.”
Was there hi-def video in 1997? It’s hard to watch something that does not exist yet.
“Similarly for audio, where back in the day, audio would sputter and stutter while playing.”
It does the same today on the mobile phones kids are so proud of, reminds me to 8 Bit 11 kHz blown out of a crappy little speaker.
“Then there’s games. DOOM 3 might run at the same framerate on today’s computers as DOOM did back in 1996, and DOOM could run at 10000 frames per second today, but look at the difference in graphical quality. To say there’s been no “payoff” in that area or that gaming has been stuck in the “stona age” is absurd.”
I agree, games are a major group of software that profits from computing power.
“And Virtualization.”
Nothing new, existed in the 60s in IBM’s mainframes.
“One last thing: Yes bling is part of the payoff. People like to work in pleasant environments, and bling adds to that. And people are more productive as a result.”
This is very individual. Maybe you can make this claim for a majority of home users, but there are users (esp. professional ones) that feel bothered by distracting stuff that disturbs them from working.
The vast majority of the so-called improvements in gaming in the same timeframe this article addresses (7 years) has only been in the eye candy area. Delivering good gameplay, a good storyline, or an immersive world seems to be just as elusive as ever.
There’s a small crop of good games still being made, of course, just like there always has been. But whether a game is really good is independant of the available hardware of the time. The only exception to this is the all too common “casual gamer” that only wants to play his games for a few hours before they’re lured away by the next game of the week.
“Delivering good gameplay, a good storyline, or an immersive world seems to be just as elusive as ever.”
You are right, I think you’re talking about the fine (but important) difference between pure graphics quality and overall gameplay. Immersive effects are not a simple product of most realistic image floods. Storyline etc. are more important than this, I think. It’s the only way to explain why “old fashioned games” (classics, if you want to call them this way) like DooM or Quake are still fun to play (if you don’t know them too good, of course). It’s the same relationship as the assumption: You make a good movie if you use the latest special effects only.
“There’s a small crop of good games still being made, of course, just like there always has been. But whether a game is really good is independant of the available hardware of the time. The only exception to this is the all too common “casual gamer” that only wants to play his games for a few hours before they’re lured away by the next game of the week.”
These are one of the important groups that update their hardware very fast. In order to play the latest games, you need to have up to date hardware, of course. On older hardware, you just need to wait in order to play the games, maybe one or two years later it’s no problem to play a former newest game on today’s common hardware.
“I’ll start with business productivity software.
Word processors are as fast as they need to be. As long as the cursor keeps up with my typing, that’s fine with me, and I’ve never had any Windows version of any word processor have trouble with that. (I can’t say the same for my Mac, however, where merely typing in Safari forms frequently spins beachballs or I have to wait whole seconds for the cursor to catch up with my typing.) ”
And multitasking has to suffer for that on windows.
“Wordprocessors and spreadsheets run at the same “speed” as yesterday’s wordprocessors and spreadsheets on yesterday’s hardware, but they do way more stuff. For example, today’s office apps can save files in zipfile’ed XML rather than binary, and still be able to load those XML documents reasonably quickly. This allows for open file formats to be more feasible and even allows a human to read through the XML if necessary. So the hardware increases in power, the software takes advantage of the power to store data in zipped XML rather than binary while retaining the same speed when loading the zipped XML document. And that’s just one example. Productivity apps do way more than yesterday’s apps do, and they do them fast enough for the user. ”
In other words u get a bunch of features u dont want or other software can process much faster/better on top of an already sluggish OS.
“Multimedia is a big payoff. No way could I record TV on a 1997 computer. No way could I watch hi-def video on a 1997 computer. Even low-resolution 320×240 mpeg1 was a struggle. Now we can watch HD video on computers, and that’s because of both hardware and software (hardware providing the raw speed, software providing more efficient codecs). Similarly for audio, where back in the day, audio would sputter and stutter while playing. Today, Vista’s and Leopard’s audio systems blow away everything preceding them.
http://www.avsforum.com/avs-vb/showthread.php?t=713073&highligh… ”
How do you explain that vistas sound moved to software accelleration? musicians suffer severely by that
All one has to do is look at the things achieved in embedded systems and game consoles. These platforms literally demand efficient programming otherwise it just won’t fit.
Perhaps Microsoft (and other brands) need to apply these “restrictions” to their design documents. Leave the push for newer and better hardware to game development and graphics design, and give us an efficient base to work with.
not to be an advocate of anything but the hardware improvements did draw down costs of IT in general
back in the day you had to pay a dev years to develop something in assembler and c while nowadays you can use c#, java, python, whatever.
if anything at all, maybe the user experience didn’t get any faster, but development costs went down
it’s not only because of lazy or bad programmers that bloat went up, it’s often because of the reasoned choice for a framework or easier language.
“it’s not only because of lazy or bad programmers that bloat went up, it’s often because of the reasoned choice for a framework or easier language.”
To a certain extent that’s true: A general-purpose application written in Python takes more resources to do a GUI output task than one coded directly in C.
But I don’t know enough to say whether most of MS Office is written in high-level languages, or if it still uses venerable C++.
From what I understand the GUI of Office 2007 now uses Winforms 2.x rather than the native Window controls.
“””
To a certain extent that’s true: A general-purpose application written in Python takes more resources to do a GUI output task than one coded directly in C.
“””
Keep in mind that the Python library doing the GUI output likely *is* written in C. Feel the power of the snake!
Many of one’s common calls to Python libraries are calls to optimized C.
Keep in mind that the Python library doing the GUI output likely *is* written in C. Feel the power of the snake!
Many of one’s common calls to Python libraries are calls to optimized C.
True, and maybe true to a certain extent for all high-level languages, but I’m not sure that it’s relevant. Otherwise there would be no performance difference between Python and pure C. But Python needs to make many more C calls to do mathematical operations within a nested loop, for example, than hand-coded C would.
So to the extent that applications are now written in Python (or Ruby, or Perl) rather than C, you use more computer resources to run them.
My only point in this is that something along these lines might explain some of the larger size and lower relative performance of more recent applications, including Office, IF Office is using more high-level abstractions rather than simpler low-level calls.
It’s not intended as a criticism, nor a knowledgeable pontification, only proposed as a possible explanation.
I agree. I’d like to know how much bloat is from managed code, non-integration, etc., as opposed to sloppy programming and quality control.
If performance actually improved with each version of Windows, Office, and various other programs then we’d see people more keen on upgrading to get more out of their current hardware.
This is one of the reasons why people are avoiding upgrades. The Steam survey for all gamers who use the Steam client shows just how people aren’t taking this anymore. Gamers are usually the first to upgrade if they’d get better performance from their systems, or extra eye candy, but instead Vista requires more memory, more graphics power, more CPU, etc.
I’m not sure it’s wise to trust someone who can’t even calculate percentages and mentions “bloat” in every paragraph…
If the memory consumption of Office 2000 on Windows 2000 started at 9MB and became 70% bigger every year (i.e., assuming that’s what the author means when he says “170% increase in memory consumption per year”, otherwise his calculations would be wrong in more than one way), then Office 2007 on Vista would be consuming roughly 369,3MB of memory. If Office 2007 is only taking up 109MB, then the memory consumption of the suite grew 43% per year over the last 7 years.
The same mistake can be found all over the article.
The payoff is more social (and economical)than anything else I imagine.The PC has become more mainstream because of this sort of thing and many of the geeks now wont have gotten into computers if it didn’t happen because we wont have known about it.
Right ?
They build, the masses consume and we geeks tweak.Companies make money,we service the masses and gives us geeks something interesting to do to get more from more and get paid.
IMO things have changed for the better.Maybe not for the elitists.
(Headnote: Compile times are accurate within a factor of about 2. For those who don’t know, Q6600 == 2 E6600s. This post has nothing to do with Microsoft and is mostly rambling.)
I was amazed when my Q6600 compiled my kernel in 8 minutes; so imagine my surprise when I talked to somebody who was compiling it in half that time on an E6600. The difference? He was removing all the stuff he didn’t use. And he said that it was as low as 2 minutes a few years back with two 500 mhz processors. Today’s default kernel would take 4 hours on that.
Is anybody advocating that we go back to using 2.2? Probably not. With that size increase comes a vast amount of functionality; my CD drive’s eject button started working at kernel 2.6.12; my mobo sensors started working at 2.6.20. And let’s not forget all the drivers and performance improvements.
OS 6 was nice (and the Mac Plus was a sexy beast), but it isn’t usable today. On the other hand, has anybody tried to optimize a modern operating system for speed? I know the answer is yes, but I haven’t heard about any distros (suggestions, anybody? And no, I don’t mean xubuntu) – just the XP-lite thing.
Ubuntu server edition can boot (GRUB to command prompt) in 12 seconds, and I’ll bet that can be reduced. Add a few seconds to start X, metacity and xfce-panel or something, and put efficient applications in the menu. Linux is fast on today’s hardware; a minimal environment should be able to fly without compromizing usability. Since this would be easy for a power user to set up, I don’t see any need to be complaining about bloat.
The point is that if you pick your software carefully, hardware speed is outpacing bloat in every area except boot time. Bloat exists, but unless it’s for work it is not like anybody’s forcing you to use it. There are no rules against using software from seven or ten years ago, and today it will be fast and relatively resource efficient.
As for firefox, it’s satan for memory use but I would (and have) happily add another stick for it; there aren’t any other options, at least until opera can be configured to look and work exactly like firefox.
With the CPU manufacturers mindlessly lumping goo-gobs of cores in one package for Joe Blow the computer user….and memory being ridiculously cheap I fear there is really no check and balance for the software developer anymore. Do software guys even have to optimize anymore? Code bloat? No problem…rate it for a quad core instead of a dual core. Shnazzy graphics getting you down? Tack on another gig of RAM.
The great equalizer has always been the hardware, but now the hardware has far surpassed anything that man can efficiently program for it.
As far as I can tell, Windows and Office are designed with speed as a priority only in scenarios where users would notice it. Functionality goes it first, with targetted goals of how fast various things should be. If those goals are met, then other areas are addressed. If the scenario is too slow to be reasonable, then some effort is taken to optimize things. As always, the target hardware is the mainstream of the time when development begins.
If you’re doing something that’s really compute or memory bound, you’ll get all the benefits of your modern hardware. Sure, Windows commits more memory now than it did in the past. But the paging algorithms do a fine job of giving the desired memory to an application… especially if that application tells the OS that it wants a larger working set. It’s pretty hard to accurately interpret the memory usage figures given in task manager or the page fault data given in performance monitor without having detailed knowledge of how Windows manages memory.
From my own quick investigations of Word 2007, it takes 36 MB of VM when running fresh. It uses no preloading and starts up nearly instantly (on the order of 1 second). I wouldn’t worry too much about the memory usage because if you keep Word open while playing your video game, Word will naturally get paged out. You won’t even need a paging write for most of that because all but 8MB are shareable (basically memory mapped dlls).
For us Unix users, This is very good. This forces drives the evolution of hardware. But for Unix users, this race wont matter. We will still be happy with 1GHz, let alone AMD 6000+ or Penryn quad core. All Unix users wins on this rapid evolution.
You know, /everything’s/ requirements has gone up substancially. Not just Microsoft’s products. I mean, “The Great Moore’s Law Compensator” has been the same for years and years now. Windows 95 ran like a pig on a 486 with 8MB RAM but that’s what it was required to run it, just like how “stupid” Windows Vista requires 1GB of RAM to run, which is ungodly, just like how 8MB of RAM was ungodly in 1995. You know, some distros of Linux can run on far less than what Vista wants but really, the other big ones want just as much. These kind of articles get underneath my skin.
Take a program that could “use” the cpu from back in the day, and a modern setup. I’m pretty sure I can calculate pi faster now, so why can’t I use office faster? I believe that’s what they author is saying. He’s not “using” those new features, he just wants to quickly.. using a script, look up some stuff with IE in a database etc.
The interesting question is, why is speed not a sales argument for software? (Or perhaps not anymore?)
Microsoft made a business choice: apparently most buyers perceive a fancy GUI and zilions of functions as more important for their Office-productivity than a faster and lighter application.
Could it be that most buyers don’t really use their software at the limit anyway? That if you are — just like most people — writing a single letter or making a single spreadsheet a day, and not writing complete sales reports or books, the half hour you can spend with all those wonderful options and functions makes a bigger purchasing incentive than the two or three minutes your letter or spreadsheet is finished earlier?
I noticed that most of these people are using ms tools to tell them how much memory their ms applications are using.
Call me paranoid but its like asking the car companies (without regulation) to reliably tell you how much horsepower your engine produces or what REAL WORLD MPG it gets is and we all know they STILL lie.
And you trust MS to not “tweak” things so it wont be called bloated har har har