Six months ago, after a long gestation period, Microsoft finally released Windows Vista. Vista is a huge release; not only because of the long list of new features, but also because of its sheer size, and number of bugs and other oddities and downsides. The development process that lead to Vista has left many with a very bitter aftertaste; features were cut, codebases were scrapped, release dates postponed. A few days ago, Microsoft released some sparse details on Vista’s successor, internally dubbed ‘Windows 7’, and in order to prevent another Vista-like development cycle, here is what I would advise Microsoft to do. Update: APCMag reports that Julie Larson-Green, who was the driving force behind Office 2007’s new Ribbon user interface, has been transferred to the Windows 7 GUI team.
Many have advised Microsoft to scrap the current Windows codebase, and start all over again – or, use something exotic like Microsoft Research’s Singularity. None of that is necessary. The Windows NT kernel, upon which Windows NT, 2000, XP, Server 2003, and Vista have been built, is very much capable of driving a modern operating system. The kernel is portable (the NT kernel has been ported to MIPS, PPC (not only the Xbox 360, but also workstations), SPARC, Alpha, Clipper, and more), so it is ready for any future platform changes. This portability was also evidenced by the fact that Microsoft was fairly quick in releasing versions of Windows NT for IA-64 (Intel’s Itanium) and AMD’s 64bit processors (followed later by Intel’s x86 64bit processors). NT has proven itself to be ready for any future platform changes that might happen in the desktop computing world.
Apart from being portable, the Windows NT kernel has also had a long time to mature. This is evidenced by the fact that kernel crashes (the infamous blue screens of death) have become an extremely rare occurrence in Windows Server 2003, Vista, and up-to-date installations of Windows XP (assuming you are not running any unstable kernel-mode drivers). Contrary to what some people stuck in 2001 want you to believe, current versions of Windows are stable as a rock, and the system barely crashes. The last time I saw a blue screen of death, was when I was using a highly unstable version of the Ati driver for my Radeon 9000. That was over 4 years ago.
Where Microsoft got its act together in the stability department later on in Windows XP’s life cycle, they failed to do so in the security department. Despite its second service pack, Windows XP was not a very secure operating system. Microsoft made the fundamental error of not enforcing the strict security practices the Windows NT kernel provided the operating system with; as a result, application developers got sloppy.
With Vista, Microsoft finally saw the light of day, and implemented User Account Control – something that should have been standard on Windows NT since its very first version. And now, when running Windows Vista, the sloppiness I just mentioned gets painfully visible: even installing something as simple as a notepad application requires administrative privileges.
In conclusion, scrapping Windows NT would be a pointless exercise. It is a mature, stable, and, yes, secure system by design. Do not make the mistake of thinking that simply because Microsoft refused to enforce proper security policies from the get-go, that NT is an insecure system by design.
What Microsoft does need to scrap, is the userland it has built on top of NT. Anyone who has used Windows Vista on a day-to-day basis (I use it every day on my laptop) knows that Vista’s userland is… Immense. Complex. This complexity stems from the one thing that has kept Microsoft on top of the operating systems game for so many years: backwards compatibility. It might not be perfect, but it sure is a millions times better than what the competition has to offer. Microsoft actually keeps copies of a lot of applications out there, and tests them to ensure compatibility is maintained. This is an extremely important aspect of Windows for many businesses, who may still run old DOS or Windows 3.x-based applications; in fact, the tills at the shop where I work are DOS applications. Sure, they look old and outdated, but I have been using them for 5 years (and some of my colleagues for 10-15 years), and they perform their job just fine.
For programmers, however, this desire to maintain backwards compatibility is a potential hell. This means that if you, as a Microsoft employee, have come up with a new killer feature, or maybe something less significant like a fix for long-standing minor bug, it needs to pass through a long process of testing to ensure backwards compatibility is not affected by your code. And if it does affect compatibility, your code needs to be rewritten, or, new patches need to be made to fix the compatibility breakage caused by your original patch. You can easily see how something like this is a restraint on many developers, and how it can hold back many envisioned improvements.
So, scrap the current Vista userland. Give the developers at Microsoft the breathing room to build an entirely new graphical user interface and other userland features on top of Windows NT. Do not make the common mistake of thinking that the programmers working at Microsoft are somehow magically less qualified than those working at Apple or on open-source projects. Various Microsoft Research projects have shown that there are a lot of bright minds among Microsoft’s 71100 employees, and giving them the freedom to develop something new, without having to take almost 20 years of application compatibility in mind, could be a very wise thing to do. It would also be a morale boost among the employees.
This Windows 7 would have all the hardware compatibility the current versions of Windows NT have (since the kernel and base system are still compatible), so it would run on most modern machines. The userland could incorporate many new and fresh ideas, and it would make use of Windows NT’s security features – and it would enforce those strictly upon application developers. In addition, it would incorporate a VM in order to run 200x/XP/Vista applications, similar to what Apple offered on the PowerPC version of Mac OS X (the Classic environment to run Mac OS 7-9 applications).
This would undoubtedly be devastating for Windows’ market share. Removing backwards compatibility means business users would never buy into Windows 7, and that would mean a serious lack of cash-flow for Microsoft. Therefore, Microsoft needs to cater to business users and other people concerned about backwards compatibility by maintaining a version of Windows based on the ‘old’ Windows NT; call it Windows Legacy, if you will. This version of Windows would be Vista (or, more preferably, Windows Server 2003), receiving only security updates and bug fixes.
Would this require more developers than are currently needed? I doubt it. The legacy version would need little development – since all that needs to be done on that version is bug fixes and security fixes. It’s a legacy version – no need for new features. And since the modern version does not require work on legacy stuff, greatly simplifying development, the end result would be that no more developers are needed than there are now. This is of course a bold statement, and only time can tell if that is actually the case.
There are a few things Microsoft ought to take care of when it comes to Windows 7’s beta/RC stage and launch. Release developer previews early and often, to as many developers as possible. Provide excellent documentation on the new APIs and user interface, to allow the major ISVs to get to know Windows 7. When Windows 7 is released, provide its small userbase with fast updates and bugfixes, and provide service packs for free on a regular basis. Promote participation by creating easily accessible bug tracking systems, and allow openness among developers, and good interaction between them and the userbase. And, most importantly, do not introduce any feature until it is 110% sure it will make it into the final product.
This is what I advise Microsoft to do. Is it likely a similar course of action will pan out over the following years? No, I do not think so. Microsoft has invested far too much time and money into Vista to make it sizzle out in such a way. In other words, I am afraid we are going to be stuck with another Vista-esque release.
Sadly.
PS: Microsoft, whatever you do with Windows 7, please do not create 12946 different versions. Ship one version, at a fixed price, and be done with it. It will save the world a whole lot of headaches.
If you would like to see your thoughts or experiences with technology published, please consider writing an article for OSAlert.
(quote)The development process that lead to Vista has left many with a very bitter aftertaste; features were cut, codebases were scrapped, release dates postponed.(/quote)
I thought the bitter aftertaste was from: DRM, Vista Phone home, User Account Control, its Slow and bloated, there are about 5 to many versions, Repressive licensing/activation, its over priced vapor ware!
Overpriced, maybe, but hardly vaporware. Vista was released over 6 months ago…released software is by definition not vaporware.
>not vaportware…
Ummm… I think he was talking about the fact that 99% of the “promised” and “hyped” features that were supposed to be in Visia never materialized, aka WINFS etc.
aka… its a vaporware relase through and through.
“its a vaporware relase through and through.”
Uh no, it may be a dissapointment but it is not vaporware. Vaporware is overhyped software that has not yet been released.
Vista = unwarranted optimism
Vista = deception
Vista = complete failure on completion date, feature set, feasibility.
In my opinion it’s Vaporware even tho it was released, what other choice did they have? to wait another five years to really, really fix it?
Edited 2007-07-23 14:20
complete failure on completion date
The only official release date EVER set by Microsoft was the one they indeed did ship it on. The other dates were NOT official; based on rumours, or individual employees making remarks on it.
feature set
Apart from WinFS and Monad, there’s little I can think of that was scrapped.
“Apart from WinFS and Monad, there’s little I can think of that was scrapped.”
I don’t know that I’d consider Monad (aka Powershell) “scrapped” since it’s freely downloadable for both Vista, XP, and Windows Server 2k3 (I use the XP version (well, I play with it, anyway)).
http://www.microsoft.com/technet/scriptcenter/topics/msh/download.m…
I think Exchange Server 2k7 bundles it (and indeed, requires it, as it’s used to perform administration operations).
I think the reason it’s not bundled with Vista is that Powershell 1.0 was completed towards the end of (or even after) Vista’s RTM schedule and they didn’t feel it was important enough to hold up Vista’s release.
I can think of…
PC-to-PC sync
WinFS = Windows Future Storage
Windows PowerShell = Monad
RSA SecurID support
Safe Delete
EFI = Extended Firmware Interface
NGSCB = Next-Generation Secure Computing Base
Windows Security Center …… is overhyped!
Um yes, but it was released.. so… it’s not.. vaporware.
Plenty of other software has been released with reduced functionality and it’s not tagged as “vaporware”.
You’re very much free to be disappointed with Vista, but tagging it as vaporware is entirely wrong, no matter how you spin it.
“In my opinion it’s Vaporware even tho it was released”
Your opinion doesn’t really matter. Vista does not fit the definition of vaporware. End of story.
Edited 2007-07-24 02:58
WINFS has been vaporware since the “Cairo” days. Aside from that, what was Vista missing that was promised?
“Ummm… I think he was talking about the fact that 99% of the “promised” and “hyped” features that were supposed to be in Visia never materialized, aka WINFS etc.
aka… its a vaporware relase through and through.”
If you would, could you please expand on the “WINFS etc”? I’ve notice that lots of people say that Vista was stripped of a huge number of features, then to support that assertion they give “WINFS etc” as their examples. Can you or somebody else provide a complete list of cut features besides WINFS (so we can see just how many features were cut and judge how important those features were)?
This list looks complete to me:
http://en.wikipedia.org/wiki/Criticism_of_Windows_Vista#Removal_of_…
It should be pretty obvious that it’s a short list. The reason WinFS is always used as the example is due to the overall breadth/scope of what it was going to accomplish. It’s worth mentioning that by no means is WinFS dead…it is still being worked on for future incarnations of Windows.
Edited 2007-07-23 14:31
“This list looks complete to me:
http://en.wikipedia.org/wiki/Criticism_of_Windows_Vista#Removal_of_…
It should be pretty obvious that it’s a short list.”
OK, so let’s go through the list. Apart from WinFS, we have:
1. Next-Generation Secure Computing Base aka Palladium, which MS critics despise anyway. Surely MS critics that constantly refer to Vista’s missing features don’t actually wish that Palladium were in Vista, do they?
2. Powershell, which, as the article states, is available separately.
3. SecurID Support, an RSA security mechanism, the wikipedia article of which says is flawed anyway.
http://en.wikipedia.org/wiki/SecurID
4. PC-to-PC Synchronization. I know nothing of this feature, so I’ll give the critics the benefit of the doubt and grant that it’s a hugely important feature that was stripped.
So that’s it. I’m disappointed, I expected a humungous list of very desirable features, given the oft-asserted “Vista was stripped of 99% of its features” proclamations. :p
The real problem regarding “stripped features” is not the stripping of features, but that Microsoft has such a huge user and develper base that they are compelled to provide guidance as to what they are working on for future releases, so when features are stripped, they are stripped publicly. Apple, on the other hand, doesn’t announce what it’s working on (until just before shipment), and can get away with it because of its smaller user/developer base. Apple can therefore strip planned features at will without any bad PR. I’m confident that any version 10.N of OSX contains new features that were originally planned for version 10.N-1, and has had features cut and postponed for version 10.N+1. But we just don’t know about it.
Incidentally, I think there is actual (if circumstantial) evidence of Leopard being stripped of key features. Last August (I believe), Jobs did announce at a WWDC that that Leopard would have fantastic tops-secret features that he couldn’t talk about at the time for fear that they’d be copied. Almost a year later, at the recent WWDC, there were no such features revealed Oh, there were some new things revealed, but nothing that even began to live up to the super-secret feature hype, leaving one to conclude that those features were stripped and postponed for OSX Ocelot (assuming that the features do indeed exist to begin with).
Note that I’m not bashing Apple, I’m just saying that I highly doubt that Microsoft is the only one that cuts features from particular releases of software products, postponing them for future releases (or deciding to cut them for good).
Edited 2007-07-23 15:20
Not at all. Ubuntu gets pretty much the same reception when planned features get deferred. Anyone remember how the deferment of composite by default for Edgy went over? Not well, and I happened to be one of those who had my pitch fork and torch in hand. Ubuntu is pretty transparent when things get deferred and users always know what going to make it and what is not. The reason people are so very disappointed with Vista is because they were expecting a lot more than what was given to them. It had nothing to do with features not being implemented or released (cause frankly there weren’t that many to begin with). What WAS implemented isn’t up to par with what people were expecting. People wanted something exciting and new and all they got was the same stuff with a new coat of paint.
Full resolution independence is the missing feature that disappointed me most.
This list is far from complete. Of course, like almost everything in Wikipedia it is a article in the making – which is good – but which makes in not exactly a proper source to make a statement like that. There is more that is missing. I have been exposed to Microsoft’s marketing buzz long before Vista shipped and I still remember a lot of things that haven’t made it. Take the whole thing about managed code as an example. Where does the core system and the shell in Vista used managed code? NET Framework 3.0 is on bord, but the shell itself still relies heavily on the Win32 API. And look at this inconsistent mess and all these hard-coded owner-drawn forms and windows. It looks definitely unfinished.
And what about the new TCP/IP stack – Did this really take five years? EFI support? I could go on, but it’s such a waste of time ranting about Vista. Trainwreck, that’s what it is.
Please provide references to statements made about features that ended up not being in Vista.
Personally, I think that people are asking the wrong question here.
I think the question really should be “What features in Windows Vista that were actually released that make it more appealing than XP?”
It’s not so much as the ones that were planned on being in it that were removed, but the ones that were put in it that aren’t all that worth while. Not to mention the fact that Vista simply hogs all of your system resources without providing a lot of new features.
Here’s a good example;
https://activision.custhelp.com/cgi-bin/activision.cfg/php/enduser/s…
The Transformers video game.
Minimum system requirements state that it is 256MB of RAM for XP but 1GB for Vista? Seriously, it takes 4 times as much RAM? Then again, I wouldn’t even dream of trying to run Vista with anything less than 1GB of RAM. So it’s kind of redundant to say on Vista the game requires that much. Microsoft should have just put in the Installer to check if you have less than 1GB and if so either don’t install or pop up a warning telling the user to upgrade. Otherwise their OS will run slow as molasses.
Shadow Copy? Disk I/O prioritization? Aero?
There is a feature which is basically aggressive caching called ReadyBoost. What it does is it figures out what apps you use when, and starts loading them before you launch them. This (in conjunction with a few other features) makes working on vista seem alot more snappy and responsive, however it also uses huge amounts of ram. The whole “Use a flashdisk to improve performance” thing is there pretty much to mitigate it, but that is where your ram consumption goes.
Transformers recommends at least 1gb of ram for running the video game, I would say you are crazy to run any modern operating system with less then a gig. I always strongly recommend to people buying either a mac or a vista box to shell out the extra few bucks and go two gigs. The fact they recommend 256megs on XP means they aren’t being too honest with their requirements. I have two gigs on my laptop, and I can run world of warcraft with all the settings maxed, and get no chunking at all, even with a virus scan running (the only way i can tell it started is my laptop starting heating up REAL quickly.)
I have an HP Pavilion dv9000, and Vista runs on it like a dream. The only problem I have that everyone bitches about is the incredibly long boot time. Other then that it is stable, very responsive, doesn’t thrash my harddrive for hours, copies files quickly, and looks absolutely gorgeous.
google_ninja wrote:
-“There is a feature which is basically aggressive caching called ReadyBoost. What it does is it figures out what apps you use when, and starts loading them before you launch them. This (in conjunction with a few other features) makes working on vista seem alot more snappy and responsive, however it also uses huge amounts of ram. The whole “Use a flashdisk to improve performance” thing is there pretty much to mitigate it, but that is where your ram consumption goes.”
first of all, the caching feature is called SuperFetch, ReadyBoost is the ability to use USB flash memory as ram. secondly SuperFetch isn’t what drives the higher memory requirements for Vista, it’s the actual system consisting of the kernel and services running. the Superfetch cache will be flushed the second a program needs that memory, so if you launch an application that needs 1gb ram it will flush out enough precached applications so that the application requesting 1gb ram will get it. and since the SuperFetch cache consists of nothing but preloaded executables, it can be flushed instantly.
so while SuperFetch may give the impression that it is responsible for the high memory requirements for Vista, it really isn’t. personally I think that together with the (imo) overall improved looks, SuperFetch and ReadyBoost are some of the (again in my opinion) very few highlights of Vista.
I thought WinFS was dead but is sort of set to live on in future versions of SQLServer. At least that was what Quentin Clark of the WinFS team described in his blog. One quote which seemed to summarise things follows:
These changes do mean that we are not pursuing a separate delivery of WinFS, including the previously planned Beta 2 release. With most of our effort now working towards productizing mature aspects of the WinFS project into SQL and ADO.NET, we do not need to deliver a separate WinFS offering.
http://blogs.msdn.com/winfs/archive/2006/06/23/644706.aspx
Has anything changed since then?
More “evolved” than “dead”. The contemporary components of WinFS are indeed being delivered via SQL Server 2008 and ADO.NET v.Next.
Roughly, SQL Server 2008 rather than 2005 provides the storage engine.
Object Spaces -> LINQ
Items Data Model -> Entity Data Model
Sync Adapters -> ADO.NET Synchronization Framework
The ultimate client deliverable will likely be some offshoot of SQL CE or Express 2008.
Perhaps it’s easier to spot the light (no phun intended) on what features are in fact present.
Cant’t be directx because still there are not many dx10 games around and the games written for dx10 perform badly with even a lot of so called dx10 capable high end cards.
As I understand it, the main goals of WinFS is to get away from the current partitioning system; it’s been servicable for ~20 years, which is remarkable in computer science terms, but it’ll eventually have to be overhauled as we continue to bump up against its limits. While they were doing that, they also are trying to make the ‘attribute’ more important than the folder in terms of organizing data (although presumably folders will still exist to segregate the system, the application, and the different user documents), and to make complex searches for files on disk as simple as making a system call (or a .NET call).
I’ve heard it also compared favorably to ZFS, but I don’t know how apt that comparison is.
WinFS was supposed to be a semantic data system on top of NTFS that would bring relational database functionality to a traditional hierarchical filesystem. It turns out that there’s no practical advantage to integrating the metadata with the filesystem at the inode level. It’s more effective to store it in a centralized repository than to distribute it across the filesystem. Think about the performance considerations of modification versus lookup.
The approach they ended up taking for desktop search in Vista is more in line with what Google has been doing and a subset of free software semantic desktop efforts such as Nepomuk. WinFS imagined a design synergy between semantic and hierarchical data organization that doesn’t actually exist. It was a bad idea. In fact, several ongoing projects are reconsidering distributed metadata storage in traditional filesystems.
Good comment butters. I know that something WinFS-like could be done with reiser4 + plugins, but what other projects are trying to implement it? IMHO this is one of the holy grails for desktop computing, and has the potential to completely revolutionize the way we organize our data, so I have been extremely disappointed in the underwhelming offerings that MS and Apple have given us so far.
Every time I read a comment like that, about something that is the “Holy Grail” of Desktop computing I really want to hear some justification to this claim.
How does adding more text in a special attribute tag on every file going to help me? How are these tags populated? Does he user have to populate them every time one is created? what about files that are created by other programs? What makes this more revolutionary than just using a regular search box that already ships with most OS’s?
Everyone keeps saying how great this meta data attribute tag is but really what does it do that a good directory structure doesn’t do already? Maybe there is room for meta tags on some files of specific types like jpgs and such, but that would perhaps be best served by changing the standards a little rather than the file system?
Is adding more overhead to a file system the right way to be going for the “Future of Computing”?
In theory, the proper use of attributes would allow you to have a near-infinite number of ‘directories,’ with each file being able to be placed into one or hundreds of thoe directories.
For example, I am both a steampunk fan and an RPG gamer. If I bought a steampunk RPG online, I could effectively put it into both folders at once, without dealing with clunky links or shortcuts.
Currently, the only method we have of categorizing data is incredibly kludgy. Let’s say I take a picture on vacation with my girlfriend and another couple, how do I store it? something like
/pictures/vacation/2007/beach/group22.jpg
First off, this provides no flexibility when it comes to retreving it. I am only limited to one method, and am forced to select the one i think will be the most meaningful to me (what happens if I want to find all the pictures with me and the gf on the beach? or all the shots of that other couple?) What is going on is that I am using directories as a very limited form of meta data.
The other thing to keep in mind, is that mazes are difficult for the human mind, which is why they have been used as games for so long. Someone who is technically inclined can learn to be very efficient with directory based orginization, but if you look at the average person, their directories are consistantly a mess. They will have a small number of huge folders with really long filenames, and get completely lost if they are required to navigate out of that safe haven.
Now, meta data provides a far more flexible and natural way of storing data. Instead of
pictures/vacation/2007/beach/group22.jpg
I could instead have tagged the picture vacation, 2007, beach, sunset, girlfriend, couple, summer, relaxed. The path could simply be
pictures/nice evening.jpg
and be far more accessible then the current option.
If you think about it, the filesystem is pretty much the last place that forces these restrictions on you. Photo and music management software both use meta data to great effect. On the web it is even more previlent, in everything from email to blogs.
I stated there was a place for meta data in media files in my post, but not everything needs meta data and adding it to a file system seems backwards. It would be better to have Media files with a wrapper or add-on like an ID3 tag on mp3s. If the meta data was stored in an addendum to the media files themselves then the files could be moved from file-system to file-system while preserving the information.
By making the meta-data file-system specific wouldn’t that create that dreaded “Microsoft Lock-in” that everyone always bitches about? Then we’d have all kinds of wasted effort and time spent trying to convert this data so that other file-systems could use it and not restrict the files to a Microsoft only file system.
No thanks.
stuff like id3 only came about because of the lack of the feature on the filesystem.
When you take the feature to the file system, then you can start to drastically change the way file managers work, and thats the prime advantage.
As for vendor lock-in, this is always a problem, and it will take a huge amount of work to get something this complex to work from another platform. however, we are already more or less in this position, linux filesystems store different data then hfs+, which stores different data then ntfs.
You forgot to also tag the picture as “me” so it wouldn’t show up in your search.
Obviously you’d be more careful when you’re actually tagging the real pictures, but you were up to 8 tags already and the first search you wanted to do was broken (showing that tagging does require time and effort to be done right). I don’t know how many photos you have, but individually tagging let’s say 150 photos after each vacation (not to mention the 10,000 photos you’ve already taken with 10+ tags each is I think more work than most people are ready for. If you’re lucky, you might get them all tagged with “summer, vacation, 2007” since it applies to all of them, but in that case /2007/summer/vacation works just as well.
Point taken
The thing is, you develop the tags that you want to organize by, just like you develop the folder structure. The only difference is that one is far more flexible then the other. If all you use are three tags, you still gain something. Lets say I want to view all my 2007 photos, not just in vacation, but also from family and work (all this is fictional btw, I don’t even own a camera ;-). On a tagging system, I can do that very easily, with the directory system you are stuck in a certain path. So even in your 3 tag scenario, you still get advantages.
Of course, all this needs to be married with a good UI that makes the whole process as painless as possible, both with saving and querying the metadata.
Personally, I’d just do a search for all files in /2007.
But I get what you mean. The problem with photo metadata compared to id3 tags (for example) is that music is more or less fixed, in the sense that I will probably tag my copy of a song the same way as everyone else (even if it’s in a different file format), so you can stick it in a database and recall it. Tagging your Word documents isn’t too bad either, because you can just search the text, the headers, etc for relevant terms. Photos are (hopefully more or less all unique, so each has to be uniquely tagged by hand. Be honest, if you had to tag your entire music collection (far easier than a photo collection) by hand, would you do it?
I think something similar to the iTunes interface might work, though. Drag and drop photos from your library into “playlists” which would be your static tags, then use “smart playlists” for your dynamic searches (family: where photo is in playlist “me” OR “mom” OR “dad”, new stuff: view count is 0, etc). I hear (unsurprisingly) that iPhoto is like this, but I’ve never used it.
It turns out that there’s no practical advantage to integrating the metadata with the filesystem at the inode level.
Good thing WinFS didn’t do that, or am I misunderstanding your point?
Vaporware is a software or hardware product which is announced by a developer well in advance of release, but which then fails to emerge, either with or without a protracted development cycle. The term implies unwarranted optimism, or sometimes even deception; that is, it may imply that the announcer knows that product development is in too early a stage to support responsible statements about its completion date, feature set, or even feasibility.
The development process that lead to Vista has left many with a very bitter aftertaste; features were cut, codebases were scrapped, release dates postponed.
OK……. it’s only 98% Vaporware the one thing they did do was release it.
va·por·ware (vā’pər-w^ar’)
n.
New software that has been announced or marketed but has not been produced.
—
Software that is not yet in production, but the announced delivery date has long since passed.
—
If you want to define vaporware simply as software being announced way in advanced, KDE4 might even fall under that. Heck, it fits the definition on some sites:
Products announced far in advance of any release (which may or may not actually take place).
The most common definition of vaporware is software that is announced but never comes. You knew Vista was coming, one way or another.
Maybe we could say Longhorn was Vaporware and Vista is Evaporated ware.
Dealing with people who are not in know about computers doesn’t really care either way about DRM, Vista Phone Home, User Account Control. They do complain that it seems Slow. But to the most part they feel kinda ripped off because they waited so long for the new version (so they could upgrade their PCs) and didn’t get much in return. If you read they Hype that microsoft gave to it, it would seem that it would do everything that Windows 95 Promiced us. But still it went short.
that was simply the best critic of MS’s company culture and their internal development process I’ve ever read. Before, I’d have recommended to them to scrap everything and go use sinularity, too, but now I realize the kernel is not so much part of the problem- the whole WIN32API is. Alas the structure of the kernel is good bit messed up too, but building new sane userland APIs on top of it should help out there. But as you said, it is most likely they simply don’t dare saying to their customers: ‘look we sold you shit at a high price for all these years, now we have completly refurbished it, you’ll have to rewrite all your applications though but please still buy Windows7 even if MacOS will fit your needs for a lower buck and FOSS will do it for free…’
I think their train is gone now. They had their chance to do what you explained so well when XP was released, they would have had a competitive operating system by now which runs on a single chocolate bar instead of one that sucks and uses oneandahalf Gibibytes doing it… Well better luck next time.
The problem with Thom’s suggestion is that the kernel is a much smaller part of an OS than many people realize. Essentially the conclusion of this piece is that only a tiny part of Windows Vista is worth keeping, while the rest should be scrapped. I don’t necessarily disagree, but arguing that it isn’t all bad on the grounds that at least the kernel is fundamentally sound is a bit of a stretch.
Let me reduce the OS market down to the fundamental platform models. Windows is modeled on backwards compatibility: release once, runs forever. Free software is modeled on transparency: watch the mailing list, try to keep up. Mac is based on periodic obsolescence: don’t use this library anymore, this new one is better.
This is what fundamentally defines the implicit contract between platform vendor and application vendors. Each platform has a particular arrangement, and while everything else is subject to change, they can’t reneg on their deal with the devel(opers).
Linux vendors can’t freeze their userspace ABIs for 5 year periods. Apple won’t let third parties sully their beautiful platform by using antiquated libraries. Microsoft will never, ever make a clean break from backwards compatibility.
Microsoft’s age of technological leadership on the PC is over, and there’s nothing they can do about it. As I’ve argued many times, they will decline over an excruciatingly long period, and the less they struggle, the longer they will continue to be profitable. If they carefully maintain their house of cards, they have a good 20 years of black ink ahead. If they follow Thom’s advice, they’ll be bleeding in less than 10 years.
Their goal is not to produce a compelling, usable, and efficient platform. Their goal is to keep the gravy train rolling down the tracks. As long as they release something that vaguely resembles a modern OS and continues to run apps from the Bush Sr. administration, they’ll be fine. But if they make a clean break, they’re going to have a really hard time convincing anyone not to switch to Mac or Linux.
And nobody, not even Microsoft, can develop a modern operating system from just a kernel in less than 5 years, let alone the 3-year target for Windows 7. This is like putting a man on the moon. Ten years and $25 billion, minimum.
Actually windows compatibility is hardly what the article suggests. Everything after windows 2k is only tested to comply with windows 2k+. For software that does not run on 2k+ there is an emulator. But the API changes are minimal and this is a huge problem at least for me. The graphical libraries are ancient and even file handling is years behind what you get on *nix and mac os. But it is what it is. After all the MS business strategy is minimal improvements but enough to make you upgrade. I am not so sure Vista qualifies as enough but time will tell.
As far as 5 years being enough … you greatly underestimate the developmental power of MS. There are thousands of developers working on windows and the reason why so many are needed is because the code base they need to maintain is so vast. If rather than maintaining those same developers move to active development they can write a whole new OS and a new Office in 5 years. The problem however is that from a business perspective this is a suicide and MS will never do it. You cannot scrap compatibility and while office compatibility is easy because there is a clearly defined standard, making sure that software written by various developers for previous versions of the OS is a very hard task. This is why the API stays virtually unchanged and the developers really can’t change much. They could pull an Apple but the problem is that MS management really doesn’t have the balls for that. Maybe even the expertise…
Good informative article.Cleared up some issues for me and generally makes alot of sense.
“PS: Microsoft, whatever you do with Windows 7, please do not create 12946 different versions. Ship one version, at a fixed price, and be done with it. It will save the world a whole lot of headaches.”
I agree wholeheartedly.They should just ship one DVD and let you unlock the version you want to install, desktop or server.Alternatively, if pricing is an issue just issue two DVDs with each product.
Edited 2007-07-23 14:06
I agree wholeheartedly.They should just ship one DVD and let you unlock the version you want to install, desktop or server.
Well they do ship all their [desktop] versions on one DVD and unlock one.
I respect what you are saying but what I meant there is really no need for all these desktop versions.Just have one.Even if you want to make that a barebones DVD per se and let the user add their desired functionality afterwards to trim the bloat down a bit.
I think it’s clear that the reason for the different desktop versions is to provide a set of different price points; some accountant guesstimated the set of price points that would maximize revenue. People talk of OSX having a single OS at a single price point, but Apple makes most of its money on hardware (and they release an upgrade every 12-18 months at that single price point). Apple, being primarily a hardware company, makes different versions of Macs and iPods at different price points calculated to maximize revenue. Microsoft, being primarily a software company, does the same for software (thus the different versions and price points of Windows, Office, Visual Studio, etc).
Then of course, there’s the EU-mandated “N” versions that nobody wants. :p
I agree completely, but you can’t apply traditional marketing strategies to software and other kinds of digital media. Different features and different price points works when the consumer associates the added features with added costs to the producer. For most kinds of goods and services, this holds true. But most software consumers understand that it doesn’t cost Microsoft any more money to press a Vista Ultimate CD than it does for any other version.
Let’s consider two illustrative examples. First, consider a miracle drug. People know that it doesn’t cost the pharmaceutical company $100 to make that pill, but they know that it cost them a lot of money to develop. If the doctor prescribed a new version of the pill that included aspirin and costs $200, the patient would be rightfully pissed. They’re OK with supporting the development costs, but they’re not OK will getting ripped off for added features that don’t cost much money to develop or manufacture.
Now consider buying a car at a dealership. You want the upgraded floor mats, but the dealer says they only come with the leather package, which costs $2000. You thank the dealer and tell him you’re going to the competitor down the street. Wait a minute, now you can get the floor mats for $75. While it’s silly that you have to play these games in order to get the features you want without paying for the ones you don’t need, at least it’s sometimes possible.
These examples are tangible goods where everybody realizes that you get what you pay for. We just don’t want to be nickel and dimed out of our hard-earned money. Not even this much is true for digital media, where a lot of people have no problem with making a copy and not paying for it. Microsoft is in an industry where piracy is rampant, and they respond with seven different versions of Vista at various price points. What are they thinking?
My guess is that Microsoft wants to offer Windows to people that don’t need all the features and don’t want to pay for them, but they don’t want to lower the price of a Windows version with all the features because that would severly hurt their profit margin if everyone was able to get all the features for that reduced cost.
If they offer only one version at about what Ultimate goes for, then not everyone can afford it or is willing to pay that much.
It’s not about making people pay extra for features, but making Windows more affordable to people who don’t need/want all the features, without pissing off shareholders. Remember the shareholders.
My guess is that Microsoft wants to offer Windows to people that don’t need all the features and don’t want to pay for them, but they don’t want to lower the price of a Windows version with all the features because that would severly hurt their profit margin if everyone was able to get all the features for that reduced cost.
Bingo.
But I think three desktop versions would have sufficed: Home Premium, Business, and Ultimate. “Enterprise” should have just gone under the “Business” moniker, i.e. “Business Subscription”. The Home Basic and Starter editions are pointless.
My ideal price point for Home Premium full is between what the Home Basic & Home Premium OEM versions actually cost. I believe Microsoft would make around the same amount of profit that way, perhaps even more. The full versions are ridiculously overpriced, and I bet few people actually buy them compared to the OEM and full versions. Just have one price for each edition and be done with it–no OEM, upgrade, or full edition mess. That InformationWeek dude talked Linux being an open-source “mess”; I don’t see how navigating Microsoft’s pricing schemes and editions is any less confusing, and it all comes from the same company.
Microsoft can give discounted prices to individual vendors as they please, and they can significantly reduce the cost of Windows in third-world countries that would get the Starter Edition. But, please, just cut down on the number of editions and have one pricing scheme for each. ;p They can easily maximize their profit with just three desktop editions and one pricing scheme.
Given the current enormous profit margin of Microsoft, I think that this is quite funny to suggest that they do offer the ‘truncated’ version out of good will, if I understood correctly what you said.
Good will? No. It’s a way to expand the user base and maximize profit. It’s to please the shareholders.
But some consumers do benefit from it.
“Even if you want to make that a barebones DVD per se and let the user add their desired functionality afterwards to trim the bloat down a bit.”
If the extra code never executes (i.e. it’s a feature not available on the edition you have installed), how is that bloated? Or are you just saying that because the DVD is a fixed size regardless of the version you install, that it’s bloated?
People say that exact thing of Linux distros constantly, so it would be no wonder really if they’d say that about that Windows DVD too.
The difference is that on a Linux dvd you get all the software you could just about ever need, on a Windows cd you get what Microsoft thinks you need.
Are you willing to put money on it that the next version of Windows will be smaller? and requiring more ram for no real reason accept to cache it.
Hard disks, flash memory and ram are getting bigger but thats no reason to make your OS fit it and on the same level have problems with 4Gb ram.
Edited 2007-07-23 19:30
“If the extra code never executes (i.e. it’s a feature not available on the edition you have installed), how is that bloated? Or are you just saying that because the DVD is a fixed size regardless of the version you install, that it’s bloated?”
I think I just had a brain fart there.
Edited 2007-07-23 14:52
“PS: Microsoft, whatever you do with Windows 7, please do not create 12946 different versions. Ship one version, at a fixed price, and be done with it. It will save the world a whole lot of headaches.”
One could argue that someone who finds the different versions a source of confusion shouldn’t be using computers.
How long until someone makes mention of the “Microsoft Defense Brigade” in this article?
(please mod this down)
Too late, you already mentioned it.
Probably about as long a it takes for someone to say “linux zealot” or “apple fanboy” in articles about Linux or Apple.
I like the idea a lot to pack old lagacy apps in VMs. Actually this is what I’m doing right now. But it opens much more possibilities: You don’t have to run Windows at all, natively. Choose the host OS that you like and that best fits your needs. And something tells me that this is what Microsoft really dreads: freedom of choice for their customers and the possibility to move a away from a OS monoculture. Let’s not forget that far far reaching lock-in contracts are part of Microsoft’s business culture.
While most everyone agrees that stripping backward compatibility would be the best thing to happen to Windows, I’m resigned that it won’t happen.
Simply put, Apple had nothing to lose by scrapping everything and placing the burden of compatiblity in a molasses Classic environment. Until MS has everything to lose, as evidenced by significant market share losses, they will not take the same risk. As of now, even with all of Vista’s bad press (much of it unjustified), the competition still can’t shake the pillars.
Besides, there will always be a very vocal group complaining that “Microsoft is forcing me to upgrade my apps/ hardware to get Windows x.x.” I never understood this argument. Don’t upgrade. It really is that easy.
Windows 7 will probably be an incremental upgrade on Vista, in the same way that Leopard, Tiger, etc are for OS X. It will clean up problem areas and probably add a new major feature or 2. For about the past 8 months, MS has stated they like that path on more than 1 occasion.
MS has always been pragmatic.
When computers were not powerful enough to run a good GUI in an X like fashion, they integrated the graphics in the kernel. When most of the world was not networked, they didn’t bother with all the user privileges…
Sure, you can say these decisions have caused MS trouble especially in the security area. But they made things work at the time. Just like how they adapted to the ‘network internet’ age with XP /Vista.
Now with virtualisation becoming more feasible/popular on desktops/x86, MS is most likely going to incorporate that. This way, as the author says, they can bypass the whole compatibility layer and just have you run MSDOS/win9x program in a virtual environment.
We can only see how they implement. Pass performance is not an indicator of future performance
I generally find articles about Windows to be one extreme or the other. I use XP full time at work and our servers are almost entirely 2000/2003, and it’s a setup that works very well. At the same time, I won’t allow Windows in my house anymore, we’re Mac/Linux only.
Anyway, it’s true that XP has reached a great point where it is really stable, very usable, and great for everyday use for most people. It’s also true that Vista is a massive failure and very few people want or need it. So Thom’s standpoint, that 7 will need to do something dramatic if Microsoft wants to maintain its standing, is pretty dead on for me. I’ve long said that ditching compatibility for a new core design is the way to go.
Although I’m not so convinced that NT is the necessarily way to go. Yes, drivers are all written for it, so it would be nice, but I think using the Solaris kernel would be a nice change for processor and datacenter scalability, it’s really the userland that will matter as Microsoft has the resources to do this well if it’s managed properly.
Time will tell.
Yes, drivers are all written for it, so it would be nice, but I think using the Solaris kernel would be a nice change for processor and datacenter scalability
The day Microsoft uses an open-source, Sun-developed kernel for their OS is the day hell freezes over.
Yeah, it was just fantasy.
It would be interesting to see MS at it’s best. It has been a long time. And all the smart people seem to work on stuff that isn’t the core. The committee that designed vista should be fired.
Such a Windows Legacy environment would require a special type of VM.
What I’m saying is that the VM should work with the hosts memory manager, as well as “Windows Legacy”, which should relinquish at least it’s memory managing and file caching duties to the host. Like this, the VM can always run with the smallest footprint necessary. Optionally, other subsystems could be replaced (or paravirtualized, if that applies).
The idea is being able to still supply the native APIs, while accelerating and optimizing what’s possible. Apart from memory, caching, disk and probably video, anything else could be virtualized/replaced later on, by means of updates or service packs (like network, audio, etc).
Duke Nukem Forever first person shooter. I believe it was started in 1997. Ten years in the running. Their website says it will be done when it is ready. No release schedule, no dates, no preordering. There are even old sample videos from 2001 kicking around. Remember John Romero’s Daikatana? Same problem, retooling after retooling, great features hyped, but most people hated it when it came out. Some reviewers proclaimed it was “unplayable.” Before I left the software field I remember that the constant retooling pattern in software was usually a symptom of design/organizational problems.
Now thinking about it, if the kernel stays the same, there’s not even a need of a VM. See WOW64, where the necessary barebone, but fully functional code-wise 32bit userland sits on the 64bit kernel, with thunking involved.
Why to struggle with maintaining backward compatibility when Microsoft already owns the perfectly backward compatible software, the older OSes? Microsoft already owns virtualization software.
HW which will run the new OS is powerful enough to deal with VMs.
Why non include MS DOS 6.x, Win 95, Win 98, Win 2k … as integral part of the OS running in VM? Rather then maintaining backward compatibility focus on developing best VM that runs your old software and integrate with your new OS. Then when application is installed automatically associate it with an VM and then automatically run it in given VM.
What you will end up with is the best OS possible since you do not have to deal with old baggage, best virtualization experience out of the box and best backward compatiblity since the application will run in OS for which they were developed.
I like the article. The bit that worries me though, is:
APCMag reports that Julie Larson-Green, who was the driving force behind Office 2007’s new Ribbon user interface, has been transferred to the Windows 7 GUI team.
I really can understand the benefits of the ribbon UI to new Office users. I find the idea of Windows with a ribbon everywhere UI rather upsetting however. Just imagine what evil things they can device in 5 years time…
It’s not logical to use ribbon UI through out Windows. Who says they would do that though? Just because the man and the woman behind the push are on the Windows team now? That’s a poor connection.
“I won’t allow Windows in my house anymore, we’re Mac/Linux only.”
Same here.
But I read the article and to me Windows Vista was a ‘infrastructure release’ where all the infrastructure relating to those new technologies were merged and ‘Windows 7’ will be a release that takes advantage of those new frameworks.
One could argue that what we saw in ‘Longhorn’ before the big ‘slash and burn’ was meant to be what we’ve seen being talked about in ‘Windows 7’, they rolled it back the features in Vista in favour of having them appear in a later release.
The question is whether Microsoft ends up biting more than they can chew – the biggest problem is Microsoft’s arrogance. Their unwillingness to admit they made mistakes, admit that rivals do have some positive aspects to their development model which allows them to be agile and move quickly to meet changing customer demands.
Thats pretty much it, Vista can be compared to OSX 10.0. It is only with leopard that Apple said it is at the place where they are where they want to be, which is why they are switching to a longer release cycle.
As for biting off more then they can chew, you have to keep in mind that MS is experimenting with a more open development cycle, for the purpose of having a community that feels more involved. This is why all the developer blogs appeared, channel9 started, and for the first time, they talked in detail about what they were doing with vista. Cutting features is a part of life when you have a deadline, just because you can never really predict what will happen. For example, chances are, if MS didnt halt production for the XP code review, WinFS would have shipped. What they ended up doing is raising expectations to epic levels, and then completely underwhelming everyone with their actual release.
True, but the difference is that Apple also took advantage of those new features. With each release they would push their applications up to use those new API’s which were made available.
New release of MacOS X then a few months later, updates to Apples products which take advantage of cool new features – something that is sorely lacking in Windows land.
As I said in a previous post, 6 widget kits were noticed within a few hours of using Windows Vista – why hasn’t Microsoft standardised *ALL* their bundled operating system applications on one widget and API set? why didn’t Office 2007 use the new Avalon enhancements?
But it is appearance more than reality – the reality is that Microsoft is just as distant as it was before, its recycling the same garbage as before. There is no effortto adrress end users requests – getting rid of legacy crap such as win16 and deprecated win32 calls, fixing up their interface, fixing up their bundled applications etc.
Heck, following one thread on a forum there was the fonts manager that was still using win16 widgets for christ sake! it truly is getting really that terrible – in the end, Microsoft is simply doing the motherly thing of “yes, yes, I know dear” but the reality is, they’re hoping if they say ‘yes, yes, I know dear’ that eventually the ‘great unwashed masses’ will shut up and be damn grateful for what is being produce by the ‘great Microsoft’.
But the Windows XP code review hasn’t actually achieved anything – if it was a fully code review they would have fully fixed up all the bundled applications; pushing them to use all the same kit, move all their applications to moving the safe version of standard win32 calls, removing rather than ‘working around’ unsafe calls that exist within win32 – removing backwards compatibility that was a security risk.
Has it improved security – no. The fact is that here we are, almost 6months after the release and a flurry of security updates have come through. No Internet Explorer 8, still major stability issues, security issues appearing on a regular basis with some just being pushed to the back hoping that they’ll be ignored by the media.
If Microsoft truly meant security Windows Vista would have been a ground breaking change, it would have removed tonnes of ancient crap from the code base, it would have been clean, pristine – code fully audited. sure, a few crappy applications not working, but by enlarge, a stable operating system focused on the future rather than maintaining compatibility for the past just to keep those in the cheap seats with their 10 year old applications and 20 year old hardware happy.
I agree with the different UIs. I like most of them for various reasons, but the control panel is COMPLETELY different from IE, which is completely different from explorer, which is completely different WMP, which is completely different from Office, which is completely different from VS.
I am 100% behind removing the menu bar on non apple operating systems, but the whole ADD approach to interface design that MS is using nowadays is kind of baffling.
As for compatibility, I agree here too. Compatibility is essential for a business platform, and this has traditionally been microsofts greatest strength. However, some time you need to clean house, and since Vista adoption will likely take even longer then XP (which took around 4 years to get a decent market share), it would have been a good idea to do it here.
The code review did quite a bit. Pre SP2, XP security was a joke. Post SP2, it was what one would expect from an operating system expected to connect to the net. The changes in security for vista run alot deeper, and are analogous to stuff like SELinux. Like everything else in vista, it is very new and untried in the real world, but the difference in security is night and day.
I am one of the lucky few that has hardware which runs Vista with no problems, and my experience has brought back to using windows full time. I actually expected something far less then what was delivered.
Just have a look at the bundled approach, Notepad, Wordpad, Fonts Control Panel, all use different widgets.
Whats worse, this fixation by vendors with their ‘branding’ and ‘skinning’ – I don’t want skinning. I want a stable application that plays my music without hogging a massive load of memory.
Take Rhythmbox – thats how a media player should look, simple, straight forward and does the job of being a media player as it should.
As for the control panel, I’d love to know who designed it on Windows Vista because quite frankly its the interface from hell.
For me, I don’t care what approach they take, just so long as it is consistent. When Microsoft chooses something, it would be nice for the WHOLE of the Microsoft organisation to use that damn standard rather than each division doing its ‘own thing’ like some sort of out of control teenager.
Its been 4 years for companies (third party) to upgrade and update their software – I’m sorry but 4 years is a damn long time for companies to actually test their applications, remove crufty code and ensure that their products work with the spiffy new Windows.
Heck, the new, safe ‘apis’ already existed in Windows XP, so they could have started back in 2001, moved their software to the new safe calls, then by the time Vista was ready to ship, it would be all compatible.
True, but they never went far enough. They identified a whole list of security risk calls and problems – why didn’t they draw a line in the sand and say, “this is being removed from Windows in the next release” send out a tool so that companies can identify code that needs to be changed. Again, it would require some leadership on Microsofts part.
My computer was preloaded with Windows Vista Business Edition, it was disappointing. When Solaris x86 can boot in under 1/2 the time to desktop, my wireless is more reliable and I can do more out of the box than I could with the Windows Vista Business + HP Kit, its a sad indication to just how bad OEM vendors and Microsoft have become in terms of their ability to provide the end user with something that “just works”(tm)
“the biggest problem is Microsoft’s arrogance. Their unwillingness to admit they made mistakes, “
How can you claim that Microsoft refuses to admit mistakes when they volunteered to be interviewed for that NY Times article where they admitted that the development process for Vista was entirely screwed up (too many interdependent things being developed at the same time, resulting in code being checked in with minimal to no testing because testing wasn’t possible due to a dependent modules not being complete), resulting in the “reset”?
And numerous mistakes are admitted to in blogs. I recall there was a bug in the initial release of VS2005 that was apparently introduced late in its development (so it appeared after the beta). The programmer himself acknowledged in his blog that it was he who screwed up.
Same thing happened wrt the Vista voice-recognition demo that went awry. The programmer responsible openly acknowledged that responsibility in his blog.
Microsoft openly admits that “Adaptive Menus” in Office were a mistake (see Jenseh Harris’s blog). Microsoft openly admits that MacWord 6 was a mistake wrt UI and performance (read various MacBU blogs). Microsoft admits that letting IE6 languish was a mistake (see various IE blogs).
These are just examples off the top of my head. There are many other such admissions in blogs, channel9 interviews, developer conferences.
MS admits to mistakes much more freely than their “rivals”. I even remember iPods shipping with a virus, and Apple issuing a press release blaming Microsoft for not making Windows secure enough to combat the virus!
As for “arrogance”, can you honestly say that Microsoft is any more arrogant than Apple, IBM, Sun, Red Hat, Oracle, FSF, etc.? And on a personality level, Gates appears absolutely humble when compared to the likes of Jobs, McNealy, Ellison (who’s ego is so huge, when his net worth exceeded that of Gates for a few weeks due to stock fluctations, he had Oracle issue a press release trumpeting himself as the world’s richest man. lol) I’d even say that RMS is more arrogant than Gates (certainly RMS and FSF are more *self-righteous* than Gates and Microsoft), and is least likely of all of those that I’ve mentioned to “admit mistakes”.
“[MS also refuses to] admit that rivals do have some positive aspects to their development model which allows them to be agile and move quickly to meet changing customer demands.”
Microsoft has admitted that Apple’s scheme of releasing more modest OS upgrades every 12-18 months has advantages. MS blogs admit the advantages of the agility of OSS developement model.
Microsoft, today, is the most publicly transparent software development company in history, with 4000 developer blogs, hundreds of channel9 developer interviews, and more developer conferences than anyone else. Those blogs, channel9, and conferences are a wealth of info, including admissions of mistakes and acknowledgement of the virtues of their competitors. Can their “rivals” even begin to say the same?
Edited 2007-07-24 15:13
Interesting, and yet, as I said previously, 6 widget kits used in a space of 3 hours of using Windows Vista – audit or more like ‘ram the code through a smart compiler and fix up things as it goes along’.
And he (along with his superior) got to keep his job? it makes me laugh that a decision to ship a product with a known fault was made by those who tick off whether something can be officially released.
Yes, but if nothing occurs because of it, then its a waste effort of electrons. Its nothing more tha, “oh, I stuffed up, but I haven’t learnt anything from it nor will I change anything as a result of it”.
Only forced to admit after they were pushed into a corner; its like pinning someone up against the wall and making them admit they did something wrong – what response would you expect in a similar situation?
Yes, they admit problems with the registry – but they do nothing to actually address the damn problem!
Heck, I could sit down with you and say that I could probably stand to lose a few kilo’s, but if I don’t actually getting off my fat chuff and do something about it, it is wasted oxygen saying it.
That is the problem, admiting a mistake, but doing nothing to resolve it.
Heck, latest issue regarding security and the idea of ‘mandating’ that all future Microsoft work to use the new safe call rather than the old one – great. What about the old stuff, why not update their existing products and ship it in a service pack? why continue to provide the old unsafe way when the better way would be to simply announce that it’ll be pulled in 6months time and it was up to developers to update their software.
Comparing wrong with wrong doesn’t prove anything. Its a bit like a Christian getting defensive over Christian terrorists but saying, “but there are Islamic terrorists to” to some how make a point – the point is? there is no point – its a way of deflection attention off the subject at hand.
The issue is Windows and the lack of fixing issues within the product. They had the opportunity to break compatibility, make massive changes, but they chose not to.
I don’t care about personalities; whether Balmer likes attending orgies, whether Bill Gates has a secret longing for a sugar daddy – I don’t care. The issue at hand are organisations, not individuals.
Microsoft has proven that it has no willingness to make changes. Change the curtains, add a lick of paint, but its not going to change the fact that the house has rotting walls, dodgy wiring, leaking roof and a infestation of bugs.
Edited 2007-07-24 15:21
You said Microsoft refuses to admit mistakes, then when shown to be completely incorrect, you move the goalposts. You talk of Microsoft not admitting mistakes and here you are unable to admit that you were wrong. How about admitting your own mistake rather than trying to spin away from it?
I just want to address some points you made (the rest I’ll leave, since I’d be merely repeating what I said before):
“And numerous mistakes are admitted to in blogs. I recall there was a bug in the initial release of VS2005 that was apparently introduced late in its development (so it appeared after the beta). The programmer himself acknowledged in his blog that it was he who screwed up.”
And he (along with his superior) got to keep his job? it makes me laugh that a decision to ship a product with a known fault was made by those who tick off whether something can be officially released.”
They didn’t discover that there was a problem until after the product shipped. But products are always shipped with known bugs. Those that are deemed too risky to fix (i.e. the importance of the problem is outweighed by the risk of the fix, if there’s not a lot of time left in the development schedule to test the fix and make sure the fix didn’t break anything else).
“Same thing happened wrt the Vista voice-recognition demo that went awry. The programmer responsible openly acknowledged that responsibility in his blog.
Yes, but if nothing occurs because of it, then its a waste effort of electrons. Its nothing more tha, “oh, I stuffed up, but I haven’t learnt anything from it nor will I change anything as a result of it”.
*sigh* They fixed the problem. What more do you want? And why do you assume he learned nothing from it?
“Microsoft openly admits that “Adaptive Menus” in Office were a mistake (see Jenseh Harris’s blog). Microsoft openly admits that MacWord 6 was a mistake wrt UI and performance (read various MacBU blogs). Microsoft admits that letting IE6 languish was a mistake (see various IE blogs).
Only forced to admit after they were pushed into a corner; its like pinning someone up against the wall and making them admit they did something wrong – what response would you expect in a similar situation?”
Who cares? They admitted the mistakes and addressed them. Good grief. They don’t pretend to be “perfect” or “holy” like their “rivals” do. These are humans, not Gods.
“The issue is Windows and the lack of fixing issues within the product. They had the opportunity to break compatibility, make massive changes, but they chose not to. “
They tried massive changes and found it too much, so they did the reset and will go with incremantal changes now. Besides, your wrong anyway. They introduced a new and improved driver model, which did break compatibility with old drivers, for which they are being ripped now. You want them to break more compatibility and get ripped all the more?
“Microsoft has proven that it has no willingness to make changes. “
Shown no willingness to make changes? Have you seen Office 2007? And there are those ripping them for making Office 2k7 too much different than previous versions.
You’re setting unrealistic goals for them to meet, then rip them when they don’t meet them.
Edit:
One more thing:
“Yes, they admit problems with the registry – but they do nothing to actually address the damn problem!”
I don’t know what problems you’re specifically referring to. But they now encourage app settings to be stored in their own config files in the user’s home folder (withing LocalSettings, and whatnot). .NET provides the Isolated Storage api for just that purpose, and WinForms builds on that to provide the Settings api, which allows for easily storing app settings in XML files. The registry still has its purposes (system-wide info). But it’s locked down using the same user permissions as the file system itself.
Edited 2007-07-24 20:52
I have advocated this for years. One of the best things Apple did for Mac OS was to split off the legacy version 9.x base and require <= 9.x apps to run in a virtualised environment. Doing the same for Windows would reap huge dividends. The OS is literally hamstrung because MS’s developers have to cater to all those legacy users out there. The NT kernel-land is wonderful. The user-land is a monstrosity. With an OS version cleared of all that tangling brush and scrub, Microsoft can finally release something that can go toe-to-toe with all the various *ix-family operating systems.