“The difficulties in developing Vista stemmed from its monolithic structure and the need for ‘backwards compatibility’, ie ensuring that software used by customers on older versions of Windows will work under Vista. This vast accumulation of legacy applications acts like an anchor on innovation. The Vista trauma has convinced some Microsoft engineers that they will have to adopt a radically different approach.” I said something similar months ago.
to innovate you must be given a free rein to do so. If your not, you will always be held back, forced to reinvent the wheel
Well, I’m not entirely sure I agree with that.
Constraints are a vital part of defining what something should do, and who it should do it for. Innovation is finding a way of achieving something *better* and/or with fewer irrelevant constraints.
And there’s no need to reinvent the wheel, just because you share familiar constraints with other car manufacturers! Keep the wheel, if rolling along is your problem, but invent a new low-friction bearing, or a chassis, or an integrated drinks-dispenser powered from the exhaust. Or whatever.
Innovation doesn’t mean throwing everything away and starting again in a different direction.
You also have a responsibility to your existing customers. Innovation in a vacuum is much, much easier than (although rarely as fruitful as) moving forward steadily, and dragging the world with you…
That said, there comes a time when you’ve replaced the head and the handle of the hammer a few times, and it is no longer the tool it was. That’s probably more where we’re heading with Windows over the next decade.
Innovation doesn’t mean throwing everything away and starting again in a different direction.
It does if the product that was originally built was fundamentaly flawed from the start. This is one of the essential arguments about the use of Windows; it might have been useable for the masses, but was built poorly to begin with.
Only now are we seeing the “visible cracks in the foundataion”, however the flaws have been there from the beginning. Had Microsoft built the system correctly and seen the potential the Internet was to it’s business, we may have seen a better product.
Well – I don’t actually agree with that at all. Very, very rarely is the right decision to throw *everything* away and start again. The risk of losing lessons learned, and the cost of reconstructing stuff that isn’t really broken is too high.
Look at Netscape – it took the best part of a decade to get Firefox up to par, time in which a different engineering approach has brought IE to within spitting distance of it in terms of security, performance and features. And I’ll bet far less effort was expended on IE for more-or-less the same net result.
Don’t like the Win16 API or the Win32 API – fine. Lots of APIs have already been deprecated there – but the decision not to support them any more is taken based on analysis of actual lines-of-code in the wild.
As you say, the .NET APIs have been created with a view to a different software model going forward. Someone else mentioned the Singularity research OS, and a VM approach could also be taken to run a lot of legacy code. All of these are steps on an evolutionary path, not a “junk it and start again” policy.
Why? Because you still need to take your customers with you on that journey. The outcry from users around changes in XPSP2 and Vista that undoubtedly make the product more secure (which always == less user friendly), but break some existing software, indicates that this isn’t just Microsoft’s decision.
Well – I don’t actually agree with that at all. Very, very rarely is the right decision to throw *everything* away and start again. The risk of losing lessons learned, and the cost of reconstructing stuff that isn’t really broken is too high.
Ofcourse the irony here is that MS has already done that and pulled it off : Windows vs WindowsNT. Unfortunately they then decided to merge the 2 branches no doubt implementing all kinds off workarounds and adding bloat. In the process they also threw away the NT brand, which at least had some credibility, in favour of the Windows brand.
Unfortunately they then decided to merge the 2 branches no doubt implementing all kinds off workarounds and adding bloat.
“No doubt”? With theming support disabled (which wasn’t present in Win9x anyhow), XP is very much like 2000 in terms of performance, functionality, and stability.
In the process they also threw away the NT brand, which at least had some credibility, in favour of the Windows brand.
Neither Windows 2000 nor Windows 2003 bear the NT name, despite not being home-consumer-oriented operating systems. I’m not quite sure how the NT stamp changes things, anyhow.
> Windows vs WindowsNT. Unfortunately they then decided to merge the 2 branches…
Windows Mistake Edition. But Windows 2000 was much better, especially if you set to Disabled almost 30 of its unnecessary, resource-hogging, security-decreasing Windows Services and install a firewall.
It does if the product that was originally built was fundamentaly flawed from the start. This is one of the essential arguments about the use of Windows; it might have been useable for the masses, but was built poorly to begin with.
Your premise is flawed. Windows has already gone through several rewrites since its inception. Windows NT ain’t Windows 95. Similarly, Windows Vista ain’t Windows XP. Many people don’t realize that Vista has a brand new kernel, a brand new driver model (LDDM), UAC, new graphics stack, etc. I would encourage you to read and learn about the changes.
No they don’t, Vista has brand new kernel components. The kernel has fundementally remained the same.
He’s saying, there is nothing wrong with the NT Kernel, it’s great. It’s just how MS implements Windows ontop of the NT kernel thats the problem.
Your premise is flawed. Windows has already gone through several rewrites since its inception.
Exactly my point. They’ve rewritten it; but the essential code for Windows hasn’t fundamentally changed.
Think of it like a house on that ABC makeover show. More times then not, they demo the house because the sheer amount of issues that might come up from restoration would be too great to handle; it’s much easier to build it from scratch so you know the house is more structurally sound than it was before.
Windows NT ain’t Windows 95
No it isn’t…and really it wasn’t even Microsoft’s either. Windows NT was OS/2, with a bit of VMS thrown in.
Many people don’t realize that Vista has a brand new kernel, a brand new driver model (LDDM), UAC, new graphics stack, etc. I would encourage you to read and learn about the changes.
I have. The code is based on Windows Server 2003; so they haven’t really built it from scratch. The’ve added on more modularly with their drivers, with the UAC and graphics; however the underlying code remains the same.
And this is the 2nd iteration of Vista; the first was built upon XP code, and ended up being dumped because the code was too shaky to reasonably release.
No it isn’t…and really it wasn’t even Microsoft’s either. Windows NT was OS/2, with a bit of VMS thrown in.
Microsoft co-developed OS/2. Among other things, they provided the shell and file system.
NT is not OS/2 or VMS. It has influences from both, but has code from neither. It was started from scratch after the split with IBM.
I have. The code is based on Windows Server 2003; so they haven’t really built it from scratch. The’ve added on more modularly with their drivers, with the UAC and graphics; however the underlying code remains the same.
There have been code changes from the loader up. The underlying code can’t be the same if it has changed, and it has changed significantly.
>NT is not OS/2 or VMS. It has influences from both, but has code from
>neither. It was started from scratch after the split with IBM.
Maybe not in the kernel, but higher up, I doubt Microsoft made the OS/2 compatibility subsystem that still exists in Windows XP*, without code from OS/2.
Also, so many of NT’s filenames come from OS/2 that I doubt they are completely unrelated.
*The OS/2 subsystem in NT/2000/XP only supports 16-bit, console mode, OS/2 1.x applications. There are very few of them in existance. I have a copy of OS/2 1.3’s CMD.EXE which I used to confirm that this functionality still exists in XP.
So what’s your point, exactly? Microsoft co-developed OS/2 with IBM. At some point (when Windows became MS’s primary focus), MS and IBM parted company. Influences != code. The OS/2 compatibility subsystem in Windows is essentially wholly independent of the rest of Windows — just like the POSIX subsystem for Windows is independent.
Microsoft has serious issues with it’s Windows operating system. Windows’ roots are more then twenty years old. And while Unix’ roots dig much deeper, the UNIX API has been (more or less) stable for a very long time.
Windows on the other hand had some rough time. Windows Vista still has support for (most) DOS programs. And even the Win16 API is still supported. And don’t forget all those other virtually dead APIs (MFC? OLE?) Windows has to drag on to support all those age-old applications.
Microsoft has to make a clean cut. Have you ever compared the whole API stack to the UNIX stack (And don’t be afraid to include stuff like GNOME)? The Windows API stack is HUGE. No, it’s IMMENSE. Not to say GARGANTUAN.
It’s true that this is/may be one of the reasons why Microsoft introduced its DotNet framework. But if Microsoft really wants to make a clean cut, they’ll have to use the stuff themselves. (Which they didn’t in Windows Vista).
Another thing Microsoft has to fight is interdependence between their “modules”. My boss once printed out a (very simplified) module interdependence chart of windows. It’s a fricking maze. No one could ever understand it. Microsoft needs clean interfaces, clean cuts. No exceptions. Not even for their own applications. No hacks. A clean, proper structure, unburdened by all those legacy APIs.
Maybe they’ll deliver this with Vista + 1. But I doubt it.
I think they should go for BSD (well, no. I think they should go for Linux. But given the licence, it’s not likely, is it?)
I used to think MS building a new OS on top of BSD was unlikely, because it “would destroy the credibility that MS has with Windows”. But that’s ignoring the fact that:
1. Apple did it, and it’s done them NO harm. Nothing but good, in fact.
2. I should imagine it’s possible to be a Mac user and not know that OS X is built on BSD, or what that means; how much more possible in the “OS for the masses”.
3. Despite the flaming and down-modding I’ll get for saying this, whilst as an application platform Windows is a monster, amongst those who know, MS and Windows already have zero credibility anyway. And that lack of credibility is beginning to seep into the minds of “Joe User”.
As for .NET and managed code, even if Unix (any flavour you like) weren’t vastly farther ahead than Singularity, it’s just a recipe for even bigger, more bloated apps. Certainly, a level of abstraction can be handy (and is responsible for a lot of the cross-platform success of Linux, Unix in general and even (to greater or lesser degrees) the BSDs, but building a whole OS out of managed code is just Going Too Far. Of course, they might just do it: Going Too Far is The One Microsoft Way, all the Way).
Edited 2006-09-12 15:29
1. Apple did it, and it’s done them NO harm. Nothing but good, in fact.
Old MacOS sucked. It didn’t support even the most basic aspects of what makes a modern operating system (user privileges, pre-emptive multitasking, automatic virtual memory management, etc.). The Windows NT kernel is rock solid and modern (arguably more modern than UNIX/Linux/BSD/Mach). The problems that NT has had have always stemmed from bad drivers, poorly written userspace programs, users running as administrators, and too many open ports. There’s nothing to be gained in any of those areas from switching from the NT kernel to a UNIX/Linux/BSD/Mach kernel.
how much more possible in the “OS for the masses”
2. No one’s denying that it’s technologically possible for Microsoft to take this route. Just because something is possible doesn’t mean it must be done.
The rest of your post is pure troll with plenty of personal attacks and zero technological merit so I must decline any further response.
The rest of your post is pure troll with plenty of personal attacks and zero technological merit so I must decline any further response lest I require smelling salts from this enervating exchange, sirrah.
Fixed that for you.
FWIW your analysis of Classic MacOS is spot on. It was comparable to Windows 95; an OS designed to the conceptions of the late 1980s/early 90s, with more modern concepts bolted on, badly.
There is plenty of evidence to show that “Windows NT is VMS reimplemented,” which was more or less the title of an article in itself. So technologically, VMS has about 5 years (of 40) of technical progress to its credit vs. Unix, about 5 years more “youth”, and shows plenty signs of doing things a different (and brain-damaged) way to the Unix way “just because” Cutler hated the latter.
The technological feasibility of starting again is not at issue; the feasibility of marketing it is.
I couldn’t give a toss how good the NT kernel is. If it’s surrounded by an execrable userland, which it is, it becomes 90-something percent useless.
Edited 2006-09-12 19:21
build a new kernel/os that can host legacy win32 apps in a VM?
(LOL guess i should have read TFA)
Edited 2006-09-12 15:19
Don’t you think that Microsoft is starting to work on “microkernelization” of its future Windows kernel through its Singularity research OS?
Well if you mean .NET virtual machines running on any system. Does virtual hardware make using virtual machines more legitimate to take over the industry? I don’t see virtualazation as the wave of the future but more a side tool stabalize. Native code just doesn’t fit into this landscape and native code is needed for portability even if they do open source Java etc.
Regardless, putting Windows under .NET will just make it more complicated then it is. How can i control the hardware when everything is virtualized. It just seems so bloated. In my experiecne with managed languages, although they works great in minor instances and web services it bugs out allot on higher levels hence why Wings 3D are written in Erlang as a smaller modeler.
looks like they are going to clean up all the APIs, dump legacy stuff in favor of VMs. and get more moduler. sounds like its about time. but it wont be easy. so prepear for a long wait for the next OS again.
-Nex6
Is MS saying here winehq, heres everything, enjoy.
Move over to the Singularity base with a clean cut. Then use a custom version of wine as a sandbox (perhaps stipulate it as part of the deal?)….
MS gets to say “hey folks we’re OSS friendly” to the OSS folks, “We aren’t a monopoly any more – look, everyone has compatability with our products so it’s an open market” to the EU and DOJ..
And to it’s customers it can say “we have an ultra secure, lightwieght scalable solution that actually lives up to the hype”
Unlikely to happen granted. But it would sure solve an awful lot of MS’s current issues.
Edited 2006-09-12 17:22
Singularity, is very cool. but it still in the early stages they need time to mature it.
but who knows….
Singularity, with VM modules for compatibility may be the way to go.
-Nex6
Why would they use WINE?
WINE is a reimplementation of the Win32 API, and an incomplete one at that.
Why not just run a Win32 subsystem “sandboxed” out, or even in a virtual machine (I’ll bet they could even get the VM to run seemlessly so as to not need to actually boot into classic Windows, as OSX did with OS9).
1. At MS Windows infancy (3.x, 9x, first NTs) was evolving, most IT people were saying CLI were bad and they needed GUIs to better administer the growing complexity of the network infra-structure.
2. At same time the computers were not that fast, memory wasn’t cheap, hard disks were small and seemed that a lot of integration was the right path to pick.
3. Also, everybody was crying for back compatibility.
4. Add the development of a huge amount of gadgets to do unforeseen things that should be connected to be more useful.
That was the receipt to disaster. To achieve 1 the relaxed the security. To have the computer seems fast they used 2 even where they shouldn’t. 3 is probably what is dragging them to the worse nightmare any developer ever had. And, finally, 4 gives headaches to all of us on any system. AND ALL THAT COUPLED.
What they got? Well lots of money on the road but also a huge amount of code very difficult to maintain and improve.
Many years later is easy to point out the problems they were directed to but few of us devised that. Now is clear that code should be well defined and delimited if we want to maintain/improve or just swap it for years and years to come, but who ever imagined the degree of complexity the things were going to have?
Ironically, Unix had the needed simplicity and isolation, having learned the precious lessons the failure of Multics gave, the limited resources and the ever good “don’t be too ambitious at beginning” approach, to achieve the long term maintainability and capacity to improve on or swap parts.
Should they dump all? No. Probably they are going to use Vista as a escape route, developing another new OS in parallel, and using only the best parts. As a previous post noted, it must be a trip with coupled hands between company-customers. And don’t be fooled, they for sure have the resources.
Look, Linux is just a badly implimented Unix. Now that Linux has cherry-picked the Unix installs where cheapness is the primary reason to dump Unix, Linux sales are stagnating.
Windows is where the innovation is. Linux is all about stealing others ideas and implimenting them in 400 different ways so you can’t run the same code confidently on more than 1 version at a time.
This bitterness directed at Microsoft is pathetic.
This bitterness directed at Microsoft is pathetic
How else can a geek look ‘cool’ online?
VMS is what Unix wishes it was.
Microsoft will really face an eminent end if they will stick to their monolithic concept of developing system on their radical system, they must early realized that they have to start conforming with some of the opensource development specially virtualisation. I believe that this people from opensource has a better imagination on developing their projects and integrating things in consistency, why microsoft don’t try to understand the philosopy that all of the answer should not emulate from their box but would be possible if they would pick it out of their box.
It’s now the time that they should hybrid their system with open source concepts.
This could be the answer to their crucial development of Vista.
windows on (trusted)solaris or VAX/VMS with cedega for the kids and crossover office for the office fans.