“Microsoft late last week said it won’t patch Windows XP for a pair of bugs it quashed Sept. 8 in Vista, Windows Server 2003 and Windows Server 2008. The news adds Windows XP [SP2] and SP3 to the no-patch list that previously included only Windows 2000 Server SP4. ‘We’re talking about code that is 12 to 15 years old in its origin, so backporting that level of code is essentially not feasible,’ said security program manager Adrian Stone during Microsoft’s monthly post-patch Webcast, referring to Windows 2000 and XP…. ‘By default, Windows XP SP2, Windows XP SP3 and Windows XP Professional x64 Edition SP2 do not have a listening service configured in the client firewall and are therefore not affected by this vulnerability,’ the company said. ‘Windows XP SP2 and later operating systems include a stateful host firewall that provides protection for computers against incoming traffic from the Internet or from neighboring network devices on a private network.'”
“Won’t fix” can be, of course, a perfectly valid response to a problem. There can be all kinds of legitimate reasons why something can’t be fixed, or that it is simply not economic to fix (from the point of view of a software vendor). Microsoft is without a doubt entirely within its rights here, and even within any kind of moral duty or obligation.
“Can’t be fixed by anyone else either” is unique to proprietary software where the vendor has said “won’t fix”.
Besides, anyone who uses Windows has to realize that they’ve committed themselves to an upgrade treadmill of proprietary software. There ARE alternatives. If you choose to use Windows, then you choose to be subject to the whims of Microsoft.
Also, I don’t blame them for not wanting to fix 15-year-old code.
Finally, if you’re so poor that you can’t afford to upgrade from Windows XP, than that would seem to imply that your computer is so old that you wouldn’t WANT to be using someting as heavyweight as Vista or Windows 7. You really might want to seriously consider installing Ubuntu on that dinosaur.
Dinosaur owners face a problem … it is increasingly difficult to get dinosaur care.
There are a lot of people on the planet who have no access to computing other than via a 4DinosaurOS. These people are people too, and they have legitimate needs, and they probably vastly outnumber the people who are in the market for the latest whizbang RipoffOS.
Might I suggest that TinyME, Crunchbang and Lubuntu are perhaps some of the 4DinosaurOS distributions that people might look into?
http://distrowatch.com/table.php?distribution=tinyme
http://distrowatch.com/table.php?distribution=crunchbang
http://en.wikipedia.org/wiki/Lubuntu
For extremely limited, very ancient computers, perhaps TinyCore?
http://distrowatch.com/table.php?distribution=tinycore
They’re all too slow. All of them. Even Slitaz, easily the fastest Linux distro I’ve ever used, suffers from constant cursor skipping and horribly slow screen redraws on old Dell I tried it on.
(Dell Dimension XPS 350, FWIW, upgraded with a 450 MHz Pentium III Katmai processor, 384 MB of RAM, and an ATI Rage 128 video card.)
Granted, Windows XP is pretty piggish on that machine too, even with the gratuitous eyecandy turned off. But XP is actually usable. In order to use Linux – even light distros – you *need* fast hardware.
As for Xubuntu on a dinosaur box… Ouch man. Just… ouch.
It has been my (seemingly contradictory) experience that any machine that could once run XP tolerably well can easily run any of the Linux distributions mentioned to date in this thread.
It’s the same for any OS the percived speed depend on the crap you put on it,
Well it a little bit wrong for Mac OSX as you have little choice of crap to begin with.
Pfft. DSL. I’ve run it reliably and quickly on a Pentium 233 with 128 mb of ram. Rage 128.
Aaaahh… The peeing contest.
I have run DSL on a Pentium 75MHz with 48MB RAM and 550 MB HDD.
I was just saying its possible to run linux with gui on lower end hardware.
But, if you really want to have a contest …
I got an obscure distro of linux to run on a 286!! No Gui, but still that really shouldn’t be possible. Although it was so heavily modified, I’m not sure you could still call it linux.
Weird. I’m running CentOS 5.3 on a PII 333/256MB Dell Inspiron 7000 laptop w/ icewm and it works just fine. (Yum is a bit sluggish and firefox 3.0 w/ java-script-laden-sites is slow), but as a work laptop (gvim for code, OO for documents, a number of open XFCE terminals and Thunar) – it works like a chap.
– Gilboa
I run Slackware 11 with 2.4 kernel on a Dell Latitude xpi CD (Pentium 1 150 MHz, no MMX, 32 MB RAM, 1.4GB HDD). X with Windowmaker works just fine (OK, it’s not “blazing fast” but still perfectly usable).
Try to put XP on that. Your point is moot.
> (Dell Dimension XPS 350, FWIW, upgraded with a 450 MHz Pentium III Katmai processor, 384 MB of RAM, and an ATI Rage 128 video card.)
That’s a freakin’ screamer dude! I remember using XP on 200 MHz PII with 128 MB RAM and it ran just fine (turning down all unneccessary services and XP uses only 71 MB RAM) Of course no antivirus as that kills the machine instantly
At the same time I was running Slackware linux with KDE1 I remember and that was fine also.
So a P3 with 384 MB RAM is a screamer and can run a lots of linux distros! Perhaps the latest and the greatest desktop environments are out of reach but the system should be nicely usable with something like XFCE or or blackbox, or fluxbox or even correctly set up FVWM (setup is for usability mainly)
Edited 2009-09-16 19:00 UTC
And if anybody have something to say about viruses, then, today’s viruses are so heavyweight (damn script kiddies) that the virus is barely able to install itself before the machine is dead and therefore the virus cannot populate itself. You as the user have nothing else to do than reinstall the windows and enjoy a freshly installed windows
Actually back at 98 or so when P2 was an OK system, the virus vriters were kind enough to optimize their exploits so that there were some resources left after the infection has happened But anyway any infection was immediately visible and that meant that reinstall was in order
Edited 2009-09-16 19:13 UTC
That’s something people do not understand. Why should Microsoft be forced to support out-of-production systems?
Do you still get updates for RedHat 2? Nope. And the only updates for RedHat 3 are really high profile fixes (or did they stop them too?)
As others will surely point out, open source is completely different story, as nothing stops you from hiring someone to back-port the fixes to your favorite Linux 2.2 running RedHat 6.2 and/or doing it yourself. (You’ll be amazed at how many people still supporting RH9).
As for the point at hand, AFAIK, if you bought a 10 year extended contract for RHEL 2.1, it’s still supported.
(Than again, unlike XP, nothing stops you from supporting yourself)
– Gilboa
Mind you, I laugh at this.
Been there, done that. The cost of supporting in-house Linux solutions is terrifying.
There are currently only a couple of solutions for long-term support: Red Hat, CentOS, Novell, and in some respects Ubuntu LTS.
Keep your fanboy comments to yourself when majority of Linux distributions are not able to support anything beyond six months. If even that.
There was a point to your comment, or were you just trolling? *
– Gilboa
* Especially given the fact that I worked on a project that used RH9 long after it was no longer supported by RedHat; We only had to maintain a small number of packages that we needed (we used highly customized installation) and keep an eye on major kernel exploits – making it *far* cheaper for us to continue using RH9 compared to moving everything to RHEL or CentOS (If it works, don’t break it). As far as I know, it’s still being used, 4 years later.
But hey, I’m a fanboy, right? *Sigh* (I’d imagine that an apology is in-order, once you take your foot our of your mouth)
Edited 2009-09-16 08:55 UTC
Both. But there was a point too.
Needless to say, my experiences have been exactly the opposite. The cost-benefit ratio just does not cut it when it comes to a little more than supporting “small number of packages”.
Already trying to track the kernel exploits you mention is a full-time job for at least one person. If you read, say, @oss-security, there are about five vulnerabilities for the kernel in a week.
When you indicate that “open source is completely different story” because “you can just do it yourself”, you sure sound like one. Next time someone has problems with Linux, I’ll just point out that “RTFM and DYI”. Reality check: there is a very, very good reason why business pays for Red Hat or Novell or Sun or Microsoft.
My apologies if I offended you.
Edited 2009-09-16 09:05 UTC
… I never said it was easy.
I said it was possible. Two different stories.
The amount of effort greatly depends on a number of packages your require and on the risk assessment (E.g. if you don’t have local users beyond a single network-facing service [owned by you], you most likely don’t really care about local-privilege escalation exploits) – all of which can have a huge impact on the number of man-hours required to maintain “dead” distribution.
On the other hand, if you’re using proprietary OS, you don’t get to choose. (I sadly had to rewrite one of my projects once Windows 2K EOL’ed, as we couldn’t do a clean port to XP).
– Gilboa
Edited 2009-09-16 09:42 UTC
I think in case of open source, people are much more likely to upgrade, because most open source is freely available or atleast doesn’t have a upgrade price other then the upgrade itself.
Again, desktop users, yes. But in most cases they an upgrade from Windows XP to Linux/Vista/W7/MAC will not kill them.
I was referring to far bigger players.
– Gilboa
There are lots of barriers to supporting yourself, even with Linux
The target market for RH 2.1 and WinXP are entirely different. Most normal users are unable to support anything. Normal desktop users don’t have the time, inclination knowledge or werewithal to patch any OS manually. They aren’t coders, they are just people, and view the computer as a tool.
I’ve used Linux and FreeBSD for years, and while I can compile a kernel, I would have no idea how to apply a patchset, or merge changes from a new kernel, that’s advanced stuff, much more advanced then most medium and small businesses can afford.
Your utopia sounds nice, but it is not very practical in the real world.
I fully agree.
… But with all respect to home desktop users that can simply suck it in and migrate to Vista/Linux/MAC, if you have millions of $$$ invested in some embedded system, single application-desktops and/or server farms running RH9, Windows 2K or Windows XP, migration becomes a -huge- problem.
In most cases (as in, unless we are talking about an Internet-facing desktop computers) hiring one or two Linux devs to keep the kernel all patched up and ~20 other packages is -far- cheaper.
– Gilboa
I think you are still missing the point. Take this as a general theoretical suggestion.
Lemma 1. It is always cheaper to rely on a vendor to do the customization and maintenance than it is to become a software vendor.
That is a lemma that many fail to confute, and thus rely on Red Hat or Microsoft or some other vendor.
Edited 2009-09-16 15:38 UTC
… I think you’re missing my point.
Whether or not its economically wise to self-maintain a dead distribution is the result of 3 questions:
1. Size: What the size of your deployment? 1 machine? 100 machines? 1000 machines? What will be the cost of migration license-wise?
2. Usage: What do you do with your Linux distribution? Is it being used for general purpose desktop computing? Are you using to run a certain application that will require a full re-licensing once the OS get upgraded? Are you using your OS to power a servers farm? Maybe some type of embedded application?
What will be added cost of migration – application and training wise? How many distribution-supplied packages are being actively used?
3. Security: What are the risks involved? Are they machines being used to access the internet, or are they being used in a closed, secure environment? How many packages within the OS must be patches and maintained? What will be cost of maintain said packages once the support dies?
If you’re a home desktop user, than migration is the best solution. If your embedded system uses 20 packages and will never see the Internet, self-maintain might be good option.
– Gilboa
XP is not “out-of-production system”. It comes with netbooks, you can downgrade some Vista versions to XP and even when Windows 7 will be released you will be able to downgrade to XP. You can still buy XP CDs in stores and Microsoft said that they will support it until 2014.
Not releasing patch for most popular version of OS is just a lame excuse to make people buy newer version of Windows.
Edited 2009-09-16 07:31 UTC
No. It is called business.
When you develop software for living you start appreciating upgrade cycles.
Nobody “appreciates” upgrade cycles unless they are masochists who enjoy pain. They’re a necessary evil, often forced on us by self-serving software or hardware vendors, only sometimes genuinely useful. Look where it’s lead us: we all use computers that a decade ago would have been called supercomputers, we spend a fortune on them, we barely finish one upgrade before we start looking for another and for what? We end up doing the same old things that a computer of a fraction of a power would suffice.
I certainly did not spend a fortune on any of the computers I own, and every time I buy one, it’s cheaper.
A massive quad core with 8G of ram for 800 bucks? that’s a lot more power and a lot less money then the equivalent of 5 years ago, and over the last 20 years, computers have fallen in price by an order of magnitude.
I am a software developer. For over a decade now. And if I would release software to my clients that functionally compared to older version, like Windows 7 is to Windows XP, I would be just kicked out of their office.
Whenever I have to release upgrades, I have to present a list of why’s. Win7 does not provide any “We must have it” points, that opens clients’ wallets.
Because they commited to XP still having security updates for an other 5 years. That’s why this is ‘interresting’.
Yes, i do get free updates for RedHat 2, in the form of the latest release of Fedora. Microsoft will only offer a paid upgrade path from their older systems.
And yet, RH nolonger sells RH 2, while Microsoft does sell WinXP on netbooks at the moment.
My dinosaurs have Xubuntu.
Besides, anyone who uses Windows has to realize that they’ve committed themselves to an upgrade treadmill of proprietary software. There ARE alternatives.
Don’t be so arrogant. The alternatives often times just don’t cut it. Not all applications run under Wine and there are no (suitable?) alternatives to those so one just has to stick to Windows. Or as in my case, I play World of Warcraft and it just doesn’t work under Linux on my very poorly supported ATi card.
Just saying that some people might use something out of necessity, not by choice.
You just supported my argument, though. If you can afford the subscription fees to WoW, then you can afford to upgrade to a faster PC that can run a newer Windows.
WoW subscription fee ~13e a month.
New PC ~400e.
So, that’s a no.
Hey, all you gotta do is stop playing WoW for 2.5 years.
Actually, that’s not a bad idea. I’ve wasted far too much of my life playing on-line games. I get more work done now that I’ve stopped.
How does that make any sense? That’s like saying, if you can afford to pay rent, you can afford to buy a new car. The part where I’d have to live in a cardboard box for the next 5 years doesn’t cut it.
I’m not sure I’d equate not playing WoW with living in a cardboard box.
Anyhow, all of this is moot. You can get a cheap Linksys router to act as a firewall and get a lot of protection relatively cheaply.
Problem is that they are still selling Windows XP based machines! These are called netbooks.
I have one, it has XP on it, even though I use Ubuntu instead of WinXP on it.
And I would be Ok, if they stopped selling it 2 years ago, but this thing is “brand new”.
Edited 2009-09-16 19:26 UTC
Huh. Ok, I totally concede on that point. If they’re still selling it, they should support it properly for a number of years.
The article said that the patch for Vista and such came out on Sept 8. It also said that Windows XP sp2 wasn’t affected. Does that mean that the firewall in Vista, Server 2003 and Server 2008 are the same piece of crappy software that was in Windows XP release? Makes you wonder.
As for fixing XP, 2/3 of Windows users still use XP. But this is really just them abandoning Win 2000. Which is old. Red Hat supports their OS’s for 7 years, so yes, you can still get support for RHEL 2.x.
Microsoft is really saying what Linux users have known all along (and what Windows users are still to learn): You shouldn’t be running XP; it’s too old. There’s a perfectly good drop-in replacement operating system available if you’re concerned about the bugs.
Drop-in? You tell me how I can just drop it into my system without upgrading it from 256MB RAM to at least one gig (a system from around 2001 that is 32-bits and only supports up to two gigs, RD-RAM). A typical Linux variant (assuming it is not KDE4-based or Sabayon or one of the other ultra-heavyweights) is is more of a “drop-in” replacement than Windows 7 will ever be. Hell, if I knew the BSDs better, even they would probably run pretty smoothly.
Edited 2009-09-16 16:31 UTC
I’ve seen some people cry poverty on this forum; anyone find it funny that those who cry poverty are quite happy to spend a couple of grand on a big screen television or several grand over a year going to sports or entertainment events? its all about priorities – you prioritise other things ahead of upgrading your software. I’m not making a value judgement or saying what you did is right or wrong, I’m just pointing out that you’ve made a lifestyle choice and you should take responsibility for the results of it.