Today, we’re announcing the end of Chrome’s support for Windows XP, as well as Windows Vista, and Mac OS X 10.6, 10.7, and 10.8, since these platforms are no longer actively supported by Microsoft and Apple. Starting April 2016, Chrome will continue to function on these platforms but will no longer receive updates and security fixes.
Yet another reason for those few stragglers to finally dump that silly excuse for an operating system called Windows XP, and move towards something newer. Windows XP was dreadful the day it got released, and has only become more so over the years. Really – there’s no excuse.
XP runs a whole shed load of games that I paid for. Windows 10 does not. I create graphics and plan projects, on Win10 the software is shaky or does not work. XP is good, sandboxed and with the correct tools for securing it, it is still a good o/s.
Chrome not supported, who cares?
Microsoft should open source Windows XP.
What you are looking for is ReactOS.
What we are all looking for is ReactOS even the devs. It will come one day.
What, in your opinion are the right tools to secure it? Are those updated? If so, what happens when they go away? If not, how can it protect against new threats?
If you:
1. are behind a firewall, with no incoming ports open
2. run only trusted software
3. visit only trusted websites with a locked-down browser
then there is not much to worry about.
Can you recommend a good firewall that works on lower-spec’d machines? I have an old Pentium 4 that could still be used for some stuff, but I don’t want to spend the money to upgrade it to Windows 10. I have to run some Windows-only software on it (mainly patch editors for hardware synths), so Linux won’t do.
Any firewall running inside Windows XP is fundamentally useless. Get an RPi2 and a second USB NIC to use as a hardware firewall.
Disagree strongly.
3) There isn’t any such thing as a locked down browser. There are always new attacks on them. So the vulnerabilits get patched, and the browser gets updated. But that new browser won’t exist on xp anymore.
Websites with good intentions can and have been hacked to infect desktops.
You can run win xp, but you have to realize it’s less secure than almost any other is at this point.
I’m still spending 8 hours a day using the first 24″ iMac (Late 2006) at work. OS X 10.7.5 is the max OS as its EFI is 32bit only. :\
I hope they still at least compile Chromium for these older OSes?
Guess I can always fall back to Firefox if necessary but it’s noticeably slower and UI does not match OS X as well as Chrome does. Sigh.
Your computer is almost 10 years old. Maybe you should buy a new one.
Maybe we should stop thinking that replacing perfectly working hardware just because of software is the only way.
Exactly. But, seriously you should look at getting a supported OS on that, like a linux distro. Attacks always get better, old code just gets more vulnerable with time.
XP-era software would probably work well in Wine if one still relies on some, except for the ones that directly interact with some old hardware. I’d probably recommend using an up-to-date OS and then running those aforementioned apps in Wine on top of it instead of just sticking with XP. It does require some fiddling to get going, though :/
https://www.codeweavers.com/compatibility
I know its just Wine repackaged with some additional compatibility bits but it does work very very well, particularly for some old games I still like to play.
I don’t know wine almost never worked for the programs I wanted to use. Same goes for react.
That’s what I did with my iMac G3 DV+ – I bumped it to 1GB of ram and installed Xubuntu PPC. Now I have a better cheap PC I use these days, but I can still use that old iMac with linux whereas it’s pretty much dead under OSX. I keep it triple booting with linux for most use, and OS9 and OSX 10.1 for certain old programs and games.
Let me turn that around and ask why developers should spend their already limited resources on people too cheap to buy new hardware? It is not like it is free to maintain support for old OSes and hardware.
You do know this is the OSAlert-site, right ?
Where at least some people like to talk about and run old hardware/software.
Oh I’m all for people running old hardware for fun, hacking or even to save money.
What I’m complaining about is when they then expect that developers should still be supporting such old hardware and software. If Windows XP should be supported, should Windows 2000 too? What about Windows Me? Windows 98? Windows 95?
Why?!?!?!
If the old one is working perfectly fine and is fast enough, why should anyone waste their money buying a new one just because its shiny.
I believe you can upgrade using Chameleon, but not Clover as a bootloader for your situation.
Why, because Apple being the douchebags they are dropped all support for my perfectly fine MacPro 1,1 and won’t let you install anything past 10.7.
There’s no technical reason for this, it has a 64 bit processor, and I write 64 bit code all the time.
Now with XP, theres’s no excuse there because at least Microsoft knows that people are not going to throw out perfectly good hardware.
This is really disgusting of Apple, I mean how many different hardware platforms do they have to support relative to Windows or Linux, and they can’t be bothered to support a machine that I paid a ton of money for.
Even worse, we have no replacement for the MacPro other than that ridiculous joke of a trash can they call a Mac Pro with no room for any extra drives. Sorry Apple, I’ve got 4 3 TB drives that just won’t fit in the trash can. I mean WTF were they thinking, selling workstation where you can’t add any storage to it and with only 1 processor.
So, I guess I’ll have to start the process of trying to migrate off the Mac platform that I’ve used for 30 years.
They sell you Mac Pro with only 1 processor because OS X is not NUMA aware…
And economically, it doesn’t make sense to make tremendous changes to OS X just to support NUMA for 0.05% of Mac sold.
WHAT????
I’m sorry, but you have no idea what your talking about.
Intel’s bus / QPI takes care of the memory access for multiple processors, and Mac Pros used to come with 2 processor sockets and 12 physical cores.
I write both threaded single process code and multi-process MPI code that I run on OS X with 8 physical cores (2 sockets) that works perfectly fine. Then its just a re-compile on Linux and deploy the code on our 1000 core cluster.
The only reason for going to the trash can is I guess they wanted something trendy that hipsters would buy.
He’s actually right. While QPI handles memory accesses just fine, if your thread is running on one processor, but the data is in the RAM attached to the other processor, there can be significant performance losses.
Windows, since 7 (and, since 2008R2 on the server I think), has tried to assign threads to the processor closest to the memory they’re accessing, as well as provided an API to allow software to do it on their own.
Though, I doubt the lack of NUMA in OSX is the reason for not having a dual-socket system. I’m have a feeling it was more about wanting to push OpenCL for compute-heavy stuff.
That means it’s not NUMA optimized rather than it’s not NUMA aware. Not NUMA aware would mean it would crash any time the CPU and memory didn’t mesh.
Not necessarily. Some older NUMA systems required software to be explicitly aware, but AMD’s and Intel’s implementations don’t require this – memory access that have to go over HT/QPI will still get fulfilled without the software having to do anything different.
If the motherboard hardware handles memory attached to other processors, it’s NUMA aware. The CPU might not be, but SOMETHING is or the fetch/store would fail. NUMA optimized would be where either the CPU also recognized this and handled it OR the operating system made sure it didn’t happen. Not NUMA aware means it must be handled entirely in software or it fails.
Looks like a misunderstanding of how memory works.
Memory is not “attached” to a processor, this is single, large address space and all the processors access the memory via a bus.
The only memory that is “attached” to a processor is the cache, which is a few MB.
Threads get moved from processor to processor all the time, how do you think hundreds or thousands of threads can run on a single processor system? This is called task switching.
When a thread gets moved from one processor to another, the cache gets re-loaded on the new processor.
Any processor on a shared memory system can access any physical location in memory.
Now, some OS are better at task switching then others, Linux for example does a better job under heavy workload than OS X or Windows. Here, they use a better algorithm which does a better job preventing cache misses when task switching occurs.
So, let me just re-iterate: memory does not “belong” or is “attached” to a processor on shared memory systems, i.e. multi-core or multi-processor machines. On distributed memory machines, like clusters, this is different, where memory can only be accessed directly from the physical box that its located in.
No, it wasn’t a misunderstanding on how memory works. When I referred to memory being attached to a processor – I meant physically attached.
As in, in a dual processor Xeon system, three memory channels are physically attached to one processor, and three channels are physically attached to the second processor. I’m aware that it all appears as one large flat memory space, but that wasn’t what I was referring to.
On a Xeon 5500 series, a QPI link has a max bandwidth of 25GBps, which is less than the 32GBps of its three memory channels provide. In addition, having to go through the QPI to access memory physically attached to the other processor increases latency.
This is what I was referring to: if a thread is running on CPU #0, but its data is stored in RAM connected to CPU #1, latency is increased and total bandwidth is reduced, compared to if it was accessing data in RAM connected to CPU #0. Windows provides an API to manage this, so threads can make sure that they are running on the CPU that is nearest towards the memory it is using.
You can install Yosemite (and probably El Capitan, but I haven’t tried it yet) on your Mac Pro, provided that you have a compatible GPU or are willing to buy one. Here are the instructions. It’s a bit cumbersome, but it’s absolutely worth it.
http://forums.macrumors.com/threads/2006-2007-mac-pro-1-1-2-1-and-o…
Somebody got a spare clue they can lend to Google? Much as I suspect Microsoft would like to drop it, Vista is supported for another 17 months, until 2017-04-11.
Xp Embedded (and thus in a way all other XP versions as well with a small registry edit) are supported until “January 12, 2016”
Which has a specific build for those on Atom CPU/ XP users..
http://www.palemoon.org/palemoon-atom.shtml
As for Windows “All your data belong to us” 10? Not no but HELL NO, I’m not having a big brother OS that I literally have to sandbox like one of the spyware ridden PCs at the shop, if YOU want to give everything you type, everything you say, and everything that goes on in front of your camera to MSFT? That is YOUR choice, but for me when Win 7 reaches EOL I’ll be just like those XP users are right now, sticking with what works!
Why would you recommend that over Firefox? Can you show anything?
That is easy 1.- They have refused to accept the “Australis refresh” that FF has done, and in fact have made it clear it will NEVER be going down that road, 2.- The team runs its own sync servers, so no having to stick with FF’s schedule (and get a UI I hate dumped on me) and 3.- they are keeping the extension framework and have been reaching out to extension devs to get support specifically for PaleMoon (which now has its own user agent string) and the ones that decline they are compiling their own version.
So I would personally argue that PM is frankly better in every way, least of all because if I update the thing I’m not gonna have my browser UI completely crapped upon. I’m sure many are like me and want a browser to…ya know, BROWSE, not to be some hipster’s art project, and the PM guys have been damned good about keeping the UI consistent and the browser stable which is what I look for in a browser.
BTW for those that want the same but have to have a Chromium based browser? Comodo Dragon, their UI has been consistent for years and they support XP-10.
https://www.comodo.com/home/browsers-toolbars/browser.php
Ok good reasons. I personally think either GUI’s have their good points. And I like the extensions that can be enabled and disabled without restarting. I take that you mean the 6 week update cycle?
Yes the ONLY thing that gets ported from FF are security related patches, everything else is done as needed by the PM team. Now that Mozilla is killing off the extension framework (frankly IMHO the only thing that made FF better than the competition) having PaleMoon reach out to the extension devs and compiling their own means that I can tell Moz that getting rid of my extensions is a DO NOT WANT and stick with what works WITHOUT ending up with a browser without security holes being fixed.
If you haven’t tried it yet? Give PM a spin, I bet you like it. Solid, reliable, a GUI that actually stays consistent, and support for an ever growing list of extensions makes it a really good choice for those of us that do not like the direction Moz is going.
PaleMoon is the Firefox while it was still sane. Basically, PaleMoon is incomparably better and easier to use than latest version of Firefox. That is the reason PaleMoon exists and is liked by so many users.
strange decision not to support Vista. Microsoft supports it until 2017.
Yes, especially when Vista SP2 is Win7.
Also the end of Vista support probably means that Windows Server 2008 (non-R2) will no longer be supported, while Microsoft supports it at least until 2020.
“Windows XP was dreadful the day it got released, and has only become more so over the years”
really, some low quality writing here
On release day, XP was an awful piece of crap, unstable, slow and resource-consuming on good hardware, especially when compared to Windows 2000 which was smooth and stable with 512 MB of RAM. Then, with SP1 it became bearable but slow. With SP2, it became even more bearable, but still worse than Windows 2000. Then most games were not compatible anymore with Windows 2000 because of the lack of DirectX update and XP was forced down my throat.
SP3 never made XP as fine as Windows 2000, just less bad.
I could never figure out how XP ended in many minds as the finest Windows.
You mean W2K was stable with 32 MB of RAM; Windows XP — upon launch — required 128 MB of RAM, which was not the most common at the time. Many early XP machines shipped with as little as 96 MB of RAM. It was a squeeze, but it worked mainly due to smaller drivers at the time.
By the time SP2 came around, the minimum RAM usage had ballooned to 200+ MB.
Well, my VIA C7 runs Windows 2000 SP4 and XP SP3 just fine. But sure, 2000 is fabulous, so is XP Pro when nicely tuned down toward something as lightweight as 2000 is. Those two OSes are wonderful, so sad Thom do not know them the way we do.
XP vanilla sucked but XP X64 (which was in reality 2K3 X64) was a damned nice OS and was a great migration for myself and many Win2K users.
I ended up skipping XP outside the office completely, going from 2K to XP X64 to Win 7 which actually felt like having my OS upgraded instead of forced upon me. Win 10 is simply too filled with spyware to be of use to me but luckily Win 8 can be had for cheap and with 8.1 supported until 2023 by the time it leaves support either MSFT will have gotten its head out of its behind and released a non spying OS or some other company will come in to fill the void, at this point I honestly do not care which.
The problem with XP64 is that AMD quit supporting it years back – well before MS stopped support. Example, my current PC is an AMD A6-5400k. AMD stopped XP64 driver support just TWO versions before it supported the A6. So when I install XP64 on my system, I just get a dumb frame buffer and no 2D or 3D acceleration. Luckily, 32-bit XP got drivers for the A6-A10. That was the last drivers AMD did for XP. Anything newer from AMD requires at least Vista.
You should be able to use the vanilla GPU drivers, I’ve found those “APU” drivers are really nothing more than the vanilla GPU drivers with a few extra presets.
Also I had good luck using Vista 64 drivers, have you tried them? I stayed on XP X64 until Win 7 X64 and all I had to do was extract the drivers and install them manually and it was all cotton candy and puppies.
Looking a bit for you you should be able to use 13-4_xp64_dd_ccc_whql.exe with that particular chip, download it and give it a try and let me know how it goes.
Dear Thom
please hold back a touch on the XP (or ANY OS downing too much)
Sure, if you’re a “home” or “consumer” user (with the caveat that you have the funds/minimal know how to uprade) and you’re on a network connected machine – THEN, sure you have no excuse no to upgrade, update, and secure your machine (or your new machine)
But if you’re running an XP install to run software (especially if offline) that operates scientific or technical equipment – then it’s entirely arguable that you have no or little reason TO upgrade – if it works and all that..!
(sure make enquiries with you vendor whether they have a newer version/on a newer OS but if not… and even IF.. )
This things are say again and again by open source zealots and fanatics “windows xp was horrible” it was great and the proof is that is still used by millions of users even more users prefer this 16 year old os than to use linux …….. google just look at any chance to bash microsoft its silly.
Currently i use windows xp 64 bit i have used it know don’t remember 10 years? on a dell server dual quad core xeon 32 gigs ram sata 3 raid controler sas drives geforce 8800 ultra video card. Don’t have antivirus don’t need it never got infected by a virus if you know what you are doing you don’t need it.
Ok, so what version of Windows am I supposed to run on my (nearly) perfectly-functioning Acer Aspire One ZG5 (Atom N270, 1GB RAM, 120GB HDD)? It came with XP, which works as well as XP ever worked. There are times when I have to run Windows. Fortunately, it has no problem running other modern operating systems (Fedora 23 with the Mate DE being the most recent)
Unfortunately it looks like most Linux distros will also soon follow suit and kick you (and me) to the curb by eliminating their 32-bit branches. The newest openSUSE Leap has already eliminated 32-bit support, and Fedora and Ubuntu are also talking about doing the same. Real bummer to see that not even the open source people care about supporting legacy hardware that still works just fine, as you mention.
Never mind. There’s always Puppy Linux and other lightweight distro’s that will support 32 bit for a good few years yet. Apart from that, there is Haiku or Aros that do and will support 32 bit for the foreseeable future.
There is a significant difference; you can still update, manually, your present linux system in the case of your distro dropping 32bit x86.
You have to draw the line somewhere.
When I have to support your antique machine it means:
– I can’t use SSE/AVX instructions on my new CPU without significantly more work.
– I can’t utilize all the registers on my newer CPU easily
– My graphics code cannot use compute shaders, or maybe not even fragment shaders if your computer is that old (assuming an Intel GPU).
– I have to watch out for memory fragmentation because I’m stuck with an old 32-bit flat memory model.
– Your CPU is so old and slow that I can’t just brute-force some problems and now have to spend twice the time coding something to have it run fast on your old junk.
– For Windows XP, I can’t use newer APIs like Direct3D 11, DirectWrite, Direct2D and the newer audio subsystem.
Why should I suffer so much just so you can save money? Open source developers are already very generous in how much old hardware they support!
…and now you just accidentally revealed that you are, in fact, one of those retarded code-monkeys, far from deserving a name of a developer. At this point, I would stop coding and look for a career in farming or foresting if I were you… The world will be a better place for all of us this way.
My god… “Brute-force some problems instead of writing proper code”… I don’t want to live on this planet any more…
I have absolutely no problem with Linux distros going 64-bit only – I’ve been using 64-bit Linux since 2005 and new hardware has been 64-bit capable for many years now.
Yes. include legacy 32-bit libraries to keep old closed source programs running on 64-bit distros of course, but I see no need for new (or recent) machines to run 32-bit distros any more. Users can always stay on long term distros for 32-bit support (e.g. CentOS 6 is supported until Nov 2020 and there’s even just been a 32-bit CentOS 7 build release too, though this is utterly pointless for an Enterprise-level distro, IMHO – workstation/server kit will have been 64-bit for years).
We have a computer running windows 95 here controlling a spectrometer. Not connected to anything and an USB floppy drive on the desktop PC’s to transfer files. The interface card needed for the spectrometer is not supported by anything newer, we tried last year. End of live of an OS is no good reason to spent 100k^a‘not on a new spectrometer.
If it’s not connected to anything you probably don’t need Chrome on it anyway so I don’t see how your case is relevant for the topic at hand.
To the chap who couldn’t get his software working on ReactOS – I have one thing to say – DOH!
Not sure what you mean. Are you really suggesting that react os is completly compatible with any version of windows?
It means that you should expect NOTHING to work in ReactOS – it is in ALPHA – so you should expect that it definitely won’t work…
If anything does then that is just cream on the cake at the moment. When it enters beta you can start to expect it to be stable but software may still not work.
That is the idea of Alpha and Beta.
Would you be able to help me run Adobe Lightroom on ReactOS?
o Download and install Oracle VM virtualbox.
o Download bootable ReactOS daily build (not 0.3.17)
o Boot Reactos from bootable daily, test it boots. Shutdown.
o Create a ISO with the Adobe install and mount in
virtualbox.
o Boot Reactos from bootable daily.
o Install Adobe Lightroom from ISO or CD
o Fails to install – raise a bug on the forum and in the ReactOS bug system JIRA, screenshots &c
o Fails to run – raise a bug on the forum and in the ReactOS bug system JIRA, screenshots &c
Expect it to fail. ReactOS kernel is only 50% compatible currently and is still in Alpha.
That sounds like quite an interesting adventure. I think I will give it a shot on my next vacation (no sarcasm here).
Ant yet, I have a feeling that at the current development pace… When ReactOS finally reaches RTM or at least RC stage, even NT kernel will be long forgotten in Windows world, so entire effort of building alternative binary-compatible OS will be an exercise in futility. I mean, unless OS has reached stable state, no one will ever be using it in production or for any serious work. So, basically “just because we can” attitude, nothing more.
No, not at all!
Windows internals are undocumented, largely. ReactOS determines what those internals are and documents them during reverse engineering – then re-engineers them from scratch. It discloses the hidden functionality of Windows utilised only by MS and other devs “in the know”. An open source windows API compatible o/s means that we have an open alternative that is not at the whim of MS who can change /drop support at a moment’s notice (they do this all the time). It provides a potentially stable platform for legacy applications (essential – hopefully you know why). It teaches others the internals of Windows. It provides skills for those wishing to learn how to code a modern o/s.
ReactOS boots in 10 seconds on older kit, it will be blisteringly fast when it is released as the coding is very slick and compact. Will allow support for legacy or older peripherals (essential – hopefully you kno…). ReactOS will provide competition, a free Windows alternative that may help to force MS to make their own offering free and open source, eventually perhaps?. Reducing Microsoft’s hegemony over desktop o/ses is a good thing in itself &c &c.
Do I need to go on? This stuff is blindingly obvious to me, it worries me that it needs to be explained.
Edited 2015-11-16 12:46 UTC
I am well aware about reverse engineering process, I know the situation with Windows internals, and all the other things you stated here.
I am simply saying that at the pace ReactOS is going it will be completely obsolete and irrelevant once it finally reaches a stable state. Being able to emulate NT kernel providing binary-compatible alternative equivalent to roughly Windows XP at that time will be the same as providing binary-compatible alternative to Windows 3.11 today ^aEUR” basically useless because of it’s obsolescence. Microsoft will probably be moving on from NT kernel to something else by then, just as they have done with 9x kernel before and systems (both hardware and software) requiring Windows XP will be mostly non-existent.
Being able to run a win32 app on an o/s is nothing new and the requirement is not going to go away in a hurry. Who cares about the other stuff that windows says it does? Vista, 7, 8 and 10 are all the same o/s, derived directly from NT5 running windows apps. Bugger the current GUI, the only real differentiator from one version Windows to another…
As long as ReactOS can run ‘your’ app then it can be considered usable. For many it will start to be usable in a year’s time, say Dec 2016 when it reaches beta. As it is improved compatibility will improve and more apps will run.
END
Also check the “epic win” thread on the ReactOS forum which shows the list of working apps and programs that people have managed to get working completely or partially.
http://www.reactos.com/forum/viewtopic.php?f=2&t=10972&p=114355&sid…
There is NO legitimate reason that a well written win32 program shouldn’t still work on Win2k, much less modern other than reliance on external libraries you shouldn’t be relying on existing, or “undocumented” calls Microsoft told you NOT to use for a reason.
All those “Undocumented Windows” programming books being one of the leading causes of broken stupid code and Windows headaches of the past 25 years!!!
I mean apart from it being a bonehead maneuver given the number of people (specifically businesses) who are STILL refusing to upgrade past XP, or that it’s anywhere from 9% to 20% of the current OS userbase depending on who’s numbers you use… You set that aside you still have to ask the question:
WHAT THE **** ARE THEY DOING that means their codebase works on Win7 but not Vista? That right there is the biggest “tell” of something being horribly and terrifyingly WRONG with how they have written their software!
Next thing you know they’ll announce dropping 7 support as they turn it into one of those god awful utterly useless painfully slow and bloated metro crapplets.
As of ~2k a win32 executable should be a win32 executable, there are ZERO major OS changes that should break if you just write as that… so what possible excuse do they have for this?
I mean, if they went 64 bit only, then fine — I could see that… but even that wouldn’t excuse dropping support for XP x64 (aka Server 2k3) or Vista x64 since really from an applications point of view they should all be the same damned thing!
That’s why we have API’s in the first damned place!!!
— edit — P.S. Nice job on the noodle-doodle fantasyland ranting and raving about how bad XP was. Didn’t reek at all of fanboyism or lack objectivity whatsoever.
Edited 2015-11-13 23:41 UTC