And so the smartphonification of the general purpose computer continues. This time around, though, it might actually be for the better. Microsoft has detailed two new features in Windows 8: refreshing and resetting your computer. Reinstalls will be a thing of the past.
To this day, it remains the number one piece of advice for any computer operating system if you don’t know how to fix a problem or if your computer has become slow: reinstall the operating system. Especially in the days of Windows XP, reinstallation was kind of a yearly ritual. Windows 7, Mac OS X and to a lesser degree Linux, also need the occasional reinstall to freshen everything up, but not as often as Windows XP did (just another reason why I never liked XP).
I’m pretty sure that, especially with the new Metro applications, Windows 8 will have far less issues with getting slower over time than even Windows 7 did. Still, as the nonsense concerning people advising users to close applications on iOS and Android shows, old habits die hard, and I’m pretty sure that reinstalling the operating system has become so entrenched among users (thanks, Microsoft), that Microsoft felt the need to provide users with a very easy way to do this.
And thus, refresh and reset. Refresh will “keep all personal data, Metro style apps, and important settings from the PC, and reinstall Windows”. Reset takes the more rigorous approach and “remove[s] all personal data, apps, and settings from the PC, and reinstall[s] Windows”. Both tasks can be performed through the Settings application within Windows, from Windows RE on the hard drive, or from Windows RE on the install disk.
Since especially the reset feature is also ment for PCs you’re about to sell on or give away, the reset option will also include a checkbox to enable random writes to sectors to make sure none of your data can be recovered. It’s only single-pass though, so it probably won’t comply with regulatory compliance in case you’re trying to get rid of sensitive government or company data – but it is a whole lot faster, and probably sufficient for normal people.
The refresh option is still an actual reinstall of Windows, but retains all your data and Metro applications. You don’t even need a backup for this; Windows will remember all your stuff and applications by itself. Not all settings are preserved; some are purposefully set to factory state because they are a common source of problems. These include:
- File type associations
- Display settings
- Windows Firewall settings
So, what about non-Metro applications? Well, Microsoft basically admits current desktop applications are a bit of a mess when it comes to installers, making it impossible to take these into account – they are removed during the refresh process.
“First, in many cases there is a single desktop app that is causing the problems that lead to a need to perform this sort of maintenance, but identifying this root cause is not usually possible. And second, we do not want to inadvertently reinstall ‘bad’ apps that were installed unintentionally or that hitched a ride on something good but left no trace of how they were installed,” Microsoft details, “It is also important to understand that we cannot deterministically replace desktop apps, as there are many installer technologies as well as custom setup and configuration logic, of which Windows has little direct knowledge.”
Still, Microsoft provides a tool to make an image of your preferred Windows 8 setup – including legacy applications – and then use that image for the refresh procedure. In other words, you cans set up your PC just the way you like it, including legacy applications, image it, and tell refresh to use this instead of the factory method. This makes me very happy.
Not that I’ll be switching to Windows 8, but at least there’s some features in there that I like. All Microsoft has to do now is make Metro usable for more than just Twitter and weather applications. Sigh.
.. that are so easy to fuck up beyond repair that the only way to fix them is to reinstall or reset.
Making them harder to fuck up obviously is not a solution MS is contemplating.
Yet Linux bakes it into the user experience. To the extent people aren’t updating Linux servers. An update can kill you wireless or video still.
TBH While WinXP was prone to breaking itself (it needed care), Windows Vista and 7 really don’t have any problems.
Yes you have to make sure you aren’t running every update manager under the planet, and everything is not starting on boot but this is the same for MacOSX and Linux as well
Edited 2012-01-06 12:53 UTC
What planet are you living on?
Distribution upgrades on Linux – yes, Ubuntu manages to break a lot. Software package updates On Linux – no, RedHat(6 years of updates), SUSE(3years of updates) and Debians(Ubuntu 8.04 Server hasn’t had any unplanned downtine due to botched updates) work perfectly well.
To my knowledge Windows Server 2003 and newer handle updates without issues.
If you don’t count rebooting every tuesday to patch holes in iis and .net “issues” sure.. it works frikkin wonderful!
Only crap sys admins allow their servers to run out of date. Just like how crap intranet developers have forced some organisations to keep their users on XP + IE6.
Crap technicians exist on both platforms and hardly prove a point.
There’s been plenty of horror stories about Windows driver updates and service packs breaking previously working systems.
Actually it’s not an issue for Linux as all the updates are managed centrally, which makes a massive difference (particularly as the majority of Windows start up bloat is the plethora of third party update managers)
I will say that in all fairness, an experienced user can keep even XP running stable for years without a re-install and the state of things have definitely got better with Win7. However I still think there is lots of room for improvement – and I mean that on both Windows and Linux.
Edited 2012-01-06 13:06 UTC
I beg to differ. When you want to update an application for some important new feature or bugfix then the updated packages are most probably not available on the “main” or “stable” repository but either some testing repo or third party repository. And then you scout the internet for possible repositories, each hosting their selection of duplicates and the app from that repo absolutely requires that repo’s version of library X which of course conflicts with your installed stable (==old as hell) version of library X and there you have it – DLL hell all over again, this time with a different dressing.
And of course no Linux install comes with startup bloat, no sir! I absolutely need the plethora of servers and services that get started on a default install. SSH for when I want to remotely log in to my laptop (which is always travelling with me), cupsd (although I have never owned a printer) or even the full LAMP stack that some distros install.
To paraphrase our resident DJ Thom: “Pot, meet kettle!”
Uhm, It sounds like you are talking about a debian based distro. Which is a point, but not the whole enchilada.
Obviously, Arch does not have any of the problems you describe. Ubuntu does, which is why I avoid it. Fedora usually does not. The only time I ran into an issue is when I was still on Fedora 14 and needed a newer version of firefox ( 4+). The extra repos I found authored by a main fedora dev, and had no issues installing along side firefox 3.6. I actually added another when firefox 5 came out, before the normal upgrade to fedora 16 ( I always skip the odds).
SSH is on the order of kilobytes. Its not meaningful bloat. LAMP by default is not a configuration I’ve seen on a mainstream distro, although there are distros that do LAMP by default because they are designed for easy LAMP servers.
Linux distros are so varied that your comments can’t really apply to them all. So its more like:
“Pot meet a collection of things made out of metal, some of which are black, others are used for cooking stuff, some of which you could put water in and hold over a flame to boil water, but everyone would look at you cross-eyed if you did”
<nitpick>So a friendly little 5KB Blaster/Sasser/Sobig that starts up when an infected WinXP box boots is no big deal?</nitpick>
What I also find interesting is that MS got a lot of flak for so many (!) versions of Win7 but when it comes to Linux/BSD then the overwhelming choice is actually a good thing and it’s usually the users’ fault for choosing the wrong distro. (Jesus f***, did I just defend Microsoft here?? That felt weird…dirty even) BTW, for what it’s worth, most of my “DLL hell for Linux v.0.3” occurred with SUSE (granted, with *buntu too).
They get flak for that because while in Linux every distro gives you the same features, Windows 7 editions have varied feature sets.
There’s a difference between ONE 5KB file that sites idle with an open a socket, and half a dozen files that all actively and simultaneously download content from the web when the PC is busy performing one of it’s most intensive operations.
Anyway, you’re missing the whole point. We’re not arguing that Linux is more or less bloated than Windows. I was simply stating that you don’t need to manage about rouge update managers and start up processes in Linux in the same way you do in Windows as 3rd party installers don’t install crap everywhere in Linux when you’re pulling binaries from apt-get / yum / pacman / et al.
I don’t know what sites you frequent but one of the biggest regular criticisms against Linux is the number of incompatible distros.
However the reason you got tacked on your point was because you were comparing a server distro to a desktop distro – which is akin to complaining that your netbook install of Windows as active directory and IIS yet you opted for Win Server 2008 instead of Win7.
So all we are asking is you compare like for like rather than muddling desktop and server installs as one desktop OS.
Yeah, SuSE can get it’s knickers in a twist if you try to pull RPMs from outside the official repositories. It one of my biggest grips with SuSE (we run SLES at work and I’d quite happily replace them if the IT director would let me)
Edited 2012-01-07 12:27 UTC
At the risk of sounding “preachy”, this is one of the habits that window’s users have to drop when they which to Linux. I used to be the same way, having the latest greatest of every app, like I had in the Windows world. You are correct, on Linux this can lead to the same problems.
As I’ve aged (!), I’ve moved away from that habit. In fact, I’ve even started to live with some of the LTS version of Ubuntu, Mint, etc. In the rare instance where i just HAVE to have a more recent app, such as Firefox, I just download the .tgz binary, unzip it in my home directory and run it from there.
It’s a hard adjustment to make, but since I have, I now get to live with a very stable system, and spend a lot more time using the computer instead of maintaining it.
But the package management is still handled centrally – one update manager as opposed to a plethora of independant update managers which app compete against each other upon system start up (which was the point we were discussing).
That’s a separate issue entirely. If you have a problem with the availability of packages for your distro, then perhaps you should be looking towards other distros
Of course some do. This isn’t something I’ve ever stated otherwise so I’m really not sure why you’re expressing it that way.
So disable them, do a minimal install or pick a different distro that has better defaults out of the box (eg Ubuntu comes without openssh-server but Debian minimal install would come with sshd but without CUPS and even X.
Your argument is a little like complaining that Windows Server 2008 is bloated for the desktop.
I sincerely hope you meant that ironically.
Have you seen the number of services running on an Ubuntu or Fedora install? Quite a few, some I don’t even know why they are there … bluetooth device manager even when there is no bluetooth devices etc.
I have one update manager (for Adobe Reader), running on my system. Everything else is done via Windows Update … that is Drivers, Office, SQL Server, VS 2010 … Everything else checks when the app starts up.
Yeah there is, but the OP said that Windows had the problem … I contest that buy saying they all have their problems.
Actually my nettop XP install is from 2004, my Win 7 gaming PC install is 2009, neither has ever needed reinstall thanks to my secret which is…..don’t install crappy software! I know, its a concept and i’m sure heartbroken over those kitty screensavers i missed out on, but that’s the price of having a rock solid Windows i suppose.
For those that would like a butt simple and cheap (or free depending on which Windows version you are on) way to set up windows and never have to worry about it here is the way. 1.-On first install go to ninite.com, pick any freeware you need and grab Avast free while you are there, 2.- go download Comodo dragon for a browser (you can use Chromium if you prefer, but i like the extra security features of the dragon) 3.-Install filehippo update checker which will take care of third party software updates, 4.- and finally the part that may cost money but is free if you’re on XP as there is an old version they give away free, and that is Tuneup Utilities. It cleans the junk that ends up building in the registry during uninstalls and much more. you can get Glary at ninite if you don’t mind doing it manually, but TuneUp has better features and does a better job IMHO and by default is fully automated.
Tada! you know how a Windows PC that is pretty much as easy to use as a toaster. it’ll clean and take care of itself, Avast and Comodo keep out any nasties, its all easy peasy simple. if you are paranoid and want your machine to be unbreakable short of hardware failure you can add Comodo Time Machine for free and that way if little suzy manages to bork the OS so bad it won’t even boot you just push the home key on startup and in 20 minutes you’re back like nothing ever happened. See how easy that is? Certainly easier than playing forum hunt when Linux borks your wireless and with this even the most clueless are safe as houses while power users have a good running machine they never have to fiddle with, it all “just works” and keeps on working, year after year, oh and with TuneUp there is no “WinRot” which I’ve found is caused by badly written third party uninstallers leaving bad pointers in the reg. Enjoy!
I used to run a stripped XP build as well, now I just run Vanilla 7, much better than it used to be. However people seem to think that Windows was exactly the same as it was 10 years ago.
Clearly it doesn’t “just work” if you had to list off a dozen third party apps required to keep your OS running. :p
Plus how are new users supposed to know all that? Usually that would end up being via the same methods which Linux users would navigate too when troubleshooting. So don’t start spouting garbage about how Linux requires a learning curve to be a stable OS when you just listed of a page of prerequisites which Windows users need to know.
Any OS requires a degree of education. As hard as MS, Apple and Canonical try, computers are a complicated beast. It’s akin to giving someone a car and expecting them to drive it without any lessons.
In fact, this problem i have with fanboys in any camp (be that Linux, Apple or Windows) as they assume their method is the best and no other OS works. Clearly that’s not true otherwise Linux (for example) wouldn’t be used on millions of desktops worldwide.
Now I think I’ve been more than fair because I’ve tried not to let my personal preference colour my comments on here. So please show me the same respect and don’t fob me off with false pre-tenses.
Here try my little challenge and see my words are true! Take ANY distro from 3 years ago, install it, get all the drivers working. you may use CLI now as you are the builder not the consumer. Now that its working use whatever GUI method you prefer to upgrade it to surrent and see how much is broken. now go to the forum and ask for a GUI way to fix what went wrong…guess what? it most likely doesn’t exist! When some poo pooed what I said i said “Okay here is the problem i’m having in wireless, now pretend i’m not a nerd, i’m Suzy the checkout girl who knows nothing about your OS, walk me through fixing it with the GUI because buttons i understand” they tried and they tried until finally one said “You can’t do that in a GUI”.
And THAT, that right there, is why Linux has lower than the margin for error. windows drivers almost never break, and if they do there is a handy ‘roll back drivers’ button that is “clicky clicky reboot”.
Your failing is that you think people will sit down and read man pages and learn Bash and the fact is NOBODY wants to learn that mess, okay? NOBODY. they want GUI, they want simple, they want clicky clicky easy. you don’t give them that they go somehwre else. they do NOT care about ‘free as in freedom” they do NOT care about “the right to tinker” they do NOT care about “the power of CLI” they want simple and easy. Windows and OSX gives them that in spades, Linux don’t. heck look at ANY “mainstream’ distro and look at the apps, some do things the Windows way, some the mac way, some old school UNIX, no consistency at all anywhere. Oh and the latest numbers show Linux stalled at 1.3%, that’s nearly lower than the margin for error. for an OS that has been out 20 years already? that’s horrible numbers.
Me i’m having to scramble to find a Win 7 starter supplier because after all my tests not a SINGLE Linux survived my little three year test, which is less than half the length windows provides support BTW. that’s just sad man, that’s just sad. Oh and my little ‘trick” is simply that, a trick I use to make sure i never have to mess with a machine again. the simple fact is I can slap an AV on a Windows box along with dragon or Firefox or anything other than IE and it’ll be going a decade from now with NO tinkering. Can you say the same?
It has nothing to do with being a fanboy of anything, its about knowing the market which its obvious the Linux community doesn’t or the numbers wouldn’t be so lousy. If I were to sell Linux boxes i’d be out of business in a year because I would have to provide free lifetime support AND have spares for when the 6 month upgrade broke something that a “fix” isn’t out for yet AND have to deal with customers getting burned because the only way to buy devices is to play hardware roulette when it comes to Linux because all the forums have device lists horribly out of date. Sorry but I’d like to not go out of business and the only ones that Linux “works” for is geeks who don’t mind fiddling, since i don’t sell to geeks no sale.
Simple and easy, Windows ? Hah !
You have to choose your side, sir. Either you invoke that Windows can work correctly when a knowledgeable person tunes it up, or you invoke that it can be easily used by non-knowledgeable persons. Because maintenance tasks are made anything but easy on this OS.
And before you feel threatened in your intrinsic argumentative superiority and start to spit kilometric paragraphs about how great Windows is with respect to Linux, I’m not saying that Linux is necessarily excellent. It’s only less bad. What I’m saying, on the other hand, is that the only reason why you think that Windows is less quirky than Linux is that you are more familiar with its quirks.
You keep invoking the fact that Windows has a GUI, but this OS is pretty much the experimental proof that GUI tools can be made less discoverable and understandable for non-technical people than a CLI. From the point of view of support, I’m sorry to say that blindly running a few bash commands is much, much easier than blindly using MSconfig, the Registry editor, the Services manager, or most tools which are designed as a prettier frontend to those such as CCleaner.
In a sense, I prefer what Apple did in OS X : when an OS designer can’t figure out how to design an advanced GUI, it’s best to leave the job to command-line tools. Their simpler nature makes them harder to mess up.
Edited 2012-01-09 06:02 UTC
Not exactly. How hard a maintenance task is depends on two main factors: the type of maintenance required, and the tool(s) accessible by the user to address it. These types of tasks and software have made the user experience pretty painless, regardless of the users level of computer knowledge.
While I tend to lean in agreement of your point about being familiar with quirks, I can not agree that Linux is less bad without specifying in what regard you’re referring to. In some cases, Linux slaps Windows around. In some cases, Windows slaps Linux around. In the remainder, it’s a draw. But as a blanket statement that Linux is less bad? No way.
The time Windows spends keeping your harddrive defragged is so insignificant that it’s not even worth mentioning. Grossly over-exaggerating it tells me you either a) don’t actually know much about it’s defrag process, or b) are intentionally being misleading & FUD’ing. Be better than that.
I was implying that Linux is less bad than Windows as an OS for advanced users who like to fine-tune their user-experience, sorry if it was not clear enough.
As for examples of this…
-The CLI system management tools and text config files offered by most Linux distributions are easily scriptable, whereas most of Windows’ configuration is stored in binary blobs that are pretty to access by automatic means (although it’s not impossible).
-This development of CLI tools also means that you can use the GUI environment that suits your needs best, no matter how obscure it is, and yet still be able to get support from users of other GUI environments.
-If you have lots of system administration tasks to do at once on Linux, you can just su root and do whatever you want without being annoyed. On Windows, unless I’m missing something, you have the choice between a/enduring constant UAC warnings even when you log into an admin account and b/giving up on privilege elevation dialogs when you run as an unprivileged user. I know of no way to selectively disable UAC for admin accounts.
-On Linux, when you have messed up an install and want to start over from a cleaner state, it is possible to transfer only part of your user configuration to the new install, simply by taking care of which content you copy from the old /home to the new /home. On Windows, it’s pretty much all or nothing.
-Package managers are a very, very sweet feature to have on a new config when OS manufacturers can afford the server maintenance cost. I’ve heard that Microsoft plan to catch up on that on Windows 8, though.
-It is possible to make quite lean Linux installs that boot quickly and are very snappy, if you are ready to bother with choosing the right distribution or playing around with system management tools. While Windows used to have the wonderful nLite and vLite, that allowed one to get lots of bloat out of the installation CD/DVD and achieve a performance that gives justice to modern hardware, the author has not kept up with the pace of new Windows releases since he got a job at Microsoft.
-Most Linux distros come on LiveCDs, which are an extremely nice maintenance tool. The equivalent functionality on Windows, BartPE, is nowhere near the same amount of comfort and vendor support, and it takes quite a lot of skill and patience to make a usable Windows LiveCD.
-It would be pretty hard to have Windows install and boot on something else than NTFS, if possible at all. Making it *read* extX volumes is already a challenge. Whereas on Linux, you get support for a large range of filesystems depending on whether you rather value performance, fault tolerance, or another killer feature that only filesystem X has.
I could go on and on, but I think you see my point.
As for more regular computer usage, it’s harder to differentiate different OSs as they have become pretty close to each other.
-Windows is quite worse on out of the box hardware support, but generally gets better drivers for nonstandard hardware (GPUs, some network cards…), although the general quality remains awful (Seriously, Nvidia, 100MB+ for an effing DRIVER ? And what’s with the screen turning black on every boot on XP ?).
-The performance of Windows installs is generally quite awful as compared to that of Linux installs, even when the latter are running heavy desktop environments such as GNOME or KDE. For a simple example, one can mention the ridiculous time it takes for an USB pen drive to be recognized. To be fair, this may be related to the need to use antivirus software on that OS though.
-Windows Update (or is it Microsoft Update now ?) is a pain. It’s slow, it doesn’t work with third-party software (which leads such abominations as the Java updater to exist), and it seems that assume that you will never reboot your computer if it doesn’t keep nagging you on that matter. The related mess on Linux would be dist-upgrades, which are way too frequently needed on some distros and tend to break stuff.
-Installers on Windows are full of rubbish too (unsafe, annoying, regularly used to sneak bloatware in…). I have to give it to Microsoft though : on Linux, out of repository software is even more of a pain due to the multiple competing binary package standards.
-As discussed somewhere else, Windows developers have no sense of measure and tend to put their software on boot of the OS for no good reason. Unless regular maintenance (MSconfig, etc…) is performed to avoid this outcome, Windows installs end up loading up tons of crap on boot and becoming even slower. Linux software, for some reason, mostly doesn’t have this problem.
In the end, Windows’ main advantage remains that every average techie knows how to fix it, and that it has overall better third party software (games, pro applications…). I wouldn’t call it a good OS for its intrinsic merits, though, it’s mostly the community around it that’s interesting.
As a whole, the time a computer spends doing HDD writes is negligible as compared to the time one spends using software that’s loaded in RAM. And yet these little things still breaks…
AFAIK, if it weren’t for HDDs, computers would easily last 20 years without major hardware issues. Hard drives, cheap ones especially, are incredibly fragile as compared to the rest of the machine, and to the best of my knowledge current SSDs are no better. Operating systems should probably really do their best to put minimal load on mass storage devices, instead of inflicting them 30 minutes of intense defragmentation process every week.
Edited 2012-01-09 14:15 UTC
[edit – posted this in the wrong place – sorry]
Edited 2012-01-09 09:15 UTC
I’ve been using Linux for the past 10 years and I can say that for at least 5 of those years everything could be done via a GUI.
Most of the time when users drop down to the CLI, it’s to edit a config file or delete system files. Guess what, that can be done via Kwrite, Gedit or any number of other text editors and files can be deleted via any file manager. Granted you’d need to run your app as root (which, again, can be done via the GUI) but that’s no different to running regedit as administrator on Windows. Further more, editing config files are no more complicated that editing the registry (in fact, arguably less so).
If you want to know the real reason users get directed to the CLI – it’s not because the GUI in Linux doesn’t work, it’s because:
a/ different users will have different GUI’s they’re familiar with – and thus directing them around can be a complete nightmare
b/ CLI commands can be copied and pasted from a forum / wiki and will almost always work proving the user can copy / paste (which is far less demanding to n00bs than asking them to learn the fundamentals of regedit and msconfig).
Granted Linux might be shooting itself in the back by giving CLI solutions to problems as it gives the illusion of greater complication, but these people give up their free time to help support n00bs so it’s hardly their fault if they choose the easy solution (from a support perspective) rather than trying to talk someone through a graphical user interface that the n00b is unfamiliar with (have you ever tried to do this? I have and it’s a bloody painful affair!!)
Bullshit – by your logic PowerShell would be proof that Windows 7 has a lower margin for error than XP.
What you’re doing taking two personal opinions and trying to correlate an unrelated argument from it;
the CLI vs GUI debate has absolutely nothing to do with an OS’s reliance from fucking up
There are Linux equivalents. To assume there aren’t is just ignorant.
Cut the crap. I never once suggested people would want to or even need to learn bash. Far from it in fact.
If you are going to falsify my arguments then this discussion is pointless.
I take it you’ve never used KDE nor GNOME then :p
You’re the one raising these points, not me. If you really thought I felt that was the case then why are you the one raising those points when I’ve kept quiet about it?
“Doesn’t”
Grammar aside, you’re still wrong. Linux does have “clicky clicky easy” interfaces. I will grant you that many Linux apps are less pretty than Windows apps though. But that tide is turning with many desktop environments dragging the GUI into the 21st Century.
True, but where is the consistency in Windows? Even Microsoft break their own usability and toolkit guidelines (each new version of Office is significantly different in graphical design from the rest of the OS). I hear people (and rightly so) arguing against the consistency in Linux, but maintaining such consistency is pretty much impossible for a major OS. Even OS is starting to lose that fight and Apple are the strictest for UI consistency.
This argument is just another example of how Windows fanboys can’t look past their own bias to see the problems on in their own back yard.
Oh that old argument: “Few people use it bso it must be shit”.
Let me educate you a little about basic mathmatics: 1.3% is a relative figure. Given the vast numbers of computers in the world (we’re talking billions), 1.3% is a monumentally high number of Linux installs.
Now let me educate you a little about how these figures are compiled: nearly every laptop and pre-build PC sold counts as a Windows sale. even my laptop (which doesn’t run Windows) counts as a Windows sale because MS have managed to tie their OS with nearly every pre-install. In fact trying to buy a computer without Windows pre-installed is a fucking nightmare (trust me, I’ve tried). Furthermore, many PCs dual boot which, again, would register as a Windows install rather than Linux or both. So we simply don’t know how many Linux users the really is. There’s no accurate way we could possible know this (and this is proved by all the wildly contradicting estimates you see).
Now let me educate you on public trends: Having a large market share does not prove a product is any good. Here, in the UK, BT have a massive lead for supplying telecomes solutions, yet they’re one of the least reliable IT corporations around. However many people buy from BT because they either don’t know better or simple do not have any choice (eg when BT have a literal monopoly in their area). Lynx (the deodorant) is the biggest selling deodorant in Britain, yet it’s one of the worst smelling and rubbish as an antiperspirant. And finally many people like the worst music for no reason other than it’s constantly hammered on TV and the radio so they end up liking it through repetition. If the best quality product always prevailed, then BT would have gone bust, Lynx wouldn’t sell and pop music would be creative independent artists. However sometimes people just buy what their familiar with as it’s more preferable to trying something new.
Now lets take your example and shift it to the mobile market: Windows has about 2% market on smart phones and even less on tablets. Where as Linux is enjoying ~40% on smart phones alone. Therefore, using your logic, Windows has a lower margin for error the moment your hardware becomes portable.
Now clearly I’m not suggesting this to be the case – however it does prove how retarded your original argument was.
Once again, you’re not comparing like for like:
With Linux, many distros don’t see OS upgrades (eg going from version 1 to version 2) as a new OS but more like a “Windows service pack” for Linux. With Windows, OS upgrades are, in essence, a whole new OS. Thus MS have no option but to keep their support alive
….continued….
Yup, indeed you can. However the OS wouldn’t be up to date, but then neither would your Windows box if you did what you were suggesting.
So why the unbalanced opinion? At least I’ve admitted the negative aspects of Linux, you seem completely blind to the negative aspects of Windows.
And which market specifically is that which the “Linux community” (which please bare in mind is so massive that there’s different facets for different niches) does not understand?
Your comment is such a sweeping generalisation that I could be here all day debunking such nonsense.
All of my hardware works out of the box and I never check for Linux compatibility. I will admit that my first laptop had issues with it’s graphics chip and wireless chipset (largely because ASUS, in their infinite wisdom, re-branded the chipsets so the generic drivers -which should have worked- didn’t recognise the hardware. In fact Windows wouldn’t recognise the same hardware out of the box either. But that’s neither here nor there as the end result was the same). That’s the only hardware I’ve ever struggled with in 10 years of Linux.
In all honesty, I have genuinely had just as much hardware go undetected in Windows as I have in Linux: Graphics chips that go undetected; soundcards and so on (just have a read through nerd forums and see the number of “Windows is not detecting xyz” thread to see my point). Windows is far from perfect on the driver front.
In fact, while we’re talking about the n00bs, installing drivers can be a complete nightmare for them (particularly if they lose the driver CD supplied with their hardware). At least with Linux, virtually everything is detected and installed out of the box (bar closed binaries, but many of the n00b-focussed distros include simple GUI tools for switching between propitiatory and open drivers – tools which include downloading the drivers for you for added simplicity).
Again, I’m not trying to boast that Linux has better / easier to manage hardware support than Windows does. They both just work differently so I acknowledge the differences and the pros and cons they bring. Where as people like yourself can’t see past the Windows ecosystem and thus assume that MS has developed the only working solution (which, quite frankly, is a narrow-minded attitude to have).
Expanding my point a little, many of the problems I’ve seen with Windows users condemning Linux is the complete ignorance towards it. They want Linux to behave like Windows and then complain when it doesn’t. It’s akin to buying an Android handset and demanding it function like Win Phone 7 or buying a PS3 and expecting it to play Xbox games.
Well said sir!
But those services you listed on Linux are not update managers. They’re just normal system services and you’d expect to see them on Windows as well (have you ever opened the system services control panel applet and seen the number of windows services that run even on a minimal install?).
Any sufficiently advanced OS will have a multitude of daemons / services running – that applies to both Windows and Linux and thus not something I was ever arguing against. However you specifically referred to update managers and stated that Linux suffers from multiple 3rd party update managers – which it does not. I’m not levelling criticism against Windows nor praising Linux, I’m just pointing out that your original statement was incorrect.
That’s not the default action though (you have to enable non-OS support to get things like Office to update automatically via Window Update manager) plus that only works with Microsoft software.
Indeed they do. This we agree on.
Edited 2012-01-08 11:57 UTC
Update break your video or wifi? Maybe try a more stable distribution. While there are crap distributions and crap hardware manufacturers, my current distro of choice has been fine across several major versions and gpu/wifi combinations. Granted, I may have a bias towards hardware manufacturers that support a choice of OS platforms; one doesn’t buy an HP and complain about how it doesn’t run osX well after all.
jabbotts,
“Update break your video or wifi? Maybe try a more stable distribution.”
Yes of course you can blame the distro, but the very same distro may be very good for some users and not work for others.
Ubuntu had a regression for me where the video turns black after bootup with my nvidia chipset. The live CD would work (I guess in vesa modes), but the installed version just won’t work out of the box.
I tried Mint debian edition once, but of all things the PS/2 mouse wouldn’t work. I tried in vein to get it working, I ruled out hardware, I finally gave up and installed Mint Ubuntu edition.
We don’t always appreciate that incompatibilities do exist, and they can be very difficult to resolve.
I usually find linux easier than windows to install because it doesn’t have the catch-22 situation windows has when drivers aren’t installed out of the box (I always bring a linux live cd with me when someone asks me to install windows so I can download windows drivers).
Anyways if ms is improving this, that’s a welcome change.
Those breakages do suck rocks. PS2 support can’t be old enough to drop yet and there is no reason for nvidia breakage in Ubuntu when Debian (the parent distro) has been fine. In Mint’s case, they are building out of Debian stable/untesting packages I believe so I’d expect some flakey behavior.
In my case, Debian Stable has bee a rock. No GPU breakage due to an updated package yet with my Nvidia or Intel. (haven’t had an ATI in a machine for a long while now so no idea how AMD’s doing with the open drivers on that one)
I’ve used other distributions in the past but lacking drivers, some other feature or poor QA on updates eventually drove me to a distribution that better supported my needs.
In general, all products are great for some people and not others. Some people love Mint, others love Fedora; I am not a target customer of either. This was actually my reason for not being specific about my preferd distro initially and asking what distribution gave you grief.
Windows is much harder to fuck up then XP and below used to be. Much harder. Reinstalls are never really necessary anymore, but for computer illiterate users, making reinstalls easier is good for everyone.
My sentiments exactly.. they keep adding tools to clean up after their crap solutions.. instead of fixing their terrible mess.. but that would break backwards compatibility! Imagine if their customers couldn’t run their windows 3.1 software on windows 8 .. !! oh the horror!
There are tools, it is called MSCONFIG, and has been there for the last 10 years at least.
Don’t make uninformed statements about Windows.
At least Windows has backwards compatibility (and forward), try installing the latest Firefox on Ubuntu 4.04 … good luck.
Firefox should even work on Win2000 (which has a higher install base for desktop than Linux).
Edited 2012-01-07 17:54 UTC
I don’t think any Debian developer/expert user ever advised people to reinstall the whole OS to fix something or make it faster.
If you know your way around the package manager you can fix just about anything and it has been that way for more than 10 years.
I’ve had pretty some bad experience with Debian ‘apt-get dist-upgrade’ where I ended up with partial upgrade and dependency problems. Admittedly, I’m not a Debian developer/expert although I have managed to get certain dependency problems fixed after a lot of man-page reading and head scratching.
Don’t get me wrong, I love the deb package system and I only use distributions with deb packages. It’s my favorite Linux technology (after Linux itself). The only thing it’s missing is a rollback. I hear that may be possible with Btrfs snapshotting so it’s seems we’re getting there soon.
But not everyone is an expert. And he/she shouldn’t be. Precisely because most times it works flawlessly – you take it for granted and when sth goes wrong you are in a WTF situation.
Still, I see nothing wrong with reinstalling every once in a while if only to repartition your disks to your better liking
I could do with a better way to reinstall all currently installed packages with a forced over-write. My build script aproach works but a more dynamic “what packages are installed | aptitude reinstall with extreme prejeduce” short command could be very handy. Might be that I juts don’t know dpkg/aptitude well enough.
dpkg -i /var/cache/apt/archive/*.deb
nice. I’ll keep that one in my notes. If cached packages are not removed or currupt then done.
I guess what I was thinking originally was more like this:
aptitude reinstall –all-installed-packages –from-repository –overwrite-what-you-gotta
Or something closer to the Windows “already installed, want for I should replace any missing or currupt files?”
I like this though. I hadn’t thought to just go look for the local .deb cache.
This is true for any OS.
Whatever OS you use, if the user or users have free reign to install or configure whatever they feel like, from the Internet, free CDs from Magazines and friends, after a while the system is just unusable.
You can add as much security and integrity mechanisms as you want, but if the users have root/admin access they will eventually kill the system.
This is bound to happen to any system. Heck I am even aware of some friend’s smartphones that could have a reinstall, when they start messing with the settings.
TBH this isn’t necessary. Using CCleaner or something similar on Windows will do most of the heavy lifting for you, though it is pretty much a nicer front end to MSConfig.exe
“know your way around” requires a bit more know how than most users are going to have. My favorite WTF was when my console text editor of choice stopped working. The program in /usr/bin was fine, but I tracked it down to the filesystem getting full, which caused one of the links in /etc/alternatives (to my text editor) now showed up as a 2GB file instead of a link.
Of course that’s precisely what should happen when I slip and let my filesystem get too full!
You aren’t wrong though, that Debian install lasted through years, couple major kernel versions, several hard drives and two completely different systems. It’s still running in fact, though I haven’t touched it in a year or so.
you’re delusional. one can find buggy software to mess up debian.
I have not been able to find one that can mess up debian past repair. debsums is a great tool to track down what has been damaged so you can reinstall the damaged packages.
Pitty blood relation ubuntu lacks it.
Mind you I am looking forwards to btrfs snapshots that will make Linux systems way more resistant. MS here is partly trying to get in first strike against what is coming.
My concerns is what is the first strike going todo. MS actions normally good comes with bad the question is where is the bad.
An OS shouldn’t slow down, no matter how much “crap” is installed along side it.
The user model of Windows doesn’t help with breaking things… it isn’t as bad as it used to be thanks to UAC… but meh…
My linux install is filled to the brim with rubbish to the point where my main root partition is almost full, The install is from 2005 and it runs as fast as it did then…
Windows should be re-designed from the ground up, to sort out the user privileges and get rid of that mess called the registry… And design a better filesystem while your at it. – Then maybe, you won’t need to reinstall because of it “slowing down”
I bought a newly installed WinXP laptop for friend, went through the painful process of installing all the latest windows updates (*3* reboots IIRC) and found that although the disk was only 10% full, data was all clustered in the first 20% of the disk and was badly fragmented. It’s like MS learned nothing from the BSD filesystem research on UNIX from the early 80’s, and thirty years later NTFS still performs like a UNIX V7 filesystem.
Once the filesystem fills up, there’s no hope of even successfully defragging properly, though I did find some of the defrag utilities at sysinternals (http://technet.microsoft.com/en-gb/sysinternals/bb545027) very useful (page file defrag!)
Windows XP is 11 years old.
OK, so they were only 20 years behind filesystem layout research upon release.
ext3 has a similar age to the NTFS used in XP, and I’d take it any day over NTFS in how its performance degrades over time (all anecdotal, of course, no numbers to back that up etc.)
Thom, NTFS is also in Windows 7…
Since I have never messed with Windows 7 (Onlyuse it at work because I am forced to) is there an alternative filesystem to NTFS in Windows 7?
Edited 2012-01-06 15:32 UTC
Nope. NTFS defragmenters improve over time though, so maybe at some point NTFS with scheduled defragmentation will start to reach the performance of ext3 save for the occasional slowdowns
Edited 2012-01-06 16:08 UTC
The problem is you’re thinking like Linux and windows is if anything more like OSX. You don’t use the default utilities in Windows thanks to antitrust making MSFT hobble them to keep from getting busted. just look at how much they had to fight to get Windows Defender in there and it barely does anything!
You want to keep Windows nice and defragged there is Defraggler if you are cheap, its okay but not excellent, or there is either TuneUp Utilities (my preferred method) or Diskeeper but neither of those are free. what I like about TuneUp is it builds a profile on first run and “learns” what is best for YOUR machine. If you have a 200gb drive it might be completely different from me when i have 3 Tb spread over 2 drives. With mine it determined that on the 1Tb 18% fragmentation was where it affected performance, with the 2Tb it was 24%. I ran benches on the drives and found their numbers right on the money. With diskeeper it takes a “never allow ANY fragmentation” approach which for some like hardcore gamers might be the correct approach but i personally like the “one stop shop” of TuneUp as it clean, defrags, has a ton of other features, it really makes Windows “set and forget’ which when you work 6 days a week like me is nice.
So don’t blame NTFS as there isn’t anything wrong with it, it has symlinks and junctions and all the other features you’d expect from a FS, its just you can’t use the crap utilities that come with it if you want it to be running tip top as MSFT will get their butts sued. I mean how long have they owned MSE now and they are STILL not allowed to include an AV by default?
bassbeast,
“So don’t blame NTFS as there isn’t anything wrong with it, it has symlinks and junctions and all the other features you’d expect from a FS, its just you can’t use the crap utilities that come with it if you want it to be running tip top as MSFT will get their butts sued. I mean how long have they owned MSE now and they are STILL not allowed to include an AV by default?”
So microsoft is required to make crap products by law? That explains alot!
I’m kidding, although I don’t think microsoft was ever prevented from developing whatever it wanted, the antitrust was about bundling and giving it’s own products preferential treatment in windows. As a monopoly it wasn’t allowed to do that.
“…neither has ever needed reinstall thanks to my secret which is…..don’t install crappy software! I know, its a concept and i’m sure heartbroken over those kitty screensavers i missed out on, but that’s the price of having a rock solid Windows i suppose.”
You can mitigate risks by not installing anything new on the machine, and it’s true there’s a lot of crapware out there. However I find it disappointing that such strict no-playing-around policies are needed to keep the OS running well. Isn’t it reasonable to judge an OS by how easy it is to get rid of unwanted junk? For windows, often times it’s much easier to to reinstall than to fix. More likely than not, the problem lies in the registry, but there’s no systematic approach to solving registry problems. even if I can compare it side by side with a working system, it’s a can of worms.
All platforms have pros and cons, choose what works best for you and to each their own!
I personally use MyDefrag (formerly JkDefrag), and it does its job well enough although I miss its predecessor’s extremely straightforward UI. But that’s not what I was referring to in the parent post.
My ext3 and ext4-based Linux installs have never reached a level of fragmentation that forces me to manually defragment them. They stay constantly at something like 1% fragmentation, and apparently it’s not a 1% that matters much for overall OS performance.
Yet it doesn’t seem like they have to perform any scheduled defragmentation/maintenance tasks, save for the occasional fsck that has become extremely quick in ext4. So I have to wonder what it is on Windows that requires regular defragmentation to be performed for good HDD performance to be achieved.
Either there is something fundamentally wrong in the NTFS spec, or Windows’ file management routines are very badly written. I don’t know enough about either to conclude. However, since implementation mistakes are relatively easy to fix in software, I’d spontaneously believe that after so many NT releases, if something could be fixed in Windows’ code without a breach of software compatibility, it would be fixed by now. So I assume that NTFS is the problem.
Edited 2012-01-07 13:11 UTC
I have been using Windows 7 for over a year now, without reinstallations and I actually tried running defrag a few weeks ago out of curiosity. Well, I had 1% fragmentation, no need to run it, even though I’ve been installing loads of games and applications and whatnot throughout the year.
As such I do not believe that either applies to current Windows.
In their default configuration, Windows Vista and up are set up to run defrag automatically in the background as a scheduled task : http://blogs.msdn.com/b/e7/archive/2009/01/25/disk-defragmentation-…
Edited 2012-01-07 16:12 UTC
I too have a ton of games on my Win 7 which has been running since Oct 09 (have replaced just about every part but the HDDs and only needed a single reactivation when i switched boards) and counting Steam games you’re looking at over 100Gb of games and the highest i’ve had it fragment was 4%, big whoop. And I just checked tuneup and while it has been cleaning out some dead reg entries where I beat games and tossed them it hasn’t needed to defrag yet so I’d say Windows has the frag problem licked pretty much.
Of course the sad part to me is how many won’t believe it, same as i quit hanging around any sites where Linux users may congregate because i got tired of hearing “Windows constantly BSODs ZOMG!” like its still 1993. That would be like saying Linux doesn’t have anything but a kernel since Torvalds only recently put it up on IRQ!
So lets resolve to bury some of the old FUD in 2012, kay? windows doesn’t BSOD daily, you don’t run as an admin, it doesn’t get infected by turning it on , you don’t have to hunt for driver discs (Windows Update now takes care of drivers) and it doesn’t take a supercomputer to run win 7, in fact my oldest is running it quite well on a 6 year old Pentium D with 1.5Gb of RAM.
As replied to the previous post, you can consider that Windows Vista and up have got fragmentation sorted out if and only if you are ready to ignore the thought that it keeps running defrag in the background every week, needlessly eating up power and cutting on your fragile HDD’s lifetime in order to compensate for its terrible everyday file management performance.
Edited 2012-01-09 06:03 UTC
bassbeast,
“you don’t have to hunt for driver discs (Windows Update now takes care of drivers)”
I wouldn’t have responded to any of your other claims, but here I beg to differ. Windows 7 driver installations can still be particularly problematic before web access is available. Of course the problem is generally only witnessed by people who build their own windows systems and don’t use the OEM recovery partition to install windows.
In all honestly I find that linux wired ethernet compatibility out of the box to be undeniably superior to any version of windows. Wireless on the other hand is a different story, I’ve spent numerous hours cursing linux distros for WLAN problems that the windows drivers don’t share.
Weird. I haven’t defragged in a year. I just checked, and I’m 1% fragmented.
I also frequently install/reinstall software, and add/erase large amounts of files, big and small.
I know my experience isn’t universal, but my usage pattern should be the type that causes much fragmentation, yet it doesn’t.
Edited 2012-01-06 18:49 UTC
Drumhellar,
“I know my experience isn’t universal, but my usage pattern should be the type that causes much fragmentation, yet it doesn’t.”
It depends of how full the drive is. Consider a disk at 50% capacity, there are probably still plenty of unfragmented clusters available to place new files in, and so those files won’t become fragmented, even if files are deleted.
It’s not until the disk approaches capacity that the file system has to start making compromises about placing bits of files at less ideal locations and fragmentation begins to take place.
I don’t know… I have a Windows computer at home whose drive has never been more than 70-80% full for some time, yet playing around with large files on it (installing, playing, and uninstalling games) can lead to some pretty bad fragmentation. I couldn’t believe myself how much defragmenting it the other day improved performance.
I suspect that NTFS is designed to put as much data on exterior HDD sectors as possible because those have higher data throughput. Nevermind the fragmentation and reliability problems that this kind of algorithm may cause.
Neolander,
“I don’t know… I have a Windows computer at home whose drive has never been more than 70-80% full for some time, yet playing around with large files on it (installing, playing, and uninstalling games) can lead to some pretty bad fragmentation. I couldn’t believe myself how much defragmenting it the other day improved performance.”
Yea, it’d be challenging to come up with a quick and easy test to compare the fragmentation levels of different file systems under various conditions. I don’t even know if the linux NTFS driver would show the same patterns as the windows NTFS driver.
I think part of the issue is one of developer attitude. On Linux, software doesn’t tend to put itself on boot, reinvent GUI widgets for rubbish reasons, feature speech and blinking ads, etc… nearly as much as on Windows, yet nothing in current package managers prevents it to do so.
Edited 2012-01-06 14:04 UTC
This is actually the main problem with Windows, all the crap that start at the same time as the OS for no good reason. Usually, when people call me to fix their slow computer, I just remove these apps from the boot (NOT from the computer, just the startup at boot), restart the PC and it’s problem solved.,
Sometime the PC is so much f*cked up that I just go for a reinstall, but this is rare.
If Windows would prevent applications to start automatically at boot (at least without the user’s consent), this would solve a lot of problems.
A central configuration database actually has quite some benefits. I consider it one of the last places that need to be redesigned.
mightshade,
“A central configuration database actually has quite some benefits. I consider it one of the last places that need to be redesigned.”
You’ve got some nerve saying that out loud, man
Especially considering that non-Windows OSs have had central configuration databases for a long time without needing anything like the Windows Registry.
Consider /etc and .xxx home subdirectories on Unices, as an example.
I know. I think I’m just tired of reading “… and let’s put everything in various text files in various directories instead!” (which usually follows in the next sentence), like this was some inherently better approach. Both have pros and cons, and neither is a “mess”.
Edited 2012-01-07 12:17 UTC
That’s just bullshit. I have hundreds of applications and games installed and I am not getting any slowdown on my Windows installation. It was true back in XP, but it hasn’t been true for Vista or 7.
That’s just worse bullshit.. they havent replaced the registry yet, so nothing changed there.. Windows WILL kill itself slowly.. App developers may be a bit wiser in the registry universe, so the bad impact on performance is probably less than it used to be.. no credit to Microsoft i’d say ..
Yes, there have been changes. It is a lot faster and more optimized now compared to what it was on XP.
In my experience of upholding a domain of 600 machines all running Windows 7… Nah, don’t agree, it may be good on a quad core with 8gb ram, but at the end of the day, it will still slow down and suffer from shitty problems that just shouldn’t be there…
But… that’s my opinion – no need to be nasty at your first post
Re-Install a Linux operating system?! Well I suppose that depends on what you mean. Like going from LMint11 to 12? I run Arch and as a rolling release are you counting every time I update my kernel? Not everyone needs a version jump for new features.. Don’t tell Microsoft this as that’s their whole business model.
Yes I know things get messed up, however I haven’t ran across a need to in 10 years , withstanding version upgrades. Well, I did have to deal with broken grub once, what a pita. I guess it also helps I run older systems mostly by the time I get them their stable.
Just for perspective, Microsoft is not showing how their OS is much worse than Linux or whatever by bundling reset/reinstall functionality. Quite the opposite, the variants of Linux that most consumers use have this feature, and tons of people happily use it!
When someone brings out a needle or whatever to poke the little reset hole on the back of their router they are restoring a messed up Linux system to its pristine state. It is a good feature, that solves a lot of technical issues for many people in an understandable and acceptable way. The same when people do a factory reset of their Android phone, also a very common way to handle issues with phones.
In both the case of the router and the phone I have little doubt that it would be perfectly possible to get a shell up on it and then go through the system to find out what is actually wrong, but few people have the time and skills necessary to do so (or really care to, I am sure most readers have reset their router at one time or another without even trying to access the Linux system in it). It is especially a non-brainer if it can be done without losing your data and settings, which is exactly what Microsoft is designing this system to do.
It is true that Windows has also suffered a special need for this type of system, which makes it a bit sweeter still. That has improved greatly though, where I would reinstall XP every other year or so on pretty much all systems I have not reinstalled a single Windows 7 system yet, and the 7 installs are well past 2 years old already.
And, since you can only get Metro apps from the Windows store, to use this new feature, you have to get all your apps from the Windows store to really benefit.
EDIT: I would like to add that I don’t think this is intentional, but it does exist.
Edited 2012-01-06 19:01 UTC
….doubtful. Just like scheduled server restarts are a thing of the past with Server 2008 R2.
“it remains the number one piece of advice for any computer operating system ”
This is what is called comment bait. I see a bunch of Linux enthusiasts are countering this assertion.
Debian “stable” tends to be just that. However, it’s painfully slow when it comes to being updated. Debian unstable is often stable as well. But to say that Debian or even Linux in general doesn’t suffer from constant breakage via updates is absurd. Most Linux distros are in fact not rock solid. Linux is a 24/7 work-in-progress. That’s why there’s a steady flow of bug reports, fixes and/or regressions, more bug reports, etc etc etc. And speaking of regressions, …there are plenty of those. Why? Because (mostly) untested code is merged all the time. Why? Because devs are in a rush to merge their code and/or simply don’t bother considering test scenarios beyond their local setup — and that’s if in fact they even have the hardware locally.
A Linux install, as with Windows, can last a very long time when properly maintained. Neither is immune to reckless user behavior and crap package management.
Commonly given by those who know little to those who know even less.
That is the question people are not asking.
MS already allows computers to be sold without any media just images on the harddrive.
Now this could be the next level of great this is cool for malware writers. Get past MS repair system and basically infect the system for ever.
Really more question need answering before we know if this idea from MS is good. Or if everyone should be looking at Linux just from a secuirty point of view.
oiaohm,
“MS already allows computers to be sold without any media just images on the harddrive.”
“Now this could be the next level of great this is cool for malware writers. Get past MS repair system and basically infect the system for ever.”
I’ve already found malware on the windows recovery partitions which allowed it to survive a format & reinstall of the main partition.
I still think a “factory reset” feature is a good feature to have, but it’s very lame that manufacturers require us to purchase CD/DVD media separately. It ought to be included, even if only because the hard drive could fail.