David Williams over at iTWire has done a comparison of Windows vs Linux. It is performed by doing functionally identical tasks in both the OSes. This comparison is not a fair one by any measure. The laptops running the Windows and Linux were different in the hardware config and the software used for the tests were comparable but clearly different (MS Office vs OpenOffice; IE vs Firefox 3).
From the article:
It’s a shame really. I would think that a comparison of the best Windows apps vs the best Linux/open source apps would’ve made for a much more interesting article.
Well, I would think that anybody who would actually care about an article such as this probably has their settings tuned for best results. On Vista, this means turning off Aero, indexing, UAC, and most of the other crap that slows the system down. Similary, I would assume that a finely tuned Gentoo or Slackware setup would do much better than Fedora with Compiz loaded. Linux is just hard to test in this regard, because there’s so many different configurations, distros, and desktop environments out there.
Additionally, the author says that Symantec Antivirus was installed, which basicaly means you crippled the system before it even had a chance. You might as well just load the Vista box down with malware before you run the tests. Nod32 would’ve been a much better choice.
Disclaimer: I didn’t even read the results of this test so I dunno who came out on top. So when I say the test is borked from the beginning, it’s not because I didn’t like the results.
Edited 2008-07-22 18:13 UTC
As much as I dislike how Vistas accelerated GUI looks, I have to say that turning it off hurts GUI performance more than it helps. Sure, on a laptop is chews through battery since it keeps the GPU going all the time, but when you disable it, you are left with a really slow GUI. After using NEXTSTEP, OSX, Vista + Aero Glass, I’ve come to REALLY hate tearing and redraws you get on most other GUIs
Edited 2008-07-22 18:31 UTC
Are you sure?
http://weblog.infoworld.com/enterprisedesktop/archives/2007/03/vist…
ok, “perceived” performance I’d rather it be a bit slower and smoother than fast and horrid.
OK, I’m apparently an idiot. I haven’t used Vista enough to notice the performance hits that article talks about. But the interface was smoother than in classic mode.
anyways…
Why do you expect us to read to the end of your post then before we hit the -1 button !
Seriously, folks, how can you even *begin* to compare the OSes when you’re not even running on the same hardware? The results are meaningless. Also, while comparing different apps is anecdotally interesting (if only because it highlights the differences in the apps), a more meaningful comparison would be to see how Windows and Linux handle RAM and disk usage for the same apps (eg. Firefox, OpenOffice, etc). It isn’t clear to me why the author decided to go down the path of introducing too many unquantifiable differences in his analysis. It isn’t that difficult to image a machine, install a new OS, image the new install, and then flip back and forth between the configs. Unless this was simply about being “different”. But, quite frankly, I would prefer “intelligent” to “different”.
Agreed, for the stats to mean anything the testing should have been done on the same machine.
Install the OS, do the timings, clear the machine, install the other OS, do the timings again.
Anything else is a waste of space. I am sorry I read the article, it was just one big Linux advert.
We Linux users neither need or like sites like this
“Install the OS, do the timings, clear the machine, install the other OS, do the timings again. “
You don’t even need to wipe out one OS to install another. You can simply install both in the same time and dual boot.
Edit: on the same machine I mean (just to clarify things).
Edited 2008-07-22 19:24 UTC
You cant do that either, the OS that is installed at the start of the drive will enjoy faster access times than the OS installed from the later parts of the platters.
Not necessarily true…
NTFS tries to put everything in order initially (from the beginning of the drive). Hence accessing something at the beginning of the partition then getting something at the end is going to inccur latency.
ext3 (dunno about the others) writes data on either side of the middle of the partition so that for any given data at any given time data is closer to the drive head. This is also the main reason ext3 does not fragment as easily as ntfs.
Stuff at the beggining will ony be faster for things like bootups and when the drive heads are parked.
Edited 2008-07-22 21:15 UTC
Neither Linux nor Windows let you specify the psychical location on the disc. Dual boot or not therefore doesn’t matter.
ANY Linux installation utility will open up a partitioner where you can work on disc sectors rather than size. Clearly you have never used one.
Which means that the two os’s will be at different parts of the disk with different performance characteristics…
Better to get 2 identical machines, or at least 2 identical drives.
The sad part is that it isn’t even fair with Linux, stating that Fedora uses 1 Gb of RAM at boot is just BS, or he really fscked up his installation. I’m not a regular Fedora user, but that number is just nonsense.
I think the guy probably doesn’t even know how to measure PROPER memory usage. In Linux the majority of RAM is used up to cache stuff. The more you have the more it likes to cache.
For example my friend had 256mb ram on an older computer yet after boot, Ubuntu used only about half of it, and the rest was cache, no swap at all.
Same distro on my 1gb machine used up about same amound of RAM but MUCH more cached stuff. No swap of course either.
I bet his real usage was say 256mb, but he also counted the cached stuff.
I stopped reading when he said that after closing the browsers he found that memory usage was higher than before and his first explanation was that they have memory leaks.
I doubt he knows how to measure memory usage properly.
But just look at how much discussion it’s stirred up already. The real hotzones must be ablaze with Fanboi versus Zealot armagedon.
Think of the page hits; Ooh,.. won’t someone think of the pages hits (er.. children.. think of the children.)
(keehee.. I had to find something amuzing to do with the discussion. the article pretty much cancelled itself at “did not use same hardware or software”.)
This test is the worst I’ve ever seen because Symantec Antivirus was running on the Windows machine
Edited 2008-07-22 18:32 UTC
He is just comparing a workable windows machine with a workable linux machine. Can you assure me that the windwos box can be operated without the Antivirus?
mine run without a realtime virus scanner going.. they get scanned once a month and never have anything…
It’s not needed more on Windows than on any other operating system. Why would you think otherwise? Forums? Have you ever seen Linux support forums? Running both operating systems since 1998 I have.
It could be run with a proper AV program, and not that POS. Symantec Antivirus is the worst product in that category of crap bar none.
I see your point, but arguably it is fair to run a piece of antivirus software on Windows, since you kind of need to have it running.
1.12 GB of RAM used on Windows vs. 1.06 GB on Fedora? Almost a negligible difference. Both systems are using far too much RAM, in my opinion.
Fedora is not fat. I performed a default Fedora 9 + Gnome install and I got around 150Mb RAM at boot. I have no idea how you can get 1Gb RAM usage at boot. I’ve never ever seen more than 200Mb RAM usage at boot on the most bloated Linux distributions. I currently use a custom installed Mandriva that shows 70Mb RAM usage at boot.
Edit: Mandriva runs with KDE
Edited 2008-07-22 18:52 UTC
They probably miscounted the ram…
It’s not uncommon to take the memory usage figure from top, which includes memory used by buffers and cache… After a full boot i wouldn’t be surprised to see 1gb of ram being used as disk cache on a system with so much ram to spare.
*nix has always been heavy on RAM and disk throughput based on how it handles memory. Fast read/write disks and nice fast big ram sticks for cacheing.
I understand that Vista is actually doing prechaching now too which is why it seems to use a huge amount of ram. One program open and 80% used; no worries, it’s supposed to use RAM like that now too.
You’d have to strip out the caching from both platforms and look at memory dedicated to programs for a real idea of which is bloated in that regard.
Linux uses RAM in a very different manner than Windows. Most RAM is used as disk cache to speed up file operations. Linux is designed to use as much RAM as possible to keep the amount of disk access operations as low as possible, thus improving efficiency and speed.
You can’t compare Windows memory usage with Linux memory usage. Most memory used by Windows is just to store static blocks of memory like complete programs. Linux uses most memory for dynamic purposes like changing data and files.
As I see it, the point was – as the topic says it – perform real life benchmarks. Virtually no one runs Vista with the crappy Windows 95 theme and indexing and stuff disabled. Also IE is still the most used Windows browser aswell as MS Office is the most popular office tools package out there, while Linux people use Firefox and OpenOffice.org.
I don’t get it how Fedora can eat over a gigabyte of memory rightaway, since my current distro – Ubuntu 8.04 – needs only ~700 MB max, and I’m running a 64-bit system here! My old 32-bit Arch Linux settled down to ~400 megs with Compiz Fusion.
Anyways, I didn’t read the article since these so called benchmarks were run on different hardware and no graphical evidence (charts) was available.
Edited 2008-07-22 18:57 UTC
Unused memory is useless memory.
And I thought I was the only one who do not read the article until the end, so I don’t know the results either, if that matters.
And 1 GB RAM at the start? I don’t know what eats up to much memory, but I was running a machine with Arch + Xfce + Compiz + Firefox with something around 150 MB RAM.
how someone could even consider using the words “face off” on wildly different computers is mind boggling
makes me want to donate money to mental institutions and medical companies. We sorely need a cure.
At least both machines had 15″ screens.
Haha, you made my day! But seriously – how can you even think about benchmarking two operating systems on DIFFERENT HARDWARE? This. Is. Totally. Stupid.
and waiting for Singularity. Hopefully it will come preinstalled on Quantum Computer which presumably replaces the laptop roughly at the same time.
Hmmn, I’m using 130 processes in around 640 megs of ram on Debian Testing, with Gnome/Compiz, Evolution and Firefox, but also with Apache and Mysql running, and with Exim4, Dovecot and Spamassassin as well.
What the article fails to make clear is the creamy smoothness of it all on Linux. I was once mail-bombed on a similar setup – 5000 emails in a couple of hours – and didn’t even notice till later. Everything just worked – Exim4 and Spamassassin just did the business. You also get superb logging on Linux, not something one can take for granted on Windows.
I expect a server chewing through 5000 emails would have all worked fine on Windows Vista too, but I really wonder what those web and mail servers would do to the ram usage and smoothness of the Windows machine while the user gets on with desktop tasks.
So yes, the article’s comparison is something of a balls up, with different machines and so far as one can see pretty questionable configurations. How he calculates the ram usage on Linux is something of a mystery. However, his eventual conclusions are surely correct: what it comes to memory usage, Linux Pwns Windows.
If readers thinking of trying Linux take nothing more away than that, the article will have earned its keep.
Please. If you mail-bomb an exchange server, it starts to do wierd things. Event sinks don’t fire, API calls stop working (randomly fail) all sorts of things begin to break down.
Both Firefox3 and OpenOffice are available for windows. I can forget the fact the hardware is different. Is he stupid enough, not to use the same applications?
It doesn’t matter since the hardware is different. Nothing matters.
Thats not the comparison. There are a stack of comparisons all over the net, and Microsoft again come across badly in the area of Memory footprint for these applications…but that is not the point of the article. Which is having a functional OS+Office+Web Browser as far as memory footprint goes Open source is superior to Microsoft Proprietary products.
Exactly. The point is that on a lesser machine Linux delivers more functionality and performs better than Vista on a higher-spec more expensive machine, running the “typical” desktop applications that one would presumably want to run on those systems.
After all … what is the point of using an expensive machine bogged down with Vista and DRM and encumberances such as anti-malware, Aero, WGA etc, etc to run applications such as OpenOffice.org and firefox which run faster on a far cheaper machine?
The fact that Vista won’t even run on the machines in the “hot” new notebook category, but Linux runs fine and better than XP on those same machines, should make this fact self-evident, but it doesn’t hurt to bring it to people’s attention again.
What is interesting is the dearth of articles featuring real, direct, head-to-head-on-the-exact-same-hardware comparisons being published.
If I were the paranoid sort, I might infer from such a lack of direct comparisons that one influential party might not want such articles published for public scrutiny. People might after all begin to wonder why they are not being offered the best value-for-money option in the stores.
Edited 2008-07-23 03:12 UTC
This article is superfluous. I’ve read all the comments but I will save you a little time, because I’ve done these arguments to death.
Ignoring the different hardware Memory Footprint is being tested…not performance; application support; features; usability; maintenance; security; look and feel. Of which we all should know here are important, especially with hardware prices dropping, and particularly memory due to Vista. Vista is even being used as a selling point for memory by Memory Manufacturers…have you not seen the advertisements. We also should be aware here as a least semi-technical people memory can be used for all kinds of clever things from parity to a whole host of caching that might have side benefits, thats ignoring things like features.
However you paint it Vista+Office 2007+Internet explorer 7+Any necessary 3rd Party Anti-virus program is going to look pathetic from a memory footprint when compared to any Linux distribution+OpenOffice 2.41+Firefox 3…but then it would compared to Windows XP+Office 2003+Internet explorer 6. Thats why vista is losing out in the new disposable computer market. Have you not been paying attention.
This total blind denial is tedious. Its why I post less. The arguments used to be Windows had Application Support; De facto standards; Installed at default; Familiarity; Them Adobe Products; Commercial Gaming and a host of reasons going for it.
Has Linux based Distributions got enough application support with good enough Office + Internet + Small tool support. Support for those proprietary standards and supporting legitimate Open Standards , One Desktop analogy it turns out looks enough like another, Adobe products are heavily supported on those ever popular Apple products, and open-source software are not only improving but is at least good enough to all but the elite if that is still true. Commercial gaming is being crippled by Microsoft and consoles simply being a better platform, with casual gaming being good enough on linux(in reality I suspect social networking is killing desktop gaming).
…but the bottom line is that if your response is not Microsoft Proprietary Products have larger memory footprints than their most popular open source rivals but they <insert advantage here> you are a <insert noun for reason for denial>
I miss the days before Vista was RTM at least Microsoft was a contender then.
Edited 2008-07-22 23:54 UTC
The (deeply flawed) comparison is about systems that are functionally equivalent. You can’t honestly tell me that you think IE6 is the equivalent to Firefox3. You may be able to make a case for Linux vs. XP (thousands have tried and we still haven’t come to a decisive conclusion about which is better). And on the other end of the spectrum Office 2003 is generally a better Office (more features?) suite than Open Office (that is debatable too depending on one’s needs).
The point is the idea of a attempting to make a fair comparison between functionally equivalent software is not really possible using your recommendations or the articles.
I guess the one conclusion I can come to is Windows is better at some things than Linux distros and Linux distros are better at some things than Windows. But trying to make an apples to apples comparison is damn near umpossible.
I disagree. Twice.
I can think of some areas where Linux is better than Windows (any version, even XP) … for example, Linux comes with a powerful compiler, XP doesn’t.
I can’t think of any area where Windows on its own is “better” than Linux … where you define “better” in terms of “what can I get done for my money”?
So take that further … define a set of “things I want to get done on my desktop” … such as” browse the web; write a letter; read an e-mail and reply; make a flat spreadsheet; IM, etc, etc. Make it a list that most of the PCs that get sold in stores would be used for.
Now compare the least expensive and best-performing “solution” to get those things done using a Linux machine. Ubuntu probably, since it is the best known. The lowest spec and cheapest of the low-end Celerons would probably suffice.
Now match that perfomance with a Vista machine (tough one, I know, but there are some really high spec machines available now, so you might find something that despite running Vista can still match the performance).
OK, now tally up the costs for both solutions … Oh dear.
So the question is … why aren’t people being offered in stores the far and away better option in terms of value-for-their-money-for-their-needs? Why is their only option the far more expensive Vista?
Edited 2008-07-23 04:15 UTC
so we should compare microsoft visual c++ to gcc. what gives the best code?
Which one gives the best value for money?
Which one can best deliver code for other platforms (other than the platform on which it is running)?
Which one is part of the normal distribution of the OS, and which one is a bolt-on extra cost item?
Neither, they are both free
Since when were we talking about xross-platform? Don’t complicate the argument.
Actually, neither really. GCC is not installed on a basic Desktop install of most Distros. The fact that Linux distros tend to use package managers is irrelevant here.
Neither. They both can be obtained for free.
Only the “express” version of Visual C++ is free.
http://en.wikipedia.org/wiki/Microsoft_Visual_Studio_Express
“Microsoft Visual Studio Express is a set of freeware integrated development environments (IDE) developed by Microsoft that are lightweight versions of the Microsoft Visual Studio 2008 (codenamed Orcas) product line. ”
The GNU Compiler Collection (gcc) is anything but lightweight.
http://en.wikipedia.org/wiki/GNU_Compiler_Collection
“The GNU Compiler Collection (usually shortened to GCC) is a set of compilers produced for various programming languages by the GNU Project. GCC is a key component of the GNU toolchain. As well as being the official compiler of the GNU system, GCC has been adopted as the standard compiler by most other modern Unix-like computer operating systems, including Linux, the BSD family and Mac OS X. GCC has been ported to a wide variety of computer architectures, and is widely deployed as a tool in commercial, proprietary and closed source software development environments. GCC is also used in popular embedded platforms like Symbian, Playstation and Sega Dreamcast.”
Visual C++ express has nothing like the capability of gcc.
http://en.wikipedia.org/wiki/Microsoft_Visual_Studio_Express
“The idea of express editions, according to Microsoft, is to provide a streamlined, easy-to-use and easy-to-learn IDEs for less serious users, such as hobbyists and students.”
Visual C++ express is not meant for real, “serious” use.
gcc is the premier compiler in the world. It is the basis for more software on more platforms than any other product, by a huge margin.
http://gcc.gnu.org/
“The GNU Compiler Collection includes front ends for C, C++, Objective-C, Fortran, Java, and Ada”
As for backends, or targets if you will:
http://gcc.gnu.org/backends.html
If you write your code starting with Visual C++, and your project grows and shows signs of becoming useful as time progresses, then you will quickly find that you run into the limitations of the “express” compiler features, and that you must purchase an expensive Microsoft proprietary compiler to take your project further … and even then, your project will forever be constrained to be practical only for an x86 architecture running Microsoft Windows as a target.
Using gcc is equally viable to begin your project with, but it gives you no such limitations. You can use it to write a “hello world” in a number of languages as a learning exercise … all the way through to using it as the entire basis for a multi-billion-line (20,000+ packages) complete software distribution.
Yes it is included.
$ uname -a
Linux ********* 2.6.24-19-generic #1 SMP Fri Jul 11 21:01:46 UTC 2008 x86_64 GNU/Linux
$ which gcc
/usr/bin/gcc
$
Edited 2008-07-23 11:06 UTC
Well, the full compiler is free, just extra tools aren’t included in the express edition. Getting equivalent [functionality & ease of use] tools for GCC for free is difficult.
Quoting Wikipedia in this case is really not clever.
And that is relevant here, why?
Well they certainly have different capabilities. Certainly the Microsoft compiler had far better managed compilation support, and IMO, the builtin debugger is easier to use than, say, GDB or DDD
This is just not true. the limitations of the express edition plain do not work like that. Sure, there *are* features in the Visual Studio Team edition that make project management easier, but it’s not that the express edition is crippled.
well done, you found a way to install your distro *including* GCC, however many distros DO NOT include GCC by default. If I were really pedantic, I could create a Windows Install CD that provided Visual Studio by default too if I wished.
Au contraire
http://en.wikipedia.org/wiki/Eclipse_(software)
http://www.eclipse.org/screenshots/
http://en.wikipedia.org/wiki/Kdevelop
http://www.kdevelop.org/index.html?filename=3.5/screenshots.html
http://en.wikipedia.org/wiki/Qt_(toolkit)
http://www.anjuta.org/
http://www.anjuta.org/screen-shots
http://en.wikipedia.org/wiki/Lazarus_(software)
http://wiki.lazarus.freepascal.org/Screenshots
… it is just a session with a package manager away.
Most of these are way, way better at cross-platform application development than any Microsoft product is.
It was a standard Kubuntu 64 install from a LiveCD. That is about as typical as you can get.
Granted, you would have to get IDE’s … they don’t come standard. But the compiler and text editors do.
Here are some extras for GUIs, if you don’t like QT:
http://www.fox-toolkit.org/
http://www.fox-toolkit.org/
http://www.fltk.org/
http://www.fltk.org/shots.php
http://www.fltk.org/applications/shots.php
Enjoy!
Edited 2008-07-23 15:12 UTC
c:\>ver
Microsoft Windows [Version 6.0.6001]
c:\>cl
Microsoft (R) 32-bit C/C++ Optimizing Compiler Version 15.00.21022.08 for 80×86
Copyright (C) Microsoft Corporation. All rights reserved.
usage: cl [ option… ] filename… [ /link linkoption… ]
c:\>
you are comparing apples and oranges. VS is an IDE, gcc is a compiler. VS express lacks things like integrated unit testing and the ability to use plugins, the compiler is full featured.
The VS express compiler/IDE lacks support for application development on all platforms but one, and all languages but one.
Miles away from “full-featured”.
The vcpp compiler is ANSI compliant, as long as you aren’t using some platform specific development with it, there is no reason why you can’t write code for another platform in VS. Cross compilation is a nice to have, but hardly something one would use with any sort of regularity except for very specific situations.
Au contraire … most of the world’s software applications don’t run on x86 Windows platforms.
Mobile phones alone would probably beat x86 Windows platforms, let alone the myriad of other embedded applications.
Why? Because every normal user uses Visual C++ and is designing and developing programs? A bit strange don’t you think? And who is really interested (besides dome geeks) in the efficiency of Visual C++ versus cc when you can compare the results in a test like the one we are responding to?
The test was intended for the normal applications every normal (standard) new user would use on the two operating systems. For Windows this was Vista + MS Office + IE. That’s the combination the majority of the common Windows users are using, if you like it or not. For Linux this was Ubuntu + OpenOffice + Firefox, a combination most common (new) users will use. Most Windows users do NOT use OpenOffice (like some Windows fanboy keep on arguing – where are they now?), and NOT FireFox (same remark). And let’s face it – most Linux users would not think about using IE or MS Office on their machine. So – what’s the fuzz all about?
Like it or not, even on older hardware the open source combination gives better results than the closed source combination. This is undeniable.
The only big flaw in the test (as far as I am concerned) was the memory footprint. Linux is designed to use as much memory as possible just to improve performance. Why having a lot of hardware (memory) when it is not used? That makes no sense. So programs an even a OS using a lot of memory is not that bad. What IS bad is that Vista is using the swap file even when there is enough memory left. That is a sign of bad design (even as they have used Visual C++)…
Some people here are arguing that it is not fair to use Vista for a comparisation and he should have used XP. Let us all be real here. No new user can even buy XP in a easy way at this moment, so the overwhelming majority of new users will be left with Vista. The author is also using the latest version of Linux (even newer than Vista), so this is a fair test for this case.
I think the point was, that the two systems represent the most common setups for the average user (tho i guess ubuntu would be more common than fedora), or the os vendor’s “preferred” setup of applications.
whoa, my system has only 512MB RAM and has had for the last 4-5 years.
I’ve run a few different flavours of linux, but mostly just opensuse (currently 11.0). I have used KDE up until recently and now use GNOME a lot.
memory usage on boot is around 100MB or less. The most i’ve seen it use is around 350MB and thats when i’m busy working on something. The swap has never been more than say 10MB and is not used at all 99% of the time.
Still, RAM is so cheap nowadays that as long as I have enough, I couldnt’ care less how much of it the OS uses.
What gripes me is when windows uses swap BEFORE i’m anywhere near running out of available RAM?! go figure!
Linux and Windows people unite! This article is pure BS!
I can find some excuse for the hardware / software non been equal. But the author clearly has no understanding of several things:
– Memory usage
– Page faults
– Caching
– …
Where is the usability factor here? Does he think we all keep the task managers / process viewers / top running all the time just to see what memory is used? Fedora definitely won’t need 1GB of RAM at startup. And Windows Vista uses most of the RAM for caching.
This whole thing is pointless.
Well then, you don’t have to post every crap story only because it’s there, do you..?
Yes, yes we do, it is an addiction.
I’d like to point out one thing.
Some of you are wondering why Vista is touching the swap file even if nothing is using all the RAM.
Please open Task Manager, look under the Performance tab and then look at the Physical Memory section.
Mine currently shows:
Total = 4028
Cached = 2768
Free = 9
And if I look at the CPU/Mem gadget at the sidebar, it shows that I have used 44% of my RAM.
As you have probably noticed the 44% used does not match the 9MB shown free under Task Manager. And I guess the author of the article made the same mistake. Thinking that he has like ~50% free memory left and then wondering why the swap file is being used.
My guess is that Vista actually uses all the memory in your machine.
Some 44% is currently used (in my example) for all the applications and OS itself. And all the rest for cahcing.
Since I have pretty much no free memory left, then opening an application will touch the swap.
Whether this is a serious design flaw or not is another discussion altogether…
Not necessarily. OS cache should return memory to applications when they need it. On my Linux box I can easily make it use all RAM (2GB) when I list directories all over the disk (I have a huge MP3 collection). I don’t have any swap and yet I can still start any application I want after the RAM is almost full. Try it on your Vista machine and your experience should be similar.
I guess it depends which application you are going to open. If the application is cached then swap should not be touched. But if you open a new app for the first time then some of the in memory cache has to be put into swap inorder to make room for the new app.
I think, perhaps, you misunderstand what the cache actually does. It stores data that’s already present on the disk, such as recently accessed files, or data that’s about to be written to the disc. It doesn’t need to swap any of this out or stick it in a swapfile. Ever.
I guess it depends which application you are going to open. If the application is cached then swap should not be touched. But if you open a new app for the first time then some of the in memory cache has to be put into swap inorder to make room for the new app.
I thought to clear this up a little bit..Cache is just a collection of redundant data that is not _needed_ for anything, it lies in memory just in case something happens to need it. In that regard, if some app wants to f.ex. read a file that is already in the cache memory the system doesn’t need to access the disk at all. This boosts the system performance quite a lot.
But, as the data in cache memory is redundant and is there for the “just in case” situations it can all be just thrown away and discarded if you are launching a new app or some pre-existing app needs suddenly more memory. The data in cache will not be written to swap, it’s just thrown away completely. Just don’t mix this cache with filesystem and I/O read and write cache, they are a different thing.
So someone please explain when is it EVER a good idea to prioritize the OS’s own cache, over the USER’s applications?! It should do as linux does (i believe) and simply reduce the size of the cache to accommodate for user apps. The cache is simply a luxury for when you have free RAM. no RAM? no cache. its simple. and its the reason why many people (including myself) find linux 100 times more responsive than windows under normal daily usage.