“As we saw in part 1 of this series, large applications and games under Windows are getting incredibly close to hitting the 2GB barrier, the amount of virtual address space a traditional Win32 (32-bit) application can access. Once applications begin to hit these barriers, many of them will start acting up and/or crashing in unpredictable ways which makes resolving the problem even harder. Furthermore, as we saw in part 2, games are consuming greater amounts of address space under Windows Vista than Windows XP. This makes Vista less suitable for use with games when at the same time it will be the version of Windows that will see the computing industry through the transition to 64-bit operating systems becoming the new standard. Microsoft knew about the problem, but up until now we were unable to get further details on what was going on and why. As of today that has changed.”
Windows is getting incredibly close to hitting the 2GB barrier, the amount of virtual address space a traditional Win32 (32-bit) application can access.
On linux we at least have PAE kernels with which you can access upto 4GB RAM and more on 32-bit.
Edited 2007-08-14 18:16
On linux we at least have PAE kernels with which you can access upto 4GB RAM and more on 32-bit.
Windows also addresses 4GB of RAM, it just reserves half of it for the kernel. There’s a boot paramater you can pass to the winnt kernel to reduce the kernel allocation to 1GB, but that’s a bit hacky.
Oh, and Windows supports PAE too.
http://www.microsoft.com/whdc/system/platform/server/PAE/PAEdrv.msp…
Oh, and Windows supports PAE too.
That i didn’t know
I hope trying to push their 64-bit Vista systems will come back and bite them on the bum.. only time will tell.
2GB should be enough for everyone!
Seriously though, how can an app or a game actually try to use up to or more than 2GB? I can understand trying to cache as much textures/video or sounds in memory to speed things up but that doesn’t mean it needs that much memory. Current gen consoles are far from 2GB and they manage to run great games… well, if they don’t overheat and melt of course…
but then consoles are single task machines (excluding the latest, and even then i think the ui goes on standby while the game plays) vs a computer thats juggling multiple tasks.
“the 2GB barrier, the amount of virtual address space a traditional Win32 (32-bit) application can access”
each app has a 2GB limit, not the OS itself. Although I will admit I don’t know XP’s memory limit, I’d expect it to be higher, or artificially limited to avoid competing with the Windows Server series…
I’m pretty sure it is the standard 32-bit 4GB limit, and you have to go to one of the 2003 server versions to get more through tricks like PAE.
Each app is limited to it’s own 2GB of virtual address space on pretty much every 32-bit OS out now, which notably does not equal the amount of memory actually being used. You can tweak some to get more, of course, but that often leads to broken drivers on Windows.
well in that case, my bad.
Try to use ZBrush.
I think I’ve said enough with that one sentence
That’s not really your typical Windows app either…
I’ve never worked on games, but I have in many industries where we hit this limit (medical imaging was the biggest one). It’s not something more efficient coding can help with…the datasets are just that big.
Define ‘need that much memory’. The amount of memory needed is a function of what you want to do. I want as smooth an experience as possible. The more caching you do, the better, so give me more memory
Don’t worry. The more memory developers have available, the more memory they will use. So it all works out
You’re talking about highly specialized domains. The kind of stuff that should be using more reliable systems than Windows anyways unless you want to give a whole new meaning to the blue screen of death…
The problem here is that the right hand doesn’t know what the left hand is up to. Graphics memory is effectively a shared resource. But until Aero and various media accelerators came along, this chunk of memory was predominantly dedicated to a single running application.
Therefore, previous versions of Windows required the application to manage graphics memory directly. That involves reserving virtual address space as a sort of swap area for graphics memory. Some applications maintain a mirror of their graphics memory in this space. Others save their graphics memory when they get a signal from DirectX that they are about to yield the graphics device.
In Vista, the WDDM virtualizes graphics memory, allowing applications to think that they have full ownership of graphics memory regardless of whatever other applications are sharing it. Unsurprisingly, this involves reserving virtual address space in each process according to its graphics memory usage. Only this time, Vista takes care of saving and restoring graphics memory as it transfers ownership of the graphics device.
The trouble is that the applications are still supporting the old graphics memory model. So both the application and Vista are reserving address space in the process according to its graphics memory usage. Twice as much memory is being reserved than is actually needed. With 512MB and 1GB graphics cards, its extremely easy to reach the 2GB limit just because of this miscommunication.
The hotfix modifies WDDM such that only certain graphics allocations (“lockable” ones) result in reserved address space. The applications apparently still reserve address space for no reason. But this workaround reduces address space consumption on the WDDM side enough so that current games don’t hit the 2GB limit. Hopefully game developers will update their graphics memory management code to eliminate these unnecessary allocations when running on Vista.
But ultimately, the 64-bit transition has been complete for some time in the hardware space. Software and driver developers are seriously dragging their feet, and users are paying the price. The 64-bit and Vista transitions occurred in rapid succession, and the results demonstrate a troubling inability for the Windows ecosystem to respond to change.
That’s what this fix is all about. Microsoft made what can only be seen as an improvement to graphics memory management, and third-parties have not responded appropriately, even when their products began failing. NVIDIA was unable to cope with the change, and neither were game developers. We don’t realize how stagnant and complacent the Windows platform really is until something changes.
One could argue that due to the ugly nature of the operating system and its API’s things are made 2 times more difficult than they need to be.
Case in point, look at Nvidia on Linux, FreeBSD and Solaris – smooth 32bit -> 64bit transition. 32bit and 64bit applications sitting side by side in perfect harmony.
What it speaks more of is the poor design Microsoft made with Windows NT than anything to do with ‘lazy third party developers’ and then compounded the problem further by failing to make Windows Vista the ‘big break’ to actually fix those structural issues.
Basically, the more games strive for photorealistic graphics, lifelike physics and AI, the more processing power and memory they will require, and they will use everything you can give them.
A few years ago, 512Mb was enough for most people, then it was 1GB, now it is 2GB, tomorrow it will be 4GB, and five years from now, people will wonder how on earth people did without 8GB or 16GB, yet will be ridiculing people calling for systems with 32GB of RAM or more. 10 – 15 years from now, people might be talking about terabytes of RAM, not gigabytes.
It is called progress, and it tends to happen whether you like it or not.
As far as my personal computing needs go, nothing less than a 64 bit quad core system with 4GB of RAM will do, and my software will swallow every byte of RAM I feed it.
Now, granted, for most business computer needs, a 500MHz PIII with 512MB RAM and a lightweight OS with a browser, email client and office suite would do the job, but people want to be able to do clever things with photo collections that require large amounts of memory and address space, and faster processors don’t hurt either. People’s expectations of what computers can do has shifted as hardware becomes more powerful.
My mobile phone has more memory and processing power than my computer did 10 years ago, and current next gen consoles pale in comparison to modern computers. You can’t run games like next gen MMORPGs on a console, the poor thing would shit itself – even games like Oblivion will bring the most powerful computer you can currently buy to its knees if you turn all of the quality settings up. The Xbox 360 version is a shadow of the game on a decent PC.
I wish MS had only released Vista as a 64 bit OS, it would have made it easier for developers and hardware manufactures to support, and those of us who have a use for 64 bit operating systems wouldn’t have to hunt for compatible software and hardware so much. I guess that is what you get for being an early adopter, though.
yeah microsoft went version crazy. they way they release vista, they usually intend it for new hardware only not for upgrades. they should have just required a 64 bit processor.
Especially since nearly every processor released in the last few years has supported the 64bit instruction set…
PAE is a *physical* address space extension and doesn’t help much with the virtual memory space…
I usually try to avoid being a linux zealot and fanatic, but, bloody hell, see the computing industry through the transition ? And the fact that you’ve got “Replica” written down the side of your gun…
That it’s the same old story, Vista uses up more Virtual address space and having to buy a new card with more onboard ram. So you do need more ram, CPU and video memory for Vista for what?
It’s a future OS what takes advantage of future hardware, well guess what XP is faster on new hardware now, I could have told you that back in 2001 when XP was released.
Back in 2001, everyone was complaining at the collosal amount of resources XP used (almost double that of 2k). Alot of generic hardware stopped working due to the shift from 9x->NT, and people HATED the teletubbies theme. (I remember talking about it with a professor, who swore nothing that looked like THAT would ever touch his computer). Most hardcore windows users DESPISED it, and adoption took almost 4 whole years before it became standard.
It seems like people have a real short memory nowadays. XP was DOA, just like every other version of windows since 95, and with pretty much all the same reasons being given.
Actually, that isn’t my experience. I, and people that I know, switched to XP as early as possible and found speed improvements over 2000.
Well, 2k ran in about 200 megs of ram, xp ran in about 400. Vista almost doubled the amount of computer sales that xp drove its opening week. That discussion with the prof wasnt the only one like it I had, I switched a few weeks before it actually went gold (got my hands on a leaked copy), and spent a lot of time explaining how you can easily go back to the oldschool grey theme. A friend of mine never got her scanner working with xp, and dual booted for quite awhile with ME just for that. Several friends had webcams that stopped working, although it wasnt such a big deal, as they were the junky 20$ kind anyways. Most serious gamers I know stayed on 98 SE for a few years, as the 9x vid drivers were FAR superior to the NT ones for quite awhile. And if you look at the stats, for 4 years after the xp launch, 98 was the most used version of windows.
XP was a flop on launch by any standards, Vista has actually been doing alot better in terms of sales figures.
Heh. I found that with 256Mb ram, XP was less resource hungry than 2000 (Especially after trimming the services), and wasn’t affected by the general decay that the 9x/ME suffered.
I can’t say, I had 500megs at the time
I can’t for the life of me remember what the technology was called, but XP introduced something to ocmbat the fact that a significant amount of apps would overwrite system DLLs to add functionality. It greatly improved stability, but instead of getting more crashy over time (like 9x), I found it got way slower. After a few months on XP, a wipe/reinstall and your system would seem to be flying.
google_ninja wrote:
–“Back in 2001, everyone was complaining at the collosal amount of resources XP used (almost double that of 2k).”
can’t remember those arguments at all. XP uses ~20mb more RAM than my win2k pro freshly installed. I remember people thinking the default GUI was like something ugly from fisher-price (can’t disagree there), and also that people were dissapointed that although XP wasn’t slower than win2k, it wasn’t noticeably faster either apart from booting and some small speed improvements in the memory management functions.
don’t know were you got the ‘almost double that of 2k’ from, sounds like something you made up. Vista on the other REALLY is a memory hog in comparison to previous versions of windows, it’s also slower than XP.
http://www.tomshardware.com/2007/01/29/xp-vs-vista/
edit:
google_ninja wrote:
“Well, 2k ran in about 200 megs of ram, xp ran in about 400.”
where do you get these numbers from? they are grossly inflated.
Edited 2007-08-14 21:56
Remember a couple of things here:
First, when XP was released Windows 2000 was only at SP2, which used far less memory than SP4 does.
Second, Windows 2000 uses around 64-160MB of memory on a fresh install. I am not talking “active memory”, I’m talking about total memory usage, page file included.
Now, when I do a fresh install of Windows XP SP2 or Windows 2003, I’m lucky to have total memory usage be less than 400MB. No, really. That’s still over twice as much. That’s over four times as much in many cases.
Now, if I strip most of the useless crud out of Windows 2000, I can make it run on a computer with 16MB of memory. It will use a total of 48MB of memory, all used pages accounted for.
Meanwhile removing every possible thing from XP that doesn’t kill program/hardware compatibility still leaves the system using around 160MB of address space. Sacrificing several functions like network browsing, printer support, etc, you can get it down to around 128MB, but it’s a pain.
And it’s not just turning off services to get it that low, you have to actually manually rip those parts of windows out of the Operating system so that they can’t be used any longer.
Best case scenario, Windows XP uses 2.5x the memory of Windows 2000. Average computer it will be 3-4x memory usage. Meanwhile, I haven’t seen anything that only works in XP. Yes, some people make “XP only” games, but you can change that using Orca and the programs, surprisingly enough, still work just fine in Windows 2000.
I actually have quite a few things that work in 2000 but won’t run in XP…
The problem that people have with Vista is that it tends to leave crap in active memory that XP would have paged out. It really only uses about 1.5x the overall memory of a full XP install, they just have bad memory management so it looks worse. (note: I haven’t used Vista from release onward, I’m going by RC1 usage).
google_ninja wrote:
“Well, 2k ran in about 200 megs of ram, xp ran in about 400.”
where do you get these numbers from? they are grossly inflated.
Most likely he got them from, I dunno, the little spot in the task manager that tells you the memory usage. RAM usage and memory usage are two completely different things.
Check out your task manager, under performance: Total Commit Charge is what you are looking for.
Last time I installed 2000, after tweaking, it was at 80MB usage just sitting there (drivers, etc, all set up.) I just checkout out my XP box, it registers about 412MB usage… just sitting there.
Vista, if memory serves, used around 700-800MB… which is a far smaller jump… 175-200% usage, vs 515%…
Edited 2007-08-15 00:14 UTC
how nice of you comparing 2000 to XP (not 98 to 2000 which has more trouble in the beginning)
Windows 2000 could only be compared to Windows NT 4.0. Win2k was a business OS never meant for the home market, it was a replacement for NT 4.0. I know microsoft wanted Win2k to work for the both markets, however i think mainly directx and some other issues stopped this. For the majority of home users the upgrade path went Win98/ME to WinXP.
Personally i think Microsoft’s last great desktop OS was win2k, well built, very fast. WinXP i never really thought of as much as they could have done so much more with the Win2k baseline.
I really do think that Microsoft should have bit the bullet and released Vista as x64 only, or failing that should have put more emphasis on x64 more so than the standard. I know that consumers don’t really need x64 however i think it’s a push worth starting now. When the next release of windows is upon us x64 will be really important and starting the transition then will be a lot harder.
Nail on the head there
Vista actually will go way past 800, basically fill as much RAM as you after enough time because of the aggressive caching. Basically, it analyzes what you use and when, and then starts loading it before you do. However, that memory will all be freed as soon as other apps require it. The first time I looked in the task manager and saw a gig and a half being used I almost flipped though
steven wrote:
–” Check out your task manager, under performance: Total Commit Charge is what you are looking for.
Last time I installed 2000, after tweaking, it was at 80MB usage just sitting there (drivers, etc, all set up.) I just checkout out my XP box, it registers about 412MB usage… just sitting there. “
of course you look at commit charge->total. and your numbers don’t add up. in order to see what the actual system uses you check after you have installed the operating system and only runs the system and it’s default services. I’d like to know what version of XP uses 400mb out of the box freshly installed?
and I’m NOT talking about tweaked systems where you shut down alot of unecessary services.
likewise checking you system after having it running for a while is pointless since programs loads dll’s into memory which won’t be flushed until the memory is needed and also many of these dll’s/programs will start additional services which further use up resources.
some fluctuantions exists due to the difference in memory usage of different hardware drivers which are installed with the system. but in no way would they translate from ~120-130mb to ~400mb.
Got the numbers from my experiences at the time, Steven went into far more detail then I could in the other reply to this post.
As for the benchmarks, anything from January is pretty much useless now for Vista benchmarks. In fact, anything older then last week is pretty out dated, considering that MS just released patches which improve memory management, disk performance by about 140%, and graphical performance.
I agree with some of what you said. It took a long time for XP to be adopted (though tech mags like Maximum PC were pushing it). I remember many stores had no idea what XP was on the date (or even week) of the release. There were a lot of issues with compatibility in both hardware and software.
It did use more system requirements that XP. You could run Windows 2k on about a 133Mhz with at least 64mb of memory. It could run on less. XP, however, required at least 64mb (again, you could use less, but it was more of a pain). Worse, XP required at least a 266MHz Pentium (though I don’t believe it was a II).
A realistic everyday system running Windows 2000 was between 350MHz and 700MHz with about 128mb of memory. Windows XP really needed at least 1GHz to run well with between 128mb and 256mb of memory.
It is also important to stress that Win2k was released in 99 and XP is late 01. Vista, however, was released in November of 06. That is five years and about a month difference. The hardware requirements to run a system are going to change.
I do take issue with XP being despised. The real reason was that most Win98 users had less hardware than even Win2k users and saw no need to upgrade. They did not see a difference in kernels. This began to change in early 2002, though it did not start to really change until 2004. A real issue with many XP users maybe that they recently purchased it. I know of countless individuals who were hardcore Win2k users and only within the last year or so moved over to XP.
I think the difference this time is the more elite users are not upgrading to Vista like they did XP. This is mainly because the benefits are not as significant for gaming (other than directX 10) and multimedia users (other than the new API additions). XP on the other hand was a move to a more stable, modern and efficient operating system. Windows Vista does have a lot to offer but not everyone wants it. Though you are right that in the future select technologies will push people to the platform.
“A realistic everyday system running Windows 2000 was between 350MHz and 700MHz with about 128mb of memory. Windows XP really needed at least 1GHz to run well with between 128mb and 256mb of memory. ”
Xp never needed a 1Ghz processor to run well, I used to do VB 6 and VS.NET 2002 development on a 533 k6+ with 256M, usually with ICQ, MSN messenger, and 5 or 6 browser windows open. If I was doing web development, I usually had photoshop running too. It ran perfectly fine. when I upgraded that machine to 512M, it was a great machine, had it for 5 years. I don’t know what you were running on that machine, but perhaps it was your apps eating resources, and not XP
I can’t say that I ever tried to do the whole stripped down thing, as I had an OK computer at the time. But I remember 2k sitting at around 100-150megs, and XP sitting at about 400megs, give or take, both at default installs.
Not sure if you take issue with my description of the situation (i more or less said the same thing) or with how it was despised back in the day.
I took issue with it too, as I found the stability and responsiveness of XP to be well worth the hit in compatibility and performance. What I am pointing out is that windows users have a long history of crucifying the new windows release, and then slowly migrating to it over a long period of time. I have been using the platform since the 98 days (before that I was on Mac Classic), so I can’t talk about anything before that, but release after release I see the same thing, and yet it seems like there is this localized amnesia thing where nobody else recognizes it.
Exactly. I will tell you this though, I have had UI issues with Windows ever since I began using it (when I became a programmer), and many of those frustrations are gone with Vista. MS seems to be on a crusade recently to revamp all of its user interfaces, which I find to be a good thing as they have been the whipping boys in virtually every usability paper I have ever read. Big things like the removal of menu bars are going to alienate long time users of the platform, and the unfamiliarity combined with overall buggyness of the gold release (most of which was fixed last week, but that is too soon to change overall opinions) is what got the usual hatred to such unprecedented levels. I was pointing out that people hated XP when it came out, but not even I will claim that it comes to the vitriolic anger that Vista seems to be provoking.
That’s extremely odd. I’ve got an old computer right now still running XP, and at boot it only uses 288MB (according to task manager commit charge). If anything, that number should be higher than a default install because it’s got several services running on it like vmware, sqlserver, and a virus scanner. I don’t think it is a case of extensive optimizations over the years either, because I distinctly remember that it seemed to run OK back then with only 256MB of memory. Not great, obviously you couldn’t have 50 tabs open in Firefox like I have right now, but if all you had was an IE window or two and an editor open or something it would run decently enough.
I immediately loved XP, but then I was using Windows Mistake Edition before that. If I had been on 2000 I probably wouldn’t have seen much reason to upgrade either.
Edited 2007-08-15 06:46
That’s basically been my experience with XP. It would run decently enough on 256mb of ram, but would run like crap on 128mb. Seriously on the exact same machine, running Ubuntu and Windows XP, it would take probably 10 minutes for OpenOffice to load on XP and about 2 minutes for Ubuntu to load it up. Once you upgrade the ram past 128mb they both run pretty decently.
Memory Management in Windows has always sucked. Why is it that it hits the Page file before taking up the physical ram? Why oh why?
Physical RAM being a hell of a lot faster, I always see on my system that the Page File has been hit and yet I still have 2.3gb of free RAM (I have a total of 3GB).
Under Linux, I never even hit the swap space anymore. In fact I currently have 12 tabs opened in Firefox, Synaptic, all of Gnome, Compiz, Pidgin, etc and still only taking 24.6% physical ram and 0.0% swap. This isn’t bad at all. Under Windows XP, I usually have about 500mb of page file used with only loading up the desktop and no apps (except of course the anti-virus, firewall, messenger, etc.)
you can request it from here
https://support.microsoft.com/contactus2/emailcontact.aspx?scid=sw;e…
the hotfix KB number is 940105 incase you didn’t bother to RTFA.
cheers
anyweb
Yes, this appears to dramatically decrease VM utilization, but all that the author can say about the hotfix is that it doesn’t decrease fps – well Einstein, it also doesn’t appear to increase performance either.
So why does the author highly recommend that everyone run out and apply this hotfix? All of the games tested still had approx 200MB of headroom before they hit the 2GB barrier without the hotfix, and the hotfix doesn’t improve performance one iota. Huh… I think I can wait for SP1.
The comparisons in part 3 were just to previous results without the patch, they weren’t trying to max anything out.
Take a look at part 2:
http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=3044&p=2
Company of Heroes goes over 2GB just loading the single-player Cherbourg mission if you have a certain video card, and presumably all of these are worse if you have SLI or Crossfire since the problem in Vista got worse the more video memory you threw at it. Several other games would easily go over 2GB if you gave them a busy multiplayer game.
So if you aren’t getting any mysterious crashes, then just wait for SP1. But if you’re doing something and the app suddenly crashes back to the desktop without any warning, you might want to look to see if this is the problem and fix it if necessary.
Edited 2007-08-14 21:56
There is little point saying ‘Win2k used 256 MB’, ‘XP used 512 MB’…
Free memory is wasted memory. Who wants waste? I’d rather the OS/applications make use of my free memory though caching or preallocating memory than waste it. Of course it should deallocate that if a new application needs it
This is why judging memory usage by what you see in the windows task manager is not the most useful thing to do. It can serve as a guide, but it is more than likely to be misinterpreted.
Rather, the effectiveness of memory usage should be on how many applications can you run, how snappy does it feel, how large datasets can you load…
Free memory is wasted memory. Who wants waste? I’d rather the OS/applications make use of my free memory though caching or preallocating memory than waste it. Of course it should deallocate that if a new application needs it
I wouldn’t, id always like some free memory to be available to the system for when its required without it having to scan for stuff to free up .. This results in poorer performance … All systems i have seen with high scan rates and freeing pages like theres no tomorrow run like dogs … Free memory is good …
That’s not how virtual memory works… at least not on Windows. The superfetch pages are backed by read-only data, so there’s no cost to “freeing” the pages: they are just filled with the new on-disk data and the old data vanishes just like the 0’s would have if the page was unused. And you don’t have to scan for a page to free: the superfetch pages are kept on linked lists, just like free pages, and they are treated exactly like their free counterparts.
The only place where there could be an issue is servicing a user application’s demand to materialize a new page. The OS must give a zeroed page to prevent information disclosure between different processes, but there may not be a large number of pre-zeroed pages. On the other hand, the Windows Memory Manager auto-tunes the number of zeroed pages to leave around and this isn’t really a problem in practice.
So in windows all free memory is allocated to caching/superfetching ? I guess if it is and there is no hit when freeing a page to make room for more then the vm manager under windows is excellent …
My understanding in unixland (Solaris) is caching will be done using memory and when it hits the limit (lotsfree?) as to what needs to be free in order to service the system normally then the system has to scan then page to disk, inducing a performance hit … Sure some caching/scanning and paging is normal but too much and you will uncur a penalty .. mabey i should go read some win docs to get some contrast ..
Edited 2007-08-16 03:07
You never need to write a page to disk if it’s read-only… x86 processors (and surely SPARCS too) have a bit indicating whether or not the page is dirty, so the memory manager can know if it needs to write the page to disk. Finding memory to dedicate to new pages is pretty easy: Windows, and surely other OSes, maintain lists of pages that haven’t been used recently (the way it’s done on Windows is called second-chance FIFO.. it’s an approximation of the “clock algorithm” that you could look up). Pulling elements off these lists isn’t that expensive. You don’t really have to do extensive scanning to decide which page to drop.
You are correct regarding Solaris as well, not all pages are paged to disk apologies if it seemed i was implying that, with the vmm on Solaris pages unused for a term are added to the freelist, if a page is being used a flag is set for a time and if its not referenced after that time it can be added to the freelist ready to be used again .. if a page has been modifed and we are short of memory (desfree or minfree) then its paged to disk .. this is the step that has cost .. I suppose my orig beef with the comment was that i object to all memory being used for cache, it must have some cost to performance if a lot of scanning occurs and we have to look for pages to free then have to put them on disk, especially on a busy system with multiple users, for a single user system this will be lesser of an issue… I will look into the windows way of doing things also sounds a bit interesting. Cheers
This is why judging memory usage by what you see in the windows task manager is not the most useful thing to do. It can serve as a guide, but it is more than likely to be misinterpreted.
There’s nothing wrong with using the task manager to determine how much memory is in use. You just have to be smarter than the screen you are looking at so that you have some idea what it means.
If you check under “Physical Memory”, the place where it lists your physical memory size… you know, where you would think you are supposed to look… that usage number means nothing. It’s a waste of space.
Commit Charge, however, tells you exactly what is going on.
XP says “Total Commit Charge 412MB”… that means that 412MB of memory are being used by “stuff”, not caches.
Meanwhile, my stupid “Physical Memory” thing, the one on the top right, says “System Cache usage 728MB”… which means there’s obviously a lot of caching going on for something… but the cached files don’t show up in the commit charge, only memory that’s doing something shows up there.
Now, yes, arguing over “Active memory usage” is pointless, since there are so many possible caching methods, Linux/BSD/Windows all do it completely differently, resulting in wildly different usage patterns for exactly the same thing.
Arguing over the total commit charge usage in various versions of windows is justifiable, however. That is not caching if it increases, it is plain and simple program bloat.
Yes, Free Memory may be wasted memory, from a caching perspective, but program bloat is wasted in a far worse way. You can’t ever retrieve that waste.