Why don't you take a single processor CPU with 1 Gigabyte and download a few thousand emails from Eudora (which works its mail through an Inbox that is in memory) and open a few dozen Firefox tabs and a few other browsers and have a couple of dozen programs open and then report back. If on your system you have a well-behaved game or graphics or web-dev program .. what difference does it make ?
I used to run with and AMD64-3500+ (2.2GHz) and 1gig of ram - before that, I had a P4 2.53GHz w/512meg ram (later upgraded to 1gig, iirc). I've put those systems under a lot of stress, lots of uptime, and never ever would a program like cleanmem have helped anything. A thing that did
matter for the 512meg system was specifying sane pagefile sizes, so the system didn't have to expand the pagefile... that was a very costly operation. I don't think I've ran XP with less than 512meg ram, but I used Win2000 comfortable with 256meg - and still no "memory optimizers".
What surprises me in your approach is how you don't even address the timing issue. That for Windows XP to do "stuff" (and likely the wrong stuff) late .. after your keystroke creates a crises .. is doofus memory management. XP should be prepared for the next need with CPU and memory attuned and ready to go. This idea that you wait a long time while XP tries to clear out space is simply an operating system weakness. And one that CleanMem helps address.
Why should process working sets be trimmed before it's necessary? I'd be frustrated from the possibly unnecessary
disk I/O this would cause.
There is an irony that you mention Visual Studio as the major memory-CPU part of your earlier system. I would assume that VS uses the .Net function that encouraged CleanMem that is largely ignored elsewhere. Thus keeping a light footprint.
What is the ".Net function that encouraged CleanMem"? If you mean garbage collection, then that isn't anything at all like process working set trimming. I mention VS since it's a relatively resource-heavy program (sitting at 112MB private bytes with a relatively small solution open). Eclipse (java) sits at 130MB with my schoolstuff workspace open. Those are two of the heavier often-in-use applications I keep running... firefox is, by far, the biggest sinner - it's no unusual for it to sit at 5-800 megabytes private bytes if I haven't restarted the browser all day.
And you say you disabled the Pagefile and ran with 1 Gigabyte. I am not sure how that works, I read a bit about that way of running and decided against it, I think I remember warnings that it would not work well if at all, perhaps you have different ideas to share. Clearly the moment you disable the pagefile you have a radically different system, making any comparison one of apples and kumquats.
Back when I had 1GB in my system, there'd be an occasional hiccup (read: application crash) if I tried to run recent games without PF enabled, but as a whole things worked well (this was before FF
). With 2GB I never ran into problems, and in my current system I have 8GB - no pagefile, permanent ramdisk running, and everything flies.
Yes, disabled pagefile does make a difference, namely that I won't suffer disk-write I/O. My laptop has 2GB and a pagefile though - haven't bothered profiling memory usage to see whether it's safe to disable or not.
Note specifically the point about a lot of memory released that does not go to the pagefile. From Ian Griffith beginning "I'm unconvinced by the points regarding the way Windows pages out applications that are idle.". It seems that this bears directly on the issues involved with CleanMem as well.
What he's referring to is page discard
, and that (as he says) only happens with pages that aren't dirty (ie, haven't been written to). This is a relatively tiny amount of memory for most programs, compared to the writable data allocated. Discarding does mean that you don't need to write pages to the pagefile, but it isn't free - once the code/data is needed again, it will be re-read from disk. And disk is slow compared to memory.
Keep in mind that read-only pages are sharable across processes. If you launch two instances of firefox, physical memory will only be allocated for one set of the read-only pages. Same things happen with shared DLL files (as long as they can be mapped to the preferred base address and don't need relocating).
I'd like to run programs/apps without having to restart them for any reason. I shouldn't have to. XP's management does not work for every situation or computer.
Complain to the programmers who write buggy, memory-leaking software
But you have to restart firefox sometimes, with or without cleanmem or any other help.
Yes, obviously cleanmem (or anything else) won't help about programs that have memory leaks - the only thing that works is restarting the program.
Cleaning temp files periodically as well as clearing the cache helps with mem and cpu.
How does cleaning temp files help wrt. cpu/memory?
Agreed. Management doing preemptively, casually in the background, a helpful function that XP does very late in the day of crises, after slowdowns, while you are waiting, and not fully.
It trims the working set when minimizing windows
, if the application hasn't overridden the default behavior.
I'm not a fan of trimming workingset before it's necessary, since it's a pessimization for the trimmed process(es). An exception would be a program or service that's going to sit idle for long period of time, it can
make sense for it to trim it's working set after initial startup is finished... but the largest effect is pleasing people obsessing over task manager memory stats.