Welcome Guest.   Make a donation to an author on the site October 25, 2014, 08:55:39 PM  *

Please login or register.
Or did you miss your validation email?


Login with username and password (forgot your password?)
Why not become a lifetime supporting member of the site with a one-time donation of any amount? Your donation entitles you to a ton of additional benefits, including access to exclusive discounts and downloads, the ability to enter monthly free software drawings, and a single non-expiring license key for all of our programs.


You must sign up here before you can post and access some areas of the site. Registration is totally free and confidential.
 
View the new Member Awards and Badges page.
   
   Forum Home   Thread Marks Chat! Downloads Search Login Register  
Pages: [1] 2 3 4 Next   Go Down
  Reply  |  New Topic  |  Print  
Author Topic: Windows XP Myths  (Read 32503 times)
zridling
Friend of the Site
Charter Member
***
Posts: 3,291


Linux captive

see users location on a map View Profile WWW Read user's biography. Give some DonationCredits to this forum member
« on: July 16, 2006, 02:00:48 AM »

For the rest of us not switching to Vista the minute it hits the shelves, take a couple of minutes to skim XP Myths. Nice tips and well worth knowing when applying random tweaks to XP found throughout the web.

                 http://mywebpages.comcast.net/SupportCD/XPMyths.html


Sample:

/Prefetch:1 Switch Tweak

MYTH - "Adding the /Prefetch:1 Switch to the startup path of a program's shortcut will decrease the program's startup time."

REALITY - All it does is change your hash number - the OS is doing exactly the same thing it did before, and just saving the prefetch pages to a different file. It does not improve performance in any way. Ryan Myers of Microsoft's Windows Client Performance Team writes: "The /prefetch:# flag is looked at by the OS when we create the process - however, it has one (and only one) purpose. We add the passed number to the hash. Why? WMP is a multipurpose application and may do many different things. The DLLs and code that it touches will be very different when playing a WMV than when playing a DVD, or when ripping a CD, or when listening to a Shoutcast stream, or any of the other things that WMP can do. If we only had one hash for WMP, then the prefetch would only be correct for one such use having incorrect prefetch data would not be a fatal error - it'd just load pages into memory that'd never get used, and then get swapped back out to disk as soon as possible. Still, it's counterproductive. By specifying a /prefetch:# flag with a different number for each "mode" that WMP can do, each mode gets its own separate hash file, and thus we properly prefetch. (This behavior isn't specific to WMP - it does the same for any app.) This flag is looked at when we create the first thread in the process, but it is not removed by CreateProcess from the command line, so any app that chokes on unrecognized command line parameters will not work with it. This is why so many people notice that Kazaa and other apps crash or otherwise refuse to start when it's added. Of course, WMP knows that it may be there, and just silently ignores its existence. I suspect that the "add /prefetch:1 to make rocket go now" urban legend will never die, though."
Logged

- zaine (on Google+)
mouser
First Author
Administrator
*****
Posts: 33,598



see users location on a map View Profile WWW Read user's biography. Give some DonationCredits to this forum member
« Reply #1 on: July 16, 2006, 02:51:23 AM »

cool find.  Cool
Logged
nudone
Cody's Creator
Columnist
***
Posts: 4,116



see users location on a map View Profile WWW Read user's biography. Give some DonationCredits to this forum member
« Reply #2 on: July 16, 2006, 02:59:29 AM »

fantastic. myth busting should be part of the everyday news reports - might reduce the stupidity of the planet. great to see a retort to all the needless xp tweaks (i think i can hear the cry of someone already declaring "tweak **** made my system quicker, so there" ).
Logged
jgpaiva
Global Moderator
*****
Posts: 4,710



Artificial Idiocy

see users location on a map View Profile WWW Read user's biography. Give some DonationCredits to this forum member
« Reply #3 on: July 16, 2006, 06:20:19 AM »

I specially like this one:

Myth - "The Windows Platform has more Security Vulnerabilities than the Linux/Unix Platform"

Reality - "Between January 2005 and December 2005 there were 5198 reported vulnerabilities: 812 Windows operating system vulnerabilities; 2328 Unix/Linux operating vulnerabilities; and 2058 Multiple operating system vulnerabilities" - Source
Logged

Josh
Charter Honorary Member
***
Posts: 3,338



View Profile Give some DonationCredits to this forum member
« Reply #4 on: July 16, 2006, 09:37:48 AM »

I specially like this one:

Myth - "The Windows Platform has more Security Vulnerabilities than the Linux/Unix Platform"

Reality - "Between January 2005 and December 2005 there were 5198 reported vulnerabilities: 812 Windows operating system vulnerabilities; 2328 Unix/Linux operating vulnerabilities; and 2058 Multiple operating system vulnerabilities" - Source

I love that. People always assume that just because the windows vulnerabilities get more publicity (because it is a much more widely used platform) that there are more for windows. This is not the case. Its just we never hear "The linux kernel had a stack overflow, or DDoS vulnerability today" because the linux platform is not in the majority of use. I subscribe to the secunia, security focus, panda's oxygen, and several other mailing list's and I get to see just how many vulnerabilities come across the pipe. You'd be surprised how many more exploits exist for linux and its included software than do for windows.
Logged

Strength in Knowledge
app103
That scary taskbar girl
Global Moderator
*****
Posts: 5,280



see users location on a map View Profile WWW Read user's biography. Give some DonationCredits to this forum member
« Reply #5 on: July 16, 2006, 11:26:49 AM »

It's a good list, for the most part, but I think he's a bit clueless when it comes to blocking malware with a hosts file.

Quote from: Hosts File
Special AntiSpyware Hosts Files attempt to associate a known safe, numeric address with the names of sites you want to block. When the user or any process on the PC then tries to access a blocked site, it is instead directed to the safe location. This works as long as the site's numeric IP address never changes. But IP addresses do change and they're supposed to be able to. The Web operates via "dynamic" naming, where a human friendly name (www.google.com) is actually an alias for the real address, which is numeric. The numeric address can and will change from time to time as a site or server is moved or reconfigured. 

You are supposed to use the localhost IP of 127.0.0.1 as the safe location. I have never known that to change when some other site changes their IP. It doesn't point google.com to Google's IP. It points BadMalwareSite.com to your own pc, where you are not likely to pick up a malware infection from.

Quote
The Hosts entry will permanently point them to a dead location!

That's the whole point to it! That's how it works!

Quote
People with out-of-date addresses hardwired into their Hosts File will no longer be able to connect to any site whose numeric address has changed.

How can localhost be out-of-date? It doesn't change. And how can pointing the domain name of a bad site to yourself block a good site with a different domain name that wouldn't be in your hosts file to begin with?

Quote
It's almost impossible to update a Hosts file frequently enough to guard against all threats and even if you did, you'd probably also run into problems in accidentally blocking good sites that happened to move to new numeric addresses.

He is only partially right there...you can't add entries fast enough to block all malware, nor can you ever know all of the possible ones you should block.

But since you are only redirecting the bad ones to yourself, the good ones are not affected by an IP change....they were never in your hosts file to begin with.

Quote
When cleaning Malware/Spyware from a PC, it is much easier to check a clean Hosts File then one filled with thousands of lines of addresses.

How hard is it to open the Hosts file in Word (or a small free proggie like my AlphaSort) and alphabetize the lines?

All the malware entries will be the lines beginning with a different IP than 127.0.0.1 ...and they will either rise to the top, listed after the #comment lines, or drop to the bottom, when you alphabetize the whole list.

Quote
Notes - There is a much better solution for bad site blocking using SpywareBlaster which more intelligently use's Internet Explorer's built-in Zone Security settings and the registry.

That only works for IE and IE based browsers, which even though they are the ones that end up being the cause/victim of spyware most of the time, it is theoretically possible to get an infection while using Firefox, Opera, or something else....and sooner or later you will start hearing of it happening.

ActiveX isn't the only way malware gets onto a PC through a browser...Java & Flash are also exploitable paths to your PC.

Using a hosts file to block the same domains that would be entered into your registry by SpywareBlaster will accomplish the same thing that software does...only it will protect all users of any browser or any software on that pc. The domains will be unreachable with anything you could possibly run...not just IE.

And the InformationWeek article he references has nothing to do with using the hosts file for prevention of malware. It was referring to using the hosts file for speeding up your connection by including the IP's of sites you visit often.

There is one thing I have to say about a hosts file he didn't mention...and his SpywareBlaster solution would also fail miserably too. And that is in the case of scripts that reference an IP directly and not use a domain name at all.

You can't redirect an IP to yourself with a hosts file...only a domain name and be redirected.

And if you start adding IP's to your security zones, you will eventually end up in a similar hell to one he was warning you about, where websites you want to use end up not working right because their IP's may have changed to ones you added. And finding the IP in your registry that is the cause of a problem is tougher than you could imagine when you have a whole bunch in there. You would have to remove them all and add them back 1 at a time till you discovered the one that breaks the good site.

In a case such as this, I would add IP's to my firewall if I wanted to block them. And if a good site is somehow blocked, it would be easy to figure out which IP to remove from your list by checking the firewall log and see what was just blocked when trying to access the good site...that's the one that needs to be removed.

so in summary...

the hosts file is used for blocking domains you want no contact with, ever

firewall to block ip's you want no contact with, ever

and zones for sites & ip's you want contact with, but you want them to be broken.
Logged

Rover
Master of Smilies
Charter Member
***
Posts: 628



see users location on a map View Profile Give some DonationCredits to this forum member
« Reply #6 on: July 16, 2006, 12:35:46 PM »

Myth - "The Windows Platform has more Security Vulnerabilities than the Linux/Unix Platform"

Reality - "Between January 2005 and December 2005 there were 5198 reported vulnerabilities: 812 Windows operating system vulnerabilities; 2328 Unix/Linux operating vulnerabilities; and 2058 Multiple operating system vulnerabilities" - Source
I hate when people quote stats like this.  (The author, not here on DC.)
2 points: 
1 those are unix AND Linux OS vulnerabiliities.  So I claim that Windows and Mac have more than OS/2.  What the hell does that mean?
2 what that hell does Unix/Linux vulnerabilities mean?  All of the software that runs on *nix included?  Just the kernel?  What?

That's like saying IE has less vulnerabilities that FF.  On what OS?!!!  FF supports several.  IE only 1.  Is that any kind of real comparison?

99% of all stats are made up on the fly....

Sorry for the rant.   undecided
Logged

Insert Brilliant Sig line here
mukestar
Supporting Member
**
Posts: 68


View Profile Read user's biography. Give some DonationCredits to this forum member
« Reply #7 on: July 16, 2006, 08:17:17 PM »

Myth - "The Windows Platform has more Security Vulnerabilities than the Linux/Unix Platform"

Reality - "Between January 2005 and December 2005 there were 5198 reported vulnerabilities: 812 Windows operating system vulnerabilities; 2328 Unix/Linux operating vulnerabilities; and 2058 Multiple operating system vulnerabilities" - Source
I hate when people quote stats like this.  (The author, not here on DC.)
2 points: 
1 those are unix AND Linux OS vulnerabiliities.  So I claim that Windows and Mac have more than OS/2.  What the hell does that mean?
2 what that hell does Unix/Linux vulnerabilities mean?  All of the software that runs on *nix included?  Just the kernel?  What?

That's like saying IE has less vulnerabilities that FF.  On what OS?!!!  FF supports several.  IE only 1.  Is that any kind of real comparison?

99% of all stats are made up on the fly....

Sorry for the rant.   undecided

Here Here , its a very ambiguous statement (System V, BSD, Solaris,HP-UX, AIX, Mac OS.10 .Linux ........)

I think also the key missing in that that myth is "exploit", a vunerability needs to be expoited, i.e Windows may have had less vunerabilities compared to every other Nix varient under the sun, but there a damn site easier to exploit.

But hey im an MS user, i like to be kept on my toes.  Grin
Logged
zridling
Friend of the Site
Charter Member
***
Posts: 3,291


Linux captive

see users location on a map View Profile WWW Read user's biography. Give some DonationCredits to this forum member
« Reply #8 on: July 16, 2006, 11:05:07 PM »

[tangent]: mukestar, great avatar dude!
Logged

- zaine (on Google+)
app103
That scary taskbar girl
Global Moderator
*****
Posts: 5,280



see users location on a map View Profile WWW Read user's biography. Give some DonationCredits to this forum member
« Reply #9 on: July 17, 2006, 05:41:12 AM »

I think also the key missing in that that myth is "exploit", a vunerability needs to be expoited, i.e Windows may have had less vunerabilities compared to every other Nix varient under the sun, but there a damn site easier to exploit.

But hey im an MS user, i like to be kept on my toes.  Grin

The more popular an operating system is, the more that the exploiters will want to exploit it.

Windows is the most popular, so most of the efforts to find & exploit vulnerabilities is focused on it. Linux is less popular so not as much effort is put into exploiting it.

If you want to be even 'safer', then get yourself an OS that hardly anybody uses. The exploiters won't bother with it because it's not worth the trouble...too rare to be much fun.

Amiga OS4 could have a million+ vulnerabilities, but since hardly anybody uses it, they haven't been discovered, nor exploited.

Creating a web page that will do some nasty thing to a visitor isn't very practical if you are targeting something that may never see that page, like Amiga OS4. You could wait years before someone running OS4 with Ibrowse shows up (if ever).

It's more likely you will catch more and do the most damage if you target IE on Windows, which is the most popular combination.

That's why Windows isn't as 'safe' as Linux...and why running Amiga OS4 is 'safer' than both.
Logged

JavaJones
Review 2.0 Designer
Charter Member
***
Posts: 2,537



see users location on a map View Profile WWW Read user's biography. Give some DonationCredits to this forum member
« Reply #10 on: July 17, 2006, 11:50:23 PM »

Not a bad little article. I learned a bit, anyway. Definitely some issues, as pointed out by others, but worth a read nonetheless.

- Oshyan
Logged

The New Adventures of Oshyan Greene - A life in pictures...
chrisa52
Charter Member
***
Posts: 32


see users location on a map View Profile Give some DonationCredits to this forum member
« Reply #11 on: July 19, 2006, 11:35:35 AM »

Just wanted to pass along a useful link from mvps.org, Hosts File FAQ: http://www.mvps.org/winhelp2002/hostsfaq.htm
Logged

It's Later Than You Think.
Carol Haynes
Waffles for England (patent pending)
Global Moderator
*****
Posts: 7,958



see users location on a map View Profile WWW Give some DonationCredits to this forum member
« Reply #12 on: July 19, 2006, 12:50:48 PM »

I have to say I have used the HOSTS file from mvps.org and it broke quite a lot of legitimate websites as well as the annoying ones. In the end I had to remove it and I now manage my own.
Logged

f0dder
Charter Honorary Member
***
Posts: 8,774



[Well, THAT escalated quickly!]

see users location on a map View Profile WWW Read user's biography. Give some DonationCredits to this forum member
« Reply #13 on: July 19, 2006, 02:14:12 PM »

Humm, that article isn't entirely correct.

For instance...

Disable the Pagefile
They claim that there's no performance benefit from disabling the paging file... well, that's not true. Windows tends to page out to disk in a lot of situations where it doesn't really make sense (partially because of badly designed usermode programs, though). So if you have plenty (at least a gig, 2gig preferred) you can avoid some needless paging by turning off the paging file. It's not a big improvement for most situations, but it does help a little.

Of course if you run anything from Adobe, forget about turning off your paging file... Adobe apps are badly designed monsters that don't have decent memory management. And of course, some games have extreme memory requirements as well (funny enough a relatively simple game like PainKiller uses enough memory that a gig of ram + no paging file crashes on some levels).

Also note that this only works for XP, Windows 2000 and below require at least a minimal (~20meg) paging file, and will create one at boottime if you've disabled it.

The article also seems to confuse 80386 protected mode "virtual memory"/"paging" with the process of paging in/out from disk - just because 80386 paging is enabled doesn't mean you have to page (or swap) to disk.

Large System Cache Tweak
This tweak can be nice on Desktop machines, not just servers, depending on the way you use your system. For me this is a VERY nice tweak. It means that, for instance, when I've used nLite to create a slipstreamed install CD, once the files are prepared and I go to the ISO creation stage, almost all files will be in the filesystem cache, so almost all reads will go from memory instead of drive... lots faster.

but the changed pages occupy memory that might otherwise be used by applications is a moot point, since filesystem cache will always be dropped when applications request memory and there's currently not enough free memory. On workstations this increases paging and causes longer delays whenever you start a new app. is plainly wrong, there's actually a better chance of your .exe and .dlls being in memory (equals shorter loading time) if you have LargeSystemCache=1.

HOWEVER, never use LargeSystemCache=1 on machines with ATI graphics cards and drivers. ATI drivers are nasty... see this post.
Logged

- carpe noctem
nudone
Cody's Creator
Columnist
***
Posts: 4,116



see users location on a map View Profile WWW Read user's biography. Give some DonationCredits to this forum member
« Reply #14 on: July 19, 2006, 02:35:52 PM »

just when i thought it was time to stop believing in the myths. oh well, back to being confused and unsure of what to believe - just as well i can't really be bothered with all this tinkering - my pc appears to working at the moment so i'll leave it well alone.
Logged
mrainey
Charter Member
***
Posts: 433


see users location on a map View Profile WWW Give some DonationCredits to this forum member
« Reply #15 on: July 19, 2006, 02:38:04 PM »

Quote
my pc appears to working at the moment so i'll leave it well alone.


Good advice.
Logged

Software For Metalworking
http://closetolerancesoftware.com
f0dder
Charter Honorary Member
***
Posts: 8,774



[Well, THAT escalated quickly!]

see users location on a map View Profile WWW Read user's biography. Give some DonationCredits to this forum member
« Reply #16 on: July 19, 2006, 03:07:06 PM »

just when i thought it was time to stop believing in the myths. oh well, back to being confused and unsure of what to believe - just as well i can't really be bothered with all this tinkering - my pc appears to working at the moment so i'll leave it well alone.
Well, those were the only two items I really disagree with smiley - and leaving it alone if it works is probably a good idea. I've built up my collection of tweaks over several years, and merged them in my unattended XP setup CD. Otherwise I probably couldn't be bothered (but would swear at XP often Wink).
Logged

- carpe noctem
Gothi[c]
DC Server Admin
Charter Honorary Member
***
Posts: 857



see users location on a map View Profile WWW Give some DonationCredits to this forum member
« Reply #17 on: July 19, 2006, 05:05:41 PM »

Rover makes a good point, there probably comparing windows OS vulns with linux+all apps that run on it vulns, that's just wrong.

And the only reason why there are more reported vulnerabilities for linux applications in the first place, is because they are easyer to spot, since it's all open source. On linux they usually get spotted, reported, and fixed quite fast, which adds to the security.
On windows they aren't usually spotted all that easily unless you want to dig through a bunch of assembly and do random penetration tests. And when they are spotted you're at the mercy of the original developers to wait for a fix, while on linux you can apply a patch to the original source code and recompile.

That comparison is so wrong for so many reasons and i probably only covered not even half of it :p
Logged
f0dder
Charter Honorary Member
***
Posts: 8,774



[Well, THAT escalated quickly!]

see users location on a map View Profile WWW Read user's biography. Give some DonationCredits to this forum member
« Reply #18 on: July 20, 2006, 05:18:04 AM »

Quote
And the only reason why there are more reported vulnerabilities for linux applications in the first place, is because they are easyer to spot, since it's all open source.
That's not necessarily true... first of all there's (private) tools for finding exploits, appearantly some of them are pretty efficient. But even without such a tool and without source, it's not necessarily hard to find an exploit. I was chatting with a grey-hat friend of mine while he looked for holes in either AIM or Yahoo chat (can't remember which one). It took him between 30-60 minutes to find a 0-day exploit.

Quote
On linux they usually get spotted, reported, and fixed quite fast, which adds to the security.
How long did the chunked-mode exploit exist in Apache before it was discovered? (Discovered by full-disclosure people, anyway Wink ).
Logged

- carpe noctem
Robert Carnegie
Participant
*
Posts: 10

View Profile Give some DonationCredits to this forum member
« Reply #19 on: July 20, 2006, 05:43:31 AM »

I'm a committed sceptic.  I don't trust people who tell me I can speed up my computer by some mysterious fiddle that Microsoft apparently forgot to do, or by buying a tune-up product, and I also don't trust Microsoft.  This document trusts Microsoft too much, I think.

Running as a limited user, specifically, is indispensable but not sufficient in a security recipe.  Someday something nasty is going to leap out of Internet Explorer at me.  I do encounter applications that don't work as limited user, and if I have to then I'll run them on my desktop as Administrator sessions.  Apparently this will be easier in Vista.  But, developers, I want to hear how you justify demanding full control of my PC.  I don't have full control of my PC.  I don't really understand what a Registry is.  And it's -my- PC.  So why should -you- get control of it??  

It's like you're a guest in my home and you want the keys to the safe and the gun locker...

Specific root-only applications that I use include the Fitaly on-screen keyboard (I think it doesn't address the registry in a proper multi-user way - and its market is too small to demand a fix), and the software for Hauppauge DEC-2000T PC-DTV receiver, which is broken in a lot of other ways (video doesn't work on my Tablet PC; sound randomly cuts out of plays half an hour late[!?]; timed recording consist of using Task Scheduler to open and close the application, each time rebooting the hardware twice).

I use "ExplorerXP" as a file manager which I can run as Administrator to handle files for stoopid applications - amongst the things I can't see a way to run as Administrator from a limited desktop are Windows Explorer and Internet Explorer.
Logged
moerl
Charter Member
***
Posts: 404


View Profile Give some DonationCredits to this forum member
« Reply #20 on: October 09, 2006, 08:27:09 PM »

Well.. damn. Just when I thought we had a nice source of debunked XP performance enhancement myths.. you people come along and destroy that new-found trust cheesy
The only thing I found on there that applied to my system was that I had set my paging file to a different partition on the same drive. I changed that so that the paging file is now simply on C:, and with a system-set size.
Logged
Carol Haynes
Waffles for England (patent pending)
Global Moderator
*****
Posts: 7,958



see users location on a map View Profile WWW Give some DonationCredits to this forum member
« Reply #21 on: October 10, 2006, 07:02:59 PM »

Quote
Disk Defragmenter  Partial

Myth - "The built-in Disk Defragmenter is good enough."

Reality - "This statement would be true if the built-in defragmenter was fast, automatic, and customizable. Unfortunately, the built-in defragmenter does not have any of these features. The built-in defragmenter takes many minutes to hours to run. It requires that you keep track of fragmentation levels, you determine when performance has gotten so bad you have to do something about it, and then you manually defragment each drive using the built-in defragmentation tool." - Source - Comparison Chart

Disk Defragmenter Limitations - "The Disk Defragmenter tool is based on the full retail version of Diskeeper by Executive Software International, Inc. The version that is included with Microsoft Windows 2000 and later provides limited functionality in maintaining disk performance by defragmenting volumes that use the FAT, the FAT32, or the NTFS file system. The XP version has the following limitations." - Source

- It can defragment only local volumes.
- It can defragment only one volume at a time.
- It cannot defragment one volume while scanning another.
- It cannot be easily scheduled without scripts or third party utilities
- It can run only one Microsoft Management Console (MMC) snap-in at a time.

Another error - Windows built in defragmenter can be scheduled easily with Windows own Sheduled Tasks applet in Control Panel.
Logged

Mastertech
Participant
*
Posts: 25

View Profile Give some DonationCredits to this forum member
« Reply #22 on: October 10, 2006, 08:56:49 PM »

It's a good list, for the most part, but I think he's a bit clueless when it comes to blocking malware with a hosts file.
Not at all it is not necessary to use Hosts files to stop malware. I support thousands of clients ranging from home users to businesses and never use Hosts Files and never have any problems.

You are supposed to use the localhost IP of 127.0.0.1 as the safe location. I have never known that to change when some other site changes their IP. It doesn't point google.com to Google's IP. It points BadMalwareSite.com to your own pc, where you are not likely to pick up a malware infection from.

That's the whole point to it! That's how it works!

How can localhost be out-of-date? It doesn't change. And how can pointing the domain name of a bad site to yourself block a good site with a different domain name that wouldn't be in your hosts file to begin with?

But since you are only redirecting the bad ones to yourself, the good ones are not affected by an IP change....they were never in your hosts file to begin with.
Not all Hosts files work this way. The one's you are referring to these do not apply to these specific issues.

He is only partially right there...you can't add entries fast enough to block all malware, nor can you ever know all of the possible ones you should block.
That is not partially right that is RIGHT.

How hard is it to open the Hosts file in Word (or a small free proggie like my AlphaSort) and alphabetize the lines?
Huh? So alphabetizing something with ten thousand entries makes it easy to check? What is easy to check is NOT having a Hosts File filled with ten thousand entries.

All the malware entries will be the lines beginning with a different IP than 127.0.0.1 ...and they will either rise to the top, listed after the #comment lines, or drop to the bottom, when you alphabetize the whole list.
What if malware adds good sites? This is very common with Malware that trys to stop you from cleaning it. Malware can do whatever it is programmed to, it does not follow any rules. Malware can also simply delete entries just as easily as it can add them.

That only works for IE and IE based browsers, which even though they are the ones that end up being the cause/victim of spyware most of the time, it is theoretically possible to get an infection while using Firefox, Opera, or something else....and sooner or later you will start hearing of it happening.
It also works for Firefox and there are no known instances of Malware in Opera.

ActiveX isn't the only way malware gets onto a PC through a browser...Java & Flash are also exploitable paths to your PC.
This simply requires having the latest version of Java and Flash. That is a more intelligent solution than trying to block all sites that have one of these exploits, which is impossible.

There is one thing I have to say about a hosts file he didn't mention...and his SpywareBlaster solution would also fail miserably too. And that is in the case of scripts that reference an IP directly and not use a domain name at all.
Fail misrebably? How so when I nor any of my clients get infected? The key is to provide practical security advice not ridiculous things like Hosts Files which is the equivalent of trying to kill all the ants in the world by stepping on them as fast as possible.

And if you start adding IP's to your security zones, you will eventually end up in a similar hell to one he was warning you about, where websites you want to use end up not working right because their IP's may have changed to ones you added. And finding the IP in your registry that is the cause of a problem is tougher than you could imagine when you have a whole bunch in there. You would have to remove them all and add them back 1 at a time till you discovered the one that breaks the good site.
IP's are not added to the security zone but domains.

Carol clearly points out what problems you can have:

Quote from: Carol
I have to say I have used the HOSTS file from mvps.org and it broke quite a lot of legitimate websites as well as the annoying ones. In the end I had to remove it and I now manage my own.
Logged
Mastertech
Participant
*
Posts: 25

View Profile Give some DonationCredits to this forum member
« Reply #23 on: October 10, 2006, 10:15:48 PM »

Humm, that article isn't entirely correct.
No it is completly correct.

Disable the Pagefile
They claim that there's no performance benefit from disabling the paging file... well, that's not true. Windows tends to page out to disk in a lot of situations where it doesn't really make sense (partially because of badly designed usermode programs, though). So if you have plenty (at least a gig, 2gig preferred) you can avoid some needless paging by turning off the paging file. It's not a big improvement for most situations, but it does help a little.
No it doesn't. Windows only pages to disk when necessary. Turning off the Pagefile only disables paging to disk. Windows Simply creates a page file in RAM which takes RAM away from your applications. The Windows Memory Manager is very efficient. I have yet to see any documented reproduceable proof on a clean install of XP confirming these claims.

Also note that this only works for XP, Windows 2000 and below require at least a minimal (~20meg) paging file, and will create one at boottime if you've disabled it.
Yeah in RAM and multitasking performance will suffer.

The article also seems to confuse 80386 protected mode "virtual memory"/"paging" with the process of paging in/out from disk - just because 80386 paging is enabled doesn't mean you have to page (or swap) to disk.
People confuse paging to disk with Virtual Memory, thus they think disabling paging to disk is disabling virtual memory.

Large System Cache Tweak
This tweak can be nice on Desktop machines, not just servers, depending on the way you use your system. For me this is a VERY nice tweak. It means that, for instance, when I've used nLite to create a slipstreamed install CD, once the files are prepared and I go to the ISO creation stage, almost all files will be in the filesystem cache, so almost all reads will go from memory instead of drive... lots faster.

but the changed pages occupy memory that might otherwise be used by applications is a moot point, since filesystem cache will always be dropped when applications request memory and there's currently not enough free memory. On workstations this increases paging and causes longer delays whenever you start a new app. is plainly wrong, there's actually a better chance of your .exe and .dlls being in memory (equals shorter loading time) if you have LargeSystemCache=1.
This is not wrong and is plainly stated by Microsoft. So you are essentially calling Microsoft liars? I don't think so: http://support.microsoft.com/kb/895932
Logged
tslim
Honorary Member
**
Posts: 209


View Profile Give some DonationCredits to this forum member
« Reply #24 on: October 10, 2006, 10:32:51 PM »

so in summary...

the hosts file is used for blocking domains you want no contact with, ever

firewall to block ip's you want no contact with, ever

and zones for sites & ip's you want contact with, but you want them to be broken.

I think using Host file to prevent threats is not a good idea, here is how I feel, to help avoiding car accident:
Driving slower and carefully is one of the good/right measure, but
"Don't ever drive in these or those blacklisted areas at all" and "keep adding this and that area into the blacklist" is not.
To me, the trouble of avoiding threats using the latter is as great as cleaning infection... smiley especially if I am the one who maintain the list...

I remember a case where a trojan/worm (but I can't recall its name) infects a PC and writes to the Host file to block most of the popular antivirus sites (Symantec is one of them) which provide cure/cleaner. I am the one who did the cleaning, since then, I rather the Host file is permenantly locked and no change is allowed, ever.
Logged
Pages: [1] 2 3 4 Next   Go Up
  Reply  |  New Topic  |  Print  
 
Jump to:  
   Forum Home   Thread Marks Chat! Downloads Search Login Register  

DonationCoder.com | About Us
DonationCoder.com Forum | Powered by SMF
[ Page time: 0.077s | Server load: 0.12 ]