Hello:
There is no 'Pause' button on a manually-started scan, only 'Cancel'. Is that right?
How does a user pause a currently-executing scan? WRSA.exe is using 99% CPU.. Attempts to lower task prioritiy from 'Normal' with Task Manager are blocked, "Access is denied'. I was 25 hours into this scan, and was not happy to have to cancel it to recover use of my workstation.
-- Roy Zider
Windows SP SP3
Latest version of WRSA (but no 'Help | About' tab to identify version)
Solved
No 'Pause' button on WRSA
Best answer by Kit
Hi!
Let's start at the beginning, and when we get to the end, we'll stop. :)
Item: No pause button
The lack of a Pause button is intentional. On the average properly-running computer running default settings and deep scans, the scan takes two minutes or less. The use of custom or full scans is explicitly not-recommended and the agent does, in fact, advise you of this.
Item: CPU Use of 95%
See: Settings -> Basic Configuration -> "Operate background functions using fewer CPU resources" (Applies to realtime operations) and Settings -> Scan Settings -> "Favor low CPU usage over fast scanning" for additional options to reduce CPU use. That being said, the system is running Windows XP on an AMD Athlon XP 2000+ single-core 1.66Ghz 32-bit processor, which was originally released in January of 2002. That's processor technology that's over a decade old. While the program will run on that, just like any application, running on substantially older, lower-end machines will take more of the resources and a longer time. Unfortunately, I can't speak for a diagnosis of the CPU use in the case of the scans as the logs you submitted do not include the actual scan logs, but only have the main program logs.
A deep scan cannot be run on an arbitrary location, mind you, so claiming that a deep scan was run on U: is not physically possible in the agent unless a file on U: is referenced in startup locations or auto-run locations. A quick examination of the logs indicates that the initial scan took 7m 42s to perform its deep scan, and a second deep scan thereafter took 3m 28s, and a third 3m 26s. A Quick Scan finished in 3s thereafter, followed by starting a scan of C:, 😨, E:, U:, and V: all in one scan.
This monolithic scan promptly began to detect the malware located in dormant places on pretty much every drive, and the moment it detects one thing, it goes into a more thorough mode, which will invariably slow down the scan as it begins to do extra investigation of files. Even without this extra investigation, doing a Full or Custom scan will invariably result in every byte of every file scanned being read from the disk for the purpose of generating the MD5 hash of the file. Disk performance will impact this of course, as will the older nature of the physical CPU. Combine that with extracting files from numerous archives and you'll note that the impact is even greater, since it's extracting the archives to the temporary folder after the initial hashing, followed by DiskKeeper evaluating the files that were just extracted for fragmentation, after which it then hashes the files that were extracted.
For even more fun, many of the files were double-archived. So you have the overhead of first extracting the .zip from the .rar, followed by extracting the .exe from the .zip. Older computers are not quite as fond of that. I'm even boggling at a case of a file inside a .rar inside a .zip inside a .rar. Bit overkill, maybe.
Regardless, nearly 800,000 files and a day later, the scan was cancelled. Yes, indeed, with extracting that many files from nested archives and processing that sheer a volume of data, it's going to take a long time and quite a bit of CPU on an older computer.
Item: The first scan, run on installation on 6/4/2012, was a deep scan on K7N system first time. It was aborted at 25 hours, only 43% done.
See above. The first scan run was a deep scan that took 7m42s. The scan you are referencing is a custom or full, which the agent specifically recommends against unless you are a network administrator for example or explicitly attempting to inventory the system.
Item: Checking the integrity of a drive with CHKDSK is generally a bad idea, a rookie mistake. The first check to run is for readability of all files – using CDCheck. Ran CDCheck last night, read the drive OK:
Validating the media is only a small part (and the longest part) of running chkdsk. CDCheck will indeed generally validate the readability of the media, however it will not validate the integrity of the file system, which is also important. The second thing that CHKDSK does that CDCheck does not do is communicate with the drive controller at a low level for the purpose of detecting recoverable errors that indicate health issues on the media. If the drive takes two seconds to read a sector but eventually teases the correct data out of it (according to the drive itself), CDCheck will consider that valid. By comparison, chkdsk will flag that as a failing sector and move the data off it while it still can be read. (I'll refrain from pointing out the errors indicated in the CDCheck paste because they may not be related, while by comparison a chkdsk result set is standardized.
Item: False Positives
Without full scan logs to investigate, I can't accurately comment on this. I can surmise, with no information, that there is either a good chance the command.com in question was in a non-standard location and really was a threat, or was hit by a file infector, of which you have a plethora detected in the aborted 25-hour scan that I have logs for.
Anyway, the primary issue that I see in this case is the use of the agent in a manner inconsistent with best practices and agent-given advice. While you can run a full or custom scan, it just generally normally shouldn't be and isn't necessary in common consumer user scenarios.
The recommendation in that situation is to, for normal use, leave it running a deep scan (about 3-4 minutes in your case) and utilize the settings described above to reduce CPU use on the older hardware.
The request for a pause button is (still) currently considered extremely edge-case and better solved with information.
I can't comment much on the alleged CPU usage without seeing scan logs from standard operation, however given that it was extracting four-layer nested archives and hashing everything as per operating parameters on a decade-old CPU that is highly sub-optimal for hashing, I would expect it. That is why the settings are available to reduce the CPU usage in those cases.
I will address the Deep vs Full/Custom in the other thread.
The post you made was very long, so I apologize if I missed anything.
View originalLet's start at the beginning, and when we get to the end, we'll stop. :)
Item: No pause button
The lack of a Pause button is intentional. On the average properly-running computer running default settings and deep scans, the scan takes two minutes or less. The use of custom or full scans is explicitly not-recommended and the agent does, in fact, advise you of this.
Item: CPU Use of 95%
See: Settings -> Basic Configuration -> "Operate background functions using fewer CPU resources" (Applies to realtime operations) and Settings -> Scan Settings -> "Favor low CPU usage over fast scanning" for additional options to reduce CPU use. That being said, the system is running Windows XP on an AMD Athlon XP 2000+ single-core 1.66Ghz 32-bit processor, which was originally released in January of 2002. That's processor technology that's over a decade old. While the program will run on that, just like any application, running on substantially older, lower-end machines will take more of the resources and a longer time. Unfortunately, I can't speak for a diagnosis of the CPU use in the case of the scans as the logs you submitted do not include the actual scan logs, but only have the main program logs.
A deep scan cannot be run on an arbitrary location, mind you, so claiming that a deep scan was run on U: is not physically possible in the agent unless a file on U: is referenced in startup locations or auto-run locations. A quick examination of the logs indicates that the initial scan took 7m 42s to perform its deep scan, and a second deep scan thereafter took 3m 28s, and a third 3m 26s. A Quick Scan finished in 3s thereafter, followed by starting a scan of C:, 😨, E:, U:, and V: all in one scan.
This monolithic scan promptly began to detect the malware located in dormant places on pretty much every drive, and the moment it detects one thing, it goes into a more thorough mode, which will invariably slow down the scan as it begins to do extra investigation of files. Even without this extra investigation, doing a Full or Custom scan will invariably result in every byte of every file scanned being read from the disk for the purpose of generating the MD5 hash of the file. Disk performance will impact this of course, as will the older nature of the physical CPU. Combine that with extracting files from numerous archives and you'll note that the impact is even greater, since it's extracting the archives to the temporary folder after the initial hashing, followed by DiskKeeper evaluating the files that were just extracted for fragmentation, after which it then hashes the files that were extracted.
For even more fun, many of the files were double-archived. So you have the overhead of first extracting the .zip from the .rar, followed by extracting the .exe from the .zip. Older computers are not quite as fond of that. I'm even boggling at a case of a file inside a .rar inside a .zip inside a .rar. Bit overkill, maybe.
Regardless, nearly 800,000 files and a day later, the scan was cancelled. Yes, indeed, with extracting that many files from nested archives and processing that sheer a volume of data, it's going to take a long time and quite a bit of CPU on an older computer.
Item: The first scan, run on installation on 6/4/2012, was a deep scan on K7N system first time. It was aborted at 25 hours, only 43% done.
See above. The first scan run was a deep scan that took 7m42s. The scan you are referencing is a custom or full, which the agent specifically recommends against unless you are a network administrator for example or explicitly attempting to inventory the system.
Item: Checking the integrity of a drive with CHKDSK is generally a bad idea, a rookie mistake. The first check to run is for readability of all files – using CDCheck. Ran CDCheck last night, read the drive OK:
Validating the media is only a small part (and the longest part) of running chkdsk. CDCheck will indeed generally validate the readability of the media, however it will not validate the integrity of the file system, which is also important. The second thing that CHKDSK does that CDCheck does not do is communicate with the drive controller at a low level for the purpose of detecting recoverable errors that indicate health issues on the media. If the drive takes two seconds to read a sector but eventually teases the correct data out of it (according to the drive itself), CDCheck will consider that valid. By comparison, chkdsk will flag that as a failing sector and move the data off it while it still can be read. (I'll refrain from pointing out the errors indicated in the CDCheck paste because they may not be related, while by comparison a chkdsk result set is standardized.
Item: False Positives
Without full scan logs to investigate, I can't accurately comment on this. I can surmise, with no information, that there is either a good chance the command.com in question was in a non-standard location and really was a threat, or was hit by a file infector, of which you have a plethora detected in the aborted 25-hour scan that I have logs for.
Anyway, the primary issue that I see in this case is the use of the agent in a manner inconsistent with best practices and agent-given advice. While you can run a full or custom scan, it just generally normally shouldn't be and isn't necessary in common consumer user scenarios.
The recommendation in that situation is to, for normal use, leave it running a deep scan (about 3-4 minutes in your case) and utilize the settings described above to reduce CPU use on the older hardware.
The request for a pause button is (still) currently considered extremely edge-case and better solved with information.
I can't comment much on the alleged CPU usage without seeing scan logs from standard operation, however given that it was extracting four-layer nested archives and hashing everything as per operating parameters on a decade-old CPU that is highly sub-optimal for hashing, I would expect it. That is why the settings are available to reduce the CPU usage in those cases.
I will address the Deep vs Full/Custom in the other thread.
The post you made was very long, so I apologize if I missed anything.
Reply
Login to the community
No account yet? Create an account
Enter your E-mail address. We'll send you an e-mail with instructions to reset your password.