Is it possible to exclude certain folders from scanning?
Thank!
Page 6 / 8
There is a distinct difference though between "Disable Protection", which makes everything "Red! Warning! Ohnoes! Do not do! Bad user!" as a constant versus excluding a folder, which creates no warning or advisory at all in anything and certainly if it did have an ongoing "Ohnoes!" people would get annoyed. So no, you can't really legitimately compare "Sorry, we won't let you exclude folders" with "Sorry, we won't let you turn us off or uninstall us".
One might then counter with "Well, then by your logic power drivers should have the right to go 150 MPH on the freeway if they want to and it's bad for the police to not give us the choice to drive on the sidewalks if nobody is walking on them at the time."
Many AV products ignore the printer spool by default, and many locations that use print management software specifically tell the AV to ignore the printer spool. Would you believe how many infections Webroot catches in the printer spool directory? It's a pretty severe number, because they know that AV is often designed or told to ignore that directory.
In all honesty, it really does come down to a balance. But in this case, the balance does go against allowing directory exclusion. There are fewer people being negatively impacted by it than people being positively impacted by it and though the only thing you can do is "vote with your wallet" by not buying the software, the cost you impose to the company (loss of your purchase) is much smaller than the cost of allowing directory exclusions.
One might then counter with "Well, then by your logic power drivers should have the right to go 150 MPH on the freeway if they want to and it's bad for the police to not give us the choice to drive on the sidewalks if nobody is walking on them at the time."
Many AV products ignore the printer spool by default, and many locations that use print management software specifically tell the AV to ignore the printer spool. Would you believe how many infections Webroot catches in the printer spool directory? It's a pretty severe number, because they know that AV is often designed or told to ignore that directory.
In all honesty, it really does come down to a balance. But in this case, the balance does go against allowing directory exclusion. There are fewer people being negatively impacted by it than people being positively impacted by it and though the only thing you can do is "vote with your wallet" by not buying the software, the cost you impose to the company (loss of your purchase) is much smaller than the cost of allowing directory exclusions.
Main Reason that others have directory exclusions:@ wrote:
Why shall we convince Webroot about the reasonability of what some of us want, if what we want is available already by other developers? Most others have this feature - there must be a reason. What we think is reasonable has already been clearly rejected by Webroot as not reasonable. It's a clear case.
If malware gets past them even a tiny foothold, they lose. Thus, they must exclusively lock files while scanning them, otherwise the threat code can enter memory. Ergo, things that rely on uninhibited file access will have problems. Combine that with their inability to pick and choose what they scan or make intelligent decisions about what a file is and call off a scan of that file based on the first few hundred bytes and they could lock down a large print job for quite some time.
So others having that feature is kind of like why a city had to make a law against shooting rabbits from the tops of streetcars when the streets were occupied by horses who would spook at the gunshots. Now that there are no plethora of horses walking down the street and pulling carriages and streetcars, the law seems silly.
Webroot doesn't need to exclusively lock files. Webroot can make decisions about things in a few hundred bytes. So it doesn't have to exclude directories for the reasons that others need to. The others have that feature because it is the easiest solution for their old and decrepit technology. Webroot doesn't have old and decrepit technology, so doesn't need to make it possible for the user to break something in that way and call it a feature.
As to FPs... I really wonder what the bajeebers you guys are pulling down or what your settings are at to get any. When I was still working for Webroot, I used a basic, standard install and tried desperately to find a true FP. Thousands of legitimate programs on my computer are still only seen by a few dozen Webroot users in the Enzo database. I have gotten no true FPs, though I have gotten apparent FPs that upon further examination were legitimately threats based on the direct actions they took that nobody bothered to pay attention to.
Honestly I did not read through the whole 13 pages.
However the only thing I would like to add that Exclude whole folder works great if one is using more than one security solution. Webroot SA was initially marketed as an adjoint security solution so that is how I am using it.
I am using SA along with ESET AV and Online Armor Premium FW. Now I would like to exclude ESET and OA folder from scanning the same as I excluded Webroot from ESET and OA scans etc.
However the only thing I would like to add that Exclude whole folder works great if one is using more than one security solution. Webroot SA was initially marketed as an adjoint security solution so that is how I am using it.
I am using SA along with ESET AV and Online Armor Premium FW. Now I would like to exclude ESET and OA folder from scanning the same as I excluded Webroot from ESET and OA scans etc.
I'm not necesarily looking to exclude folders, but realtime scanning is interfering with games that stream content from HDD. I'm currently tinkering with Diablo 3 and disabling realtime scanning has a noticeable effect on in game stuttering. Excluding the game folder would be a better sollution than disabling the realtime scanning all together during gaming.
I tried to add the game executable to the allowed list but it's not helping. Maybe just hitting your trap is enough to cause the stutter, even if the trap determines the executable to be allowed.
Anyhow, I like your product for it's speed and light weight. It would be nice if you could be gaming friendly also.
I tried to add the game executable to the allowed list but it's not helping. Maybe just hitting your trap is enough to cause the stutter, even if the trap determines the executable to be allowed.
Anyhow, I like your product for it's speed and light weight. It would be nice if you could be gaming friendly also.
Hello eyeofac and Welcome to the Webroot Community Forums!
Can you Right Click on the Webroot Tray Icon and Click Save a Scan Log an look to see how many [u] files related to Diablo 3 are in the log just an approximate and don't post from the log!
Also have a look at this thread: https://community.webroot.com/t5/Webroot-SecureAnywhere-Antivirus/Antivirus-Software-and-PC-Gaming/m-p/52976#M2559
Thanks,
TH
Can you Right Click on the Webroot Tray Icon and Click Save a Scan Log an look to see how many [u] files related to Diablo 3 are in the log just an approximate and don't post from the log!
Also have a look at this thread: https://community.webroot.com/t5/Webroot-SecureAnywhere-Antivirus/Antivirus-Software-and-PC-Gaming/m-p/52976#M2559
Thanks,
TH
According to the log, it doesnt admit doing much anything while running Diablo 3. There's some vague notions of passive scans lasting couple of minutes, but I dont think it's that.
I made couple of fraps runs and a graph.
It's not the kind of night and day difference I wanted to show, but you can see how the asset loads are swift spikes when AV off and AV on they break up more. The graph is framerate vertically and frame number horz.
Since the log doesnt reveal anything, it's difficult to say what's going on, but I have observed this behaviour twice tonight. As soon as the realtime scan is switched off, the game runs smoother.
I made couple of fraps runs and a graph.
It's not the kind of night and day difference I wanted to show, but you can see how the asset loads are swift spikes when AV off and AV on they break up more. The graph is framerate vertically and frame number horz.
Since the log doesnt reveal anything, it's difficult to say what's going on, but I have observed this behaviour twice tonight. As soon as the realtime scan is switched off, the game runs smoother.
We would love to be documenting all of this in the support system. Our engineers are particularly interested in investigating any potential conflicts or slow-downs involved with Webroot and PC gaming.
Have you already Opened a Support Ticket (I don't see a ticket under your registered email address)?
Have you already Opened a Support Ticket (I don't see a ticket under your registered email address)?
Some interesting information, I use my PC exclusively for gaming and I have completed D3 a number of times and didnt notice and performance issues. I`ll have a look at Diablo when I get back home and see if I can find anything. In the meantime if you can save some scan logs and post them here it would be great or alternatively create a support ticket.
Let's say you wrote a program in Foxpro and built a 32-bit .exe. The program access data tables and fetches stuff from the web. You have to run this program at least nightly to keep everything up to date. More often than not the program is interrupted and crashes because of Webroot. It's not 100% of the time and it is not consistent... it just happens when it happens. So, you need to run this simple, non-intimidating little program but you can't unless... you turn Webroot off. Now, what is a bigger security issue? Exclusion of the folder that I run this program in and the program itself or, having Webroot turned off at least 2 hours a day (if I don't forget to turn it back on).
Path exclusions is a common request, especially among our Enterprise customers, and developers as well. You can add a feature request to the Ideas Exchange here: https://community.webroot.com/t5/Ideas-Exchange/idb-p/Ideas
I can send you a PM to get some information on your files and see what we can do from a whitelisting aspect on our end, although I personally agree that path exclusions would be ideal in this situation.
-Dan
I was under the impression that this feature was requested for years and that Webroots stance was that they know better what's best for the user.@ wrote:
@
Path exclusions is a common request, especially among our Enterprise customers, and developers as well. You can add a feature request to the Ideas Exchange here: https://community.webroot.com/t5/Ideas-Exchange/idb-p/Ideas
Is there a point to request it again?
In the pipeline: https://community.webroot.com/t5/Ideas-Exchange/WSA-all-versions-Exclusion-of-specific-files-folders-from-scans/idi-p/3300@ wrote:
I was under the impression that this feature was requested for years and that Webroots stance was that they know better what's best for the user.@ wrote:
@
Path exclusions is a common request, especially among our Enterprise customers, and developers as well. You can add a feature request to the Ideas Exchange here: https://community.webroot.com/t5/Ideas-Exchange/idb-p/Ideas
Is there a point to request it again?
Status: Coming Soon
This one is in the works and is waiting on QA testing currently.
Oh, that's great to hear. That would induce me to come back to Webroot.
How did I miss that one? (I may have mentioned the need for that feature a time or two around here.) This will make a lot of people happy.
-Dan
-Dan
So what is the difference between path exclusion and folder exclusion?
New here, as I discovered today that Webroot SecureAnywhere AV causes several clients' main inventory system to crash. It is also the first time I have EVER encountered an antivirus product that does not offer FOLDER & FILE exclusions for REALTIME (i.e. CONSTANT/LIVE scanning).
It is my understanding that specific files may be excluded (some desirable ones would be c:windowssystem32spoolsv.exe for example...) by MD5 hash. Makes perfect sense and IN FACT this of file-level exclusion is typically one of the first things done in enterprise. It is recommended to exclude print spooler, and thing like your SQL database, etc. for example. I like the idea of doing it by MD5 in case the file were not genuine or infected.
That said, when you DO NOT exclude program folders that contain dynamic data such as your accounting database locations, etc. you are almost always going to have (to say the least) SOME type of issues....I've seen where NOT excluding them only caused minor issues such as inability to run certain reports OR LCK file being generated , locking up, etc.
Of course it would be impossible to exclude such folders by file since the files are dynamic and growing/shrinking and/or changing all day long....
WE ARE NOT TALKING ABOUT EXCLUDING FOLDERS FROM SCHEDULED SCANS. ONLY meaning excluding certain folders from realtime/live/continuous scanning.
MANY SOFTWARE vendors will say something like "If your antivrus software allows folder exclusions, exclude this folder:..........if only file exclusions, then exclude the following LIST OF FILES <insert long list of individual files here> "....sadly this isn't possible when you are trying to exclude your DATA !!!! Whether it's a financial institution or a large corporation, I've never had this problem (unable to exclude a folder full of mission critical data from AV) before. I've been using 100% CORPORATE/ENTERPRISE AV products for decades and this is the first time I've been unable to do this.
That said, I do find webroot to install very quickly, and I find it to have one of the lightest footprints of any AV product. I have over 100 machines using Webroot and only the server machines have had problems. These problems have been traced to the realtime scanning of folders as described above. I hope this gets implemented .
I saw posts where it sounded as if the OP didn't know the difference between realtime scanning and scheduled/manual/quick/full/etc scanning.
It is my understanding that specific files may be excluded (some desirable ones would be c:windowssystem32spoolsv.exe for example...) by MD5 hash. Makes perfect sense and IN FACT this of file-level exclusion is typically one of the first things done in enterprise. It is recommended to exclude print spooler, and thing like your SQL database, etc. for example. I like the idea of doing it by MD5 in case the file were not genuine or infected.
That said, when you DO NOT exclude program folders that contain dynamic data such as your accounting database locations, etc. you are almost always going to have (to say the least) SOME type of issues....I've seen where NOT excluding them only caused minor issues such as inability to run certain reports OR LCK file being generated , locking up, etc.
Of course it would be impossible to exclude such folders by file since the files are dynamic and growing/shrinking and/or changing all day long....
WE ARE NOT TALKING ABOUT EXCLUDING FOLDERS FROM SCHEDULED SCANS. ONLY meaning excluding certain folders from realtime/live/continuous scanning.
MANY SOFTWARE vendors will say something like "If your antivrus software allows folder exclusions, exclude this folder:..........if only file exclusions, then exclude the following LIST OF FILES <insert long list of individual files here> "....sadly this isn't possible when you are trying to exclude your DATA !!!! Whether it's a financial institution or a large corporation, I've never had this problem (unable to exclude a folder full of mission critical data from AV) before. I've been using 100% CORPORATE/ENTERPRISE AV products for decades and this is the first time I've been unable to do this.
That said, I do find webroot to install very quickly, and I find it to have one of the lightest footprints of any AV product. I have over 100 machines using Webroot and only the server machines have had problems. These problems have been traced to the realtime scanning of folders as described above. I hope this gets implemented .
I saw posts where it sounded as if the OP didn't know the difference between realtime scanning and scheduled/manual/quick/full/etc scanning.
Hi pcrecovery
Welcome to the Community Forums.
A point well made but already considered and in hand.
For more informaiton please see the following feature request that is currently in the Ideas Exchange:
https://community.webroot.com/t5/Ideas-Exchange/WSA-all-versions-Exclusion-of-specific-files-folders-from-scans/idi-p/3300
and feel free to suport and/or add you comments...as that is what the Ideas Exchange is there for.
Regards
Baldrick
Welcome to the Community Forums.
A point well made but already considered and in hand.
For more informaiton please see the following feature request that is currently in the Ideas Exchange:
https://community.webroot.com/t5/Ideas-Exchange/WSA-all-versions-Exclusion-of-specific-files-folders-from-scans/idi-p/3300
and feel free to suport and/or add you comments...as that is what the Ideas Exchange is there for.
Regards
Baldrick
Baldrick,
I was excited when you siad "in hand" "already considered"... To tell you the truth, I'd be using it almost exclusively on all my servers as an additional LAYER of protection if this was implemented. Thanks for the link. From some of the posts, it seemed as if this was years old and never going to materialize.
So, I clicked the link and read thru...and at the end there was a suggested link to the topic in the biz section...followed it to the BUSINESS forum ... and their FOLDER EXCLUSIONS topic....read the posts on page 1 from 2012 where they said "PAST the SUGGESSTION stage...no date yet...." and then skipped to Page 4 (2014) only to see people still struggling with varrious software not working due to the simple feature not being implemented yet....status appears to be:
" You can also find the Business forum topic under TOP KUDOED...
Status: On Hold
Webroot is working hard on a number of improvements to Webroot SecureAnywhere Business Endpoint Protection, and although this idea has merit, other issues that apply to a larger percentage of our customer base are going to need to take priority in the near future. As such, we are going to need to move this idea to On Hold so we can focus our resources on other more-commonly requested features. While we will revisit this idea again in the future, this request will need to be deferred at this time."
I was excited when you siad "in hand" "already considered"... To tell you the truth, I'd be using it almost exclusively on all my servers as an additional LAYER of protection if this was implemented. Thanks for the link. From some of the posts, it seemed as if this was years old and never going to materialize.
So, I clicked the link and read thru...and at the end there was a suggested link to the topic in the biz section...followed it to the BUSINESS forum ... and their FOLDER EXCLUSIONS topic....read the posts on page 1 from 2012 where they said "PAST the SUGGESSTION stage...no date yet...." and then skipped to Page 4 (2014) only to see people still struggling with varrious software not working due to the simple feature not being implemented yet....status appears to be:
" You can also find the Business forum topic under TOP KUDOED...
Status: On Hold
Webroot is working hard on a number of improvements to Webroot SecureAnywhere Business Endpoint Protection, and although this idea has merit, other issues that apply to a larger percentage of our customer base are going to need to take priority in the near future. As such, we are going to need to move this idea to On Hold so we can focus our resources on other more-commonly requested features. While we will revisit this idea again in the future, this request will need to be deferred at this time."
Hi pcrecovery
No problem...I believe that if Webroot say that it will be implemented then it will be. Having said that you may want to hang out over in the Business Forums and pursue the point over there.@ is our resident expert on the business side of things and may be able to advise further.
Regards
Baldrick
No problem...I believe that if Webroot say that it will be implemented then it will be. Having said that you may want to hang out over in the Business Forums and pursue the point over there.
Regards
Baldrick
Hello pcrecovery,
Please Submit a support ticket and we will work with you to do what we can to help the issue until path/file exclusions are available.
Thanks,
-Dan
Please Submit a support ticket and we will work with you to do what we can to help the issue until path/file exclusions are available.
Thanks,
-Dan
Thank you Dan. I'll have one of our lead techs do that tomorrow. We really like Webroot otherwise.
Even a year later, I'll still peek back to point out the following:
Data files are never scanned, not even realtime. Excluding data file folders would achieve nothing and would allow a virus a great place to hide, but it would not resolve the issues you are encountering. Also note that excluding the spooler by MD5 in Webroot is also Generally Useless, since it's not monitored to begin with anyway.
I will describe the details if requested, however suffice to say that with three states (Bad, Good, and Unknown), and Monitoring, what you will want to be concerned with in the case of enterprise applications is situations where the applications in question are being monitored or are listed as unknown. If the applications undergo extremely frequent updates (Weekly, daily, etc), this can be annoying, since each new version is unknown at first, however the TR team can remedy that.
For normal programs that are not updated often, a quick note to Threat Research with data from an affected computer will remedy it in short order. That's also only necessary if manually adding the items from the console is ineffective.
In summary, the cause of crashes is going to be a program or DLL it loads being monitored due to being listed as an Unknown. Technically this is caused in all cases I have ever encountered by what can only be called faulty programming in the application that crashes. However that's only partially descriptive, since most programers can't be expected to catch unlikely cases like race conditions created in tight timing situations that normally would be unaffected by anything externally. Also intentional cases of self-protecting code that objects to DLL injection as a whole.
Data file folders and the data files themselves don't need to be excluded at all though, since they were never scanned to begin with. Known ng]ood files (like the print spooler) also don't need to be overridden or excluded in Webroot either, since there is zero impact on them.
Data files are never scanned, not even realtime. Excluding data file folders would achieve nothing and would allow a virus a great place to hide, but it would not resolve the issues you are encountering. Also note that excluding the spooler by MD5 in Webroot is also Generally Useless, since it's not monitored to begin with anyway.
I will describe the details if requested, however suffice to say that with three states (Bad, Good, and Unknown), and Monitoring, what you will want to be concerned with in the case of enterprise applications is situations where the applications in question are being monitored or are listed as unknown. If the applications undergo extremely frequent updates (Weekly, daily, etc), this can be annoying, since each new version is unknown at first, however the TR team can remedy that.
For normal programs that are not updated often, a quick note to Threat Research with data from an affected computer will remedy it in short order. That's also only necessary if manually adding the items from the console is ineffective.
In summary, the cause of crashes is going to be a program or DLL it loads being monitored due to being listed as an Unknown. Technically this is caused in all cases I have ever encountered by what can only be called faulty programming in the application that crashes. However that's only partially descriptive, since most programers can't be expected to catch unlikely cases like race conditions created in tight timing situations that normally would be unaffected by anything externally. Also intentional cases of self-protecting code that objects to DLL injection as a whole.
Data file folders and the data files themselves don't need to be excluded at all though, since they were never scanned to begin with. Known ng]ood files (like the print spooler) also don't need to be overridden or excluded in Webroot either, since there is zero impact on them.
Kit, hello this is the best news and explanation of folders being scanned that I've seen posted...or maybe I'm understanding more now. Appreciate the time and effort with your assistance. This all makes more sense to me now. Thank You,
Best Regards,
Best Regards,
How is it that WR is already be excluding my clients' data files and apps? They are dynamic and ever-changing. Besides, many high-end apps are customized often extensively to cater to the particular client's needs.
I'd rather SCHEDULE a scan to scan this application folder 12 times per day that to have the "monitoring" or realtime scanner interferring with it.
In the biz forum,see here
there are plenty of people with my exact problem with no resolution. I am not making a big deal over some cheap app or insignificant finding. Some of the apps that Webroot causes issues with are 20k plus. In addition, no other product has EVER cause any such issues in the past decade that I've services these clients.
I sounds very nice that Webroot has already taken the liberty of intelligently programming their product to know the REAL print spooler process as well as other and to therefore not "monitor" it.
At any rate, for the past few years, whenever I suspect a virus ,I will submit the file to jotti.org and virustotal.com . It never ceases to amaze me how not a single product will detect the virus....wait a day or so....then 1 or 2 obscure products will detect the attachment as a virus.......several days later....finally a dozen or so now detect this new threat...often a very nasty trojan or worm. The point being that I really don't expect much from most products...heck, I'm often scanning with 50 to 100 products and they are ALL failing miserably.... what DOES help seems to be web filtering and PREVENTING users from going all over the web thru gateway appliances where we attempt to limit our users' internet access to begin with, dns, and scope and priviledges. I therefore do not have unrealisitic expectations of AV products....quite the opposite... I expect MOST...nearly ALL to NOT DETECT the latest threats until they've been all over the world for a few days.... What I DO expect is that the product will (hopefully) NOT kill my servers and/or commercial apps. From what you've described, the fault is the app or programming as WR is intelligently designed to know good from bad and already knows my clients' " data files" somehow. (though this might be a proprietary/custom format and/or app) . No doubt "faulty programming" is the cause then... I am refering to folders where DATA is stored...dynamic, ever-changing data. Note that I am NOT refering to a typical file share or "data" location such as a shared folder full of word docs and excel spreadsheet.
I'd rather SCHEDULE a scan to scan this application folder 12 times per day that to have the "monitoring" or realtime scanner interferring with it.
In the biz forum,see here
there are plenty of people with my exact problem with no resolution. I am not making a big deal over some cheap app or insignificant finding. Some of the apps that Webroot causes issues with are 20k plus. In addition, no other product has EVER cause any such issues in the past decade that I've services these clients.
I sounds very nice that Webroot has already taken the liberty of intelligently programming their product to know the REAL print spooler process as well as other and to therefore not "monitor" it.
At any rate, for the past few years, whenever I suspect a virus ,I will submit the file to jotti.org and virustotal.com . It never ceases to amaze me how not a single product will detect the virus....wait a day or so....then 1 or 2 obscure products will detect the attachment as a virus.......several days later....finally a dozen or so now detect this new threat...often a very nasty trojan or worm. The point being that I really don't expect much from most products...heck, I'm often scanning with 50 to 100 products and they are ALL failing miserably.... what DOES help seems to be web filtering and PREVENTING users from going all over the web thru gateway appliances where we attempt to limit our users' internet access to begin with, dns, and scope and priviledges. I therefore do not have unrealisitic expectations of AV products....quite the opposite... I expect MOST...nearly ALL to NOT DETECT the latest threats until they've been all over the world for a few days.... What I DO expect is that the product will (hopefully) NOT kill my servers and/or commercial apps. From what you've described, the fault is the app or programming as WR is intelligently designed to know good from bad and already knows my clients' " data files" somehow. (though this might be a proprietary/custom format and/or app) . No doubt "faulty programming" is the cause then... I am refering to folders where DATA is stored...dynamic, ever-changing data. Note that I am NOT refering to a typical file share or "data" location such as a shared folder full of word docs and excel spreadsheet.
I suppose then that a more detailed explanation is in order.
Webroot doesn't care about it unless it is code. So the data files are ignored. Yes, I understand this is speaking of things such as multi-terabyte non-relational data sets and other such wonderful things.
All of those data files are looked at for slightly less than a very small amount of time. Their names are read and they are determined to be "Probably not code". The first few bytes are read from the kernel level, non-blocking, and determined to not be PE, so after that they are ignored. Even if they change, unless the first sector is modified, they are ignored. The only thing that would cause them to stop being ignored otherwise is if they are read into memory and an execution pointer is set to them.
The Applications, on the other hand, are code. They are examined in brief and determined to be code. Then they are hashed and reported to the cloud. If the cloud has them defined as [g]ood, they are watched for changes and rarely hashed non-blockingly, but otherwise left alone. Obviously [b]ad files are blocked from executing and removed. [u]nknown files are where the complexity starts.
Unknown files are subject to:
- Monitoring
- Journalling
- Pseudo-sandboxing
Just because something is expensive doesn't mean it was written well and is perfect. In fact, I've often found the opposite to be true. The more expensive, the less adaptive it often is, and definitely the less-common it is, so the less-likely it is for it to be known. Singe-custom apps obviously can't be known at all until they are scanned for the first time. This is where overriding to [g]ood status on the Webroot console is helpful, and the folks at Threat Research can determine to good on the Webroot side if needed.
Anyway, monitoring can be intrusive. The process may have a DLL injected for deeper inspection, which definitely has a potential to cause problems if the process's code is not flexible. For example, if the process forks a subroutine while sleeping another line for enough time for that subroutine to finish, then waking up and expecting the result to be present, the monitoring could have delayed the forked routine just long enough to cause the result to NOT be present. Read the start of the result for its length and get unprepped memory, then start reading far too much and fault and crash.
Journalling normally won't cause an issue like crashing, but with I/O intensive applications it can cause a slowdown in performance. Also, some applications are very timing-sensitive and give up on I/O and fault if they can't complete it in time. They may work at a kernel level to prioritize their operations, but Webroot works at a kernel level too and is there deeper, so forces in anyway.
And of course sandboxing just throws all sanity out the window for the application. Access attempts to sensitive system functions are silently dropped and just never return in many cases, or return "generic" results. Also consider such things as the identity shield and its impact. So an application may want to use ieframe.dll to render rich content and because it's not trusted yet, it is blocked from interacting with ieframe.dll successfully. It sends "Show XYZ", then sends "Tell me what you are showing" as a very common programming step. It expects to receive "OK" effectively from the first command, but gets nothing. Perhaps then it stays in an endless loop waiting for an OK or Error! signal. Or perhaps it continues on. Then gets NOTHING...not even null...from the second command... and tries to read the result pointer... Then "Process tried to access memory at address - memory could not be 'read'." Yup. Dead application again.
But once it's overridden and set to [g]ood, or determined by TR as [g]ood, all those problems go away.
So ignoring the data folder would do nothing, since it's actually intelligently ignored already since it's not PE. But protection is not reduced since starting execution from code hidden in data would trigger a scan of what was loaded.
And no, not even Webroot will detect everything on day one. BUT... the aforementioned journalling and pseudo-sandboxing and monitoring means that it has a much better chance of being detected before or as it starts to try to do something malicious. Plus the sandboxing makes data release not happen and the journalling means that once it is detected, it is precisely cleaned for that exact instance of the malware and every action it took is reversed in order.
The fact that Webroot knows GOOD (whitelisted) programs, and the fact that it only cares about executable code means that it can be exceptionally intelligent about what it watches. So it's just a matter of getting the executable code that interacts with the data in question to be marked as Good.
Webroot doesn't care about it unless it is code. So the data files are ignored. Yes, I understand this is speaking of things such as multi-terabyte non-relational data sets and other such wonderful things.
All of those data files are looked at for slightly less than a very small amount of time. Their names are read and they are determined to be "Probably not code". The first few bytes are read from the kernel level, non-blocking, and determined to not be PE, so after that they are ignored. Even if they change, unless the first sector is modified, they are ignored. The only thing that would cause them to stop being ignored otherwise is if they are read into memory and an execution pointer is set to them.
The Applications, on the other hand, are code. They are examined in brief and determined to be code. Then they are hashed and reported to the cloud. If the cloud has them defined as [g]ood, they are watched for changes and rarely hashed non-blockingly, but otherwise left alone. Obviously [b]ad files are blocked from executing and removed. [u]nknown files are where the complexity starts.
Unknown files are subject to:
- Monitoring
- Journalling
- Pseudo-sandboxing
Just because something is expensive doesn't mean it was written well and is perfect. In fact, I've often found the opposite to be true. The more expensive, the less adaptive it often is, and definitely the less-common it is, so the less-likely it is for it to be known. Singe-custom apps obviously can't be known at all until they are scanned for the first time. This is where overriding to [g]ood status on the Webroot console is helpful, and the folks at Threat Research can determine to good on the Webroot side if needed.
Anyway, monitoring can be intrusive. The process may have a DLL injected for deeper inspection, which definitely has a potential to cause problems if the process's code is not flexible. For example, if the process forks a subroutine while sleeping another line for enough time for that subroutine to finish, then waking up and expecting the result to be present, the monitoring could have delayed the forked routine just long enough to cause the result to NOT be present. Read the start of the result for its length and get unprepped memory, then start reading far too much and fault and crash.
Journalling normally won't cause an issue like crashing, but with I/O intensive applications it can cause a slowdown in performance. Also, some applications are very timing-sensitive and give up on I/O and fault if they can't complete it in time. They may work at a kernel level to prioritize their operations, but Webroot works at a kernel level too and is there deeper, so forces in anyway.
And of course sandboxing just throws all sanity out the window for the application. Access attempts to sensitive system functions are silently dropped and just never return in many cases, or return "generic" results. Also consider such things as the identity shield and its impact. So an application may want to use ieframe.dll to render rich content and because it's not trusted yet, it is blocked from interacting with ieframe.dll successfully. It sends "Show XYZ", then sends "Tell me what you are showing" as a very common programming step. It expects to receive "OK" effectively from the first command, but gets nothing. Perhaps then it stays in an endless loop waiting for an OK or Error! signal. Or perhaps it continues on. Then gets NOTHING...not even null...from the second command... and tries to read the result pointer... Then "Process tried to access memory at address - memory could not be 'read'." Yup. Dead application again.
But once it's overridden and set to [g]ood, or determined by TR as [g]ood, all those problems go away.
So ignoring the data folder would do nothing, since it's actually intelligently ignored already since it's not PE. But protection is not reduced since starting execution from code hidden in data would trigger a scan of what was loaded.
And no, not even Webroot will detect everything on day one. BUT... the aforementioned journalling and pseudo-sandboxing and monitoring means that it has a much better chance of being detected before or as it starts to try to do something malicious. Plus the sandboxing makes data release not happen and the journalling means that once it is detected, it is precisely cleaned for that exact instance of the malware and every action it took is reversed in order.
The fact that Webroot knows GOOD (whitelisted) programs, and the fact that it only cares about executable code means that it can be exceptionally intelligent about what it watches. So it's just a matter of getting the executable code that interacts with the data in question to be marked as Good.
Reply
Login to the community
No account yet? Create an account
Enter your E-mail address. We'll send you an e-mail with instructions to reset your password.