Interesting use case question: bandwidth for "out there" network?

Userlevel 3
Badge +8
We are a non-profit serving nonprofits... and switching to Webroot Business.
A potential user has a good question. They're running 300 computers (in Papua New Guinea) over a VSAT line - 3mb down, 1 mb up. Not a whole lot of bandwidth :)

Naturally, they'd like to know how much bandwidth will be used by Webroot!
My additional question: what settings can help minimize bandwidth, and minimize impact on link performance, for low-bandwidth and/or slow links?
Obviously the initial install file is quite small and could be a single download. But what then...

Anybody have any data or insight?

Best answer by jhartnerd123 5 July 2017, 00:41

View original

This topic has been closed for comments

5 replies

Userlevel 7
Badge +33
Hey @
Welcome to the forums. 
Webroot is an ideal solution for cases where bandwidth is limited and connections are slow. My company managed over 5000+ endpoints across multiple clients with connections ranging from bandwidth capped 10Mb connections to 3Mb Satellite connections. 
We've been seeing, on average, with a 15 minute polling time (time interval where the agent checks into the cloud), that endpoints use a little under 1MB each. Usually 750K to 1MB. 

Since you don't have to worry about downloading multi-megabyte definition updates, you're immediately saving on bandwidth and not slowing down the connections for other systems on the network. 

The agent only contacts the Webroot cloud infrastructure during it's polling interval (to check for agent updates, policy changes or agent commands) or when it has to ask the cloud to make a determination on a file/process etc... Other than that it'll lay dormant. 

I would only really recommend that if you are very concerned about bandwidth usage, then adjust the polling interval to maybe 30 minutes (Please don't leave it on 24hrs). With a compromise of 30 minutes you are still able to have agents receive agent commands relatively quickly and agent update checks will be taken care of. Leaving at 24hrs leaves too much lead time between when you issue a command or if, say, something should be pushed by Webroot due to a malware attack, will take too long to reach the endpoints. 

Should ya need any further help, feel free to keep posting here bud. 

Have a great week
Webroot Champion/Ambassador
Nerds On Site
Userlevel 3
Badge +8
Hmmm... When you say "1MB each" is that 1MB **per update**?
If so, that is a quite different answer from the only other estimate I've seen in the community... their estimate was 60MB per month for 3 workstations, i.e. 20MB a month per workstation.
If it is 1MB per update, then at 30 minutes per update, for 300 workstations that works out to:
300 * 1MB / 30 min * 10mbps/MBps / 60 sec/min = 300*10/(30*60) = 1.67 mbps which is more than half of their total available bandwidth.
Userlevel 7
Badge +33
Oh sorry @ I'll clarify a little more.
In my case and most of my clients, it's waaaay less than a MB per update. Those are only a few Kb. More or less a ping to the cloud to see if there's anything new to report. 

My work notebook (that I heavily use online), only used 11.73MB of data for 30 days. 

Plus, you also gotta realize that the systems aren't always all going to be on all the time. They also aren't going to always be encountering new files to "check on." They also, by design, won't all check in at the exact same time. This is to avoid network storms that other AV products get when they dispense their multi-megabyte updates. 
Believe me, I've had clients on Sat connections using ESET previously. The stub installer alone is several MB, then once you execute it, it has to download 60-90MB of the actual program, then after install, it has to download a further 40MB+ of definition updates. That's just to get it installed initially. Then once or twice a day, ESET issues a definition update that can range from  5MB up to another 20MB+ file. 

No brainer for me and my clients once I've switched them all over. 

Where some clients on Sat and  Fixed Wireless connections would experience slow downs due to several endpoints doing updates at any given time, once we switched them over to Webroot, all that wasn't an issue anymore. 

Just because the Webroot agent relies on the cloud, doesn't mean it is a bandwidth hog. You'll never even know it's running once installed. 
Hope this helps
Nerds On Site
Webroot Champion & Ambassador
Userlevel 3
Badge +8
That's kinda what I thought. Can you explain what you were thinking when you wrote that with 15 minute updates, clients were using a little less than 1MB?
Userlevel 7
Badge +33
Sure. Simply meant that even with the shortest interval (15 min), that they were still only using very little bandwidth.