CEO goes to a website that the company pays to use and gets a blocked access. after spending a couple of hours back and forth with their administrators, come to find out that the IP our requests were coming from (webroot IP) was what was being blocked. They opened access from the IP but in no uncertain terms said that Webroot may also get blocked in the future.
Now before you asked about support, i went there and all i got for my troubles to create a ticket was "There seems to be a problem" window. Theres a problem alright.
Page 1 / 1
Are you talking about WSS? If so, the problem is that some places block the IP ranges from Amazon, since that is where our proxy is going through. You can create an exception for their site in that case, and then it won't go through the proxy.
I dont see that as an acceptable solution... pushing holes through the security we're paying for. if i wanted to maintain a whitelist i would just get an appliance and put it in my rack.
Well there's not much we can do if a website decides to block the entire IP range coming from AWS servers. We do contact them to request that they allow them, but it's up to them who they want to allow to reach their website. I believe they do this because a lot of spammers and hackers spin up AWS servers so they take the "nuke it from orbit" approach. I know that Craigslist does this to cut down on spam, and they've refused to whitelist those.
well this is the 4th one i have run into in the last couple of months. i see this as just the beginning before everyone jumps on the bandwagon. effectively the damage is being done to Amazon and their users by creating a barrier between companies trying to protect themselves and the destination URL's they either want to get to or are paying to use. bad news for everyone.
Hm, that sounds unusual to have that many all at once. There's a couple regular ones we know about, such as Craigslist and Papa Johns, but I didn't think it was quite so common. Would you be willing to open a ticket with support so that we can get these documented? That way we can both reach out to the sites to see if they'll allow AWS IPs, and also have a record of just how frequently this happens.
Fundementally, Webroot is sending web traffic from IP ranges that should not be sending web traffic. Compromised Amazon EC2 instances are commonly used for bots. I'm actually pretty surprised, I thought Webroot would have rolled their own managed infrastructure for this on top of something like Softlayer.
The economies of scale just make AWS so much cheaper than running your own data center. But yes, the issue of bots is a problem that I don't know that there is a good solution to yet.
I get an email from the HR director at 8 p.m. last night complaining he cannot access MARRIOTT.COM to schedule a reservation. It's another blocked site. I have decided to dispense with control and just insert any sites in the bypass that complain of inability to access and review the company's needs next May.
Reply
Login to the community
No account yet? Create an account
Enter your E-mail address. We'll send you an e-mail with instructions to reset your password.