20097 Views 5 Replies Latest reply: Jan 30, 2009 11:03 AM by chinicuin
go the other way around and, at least for ssh and other services that respect tcpwrappers directives, only allow certain hosts with the /etc/hosts.allow file and deny all others.
better yet, couple such measures with a decent firewall in front of your server to handle such filtering and blocking. that will avoid dealing with the "bad" traffic at the server level.
check around the forums for more details on how to secure sshd and other services.
That's good advice for some situations, but the few ports I have open in the firewall (I'm using the OS X 10.4.11 Firewall -> public Internet, no router or hardware firewall in-between), including 80, 443 and 22 need to be relatively open. I may need to ssh to the server from elsewhere...
I guess my question is, is using hosts.deny or hosts.allow (adding, editing) these files a good practice on 10.4.11 when using Server Admin? Since Server Admin seems to like to take care of a lot of configuration itself (httpd.conf), and manually editing certain files can at times create problems. But maybe hosts.deny and hosts.allow are OK to edit?
Another question is, by using a fairly large hosts.deny file to deal with the majority of hacking attempts, will that significantly slow down the server?
Ideally, I would like it if OS X would automatically block an IP after it makes 5 failed attempts at SSH authentication. That would be enough for me, since I can see the same IPs making dozens of attempts, cycling through common user names and using dictionary attacks.
the built in "adaptive firewall" on 10.5 server does what you describe.
i'd still recommend having a separate physical firewall handling security, though.
at least for ssh, you can port forward or run the service on another high port to eliminate most, if not all, of your ssh dictionary attack attempts. doing either shouldn't have any effect on server admin usage.
i don't believe making any changes to the tcpwrappers config files will affect httpd configuration.
also, adding denied hosts manually is like fighting against the tide. i don't recommend it. for servers i administer, i usually deal with this at the firewall level and then supplement rules with a deny all for sshd, only allowing ssh access from certain hosts (like my home network or the office, etc.) that should be safe. at the worst, i can always hop through one of the safe hosts to the client server if needed. a large hosts.deny file won't slow anything, but i would advise against manually editing it to the point it gets large, anyway.
another option is a hardware firewall (standalone appliance or something like iptables or pf running on linux, freebsd, openbsd, etc.) that also offers vpn access. with vpn, leave only 80 and 443 open, if those are the only publicly available ports you need, and use the vpn to access everything else.
if that's not an option, check out denyhosts here: http://denyhosts.sourceforge.net
Going to 10.5 isn't an option right now.
We were using a hardware firewall, but it really slowed things down under load. We kind of have a unique application, where each Xserve maintains upwards of 1000 network socket connections (the firewall reported around 5000 connections per machine -- perhaps a measure of the five-way TCP handshake?). Removing the hardware firewall and switching on OS X Server's firewall instead has really improved performance. And since we only have basic ports 80, 443 and 22 open, I don't see a huge risk. However, I hate the dictionary attacks.
I would like a simple way, if possible, to ignore any IP that fails after three failed attempts...
You can block specific IP addresses with ServerAdmin, go to
Firewall-->Settings-->Advanced, and click the "+" button.
In the pop up select:
protocol= IP(write it manually)
source=IP address(like this xxx.xxx.xxx.xxx)
destination=any(or the name or your network group)
When the rule is listed, just check it.
Done, you have blocked that IP.
Hope this helps.