I host web services which receive essentially zero attention from the public (which is fine). Consequently most of the traffic to my host are bots. I have limited upstream bandwidth and so I would like to understand and (where necessary) filter that traffic.
The focus here is not comprehensive security for a web host. But rather, how to manage traffic that is enabled to your site after a comprehensive implementation of the host has been completed.
Levels of Protection
Types of Bad Actors
wip
scrapers and SEO
SPAM
known vulnerability attempts
DDOS
ATO account take over
Implementation
wip, follow levels but add detail
Add new comment