eydun Posted March 1, 2023 Posted March 1, 2023 There are some hosts that are continuously scraping my website. They are using static IP-addresses, so they are easy to identify. What is the easiest method to ban some IP-addresses from access your website? Is there perhaps a PW-module for this?
Gideon So Posted March 1, 2023 Posted March 1, 2023 Hi @eydun No. I don't think there is such a module. You can add Deny from 123.123.123.123 222.223.224.225 into the .htaccess file to block specific IPs. Gideon 4
eydun Posted March 5, 2023 Author Posted March 5, 2023 Thanks for the tip. But I was hoping not to modify the .htaccess-file.
teppo Posted March 5, 2023 Posted March 5, 2023 Technically speaking best approach would be to block the IP as early as possible: preferably via firewall (on local machine or before it), but if that's not possible then Apache config, and if that's not possible either then .htaccess. Blocking access early means that more unwanted traffic gets blocked out, and it is also better for performance ? That being said, I just updated https://processwire.com/modules/page-render-iprestriction/ to support a list of blocked IP addresses. This module can block access to your site, but note that it won't attempt to protect your static files, so again other alternatives are generally speaking preferable. 3
netcarver Posted March 5, 2023 Posted March 5, 2023 I was going to suggest looking at the blackhole module by @flydev as well - but I think Teppo's update to page-render-up-restriction might be a better fit to your issue, @eydun. 1
eydun Posted March 6, 2023 Author Posted March 6, 2023 Thank you both for the advise ? The "page-render-iprestriction"-module looks like a perfect solution for my needs.
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now