cdb: How do I keep folks from ripping all the files from my web site?
I keep getting shut down because folks aren’t happy to load 1 file at a time from my web page, but have ripper programs that take thousands at a whack, over 12 gigabytes at a time. My web page keeps getting shut down because of this. How do I limit downloads? I am not inclined to use bit torrent, don’t know anything about Perl or PHP. Will it work if I put small groups of files in hundreds of seperate directories?
Answers and Views:
Answer by Jeremy T
putting small groups of files in hundreds of separate directories will probably not work. Web crawlers are pretty clever. fyi, its probably not humans downloading lots of files at the same time. Its probable sites like google indexing your images for so people can search them.
You could try banning IP ranges.
Leave a Reply