Nerds don't just happen to dress informally. They do it too consistently. Consciously or not, they dress informally as a prophylactic measure against stupidity.
After completing some of the realistic missions, I realized that robots.txt and search engines could be a security flaw in websites. With robots.txt you spell out all of your files, and without robots.txt search engines will spell out all of your files for you. However, if your confident enough in your coding, it should not be a major flaw, but it is harder to control access to some files, like .txt or .html. That is why I decided to write this.
---- Protecting One Page At A Time (PHP) ----
If you only have one page to protect, like an administrator page, you should probably use this method.
CODE :
<?php
if (1 == 2) { // whatever you security mechanism is
// whatever you admin page is
} else {
// user is not an admin
header("Location: index.php");
}
?>
For the security mechanism, check for anything that the users circumstances may provide.
If your willing to go the extra mile, you could download the ip-to-country database, and check for the users country and only allow people from the US, or any country of your choice.
The error for not being an admin, you should redirect the user to a page such as 404.php, or index.php. This will keep users who are not allowed from accessing it, and will keep spiders from indexing it.
---- Protecting An Entire Directory (.htaccess) ----
Protecting an entire directory with .htaccess is fairly simple, and requires just a couple line entry into your .htaccess file. Also, a plus to using .htaccess is that it provides advanced access features that may be harder to code with PHP alone.
The first option is to set a password on a folder, resulting in a dialog box requesting username and password.
In order to protect a directory put this code in a .htaccess file in the protected directory:
AuthName is what the dialog box will ask the password for.
AuthUserFile is the path to the password file.
In .htpasswd you would put this:
CODE :
username:(encrypted password)
To generate the encrypted password, use a .htpasswd generator (easily found by googling it).
The second option is to use .htaccess to only allow access from certaint browsers.
Please note this uses the mod_access module for apache, it may not work for every one.
.htaccess
CODE :
SetEnvIf User-Agent ^KnockKnock/2\.0 let_me_in
<Directory /documents>
Order Deny,Allow
Deny from all
Allow from env=let_me_in
</Directory>
For this, anyone with the user agent starting with KnockKnock/2.0 will be allowed access, but no one else.
---- Other Tips & Tricks ----
Most major search engines have webmaster tools, use them to your advantage.
Try using combinations of .htaccess and PHP.
Use this code to confuse spiders that look at HTTP status:
CODE :
header("HTTP/1.0 404 Not Found");
---- Foot Notes ----
Thank you for reading, I hope this has helped you somehow!
1. You can find some of these methods in place on my website www.bren2010.com
2. These methods were not meant to completely replace robots.txt.
6. Thank you to thedotmaster for helping me with the article!
Cast your vote on this article 10 - Highest, 1 - Lowest
Comments: Published: 2 comments.
HackThisSite is the collective work of the HackThisSite staff, licensed under a CC BY-NC license.
We ask that you inform us upon sharing or distributing.