/ Published in: Apache

Before taking a site "live", I password protect it except the robots.txt for search engines and custom Error pages. You can adapt this to your own use and add other pages in the FilesMatch directive.
Expand |
Embed | Plain Text
Copy this code and paste it in your HTML
# These pages are public <FilesMatch "^(robots\.txt|errorpage\.php)$"> order allow,deny allow from all </FilesMatch> # The rest of the site is private AuthUserFile /path/to/your/.htpasswd AuthType Basic AuthName "Login Required" Require valid-user Order allow,deny Satisfy any
URL: http://www.webmasterworld.com/apache/4051548.htm
Comments
