Jump to content
pwired

.htaccess and .htpasswd throw 404 error

Recommended Posts

Hi,

I don't want google to index a website that I am building at the moment, for obvious reasons.

The site should be indexed only, once it is completed.

I don't like to use <meta name='robots' content='noindex,follow' /> because this will stop only legitimate engines and bots

In the past I always could use a .htpasswd file and put this in the root .htaccess file:

AuthUserFile   /path/to/.htpasswd
AuthType Basic
AuthName "Identify"
Require valid-user

For some reason this seems to stop working and processwire multilanguage 2.6.1 throws a 404. I checked both .htaccess and .htpasswd for typo's, etc but can not find anything. If I remove again the 4 lines in the .htaccess file there are no errors.

Now I have been looking in the index.php and checking all over any limitations on the host but it is going to take too much time to fix this. What about Adriaan's page protector module ? Does this only work for "pages" in the backend or also for root access ? What I need is blocking the whole site with a password for all search engines indexing until the site is ready.

Share this post


Link to post
Share on other sites

Maybe the path to the ".htpasswd"-file is not correct. You can use this PHP-function to see the correct path:

echo getcwd();

Beside that you could use a "robots.txt"-file for preventing search engines to crawl your site. A subdomain would also help I think as well as hiding the root page in ProcessWire.

Share this post


Link to post
Share on other sites

Hi AndZyk,

I double checked the full path to the root.

I have put the .htpasswd file in the root next to .htaccess to simplify the path and still it is not working.

Share this post


Link to post
Share on other sites

Yeah both of those modules block everything and redirect either to the login or a special page that you define. I use maintenance mode when I have to take my sites offline. One button and you are set. 

Share this post


Link to post
Share on other sites

I just tested this locally on my laptop with the same setup and there everything is working.

It must have something to do with the host. Maybe the host doesn't like something in the index.php


Ok MuchDev, thanks for that confirmation. I will install Adriaan's page protector module

and stop wasting more time finding some unknown server host limitation.

Share this post


Link to post
Share on other sites

This is working perfectly having it independent from all the different host limitations out there.

Big thanks fly out to Adriaan.

post-1069-0-05024600-1447162016_thumb.jp

  • Like 1

Share this post


Link to post
Share on other sites

If you only need to keep google (searchengines) away just use a robots.txt (google and others are legitimate bots / crawlers)

User-agent: *
Disallow: /

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


  • Recently Browsing   0 members

    No registered users viewing this page.

×
×
  • Create New...