37. Before publishing websites

by Cover Tower - Updated March 14, 2021

When your websites are ready for publishing, don’t forget to remove the add_header X-Robots-Tag “noindex, nofollow, nosnippet, noarchive”; header from the server block of all the websites that you want to be indexed by search engines.

Change also the content of the robots.txt file for all the websites that you want to allow public access to. First open the file:

nano /var/www/example.com/robots.txt

Change its content to make it look like this:

User-agent: *

Disallow:

Sitemap: http://www.example.com/sitemap.xml

The  Sitemap  directive is optional, but it's recommended to include it for SEO purposes.

If you configured Nginx to serve an 'under construction' page to outside visitors, you have to reverse those changes, to make the website accessible to everyone. Therefore, the lines that restricted access should be commented out like this:

# location = /underconstruction.html {}

# error_page 403 =200 /underconstruction.html;

location / {

# allow 123.123.123.123;

# allow 124.124.124.124;

# deny all;

try_files $uri $uri/ /index.php?$args;

    }