For each site you can configure whether search engines can freely crawl and index your site or whether your site should be hidden from search engines, so they don’t appear in their index.
By default, with each site we create and deliver a robots.txt file that indicates to search engines that they are allowed to visit (and therefore crawl) all pages of your site (unless the site is protected with a login). You can disable these indications if you prefer to hide your public content from search engines.
Hide A Site From Search Engines
To hide a site from search engines:
-
From the My Sites screen, click the card that has your site’s name.
-
From the left sidebar, click Site settings → SEO.
-
From the setting Hide site from search engines, enable the toggle.
-
From the Site settings screen, click Publish changes or Save changes in the top right.
Your change will now automatically trigger a new site update (for sites set to live updates) or be applied with your next site update (for sites set to manual updates).
The robots.txt file we deliver with your site will now indicate to search engines to not visit any pages of your site. If respected, your site will not be crawled or indexed by any search engine or any other crawling agents.