The is a WordPress feature that, used wrong, will seriously slow down the initial crawing of your new blog by Google.
Make sure you get it right.
Getting your site crawled and index by search engines is important.
You want that Googlebot crawling your site and indexing as soon as possible. The sooner a page is indexed, the sooner you can start improving rankings. You won’t show up anywhere if you haven’t been crawled by the search engine bots.
You also do not want to be crawled till the blog is ready.
When you first install WordPress, there are some test pages that you do not want crawled. If they are indexed, Google will not have a correct record of your blog. If your blog is about Pez despensers, you do not want Google thinking the subject of your post is “Hello World” and your blog is “Just another WordPress blog.” You need to have those posts and settings changed before Googlebot crawls.
When you first install WordPress and log in, you are given the option to block search engines.
You will still allow other visitors, just not the search engine bots. This is probably a good idea till you fix the blog title, tagline, and get a real post on the site. Once ready, you can allow the bots back by changing the privacy settings.
The mistake that could cost you time in indexing is not changing the privacy settings back soon enough.
One of the first steps to help indexing is submitting a sitemap to Google and Google Webmaster Central. Doing this will usually speed up indexing.
If you do not allow search engine bots to crawl first, Googlebot will not download your sitemap.
The “block search engine bots” option creates a robots.txt file that block the bots. Google will see this and tell you it cannot find the sitemap.
It could take an extra day before Google will try to read your sitemap a second time. Then it will then take additional time before the Googlebot comes back to crawl your blog.
The same thing can happen if you get links to your blog before you unblock the search engine bots.
The will follow the link to your site. The bot will see the robots.txt file and not crawl and index. It may also take a longer time to return to recrawl your site.
Should you use the option at all?
I think so. It does not help you if Google crawls and gets the wrong information about your blog. It will take time for Google to update that information and fix it as well. Additonally, if your like me, you probably will be changing many posts and pages as you make the initial design and posts on your blog. This will cause excessive pinging. That can get your blog blocked as spam. The privacy setting stops the pinging while you make these changes.
It is important to change the setting before you start to get links or submit your site to Google Webmaster Central.
You would be better off waiting a day or two than making the submission before you change the privacy setting. It will cost you that much time.
Make sure you also get links to your blog.
This is one of the best ways speed up crawling and indexing. Blogging Zoom is one aid that you can use. Getting links and commenting on other blogs is another.*Update-Blogging Zoom is dead while the domain is still in use.*
Just make sure Googlebot and the other search engine bots can crawl your blog once they see the link to it first.