Must Have settings for Googlebot- WordPress Optimization

Blogging does not require only good quality and useful articles if you are on wordpress. You must take care of many things other than writing such as Blog security and controlling Googlebot accessto your website. If you are unable to control Googlebot then there will be following bad results appeared.

Duplicate content issue due to indexing of duplicate blog pages

If you do not define indexing meta in a page then it will be indexed by Google by default, thus all pages will be indexed. Keep in mind that there are many pages have duplicate content such as tag archive pages,category pages and moth archive pages.

Crawling of unwanted directories of your website

If you do not specify proper rules in robots.txt files then Googlebot will crawl every exists directry in your blog root and it may index many unwanted files other than HTML.

Here I am giving the ideal template of robots.txt file and you have to upload this file in your blog root directory.

Best robot file for WordPress 

User-agent: *
Disallow: /wp-content/
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /wp-*
Disallow: /feed/*
Disallow: /trackback/
Disallow: /cgi-bin/
Disallow: /comments/feed/
Disallow: /*?*


User-agent: Googlebot
Disallow: /*.php$
Disallow: /*.js$
Disallow: /*.cgi$
Disallow: /*.xhtml$
Disallow: /*.php*
Disallow: /comments/feed/
Disallow: /trackback/
Disallow: /*?*
Disallow: /z/
Disallow: /wp-*
Disallow: /*.inc$
Disallow: /*.css$
Disallow: /*.txt$
Allow: /wp-content/uploads/

User-agent: Googlebot-Image
Allow: /* 


Give following robot tags-

Category archive- noindex,follow
Date Archive-   noindex,follow
Tags Archive-   noindex,follow
Homepage subpages- noindex,follow 

(use Robots Meta plugin to do it all)