Sorry, nothing in cart.
Robots.txt (“robots dot txt”) is a text file that helps search engines like Google, Bing to understand what information on a particular site needed to be indexed. You can read more about robots.txt by clicking this link.
Robots.txt is a critical file for the success of any store.
Why you need Robots.txt file?
Robots.txt file should be placed in the root directory of your website in order to tell the search engines which pages to skip and which to index. Webmasters use robots.txt files to help search engines index the content of their websites. With the help of this file webmasters can tell the search engine spiders not to crawl the pages that they do not consider important enough to be crawled, such as pdf files, printable version of pages and many more. In this way they get better opportunity to have important pages featured in search engine result pages. In other words, robots.txt file is the simple and easy method of easing the process for spiders to provide more relevant search results.
Improving performance Using Robots.txt in OpenCart
There are certain areas where Robots.txt file can help, we are listing the 2 primary reasons of using Robots.txt file below:
- Robots.txt will help prevent duplicate content issue, one of the primary thing for SEO success.
- Robots.txt also help you to hide technical details about your site i.e. Error logs, SVN files, wanted directories etc. Since, these are prevented by
Robots.txtyou are left with clean URLs to be indexed in search engines.
Set Up Robots.txt in OpenCart
Before you setup Robots.txt file, you should know that
robots.txt settings will only cover 1 domain at a time, so for multiple stores you have to create separate
robots.txt files for each stores. Creating Robots.txt is super simple since it’s nothing but a text file and can be created using any text editors like dreamweaver, notepad, vim or your favorite code editor.
Once you have created Robots.txt file it is supposed to reside at root of your site. For an example if your store domain is
www.mystore.com you should put
robots.txt file under the domain root where you also have app directory. So it sits and accessed like
www.mystore.com/robots.txt. Please note that many search engines look for Robots.txt file directly under your store root and not under a directory. So keeping this file under any directory, sub-directory is not wise.
Robots.txt for OpenCart
Following is a well tested version of
Robots.txt file which you can use, just edit the lines not applicable for your store’s set up.
User-agent: * Disallow: /*&limit Disallow: /*?limit Disallow: /*?sort Disallow: /*&sort Disallow: /*?order Disallow: /*&order Disallow: /*?price Disallow: /*&price Disallow: /*?brand_tabletpc Disallow: /*&brand_tabletpc Disallow: /*?color_default Disallow: /*&color_default Disallow: /*?filter_tag Disallow: /*&filter_tag Disallow: /*?mode Disallow: /*&mode Disallow: /*?cat Disallow: /*&cat Disallow: /*?dir Disallow: /*&dir Disallow: /*?color Disallow: /*&color Disallow: /*?product_id Disallow: /*&product_id Disallow: /*?minprice Disallow: /*&minprice Disallow: /*?maxprice Disallow: /*&maxprice Disallow: /*?route=checkout/ Disallow: /*?route=account/ Disallow: /*?route=product/search Disallow: /*?page=1 Disallow: /*&create=1 Disallow: /?route=information/contact Disallow: /*?route=affiliate/ Disallow: /*?keyword Disallow: /*?av Disallow: /admin/ Disallow: /system/ Disallow: /catalog/ Sitemap: http://www.mystore.com/index.php?route=feed/google_sitemap
I hope the above tutorial will help you define what needs to be indexed or what to be avoided in Google search results. Note that displaying sitemap path in robots.txt is also a nice idea. Although, the robots.txt file is pretty generic for everyone but you can still fine tune and ignore few rules which are not applicable for your store.
Let us know if you have any questions or face any difficulty setting up the robots.txt file in your store running on OpenCart.