GuidePedia

0


Robots.txt files is useful if:


  1. If you want search engines to ignore any duplicate pages on your website.
  2. If you don’t want search engines to index your internal search results pages.
  3. If you don’t want search engines to index certain areas of your website or a whole website.

Create a robots.txt file:

You can go to your website and check if you have a robots.txt file by adding /robots.txt immediately after your domain name in the address bar at the top in search box or in url box.


How to check:

just write the URL of your site and after that write robots.txt

Like that  http://dkssd.blogspot.com/robots.txt

It will show the robots text file of your site if you have no robots.txt file download or copy here.

User-agent: * Allow: /
# Disallowed Sub-Directories Disallow: /checkout/ Disallow: /website-images/ Disallow: /forum/off-topic/ Disallow: /adult-chat/
Sitemap:http://ihkanwal.blogspot.com/sitemap.xml

Post a Comment

Please provide your feed back.Thanks

 
Top