Before intalling Magic SEO, a Robots.txt was installed with the contents below, limiting the number of indexed pages. Is-it a good idea to keep a limitation or lets the robots crawl everything ?
Old Robots.txt
=============
Code: Select all
User-agent: *
# disallow all files in these directories
Disallow: /cgi-bin/
User-agent: Googlebot
Disallow: /cache/
Disallow: /contrib/
Disallow: /docs/
Disallow: /extras/
Disallow: /htmlarea/
Disallow: /stats/
Disallow: /sitemap/
Disallow: /graphics/
Disallow: /login.html
Disallow: /privacy.html
Disallow: /conditions.html
Disallow: /contact_us.html
Disallow: /gv_faq.html
Disallow: /discount_coupon.html
Disallow: /unsubscribe.html
Disallow: /shopping_cart.html
Disallow: /ask_a_question.html
Disallow: /popup_image.html
Disallow: /popup_image_additional.html
Disallow: /product_reviews_write.html
Disallow: /tell_a_friend.html
Disallow: /index.php?main_page=create_account
Disallow: /index.php?main_page=popup_image
Disallow: /index.php?main_page=privacy
Disallow: /index.php?main_page=shippinginfo
Disallow: /index.php?main_page=products_new
Disallow: /index.php?main_page=product_reviews
Disallow: /index.php?main_page=conditions
Disallow: /index.php?main_page=contact_us
Disallow: /index.php?main_page=site_map
Disallow: /index.php?main_page=login
Disallow: /index.php?main_page=product_info&cPath=*
Disallow: /*sort*