Wednesday, September 7, 2011

Robot.txt and its Use


Every website owner wants that the search engines frequently visit their website and index their content but often there are some cases when indexing parts of your online content is not what you want or you don’t want search engine bots crawl your website or any page. If you have private, sensitive and confidential data and information on your website that you do not want the world to see, you will also have a preference that search engines do not index these pages. Moreover, if you want to save some bandwidth by excluding images, style sheets and JavaScript from indexing, you also need a way to tell spiders to repel from these objects.


To out come from this problem Robot Metatags is an option but it has also some limitations. By Robots Metatags you can tell search engines which files and folders on your website to avoid is with the use of the Robots Metatag but the drawback is that the all search engines not read Robot metatags, the Robots matatag can simply go unnoticed by some search engine. The complete solution of this problem is Robot.txt.


Robot.txt is known as Robot Exclusion Standard, it is also termed as the Robots Exclusion Protocol or robots.txt protocol. Robot.txt is a convention to put off cooperating web spiders and other web robots from crawling and indexing complete or any specified part of a website which is in public viewable. Robot.txt has different codes to let search engine know what page or content not to be crawled. One can choose different codes for different requirement. 

Robots.txt is simply a text file (not an html file) you put on your site to tell search robots which pages you would like them not to visit. Robots.txt is by no means mandatory for search engines but generally search engines obey what they are asked not to do. Robot.txt is a better way to tell search engines about your will is to use a robots.txt file.




Click on SEO Services to get the expert services

Tuesday, September 6, 2011

How to block Sitelinks for your own site



Site links make available helpful and constructive information of inner pages or different pages of a website. By creating links of the inner or different pages of the website will help from both SEO and user experience.
Site links should be created with a constructive manner, if not then Google bots will not crawler those links and your efforts will go waste. So, if the structure of your site doesn't allow Google algorithms to find good sitelinks, or they don't think that the sitelinks for your site are relevant for the user's query, they won't show them.
If you come to know that the sitelinks displayed for your site are badly chosen or wrong, in this case you can block them from crawling and being visible on the search engine results so that they no longer appear.
To block a sitelink, first ensure that you have verified ownership of the site. Then:
  1. On the Webmaster Tools Home page, click the site you want.
  2. Under Site configuration, click Sitelinks. If google have sitelinks information for your site, they'll display a list of sitelinks.
  3. Click Block next to the sitelink you want to remove
Once you've blocked or unblocked a sitelink, it can take some time for your changes to be visible.
Note: Google only display sitelinks when a site has a minimum of three sitelinks available. If you have two or fewer unblocked sitelinks, no sitelinks at all will appear in Google search results for your site.

Above information is provided by SEO Dublin