QUOTE(idharam @ Jan 2 2012, 11:00 AM)
What are the robots and what is the use of the robots?
When search engine bot crawls your website, it looks for robots.txt file. IN which you will mention which pages to crawl and which pages need not. So based upon this..the search engine bot crawl your website.
You tell google to crawl all pagesUser-agent: *
Disallow:
you tell google not to crawl folders belowUser-agent: *
Disallow: /cgi-bin/
Disallow: /tmp/
Disallow: /private
If you want to make google to crawl all of your websites you can give access and if you want to hide folders from search engines..that can be done using robots.txt.
Keep in mind..its robots.txt and not robot.txt.