Robot.txt is an on-page SEO technique and it is basically used to allow for the web robots also known as the web wanderers, crawlers or spiders. It is a program that traverses the website automatically and this helps the popular search engine like Google to index the website and its content.
Just create a file named robots.txt, put it on your root directory of your website(where index.htm/index.php is), edit with a text editor and add these:
User-agent: * (means refer to all crawlers)
Disallow: *something*