Your site content is very important for your Site . The site crawl take most important part of your site , robot.txt tells search engine how they should visit your site .If you want to block specific search engine & also directories. So by the help of robot.txt you can do the job easily .The robot.txt file are text file which are placed on your web-server . They must be placed on the root folder, as an example… www.yourwebsite.com/robots.txt
If your website does not have a robot.txt file then this is what happens –
A robot comes to visit. It looks for the robot.txt file. It does not find it because it isn’t there. The robot then feels free to visit all your web pages and content because this is what it is programmed to do in this situation.
Create robot.txt like that
User-agent: * Disallow:
In disallow you place that folder or search engine name which you want to block. SEE THIS And User-agent: * means apply on all robots