The robots exclusion general, also called the robots exclusion protocol or surely robots.txt, is a fashionable utilized by web sites to speak with net crawlers and other internet robots. the same old specifies how to inform the net robotic approximately which areas of the internet site ought to no longer be processed or scanned. Robots are often used by search engines like google to categorize net sites. now not all robots cooperate with the same old; electronic mail harvesters, spambots and malware robots that test for security vulnerabilities may additionally even start with the portions of the internet site where they were informed to live out. the same old isn't like, however may be used along side Sitemaps, a robot inclusion popular for websites.