What is Robots.txt?
Robot.txt convention is otherwise called The Robot Exclusion Standard, which is an example to keep the induction of web robots and web crawlers to every other site that are seen by open. It is a content record not html that inquiries the pages that you need to see on your site with the assistance of pursuit boots. All things considered, robots.txt is not mandatory for web indexes, as they obey what is limited to them appropriately. Be that as it may, the web crawler robots are programmed, and before they enter any page of a webpage, they guarantees about the nearness of a robots.txt document that would keep it from entering some pages. In the event that you need that your site ought to be ordered via web indexes, then you don’t require Robot.txt document. All things considered, it’s critical to find robots.txt in a proper spot.
Area of robots.txt
To the extent area is concerned, you should find robots.txt in the fundamental catalog else, web indexes can’t distinguish it. The internet searchers look for my area (spot) com/robots.txt in the primary registry as opposed to hunting down robots.txt record in a whole webpage. On the off chance that it can’t discover in the fundamental catalog, they assume that there is no robots.txt document on this site and begin indexing the entire site. Consequently dependably attempt to find robots.txt document in the perfect spot. Indeed, it has been quite a while that the origination and structure of this record have been outlined. In any case, we will quickly talk about on it.
Structure of a robots.txt record
The structure contains a rundown of vast denied documents, client specialists and registries, yet the structure is exceptionally straightforward. Give us a chance to observe the grammar of robots.txt record:
User agent:
Disallow:
The web index crawlers are termed as “Client operator” and the rundown of registries and records that ought not be incorporated into indexing are termed “Prohibit.” However, in the event that you need to compose any remark line, then begin your line with # sign like:
# All client specialists are prohibited to see the/temp index
Client specialist
Prohibit:/temp/