Robots.txt
Robots.txt is a file that provides instructions to search engine crawlers on how to index a website’s content. By specifying which parts of a site should or should not be crawled, robots.txt helps control the visibility of pages in search engine results. It is often used to prevent indexing of sensitive or irrelevant pages, such as admin areas or duplicate content.