Robots.txt is a file that exists on the basis Listing of each Internet site and can be utilized to instruct engines like google on which directories/documents of the web site they're able to crawl and include inside their index. A sitemap is a simple file that lists all URLs on https://news.rafeeg.ae