The Robots.txt file serves to give information to Googlebot and other robots that crawl the Internet about the pages and files that should be indexed on our website. Although it is not essential, the Robots.txt file is of great help to Google and other crawling robots when indexing our page, so it is very important that it is configured correctly.1 Robots.txt file location2 Types of robots that can visit our website3 Editing the Robots.txt file3.1 Blocking a page and lower level pages3.2 Blocking a page while maintaining access to lower level pages3.3 Block a page and all the lower level pages except those we define3.4 Blocking all…