This function creates a robots.txt file which tells web scrappers to not index or scrape the webpage. This will make sure that internal pages don't get indexed by google or another search engine. The default is to put this into the docs folder of the active project.

make_robots_file(location = here::here("docs"))

Arguments

location

the location of the robots.txt file. Default is the docs folder