Now, create 'robots.txt' file at your root directory. Copy higher than text and paste into the document.
Robots.txt Generator generates a file that's noticeably opposite of the sitemap that indicates the pages to be enclosed, therefore, robots.txt syntax is of nice significance for any web site. Whenever an exploration engine crawls any web site, it continually 1st appearance for the robots.txt file that's set at the domain root level. once known, crawler can browse the file, then determine the files and directories which will be blocked.