Published By: Admin
One way to inform search engines that files and folders on your computing machine to avoid is with the utilization of the Robots metatag. However since not all search engines browse Meta tags, the Robots Meta tag will merely go unnoted. A higher thanks to inform search engines concerning your can is to use a robots.txt file.
Robots.txt may be a text (not html) file you set on your web site to inform search robots that pages you’d like them to not visit. Robots.txt is by no means that obligatory for search engines however usually search engines adapt what they’re asked to not do. it’s vital to clarify that robots.txt isn’t the simplest way from preventing search engines from travel your web site (i.e. it’s not a firewall, or a form of watchword protection) and therefore the indisputable fact that you set a robots.txt file are some things like swing a note “Please, don’t enter” on associate degree unsecured door – e.g. you can’t stop thieves from returning in however the great guys won’t receptive door and enter. That’s why we are saying that if you’ve got very sensitive knowledge, it’s too native to have faith in robots.txt to safeguard it from being indexed and displayed in search results.
The location of robots.txt is extremely vital. It should be within the main directory as a result of otherwise user agents (search engines) won’t be ready to notice it – they are doing not search the entire web site for a file named robots.txt. Instead, they give the impression of being 1st within the main directory and if they do not notice it there, they merely assume that this web site doesn’t have a robots.txt file and so they index everything they notice on the method. So, if you do not place robots.txt within the right place, don’t be stunned that search engines index your whole web site.
The structure of a robots.txt is pretty easy it’s associate degree endless list of user agents and disallowed files and directories. Basically, the syntax is as follows:
“User-agent” square measure search engines’ crawlers and disallow: lists the files and directories to be excluded from classification. Additionally to “user-agent:” and “disallow:” entries, you’ll embrace comment lines – simply place the # sign at the start of the line:
# All user agents square measure disallowed to visualize the /temp directory.