Robots.txt file (C#, VB and ASP.NET) | Free code
Here you can download free code to create a robots.txt file that you can upload to your web server to allow and restrict access to your pages and files for search engines at a general level.
A robots.txt file is uploaded into the root folder of a website on a web server and is used to permit and / or limit access to different files and web pages for search engines. A robots.txt file might be used to restrict search engine access to pdf files on the website so that pdf files do not get indexed in search engines. If there is no robots.txt file on the website then all pages and files on the website will be indexed by search engines, provided that there is not a meta tag on a web page that indicates that the website not should be indexed.
In this template with free code for a robots.txt file will we allow search engines to access all web pages on the website while we restrict access to files with specific extensions such as xls, pdf and zip. The purpose of restricting access to files on a website is that we do not want files to be indexed in search engines, we want to ensure that users find and visit our web pages.
Remember that you in your web hosting account must have made a setting for the correct version of "ASP.NET" on your website (the version listed in the "Web.config" file). We are using Visual Web Developer 2005 (2010) Express Edition for web programming.
01/01/2015 | Created by All-templates.biz
Download Robots.txt file (C#, VB and ASP.NET) | Free code »
Tags: web code