The webmaster do not ignore the use of Robots

2. shielded duplicate pages: if we find our website there are two aspects of the same page, but in different ways, we’ll use Robots shield a page, the spider will crawl but still not put out, we can directly see the number of pages being intercepted in Google webmaster tools.

two, Robots.txt

, the origin of Robots.txt

Optimization of

3. shielded some dead links page

we do when the site on the line, there will be many irresistible factors put out by the search engine, which leads to a decrease in the overall quality of our web site in the search engine, we lead to the impression of poor, the role of Robots is to shield these irresistible factors don’t let the spider put them out, then we should be shielded. Some

pages? Effect of The establishment of 1.Robots.txt

1. without shielding some content page: give you an example will understand, for example: the registration page, landing page, shopping page, post page, message page, search page, if you do 404 error page to screen.

4. shielding longer: over the long path. We can use the Robots input box shield.

before I have been emphasizing details, is now the love Shanghai requirements of the web site is to see the details of your good code, label and so on to detail, as part of a Robots website also belong to the details of the job he has a great help to our site, there may be many new Adsense don’t know what a Robots is, I give you said some operations on the Robots.

create a notepad file locally, it is named Robots.txt, and then put this file in the root directory of the US, so that our Robots.txt is built up, some open source applications such as weaving is with Robots, we modify the.

we must first understand that Robots is not a command or instruction, Robots is a web search engine and the third party agreement, the agreement is the content inside the Robots.txt, early is used for privacy protection in the website, he is a TXT file in the root directory of our website.

we only those with a common feature page shielding can not, spider crawling does not mean that the spider can’t grab the address, and the address can be able to crawl to crawl to the two concepts, of course we can handle the dead links we don’t demand shielding, can not be handled by the way we like the dead link we need to shield.

three, Robots.txt

The path

Leave a Reply

Your email address will not be published. Required fields are marked *