Fetch as Googlebot
Are you tired of waiting for Google to come back and crawl your site after an update? Enter Fetch as Googlebot. This little gem allows you to have Google re-crawl a page at your demand. It also allows you to see any web page just as Google sees it. It’s a great tool to find and identify problematic pages. For example, if your site has been hacked, and is being used to display information that is part of another website (a form of cloaking – not to be confused with iFrames), then Fetch as Googlebot allows you to see this. This tool allows you to fetch up to 500 URLs per week.
Advanced Data in Index Status
While the basic tab will display the total pages in Google’s index, this information is basic and won’t help you identify all crawl issues. To identify more crawl issues, we need to go into the Advanced tab in Crawl > Index Status. Click on Advanced, click on the data you want, and then click on Update to refresh the page with the data. You will all the URLs on your site Google has ever crawled, as well as URLs that Google is unable to crawl due to robots.txt. If you are experiencing ranking and indexation problems, these three items will be able to help you pinpoint issues on your site that arise due to problem pages.
When using this data, it is important to take a look at your indexed pages. Say you know your site has 1,343 pages (including your blog), and a pages indexed number reads 4,533, you may have issues with Google indexing the search results pages from the on site search plug-in you installed. This can lead to serious duplicate content issues that can lead to serious ranking problems.
Get Back Your Keyword Data With Google Custom Search
If you are still upset and disturbed by the removal of Google referral data from Google Analytics due to “not provided” there is a simple way around this (although not exactly reflective of Google’s actual search data). Install Google custom search on your site. Using the custom search section in Other Resources in webmaster tools, you can build a completely tailored search experience on your site. This allows you to see who is searching what on your site, and you can use that data to make different decisions about your search campaigns. This is a great way to recover some of that keyword data, as it helps you identify what your users are really searching for on your site.
Blocked URLs
This section is especially useful for uncovering crawl issues due to errors in the website’s robots.txt file. For example this tool will help you discover if someone created a robots.txt file that unintentionally blocks pages due to typos in the commands. By utilizing this hidden gem, you can uncover simple (and sometimes more complex) crawl issues that may arise due to the improper use of robots.txt directives.
Context: The Last Hidden Gem of GWT
One last hidden gem in Google Webmaster Tools in your quest to identify and correct website errors is understanding the real world context of the problems reported by the tool. While problems with indexation can reveal issues like too many indexed pages be careful there isn’t another site or design issue that is interfering with this.
Find out about these issues by performing your due diligence and investigating problems you aren’t sure about. Ask questions. And follow-up with your web developer or IT Director if there are major issues. Context can mean the difference between getting issues fixed on time and within budget or just creating more problems.
As you learn the intricacies of GWT it will be easier to use and you will be able to identify issues more quickly. These suggestions are just the tips of the iceberg. But, they can help you identify some serious site issues with out having to spend more money on additional tools.
0 comments:
Post a Comment