Site Errors Breakdown
Webmaster level: All
Today we’re announcing more detailed Site Error information in Webmaster Tools. This information is useful when looking for the source of your Site Errors. For example, if your site suffers from server connectivity problems, your server may simply be misconfigured; then again, it could also be completely unavailable! Since each Site Error (DNS, Server Connectivity, and Robots.txt Fetch) is comprised of several unique issues, we’ve broken down each category into more specific errors to provide you with a better analysis of your site’s health.
Site Errors will display statistics for each of your site-wide crawl errors from the past 90 days. In addition, it will show the failure rates for any category-specific errors that have been affecting your site.
If you’re not sure what a particular error means, you can read a short description of it by hovering over its entry in the legend. You can find more detailed information by following the “More info” link in the tooltip.
We hope that these changes will make Site Errors even more informative and helpful in keeping your site in tip-top shape. If you have any questions or suggestions, please let us know through the Webmaster Tools Help Forum.
Written by Cesar Cuenca and Tiffany Wang, Webmaster Tools Interns
Search Queries Alerts in Webmaster Tools
We know many of you check Webmaster Tools daily (thank you!), but not everybody has the time to monitor the health of their site 24/7. It can be time consuming to analyze all the data and identify the most important issues. To make it a little bit easier we’ve been incorporating alerts into Webmaster Tools. We process the data for your site and try to detect the events that could be most interesting for you. Recently we rolled out alerts for Crawl Errors and today we’re introducing alerts for Search Queries data.
The Search Queries feature in Webmaster Tools shows, among other things, impressions and clicks for your top pages over time. For most sites, these numbers follow regular patterns, so when sudden spikes or drops occur, it can make sense to look into what caused them. Some changes are due to differing demand for your content, other times they may be due to technical issues that need to be resolved, such as broken redirects. For example, a steady stream of clicks which suddenly drops to zero is probably worth investigating.
The alerts look like this:
We’re still working on the sensitivity threshold of the messages and welcome your feedback in our help forums. We hope the new alerts will be useful. Don’t forget to sign up for email forwarding to receive them in your inbox.
Posted by Javier Tordable, Tech Lead, Webmaster Tools
Configuring URL Parameters in Webmaster Tools
Webmaster Level: Intermediate to Advanced
We recently filmed a video (with slides available) to provide more information about the URL Parameters feature in Webmaster Tools. The URL Parameters feature is designed for webmasters who want to help Google crawl their site more efficiently, and who manage a site with — you guessed it — URL parameters! To be eligible for this feature, the URL parameters must be configured in key/value pairs like item=swedish-fish
or category=gummy-candy
in the URL http://www.example.com/product.php?item=swedish-fish&category=gummy-candy
.
Guidance for common cases when configuring URL Parameters. Music in the background masks the ongoing pounding of my neighbor’s construction!
URL Parameter settings are powerful. By telling us how your parameters behave and the recommended action for Googlebot, you can improve your site’s crawl efficiency. On the other hand, if configured incorrectly, you may accidentally recommend that Google ignore important pages, resulting in those pages no longer being available in search results. (There’s an example provided in our improved Help Center article.) So please take care when adjusting URL Parameters settings, and be sure that the actions you recommend for Googlebot make sense across your entire site.
Written by Maile Ohye, Developer Programs Tech Lead