The most important section we should pay attention to is the "Crawl
Errors" one, under the "Diagnostics" link. This basically will tell
you if Google is facing any problems while crawling your site. If it
is, you should try to fix the issue immediately, or your search
rankings could suffer.
Not all errors are equally serious. The "Not Found" ones,
for example, might just be site owners that linked to the wrong URL
inside your site. Similarly, "Restricted by robots.txt" errors can
be caused by disallowing crawling of certain areas on your site
legitimately. Be careful with the "Timed out" and "Unrecheable"
errors, though, because they mean that the Google bot is not
reaching your pages correctly. If you get too many of these errors
your complete site might get de-indexed.
After exploring the "Crawl Errors" section you should take a look
at "Crawl stats" and "HTML suggestions", both under the
"Crawl stats" will show you how often the Google bot is visiting
your site. The graph should be stable, if not growing. If you see
a downward trend there is something wrong there. Remember that
the more often the Google bot visits your site the higher your
authority for Google.
The "HTML suggestions" part will report problems with your meta
tags. A very common problem webmasters have is a large number of
duplicate title tags. If you have, take a look at your blog
structure to solve the problem.
That is about it. There are other interesting sections inside
the Google Webmaster Tools, but the ones I mentioned above are
directly related to the performance of your site inside Google,
so you should keep an eye on them regularly (i.e., at least once