Google Search Console
I daily monitor the Impressions and Clicks that my pages receive. Those pages that I consider relevant to the current situation will get upgraded to the new design format.
What I am trying to do here is to have pages that relate to the various reports generated by GSC. These pages are not meant to be a comprehensive guide to how to use or fix all the errors you will see using GSC, They are an analysis of the reports that I have seen related to this website.
Pages that have been converted to new design have been marked with a
An analysis of some the reports:
- - Canonical - duplicate content and URLs
- - Indexed, not submitted in sitemap - possibly the first issue to clear-up
- - Redirect Error - Google Search Console - this comes from a number reports (see below)
- - Discovered - currently not indexed - a similar GSC report catagory
- - Crawled - currently not indexed
- - Excluded - Page with redirect - different to a Redirect Error
- - Submitted URL not found (404) - seems to be prevelent when converting page to new layout
Some other pages on GSC:
- - How to fix "Text to small to read" and "Clickable elements too close together" Errors
- Clicks in Search Console but no Queries
- How to use the Google Search Console "Performance" reports
- Mobile Usability- How to fix "Text to small to read" and "Clickable elements too close together"
- The Validation Cycle
- Submitting a new page to Google
- Using your re-indexing quota
- When Impressions become Clicks
- Using Google Analytics with Search Console
- - Using the Removal Tool
A re-review - February 2021
Surprisingly looking at the Google Search Console interface again I see that I can actually see search terms again. This was something that Google had removed from the Analytics reports.
Actually, I don't really care if Google indexes my pages as I am not running a business. However, it is good that I have a continued interest in what Google are up to. It is pretty unlikely that my pages will be found by anyone making a search. This is largely due to the non-uniqueness of my content. I was starting to revisit sitemaps and the analysis of my server logs. I added a text based sitemap and added a robots.txt file as I was seeing that they were being requested by bots such as Googlebot, Bing and ahrefs.
Although I adminster two local websites, the Parish Council and the Village Hall, I am not paid to do so and it is not a business. The know that the traffic to both sites is minimal but they can both be fund if an Internet user searches for either by name. This is partly due to the fact that the sites have domains that contain the name of the Parish as does the name of the sites. They both ran pretty high on Google is you make a search for them. However, the residents of the community are far more likely to use Facebook. Apart from a sitemap and a robots.txt file for the Parish Council site I have performed no more SEO on them. The Village Hall is a Wordpress.com site and ranks on Google due to the fact that it is hosted on an automatic.com server and the SEO is taken care of by the system and I have little control over such matters.
Do I still need the Analytics Tracking code?
The pages that I see the search term may still have the tracking code, I will have to check. I know that my page on my arch nemisis had some hits either from him or someone looking for him and what he may be up to!. I will have to look at this in more depth.
I also saw a report for search terms and another term was for FU toolkit (or soemthing like that) the page was Software Backdoors and it DID have the tracking code.
Some of my initial thoughts using the Google Search Console
The Search Console is a much improved and user interactive interface. Once you have registered the owership of your website with Google you have many more reports than you did with Analytics. It is early days yet and I will probably make new pages on specific issues that I find as I go along. It is too early to make a pronouncement on whether the console has helped in getting my pages seen.
There are ways of inspecting specific URLs and you can get feedback on individual pages, request that they be indexed and see if they are included in a sitemap. I must admit that I have seen a few confusing issues as I found that a page that I know was "mobile friendly" report as not being so. On resubmission the page was "mobile friendly".
Although Google tell you that your can make a request to reindex a URL there is a limit to the number you can request in a day. This number applys across all of the websites that you have control - I have 3 websites. This site, the Parish Council and the Village Hall (bwvh.uk)