Google has just added some changes to its Sitemaps. Some things there are interesting to note:
1. When you log in, in case you’ve been logged into any other Google services you use before, it asks you whether you want to use a separate account for Sitemaps – and warns that you can only be logged into one account at a time – nice enough of G to tell us 😉
2. Now you don’t even need to create an xml sitemap to use the service – all you do is upload to your server a file named with a unique code Google gives you – this way they verify you are the true owner of the domain. OK, this is all fine and pretty secure – but this is how bad they want to know who owns what!
The new information provided for the domains submitted to Google Sitemaps includes problems crawling pages, and otehr Googlebot stats. that’s not so new really, as all these stats are obtained through the old queries, only now they are listed like this:
Query Type Link
Indexed pages in your site site:orange-revolution.netfirms.com/
Pages that refer to your site’s URL allinurl:orange-revolution.netfirms.com/
Pages that link to your site link:orange-revolution.netfirms.com/
The current cache of your site cache:orange-revolution.netfirms.com/
Information we have about your site info:orange-revolution.netfirms.com/
Pages that are similar to your site related:orange-revolution.netfirms.com/
Considering the bad reputation the site: query has had lately… Hmm…
Another question that pops up in my head is whether adding the Google-coded file to the server will be as lethal for black hat sites as using Google Sitemaps proved to be. In other words, I’m not in a hurry to try it out until I hear people’s reports on this.
Comments are closed.