An Important Update

Dear Followers Of This Blog ...

If you did not use a Blogger / Google account when you Followed this blog, years ago, you are probably not Following now . During the past...

Tuesday, November 04, 2014

Blogs To Have Automatically Generated Sitemaps

Last week, Blogger gave us a feature that various blog owners have asked about, for many years.

The previous sitemap, based on the blog posts feed, has been replaced by an automatically generated, dedicated sitemap. You can see one, for this blog, as an example.

Accompanying the new sitemap will be an updated "robots.txt" file.

The new sitemap will be very simple.

The sitemap will include 2 data elements / post.

  • Post URL
  • Last updated date / time (UTC).

The new sitemaps offer interesting diagnostic possibilities, for various blog problems.

By eliminating the posts newsfeed, sitemap access becomes much cleaner.

With these data elements now available without requiring searching through the post content in the newsfeed, any process which indexes or searches, using any of these data elements, will be much simpler - and be more stable, when run.

My suspicion is that several Blogger / Google features, no longer immediately requiring the blog feed in indexing, will be much more usable. Blogs which use dynamic templates, the Reading List, and search engine indexing, will eventually benefit.

Accompanying the new sitemap, which will index posts, will be a sitemap for static pages. You can see a pages sitemap, for this blog, as an example.

The pages sitemap appears to have 2 data elements / static page.

  • Page URL
  • Published date / time (UTC).

You will see the new sitemap specified in the "robots.txt" file.

Check the "robots.txt" file on your blog. When the sitemap is installed on your blog, you will see the change.

If you're unfamiliar with the concept, you may read my other posts in this blog - or the Webmaster Tools Help: Learn about sitemaps. Now, we can do other things with the blog feed, without impeding indexing. Possibly, even private blogs can now be indexed.

Large sitemaps will be broken into pages.

Any sitemap with over 150 entries (pages or posts) will be broken into pages - 150 entries / sitemap page, automatically.

Examine the posts sitemap, for this blog - as of 5 June, 2016.

<?xml version='1.0' encoding='UTF-8'?><sitemapindex xmlns=""><sitemap><loc></loc></sitemap><sitemap><loc></loc></sitemap><sitemap><loc></loc></sitemap><sitemap><loc></loc></sitemap><sitemap><loc></loc></sitemap><sitemap><loc></loc></sitemap><sitemap><loc></loc></sitemap><sitemap><loc></loc></sitemap><sitemap><loc></loc></sitemap><sitemap><loc></loc></sitemap><sitemap><loc></loc></sitemap><sitemap><loc></loc></sitemap><sitemap><loc></loc></sitemap><sitemap><loc></loc></sitemap><sitemap><loc></loc></sitemap><sitemap><loc></loc></sitemap><sitemap><loc></loc></sitemap><sitemap><loc></loc></sitemap></sitemapindex>

The most current posts will be listed on Page 1.

The sitemap will have a limited size.

The sitemap will provide a maximum of 3,000 entries - 20 pages at 150 posts / page. As new posts are published to the blog, they will be added, automatically. Hopefully, not too many blogs will have 3,000 posts published, before the blog is indexed.

Since the announcement was made, I have added maybe a dozen posts to this blog. I just looked at Page 1 of the sitemap for this blog, and this post is now, there - 5 minutes after this post was published. You may, or may not, see the same update promptness on your blog.

The old sitemap is now not needed.

Both the old and new sitemaps index the same post complement - the old "sitemap" (posts feed) simply contains irrelevant content - the post material.

Let's compare the old sitemap, with the new, using either the content itself or an HTTP trace pair. Click on two of the links below, and compare the results.

The old sitemap:

The old sitemap URL:

The old sitemap HTTP trace:,+like+Gecko)+Chrome/50.0.2661.103+Safari/537.36&ref=

The new sitemap:

The new sitemap URL:

The new sitemap HTTP trace:,+like+Gecko)+Chrome/50.0.2661.103+Safari/537.36&ref=

The old and new sitemaps index the same content - the old sitemap simply includes all of the post content, and the new sitemap includes search engine useful data. Some of the processes that read sitemaps will simply be able to digest the new sitemap easier.

Dude, hit me with a comment!