Google has announced a new protocol that they are now using to better index web sites: Sitemap Protocol
he Sitemap Protocol allows you to inform search engine crawlers about URLs on your Web sites that are available for crawling. A Sitemap consists of a list of URLs and may also contain additional information about those URLs, such as when they were last modified, how frequently they change, etc.”
I imagine it won’t be long before most of the major CMSs out there have the ability to create one of these sitemap files. The primary benefit is to reveal pages to search engine crawlers that they would not find via their normal crawling of your site.