Click here to Skip to main content
65,938 articles
CodeProject is changing. Read more.
Articles
(untagged)

Google Site Map Crawler

0.00/5 (No votes)
13 Dec 2007 3  
Console application that chacks all URLs listed in sitemap.xml file

Introduction

Have you ever thought of trying to validate each URL listed in your sitemap file?

Background

I have a site with dynamically generated page links. Those links are generated based on a page title which can be any combination of letters, numbers and symbols. Of course, the site does remove all forbidden characters from the page title before generating its URL, trims and shortens it a bit... however errors still occur from time to time. For example, a page with a title: ''...IS_BROKEN'' ''' due to my URL conversion specifics will have the following URL: /.IS_BROKEN+ There are thousands of pages so it�s clear that I can not verify each separate page that the site�s database contains.

Based on a list of dynamically generated URLs I generate a sitemap.xml file. Which contains all of the site pages. So each time a map-file is generated I need to ensure that there are no repeating items (this may happen if different pages have same titles) and each separate URL is accessible, i.e. does not produce either bad request, or 404 or anything like that.

So I created a C# program that walks through each URL listed in the sitemap.xml file and tries to access it. It logs all errors occurred into an output file, so it�s easy to track problem pages.

I use XmlDocument class for loading a sitemap.xml; WebRequest and WebResponse classes for determination of whether a URL exists.

License

This article has no explicit license attached to it but may contain usage terms in the article text or the download files themselves. If in doubt please contact the author via the discussion board below.

A list of licenses authors might use can be found here