Create a Sitemap Using ASP.NET Core API & Blazor


December 17, 2023 Program

Create a Sitemap Using ASP.NET Core API & Blazor
Document the process of using Blazor WebAssembly Hybrid to generate a site map.

What is Sitemap?

Sitemap is like a list or directory of websites. This list lists all the accessible pages, articles, images, etc. on the website, making it easier for search engines and website crawlers (such as Googlebot, Bingbot, etc.) to understand and explore how the website is organized. It helps search engines find and record content on your website more quickly and accurately, so users can more easily find what they need when they search.

In short, sitemap is a way for search engines to better understand and record the structure of website content, so that users can more easily find the information they want.

Foreword

🔗

Currently, I have experience using Hugo (a static web page generator written in Go language) to build a website. Basically, after compilation, this framework can automatically generate Sitemap.xml for use, and web crawlers can also use it because each web page is a static website. Easily obtain website information.

However, for websites built using Blazor WebAssembly Hybrid or using front-end frameworks such as Vue and React, because they are SPA (Single Page Application), they always encounter many problems in website exposure, SEO, etc.

This article mainly records the creation method of website map required by SPA type websites in order to facilitate related services such as Prerender.io and crawlers when doing pre-rendering.

Regarding SPA type websites, how to use pre-rendering to solve the problem that Google crawlers cannot crawl website actual data, you can refer to this article: Blazor WASM SEO - Use pre-rendering to solve the problem that SPA web SEO (Prerender.io & Cloudflare Workers tutorial) (Prerender.io & Cloudflare Workers)

Solution

🔗

Basically there is no quick way, you can only generate a sitemap according to your own website structure. Regarding the use of Blazor WebAssembly Hybrid, I was originally thinking about whether it can be automatically generated by the server, so that if the website data changes, the Sitemap can be modified immediately, but real-time means that all APIs for data changes must be linked, which seems complicated. So use an independent API instead, and then generate Sitemap according to needs.

As for the Hybrid type framework, after the file is generated, it is located in the root directory, which is the server side. Currently, when deploying the website, there is an extra step to move the sitemap.xml of the server side to the client side. In this way, the file can be obtained through https://your domain/sitemap.xml when website lunched.

Code

🔗

First refer to the examples provided by Google

xml
<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
  <url>
    <loc>https://www.example.com/foo.html</loc>
    <lastmod>2022-06-04</lastmod>
  </url>
</urlset>

Contains <loc> URL, <lastmod> last modification time, and can also increase <priority> web page weight and <changefreq> update frequency. So we need to use API to assemble files in the same format. The following is an example:

  1. Basic parameters for Sitemap
C#
public class Page
{
    public string Url { get; set; }
    public DateTime LastModified { get; set; }
    public string ChangeFrequency { get; set; }
    public double Priority { get; set; }
}
  1. API
C#
[HttpGet]
public async Task<bool> GetSiteMap()
{
    XNamespace xmlns = "http://www.sitemaps.org/schemas/sitemap/0.9"; // Define namespace

    // Create XML document
    XDocument sitemap = new XDocument(
        new XDeclaration("1.0", "utf-8", "yes"),
        new XElement(xmlns + "urlset" // Using XNamespace
        )
    );

    string baseUrl = "https://your domain/";

    // Create List<Page> object 
    var pages = new List<Page>
    {
        new Page { Url = baseUrl, LastModified = DateTime.Now, ChangeFrequency = "daily", Priority = 1.0 }
    };
    //Add pages according to your website structure
    ServiceResponse<List<Category>> result = await _categoryService.GetCategories();
    List<Product> products = await _productService.GetAvailableProducts();
    foreach (Category category in result.Data)
    {
        category.Product = products.Where(o => o.CategoryId == category.Id && o.IsImage == true).OrderBy(o => o.Order).ToList();
        pages.Add(new Page { Url = baseUrl + "category/" + category.Url, LastModified = DateTime.Now, ChangeFrequency = "daily", Priority = 1.0 });

        foreach (Product product in products)
        {
            pages.Add(new Page { Url = baseUrl + "product/" + product.Url, LastModified = DateTime.Now, ChangeFrequency = "daily", Priority = 1.0 });
        }
    }

    // Add each page to the Sitemap
    foreach (var page in pages)
    {
        XElement urlElement = new XElement(xmlns + "url", 
            new XElement(xmlns + "loc", page.Url), 
            new XElement(xmlns + "lastmod", page.LastModified.ToString("yyyy-MM-dd")), 
            new XElement(xmlns + "changefreq", page.ChangeFrequency), 
            new XElement(xmlns + "priority", page.Priority.ToString("0.0", System.Globalization.CultureInfo.InvariantCulture)) 
        );
        sitemap.Root.Add(urlElement);
    }

    // Save XML 
    string filePath = "sitemap.xml"; 
    sitemap.Save(filePath);
    return true;
}

Conclusion

🔗

The above is the method for creating Sitemap.xml. Any website that wants to improve SEO or analyze traffic sources through GA4 requires the provision of Sitemap. Next will explain how SPA-type websites solve the problem of web crawlers not being able to crawl website data.

Reference

BlazorSEOWeb



Avatar

Alvin

Software engineer, interested in financial knowledge, health concepts, psychology, independent travel, and system design.

Related Posts