XML Sitemap Generator Tool Online

Last updated:

Crawl your website to discover pages automatically, or build a sitemap manually. Set priority, change frequency, and last modified dates. Download a valid XML sitemap file instantly.

How the XML Sitemap Generator Works

This tool creates a valid XML sitemap file for your website in two ways:

  1. Crawl mode — enter your website URL and the tool crawls your site, following internal links to discover up to 500 pages. It automatically skips noindex pages, non-canonical URLs, and non-HTML resources like images and PDFs.
  2. Manual mode — paste a list of URLs you want in the sitemap. Useful when you know exactly which pages to include, or when your site requires authentication or has JavaScript-rendered content.
  3. Set metadata — choose default priority (0.0–1.0), change frequency, and whether to include today's date as lastmod. These optional fields give search engines hints about your content.
  4. Review and edit — after generation, review the URL list. Uncheck any pages you want to exclude. The XML preview updates in real time.
  5. Download — download the sitemap.xml file and place it in your website's root directory. Then submit it via Google Search Console and reference it in your robots.txt file.

Why XML Sitemaps Matter for SEO

An XML sitemap is one of the most fundamental files in technical SEO. Here's why it matters:

  • Faster discovery — new pages can take days or weeks to be found through link crawling alone. A sitemap tells search engines exactly where to look, reducing discovery time significantly.
  • Better crawl efficiency — search engines allocate a crawl budget to each site. A sitemap helps them prioritize your important pages instead of wasting budget on low-value URLs.
  • Essential for large sites — websites with thousands of pages, complex navigation, or thin internal linking benefit the most from sitemaps. Google explicitly recommends sitemaps for sites over 500 pages.
  • Lastmod signals recrawling — when you update a page and reflect it in your sitemap's lastmod date, search engines know to recrawl that page sooner. This is especially useful for time-sensitive content.
  • GEO and AI discovery — AI search engines like ChatGPT Search, Perplexity, and Google AI Overviews use sitemaps to discover content they may reference in AI-generated answers. Having a sitemap improves your visibility in generative search results.

After creating your sitemap, validate it with our Sitemap Checker & Validator to make sure everything is technically correct. Also consider configuring your robots.txt to reference the sitemap with a Sitemap: directive. Use the Internal Link Analyzer before generating your sitemap — it identifies orphan pages that have no internal links, which means they'll only be discoverable via your sitemap. For hreflang-based multilingual sitemaps, the Hreflang Tag Generator can produce a complete XML sitemap with hreflang annotations.

XML Sitemap Best Practices

Follow these guidelines to get the most out of your XML sitemap:

  • Only include canonical, indexable pages — exclude noindex pages, redirects, 404s, paginated archives, and duplicate URLs. Every URL in your sitemap should return a 200 status and be the canonical version.
  • Keep lastmod accurate — only update the lastmod date when the page content actually changes. Fake or auto-updating lastmod dates erode trust with search engines and may cause them to ignore the field entirely.
  • Use a sitemap index for large sites — if your site has more than 500 URLs, split your sitemap into multiple files (by content type, category, or date) and reference them from a sitemap index file.
  • Reference it in robots.txt — add Sitemap: https://yourdomain.com/sitemap.xml to your robots.txt file. This helps crawlers find your sitemap even if they have never seen your site before.
  • Submit to Search Console — after uploading, submit your sitemap through Google Search Console and Bing Webmaster Tools. Monitor the coverage report for errors like "Submitted URL marked noindex" or "URL is not on this sitemap."
  • Automate when possible — most CMS platforms (WordPress with Yoast/Rank Math, Shopify, Astro, Next.js) can generate and update sitemaps automatically. Use manual sitemap generation primarily for static sites or custom setups.

For a deeper understanding of how search engines and AI systems discover and evaluate your content, explore our complete guide to GEO and our SEO services.

Sitemap Generator: FAQ

What is an XML sitemap?
An XML sitemap is a structured file that lists the important URLs on your website. It tells search engines like Google, Bing, and others which pages exist, when they were last updated, how often they change, and their relative importance. It is one of the foundational files of technical SEO.
Why do I need a sitemap?
A sitemap helps search engines discover and crawl your pages more efficiently, especially new pages, deep pages with few internal links, or pages on large websites. Without a sitemap, search engines rely solely on following links, which means some pages may never be found or crawled.
What is the difference between the Crawl and Manual mode?
Crawl mode automatically visits your website, follows internal links, and discovers pages for you — up to 500 URLs. Manual mode lets you type or paste specific URLs and set priority, change frequency, and last modified date for each one. Use Crawl for discovery, Manual for precision.
What do priority and changefreq mean?
Priority (0.0 to 1.0) hints at a page's importance relative to other pages on your site. Homepage is typically 1.0, main sections 0.8, blog posts 0.6-0.7. Change frequency (always, hourly, daily, weekly, monthly, yearly, never) tells crawlers how often you expect a page to change. Note: Google has stated it largely ignores these fields, but Bing and other engines may still use them.
Does Google use priority and changefreq?
Google has publicly stated it ignores the priority and changefreq fields in sitemaps. However, other search engines like Bing may still use them as hints. The lastmod date is the most useful metadata — Google does pay attention to accurate lastmod values to decide when to recrawl pages.
How many URLs can a sitemap contain?
A single XML sitemap file can contain up to 50,000 URLs and must not exceed 50MB uncompressed. If your site has more than 50,000 pages, you need a sitemap index file that references multiple individual sitemap files. This tool generates sitemaps up to 500 URLs via crawl.
Where should I place my sitemap?
Place your sitemap.xml file in the root directory of your website so it is accessible at https://yourdomain.com/sitemap.xml. Then reference it in your robots.txt file with a Sitemap: directive, and submit it through Google Search Console and Bing Webmaster Tools.
What is a sitemap index?
A sitemap index is an XML file that references multiple individual sitemap files. It is used when your website has more URLs than a single sitemap can hold (50,000 URL limit). Most CMS platforms like WordPress with Yoast or Rank Math automatically create sitemap indexes that separate posts, pages, categories, and other content types.
Does the crawl mode skip noindex pages?
Yes. The crawler checks for meta robots noindex tags and skips those pages from the generated sitemap. It also skips pages where the canonical URL points to a different page, ensuring your sitemap only contains indexable, canonical URLs.
Is this sitemap generator free?
Yes. Completely free with no signup, no ads, and no usage limits. The crawl mode runs server-side to discover your pages, and the manual mode runs entirely in your browser. Download your sitemap as an .xml file instantly.

Need Help with Technical SEO?

We help businesses set up sitemaps, robots.txt, crawl directives, and full technical SEO foundations.