Support & Help Center

Need help with Sitemap.ai? Browse our frequently asked questions below or send us a message using the contact form.

Contact Us

Frequently Asked Questions

How do I create a sitemap?

Creating a sitemap with Sitemap.ai is simple:

  1. Enter your website URL in the input field on the homepage
  2. Click "Start Crawling" to begin the automatic crawl process
  3. Wait for the crawler to discover up to 500 URLs (1,500 with free account)
  4. Download your sitemap in XML, HTML, CSV, or JSON format

No signup required for basic usage (up to 500 URLs per crawl)!

Do I need an account to use Sitemap.ai?

No account is required for basic sitemap generation! You can crawl any website and download up to 500 URLs without signing up.

However, creating a free account gives you:

  • Higher limits: 1,500 URLs per crawl (vs 500 without account - 3X more!)
  • Monthly allowance: 25,000 URLs per month
  • Save and manage up to 5 crawls at a time
  • One-click re-crawling of saved sites
  • Faster crawling rates
  • Access to crawl history
How many URLs can I crawl?

Without an account:

You can crawl any website and download up to 500 URLs per crawl. You won't be able to save crawls to your account, but you can download the results immediately. This is perfect for smaller websites and trying out the service.

With a free account:

Creating a free account unlocks higher limits:

  • Up to 1,500 URLs per crawl (3X more than without account)
  • Up to 25,000 URLs per month total
  • Save and manage up to 5 crawls at a time
  • One-click re-crawls of saved sites
  • Faster crawling rates

For larger enterprise needs beyond these limits, please use the contact form above to get in touch.

What formats can I export my sitemap in?

You can export your sitemap in four different formats:

  • XML - Standard format for search engines (Google, Bing, etc.)
  • HTML - Human-readable format for visitors
  • CSV - Spreadsheet format for analysis
  • JSON - Developer-friendly format for integrations
How do I submit my sitemap to Google?

After generating your XML sitemap:

  1. Upload the sitemap.xml file to your website's root directory (e.g., https://example.com/sitemap.xml)
  2. Go to Google Search Console
  3. Select your property (website)
  4. Navigate to "Sitemaps" in the left sidebar
  5. Enter your sitemap URL and click "Submit"

You should also submit to Bing Webmaster Tools for better coverage.

Is my data secure and private?

Yes, we take your privacy seriously:

  • We only extract URL links from pages you crawl
  • We do NOT store full page content or personal data from crawled sites
  • For guests, results are temporarily stored in your browser for your session only
  • For logged-in users, saved crawls are stored securely and only accessible to you
  • We use encryption for all data transmission

See our Privacy Policy for full details.

What's the difference between Generator, Finder, and Validator?

Each tool serves a different purpose:

  • Generator - Creates new XML sitemaps by crawling your website and discovering all URLs automatically
  • Finder - Discovers existing sitemaps on any website by checking common locations and robots.txt files
  • Validator - Validates XML sitemaps to ensure they meet search engine standards and identifies any errors
Can I use Sitemap.ai for commercial websites?

Yes! Sitemap.ai can be used for both personal and commercial websites.

Free tier includes:

  • Without account: Up to 500 URLs per crawl (download only, cannot save)
  • With free account: Up to 1,500 URLs per crawl, 25,000 URLs per month, and save up to 5 crawls

For larger commercial projects that need to crawl more than 1,500 URLs at once or exceed monthly limits, please use the contact form above to discuss enterprise solutions.

How long does crawling take?

Crawling typically takes 5-10 minutes per site, though this can vary significantly depending on whether the website allows our crawler to access it. Some websites have strict crawling policies or rate limits that may slow down the process. Our crawler is designed to be respectful of server resources and follows robots.txt directives.

What if I get a server error or "cannot crawl" message?

Sometimes servers block our crawler from accessing their content. This can happen for several reasons:

  • The website has strict bot protection or firewall rules
  • The server is configured to block automated crawlers
  • The website's robots.txt file disallows crawling
  • Temporary server issues or downtime

If you encounter this issue, please use the contact form above to send us the website URL. We'll investigate and see if we can help resolve the crawling issue.

I found a bug or have a feature request

We'd love to hear from you! Please use the contact form above with details about the bug or your feature suggestion. We review all feedback and use it to improve our service.

Can you crawl JavaScript-based websites?

Yes! With a free account, you can enable JavaScript Rendering to crawl dynamic websites built with modern frameworks like React, Vue, Angular, Next.js, and others.

How it works:

  • Our crawler uses a real browser (Puppeteer) to render pages
  • All dynamically-loaded content and links are discovered
  • Perfect for Single Page Applications (SPAs) and dynamic sites

Limitations:

  • JS rendering is slower than standard crawling (each page takes longer to process)
  • Some sites with aggressive bot protection may still block the crawler
  • Requires a free account to access this feature

Contact Information

ASUN Digital, LLC
Support: Use the contact form above
Website: https://www.sitemap.ai

Sitemap.ai is part of the Sitemap.io family of products.