Ki-Ki

Web foundations for SMEs

Why this tool exists

A sitemap is still one of the cleanest ways to tell search engines what you actually want indexed, but most people only get it “for free” because they’re on WordPress and they’ve installed something like Yoast. If you’re on a static site, you don’t get plugins, and you don’t get a database, and you definitely don’t get an admin panel that politely generates files for you.

This tool exists so you can generate sitemap.xml and robots.txt from a folder or a pasted list, without crawling the web, without installing anything, and without turning “basic SEO hygiene” into a dev project.

Static hosting friendly no CMS needed, no installs
Migration friendly cleanly map old and new URLs
No crawling works from your own file list
robots.txt sitemap.xml SEO hygiene Fast and boring

The actual problem it solves

Static sites are simple, until you need the boring files

Static hosting is fast and stable, and it’s usually cheaper too, so it’s a smart move for a lot of organisations. But once you leave WordPress, you lose the “someone else does it” layer.

That’s why sitemaps and robots.txt get forgotten, and then people end up wondering why new pages take ages to appear in Google, or why old URLs keep hanging around.

Most sitemap generation options are annoying

The alternatives are often a crawler that hits your live site, a build plugin you don’t want to add, or a toolchain that expects you to be running Node scripts and command line jobs. That’s fine for dev teams, but it’s overkill for the majority of static sites.

This tool takes the simple route: you tell it what exists, it outputs the files, and you upload them to site root.

What it does, and what it doesn’t

  • Builds a sitemap you actually meant

    It detects folder pages via index.html, it prefers clean trailing slash URLs, and it lets you tick what belongs in the sitemap. If you list everything, you’ll index everything, and that’s how people accidentally publish their mess. This encourages intent.

  • Generates robots.txt with the correct sitemap line

    robots.txt is mostly about crawl behaviour, and it’s useful when you want to avoid wasting crawl budget on junk. The tool includes the Sitemap: directive so search engines know exactly where to look.

  • Doesn’t crawl your website

    Crawlers can be useful, but they also introduce weird edge cases, and they can be slow, and they sometimes find things you didn’t even mean to expose. This tool doesn’t do any of that, it works from your folder or your pasted list.

  • Doesn’t pretend robots.txt is a privacy feature

    robots.txt is not secrecy, and it never was. If something must not be public, protect it properly, because disallow rules are basically a public hint list for curious people.

When you should use it

You’ve moved off WordPress

If you’ve migrated away from WordPress, you’ve probably lost the Yoast-generated sitemap without even noticing. That’s normal, and it’s also fixable in under five minutes.

You’re publishing new pages and Google is slow

A sitemap won’t guarantee rankings, but it does improve discovery and it reduces the “why hasn’t it even been found” problem. It’s one of those boring things that stops becoming a problem once it’s done properly.

You’ve got junk sitting in your build output

If your static output folder contains assets, backups, exports, tool outputs, or random files you don’t want indexed, this makes it easy to ignore them and keep your sitemap intentional.

You’re cleaning up after a messy migration

If old and new URLs both exist, you need to pick winners, redirect losers, and stop listing the losers. A good sitemap helps you enforce that, and it keeps your structure consistent over time.

How to use it

  1. Import your static folder (Chrome or Edge), or paste a file list or URL paths (Firefox friendly).
  2. Use ignore rules to strip assets and junk.
  3. Tick only what you want indexed.
  4. Set your base URL, then download sitemap.xml and robots.txt.
  5. Upload both to site root, then submit the sitemap once in Search Console.

After that, you just keep uploading to the same /sitemap.xml URL when you add pages. Search engines re-fetch it over time, so you don’t have to “re-submit” forever.

FAQ

Will this improve SEO?

It won’t magically rank you, but it will reduce stupid friction. A sitemap improves discovery, and an intentional URL list helps you avoid indexing junk. That’s boring, and boring is good.

Should I disallow things in robots.txt?

Only if you actually mean it, and only for crawl control. If something must be private, protect it with auth or remove it entirely, because robots rules are public and they’re often used as a treasure map.

Why does it ask me to tick pages?

Because sitemaps are meant to be intentional. If you blindly include everything, you’ll end up indexing endpoints, exports, random files, and whatever else landed in your build folder.

Does it upload my folder?

No. It runs in your browser, so your file list stays local. You generate the files and you upload them yourself.

Generate yours, once, properly

If you’ve got a static site and you want sitemap.xml and robots.txt done without ceremony, this is the tool. You’ll spend more time reading this sentence than you will generating the files.