Ki-Ki

Web foundations for SMEs

Ki-Ki Tools

Generate sitemap.xml and robots.txt from a folder, a file list, or pasted URL paths. Built for static sites and migrations, and for anyone who wants SEO hygiene without a toolchain.

Static hosting friendly Migration friendly No crawling SEO hygiene

IMPORT

Fast path, Chrome or Edge folder pick

Pick your downloaded site folder, or paste a list. Nothing uploads anywhere. This runs in your browser.

No folder picked yet
If Firefox does nothing here, use paste import below.
Quick ways to make a file list: find . -type f (Mac/Linux), dir /s /b (Windows). If you paste full URLs, the base URL will auto-fill.

URL cleaning (recommended defaults)

Stops duplicate indexing caused by casing differences.
Most sites want /, not /index.html.
Example: prefer /about/ over /about.html if both exist.
Enable only if you genuinely use pages like /privacy.html. If you are folder-first, leave it off.
This is where you avoid junk in your sitemap. If your build drops backups, staging, or tool outputs into the same folder, ignore them here.

GUIDE AND PAGES

Read once, then generate

Why this exists

A sitemap is the cleanest way to tell search engines what you actually want indexed. It speeds up discovery, helps migrations, and reduces "why hasn't Google found this page" nonsense. robots.txt is useful too, but mostly for crawl control. It is not a secrecy feature, and it never was.

How it works

Import a folder or paste a list. The tool detects folder pages via index.html, converts them into trailing slash URLs, and lets you tick what should appear in sitemap.xml. You can also paste URL paths like /offers/, or full URLs like https://example.com/offers/.

  • Folder with index.html becomes a trailing slash URL
  • Pasted URL paths like offers or /offers/ become /offers/
  • Ignore rules strip assets and junk
  • Untick pages you do not want indexed, then download and upload
Tick what should be in the sitemap

No pages yet. Import a folder or paste a list.

Tip: if you're pasting URL paths, use /like/this/ or offers

Sitemap health tips
Keep it intentional. A sitemap is basically your site saying "index these, please". If you list everything, you get everything indexed, including the stuff you didn't mean to publish. I check sitemaps on random sites all the time, and the amount of accidental chaos is impressive in the same way a car fire is impressive.

For migrations: if old and new URLs both exist, pick a long-term winner. Redirect the loser, then stop listing it. Your sitemap should reflect the version you want to survive.
robots.txt tips
robots.txt is for crawl behaviour, not secrecy. If something must not be public, protect it properly. Also, many sites disallow things for no reason and then wonder why pages don't rank.

A fun bonus: bad robots rules often reveal what people are trying to hide. You don't have to be Sherlock, you just have to read the file they published to everyone.

OUTPUT

Download, upload to site root

Set your base URL, then generate files from your selected pages. If you pasted full URLs, the base URL auto-fills.

Default is https://
Domain only. You can paste a full URL, it will be cleaned.
Base URL -
One path per line. Leave blank for Allow all.
Preview First 20 URLs

        
Upload checklist
1) Upload /sitemap.xml to your site root
2) Upload /robots.txt to your site root
3) In Search Console, submit the sitemap once. On future uploads, Google will keep re-reading the same URL.