Ki-Ki Tools
Generate sitemap.xml and robots.txt from a folder, a file list, or pasted URL paths. Built for static sites and migrations, and for anyone who wants SEO hygiene without a toolchain.
IMPORT
Fast path, Chrome or Edge folder pickPick your downloaded site folder, or paste a list. Nothing uploads anywhere. This runs in your browser.
URL cleaning (recommended defaults)
GUIDE AND PAGES
Read once, then generateWhy this exists
A sitemap is the cleanest way to tell search engines what you actually want indexed. It speeds up discovery, helps migrations, and reduces "why hasn't Google found this page" nonsense. robots.txt is useful too, but mostly for crawl control. It is not a secrecy feature, and it never was.
How it works
Import a folder or paste a list. The tool detects folder pages via index.html, converts them into trailing slash URLs, and lets you tick what should appear in sitemap.xml. You can also paste URL paths like /offers/, or full URLs like https://example.com/offers/.
- Folder with index.html becomes a trailing slash URL
- Pasted URL paths like offers or /offers/ become /offers/
- Ignore rules strip assets and junk
- Untick pages you do not want indexed, then download and upload
No pages yet. Import a folder or paste a list.
Tip: if you're pasting URL paths, use /like/this/ or offers
Sitemap health tips
For migrations: if old and new URLs both exist, pick a long-term winner. Redirect the loser, then stop listing it. Your sitemap should reflect the version you want to survive.
robots.txt tips
A fun bonus: bad robots rules often reveal what people are trying to hide. You don't have to be Sherlock, you just have to read the file they published to everyone.
OUTPUT
Download, upload to site rootSet your base URL, then generate files from your selected pages. If you pasted full URLs, the base URL auto-fills.
Upload checklist
2) Upload /robots.txt to your site root
3) In Search Console, submit the sitemap once. On future uploads, Google will keep re-reading the same URL.