Robots.txt and Sitemap.xml

This page is about using robots.txt and sitemap.xml files.

Introduction

Kiwa stores the robots.txt and all sitemap.xml files inside the crawler folder. All requests for those files will be redirected into this directory.

It is possible, to add files manually to there, but the better way is to use the Kiwa Sitemap module. It allows to create and update those files automatically.

The Kiwa Sitemap module

The Kiwa Sitemap module allows an automatical generation of the robots.txt and the sitemap.xml files.

To add it to your project, run

$ composer require kiwa/sitemap

If you are using the Kiwa Console, there will a new command now: sitemap:create.

Creating a sitemap

Let's use the Kiwa Console to create a new sitemap by calling

$ bin/console sitemap:create

The crawler requests the main URL now and follows all available links. After that it will create multiple sitemap files, one for each language. Those files will be stored inside the crawler folder.