Robots.txt and Sitemap.xml

This page is about using robots.txt and sitemap.xml files.

Table of contents

Introduction

The robots.txt file and all sitemap.xml files can be stored in the public folder.

It is possible to add files manually to there, but the better way is to use the Kiwa Sitemap module. It allows to create and update those files automatically.

The Kiwa Sitemap module

The Kiwa Sitemap module allows an automatic generation of the robots.txt and the sitemap.xml files.

To add it to your project, run

$ composer require kiwa/sitemap

If you are using the Kiwa Console, there will a new command now: sitemap:create.

Creating a sitemap

Let's use the Kiwa Console to create a new sitemap by calling

$ php bin/console sitemap:create

The crawler requests the main URL now and follows all available links. After that, it will create multiple sitemap files, one for each language. Those files will be stored inside the public folder.