Adding robots.txt to a Django site

/ #Django


Yesterday we added sitemaps as part of SEO. Today, we need a file called robots.txt to tell bots where they can't go on our website.

Not all bots are going to use this file, but most of the good ones will.

We can start by creating a new file called robots.txt and add it in your template folder.

Add the following content to it:

User-agent: * 
Disallow: /admin/
Disallow: /api/

Here I say that all robots "*" should not go into /admin/ and /api/. You can add more pages here if you want to.

Then, the only thing we need to do is to add this to the url patterns:

# Import function from Django
from django.views.generic.base import TemplateView

# Append to url patterns
path('robots.txt', TemplateView.as_view(template_name="robots.txt", content_type="text/plain")),

And that's it, you can now go to http://127.0.0.1:8000/robots.txt

Comments

Joseph Abu | Mar 03, 23 11:56

what is User-agent and is it compulsory that i must add it


Stein Ove Helset | Mar 04, 23 07:21

It's to specify which bots you are targeting. In my case, I just target them all :-)

Add comment

Newsletter

Subscribe to my weekly newsletter. One time per week I will send you a short summary of the tutorials I have posted in the past week.