How to create the perfect robots.txt for WordPress

To help search engines index your blog correctly, you need to make the right Robots txt file for WordPress. Let’s see how to create it and how to fill it.

What does Robots.txt give?

It is needed for search engines to correctly index a web resource. The content of the file “tells” the search robot which pages to show in the search and which to hide. This allows you to manage your content in the SERP.

You need to fill in robots.txt already at the stage of website development. Its changes do not take effect immediately. It may take a week or several months.

Where is Robots located?

This regular test file is located in the root directory of the site. You can get it at

https://site.ru/robots.txt 

The engine does not initially create Robots. You need to do this manually or use tools that create it automatically.

Can’t find this file

If the content of the file is displayed at the specified address, but it is not on the server, then it is created virtually. The search engine doesn’t care. The main thing is that it is available.

What does it consist of

Of the 4 main directives:

  • User-agent – rules for search robots.
  • Disalow – denies access.
  • Allow – allows.
  • Sitemap – The full URL of the XML map.

Correct robots.txt for WordPress

There are many options. The instructions on each site are different.

Here is an example of a correct Robots that takes into account all sections of the site. Let’s take a quick look at the directives.

User-agent: *
Disallow: /cgi-bin
Disallow: /wp-admin
Disallow: /wp-content/cache
Disallow: /wp-json/
Disallow: /wp-login.php
Disallow: /wp-register.php
Disallow: /xmlrpc.php
Disallow: /license.txt
Disallow: /readme.html
Disallow: /trackback/
Disallow: /comments/feed/
Disallow: /*?replytocom
Disallow: */feed
Disallow: */rss
Disallow: /author/
Disallow: /?
Disallow: /*?
Disallow: /?s=
Disallow: *&s=
Disallow: /search
Disallow: *?attachment_id=
Allow: /*.css
Allow: /*.js
Allow: /wp-content/uploads/
Allow: /wp-content/themes/
Allow: /wp-content/plugins/
Sitemap: https://site.ru/sitemap_index.xml

The first line indicates that the resource is available to all search robots (crawlers).

Disallow directives prohibit search results from displaying service directories and files, cached pages, authorization and registration sections, RSS (Feed), author pages, search and attachments.

Allow allows you to add scripts, styles, downloads, themes and plugins to the index.

The latter is the address of the XML map.

How to create robots.txt for a website

Let’s take a look at several methods.

Manually

This can be done, for example, in Notepad (if a local server) or via an FTP client (on a hosting).

You can also achieve this with VI plugins. Let’s take a look at the best ones.

Clearfy Pro

Clearfy Pro creates a virtual file. For this:

  1. Go to the admin menu Clearfy Pro .
  2. On the tab SEO , enable the option Generate correct robots.txt .
  3. Fill in the contents of the file.
  4. Save your changes.

Yoast SEO

This powerful SEO module for WP will do the trick too.

  1. Navigate to SEO> Tools .
  2. Click File Editor .
  3. If this file is not in the root directory, click Create robots.txt file
  4. If so, the editor will open to make changes.
  5. Click Save changes to robots.txt .

All in One SEO Pack

This solution also “knows how” to work with Robots. For this:

  1. Go to All in One SEO> Modules .
  2. Select the module name of the same name and click Activate .

3.Navigate to All in One SEO> Robots.txt . Add directives in the fields.

4.Add directives in the fields.

Setting up for online stores (WooCommerce)

For WordPress assets using this extension, just add these rules:

Disallow: /cart/
Disallow: /checkout/
Disallow: /*add-to-cart=*
Disallow: /my-account/

Share your love

Leave a Reply

Book Your Chance Now

Joining Giveaway is free . No additional costs. Cancel anytime.