How to Access and Edit Your WordPress Robots.txt File: The UK Guide

Master your website’s SEO. Learn how to access, create, and edit the WordPress robots.txt file with our definitive, step-by-step guide for UK users.

A hyper-realistic photograph showing a friendly, approachable person in a modern, light-filled British home office, pointing at a laptop screen. The screen clearly displays a WordPress dashboard with the robots.txt file editor open. The mood is helpful and professional, with soft, natural lighting. The style is clean and aspirational, like a feature in a tech magazine.

This post may contain affiliate links. If you make a purchase through these links, we may earn a commission at no additional cost to you.

Imagine your website is a huge, bustling museum, like the British Museum in London. It’s packed with amazing things you want everyone to see—your blog posts, your products, your stunning gallery. But it also has staff-only areas: the stockroom, the offices, the cleaning cupboards. You want visitors to see the exhibits, but you don’t want them wandering into the boiler room.

In the digital world, search engines like Google send out little automated visitors called “robots” or “crawlers” to explore your website. And just like a real museum, you need a way to give them directions. You need to tell them, “Welcome! Please check out the Egyptian mummies and the Rosetta Stone, but please ignore the staff canteen.”

That’s exactly what a robots.txt file does. It’s the friendly, unseen traffic warden for your website.

It might sound terribly technical, but it’s one of the most important files for your site’s health and visibility in search results, what we call Search Engine Optimisation (SEO). Getting it right can help you rank higher. Getting it wrong can make your site invisible.

But don’t you worry. This guide will walk you through everything, step by step. We’ll ditch the jargon and explain it all in plain English. Whether you’re a small business owner in Manchester, a blogger in Cornwall, or just curious about how your WordPress site ticks, you’ll be a robots.txt boss by the end of this.

What on Earth is a Robots.txt File? A Plain English Explanation

At its heart, a robots.txt file is just a simple text file. It lives in the main directory of your website and contains a set of instructions for web robots. That’s it. No fancy code, no complicated programming. Just a few lines of text.

Think of It as a ‘Do Not Disturb’ Sign for Google

The best way to think about robots.txt is as a set of polite suggestions. It’s like putting a “Do Not Disturb” sign on a hotel room door. Most well-behaved visitors (like Googlebot, Bingbot, and other major search engine crawlers) will see the sign and respectfully move on.

These instructions are part of something called the Robots Exclusion Protocol, a web standard that most search engines have agreed to follow. They’ll check for this file before they start exploring your site to see if there are any rules they need to obey.

What your robots.txt file does:

  • Tells bots which pages to ignore: It stops them from visiting and “crawling” certain pages or sections of your website.
  • Tells bots which pages they can visit: By not blocking them, you’re giving them the green light to explore.
  • Points them to your sitemap: It can include a link to your sitemap, which is a map of all the important pages you do want them to find.

What your robots.txt file doesn’t do:

  • It’s not a security guard: This is a crucial point. The robots.txt file is based on trust. It only works for well-behaved bots. Malicious bots, designed to scrape content or find security holes, will completely ignore it. Never use robots.txt to hide private information.
  • It doesn’t guarantee pages won’t be in Google: This sounds odd, but it’s true. If you block a page in robots.txt, Google won’t visit it. However, if another website links to that page, Google might still find out it exists and show it in search results (usually without a description). It’s like hearing about a party you weren’t invited to. You know it’s happening, but you don’t know what’s going on inside. If you want to be certain a page stays out of search results, you need a different tool called a noindex tag (more on that later!).

Why Should You Care? SEO and Your Website’s Health

So, why bother with this little text file? Because it has a surprisingly big impact on your site’s performance in search engines.

1. It Manages Your “Crawl Budget”

Search engines don’t have unlimited time and resources. They allocate a “crawl budget” to every website, which is roughly the number of pages they will crawl during a certain period.

You want them to spend this precious budget wisely, focusing on your most important content: your blog posts, product pages, and your homepage. You don’t want them wasting time crawling the backend admin pages of WordPress, your shopping cart checkout process, or thousands of search result pages on your site.

By using robots.txt to block these unimportant areas, you guide Google to spend its budget on the pages that actually matter—the ones you want to rank in search results.

2. It Prevents Indexing of Unwanted Content

Your WordPress site creates a lot of pages you don’t necessarily want people to find on Google. Think about internal search result pages, tag pages with only one post, or user profile pages. A well-configured robots.txt file helps keep this low-value content out of the search results, making your site look cleaner and more authoritative to Google.

3. It Helps Prevent Server Overload

When a bot crawls your site, it makes requests to your server. A very aggressive bot could make thousands of requests in a short time, potentially slowing down your website for real human visitors. While less of a concern for small sites on modern UK hosting, robots.txt can be used to add a “crawl-delay” to tell bots to take it easy, though Google no longer officially follows this specific rule.

Physical vs. Virtual: The Two Faces of WordPress Robots.txt

Here’s where things get a little bit clever with WordPress. There isn’t always a file called robots.txt sitting on your server. WordPress has two ways of handling it.

The Default: WordPress’s Magical ‘Virtual’ File

Out of the box, a fresh WordPress installation creates a virtual robots.txt file. This means the file doesn’t actually exist in your website’s file structure. You won’t find it if you look for it via FTP.

Instead, WordPress generates it automatically whenever a search engine (or you) requests it. It’s created on-the-fly.

A default virtual robots.txt in WordPress looks something like this:

User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php

Sitemap: https://yourwebsite.co.uk/wp-sitemap.xml

This is a sensible starting point. It tells all bots (User-agent: *) to stay out of the admin area (Disallow: /wp-admin/) but makes an exception for one important file (Allow: /wp-admin/admin-ajax.php) that helps your site’s interactive features work properly. It also tells them where to find your sitemap.

Taking Control: Creating a ‘Physical’ File

The virtual file is fine, but it’s very basic. To take full control and add your own custom rules, you need to create a physical robots.txt file.

When you create a physical file in your site’s root directory (we’ll show you how), it completely overrides the virtual one. From that point on, WordPress will stop generating the virtual file, and all bots will read your new physical file instead. This is what you’ll be doing when you use an SEO plugin or upload the file yourself.

First Things First: Does Your Website Have a Robots.txt File?

Before you start editing, let’s find out what you’re working with. Luckily, this is the easiest check in the world.

Simply open a new tab in your web browser and type your full website address, followed by /robots.txt.

For example: https://yourgreatbritishwebsite.co.uk/robots.txt

Press Enter, and you’ll see one of three things:

  1. The WordPress default file: If you see the basic rules we mentioned above, you’re looking at the virtual file.
  2. A custom file: You might see a longer file with more rules. This means a physical file already exists, likely created by you, your web developer, or an SEO plugin.
  3. A 404 Not Found error: This is rare, but it means you don’t have a virtual or physical file. Don’t panic! It just means search engines will assume they can crawl everything. We can easily fix this.

Now you know what you’ve got, let’s look at how to take control.

How to Find, Create, and Edit Your Robots.txt File: Three Simple Methods

There are a few ways to get the job done. We’ll start with the easiest and safest method for most people and then move on to the more technical options.

Method 1: The Easy Way – Using an SEO Plugin

If you have a WordPress site, you should absolutely be using an SEO plugin. They make optimising your site a doddle. Top plugins like Yoast SEO, Rank Math, and All in One SEO (AIOSEO) all come with a built-in tool to manage your robots.txt file.

This is the recommended method because it’s safe, simple, and you don’t have to leave your WordPress dashboard.

Using Yoast SEO

Yoast is one of the most popular WordPress plugins on the planet. If you have it installed, editing your robots.txt is a breeze.

  1. From your WordPress dashboard, go to Yoast SEO > Tools.
  2. On the Tools page, click on File editor.
  3. If you don’t have a physical robots.txt file yet, Yoast will give you a button that says “Create robots.txt file”. Click it.
  4. You will now see a text box containing your robots.txt rules. You can edit them directly here.
  5. Once you’re happy with your changes, click the “Save changes to robots.txt” button. That’s it!

Using Rank Math

Rank Math is another brilliant and very popular SEO plugin with a massive feature set.

  1. From your WordPress dashboard, navigate to Rank Math > General Settings.
  2. In the menu on the left of this screen, click on Edit robots.txt.
  3. You’ll see a text editor where you can add or change your rules. The editor will already be populated with some default rules, which are a great starting point.
  4. Make your edits and click “Save Changes”. Job done.

Using All in One SEO (AIOSEO)

AIOSEO is another fantastic choice for managing your site’s SEO.

  1. In your WordPress dashboard, go to All in One SEO > Tools.
  2. You’ll see a tab for Robots.txt Editor. Click on it.
  3. By default, AIOSEO will show you the rules from the WordPress virtual file. To add your own, click the toggle to “Enable Custom Robots.txt”.
  4. A text editor will appear. You can add your custom rules here. There are also handy buttons to add new rules for different user-agents.
  5. When you’ve finished, click “Save Changes”.

Method 2: The Tech-Savvy Way – Using FTP or SFTP

If you don’t use an SEO plugin with a file editor, or if you’re just more comfortable working with files directly, you can use FTP. FTP stands for File Transfer Protocol, and it’s a way of connecting to your website’s server to manage files.

What You’ll Need

  • An FTP Client: This is a program you install on your computer. FileZilla is a popular, free, and excellent choice for both Windows and Mac.
  • Your FTP Credentials: You’ll get these from your web hosting provider. You’ll need a hostname (e.g., ftp.yourwebsite.co.uk), a username, and a password.

Step-by-Step Guide

  1. Connect to Your Server: Open FileZilla and enter your host, username, and password at the top, then click “Quickconnect”.
  2. Navigate to the Root Directory: Once connected, you’ll see your computer’s files on the left and your website’s files on the right. Your website’s main folder is usually called public_html, www, or your site’s name. Double-click to open it. This is your root directory.
  3. Find or Create the File: Look for a file named robots.txt in the list of files on the right.
    • If it exists: Right-click on it, choose “View/Edit”. The file will open in your computer’s default text editor. Make your changes, save the file, and close it. FileZilla will then ask if you want to upload the modified file back to the server. Say yes.
    • If it doesn’t exist: Right-click in the empty space in the file list on the right, and choose “Create new file”. Name the file robots.txt. Then, right-click on your new file and choose “View/Edit” to add your rules. Save it, and upload the changes when prompted.

Method 3: The Hosting Control Panel Way – Using cPanel’s File Manager

Most UK web hosts provide a control panel like cPanel or Plesk. These have a built-in File Manager tool, which is like a web-based version of an FTP client. It’s another great way to edit your file without installing any software.

Step-by-Step Guide

  1. Log in to your hosting account and open cPanel.
  2. Look for an icon called “File Manager” and click on it.
  3. In the File Manager, navigate to your root directory (again, usually public_html or www).
  4. Look for robots.txt in the file list.
    • If it’s there: Select it and click the “Edit” icon in the top menu. A text editor will open in your browser where you can make your changes. Click “Save Changes” when you’re done.
    • If it’s not there: Click the “+ File” icon in the top menu. Name the new file robots.txt and click “Create New File”. Then select your new file and click “Edit” to add your rules.

Crafting the Perfect Robots.txt File for WordPress: Rules and Examples

Right, you know how to access the file. But what should you actually put in it? Let’s break down the language of robots.txt. It’s much simpler than it looks.

Understanding the Lingo: A Simple Syntax Guide

A robots.txt file is made up of groups of rules. Each group starts by specifying a user-agent, followed by the rules for that agent.

  • User-agent: This specifies which robot the rules apply to. You can use a wildcard * to apply the rules to all bots (User-agent: *), or you can target specific bots like Google’s (User-agent: Googlebot). For most sites, * is all you need.
  • Disallow: This is the “Do Not Disturb” instruction. Anything after Disallow: is a URL path you don’t want the bot to crawl. The path always starts with a /, which represents your root directory.
  • Allow: This is an exception rule. It’s used to let bots access a specific file or subfolder inside a directory that you have otherwise disallowed.
  • Sitemap: This isn’t a rule, but it’s incredibly useful. You use it to tell bots the full URL of your sitemap file. This helps them discover all your important pages much faster.

A Brilliant, Basic Robots.txt for Most WordPress Sites

For 99% of WordPress websites, the following rules are a fantastic and safe starting point. This is a more robust version than the WordPress default.

User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /wp-content/plugins/
Disallow: /wp-content/themes/
Disallow: /readme.html
Disallow: /license.txt
Allow: /wp-admin/admin-ajax.php

Sitemap: https://yourgreatbritishwebsite.co.uk/sitemap_index.xml

Let’s break that down:

  • User-agent: * This rule applies to all search engine robots.
  • Disallow: /wp-admin/ and Disallow: /wp-includes/ These block the core WordPress admin and backend folders. There’s no reason for Google to be in there.
  • Disallow: /wp-content/plugins/ and Disallow: /wp-content/themes/ These block the folders containing your plugin and theme files. Search engines don’t need to crawl the code, just the final pages they produce.
  • Disallow: /readme.html and /license.txt These block standard WordPress info files that are of no value to search engines.
  • Allow: /wp-admin/admin-ajax.php This is the important exception. It allows bots to access a file needed for some of your site’s interactive features, even though the rest of the /wp-admin/ folder is blocked.
  • Sitemap: ... Remember to replace the example URL with the actual URL of your sitemap. SEO plugins like Yoast and Rank Math create this for you automatically.

Common Blunders: How to Avoid Messing Up Your Robots.txt

With great power comes great responsibility. A tiny mistake in your robots.txt file can have a massive negative impact on your SEO. Here are the most common pitfalls to avoid.

The “Empty Disallow” Catastrophe

This one is terrifyingly easy to do. Look at these two lines:

  • Disallow: (with nothing after it)
  • Disallow: /

They look similar, but they do completely opposite things. Disallow: means “don’t disallow anything”. It allows bots to crawl your entire site. Disallow: / means “disallow everything starting from the root directory”—in other words, it blocks your entire website. One single slash can make your site invisible to Google. Always double-check your slashes!

Forgetting Case Sensitivity

The paths in your robots.txt file are case-sensitive. If you have a folder called /Photos/, blocking /photos/ will do nothing. Make sure the casing in your rules matches your actual folder and file names.

Using Robots.txt for Security

We’ve said it before, but it’s worth repeating. Robots.txt is not a security tool. Blocking a folder like /secret-documents/ doesn’t secure it. It just stops well-behaved bots from crawling it. Worse, it publicly advertises the location of your secret folder to anyone who reads your robots.txt file. It’s like leaving a note on your front door saying, “The family jewels are not in the upstairs bedroom.”

Forgetting the Sitemap

Don’t forget to include the link to your sitemap. It’s one of the best ways to help Google find and index your content efficiently. Check your SEO plugin’s settings to find your sitemap URL if you’re not sure where it is.

Testing, Testing, 1, 2, 3: How to Check Your Robots.txt is Working

After you’ve saved your changes, it’s vital to check that your file is doing what you expect and doesn’t contain any errors. The best tool for the job is provided by Google itself.

Using Google Search Console’s Tester

Google Search Console is a free service that helps you monitor your site’s performance in Google Search. If you haven’t set it up, you absolutely should. Inside, there’s a handy robots.txt Tester.

  1. Sign in to Google Search Console.
  2. Choose your website property.
  3. Go to Settings > Crawling > Robots.txt Tester. (Note: Google updates its interface, so the exact location might change, but it’s usually found under settings or legacy tools).
  4. The tester will automatically fetch your live robots.txt file and show it to you. It will highlight any syntax errors or warnings.
  5. You can also use the tool to test if a specific URL on your site is blocked for different Googlebots. Just enter a URL path from your site and see if it comes back as “Allowed” or “Blocked”.

This gives you immediate peace of mind that your rules are being interpreted correctly by Google.

Beyond the Basics: Robots.txt vs. Noindex – What’s the Difference?

This is a common point of confusion. Both can be used to control how Google interacts with your pages, but they work in fundamentally different ways.

FeatureRobots.txt Disallownoindex Meta Tag
What it doesTells bots not to crawl a page or folder.Tells bots not to show a page in search results.
How it worksA rule in the /robots.txt file.A piece of code (<meta name="robots" content="noindex">) in the HTML <head> of a specific page.
Can Google still index the page?Yes, possibly. If other sites link to it, Google may still index the URL without visiting it.No. This is a direct command to remove/keep the page out of the search index.
Best for…Blocking whole sections (e.g., /wp-admin/), managing crawl budget, stopping bots from crawling low-value pages.Preventing individual, sensitive pages from appearing in search results (e.g., a “thank you” page after a purchase).

The Golden Rule:

  • If you don’t want Google to waste time crawling something, use robots.txt.
  • If you absolutely, positively don’t want a page to appear in search results, use the noindex tag. (You can set this easily in the settings for any page or post using your SEO plugin).

Conclusion: You’re Now a Robots.txt Boss!

And there you have it. The robots.txt file, demystified. It’s not a dark art or a complex piece of code. It’s simply a set of instructions, a helpful guide for the search engine robots that visit your site every day.

You’ve learned what it is, why it’s so vital for your WordPress SEO, and the three main ways to take control of it. You know what rules to put in it and, just as importantly, what mistakes to avoid.

By taking ten minutes to check and configure your robots.txt file, you are telling search engines that you are a serious website owner who cares about quality. You’re helping them do their job better, and in return, they’ll reward you with a more efficiently crawled site, which is the foundation of great SEO.

So go on, have a look at your site’s robots.txt file. You now have the knowledge and confidence to make sure it’s working perfectly for you.

Further Reading

To dive even deeper, here are some excellent resources from the experts:

Want More Like This? Try These Next: