Search engine optimization, in its most standard sense, relies upon something above all others: Search engine spiders crawling and indexing your site.
But almost every website is going to have pages that you don’t wish to consist of in this expedition.
In a best-case circumstance, these are doing nothing to drive traffic to your website actively, and in a worst-case, they might be diverting traffic from more vital pages.
Luckily, Google permits webmasters to tell search engine bots what pages and material to crawl and what to neglect. There are numerous ways to do this, the most typical being using a robots.txt file or the meta robotics tag.
We have an excellent and detailed description of the ins and outs of robots.txt, which you need to certainly check out.
However in high-level terms, it’s a plain text file that lives in your site’s root and follows the Robots Exclusion Protocol (REP).
Robots.txt supplies spiders with directions about the website as an entire, while meta robotics tags consist of instructions for particular pages.
Some meta robotics tags you may employ consist of index, which tells online search engine to add the page to their index; noindex, which tells it not to add a page to the index or include it in search results page; follow, which instructs a search engine to follow the links on a page; nofollow, which tells it not to follow links, and a whole host of others.
Both robots.txt and meta robotics tags are useful tools to keep in your toolbox, but there’s likewise another method to advise online search engine bots to noindex or nofollow: the X-Robots-Tag.
What Is The X-Robots-Tag?
The X-Robots-Tag is another way for you to manage how your websites are crawled and indexed by spiders. As part of the HTTP header reaction to a URL, it manages indexing for a whole page, along with the specific aspects on that page.
And whereas utilizing meta robotics tags is relatively simple, the X-Robots-Tag is a bit more complicated.
But this, naturally, raises the question:
When Should You Use The X-Robots-Tag?
According to Google, “Any regulation that can be utilized in a robots meta tag can also be defined as an X-Robots-Tag.”
While you can set robots.txt-related directives in the headers of an HTTP response with both the meta robots tag and X-Robots Tag, there are specific situations where you would wish to use the X-Robots-Tag– the two most common being when:
- You want to manage how your non-HTML files are being crawled and indexed.
- You wish to serve directives site-wide rather of on a page level.
For instance, if you wish to block a particular image or video from being crawled– the HTTP response technique makes this easy.
The X-Robots-Tag header is likewise helpful because it allows you to combine multiple tags within an HTTP action or use a comma-separated list of instructions to define regulations.
Perhaps you do not desire a certain page to be cached and desire it to be not available after a certain date. You can use a mix of “noarchive” and “unavailable_after” tags to instruct search engine bots to follow these instructions.
Essentially, the power of the X-Robots-Tag is that it is a lot more flexible than the meta robotics tag.
The benefit of utilizing an X-Robots-Tag with HTTP responses is that it permits you to utilize routine expressions to carry out crawl directives on non-HTML, along with use specifications on a larger, international level.
To assist you comprehend the difference in between these directives, it’s practical to categorize them by type. That is, are they crawler directives or indexer directives?
Here’s an useful cheat sheet to discuss:
|Crawler Directives||Indexer Directives|
|Robots.txt– uses the user representative, permit, disallow, and sitemap directives to define where on-site search engine bots are permitted to crawl and not allowed to crawl.||Meta Robotics tag– permits you to specify and prevent search engines from showing specific pages on a website in search engine result.
Nofollow– enables you to define links that ought to not hand down authority or PageRank.
X-Robots-tag– permits you to manage how specified file types are indexed.
Where Do You Put The X-Robots-Tag?
Let’s state you want to block particular file types. A perfect method would be to add the X-Robots-Tag to an Apache setup or a.htaccess file.
The X-Robots-Tag can be added to a website’s HTTP actions in an Apache server setup via.htaccess file.
Real-World Examples And Utilizes Of The X-Robots-Tag
So that sounds great in theory, however what does it appear like in the real life? Let’s take a look.
Let’s say we desired online search engine not to index.pdf file types. This setup on Apache servers would look something like the below:
In Nginx, it would look like the listed below:
place ~ *. pdf$
Now, let’s take a look at a different scenario. Let’s say we wish to utilize the X-Robots-Tag to block image files, such as.jpg,. gif,. png, etc, from being indexed. You could do this with an X-Robots-Tag that would look like the below:
Please note that comprehending how these instructions work and the impact they have on one another is crucial.
For instance, what occurs if both the X-Robots-Tag and a meta robotics tag are located when crawler bots find a URL?
If that URL is blocked from robots.txt, then specific indexing and serving instructions can not be found and will not be followed.
If regulations are to be followed, then the URLs consisting of those can not be disallowed from crawling.
Look for An X-Robots-Tag
There are a couple of various approaches that can be utilized to look for an X-Robots-Tag on the website.
The simplest method to examine is to set up a browser extension that will tell you X-Robots-Tag info about the URL.
Screenshot of Robots Exemption Checker, December 2022
Another plugin you can utilize to figure out whether an X-Robots-Tag is being utilized, for example, is the Web Developer plugin.
By clicking on the plugin in your browser and navigating to “View Response Headers,” you can see the numerous HTTP headers being used.
Another method that can be used for scaling in order to identify concerns on websites with a million pages is Shouting Frog
. After running a site through Screaming Frog, you can browse to the “X-Robots-Tag” column.
This will show you which sections of the site are utilizing the tag, along with which particular regulations.
Screenshot of Shrieking Frog Report. X-Robot-Tag, December 2022 Utilizing X-Robots-Tags On Your Website Comprehending and controlling how online search engine interact with your site is
the foundation of search engine optimization. And the X-Robots-Tag is a powerful tool you can use to do simply that. Simply understand: It’s not without its risks. It is extremely simple to slip up
and deindex your whole site. That stated, if you’re reading this piece, you’re most likely not an SEO newbie.
So long as you use it wisely, take your time and examine your work, you’ll discover the X-Robots-Tag to be a helpful addition to your toolbox. More Resources: Featured Image: Song_about_summer/ SMM Panel