How to block part of a page from indexing by search engines

Every website consists of chunks of text segments and not all of these segments are meaningful to readers. As a website owner, you may find that certain segments of your website content are irrelevant, duplicate, or not SEO friendly. Often, it is found that a particular part of a web page is not valuable from an SEO perspective. On the contrary, this part could lead the website down in the SERP ranking. In this case, should the webmaster exclude the entire page from Google’s indexing list?

This is not an ideal thing to do because important, SEO-optimized segments of that page could be overlooked by search engine crawlers this way.

So what is the solution in such a case? Well, you can block the particular segment of a web page that you don’t want Google to index. This way, you will prevent Google from indexing unnecessary parts of your web pages and you can improve your site’s SEO ranking at the same time.

How to exclude parts of pages from the indexing lists of Google and other search engines?

Robots.txt files are the ultimate solutions to this issue

Typically, robots.txt files come into play when an entire web page needs to be blocked from indexing lists by Google or other search engines. The main purpose of using robots.txt files is to protect a website from overloading with requests. However, these files can also be used to block specific parts of web pages.

Are you familiar with robots.txt files? Not sure how to create robots.txt files and involve them in your web pages? The following snippet will help you understand the standard structure of robots.txt files-

# Group 1

User-agent: Googlebot

Disallow: /nogooglebot/

# Group 2

User-agent: *

Allow: /

Sitemap: http://www.example.com/sitemap.xml

When you place a robots.txt file in the HTML header of a specific page, it prevents Google and other search engines from crawling that page.

Meta tags can help a lot

Many meta tags come in handy for excluding specific articles and parts of your web pages from search engine indexing lists.

  • If you want to exclude a particular segment of your webpage from Google’s indexing, you should use the following meta tag:
.

You should modify the date part in the meta tag to your liking.

You need to add the meta tag mentioned above to make a specific article on your website unindexable for Google bots and other search engine crawlers.

Apart from these two, there are several meta tags that might help you exclude certain segments of your website from all search engine indexing lists.

Putting excludable segments in iframes is a simple solution

Although this technique is a bit difficult unless you are tech-savvy, it is a handy option to solve your problem. Adding the meta iframe tag to a particular segment of your website makes the part uncrawlable for Google and other search engines. In this process, the webmaster has to put the specific content in a separate HTML file. After that, an iframe tag should be used for displaying this file on the host page. The following excerpts will clarify all this to you in a better way-

This text can be crawled, but the following text is not visible to search engines: