Blog

SEMRush : The Definitive Guide

Semrush

Errors:

1. Pages have multiple canonical URLs:

POSSIBLE ISSUE:

“Multiple Rel=Canonical Tags” Means, URLs have more than one rel=canonical tag on the page. This can create confusion to the search engine, makes difficult to identify which URL is the actual canonical page. The issue might lead to ignore all the canonical tags or might choose the wrong one.

RECOMMENDATION: 

Remove all canonical URLs except the one that you’d like to serve as the actual canonical page.

2. Pages have duplicate Meta descriptions:

POSSIBLE ISSUE: 

A <meta description> tag is a short summary of a web page’s content that helps search engines understand what the page is about and can be shown to users in search results. Having a duplicate meta description will create problems to a crawler, this makes difficult for search engines and users to differentiate between different webpages.

RECOMMENDATION:

Put a unique, relevant meta description for each of your webpages.

3.Incorrect pages found in sitemap.xml:

POSSIBLE ISSUE: 

A sitemap.xml file makes it easier for crawlers to discover the pages on your website. Only good pages intended for your visitors should be included in your sitemap.xml file. This can confuse the search engine robots, which URL they should index and prioritize in search results.

RECOMMENDATION: 

Check your sitemap.xml if there are any URLs pointing to copies of original webpages, remove those particular URLs completely except the one you’d like to keep as the preferred version or regenerate a new sitemap.

4.Issues with mixed content:

POSSIBLE ISSUE:

This issue is triggered when base HTML or initial HTML is loaded over a secure HTTPS connection, but when images, videos, stylesheets, scripts are loaded over an insecure HTTP connection. This is called mixed content issue. HTTPS is important to protect both your site and your users from attack.

RECOMMENDATION:

Only embed HTTPS content on HTTPS pages.

5.AMP pages have no canonical tag:

POSSIBLE  ISSUE:

This issue occurs when the AMP pages have no canonical tags while creating AMP pages, both an AMP and a non-AMP version of the same page, you should have a canonical tag to prevent duplicate content issues.

RECOMMENDATION:

 Add a rel=”canonical” tag in the <head> section of each AMP page.

6. Internal links are broken:

POSSIBLE ISSUE: 

 Broken internal links indicate the page which no longer exists or due to an improper URL being present in the site. This issue can affect the website’s traffic and may discourage users from visiting your site. Broken internal links may also prevent the crawler from indexing your site and the site’s rank might get hampered.

RECOMMENDATION:

 Please follow all the links reported as broken. If a webpage returns an error, remove the link leading to the error page or replace it with another resource. If the links reported as broken do work when accessed with a browser, you may try either of the following:

– Contact your web hosting support team.

– Instruct search engine robots not to crawl your website too frequently by specifying the “crawl-delay” directive in your robots.txt.

7. Pages couldn’t be crawled:

POSSIBLE ISSUE:

 This issue indicates that crawler couldn’t access the webpage because the server either timed out or refused/closed the connection before our crawler could receive a response.

RECOMMENDATION:

Please contact your web hosting technical support team and ask them to fix the issue.

8. Internal images are broken:

POSSIBLE ISSUE: 

This error may occur, if the images no longer exist, URL may be misspelled or the file path is invalid. Broken images may affect your search ranking.

RECOMMENDATION:

Replace all broken images or delete them.

9. Pages returned 5XX status code:

POSSIBLE ISSUE: 

HTTP Status Code 500 – These errors indicate Internal Server Error. These errors prevent users and search engine robots from accessing your webpages, this can affect user experience and search engines’ crawlability.

RECOMMENDATION:

Investigate the causes of these errors and try to fix them.

10. Robots.txt file has format errors :

POSSIBLE ISSUE: 

The robots.txt is a file created to tell crawler which pages and areas on your site they are allowed to visit. If your robots.txt file is poorly configured, it can lead to crawlability issue which can affect your ranking. Get best info from a seo company.

RECOMMENDATION: 

Review your robots.txt file and fix all

11.Sitemap.xml files have format errors:

POSSIBLE ISSUE:

 If your sitemap.xml file has any errors, search engines will not be able to process the data it contains, and they will ignore it.

RECOMMENDATION:

Review your sitemap.xml file and fix all errors.

12. Pages don’t have title tags:

POSSIBLE ISSUE:

Just because your <title> tag appears in browsers and search results, and helps both search engines and users understand what your page is about. Keeping the title tag blank might affect your search ranking.

RECOMMENDATION: 

To rank high in search results and gain a higher click-through rate, you should ensure that each of your website’s pages has a unique and concise title containing your most important keywords.

 13. Pages have a WWW resolve issue:

POSSIBLE ISSUE:

Normally, a webpage can be accessed with or without adding www to its domain name. If you haven’t specified which version should be prioritized, search engines will crawl both versions, and the link juice will be split between them. Therefore, none of your page versions will get high positions in search results.

RECOMMENDATION:

Set your preferred version in Google Search Console.

 14. Pages have no viewport tag :

ABOUT THIS ISSUE:

The viewport meta tag is an HTML tag that allows you to control a page’s viewport size and scale on mobile devices. This issues might affect mobile search ranking. 

HOW TO FIX IT: 

Set the viewport meta tag for each page

15. Pages returned 4XX status code:

ABOUT THIS ISSUE:

4XX pages client error can hurt your website crawlability because search engine bots will be coming across links to dead or broken pages from your website. This is usually the result of broken links. These errors prevent users and search engine robots from accessing your WebPages; this will, in turn, lead to a drop in traffic.

RECOMMENDATION:

Please follow all links reported as 4xx. If a webpage returns an error, remove the link leading to the error page or replace it with another resource. If the links reported as 4xx do work when accessed with a browser, Instruct search engine robots not to crawl your website too frequently by specifying the “crawl-delay” directive in your robots.txt.

 16.Issues with hreflang values:

ABOUT THIS ISSUE:

An hreflang (rel=”alternate” hreflang=”x”) attribute helps search engines understand which page should be shown to visitors based on their location. You can target different audiences with hreflang according to their language and/or their country. It is very important to properly implement hreflang attributes, otherwise, search engines will not be able to show the correct language version of your page to the relevant audience.

RECOMMENDATION: 

Make sure that your hreflang attributes are used correctly. Here are a few ways to avoid hreflang implementation issues:

– Specify the correct language code

– Specify the correct country code

– Use hyphens to separate language and country values

– Precede a country code with a language code

– Do not use a country code alone

17.Non-secure pages :

ABOUT THIS ISSUE:

This issue is triggered if crawler detects an HTTP page with a <input type=”password”> field. Using a <input type=”password”> field on your HTTP page is harmful to user security, as there is a high risk that user login credentials can be stolen.

RECOMMENDATION:

Move your HTTP webpages that contain a password field to HTTPS.

 18.Issues with expiring or expired certificate:

POSSIBLE ISSUE:

If you allow your certificate to expire, users accessing your website will be presented with a warning message, which usually stops them from going further and may lead to a drop in your organic search traffic.

HOW TO FIX IT:

Ask your website administrator to renew the certificate and run periodic checks to avoid any future issues.

 19.Issues with old security protocol:

POSSIBLE ISSUE:

Running SSL or old TLS protocol (version 1.0) is a security risk, which is why it is strongly recommended that you implement the newest protocol versions.

RECOMMENDATION:

Update your security protocol to the latest version.

 20.Issues with duplicate title tags:

POSSIBLE ISSUE:

 Duplicate <title> tags make it difficult for search engines to determine which of a website’s pages is relevant for a specific search query, and which one should be prioritized in search results. This might affect ranking.

RECOMMENDATION:

Provide a unique and concise title for each of your pages that contain your most important keywords.

21.No redirect or canonical to HTTPS homepage from HTTP version:

POSSIBLE ISSUE:

Running both HTTP and HTTPS versions of your homepage can create problems search engines may not able to figure out which page to index and which one to prioritize. it is very important to make sure that their coexistence doesn’t affect your SEO. This may lead to traffic loss and poor ranking in search results.

RECOMMENDATION:

– Redirect your HTTP page to the HTTPS version via a 301 redirect

– Mark up your HTTPS version as the preferred one by adding a rel=”canonical” to your HTTP pages

 22.Redirect chains and loops :

POSSIBLE ISSUE:

 Redirect chain is a series of redirects that go from one URL after another, forcing people and search engines to wait until there are no more redirects to step through. This affects your crawl budget usage and how well your webpages are indexed, slows down your site’s load speed, and, as a result, your ranking may get hampered.

RECOMMENDATION:

If you are already experiencing issues with long redirect chains or loops, we recommend that you redirect each URL in the chain to your final destination page.

 23.Pages with a broken canonical link:

POSSIBLE ISSUE:

Setting a rel=”canonical” element on your page, you can inform search engines of which version of a page you want to show up in search results. Canonical links that lead to non-existent webpages complicate the process of crawling and indexing, this may lead to unnecessary crawl budget waste.

RECOMMENDATION:

Review all broken canonical links. If a canonical URL applies to a non-existent webpage, remove it or replace it with another resource.

 24. Pages have duplicate content issues:

ABOUT THIS ISSUE:

A website should not contain identical or nearly identical content these are considered as duplicate content. Duplicate content may confuse search engines as to which page to index and which one to prioritize in search results. Duplicated content across multiple pages may lead to traffic loss and poor ranking in search results

HOW TO FIX IT:

Here are a few ways to fix duplicate content:

– Provide some unique content on the webpage.

– Remove duplicate content.

– Add a rel=”canonical” link to one of your duplicate pages to inform search engines which page to show in search results.

 25.Issues with broken internal JavaScript and CSS files:

POSSIBLE ISSUE:

Any kind of script that has stopped running on your website may affect your rankings since search engines will not be able to properly render and index your webpages.  Broken JS and CSS files may cause website errors.

RECOMMENDATION:

Review all broken JavaScript and CSS files hosted on your website and fix any issues.

 26. Subdomains don’t support secure encryption algorithms:

ABOUT THIS ISSUE:

This issue is triggered when we connect to your web server and detect that it uses old or deprecated encryption algorithms. Using outdated encryption algorithms is a security risk that can have a negative impact on your user experience and search traffic. Some web browsers may warn users accessing your website about loading insecure content. This usually negatively affects their confidence in your website, thereby stopping them from going further, and as a result, you may experience a drop in your organic search traffic.

HOW TO FIX IT:

Contact your website administrator and ask them to update encryption algorithms.

 28.Sitemap.xml files are too large:

ABOUT THIS ISSUE:

This issue is triggered if the size of your sitemap.xml file (uncompressed) exceeds 50 MB or it contains more than 50,000 URLs. Sitemap files that are too large will put your site at risk of being ineffectively crawled or even ignored by search engines.

HOW TO FIX IT:

Break up your sitemap into smaller files. You will also need to create a sitemap index file to list all your sitemaps and submit it to Google.

29. Pages have slow load speed:

POSSIBLE ISSUE:

Crawler measures the time it takes to load a web page’s HTML code – load times for images, JavaScript and CSS are not factored in. Fast-loading pages positively affect user experience and may increase your conversion rates. The quicker your page loads, the higher the rankings

RECOMMENDATION:

The main factors that negatively affect your HTML page generation time are your server’s performance and the density of your webpage’s HTML code.

So, try to clean up your webpage’s HTML code. If the problem is with your web server, you should think about moving to better hosting service with more resources.

Warning:

1.Issues with unminified JavaScript and CSS files:

POSSIBLE ISSUE:

Minification is the process of removing unnecessary lines, white space, and comments from the source code. Minifying JS and CSS files make their site smaller which decrease your page load time and also improve your search engine rankings.

RECOMMENDATION:

 Minify your JavaScript and CSS files.

If your webpage uses CSS and JS files that are hosted on an external site, contact the website owner and ask them to minify their files.

If this issue doesn’t affect your page load time, simply ignore it.

2.Links on HTTPS pages lead to HTTP page:

POSSIBLE ISSUE:

If any link on website points to the old  HTTP version of a website, search engines can become confused as to which version of the page they should rank.

RECOMMENDATION: 

Replace all HTTP links with the new HTTPS versions.

3.Issues with blocked internal resources in robots.txt:

POSSIBLE ISSUE: 

Blocked resources are resources (e.g., CSS, JavaScript, image files, etc.) that are blocked from crawling by a “Disallow” directive in your robots.txt file. By disallowing these files, you’re preventing search engines from accessing them and, as a result, properly rendering and indexing your webpages. This may lead to lower rankings.

RECOMMENDATION:

To unblock a resource, simply update your robots.txt file.

 4.Issues with uncompressed JavaScript and CSS files:

POSSIBLE ISSUE:

This issue is triggered if compression is not enabled in the HTTP response. Apply advanced seo techniques for better result.

Compressing JavaScript and CSS files significantly reduces their size as well as the overall size of your webpage, thus improving your page load time.

Uncompressed JavaScript and CSS files make your page load slower, which negatively affects user experience and may worsen your search engine rankings.

If your webpage uses uncompressed CSS and JS files that are hosted on an external site, you should make sure they do not affect your page’s load time.

RECOMMENDATION:

Enable compression for your JavaScript and CSS files on your server.

If your webpage uses uncompressed CSS and JS files that are hosted on an external site, contact the website owner and ask them to enable compression on their server.

If this issue doesn’t affect your page load time, simply ignore it.

5. Pages have low text-HTML ratio:

POSSIBLE ISSUE:

Your text to HTML ratio indicates the amount of actual text you have on your webpage compared to the amount of code. This warning is triggered when your text to HTML is 10% or less.

Search engines have begun focusing on pages that contain more content. That’s why a higher text to HTML ratio means your page has a better chance of getting a good position in search results.

Less code increases your page’s load speed and also helps your rankings. It also helps search engine robots crawl your website faster.

RECOMMENDATION:

Split your webpage’s text content and code into separate files and compare their size. If the size of your code file exceeds the size of the text file, review your page’s HTML code and consider optimizing its structure and removing embedded scripts and styles.

6. Pages have too many parameters in their URLs:

POSSIBLE ISSUE:

Using too many URL parameters is not an SEO-friendly approach. Multiple parameters make URLs less enticing for users to click and may cause search engines to fail to index some of your most important pages.

RECOMMENDATION:

Try to use no more than four parameters in your URLs.

7. Outgoing internal links contain no follow attribute:

POSSIBLE ISSUE:

The rel=”no follow” attribute is an element in an <a> tag that tells crawlers not to follow the link (e.g., “<a href=”http://example.com/link” rel=”no follow”>Nofollow link example</a>”).”Nofollow” links don’t pass any link juice to referred webpages. That’s why it is not recommended that you use nofollow attributes in internal links. You should let link juice flow freely throughout your website. Moreover, unintentional use of nofollow attributes may result in your webpage being ignored by search engine crawlers even if it contains valuable content.

RECOMMENDATION:

Make sure not to use nofollow attributes by mistake. Remove them from <a> tags, if necessary.

8. Pages have too much text within the title tags:

POSSIBLE ISSUE:

Titles containing more than 75 characters can trigger this warning.

RECOMMENDATION:

Try to rewrite your page titles to be 75 characters or less.

9. Images don’t have alt attributes:

POSSIBLE ISSUE:

Alt attributes within <img> tags are used by search engines to understand the contents of your images. alt attributes allow you to rank in image search results.

 RECOMMENDATION:

Specify a relevant alternative attribute inside an <img> tag for each image on your website, e.g., “<imgsrc=”mylogo.png” alt=”This is my company logo”>”.

10. Pages use Flash:

POSSIBLE ISSUE: 

It is not recommended that you use Flash content for several reasons.

Most importantly, Flash content negatively impacts your website’s visibility because it cannot be properly indexed and crawled by search engines.

Secondly, using Flash content negatively affects your website’s performance. Search engines may consider it as a signal that your website isn’t worth ranking.

And finally, Flash content doesn’t work well on mobile devices.

RECOMMENDATION:

Try to avoid Flash content as much as possible.

11. Page doesn’t have doctype declared:

POSSIBLE ISSUE:

A webpage’s doctype instructs web browsers which version of HTML or XHTML is being used. Declaring a doctype is extremely important in order for a page’s content to load properly. If no doctype is specified, this may lead to various problems, such as messed up page content or slow page load speed, and, as a result, negatively affect user experience.

RECOMMENDATION:

Specify a doctype for each of your pages by adding a <!Doctype> element (e.g., “<!Doctype HTML5>”) to the very top of every webpage source, right before the <html> tag.

12. External links are broken:

POSSIBLE ISSUE:

Broken external links lead users from one website to another and bring them to non-existent webpages. Multiple broken links negatively affect user experience and may worsen your search engine rankings because crawlers may think that your website is poorly maintained or coded.

RECOMMENDATION:

Please follow all links reported as broken. If a target webpage returns an error, remove the link leading to the error page or replace it with another resource. If the links reported as broken do work when accessed with a browser, you should contact the website’s owner and inform them about the issue.

13. Pages don’t have an h1 heading:

POSSIBLE ISSUE:

While less important than <title> tags, h1 headings still help define your page’s topic for search engines and users. If a <h1> tag is empty or missing, search engines may place your page lower than they would otherwise.

RECOMMENDATION:

Give a relevant h1 heading for each of your pages.

14. Pages don’t have character encoding declared:

POSSIBLE ISSUE:

Providing a character encoding tells web browsers which set of characters must be used to display a web page’s content. If a character encoding is not specified, browsers may not render the page content properly, which may result in bad user experience.

RECOMMENDATION:

Declare a character encoding either by specifying one in the charset parameter of the HTTP Content-Type header (Content-Type: text/html; charset=utf-8) or by using a meta charset attribute in your webpage HTML (<meta charset=”utf-8″/>). For more details, please see these articles: Character Encoding – HTTP header and Character Encoding – HTML

15. Pages have duplicate H1 and title tags:

POSSIBLE ISSUE:

 If your page’s <title> and <h1> tags match, the latter may appear over-optimized to search engines. Also, using the same content in titles and headers means a lost opportunity to incorporate other relevant keywords for your page.

 RECOMMENDATION:

Try to create different content for your <title> and <h1> tags.

16. Pages don’t have meta descriptions:

POSSIBLE ISSUE:

A good description helps users know what your page is about and encourages them to click on it. If your page’s meta description tag is missing, search engines will usually display its first sentence, which may be irrelevant and unappealing to users.

RECOMMENDATION:

In order to gain a higher click-through rate, you should ensure that all of your webpages have meta descriptions that contain relevant keywords

17. Pages contain frames:

POSSIBLE ISSUE:

<frame> tags are considered to be one of the most significant search engine optimization issues. Not only is it difficult for search engines to index and crawl content within <iframe> tags, which may, in turn, lead to your page being excluded from search results, using these tags also negatively affects user experience.

RECOMMENDATION:

How to fix: Try to avoid using <frame> tags whenever possible.

18. Pages have underscores in the URL:

POSSIBLE ISSUE:

Using underscores as word separators are not recommended because search engines may not interpret them correctly and may consider them to be a part of a word. Using hyphens instead of underscores makes it easier for search engines to understand what your page is about.

RECOMMENDATION:

Replace underscores with hyphens. However, if your page ranks well, we do not recommend that you do this.

19. Pages have too many on-page links:

POSSIBLE ISSUE:

 As per rule, Crawler doesn’t crawl more than three thousand on-page links. Search engines crawl the first 3000 links on a page and ignore all the links that are over three thousand limits. Placing tons of links on a webpage can make your page look low quality which may cause your page to drop in rankings.

RECOMMENDATION:

Try to keep the number of on-page links to under 3000.

20.Sitemap.xml not indicated in robots.txt:

POSSIBLE ISSUE:

If you have both a sitemap.xml and a robots.txt file on your website, it is a good practice to place a link to your sitemap.xml in your robots.txt, which will allow search engines to better understand what content they should crawl.

 RECOMMENDATION:

Specify the location of your sitemap.xml in your robots.txt.

21.Sitemap.xml not found:

POSSIBLE ISSUE:

By providing a sitemap your website will be crawled faster and more intelligently. Sitemap.xml file is used to list all URLs available for crawling. it also quickly informs search engines about any new or updated content on your website.

RECOMMENDATION:

Consider generating a sitemap.xml file if you don’t already have one.

22. Homepage does not use HTTPS encryption:

POSSIBLE ISSUE:

Google considers a website’s security as a ranking factor. Websites that do not support HTTPS connections may be less prominent in Google’s search results, while HTTPS-protected sites will rank higher with its search algorithms.

RECOMMENDATION:

Switch your site to HTTPS.

23. Subdomains don’t support SNI:

POSSIBLE ISSUE:

One of the common issues you may face when using HTTPS is when your web server doesn’t support Server Name Indication (SNI). Using SNI allows you to support multiple servers and host multiple certificates at the same IP address, which may improve security and trust.

RECOMMENDATION:

Make sure that your web server supports SNI. Keep in mind that SNI is not supported by some older browsers, which is why you need to ensure that your audience uses browsers supporting SNI.

24.HTTP URLs in sitemap.xml for HTTPS site:

POSSIBLE ISSUE:

Using different URL versions in your sitemap could be misleading to search engines and may result in an incomplete crawling of your website.

RECOMMENDATION:

Replace all HTTP URLs in your sitemap.xml with HTTPS URLs.

25.Uncompressed pages:

POSSIBLE ISSUE:

This issue occurs if the Content-Encoding entity is not present in the response header. Page compression is important for optimizing your website. Using uncompressed pages leads to a slower page load time, resulting in lower search engine ranking. get expert knowledge from a seo consultant.

RECOMMENDATION:

Enable compression on your webpages for faster load time.

26. Pages have temporary redirects:

POSSIBLE ISSUE:

Temporary redirects mean that a page has been temporarily moved to a new location. Search engines will continue to index the redirected page, and no link juice or traffic is passed to the new page, which is why temporary redirects can damage your search rankings if used by mistake.

RECOMMENDATION:

Review all pages to make sure the use of 302 and 307 redirects is justified. If so, don’t forget to remove them when they are no longer needed. However, if you permanently move any page, replace a 302/307 redirect with a 301/308 one.

27. Pages don’t have enough text within the title tags:

POSSIBLE ISSUE:

Using short titles on webpages is a recommended practice. However, keep in mind that titles containing 10 characters or less do not provide enough information about what your webpage is about

RECOMMENDATION:

Add more descriptive text inside your page’s <title> tag.

28.Issues with uncached JavaScript and CSS files:

POSSIBLE ISSUE:

This issue occurs if browser caching is not specified in the response header.

Enabling browser caching for JavaScript and CSS files allows browsers to store and reuse these resources without having to download them again when requesting your page. That way the browser will download less data, which will decrease your page load time.

RECOMMENDATION:

If JavaScript and CSS files are hosted on your website, enable browser caching for them.

If JavaScript and CSS files are hosted on a website that you don’t own, contact the website owner and ask them to enable browser caching for them.

If this issue doesn’t affect your page load time, simply ignore it.

29. Pages have a JavaScript and CSS total size that is too large:

POSSIBLE ISSUE:

This issue occurs if the total transfer size of the JavaScript and CSS files used on your page exceeds 2 MB.

The size of the JavaScript and CSS files used on a webpage is one of the important factors for a page’s load time. Having lots of JavaScript and CSS files make your webpage “heavier” and also increases the load time and lower search engine rankings.

RECOMMENDATION:

Review your pages to make sure that they only contain the necessary JavaScript and CSS files. If all resources are important for your page, consider reducing their transfer size.

30. Pages use too many JavaScript and CSS files:

POSSIBLE ISSUE:

This issue occurs if a webpage uses more than a hundred JavaScript and CSS files.

For each file used by your webpage, a browser will send a separate HTTP request. Each request increases your page load time and affects its rendering, which has impact bounce rate and search engine rankings.

RECOMMENDATION:

Review your pages to make sure that they only contain the necessary JavaScript and CSS files.

If all resources are important for your page, we recommend that you combine them.

31. Pages have a low word count:

POSSIBLE ISSUE:

This issue occurs if the number of words on your webpage is less than 200.

Search engines prefer to provide as much information to users as possible, so pages with longer content tend to be placed higher in search results, as opposed to those with lower word counts.

RECOMMENDATION:

Improve your on-page content and be sure to include more than 200 meaningful words.

Notices:

1. Pages have only one incoming internal link:

POSSIBLE ISSUE:

Having very few incoming internal links means very few visits or even none, and fewer these reduces your ranking in search results. Add more incoming internal links to pages with useful content.

RECOMMENDATION:

Add more incoming internal links to pages with important content.

2. Pages need more than 3 clicks to be reached:

POSSIBLE ISSUE:

A page’s crawl depth is the number of clicks required for users and search engine crawlers to reach it via its corresponding homepage. From an SEO perspective, an excessive crawl depth may pose a great threat to your optimization efforts, as both crawlers and users are less likely to reach deep pages.

For this reason, pages that contain important content should be no more than 3 clicks away from your homepage.

RECOMMENDATION:

Make sure that pages with important content can be reached within a few clicks.

If any of them are buried too deep in your site, consider changing your internal link architecture.

3.Robots.txt not found :

POSSIBLE ISSUE:

A robots.txt file helps search engines determine what content on your website they should crawl. Utilizing a robots.txt file can cut the time search engine robots spend crawling and indexing your website.

 RECOMMENDATION:

If you don’t want specific content on your website to be crawled, creating a robots.txt file is recommended.

4.Orphaned pages in sitemaps:

POSSIBLE ISSUE:

An orphaned page is a webpage that is not linked to internally.

Including orphaned pages in your sitemap.xml files is considered to be a bad practice. Crawling outdated orphaned pages will waste your crawl budget.

RECOMMENDATION:

If an orphaned page in your sitemap.xml file has valuable content, we recommend that you link to it internally.

Review all orphaned pages in your sitemap.xml files and do either of the following:

 – If a page is no longer needed, remove it

 – If a page has valuable content and brings traffic to your website, link to it from another page on your website

 – If a page serves a specific need and requires no internal linking, leave it as is

5.URLs on  pages are too long:

POSSIBLE ISSUE:

According to Google, URLs longer than 100 characters are not SEO friendly. Also, some browsers may have difficulties parsing extremely long URLs.

RECOMMENDATION:

Rewrite your URLs to be fewer than 100 characters.

 6. Pages are blocked from crawling:

POSSIBLE ISSUE:

If a page cannot be accessed by search engines, it will never appear in search results. A page can be blocked from crawling either by a robots.txt file or a noindex meta tag. Apply latest seo techniques for improvement.

RECOMMENDATION:

Make sure that pages with valuable content are not blocked from crawling by mistake.

7.Pages blocked by X-Robots-Tag: noindex HTTP header:

POSSIBLE ISSUE:

The x-robots-tag is an HTTP header that can be used to instruct search engines whether or not they can index or crawl a webpage. If a page is blocked from crawling with x-robots-tag, it will never appear in search results.

RECOMMENDATION:

Make sure that pages with valuable content are not blocked from crawling by mistake.

8.Issues with blocked external resources in robots.txt:

POSSIBLE ISSUE:

Blocked external resources are resources (e.g., CSS, JavaScript, image files, etc.) that are hosted on an external website and blocked from crawling by a “Disallow” directive in an external robots.txt file. Disallowing these files may prevent search engines from accessing them and, as a result, properly rendering and indexing your webpages. This will lead to low rank.

RECOMMENDATION:

If blocked resources that are hosted on an external website have a strong impact on your website, contact the website owner and ask them to edit their robots.txt file.

If blocked resources are not necessary for your site, simply ignore them.

9. Outgoing external links contain nofollow attributes:

POSSIBLE ISSUE:

A nofollow attribute is an element in a <a> tag that tells crawlers not to follow the link. “Nofollow” links don’t pass any link juice or anchor texts to referred webpages.

Unnecessary use of Nofollow tag might hamper your site’s ranking factor. 

RECOMMENDATION:

Make sure you haven’t used nofollow attributes by mistake. Remove them from <a> tags, if needed.

10. Pages have more than one H1 tag:

POSSIBLE ISSUE:

Although multiple <h1> tags are allowed in HTML5, we still do not recommend that you use more than one <h1> tag per page. Including multiple <h1> tags may confuse users.

RECOMMENDATION:

Use multiple <h2>-<h6> tags instead of an <h1>.

11.URLs with a permanent redirect:

POSSIBLE ISSUE:

Although using permanent redirects (a 301 or 308 redirect) is appropriate in many situations (for example, when you move a website to a new domain, redirect users from a deleted page to a new one, or handle duplicate content issues), we recommend that you keep them to a reasonable minimum. Every time you redirect one of your website’s pages, it decreases your crawl budget, which may run out before search engines can crawl the page you want to be indexed. Moreover, too many permanent redirects can be confusing to users.

A 301 redirect is a permanent redirect which passes between 90-99% of link equity (ranking power) to the redirected page. 301 refers to the HTTP status code for this type of redirect. In most instances, the 301 redirect is the best method for implementing redirects on a website.

RECOMMENDATION:

Review all URLs with a permanent redirect. Change permanent redirects to a target page URL where possible.

  •  
  •  
  •  
  •  

Leave a Reply

Your email address will not be published. Required fields are marked *