Articles

What SEO errors does Netpeak Spider detect?

by Xozepew Usaz We are always in the lead!
H1 duplicates
Shows indexed pages with duplicate <h1> headers. In this report, all URLs are grouped by the "H1 content" parameter.

What threatens
The H1 heading is an important element of search engine optimization. It helps users to understand the content of the page when they visit the site.

H1 tags for different pages are duplicated if their content is identical. Most often this happens when the pages are not yet optimized, and H1 texts are automatically generated from low-quality templates.

If the H1 header is duplicated, it may not fully reveal the essence and not correspond to the content of its page - in this case, users and search engines may consider the site to be of poor quality. As a result, it may lose search traffic.

How to fix
It is necessary to create a unique (within the site) H1 heading for each page, which will concisely describe its content and contain target keywords. It should be short and informative: the optimal length is from 3 to 7 words.

It is also recommended that there is only one H1 heading per page, not duplicating the contents of the <title> tag. To do this, you just need to remove duplicate h1 tags.

broken pages
Shows unavailable URLs (for example, due to disconnection, response timeout, etc.) or page addresses that return a server response code of 4xx or higher. To view a special report specifically for broken links, click the "Error report" button above the main table.

What threatens
Broken pages are URLs that are inaccessible to users and search engines (for example, they have been deleted, the server cannot process the request, etc.).

Getting to such addresses, the user sees an error page instead of useful content, and therefore may consider the site to be of poor quality and leave it.

When a site has a lot of links to broken pages, search engines can also consider it to be of poor quality and lower its position in the search results. In addition, search robots spend a lot of resources on scanning broken pages, so these resources may not be enough for pages that are important for promotion, and they may not get into the search base. As a result, the site may lose search traffic.

How to fix
It is necessary to remove links to broken pages or replace them with links to available addresses. To see links to broken pages, you need to click on the "Error report" button above the main table.

If a lot of URLs with a 429, 5xx response code, or a timeout appear during the crawl, the pages may have become unavailable due to heavy load on the site. In this case, you need to stop scanning, reduce the number of threads in the settings, set the interval between requests or use the proxy list, and then continue scanning. When the scan is completed, you need to rescan the inaccessible URLs: just select the URL in the table and press the Ctrl+R keyboard shortcut.

4xx errors: Client Error
Shows URLs that return a 4xx server response code.

What threatens
URLs with a 4xx response code are included in the "Broken Pages" report, and are also highlighted in a separate "4xx Errors: Client Error" report, as they are common. Such a response code means that an error occurred in the request to the server (for example, the page is not on the site, it has been deleted, or the user does not have rights to visit it).

Getting to such addresses, the user sees an error page instead of useful content, and therefore may consider the site to be of poor quality and leave it.

When a site has a lot of links to broken pages, search engines can also consider it to be of poor quality and lower its position in the search results. In addition, search robots spend a lot of resources on scanning broken pages, so these resources may not be enough for pages that are important for promotion, and they may not get into the search base. As a result, the site may lose search traffic.

How to fix
You need to remove links to URLs with 4xx errors or replace them with links to accessible pages. To see incoming links to such URLs, just press the key combination Shift+F1.

If a lot of URLs with a 429 response code appear during the crawl, the pages may have become inaccessible due to the heavy load on the site. In this case, you need to stop scanning, reduce the number of threads in the settings, set the interval between requests or use the proxy list, and then continue scanning. When the scan is completed, you need to rescan the inaccessible URLs: just select the URL in the table and press the Ctrl+R keyboard shortcut.

5xx errors: Server Error
Shows URLs that return a 5xx server response code.

What threatens
URLs with a 5xx response code are included in the "Broken Pages" report, and are also highlighted in a separate "5xx Errors: Server Error" report, as they are common. This response code means that the server cannot process the request.

Getting to such addresses, the user sees an error page instead of useful content, and therefore may consider the site to be of poor quality and leave it.

When a site has a lot of links to broken pages, search engines can also consider it to be of poor quality and lower its position in the search results. And the appearance of a URL with a 5xx response code during a site visit by a search robot can drastically reduce the speed of site crawling, and pages important for promotion may not get into the search base. As a result, the site may lose search traffic.

How to fix
You need to determine the reasons why the URL is unavailable: for example, the server response code for them may not be configured correctly. In this case, you need to change the URL settings so that the pages return a 200 OK response code.

If a lot of URLs with 5xx response codes appear during the crawl, the pages may have become inaccessible due to the heavy load on the site. In this case, you need to reduce it: stop scanning, reduce the number of threads in the settings and / or set the interval between requests, and then continue scanning. When the scan is completed, you need to rescan the inaccessible URLs: just select the URL in the table and press the Ctrl+R keyboard shortcut.

Links with bad URL format
Shows pages containing links with invalid URL format. To view a special report on this error, click the "Error Report" button above the main table.

What threatens
An ill-formed URL cannot be opened by users or crawlers because the address is invalid.

When clicking on broken links, the user sees an error page instead of useful content, and therefore may consider the site to be of poor quality and leave it.

When a site has a lot of links to broken pages, search engines can also consider it to be of poor quality and lower its position in the search results. As a result, the site may lose search traffic.

How to fix
Most often, an error occurs due to typos (mistakes in writing the protocol, incorrect style of the “/” symbol, etc.) or extra characters in the link addresses.

To determine which links use the wrong URL format, just click on the "Error Report" button above the main table. These links need to be corrected (so that they lead to accessible addresses) or removed from the code of the pages.

Sponsor Ads


About Xozepew Usaz Innovator   We are always in the lead!

10 connections, 1 recommendations, 61 honor points.
Joined APSense since, December 15th, 2020, From Canada, United Kingdom.

Created on Sep 1st 2022 20:21. Viewed 84 times.

Comments

No comment, be the first to comment.
Please sign in before you comment.