SEO Onpage Errors

Checking Onpage SEO Errors, Warnings: Definition of Terms

One of the first things that can be done to quickly optimize a website is to check for onpage SEO errors. It gives an SEO specialist or a web developer a list of things that can be fixed in order to optimize the site for search engines.

There are a lot of online SEO tools that can give you a list of the errors that you can start to fix right away provided that you have access to the website’s content management system (CMS). Some errors are easy to fix while some can take a while to track down.

4XX Status Code

The 4XX Status Code error appears when a visitor cannot access a certain page on your website. This may be due to the fact that such page does not really exist or the URL might just be wrong. Another possible reason is that the page is restricted to be viewed by the public. If this error appears on the SEO audit tools that you use, you can pinpoint which pages cannot be accessed by the visitors.

It can hurt your optimization since crawlers would not be able to access your pages. This can lead to a drop in the overall traffic coming to your site and a decrease in the indexed pages.

How to Fix It: Check the page permissions and/or check the URL. Changing the permalink structure or the permalink of an individual page itself can bring up 4XX errors if there are no proper redirects.

Broken Internal Links

As the error name suggests, there can be some internal links that end up showing errors. These errors can be due to erroneous URLs or the page does not exist anymore. 5XX errors may also indicate that there is something wrong with the page itself which prevents it from loading altogether. Broken internal links can hurt your site optimization because it can stop crawlers from going through the rest of your site. It can show a dead end for the crawlers.

How to Fix It: Check the page that you are linking to. If it exists, find out why it is not loading properly. Also check the proper URL and provide a redirect if possible.

Broken Javascript and CSS Files

The error stems from the fact that your site has malfunctioning Javascript and CSS files. These are often present in customizations along with some plugins. Pages may not render properly and it can spoil the experience for your users. Crawlers would not also be able to properly render the page as well which can hurt your rankings and indexation status.

How to Fix It: Read the documentation on your plugins and customization themes. Update them if necessary to fix any broken files. Worst-case scenario is that you would have to disable the plugin and change the theme to prevent this errors from appearing.

Duplicate Meta Descriptions

As the name of the error suggest, your webpages could have duplicate meta descriptions. This means that there are pages that have the exact same match for meta descriptions. These can hurt your chances of ranking for other keywords since you are losing opportunity to use other ones in different pages.

How to Fix It: Change the meta descriptions for your pages that have duplicates. Use a descriptive meta description with heuristic-based messaging if possible to also entice users in clicking your page in the search results. Use more keywords but do not spam them as well.

Duplicate Title Tags

Pages that have duplicate title tags might confuse search engines as to which one they should point to. It also deprives you of a chance to use other keywords. Title tags are one of the many ranking factors since search engine checks them to see what the page is all about.

How to Fix It: Change the duplicate title tags to reflect the actual content of the page. If the contents are too similar, consider editing the article instead and focus on using latent semantic indexing keywords to broaden and maximize the keywords.

HTTP/HTTPS Mixed Links

Migrating to HTTPS is no easy task. Even if the site seems to have successfully migrated already, there might be some pages that have not migrated properly. This leads to some internal links still choosing to communicate with the HTTP protocol. This can lead to security issues and even disrupt your visitors’ experience since insecure pages and elements prompt browsers to pop up warnings.

How to Fix It: Make sure that every page and element (including photos and other media) are now hosted in the HTTPS version of your site. If you are not sure, ask your web developer to double check everything.

Slow Load Speed

Not only is a slow loading page bad for the crawlers, it can also make you lose a huge chunk of your visitors. People are not generally patient. Nobody likes for a webpage to load for more than few seconds. If you have webpages that take more than 2 seconds to load, you might have a problem with your server or heavy elements.

How to Fix It: Check for any media content that may be dragging the page load speed down. File size is often the problem for slow load speeds. However, there may be other factors such as the hosting provider or any other scripts that may take too long to execute. Check out Google Chrome’s Lighthouse audit for a more detailed performance evaluation.

Duplicate Content Issues

This is often a problem found in e-commerce sites but that does not mean that your websites won’t have this problem too. Duplicate content errors are pretty self-explanatory. You can’t have several other pages basically having the same content. It will confuse users and the search engines as well.

How to Fix It: If you really have multiple pages with the same content but for different items or products, it may benefit you to use just one canonical link. A better solution is to add more copy and text for each page to make them unique from each other.

Unminified Javascript and CSS Files

Minification can immensely help in reducing the sizes of the Javascript and CSS files, which in turn can lead to faster loading pages. Do be careful of minification because it might render other scripts broken and dysfunctional.

How to Fix It: If you are running WordPress, you can use plugins to automatically minify the files. However, you can just outsource the task to the web developers if you want to be sure.

Missing Alt Attributes

Photos can be a source of organic traffic too and not using alt texts in your photos can be a lost opportunity. This is especially true if you provide original photos in your website.

How to Fix It: Use your content management system to fill in the missing alt attributes for all of your photos. If there are duplicate photos, you should still use different alt texts. Do not reuse the same information every time.

Low Text-to-HTML Ratio

One of the most common problems for websites is the low text-to-html ratio. This can come from the fact that the webpage has more code than the actual content.

How to Fix It: You can add more content to even out the difference. This is the preferred way but it may not always be the best. If most of your pages have a low text-to-html ratio, then it must be a problem with your theme and not the content length per se. Have your web developer look at your code and see what needs to be done to clean it up. It will also help you improve your page load speed.

Low Word Count

If your page has only a small number of words, Google might not actually see that it is useful content. The difference between this error and the low text-to-html ratio is that you might really have a low content density even if there isn’t much code at all.

How to Fix It: Just add more content for your page. Add more facts and useful content. Use this opportunity to add more LSI keywords and such.

Too Much Text in Title Tags

Google only shows about 75 characters for page titles on the Search Engine Results Page. Anything longer and it gets truncated. This can negatively affect your clickthrough rates.

How to Fix It: Try to maximize your title tag and use less than 75 characters. Delete unnecessary words to fit all of the keywords, but make sure that it still makes sense.

Broken External Links

Broken external links can ruin the user experience for your visitors. Search engines might also see that you are not using any legitimate resources since you have broken external links on your page.

How to Fix It: Find the external links that are broken and replace them with working ones. If the links are time-sensitive, avoid using them in the first place. Just direct users to a site’s homepage.

No Sitemap

Sitemaps allow crawlers to easily find all of the pages in your website. In addition, you can also submit sitemaps to Google directly using the Search Console and have all of the pages crawled instantly. It is also their reference for crawling your site. Without it, crawlers might find it hard to see all of the pages in your site.

How to Fix It: Set up a proper sitemap with all of the pages that you want to include. In WordPress, the Yoast SEO plugin can do this for you. It automatically updates every time there is a new post. It can also let you choose which pages are excluded from the sitemap.

Multiple H1 Tags

H1 tags are the largest headings available. It provides a clear hierarchy of the content and having multiple H1 headings can confuse users. Some say that it is also a minor ranking factor for SEO.

How to Fix It: Use only one H1 tag on each page. Try to use the best keywords but also make sure that the heading will make sense.

Page Can’t Be Crawled

If a page cannot be crawled, it means that it would not be indexed as well. It will not appear on search engines too.

How to Fix It: Contact your host or web administrator to check what is causing the error. It could be because the page is not really set to be indexed (i.e. customer account pages, checkout pages). If not, there could be a problem with the server itself.

No Robots.txt

Search engine crawlers use the robots.txt file as their guide when crawling a website. It includes all the rules where they can crawl and where they cannot. Without it, the crawler might just crawl what is available to its default programming.

How to Fix It: Most dynamic content management systems already generate robots.txt for websites. If not, you would have to make one manually using a text editor and place it at the root directory of your site.

Pages with Too Much Crawl Depth

Pages that are hard to find by the crawlers may also be difficult to reach by your users. If a page requires three or more clicks before finding them, they can be considered to have a deep crawl depth. It can negatively affect your rankings especially if the crawler doesn’t reach them at all.

How to Fix It: Make sure that all pages are reachable and easy to navigate. Consider them carefully as Google’s crawlers only have limited budget for your sits.

Let us Help

If all of this is overwhelming to you, we can provide you with a free consultation along with the audit. Send us an email with your website URL to get started.

Leave a Reply

Your email address will not be published.