Technical SEO Fixes:

 

Most businesses have a list of things that they need to implement, and low-effort/high-reward tasks should be that the top of that list. This includes tackling technical search engine optimization, SEO, issues on your site. 

In this issue, we are going to focus on easy to fix, straightforward problems. Many of the issues covered will be able to fix quickly and resolved several months’ worth of organic traffic woes.

While there is not a magic button that will resolve SEO once and all, there are things you can easily check right now.

Technical SEO is defined as the aspects of a site comprising more technical problems that the average marketer would not identify and will take a bit more experience to uncover. More than likely these technical SEO problems are site-wide problems. Fixing the problem can help improve your whole site and not an isolated, singular page. There are many steps to getting a fully optimized site, but starting with this list will increase your SERP, search engine results page, results.

1. Check indexation

digital marketing top rank

Have you ever sat there looking at your analytics and wondered why you are not ranking for your brand name? Don’t let this get our stumped.  You are not able to get organic traffic to our site if it doesn’t show up in Google search.

Sergey Stefoglo at Distilled wrote an article that broke down the complex process of a technical eCommerce SEO audit into two buckets: indexing and ranking.

Do not go crazy with a several hundred itemized checklists, that has no prioritization.  Therefore, through that list out the window (figuratively, don’t litter), and start first with checking to see if your site is indexing.

An index is a technical name for the database used by a search engine. Indexes contain the information on all the websites that Google was able to find. If a website is not in a search engine’s index, users will not be able to find it.

With a quick site search directly in Google,  you can find out if your site is indexed.

Type site:{yoursitename.com} into Google search and you’ll immediately see how many pages on your site are ranking.

2. Robots.txt

robots text file syntax

An easy fix, but possibly the biggest offender, that is ruining your organic traffic is a robots.txt not being removed after the site was redeveloped. 

You can check to see if your site is blocked by going to yoursitename.com/robots.txt and make sure it doesn’t show “User-agent: * Disallow: /”.

If you have the above issue you need to talk to your developer immediately and have them resolve it promptly.

3. NOINDEX

robot access & indexation restriction: no index, no follow HTML meta tags

Even more damaging than robots.txt at times is NOINDEX. Using robots.txt will not pull pages out of Google’s index if they have already been indexed. However, a NOINDEX directive will remove every page with the configuration on it from the index.

NOINDEX is more commonly used when a website is in its development phase.  It’s best to use a tool like Screaming Frog to scan all the pages on your site at once.

4. URL Canonicalization

The average user doesn’t really care, more likely they do not know the difference if your home page shows up as all of these separately:

  • example.com
  • www.example.com
  • example.com/home.html
  • www.example.com/home.html

But, search engines are aware of the different configurations and they may index a mixed assortment of your URL versions.  This can cause confusion and dilute link equity.

5. Rel=canonical

Rel=canonical tag is closely related with the canonicalization we previously mentioned, but it is different because it’s used for more than resolving the same version of a slightly different URL.

Rel=canonical is used for preventing duplicate pages when you have similar content across different pages, which is often an issue with e-commerce sites.

6. Broken backlinks

A website migration or relaunch project can spew out countless broken backlinks from other websites, if not properly overseen by a professional SEO.

Unfortunately, some of the top pages on your site may have fallen victim to 404 error pages after a migration. The backlinks pointing back to these 404 pages are effectively broken.

There are two types of tools are great for finding broken backlinks — Google Search Console, and a backlink checker such as Moz, Majestic, or Ahrefs.

7. HTTPS

  • https://

Everyone is aware that HTTPS is for secure sites, however, what was once only necessary for e-commerce sites is now becoming more of a necessity for all sites.

Google just recently announced that they would start marking any non-HTTPS site as non-secure if the site accepts passwords or credit cards:

“To help users browse the web safely, Chrome indicates connection security with an icon in the address bar. Historically, Chrome has not explicitly labeled HTTP connections as non-secure. Beginning in January 2017 (Chrome 56), we’ll mark HTTP pages that collect passwords or credit cards as non-secure, as part of a long-term plan to mark all HTTP sites as non-secure.”

What’s even more shocking is Google’s plan to label all HTTP URLs as non-secure:

“Eventually, we plan to label all HTTP pages as non-secure, and change the HTTP security indicator to the red triangle that we use for broken HTTPS.”

Although not confirmed, it is possible that Google will start giving HTTPS sites better ranking in the SERP over HTTP sites.

8. 301 & 302 redirects

Redirects are an amazing tool for managing and controlling dead pages, consolidating multiple pages, and making website migrations work without a hitch.

301 redirects are permanent and 302 redirects are temporary. The best practice is to always use 301 redirects when permanently redirecting a page.

It is extremely important to have someone on your team who really understands how to properly strategize the usage and implementation of 301 redirects across your whole site.

9. Meta refresh

any url address tabs

Meta refreshes are used clients on their end for redirects.  This is not recommended by Google or professional SEO and it is better to use 301 redirects instead.

You can manually spot-check individual pages using the Redirect Path Checker Chrome Extension, Screaming Frog or another site crawler.

10. XML sitemaps

graphical image of a basic hierarchical structure

Google’s Search Console Help Guide is quite clear on the purpose and helpfulness of XML sitemaps:

“If your site’s pages are properly linked, our web crawlers can usually discover most of your site. Even so, a sitemap can improve the crawling of your site, particularly if your site meets one of the following criteria: 

  • Your site is really large. 
  • Your site has a large archive of content pages that are isolated or well not linked to each other. 
  • Your site is new and has few external links to it.”

XML sitemaps help Google as well as other search engine crawlers and understand your site. Most often they have the biggest impact for large and complex sites that need to give extra direction to the crawlers.

A few of the biggest problems I’ve seen with XML sitemaps while working on clients’ sites:

  • Not creating
  • Not including the location of the sitemap in the robots.txt
  • Having multiple versions of the same sitemap
  • Having old versions of the sitemap 
  • Not updating Search Console with the newest copy
  • Not using sitemap indexes for large sites

11. Unnatural word count & page size

Having an inflated word count can slow down the load speed of your page. Which could possibly trigger some penalty issues is seen as intentional cloaking. You should scan your site and compare the calculated word count and page size with what you expect. 

In addition to inflated and unnatural word count, there can be another code bloat on the page, such as inline Javascript and CSS. Fixing these problems would fall under the scope of the development team, you shouldn’t rely on the developers to be proactive in identifying these types of issues.

12. Speed

We’ve said it before and we probably going to say it many more times, speed is key and will fall under the purview of technical SEO.

Google has clearly stated that speed is a small part of the algorithm: “Like us, our users place a lot of value in speed — that’s why we’ve decided to take site speed into account in our search rankings. We use a variety of sources to determine the speed of a site relative to other sites.”

Speed is at the bottom of the priority list for many site managers, even with this clear SEO directive, and obvious UX and CRO benefits. With mobile search clearly cemented as just as important, if not more important as desktop search, speed is even more important and can no longer be ignored.

13. Internal linking structure

The crawlability of your site is hugely impacted by your internal linking structure.  If you are optimizing a massive site with isolated pages without a clean site structure, then it might be a huge amount of effort. If you are managing a smaller site on a standard platform it most likely will not be high on the priority list. 

When you are thinking about building out your internal linking plan think about 

  • Scalable internal link
  • Optimizing anchor text without over-optimizing

Developers are not responsible for monitoring and fixing your traffic. Technical SEO problems do not fall into developers scope of responsibilities, which is why you need to have a dedicated SEO team. Whether it is an in-house team or outside digital marketing company who specializes in SEO, you need to make sure you are not the only one handling issues. 

Blue Water Marketing has teams dedicated to improving your SEO both on-page and technical, here to assist by giving us a call to achieve your digital marketing goals.

Subscribe to our newsletter

Don’t miss new updates on your email

cropped BWM logo

This website uses cookies to ensure you get the best experience on our website. Please read our privacy policy