Magento Technical SEO Issues You Shouldn’t Be Ignoring

What’s the problem with my store? What if common technical SEO issues?

 

Many eCommerce websites don’t take SEO seriously and focus mostly on PPC and social media ads to promote their sites. eCommerce SEO is not easy, and it may take a lot of time and effort to set up. But if set up properly, it will bring you quality traffic that converts. 

 

To audit your site, you should look at on-page SEO, off-page SEO, and technical SEO.

 

Technical SEO issues


But don’t start your efforts with the first two.
Technical SEO is one of the most crucial parts of your strategy, and you have to focus on it first. Technical SEO is the first step in creating a better search experience. Other SEO projects should come after you’ve guaranteed your site has proper usability.

 

 

  • NOTE: Technical SEO includes components like page titles, title tags, HTTP header responses, XML sitemaps, 301 redirects, and metadata. Technical SEO does NOT include analytics, keyword research, backlink profile development, or social media strategies.

 



At MageCloud, while doing hundreds of Magento site audits over the years, we usually come across the same common technical SEO issues over and over again.

Here we share a few tips on what to start out. Hopefully, our technical SEO checklist will help you to find the SEO errors that hurt your Magento online store. 

 

Be honest, when’s the last time you checked that everything was working right and you were getting the right results in terms of traffic and online authority?

 

I believe it’s definitely high time to take another look.

 

 

Technical SEO Checklist:

 

#1 HTTPS Security

#2 Site Isn’t Indexed Correctly

#3 No XML Sitemaps

#4 Missing or Incorrect Robots.txt

#5 Meta Robots NOINDEX

#6 Multiple Home URL Versions

#7 Incorrect Rel=Canonical

#8 Duplicate Content

#9 Not Enough Use of Structured Data

#10 Broken Links

#11 301 & 302 redirects

#12 Slow Page Speed

#13 Mobile Device Optimization

#14 Missing Alt tags

#15 Missing or Non-Optimized Meta Descriptions and H1 tag

 

#1  HTTPS Security

Security has always been “a top priority” for Google. In 2014 the search engine giant announced HTTPS as a ranking signal. Since October 2017, Google is showing “Not secure” caution signs in Chrome every time a web user lands on an HTTP site. 

 

The first step for this quick fix is to check if your site is HTTPS.


How to Check 
?

    • ✔️   Type your domain name into Google Chrome. If you see the “secure” message (pictured below), your site is secure.
  • ✔️   If your site is not secure, when you type your domain name into Google Chrome, it will display a gray background — or even worse, a red background with a “not secure” warning. This warning in your address bar is going to kill your conversions.

 


How to Fix  ?️

  • ✔️   To convert your site to HTTPS, you need an SSL certificate from a Certificate Authority. For more detailed info, check this step-by-step guide.
  • ✔️   Once you purchase and install your certificate, your site will be secure.

 

 

 

#2  Site Isn’t Indexed Correctly

When you search for your brand name in Google, does your website show up in the search results? If the answer is no, there might be an issue with your indexation. As far as Google is concerned, if your pages aren’t indexed, they don’t exist — and they certainly won’t be found on the search engines.

 


How to Check 
?

  • ✔️   Type the following into Google’s search bar: “site:yoursitename.com” and instantly view the count of indexed pages for your site.

 


How to Fix  ?️

  • ✔️   If your site isn’t indexed at all, you can begin by adding your URL to Google.
  • ✔️   Your site is indexed, but there are many MORE results than expected, look deeper for either site-hacking spam or old versions of the site that are indexed instead of appropriate redirects in place to point to your updated site.
  • ✔️   In case your site is indexed, but you see quite a bit LESS than expected, perform an audit of the indexed content and compare it against which pages you want to rank. If you’re not sure why the content isn’t ranking, check Google’s Webmaster Guidelines to ensure your site content is compliant.
  • ✔️   If the results are different than you expected in any way, verify that your important website pages are not blocked by your robots.txt file (see #4 on this list). You should also verify you haven’t mistakenly implemented a NOINDEX meta tag (see #5 on this list).

 

 

 

#3  No XML Sitemaps

XML sitemaps help Google search bots understand more about your site pages, so they can effectively and intelligently crawl your site.

 


How to Check 
?

  • ✔️   Type your domain name into Google and add “/sitemap.xml” to the end, as pictured below.

  • If your website has a sitemap, you will see something like this:
  •  

 


How to Fix  ?️

  • ✔️   If your website doesn’t have a sitemap (and you end up on a 404 page), you can create one yourself or hire a web developer to create one for you. The easiest option is to use an XML sitemap generating tool. 

 

 

 

#4  Missing or Incorrect Robots.txt

A missing robots.txt file is a big red flag — but did you also know that an improperly configured robots.txt file destroys your organic site traffic?

 


How to Check
?

  • ✔️   To determine if your robots.txt file is incorrect, type your website URL into your browser with a “/robots.txt” suffix, as pictured below.
  • ✔️   If you get a result that reads “User-agent: * Disallow: /” then you have an issue, cause this example tells all robots to stay out of your website:

  • Photo credit for robots.txt bad file [Source: robotstxt.org]

 


How to Fix  ?️

  • ✔️   If you see “Disallow: /”, immediately talk to your developer. There could be a good reason it’s set up that way, or it may be an oversight.
  • ✔️   If you have a complex robots.txt file, like many e-commerce sites, you should review it line-by-line with your developer to make sure it’s correct.

 

 

 

 

#5  Meta Robots NOINDEX

NOINDEX can be even more damaging than a misconfigured robots.txt at times.

 

Most commonly, the NOINDEX is set up when a website is in its development phase, but once the website goes live, it’s imperative to remove the NOINDEX tag.

 

Since so many web development projects are running behind schedule and pushed to live at the last hour, this is where the mistake can happen. Do not blindly trust that it was removed, as the results will destroy your site’s search engine visibility.

 

When the NOINDEX tag is appropriately configured, it signifies certain pages are of lesser importance to search bots. However, when configured incorrectly, NOINDEX can immensely damage your search visibility by removing all pages with a specific configuration from Google’s index.

 

 

How to Check  ?

  • ✔️  Manually do a spot-check by viewing the source code of your page (right-click on your site’s main pages and select “View Page Source.”), and looking for or lines in the source code that read “NOINDEX” or “NOFOLLOW” such as:

  • Photo credit for NOINDEX  tag [Source: robotstxt.org]
  • 90% of the time you’ll want it to be either “INDEX, FOLLOW” or nothing at all. If you see one of the above, you need to take action.

 


How to Fix  ?️

  • ✔️   If you see any “NOINDEX” or “NOFOLLOW” in your source code, check with your web developer as they may have included it for specific reasons.
  • ✔️   If there’s no known reason, have your developer change it to read <meta name=”robots” content=” INDEX, FOLLOW”> or remove the tag altogether.

 

 

 

#6  Multiple Home URL Versions

Remember when you discovered that “yourwebsite.com” and “www.yourwebsite.com” go to the same place? While this is convenient, it also means Google may be indexing multiple URL versions, diluting your site’s visibility in search.

 

Every site should have the following two properties setup in Search Console:

  1. http://www.yourwebsite.com
  2. http://yourwebsite.com

 

If your website has an SSL certificate setup, such as an eCommerce website, you would also add two additional properties:

  1. https://www.yourwebsite.com
  2. https://yourwebsite.com

 

That brings our total to four site versions, all being treated differently! The average user doesn’t really care if your home page shows up as all of these separately, but the search engines do.

 

If Search Console is successfully set up for your website (preferred site version is set), you should begin to only see one of the site versions getting all the attention.

 

Setting the preferred site version in Google Search Console is a good first step in addressing the numerous versions of your site, but it’s more than just selecting how you want Google to index your website. Your preferred site version affects every reference of your website’s URL – both on and off the site.

 


How to Check 
?

  • ✔️  Check manually if different versions of your URL successfully flow to one standard URL.  This can include HTTPS and HTTP versions, as well as versions like “www.yourwebsite.com/home.html”. Check each possible combination.
  • ✔️ Another way is to use “site:” operator in Google search (“site:yoursitename.com”) to find out which pages are indexed and if they stem from multiple URL versions.

 


How to Fix  ?️

  • ✔️  If you discover multiple indexed versions, you’ll need to set up 301 redirects or have your developer set them up for you.
  • ✔️   Set up a schedule to monitor your URL canonicalization on a weekly or monthly basis

 

 

 

#7.  Incorrect Rel=Canonical

Rel=canonical is particularly important for all sites with duplicate or very similar content (especially e-commerce sites).

 

Dynamically rendered pages (like a category page of blog posts or products) can look like duplicate content to Google search bots. The rel=canonical tag tells search engines which “original” page is of primary importance (hence: canonical) — similar to URL canonicalization.

 

 

How to Fix  ?️

  • ✔️  Spot-check important pages to see if they’re using the rel=canonical tag.
  • ✔️  Use a site scanning software to list out all the URLs on your site and determine if there are duplicate page problems that can be solved with a rel=canonical tag.

 

 

 

 

 

#8.  Duplicate Content

With more and more brands using dynamically created websites, content management systems, and practicing global SEO, the problem of duplicate content plagues many websites.

 

The problem with duplicate content is that it may “confuse” search engine crawlers and prevent the correct content from being served to your target audience.

 

Usually, duplicate content can occur for many reasons:

  •    –  E-commerce site store items appear on multiple versions of the same URL.
  •    –  Printer-only web pages repeat content on the main page.
  •    –  The same content appears in multiple languages on an international site.

 

 

How to Check  ?

    • ✔️  There are a lot of tools to find duplicate content. The best known duplicate content checkers are probably CopyScape and Siteliner.

 


How to Fix  ?️

“>Each of the issues mentioned above can be resolved respectively with:

  • ✔️   Proper Rel=Canonical (see #7 on this list)
  • ✔️   Use and proper configuration (instructions for setup also noted above)
  • ✔️   Correct implementation of hreflang tags

 

 

Google’s support page offers other ideas to help limit duplicate content including using 301 redirects, top-level domains, and limiting boilerplate content.

 

 

 

 

 

#9  Not Enough Use of Structured Data

Google defines structured data as: “a standardized format for providing information about a page and classifying the page content…”

 

Structured data is a simple way to help Google search crawlers understand the content and data on a page. For example, if your page contains a recipe, an ingredient list would be an ideal type of content to feature in a structured data format. Address information like this example from Google is another type of data perfect for a structured data format.

 

 

 


How to Check 
?

    • ✔️   Use the  Rich Results Tool, created by Google to help you validate and test the syntactic correctness and alignment to Google recommendations of your schema markup.

 


How to Fix  ?️

  • ✔️   As you roll-out new content, identify opportunities to include structured data on the page, and coordinate the process between content creators and your SEO team. Better use of structured data may help improve CTR and possibly improve rank position in the SERP. Once you implement structured data, review your Google Search Console report regularly to make sure that Google is not reporting any issue with your structured data markup.

 

 

 

#10  Broken Links

Good internal and external links show both users and search crawlers that you have high-quality content. Over time, content changes, and once-great links break. Broken links create a poor user experience and reflect lower-quality content, a factor that can affect page ranking.

 

A website migration or relaunch project can spew out countless broken backlinks from other websites. Some of the top pages on your site may have become 404 pages after a migration.

 


How to Check 
?

    • ✔️  Two types of tools are great to find all external links that are broken — Google Search Console, and a backlink checker such as MozMajestic, or Ahrefs. From there, you can reach out to the sites with broken links and provide them with the correct link or new page.

 


How to Fix  ?️

  • ✔️  After identifying your top pages with backlinks that are dead, 301 redirect these to the best pages
  • ✔️  While internal links should be confirmed every time a page is removed, changed or a redirect is implemented, the value of external links requires regular monitoring. The best and most scalable way to address broken links is to run regular site audits.

 

 

#11  301 & 302 Redirects

Redirects are an amazing tool in an SEO’s arsenal for managing and controlling dead pages, for consolidating multiple pages, and for making website migrations work without a hitch.

301 redirects are permanent and 302 redirects are temporary. The best practice is to always use 301 redirects when permanently redirecting a page.

301 redirects can be confusing for those new to SEO trying to properly use them. They’re a lifesaver when used properly, but a pain when you have no idea what to with them.

 

 

 


How to Check 
?

 


How to Fix  ?️

  • ✔️   If using 302 redirects incorrectly for permanent redirects, change these to 301 redirects (But at first, discuss using 302 redirects with your development team).
  • ✔️   Include redirect checking in your monthly or weekly site scan process

 

 

  • NOTE:
    1. Never redirect all the pages from an old site to the home page unless there’s a really good reason
    2. Don’t go redirect-crazy on all 404 errors — use them for pages receiving links or traffic only to minimize your redirects list

 

 

 

#12  Slow Page Speed

If your site doesn’t load quickly, your users will go elsewhere. Site speed – and other usability factors – matters to the user experience and to Google.

 

 

  • NOTE: There’s a new ranking factor in town: Core Web Vitalsquality signals key to delivering great UX on the web. And your site should pass its assessment.

 

 

 


How to Check 
?

  • ✔️   Use Google PageSpeed Insights to detect specific speed problems with your site. (Be sure to check desktop as well as mobile performance.)

 


How to Fix  ?️

    • ✔️   The solution to site speed issues can vary from simple to complex. Common site speed solutions can include image optimization/compression, browser caching improvement, server response time improvement, and JavaScript minifying.
    • ✔️   Speak with your web developer or contact us to ensure the best possible solution for your site’s particular page speed issues.

 

 

 

 

 

#13  Mobile Device Optimization

In December 2018, Google announced mobile-first indexing represented more than half of the websites appearing in search results. Google would have sent you an email when (or if) your site was transitioned. If you’re not sure if your site has undergone the transition, you can also use Google URL Inspection Tool.

 

Whether Google has transitioned you to mobile-first indexing yet or not, you need to guarantee your site is mobile friendly to ensure exceptional mobile user experience. Anyone using responsive website design is probably in good shape. If you run a “.m” mobile site, you need to make sure you have the right implementation on your m-dot site so you don’t lose your search visibility in a mobile-first world.

 


How to Fix  ?️

As your mobile site will be the one indexed, you’ll need to do the following for all “.m” web pages:

  • ✔️   Guarantee the appropriate and correct hreflang code and links.
  • ✔️   Update all meta-data on your mobile site. Meta descriptions should be equivalent on both mobile and desktop sites.
  • ✔️   Add structured data to your mobile pages and make sure the URLs are updated to mobile URLs.

 

 

 

#14  Missing Alt tags

Broken images and those missing alt tags is a missed SEO opportunity. The image alt tag attribute helps search engines index a page by telling the bot what the image is all about. It’s a simple way to boost the SEO value of your page via the image content that enhances your user experience.

 


How to Fix  ?️

  • ✔️  Most SEO site audits will identify images that are broken and missing alt tags. Running regular site audits to monitor your image content as part of your SEO standard operating procedures makes it easier to manage and stay current with image alt tags across your website.

 

 

 

#15 Missing or Non-Optimized Meta Descriptions and H1 tag

Meta descriptions are those short, up to 160-character content blurbs that describe what the web page is about. These little snippets help the search engines index your page, and a well-written meta description can stimulate audience interest in the page.

 

It’s a simple SEO feature, but a lot of pages are missing this important content. You might not see this content on your page, but it’s an important feature that helps the user know if they want to click on your result or not. Like your page content, meta descriptions should be optimized to match what the user will read on the page, so try to include relevant keywords in the copy.

 

And although H1 tags are not as important as they once were, it’s still an on-site SEO best practice to prominently display.

This is actually most important for large sites with many, many pages such as massive eCommerce sites. It’s most important for these sites because they can realistically rank their product or category pages with just a simple keyword-targeted main headline and a string of text.

 


How to Fix  ?️

  • ✔️   For pages missing meta descriptions: run an SEO site audit to find all pages that are missing meta-descriptions. Determine the value of the page and prioritize accordingly.
  • ✔️   For pages with meta descriptions: evaluate pages based on performance and value to the organization. An audit can identify any pages with meta-description errors. It’s important to make sure that meta descriptions are unique to a page.
  • ✔️   Look for whether H1 and H2 tags are being found on pages across your site, if not – fix it.

 

 

 

Conclusion

An SEO audit reveals a complete overview of site health and optimization efforts. 

 

A website SEO audit is like a health check for your site. It allows you to check how your web pages appear in search results, so you can find and fix any weaknesses.

 

Have we piqued your interest?

Let us prove our skills and provide you with a free video review of your eCommerce site. Schedule a meeting with our team.
Cities we operate in