Magento 2 post-launch SEO guide

We’ve outlined just a few optimization tips to keep your new Magento 2 site ranking strong in search results, long after the post-launch dust has settled.

Launching a new e-commerce website marks the successful completion of a major project for your organization. However, it’s important to see it as the beginning of another, as the need to invest time in ongoing Search Engine Optimisation (SEO) is often overlooked.

SEO is not something that you can set and forget. Changing algorithms in Google will mean that best practice techniques are constantly in flux. If competitors are on top of their game with on-page optimization, behind the scenes site speed tweaking, and a strong content strategy, you’ll find your hard-earned rankings on search engine results pages (SERPs) start to slip.

We’ve outlined a few optimization tips to keep your new Magento 2 site ranking strong in search results, long after the post-launch dust has settled.

Set-Up Position Tracking With Key Competitors and Search Terms

It’s important to keep an eye on how your competitors are faring, and how you rank against them on the keywords that are important to your business.

Tools such as SEMRush and AHrefs have a range of features which will help you with keyword research for both organic and paid traffic. They’ll also provide tools to give you a broad overview of any current SEO issues on your website.

One of their best features, however, is their position tracking reports. These allow you to track a range of keyword rankings on your own website versus competitors, generating handy reports that you can refer to month on month.

You’ll want to make sure that the keywords you’re using are relevant to your business and have a high volume of traffic. It’s also wise to exclude own brand terms, as they will skew the results.

Keyword tracking in this way will make it easy to see how your rankings are responding to your new site and determine what keywords competitors are either outranking you on or falling behind with. It’s also a good way to keep an eye on how your rankings are affected by the launch of a new site, as this can often cause some fluctuations.

Make Sure Your Sitemap is in Order

Sitemap generation has seen a lot of improvements in Magento 2. There’s a range of new options now, including the ability to define variables like CMS, Category, Products, and other pages individually.

When you launch a new site, it can take Google, Bing, and the other major search engines a while to “crawl” through the new structure and begin to rank the pages appropriately.

You can help them with this process by submitting your sitemap to Google Search Console and Bing Webmaster Tools when your Magento 2 site launches. A properly configured sitemap.xml will reduce crawl waste and help your new site get indexed.

The process is still slow, and you can still expect to see a few of your rankings dip before they rise, but if your sitemap is in order, it can only help. Google Search Console will also show any errors or warnings that they find in your sitemap that may cause issues with your site being crawled.

Check for Broken Links & Site Errors

If the new Magento 2 site is a redesign of a previous site, then a list of 301 redirects should have been created and implemented before launch. However, often with eCommerce sites, many broken links are missed, or new issues can appear after launch.

SEO tools such as SEMRush and Ahrefs will have a site crawl feature than can help find any broken links on the new site, as well as a range of other issues that might need to be addressed.

Google Search Console is another invaluable tool for this process. Shortly after the launch of the new site, Search Console will begin to pick up any “Crawl Errors”, which could contain pages with a 404 error code, pages with server issues, pages that have mobile usability issues and pages that are blocked from crawling, among a number of other issues.

It’s important to keep an eye on Google Search Console and any SEO tools of choice on a regular basis after the launch of a site. This monitoring ensures new issues aren’t appearing that could affect SERP rankings or overall site performance.

Prevent Duplicate Content

Google doesn’t penalize sites for duplicate content unless “The intent of the duplicate content is to be deceptive and manipulate search engine results”. Even if a site is not penalized in this way, duplicate content can still impact search engine rankings.

It presents three issues for search engines:

  • They don’t know which version of the page to include (or exclude) from their index.
  • They don’t know which version of the page to direct the link metrics to - do they split them equally between each duplication? Or do they assign them to one page, which might be the wrong one?
  • They don’t know which version to rank in their SERPs.

What Causes Duplicate Content?

Ecommerce sites are more likely to have duplicate content issues. It can be caused by “boilerplate” content, e.g. product descriptions, shipping information and so on being the same on every product page.

It can also be caused by how the site handles products or the nature of the products that a website sells.

Issues could be caused by:

  • Product filtering
  • Product sorting
  • Pagination
  • Same product in different categories
  • Variation of the same product

Preventing Duplicate Content

There are a number of ways to deal with duplicate content issues:

  • Rel=canonical tags
  • 301 redirects
  • Setting URL parameters and preferred domains in Google Search Console
  • Noindexing pages
  • Disallowing pages in robots.txt

301 Redirects

Implementing 301 redirects is often the best - and easiest - way of eliminating duplicate content issues. However, it depends on the nature of the issue. A product may have five different color variations with different URLs leading to identical content, save for the color of the product and its name. It wouldn’t work to redirect them all to one main product page as this would prevent customers from accessing other colors. In situations like these, it’s better to make these colors be contained within a configurable product on the site rather than each having its own URL.

If duplicate content is caused by filters, URL variations, or HTTP versus HTTPS pages, or www. versus non-www., then 301 redirecting these pages is likely the best approach.

Rel=Canonical Tags

Canonical tags tell Google which is the 'preferred' version of a page. If there are multiple URLs with exactly the same content, it lets Google know which of those pages are the 'main' version of that page, and passes the link metrics along to this page.

Setting the canonical tag in Magento 2 is straightforward. It can be found under Stores > Settings > Configuration > Catalog > Catalog > Search Engine Optimisation. In here, there is a range of options that can be set to choose how canonicals are used to help search engines index the site.

The last two fields, Use Canonical Link Meta Tag for Categories and Use Canonical Link Meta Tag for Products are the most important here. Setting these to either “yes” or “no” will change how the site canonicalizes categories and products.

These can also be added manually by applying the rel=canonical tag to a page and setting the canonical URL. A canonical tag passes roughly the same amount of link juice as a 301 does, but it can be easier to implement due to taking less development time.

Set URL Parameters & Preferred Domains to Help Google Crawl Your Site

Google Search Console allows webmasters to set preferred domains and choose how Google handles URL Parameters on a site. This can help eliminate many duplicate content issues without development work.

Setting a “Preferred Domain” allows a site to decide how the URLs are displayed, either with www. or without, which will prevent duplicate issues with non-www. versus www. sites.

Setting URL Parameters allows a site to dictate to Google how parameters contained within URLs should be handled. This is particularly useful for pagination, where you might have duplicate URLs on product pages or blog pages - for example, website.com/blog/p1, website.com/blog/p2, and so on.

For these pages, the parameter can be set to “Paginates” which lets Google know these pages are paginated from the original and should not be ranked in their own right.

The main drawback for setting URL parameters and the preferred version is that they only work for Google. To have this apply to other search engines, it would need to be set up in their equivalent tools.

Noindexing Pages

Adding a Meta Noindex, Follow tag to pages that should be excluded from crawling is another approach. This will allow a search engine bot to crawl the page but tells them not to index it.

It’s essential that Google can still crawl pages even if they aren’t to be indexed. Google doesn’t recommend blocking crawler access to duplicate pages on your site, but they do advise any of the above approaches to help solve duplicate content issues.

Disallowing URLs in robots.txt

Disallowing URLs in your robots.txt will prevent Google (or other search engines) from indexing these pages. This will prevent them from being shown in search results, effectively eliminating any duplicate content issues. The content of your robots.txt will vary depending on the structure and size of your site, but eCommerce websites tend to need a more involved robots.txt due to the range of potential duplicate pages.

This recommended default robots.txt for Magento 2 is a great starting point, eliminating a lot of duplication caused by /catalog, /wishlist, /send friend URLs and much more. It's worth revisiting your robots.txt after the launch of your site to see if any unanticipated URLs may be getting crawled and causing duplicate issues.

Robots.txt is one of the best ways to exclude pages from being indexed as it works across all search engines, as opposed to Google Search Console parameter setting which only excludes the pages from Google. However, special care should be taken when adding or removing URLs from the robots.txt file as incorrect use may cause pages you want to be ranked to be excluded from SERPs.

Use Improved Layered Navigation to Create SEO Optimized Landing Pages

The Improved Layered Navigation extension for Magento 2 has a range of benefits for user experience, allowing for custom sliders, excellent product filters and more. It’s a key extension to use as part of the initial site build, but it also has a range of SEO benefits that can improve performance after the launch of a site.

It allows a site to set SEO friendly URLs and content on filter pages and brand pages, with the ability to create a “landing page” for any combination of product parameters and optimize this for best practice SEO.

It helps provide SEO friendly optimized custom URLs for pages, allows the setting of canonical URLs for these pages, redirects all standard Magento category pages to new SEO-friendly URLs, and allows applying NoIndex and NoFollow options for particular categories (for example, if more than one filter is applied).

Kick start your digital strategy

Do you need your website to be visible to your target audience in search engines and don't know where to start? An SEO audit is the first step to creating a digital strategy that works. Get in touch to kick start your digital strategy.

Leave a Reply

Your email address will not be published. Required fields are marked *