Best Practices for 301 Redirects

NEW PAGE IN NEW CONFLUENCE FOR GROWTH MARKETING - https://relias.atlassian.net/wiki/spaces/GM1/pages/3799220296

 

Some of the old URL link value IS passed to the new URL with 301 redirects IF done properly and other considerations are TAKEN PROPERLY

If you need to change the URL of a page as it is shown in search engine results, it is recommended by Google that you use a 301 permanent server-side redirect when possible. This is the BEST way to ensure that Google search crawlers and users are directed to the correct page.

Googlebot crawler follows the redirect, and the indexing pipeline uses the redirect as a STRONG SIGNAL that the redirect target URL should be canonical.

Serving 301 indicates to both browsers and search engine bots that the page has moved permanently. Search engines interpret this to mean that not only has the page changed location but that the content (or an updated version of it) can be found at the new URL. The engines will pass some link juice/weighting/SEO value from the original page to the new URL.

 

BUT 301 redirects can cause PLENTY of other SEO-related issues that don’t often get talked about.

It pays off to make sure there are no existing problems with 301 redirects on the website, as they could hinder current and future SEO efforts.

Google will see your 301 redirect, but they will also review other factors, so you need to MAKE SURE you:

  • Update ALL internal links on the website to point to the new page

  • XML sitemap files should NOT contain the old page that has been redirected - only status 200 pages should be included

  • Any other external links or references to the old URL NEED to be updated to the new URL

    • Social media profiles, PPC ads, directories, affiliate campaigns, email signatures 

  • ELIMINATE redirect chains

    • Search engines stop following redirects after 3 “hops”  (URL A is redirected to URL B is redirected to URL C). With each redirect implemented in a chain, the original SEO value of the URL is diminished. 

  • Make sure there are NO conflicts between XML sitemap and robots directives

    • If robots.txt tells Google not to crawl a page that’s within the XML sitemap, we’re sending mixed messages 

    • Similarly for page-specific robots directives i.e.: a page in the XML sitemap should not be set to noindex 

Once redirects are in place, you can check the URL in the URL inspection tool of GSC to see what URL is marked as canonical - this is what will rank. If it shows a URL that is not what you want to rank, then you need to look at signals (items above) pointing to the old URL and fix those. 

WITHOUT doing the above items, you RISK: confusing search engines on what URL should be ranked, which could have them result in not ranking either URL or pushing rankings down further in search results (i.e. less traffic, lower CTR, and ultimately less revenue) losing significant SEO value that we have worked to gain for these URLs.

 

ALSO be aware that when moving a page from one URL to another, the search engines WILL TAKE TIME to discover the 301, recognize it, and credit the new page with the rankings and trust of its predecessor. This process can be LENGTHIER if search engine spiders rarely visit the given webpage or if the new URL doesn't properly resolve (redirect chains)

 


Non trailing slash vs trailing slash

Browsers will render pages with varying type cases