Common issues with URL parameters

Common issues with URL parameters

25.Oct.2021

1. Keyword Cannibalisation (or simply, “Keywords in URLs”) is a Google-sanctioned SEO practice in which multiple web pages targeting different keywords include the same URL slug.

URL: http://example.com/keyword1/keyword2/

 

2. Duplicate content is considered one of the most serious issues when it comes to SEO and website management. A high number of internal duplicates is considered detrimental to indexing and can lead to a site being banned from SERPs entirely.

URL: http://example.com/page1

URL: http://example.com/page2

URL: http://example.com/page3

Resulting URL: http://example.com/page1

Resulting URL: http://example.com/page2

 

3. A high amount of 404s can mean that a large portion of your site’s pages are broken and not linked to from anywhere. This is considered as one of the most significant issues as Google has stated that crawling some broken links may be detrimental to the SEO efforts on a website.

URL: http://www.example.com/help

Resulting URL: Not found

Effect on SEO: Degradation in SERPs due to lack of link authority, quality content and internal links contributed by other pages on the same site . Also, receiving a lot of 404 page errors will make crawling a tedious task for Googlebot as it will have to spend more time on 404 pages than on indexed ones.

 

4. Related content is an important SEO factor, so ensuring related articles are linked together with the right anchor text is crucial.

URL: http://example.com/keyword1

Resulting URL: Not found

Effect on SEO: Broken link leading to 404 page, loss of PageRank and link juice, all of which contribute to lower CTRs in SERPs . Also, not linking particular pages that should be part of the same topic under one URL may confuse users, who might think that both “related” articles are independent of each other. This can further result in them clicking on the wrong link, leading to more 404s.

 

5. Unclear URL structure can cause Googlebot to completely skip crawling some pages on your site, which may lead to significant SEO issues over time.

URL: http://example.com/page1?track=true&utm_source=google&utm_medium=organic

Resulting URL: Not found due to substring match issue . Effect of this is that Google will not be able to index the page and any links pointing towards it will be treated as no-followed links, resulting in loss of PageRank and link juice for that particular page or URL (depending on its priority within the website). If you are using dynamic parameters like this on a large scale, Google will also be forced to crawl each parameter as a separate page (which is very difficult and time-consuming to do), which again contributes towards the site’s crawling budget being wasted.

 

URL: http://example.com/page1?utm_source=google&utm_medium=organic Resulting URL: Not found due to substring match issue . Effect of this is that Google will not be able to index the page and any links pointing towards it will be treated as no-followed links, resulting in loss of PageRank and link juice for that particular page or URL (depending on its priority within the website). If you are using dynamic parameters like this on a large scale, Google will also be forced to crawl each parameter as a separate page (which is very difficult and time-consuming to do), which again contributes towards the site’s crawling budget being wasted.

 

6. Incorrectly configured URL parameters can lead to an unnecessary waste in Googlebot crawl budget .

URL: http://example.com/page1?utm_source=google&utm_medium=organic

Resulting URL: Not found due to substring match issue . Effect of this is that Google will not be able to index the page and any links pointing towards it will be treated as no-followed links, resulting in loss of PageRank and link juice for that particular page or URL (depending on its priority within the website). If you are using dynamic parameters like this on a large scale, Google will also be forced to crawl each parameter as a separate page (which is very difficult and time-consuming to do), which again contributes towards the site’s crawling budget being wasted.

 

7. Using URL parameters for tracking purposes can cause significant issues when it comes to identifying individual visitors and their activities on your website. Finding relevant information in the logs relies on proper configuration of link rel values that, if incorrect, may lead to either missing links or links that cannot be traced back to their source.

 

8. Over-optimisation or poor URL optimisation can cause a number of issues, from lower CTRs in SERPs to lower conversion rates and user retention rates, which will eventually lead to loss of organic traffic and thus a decline in the site’s search visibility.

 

We are social