SEO Solutions to Tame URL Parameters

SEO Solutions to Tame URL Parameters


You have six tools in your SEO arsenal to deal with URL parameters on a strategic level.

1) URL Parameters (aka disallow; www vs non-www; static vs dynamic; variable querystring, etc.)

2) Canonical URLs (to deal with multiple versions of same page)

3) Rel=Canonical (for URL parameter handling without htaccess)

4) 301s (to redirect variable URLs to canonical URLs, for different file types and directories as well!)

5) Robots.txt directives (to block certain directories from being indexed by search engines / spiders at all!)

6) Use rel="nofollow" tags on links pointing to known spammy or questionable pages that contain one or more URL parameters

There are no shortcuts to get the job done. You will need to invest time, testing and effort in making your URL structure user friendly for search engine crawlers (spiders).

The only thing you should not do is blocking all URLs with variables on web servers behind your firewall using htaccess directives like "disallow: /?*" or "rewritecond". This setting has severe impact on usability of dynamic websites for humans, thus can easily harm your business without benefit for SEO at all! On the other hand, it is relatively easy to end up optimizing a technically bad URL structure that works well within system boundaries but fails outside due to URL parameters not being handled properly by search engines. And this type of failure will hurt your rankings much more than having some dynamic parameters in place.

This article aims at providing the answer to the question "How should I deal with URL parameters on a technical level to get it right?"

The outlook for "on-page" SEO is mixed. On one hand, Google has stated that URLs are taken into consideration when indexing websites - so you might want to have meaningful URLs including primary keywords of interest. On the other hand, Google itself recommends not obsessing over secondary keywords in URLs leading to main content pages! Basically, if you have 100s of products each typically accessing its own page via product ID/category ID/site ID + parameter(s), this scenario cannot be optimised well using our 6 tools above due to technical limitations.

If you have 100s of pages indexed with exact match keywords in URL parameters, you might be confronted with problems on SERP level for this reason alone! Google uses a process called parameter handling to deal with URLs including URL parameters. The algorithm ranks the page high enough to make them visible by users but drops its position gradually as more and more voice search becomes relevant. In other words: If one day "Google asks" what is this product worth, your rank will suffer significantly due to not being easily accessible via natural language queries. As a rule of thumb: Use up to 3-5 URL parameters at maximum on your main content pages if they are meaningful and helpful for SEO!

Useful Tips & Tools:

1) Screaming Frog's "Fetch as Google" functionality for URL parameter debugging

2) Use analytics to find out what users do with your dynamic URLs, especially if you have a large website. You should then adapt the naming scheme of your URLs accordingly to improve your rankings. You will then need to implement 301 redirects from old to new URLs, obviously.

3) Use the free SEO Moz Check Tool to see how well a given page ranks for a list of keywords - i.e. without URL parameters in place! ​ 4) Rank checker tools like Monitor Backlinks provide interesting insight about multiple versions of same URL and their corresponding ranking positions:

5) SEO for Firefox - Highlight URLS Parameters is a useful plugin for highlighting parameters in search results: 6) PageSpeed Insights provides insight about universal issues on your website that are impacting performance, including parameter handling:​insights/​#urlparameters

7) You can use Google's own Structured Data Testing Tool to see if Google understands the meaning of URL parameters you have used on your website!

8) If you are technically inclined, you can set up your own website with different templates for debugging URL parameter behaviour by using the Apache module mod_rewrite!

(This article was written with Karen Henson - thanks Karen!)

Please feel free to share this article via social media using the share buttons below & please contact us if you have questions about implementing these guidelines on your website.

By Christian Henschel & Marc Stober - - @henschmarc - Google+

SEO copywriting is often seen as an "add on" function or service but it should be considered a core part of any online marketing activity. I am one of many who consider SEO copywriting an artform in itself, that is why MozCon was such a highlight for me this year - I love to learn from the best and look at new ways to move up in the SERP's."

SEO specialist Marc Stober reveals his favourite parts of the recent MozCon event


MozCon 2018 has kicked off in Seattle. The three day conference began yesterday with a keynote by Rand Fishkin and featured presentations from other web celebrities such as Wil Reynolds, Dr Pete Meyers and Barry Schwartz just to name a few. Day 1 ended with a bang when Dr Roger Daltrey took the stage shortly 8 pm to sing hits from The Who and many other rock classics. Today, day 2 promises even more speakers and I'm quite sure we will see some new trends in SEO and online marketing emerge over the course of the next days.

The highlight for me personally on day 1 was Karen Henson from Google who had a presentation about using voice search to drive traffic to your website. Google's own Search Liaison Howard Shimmel had a keynote himself earlier today where he discussed how machine learning is impacting search results but today's session with Karen shed more light on what marketers can do to get their content found by users seeking answers via voice commands on Google Assistant or Siri.

We are social