An SEO Guide to URL Parameter Handling

An SEO Guide to URL Parameter Handling


URL parameters create duplicate content, waste crawl budget, and dilute ranking signals. Learn six ways to avoid potential SEO issues with URL parameters.


What is a URL parameter?

A parameter is text appended to the end of your website's URL that has no impact on how search engines crawl or index the page within which it resides. Most websites use these types of non-essential parameters to track data--for example, you could pass more than one keyword for Google Analytics (ga_keywords=blue widgets; red widgets) . However, many websites also include long strings of useless information in their URLs that may appear like keywords but are not used as such by search engines (see below).

As an SEO professional, I recommend using Google Analytics with its built-in URL builder to create your tracking links.


Why are URL parameters bad?

As an SEO professional, I can't emphasize enough how important it is to keep URLs free of clutter if you want Google (and other search engines) to crawl and index them properly. This is because in addition to the additional clutter in your link profile, this increase in URLs may compete with each other in search results due to their duplication within page content. For example, let's say that you own a site about widgets and that one of your pages includes the sentence "We sell blue widgets at our website." With an unoptimized internal linking structure , all four of these URLs could potentially rank for the keyword "blue widget":

We can easily fix this with a best practices approach to internal linking, but it's important to note that the site would have had more difficulty ranking well in Google without link sculpting from a clean internal link profile . So what do you need to know about URL parameter handling? It turns out there are several pitfalls you should be aware of when creating or monitoring your own URLs: Duplicate content - If one of the URLs above were submitted as a search result, for example, it would produce duplicate content and compete with the other three URLs. In fact, search engines will typically index all of the URLs from a page's URL parameter list in their index, presenting more problems for you to solve later on. Duplicate content isn't limited to multiple pages--for example, if returns a large number of results with separate listings of your target keyword "widgets" (potentially pointing back to separate pages of your site), this could also create duplicate content issues in search result listings and add query parameters such as "?keyword=widgets". As an SEO professional , I recommend checking Google Webmaster Tools for any signs of duplicate content resulting from additional URLs like this.

Crawl budget waste - Google uses up crawl budget, which it carefully allocates to ensure that your site stays fully indexed and that other sites don't outrank you, by crawling these additional URLs. Let's say returns 10 blue widgets; because search engines typically index the unvarnished page URL (in this case, they'll follow the link to your website, see ten blue widgets listed there already, and devalue all the links pointing to the "special" URL with query parameters. So not only is this bad for crawl efficiency, but it can also dilute the value of important links.

Crawl efficiency - Google treats your URL parameters as part of your page content, regardless of whether they're essential to its function or not. This means that search engines will read and index all of the information included in any URL parameter regardless of how unimportant it might be to understanding or finding that page. For example, if you wanted to highlight your company's Twitter handle on one page, you could create a custom URL like Assuming this page lives at an appropriate place within your site architecture (and there are no other signs like thin content suggesting it shouldn't rank), it will get crawled, indexed, and compete with any other URL on your site that also uses the utm_source URL parameter.

Google recommends using parameters for session IDs or tracking codes only. This is because they're likely to change over time as you update your website, leading to unpredictable results if Google tries to follow these parameters in its indexing process. URL parameters are also likely to change during website development or redesigns, so be sure to use 301 redirects whenever possible to ensure that the right content gets indexed by search engines . If you do decide to use them for anything more than a session ID or temporary tracking code, you should begin by creating a filter in your robots.txt file (or directly in your .htaccess file if you don't want to make this an open access change for everyone) to prevent search engines from crawling the parameter URLs. You should also consider blocking web crawlers in your robots.txt file in certain cases where URL parameters are being used for functional reasons, such as when they're essential to your site's function or when you're using them to prevent overlap with other URLs on your site that might be competing for rankings. The full range of what can go into a custom URL is extensive, but there are several elements that are common across many implementations. If you'd like more information about these elements, I recommend checking out Google's documentation here .

URLs seem simple enough, even without any custom formatting, so why would you go to all the trouble of creating custom URLs with multiple parameters, let alone hundreds or thousands of them? URL parameters provide a measure of flexibility in content publishing- they can be dynamic, customizable, and help you manage duplicate content issues.

We are social