URL parameters can potentially cause a lot of problems when it comes to your SEO. For example, they can create duplicate content, waste crawl budget, and dilute ranking signals. We’ll tell you several ways to avoid potential SEO issues with URL parameters. If you need help adjusting your URL parameters so they don’t hurt your SEO, call experts at SEO Design Chicago today to help you!
Definition of URL Parameters
URL parameters, which are also sometimes called query strings or URL variables, are a way to structure additional information for a given URL. They are the portion of the URL that follows a question mark. They consist of a key and a value pair, separated by an equal sign. You can add multiple parameters to a single page by using an ampersand.
Web developers and analytics experts love URL parameters. And they do play an important part in your website’s user experience, or UX. However, URL parameters can create thousands of URL variations out of the same content.
Most Common Uses for URL Parameters
Here are the most common use cases for URL parameters.
URL parameters are commonly used for specific advertising campaigns or button clicks in order to track traffic that came via that campaign or button, or to track clicks from social media posts. For example, the Google URL Builder allows you to track your marketing campaigns.
Sorting and Filtering
Parameters are frequently used on large e-commerce websites in order to dynamically generate a page with the desired sorting or filtering applied. For example, you can sort dresses or hotels.
Parameters care used for identifying multiple pages of archive or search results.
Parameters pass through the search queries that someone used in a site search.
Parameters can set language options.
Parameters pass through details of a product.
How URL Parameters Cause SEO Issues
There are a few different ways that using URL parameters can cause SEO issues.
Create Duplicate Content
Typically, URL parameters do not change the content of a web page in a significant way. A re-ordered version of the page is not that different from the original. A page URL with tracking tags or a session ID is identical to the original. For example, static URLs, tracking parameters, reordering parameters, identifying parameters, and searching parameters would all return a collection of widgets. These are several URLs for basically the same content. Now, if you do this for every category on your website, they really add up.
However, search engines treat every parameter-based URL as a new web page. So they see multiple variations of the same page, which all serve duplicate content and target the same keyword or phrase.
This duplicate content can lead to keyword cannibalization and in a worse-case scenario, can even cause your web pages to be completely filtered out of the search results. Google can downgrade their view of your overall site quality.
Waste Crawl Budget
Google and other search engines have a “crawl budget” for how much of your website it will crawl. Redundant parameter pages drain your crawl budget, which reduces your website’s ability to index important pages. Google suggests that you keep a simple URL structure for your website.
Split Page Ranking Signals
If you have multiple versions of the same web page content, links and social shares will be coming in on various versions. This splits and dilutes your ranking signals. When you confuse a crawler, it will not know which of the competing pages to index for the search query.
Make URLs Less Clickable
Let’s be real: parameter URLs don’t look good. They’re not easy to read. They don’t appear as trustworthy links. So, they are much less likely to garner a click on social media, in emails, when copy pasted into forums, and anywhere else the full URL ends up being displayed. Fewer clicks negatively impact your click-through rate and page performance, which can decrease your page ranking. Plus, every click you get matters and poor URL readability can lead to a decrease in brand engagement.
How to Know if URL Parameters are Affecting Your SEO
If you’re concerned about URL parameters negatively affecting your SEO, there are a few ways you can check if it’s a real problem for your website.
Run a Crawler
You can use a crawler tool like Screaming Frog and search for “?” in the URL.
Check the Google Search Console
One of the many ways you can use the Google Search Console is to use the URL Parameters Tool. Google auto-adds the query strings it finds here.
Review Your Log Files
Check if Googlebot is crawling parameter-based URLs.
Search with site: inurl: advanced operators
Check how Google is indexing the parameters you found by putting the key in a
Look in Google Analytics
Check the Google Analytics All Pages report and search for “?” to see how each of the parameters you found are used by users. Make sure that URL query parameters have not been excluded in the view setting.
Hire the Experts to Help
If you prefer, you can hire the expert web developers at SEO Design Chicago to find any SEO issues related to URL parameters and help you fix them.
How to Avoid SEO Issues with URL Parameters
There are a few ways you can deal with URL parameters and manage your SEO. We will tell you about each strategy and the pros and ocns of each one.
Limit Parameter-Based URLs
First, conduct a review of how and why your parameters generate. This will help you find ways to reduce the number of parameter URLs and minimize the negative impact on your SEO. Here are four common issues to check for.
Eliminate Unnecessary Parameters
Check with your developer and get a list of every website parameter and its function. You will most likely find parameters that no longer perform a valuable or necessary function. For example, you can identify users with cookies instead of sessionIDs. However, you might still have the sessionID parameter on your website. Or, you might find out that a filter in your faceted navigation is rarely, if ever, applied by your users. Any unnecessary parameters should be eliminated right away.
Prevent Empty Values
You should only add URL parameters to a URL when they have a function. Do not add parameter keys if the value is blank.
Use Keys Only Once
Make sure to avoid applying multiple parameters with the same parameter name and a different value. For multi-select options, it is advised that you combine the values together after a single key.
Order URL Parameters
Search engines think that pages are equal if the same URL parameters is rearranged. For this reason, parameter order doesn’t matter as far as duplicate content goes. However, each of those combinations burns crawl budget and splits ranking signals. Avoid these problems by having your web developer write a script to always place parameters in a consistent order, no matter how the user selected them. It is best to start with translating parameters, then identifying, then pagination, then layering on filtering or reordering or search parameters, and finally tracking.
This strategy allows for a more efficient use of the crawl budget, reduces duplicate content issues, consolidates ranking signals to fewer pages and is suitable for all parameter types. The only con is that it can take a little time to implement.
Rel=”Canonical” Link Attribute
The rel=”canonical” link attribute calls out that a page has identical or similar content to another page. This tells search engines to consolidate the ranking signals to the URL that is specified as canonical. You can rel=canonical your parameter-based URLs to your SEO-friendly URL for tracking, identifying, or re-ordering parameters. However, this is not the right strategy for when the content on the parameter page is not close enough to the canonical, like for pagination, searching, translation, or some filtering parameters.
This is a helpful strategy to use because it is relatively easy to implement, helps safeguard against duplicate content issues, and consolidates ranking signals to the canonical URL. However, it wastes crawl budget, isn’t suitable for all parameter types, and is usually taken by search engines as a hint, rather than a directive.
Meta Robots Noindex Tag
You can set a “noindex” directive for any parameter-based web page that doesn’t add any SEO value. This tag stops search engines from indexing the page. URLs with a noindex tag will be crawled less often. If you leave it there for a long enough time, it will eventually lead Google to no-follow the page’s links.
This strategy is relatively easy to implement, safeguards against duplicate content problems, and removes existing parameter-based URLs from the index. However, it doesn’t entirely prevent search engines from crawling URLs, doesn’t consolidate ranking signals, and is typically interpreted by search engines as a hint rather than a directive.
The robots.txt file is what search engines check before crawling your website. The search engine will not visit disallowed pages. You can use this file to block crawler access to every parameter based URL, or only to specific query strings you don’t want to be indexed.
This strategy is fairly easy to implement, allows more efficient use of your crawl budget, avoids duplicate content issues, and is suitable for all parameter types you don’t wish to be crawled by the search engine. However, it doesn’t consolidate ranking signals and doesn’t remove existing URLs from the index.
URL Parameter Tool in the Google Search Console
You can configure Google’s URL parameter tool to tell crawlers the purpose of your parameters and how you would like them to be handled. Google Search Console will warn you that using the tool “could result in many pages disappearing from a search.” While this might sound bad, it is much worse to have a large number of duplicate pages hurting your website’s ability to rank high. That is why we recommend learning how to configure URL parameters in the Google Search Console. Ask yourself how the parameter impacts the page’s content.
How to Configure Parameters in the GSC
- Tracking parameters do not affect the page’s content, so configure them as “representative URLs.”
- Configure parameters that reorder page content as “sorts.” If this is added by the user optionally, set crawl to “No URLs.” If a sort parameter is applied by default, use “Only URLs with value” and enter the default value.
- Next, configure parameters that filter the page down to a subset of content as “narrows.” If these filters are not relevant for search engines, set crawl to “No URLs.” If they are relevant for SEO, set to “Every URL.”
- Configure parameters that show a certain piece or group of content as “specifies.” Ideally, this should be a static URL. If not, you will want to set it to “Every URL.”
- Configure parameters that display a translated version of the content as “translates.” Ideally, you should use subfolders for translation. If that’s not possible, you will want to set it to “Every URL.”
- Configuration parameters that display a component page of a longer sequence as “paginates.” If you have achieved efficient indexation with XML sitemaps, you can save your crawl budget and set crawl to “No URL.” If not, set to “every URL” to help crawlers reach all of the items.
Google will automatically add parameters to the list under the default “Let Googlebot decide.” However, you can’t remove these, even if the parameter no longer exists. So whenever you can, try to proactively add parameters yourself. That way, if a parameter no longer exists, you can delete it from the Google Search Console.
Another tip: for any parameter you set in the Google Search Console to “No URL,” you should also add it in Bing’s ignore URL parameters tool.
This strategy allows for a more efficient use of your crawl budget, will most likely safeguard against duplicate content issues, and is suitable for all parameter types. However, it doesn’t consolidate ranking signals and is interpreted by Google as a hint rather than a directive.
Switch from Dynamic to Static URLs
Many people decide that the best way to avoid URL parameters hurting their SEO is to simply not use them. Subfolders surpass parameters to help Google understand your site structure anyway, and keyword-based URLs are an important aspect of on-page SEO.
To switch from dynamic to static URLs, use server-side URL rewrites to convert parameters into subfolder URLs. This strategy works for descriptive keyword-based parameters, like those that identify categories, products, or filters for search engine relevant attributes. It also works for translated content.
However, it is not helpful for non-keyword relevant elements of faceted navigation, like price. It is also problematic for searching parameters, pagination, reordering, and tracking. Google Analytics will not acknowledge a static version of a URL parameter.
Furthermore, replacing dynamic parameters with static URLs for pagination, onsite search box results, or sorting does not help with duplicate content, crawl budget, or internal link equity dilution.
For many websites, it is impossible to avoid using URL parameters, especially if you want to provide an excellent user experience. So, for parameters that you don’t want to be indexed in search results, implement them as query strings. For parameters that you do want to be indexed, use static URL paths.
Best Practices for URL Parameter Handling for SEO
Obviously, you can’t utilize all six of these SEO strategies for your URL parameters. They conflict with each other and it would be unnecessarily complex. There is no universal perfect solution because every website is different and has unique needs. If you need help deciding with strategy or combination of strategies is best for your website to implement, contact SEO Design Chicago today.
Avoid SEO Issues with URL Parameters with SEO Design Chicago
If you need help implementing strategies on your website to ensure that URL parameters don’t negatively affect your SEO, contact SEO Design Chicago today! Our web developers are knowledgeable in SEO best practices and can help you make sure your site is ready to rank high.
- What are URL parameters?
- What are the most common uses for URL parameters?
- How can I avoid SEO issues with URL parameters?
- How do I know if URL parameters are affecting my SEO?
- Should I not use URL parameters?