Common Causes of Complex URLs - Best Practices for SEO Friendly URLs

Posted On: Sep 14, 2018

Categories: Prestashop SEO

Each page or file on your website has a URL that represents it on the internet. There are different types of URLs that exist, such as:

Ugly/Complex URLs

These are the types of URLs, which contain different parameters in them such as session IDs, tags, search operators, sorting parameters etc. Let us discuss some examples and causes of ugly/complex web page addresses:


You can get rid of Ugly/Complex URLs by installing Prestashop friendly url module.

Common Causes of Ugly URLs


You will see some websites that offer different search filters to help users customize the search criteria and narrow down the search results e.g., you may search for terms like, ‘hotels on the beach’, ‘hotels on the beach with fitness centre’ etc. Suppose there is a hotel on the beach that has a fitness centre as well, then both of the above search terms may show this hotel in the search results. But the real problem here is that the number of URLs for hotel page will be more than one, making it difficult for search engines to identify the correct version. Here are some examples;

SEO URL Filters

These two URLs are actually pointing to a single web page. Similarly, there can be the number of search filters that will explode the URL into too many redundant links, causing a severe problem for search engines because they need to show the search results from a small number of lists.


Parameters like session IDs, search operators, tags may produce massive amount of duplications. Here are few examples;

Sorting Parameters

eCommmerce websites allow users to sort products in different ways, which will produce greater number of duplicate URLs. Here is an example;

Invalid Parameters:

These are the kinds of URL parameters which are unnecessary e.g., referral parameters, which produces multiple URLs pointing to the same page. Here is an example;

Calendar Issues:

Dynamically generated calendar may produce links to future and past dates, again producing massive number of duplicate links to the same page. Here is an example;

 Broken Links & Dynamic Content:

Dynamic content may produce different URLS e.g. different types of ads and counters may produce multiple links. Similarly, broken links can cause problems due to infinite path problems. Here is an example:

Disadvantages of Ugly/Complex URLs

  • Duplicate links make duplicate content issues, and may trigger severe Google penalties like Panda and Penguin.
  • Complex URLs are un-friendly for users as well as for search engines. Hence, search engines give lowest value to them.
  • Complex web addresses can cause problems for crawlers by creating too many URLs that point to similar page on your website.
  • Too many URLs means crawler will have to spend too much time downloading pages from your site. It may slow down your server response time, and crawlers will less likely to come on your website to download web pages.
  • Irrelevant page may be ranked, search engines try to guess the correct version of URL, but in their guess, they may choose a wrong version of your page that you do not want to rank.

Steps to Solve This Problem – Google Best Practices

To avoid any kind of problems that may be caused by complex URLs, follow the below practices recommend by Google.

  • Implement Pretty URLs on your website, set canonical links especially for ecommerce websites. Canonical HTML Meta tag will set the preferred version of your page for search engines.
  • Use the robots.txt file to block search engines’ access to unwanted URLs. You should block all dynamic URLs and other types of complex addresses mentioned above by using regular expressions.
  • Use cookies rather than session IDs if you need to track the session.
  • Try to make your URL as short as you can, do not add unnecessary parameters.
  • Use no-follow attribute for infinite calendar links.
  • Always check for broken relative links.
  • Use punctuation in the URLS e.g., use rather than
  • Organize content in such as a way that URLs are constructed logically and they represent content intelligently.
  • In case you are updating links, make sure you apply proper redirection from old to new links. (this feature is available in Pretty URLs module).
  • Do not change URLs without any genuine cause, ideally, URLs should only be updated if you have UGLY ones and want to make them clean.