The problem is that those filter URL variants cause you to create a lot of ‘thin content’ within your webs hop, which Google doesn’t like very much. That’s why you get start with one of these solutions: Robots.txt By excluding those filter URLs from crawling, you make sure that Google knows not to go there and that they can better spend that “time” on more important URLs. Example: Screenshot o enter the exact page path of the filter.
That That Data is Not Coming
This is the solution I use most often. No index tag Are you running into the problem that you are already competing with the main page you want to rank with? Then a noindex tag can come in handy, so that you Luxembourg Phone Number rank the right page anyway. Change canonical By setting a canonical you can also indicate which page you want to be index. However, this does not solve the problem that you let Google crawl all those URL variants.
Through at All Due to a Small Error
Tip also check whether those filter pages are not accidentally includ in your XML sitemap, because you do want them to be index with that. The Future of Ecommerce SEO We’re under fire. I was recently fac with a click through rate of only 3% on my main position 1 listing. Despite the fact that there is a lot to be gain organically within e-commerce, Google is making our lives increasingly Buy Leads Search ads, shopping, often a local pack and only then the rankings you work so hard for.It’s not good anytime soon, is it? Did you know that with all your filters and maybe even combinations of them, you create separate URLs? This can easily create 15x as many URLs per original category page by filtering on price, size, shape or all kinds of other logical filters.