Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Have Your Thoughts Changed Regarding Canonical Tag Best Practice for Pagination? - Google Ignoring rel= Next/Prev Tagging
-
Hi there,
We have a good-sized eCommerce client that is gearing up for a relaunch. At this point, the staging site follows the previous best practice for pagination (self-referencing canonical tags on each page; rel=next & prev tags referencing the last and next page within the category).
Knowing that Google does not support rel=next/prev tags, does that change your thoughts for how to set up canonical tags within a paginated product category? We have some categories that have 500-600 products so creating and canonicalizing to a 'view all' page is not ideal for us. That leaves us with the following options (feel it is worth noting that we are leaving rel=next / prev tags in place):
- Leave canonical tags as-is, page 2 of the product category will have a canonical tag referencing ?page=2 URL
- Reference Page 1 of product category on all pages within the category series, page 2 of product category would have canonical tag referencing page 1 (/category/) - this is admittedly what I am leaning toward.
Any and all thoughts are appreciated! If this were in relation to an existing website that is not experiencing indexing issues, I wouldn't worry about these. Given we are launching a new site, now is the time to make such a change.
Thank you!
Joe
-
An old question, but thought I'd weigh in with to report that Google seems to be ignoring self-referring pagination canonicals on a news site that I'm working on.
Pages such as /news/page/36/ have themselves as declared canonicals, but Search Console reports that Google is selecting the base page /news/ as the canonical instead.
Would be interested to know if anyone else is seeing that.
-
Hi,
I'm also very interested in what the new best approach for pagination would be.
In a lot of webshops, option 2 is used. However, in this article the possible negative outcome of this option is described (search the article for 'Canonicalize to the first page'). In my opinion, this is particularly true for paginated blog articles, and less so for paginated results of products per category in webshops. I think the root page is the one you want to rank in the end.
What you certainly don't want, is create duplicate content. Yes, your products (and of course their links to the product pages) are different for each page. And yes, there will be also more internal links pointing to the root category page, and not to the second or third results page. But if you invested time in writing content for your category, and invested time in all the other on page optimizations, these will be the same across all your result pages.
So in the end, we leave it to Google and hope that they do recognize your pagination. Is this the best option? Maybe, maybe not. Anyway, we didn't know that they didn't use rel=next/prev for several years, and mostly it worked fine.
So I think in the end EffectDigital is right, just do nothing. If you see problems, I would try option 2, using your first results page as canonical.
-
The only thing it changes IMO is delete rel=prev / next tags to save on code bloat. Other than that, nothing changes in my opinion. It's still best to allow Google to rank paginated URLs if Google chooses to do so - as it usually happens for a reason!
I might lift the self referencing canonicals, maybe. Just leave them without directives of any kind, and force Google to determine what to do with them via URL structure ('?p=', '/page/', '?page=' etc). If they're so confident they don't need these tags now, maybe using any directives at all is just creating polluting signals that will unnecessarily interfere
In the end I think I'd just strip it all off and monitor it, see what happened
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
NO Meta description pulling through in SERP with react website - Requesting Indexing & Submitting to Google with no luck
Hi there, A year ago I launched a website using react, which has caused Google to not read my meta descriptions. I've submitted the sitemap and there was no change in the SERP. Then, I tried "Fetch and Render" and request indexing for the homepage, which did work, however I have over 300 pages and I can't do that for every one. I have requested a fetch, render and index for "this url and linked pages," and while Google's cache has updated, the SERP listing has not. I looked in the Index Coverage report for the new GSC and it says the urls and valid and indexable, and yet there's still no meta description. I realize that Google doesn't have to index all pages, and that Google may not also take your meta description, but I want to make sure I do my due diligence in making the website crawlable. My main questions are: If Google didn't reindex ANYTHING when I submitted the sitemap, what might be wrong with my sitemap? Is submitting each url manually bad, and if so, why? Am I simply jumping the gun since it's only been a week since I requested indexing for the main url and all the linked urls? Any other suggestions?
Web Design | | DigitalMarketingSEO1 -
Best SEO practice - Umbrella brand with several domains
Hi, we have several blogs and comparison sites on specific topics. All the domains rank on top positions in very competitive niche markets. We think that we can get more profit out of the domains when we put them under an umbrella brand. Customers that visit domain A can then also find products easily on domain B. We see this for example on health.com, with several brands in the top. To maintain or improve our rankings i'm looking for specific information for the link structure. For example, is it better to have the 'about us'/rel=author on each domain, with contributors on that specific domain or is it better to have them all in the (umbrella) brand domain. At the moment we have the structure like this: domainA.com, domainA.com/blog, domainA.com/about-us and domainB.com, domainB.com/blog, domainB.com/about-us. I think to maintain the rankings it is best to keep specific content (like blog/ about us) on the domain. So is it the best to just do side wide links with a logo (like health.com) and what about hosting? We work with wordpress, so all domains will be hosted on one ip? when we use the multiple site option of WP? All information on this topic is more than welcome 🙂
Web Design | | remkoallertz0 -
Accordion Fold Ups Bad For Google
http://fandicoach.com/products Right now I have these accordion things on the website. Are they bad for google in terms of being an SEO best practice? I want to avoid doing anything black hat. Thanks!
Web Design | | OOMDODigital0 -
White Text / Black Background & SEO Impact
Does anyone know of any testing / studies with evidence that Google prefers dark text on a light background vs. light text on a dark background? I have a website that currently has light text on a black background, and really like the way it looks, but am concerned that the style may be hurting SEO. Moreover, redesigning something inverse with the same quality would be a large project and fairly costly, so I'd like to make sure the benefit will really be worth the cost before moving forward.
Web Design | | Bromtec0 -
Best Practice issue: Modx vs Wordpress
Lately I've been working a lot with Modx to create a new site for our own firm as well for other projects. But so far I haven't seen the advantages for SEO purposes other then the fact that with ModX you can manage almost everything yourself including snippets etc without to much effort. Wordpress is a known factor for blogging and since the last 2 years or so for websites. My question is: Which platform is better suited for SEO purposes? Which should I invest my time in? ModX or Wordpress? Hope to hear your thought on the matter
Web Design | | JarnoNijzing0 -
Does Google follow links inside a <noscript>tag?</noscript>
I'm looking at making an embedable calculator and asking users to embed it to their website. I had the idea of using javascript to include the calculator which would also conatain a text link back to my site in order to gain some back links. If it's possible Google won't see the link (as they may not execute the javascript), is it safe to place the link in the <noscript>tag? If so, Will it be indexed and will Page Rank be passed?</span></p> <p>Thanks in advance for your answers. </p> <p>Anthony</p> <p><span style="color: #5e5e5e;"><br /></span></p></noscript>
Web Design | | BallyhooLtd0 -
Infinite Scrolling vs. Pagination on an eCommerce Site
My company is looking at replacing our ecommerce site's paginated browsing with a Javascript infinite scroll function for when customers view internal search results--and possibly when they browse product categories also. Because our internal linking structure isn't very robust, I'm concerned that removing the pagination will make it harder to get the individual product pages to rank in the SERPs. We have over 5,000 products, and most of them are internally linked to from the browsing results pages in the category structure: e.g. Blue Widgets, Widgets Under $250, etc. I'm not too worried about removing pagination from the internal search results pages, but I'm concerned that doing the same for these category pages will result in de-linking the thousands of product pages that show up later in the browsing results and therefore won't be crawlable as internal links by the Googlebot. Does anyone have any ideas on what to do here? I'm already arguing against the infinite scroll, but we're a fairly design-driven company and any ammunition or alternatives would really help. For example, would serving a different page to the Googlebot in this case be a dangerous form of cloaking? (If the only difference is the presence of the pagination links.) Or is there any way to make rel=next and rel=prev tags work with infinite scrolling?
Web Design | | DownPour0 -
How to make Address Text Clickable for Google Map Link for Mobile Device
How do I make the address text on the site a clickable link for mobile devices?
Web Design | | bozzie3110