Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Sitemaps for landing pages
-
Good morning MOZ Community,
We've been doing some re-vamping recently on our primary sitemap, and it's currently being reindexed by the search engines.
We have also been developing landing pages, both for SEO and SEM. Specifically for SEO, the pages are focused on specific, long-tail search terms for a number of our niche areas of focus. Should I, or do I need to be considering a separate sitemap for these? Everything I have read about sitemaps simply indicates that if a site has over 50 thousand pages or so, then you need to split a sitemap.
Do I need to worry about a sitemap for landing pages? Or simply add them to our primary sitemap? Thanks in advance for your insights and advice.
-
Yes, any URL that has over 50,000 URL's should have a sitemap_index, within that xml sitemap index should have listed the other category specific URL sitemaps. These are best organized in the hierarchy of the website structure to reinforce your schematic URL structure.
-
John,
Good to know – At this point I only have our primary sitemap submitted to Search Console, but I will create and add a secondary sitemap. I don't see us adding a ton of secondary-like sitemaps, you still suggest making a sitemap index of sorts?
-
Absolutely no harm at all. Do you have an index sitemap that you list all the sub-sitemaps from? If not you should do that as well just for sanity of sitemap management.
-
John,
Thanks so much for the reply – So there's no harm in submitting a secondary sitemap, specifically for landing pages? Great to hear and yes, many of the landing pages overlap for both SEO and PPC.
Thanks!
Brendan -
Hi there! Good question.
First, each individual XML sitemap should only have a maximum of 50k URLs in it. At the scale of millions of pages I always recommend splitting out your sitemaps by type so that you can monitor indexation by section of the site.
If I were you I'd create a separate sitemap for landing pages and exclude the PPC landing pages unless those are the same pages you've created for SEO.
Cheers!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Log-in page ranking instead of homepage due to high traffic on login page! How to avoid?
Hi all, Our log-in page is ranking in SERP instead of homepage and some times both pages rank for the primary keyword we targeted. We have even dropped. I am looking for a solution for this. Three points here to consider is: Our log-in page is the most visited page and landing page on the website. Even there is the primary keyword in this page or not; same scenario continues Log-in page is the first link bots touch when they crawling any page of our website as log-in page is linked on top navigation menu If we move login page to sub-domain, will it works? I am worrying that we loose so much traffic to our website which will be taken away from log-in page sub domain Please guide with your valuable suggestions. Thanks
Algorithm Updates | | vtmoz0 -
Is it Okay to have "No Response" pages?
Hi all, I can see some "No Response" pages which gives a error message "Site cannot be reached" or keeps on loading but don't. I have got this list from Screaming from spider tool. Do we need to fix these or ignore? Thanks
Algorithm Updates | | vtmoz0 -
How long for google to de-index old pages on my site?
I launched my redesigned website 4 days ago. I submitted a new site map, as well as submitted it to index in search console (google webmasters). I see that when I google my site, My new open graph settings are coming up correct. Still, a lot of my old site pages are definitely still indexed within google. How long will it take for google to drop off or "de-index" my old pages? Due to the way I restructured my website, a lot of the items are no longer available on my site. This is on purpose. I'm a graphic designer, and with the new change, I removed many old portfolio items, as well as any references to web design since I will no longer offering that service. My site is the following:
Algorithm Updates | | rubennunez
http://studio35design.com0 -
US domain pages showing up in Google UK SERP
Hi, Our website which was predominantly for UK market was setup with a .com extension and only two years ago other domains were added - US (.us) , IE (.ie), EU (.eu) & AU (.com.au) Last year in July, we noticed that few .us domain urls were showing up in UK SERPs and we realized the sitemap for .us site was incorrectly referring to UK (.com) so we corrected that and the .us domain urls stopped appearing in the SERP. Not sure if this actually fixed the issue or was such coincidental. However in last couple of weeks more than 3 .us domain urls are showing for each brand search made on Google UK and sometimes it replaces the .com results all together. I have double checked the PA for US pages, they are far below the UK ones. Has anyone noticed similar behaviour &/or could anyone please help me troubleshoot this issue? Thanks in advance, R
Algorithm Updates | | RaksG0 -
Should my canonical tags point to the category page or the filter result page?
Hi Moz, I'm working on an ecommerce site with categories, filter options, and sort options – teacherexpress.scholastic.com. Should I have canonical tags from all filter and sort options point to the category page like gap.com and llbean.com? or have all sort options point to the filtered page URL like kohls.com? I was under the impression that to use a canonical tag, the pages have to have the same content, meaning that Gap and L.L. Bean would be using canonical tags incorrectly. Using a filter changes the content, whereas using a sort option just changes the order. What would be the best way to deal with duplicate content for this site? Thanks for reading!
Algorithm Updates | | DA20130 -
How to find which keywords bring traffic to a particular page on my website ?
I have been using Google Analytics and SEOMoz tools for a while now. I know which are my top landing pages and some of the keywords which bring me traffic. But I don't know which are the top searched keywords for my website as these are "not provided" by Google Analytics. More importantly, I want to know which keywords are directing traffic to a particular page on my website. Can anyone help ?
Algorithm Updates | | EricMoore0 -
Does google index non-public pages ie. members logged in page
hi, I was trying to locate resources on the topics regarding how much the google bot indexes in order to qualify a 'good' site on their engine. For example, our site has many pages that are associated with logged in users and not available to the public until they acquire a login username and password. Although those pages show up in google analytics, they should not be made public in the google index which is what happens. In light of Google trying to qualify a site according to how 'engaged' a user is on the site, I would feel that the activities on those member pages are very important. Can anyone offer suggestions on how Google treats those pages since we are planning to do further SEO optimization of those pages. Thanks
Algorithm Updates | | jumpdates0 -
Stop google indexing CDN pages
Just when I thought I'd seen it all, google hits me with another nasty surprise! I have a CDN to deliver images, js and css to visitors around the world. I have no links to static HTML pages on the site, as far as I can tell, but someone else may have - perhaps a scraper site? Google has decided the static pages they were able to access through the CDN have more value than my real pages, and they seem to be slowly replacing my pages in the index with the static pages. Anyone got an idea on how to stop that? Obviously, I have no access to the static area, because it is in the CDN, so there is no way I know of that I can have a robots file there. It could be that I have to trash the CDN and change it to only allow the image directory, and maybe set up a separate CDN subdomain for content that only contains the JS and CSS? Have you seen this problem and beat it? (Of course the next thing is Roger might look at google results and start crawling them too, LOL) P.S. The reason I am not asking this question in the google forums is that others have asked this question many times and nobody at google has bothered to answer, over the past 5 months, and nobody who did try, gave an answer that was remotely useful. So I'm not really hopeful of anyone here having a solution either, but I expect this is my best bet because you guys are always willing to try.
Algorithm Updates | | loopyal0