Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Should I submit a sitemap for a site with dynamic pages?
-
I have a coupon website (http://couponeasy.com)
Being a coupon website, my content is always keeps changing (as new coupons are added and expired deals are removed) automatically.I wish to create a sitemap but I realised that there is not much point in creating a sitemap for all pages as they will be removed sooner or later and/or are canonical.
I have about 8-9 pages which are static and hence I can include them in sitemap.
Now the question is....
If I create the sitemap for these 9 pages and submit it to google webmaster, will the google crawlers stop indexing other pages?
NOTE: I need to create the sitemap for getting expanded sitelinks.
-
Hi Anuj -
I think you are operating from a very false assumption that is going to hurt your organic traffic (I suspect it has already).
The XML sitemap is one of the the very best ways to tell the search engines about new content on your website. Therefore, by not putting your new coupons in the sitemap, you are not giving the search engines one of the strongest signals possible that new content is there.
Of course, you have to automate your sitemap and have it update as often as possible. Depending on the size of your site and therefore the processing time, you could do it hourly, every 4 hours, something like that. If you need recommendations for automated sitemap tools, let me know. I should also point out that you should put the frequency that the URLs are updated (you should keep static URLs for even your coupons if possible). This will be a big win for you.
Finally, if you want to make sure your static pages are always indexed, or want to keep an eye on different types of coupons, you can create separate sitemaps under your main sitemap.xml and segment by type. So static-pages-sitemap.xml, type-1-sitemap.xml, etc. This way you can monitor indexation by type.
Hope this helps! Let me know if you need an audit or something like that. Sounds like there are some easy wins!
John
-
Hello Ahuj,
To answer your final question first:
Crawlers will not stop until they encounter something they cannot read or are told not to continue beyond a certain point. So your site will be updated in the index upon each crawl.
I did some quick browsing and it sounds like an automated sitemap might be your best option. Check out this link on Moz Q&A:
https://azwa.1clkaccess.in/community/q/best-practices-for-adding-dynamic-url-s-to-xml-sitemap
There are tools out there that will help with the automation process, which will update hourly/daily to help crawlers find your dynamic pages. The tool suggested on this particular blog can be found at:
http://www.xml-sitemaps.com/standalone-google-sitemap-generator.html
I have never used it, but it is worth looking into as a solution to your problem. Another good suggestion I saw was to place all removed deals in an archive page and make them unavailable for purchase/collection. This sounds like a solution that would minimize future issues surrounding 404's, etc.
Hope this helps!
Rob
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the best strategy to SEO Discontinued Products on Ecommerce Sites?
RebelsMarket.com is a marketplace for alternative fashion. We have hundreds of sellers who have listed thousands of products. Over 90% of the items do not generate any sales; and about 40% of the products have been on the website for over 3+ years. We want to cleanup the catalog and remove all the old listings that older than 2years that do not generate any sales. What is the best practice for removing thousands of listings an Ecommerce site? do we 404 these products and show similar items? Your help and thoughts is much appreciated.
White Hat / Black Hat SEO | | JimJ3 -
Why is this site ranked #1 in Google with such a low DA (is DA not important anymore?)
Hi Guys, Would you mind helping me with the below please? I would like to get your view on it and why Google ranks a really new domain name #1 with super low domain authority? Or is Domain Authority useless now in Google? It seems like from the last update that John Mueller said that they do not use Domain Authority so is Moz Domain Authority tool not to take seriously or am I missing something? There is a new rehab in Thailand called https://thebeachrehab.com/ (Domain authority 13)It's ranked #1 in Google.co.th for these phrases: drug rehab thailand but also for addiction rehab thailand. So when checking the backlink profile it got merely 21 backlinks from really low DA sites (and some of those are really spammy or not related). Now there are lots of sites in this industry here which have a lot higher domain authority and have been around for years. The beach rehab is maybe only like 6 months old. Here are three domains which have been around for many years and have much higher DA and also more relevant content. These are just 3 samples of many others... <cite class="iUh30">https://www.thecabinchiangmai.com (Domain Authority 52)</cite>https://www.hope-rehab-center-thailand.com/ (Domain Authority 40)https://www.dararehab.com (Domain Authority 32) These three sites got lots of high DA backlinks (DA 90++) from strong media links like time.com, theguardian.com, telegraph.co.uk etc. (especially thecabinchiangmai.com) but the other 2 got lots of solid backlinks from really high DA sites. So when looking at the content, thebeachrehab.com has less content as well. Can anyone have a look and let me know your thoughts why Google picks a brand new site, with DA 13 and little content in the top compared to competition? I do not see the logic in this? Cheers
White Hat / Black Hat SEO | | igniterman75
John0 -
Should I delete older posts on my site that are lower quality?
Hey guys! Thanks in advance for thinking through this with me. You're appreciated! I have 350 pieces of Cornerstone Content that has been a large focus of mine over the last couple years. They're incredibly important to my business. That said, less experienced me did what I thought was best by hiring a freelance writer to create extra content to interlink them and add relevancy to the overall site. Looking back through everything, I am starting to realize that this extra content, which now makes up 1/3 my site, is at about 65%-70% quality AND only gets a total of about 250 visitors per month combined -- for all 384 articles. Rather than spending the next 9 months and investing in a higher quality content creator to revamp them, I am seeing the next best option to remove them. From a pros perspective, do you guys think removing these 384 lower quality articles is my best option and focusing my efforts on a better UX, faster site, and continual upgrading of the 350 pieces of Cornerstone Content? I'm honestly at a point where I am ready to cut my losses, admit my mistakes, and swear to publish nothing but gold moving forward. I'd love to hear how you would approach this situation! Thanks 🙂
White Hat / Black Hat SEO | | ryj0 -
Site Footer Links Used for Keyword Spam
I was on the phone with a proposed web relaunch firm for one of my clients listening to them talk about their deep SEO knowledge. I cannot believe that this wouldn’t be considered black-hat or at least very Spammy in which case a client could be in trouble. On this vendor’s site I notice that they stack the footer site map with about 50 links that are basically keywords they are trying to rank for. But here’s the kicker shown by way of example from one of the themes in the footer: 9 footer links:
White Hat / Black Hat SEO | | RosemaryB
Top PR Firms
Best PR Firms
Leading PR Firms
CyberSecurity PR Firms
Cyber Security PR Firms
Technology PR Firms
PR Firm
Government PR Firms
Public Sector PR Firms Each link goes to a unique URL that is basically a knock-off of the homepage with a few words or at the most one sentences swapped out to include this footer link keyword phrase, sometimes there is a different title attribute but generally they are a close match to each other. The canonical for each page links back to itself. I simply can’t believe Google doesn’t consider this Spammy. Interested in your view.
Rosemary0 -
Pages mirrored on unknown websites (not just content, all the HTML)... blackhat I've never seen before.
Someone more expert than me could help... I am not a pro, just doing research on a website... Google Search Console shows many backlinks in pages under unknown domains... this pages are mirroring the pages of the linked website... clicking on a link on the mirror page leads to a spam page with link spam... The homepage of these unknown domain appear just fine... looks like that the domain is partially hijacked... WTF?! Have you ever seen something likes this? Can it be an outcome of a previous blackhat activity?
White Hat / Black Hat SEO | | 2mlab0 -
Preventing CNAME Site Duplications
Hello fellow mozzers! Let me see if I can explain this properly. First, our server admin is out of contact at the moment,
White Hat / Black Hat SEO | | David-Kley
so we are having to take this project on somewhat blind. (forgive the ignorance of terms). We have a client that needs a cname record setup, as they need a sales.DOMAIN.com to go to a different
provider of data. They have a "store" platform that is hosted elsewhere and they require a cname to be
sent to a custom subdomain they set up on their end. My question is, how do we prevent the cname from being indexed along with the main domain? If we
process a redirect for the subdomain, then the site will not be able to go out and grab the other providers
info and display it. Currently, if you type in the sales.DOMAIN.com it shows the main site's homepage.
That cannot be allow to take place as we all know, having more than one domain with
exact same content = very bad for seo. I'd rather not rely on Google to figure it out. Should we just have the cname host (where its pointing at) add a robots rule and have it set to not index
the cname? The store does not need to be indexed, as the items are changed almost daily. Lastly, is an A record required for this type of situation in any way? Forgive my ignorance of subdomains, cname records and related terms. Our server admin being
unavailable is not helping this project move along any. Any advice on the best way to handle
this would be very helpful!0 -
Off-page SEO and link building
Hi everyone! I work for a marketing company; for one of our clients' sites, we are working with an independent SEO consultant for on-page help (it's a large site) as well as off-page SEO. Following a meeting with the consultant, I had a few red flags with his off-page practices – however, I'm not sure if I'm just inexperienced and this is just "how it works" or if we should shy away from these methods. He plans to: guest blog do press release marketing comment on blogs He does not plan to consult with us in advance regarding the content that is produced, or where it is posted. In addition, he doesn't plan on producing a report of what was posted where. When I asked about these things, he told me they haven't encountered any problems before. I'm not saying it was spam-my, but I'm more not sure if these methods are leaning in the direction of "growing out of date," or the direction of "black-hat, run away, dude." Any thoughts on this would be crazy appreciated! Thanks, Casey
White Hat / Black Hat SEO | | CaseyDaline0