Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Best XML Sitemap generator
-
Do you guys have any suggestions on a good XML Sitemaps generator? hopefully free, but if it's good i'd consider paying
I am using a MAC so would prefer a online or mac version
-
Hi James - i saw your reply on this thread and a quick question - i was running Gsitecrawler, after selecting all the suitable options , it opens up a "Crawl watch" page. While I am assuming it is crawling the site, as per the online instruction it says to select the "Generate" tab at the main application window (I did not opt for auto ftp).
When should I select the Generate option, immediately or wait for crawl to complete?
suparno
-
The only way to find out is to shoot them an e-mail. Either way you will discover the answer

-
I am wondering if they are talking about the paid version cus I run it on my site. www.psbspeakers.com and it comes up with all kinds of dup content.
<loc>http://www.psbspeakers.com/products/image/Image-B6-Bookshelf</loc>
<loc>http://www.psbspeakers.com/products/bookshelf-speakers/Image-B6-Bookshelf</loc>with this code siteing on both pages:
<link rel="canonical" href="http://www.psbspeakers.com/products/image/Image-B6-Bookshelf"/> -
I am wondering if they are talking about the paid version cus I run it on my site. www.psbspeakers.com and it comes up with all kinds of dup content.
<loc>http://www.psbspeakers.com/products/image/Image-B6-Bookshelf</loc>
<loc>http://www.psbspeakers.com/products/bookshelf-speakers/Image-B6-Bookshelf</loc>with this code siteing on both pages:
<link rel="canonical" href="http://www.psbspeakers.com/products/image/Image-B6-Bookshelf"/> -
I e-mailed their support and they shared it does support canonical tags. Below is the response I received:
Hi,
The script will detect canonical tags. If you can provide a live example we can look into for you.Regards,PhilipXML-Sitemaps.com-----------------------------I would suggest ensuring your tags are valid. If they are, contact the site support and they can provide specific feedback.
-
Thanks Ryan.
That's the one I already use, but it does not take canonical's into account so i end up with 2-3 links for the same page.
-
A popular sitemap generator: http://www.xml-sitemaps.com/
I cannot say it is the best but rather it works fine. The free online version will scan 500 pages. For $20, you can then have unlimited number of pages.
-
Sorry I should have said... I am on a mac ;(
is there any online ones around that don't have a cap of 500 pages? -
GsiteCrawler every time. It's free and It's an awesome awesome tool http://gsitecrawler.com/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If I'm using a compressed sitemap (sitemap.xml.gz) that's the URL that gets submitted to webmaster tools, correct?
I just want to verify that if a compressed sitemap file is being used, then the URL that gets submitted to Google, Bing, etc and the URL that's used in the robots.txt indicates that it's a compressed file. For example, "sitemap.xml.gz" -- thanks!
Technical SEO | | jgresalfi0 -
Resubmit sitemaps on every change?
Hello Mozers, Our sitemaps were submitted to Google and Bing, and are successfully indexed. Every time pages are added to our store (ecommerce), we re-generate the xml sitemap. My question is: should we be resubmitting the sitemaps every time their content change, or since they were submitted once can we assume that the crawlers will re-download the sitemaps by themselves (I don't like to assume). What are best practices here? Thanks!
Technical SEO | | yacpro131 -
Should I Focus on Video Schema or a Video Sitemap First
Hey all, I'm working on a website that is soon going to launch a video hub that contains over 500 videos. I'm interested in ensuring that the videos show up on the SERPs page in the highest position possible. I know Google recommends that you have on-page schema for your videos as well as an XML sitemap so they can be indexed for SERP. When I look at schema and the XML video sitemap they seem to communicate very similar kinds of information (Title, Description, Thumbnail, Duration). I'm not sure which one to start with; is it more important to have video schema or a video sitemap? Additionally, if anyone knows of any good video sitemap generators (free is best, but cheap is okay too) then please let me know. Cursory google searching has just churned up a number of tools that look sketchy.
Technical SEO | | perfectsearch710 -
2 sitemaps on my robots.txt?
Hi, I thought that I just could link one sitemap from my site's robots.txt but... I may be wrong. So, I need to confirm if this kind of implementation is right or wrong: robots.txt for Magento Community and Enterprise ...
Technical SEO | | Webicultors
Sitemap: http://www.mysite.es/media/sitemap/es.xml
Sitemap: http://www.mysite.pt/media/sitemap/pt.xml Thanks in advance,0 -
Best way to noindex long dynamic urls?
I just got a Mozcrawl back and see lots of errors for overly dynamic urls. The site is a villa rental site that gives users the ability to search by bedroom, amenities, price, etc, so I'm wondering what the best way to keep these types of dynamically generated pages with urls like /property-search-page/?location=any&status=any&type=any&bedrooms=9&bathrooms=any&min-price=any&max-price=any from indexing. Any assistance will be greatly appreciated : )
Technical SEO | | wcbuckner0 -
302 redirect used, submit old sitemap?
The website of a partner of mine was recently migrated to a new platform. Even though the content on the pages mostly stayed the same, both the HTML source (divs, meta data, headers, etc.) and URLs (removed index.php, removed capitalization, etc) changed heavily. Unfortunately, the URLs of ALL forum posts (150K+) were redirected using a 302 redirect, which was only recently discovered and swiftly changed to a 301 after the discovery. Several other important content pages (150+) weren't redirected at all at first, but most now have a 301 redirect as well. The 302 redirects and 404 content pages had been live for over 2 weeks at that point, and judging by the consistent day/day drop in organic traffic, I'm guessing Google didn't like the way this migration went. My best guess would be that Google is currently treating all these content pages as 'new' (after all, the source code changed 50%+, most of the meta data changed, the URL changed, and a 302 redirect was used). On top of that, the large number of 404's they've encountered (40K+) probably also fueled their belief of a now non-worthy-of-traffic website. Given that some of these pages had been online for almost a decade, I would love Google to see that these pages are actually new versions of the old page, and therefore pass on any link juice & authority. I had the idea of submitting a sitemap containing the most important URLs of the old website (as harvested from the Top Visited Pages from Google Analytics, because no old sitemap was ever generated...), thereby re-pointing Google to all these old pages, but presenting them with a nice 301 redirect this time instead, hopefully causing them to regain their rankings. To your best knowledge, would that help the problems I've outlined above? Could it hurt? Any other tips are welcome as well.
Technical SEO | | Theo-NL0 -
Adding multi-language sitemaps to robots.txt
I am working on a revamped multi-language site that has moved to Magento. Each language runs off the core coding so there are no sub-directories per language. The developer has created sitemaps which have been uploaded to their respective GWT accounts. They have placed the sitemaps in new directories such as: /sitemap/uk/sitemap.xml /sitemap/de/sitemap.xml I want to add the sitemaps to the robots.txt but can't figure out how to do it. Also should they have placed the sitemaps in a single location with the file identifying each language: /sitemap/uk-sitemap.xml /sitemap/de-sitemap.xml What is the cleanest way of handling these sitemaps and can/should I get them on robots.txt?
Technical SEO | | MickEdwards0 -
Is it bad to have same page listed twice in sitemap?
Hello, I have found that from an HTML (not xml) sitemap of a website, a page has been listed twice. Is it okay or will it be considered duplicate content? Both the links use same anchor text, but different urls that redirect to another (final) page. I thought ideal way is to use final page in sitemap (and in all internal linking), not the intermediate pages. Am I right?
Technical SEO | | StickyRiceSEO1