Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Best Practice for www and non www
-
How is the best way to handle all the different variations of a website in terms of www | non www | http | https?
In Google Search Console, I have all 4 versions and I have selected a preference.
In Open Site Explorer I can see that the www and non www versions are treated differently with one group of links pointing to each version of the same page. This gives a different PA score.
eg.
- http://mydomain.com DA 25 PA 35
- http://www.mydomain.com DA 19 PA 21
Each version of the home page having it's only set of links and scores.
Should I try and "consolidate" all the scores into one page?
Should I set up redirects to my preferred version of the website?
Thanks in advance
-
thanks for your answer
that was helpful
-
Thanks for taking the time to put together such a wonderfully detailed answer.
-
Hi Samantha,
What you have is what are called "canonical issues." By allowing multiple versions of your domain open and crawlable to search engines you "split" your ranking authority and result in the issues you are seeing right now.
The best practice is to choose one version of your domain as the "true canonical" and then 301 redirect the others at the server level by means of mod_rewrite code. Doing so will consolidate your content, incoming links and PageRank and greatly increase the root domain authority of your site.
To search engines, if your site hasn't instituted 301 redirect commands at the server level, all of these versions of your site home page would be treated as "separate pages" and each would accumulate authority individually:
http://yoursite.com/
http://www.yoursite.com/
http://yoursite.com/index.php
http://www.yoursite.com/index.php
https://yoursite.com
https://www.yoursite.comYou get the idea.
Most websites are run on one of three different types of servers...
- Unix-based servers running Apache.
- Unix-based servers running Nginx.
- Microsoft Windows-based servers running IIS or similar.
If you're unsure of what kind of server runs your site, ask your hosting company. Most sites are run on Unix-based servers with Apache. In that case, the server's behavior is configured using something called the .htaccess file.
If your site's root domain already contains a
.htaccessfile, you can simply scroll to the end of whatever code is already there and append your 301 redirect code at the bottom of the file, starting on a new line. While this may sound complicated, it's actually very, very simple to do. If you can upload files to and from your Web server, then chances are you'll have no trouble managing (i.e. altering or creating and uploading) your.htaccessfile(s).But yes, bottom line, you ALWAYS want to consolidate URLs and present one uniform "preferred" URL format to search engines and users. In your case, that would appear to the be the non-www domain which has the higher Domain Authority.
You can learn all about redirection best practices at the Moz resource here: https://azwa.1clkaccess.in/learn/seo/redirection
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Solved Should I consolidate my "www" and "non-www" pages?
My page rank for www and non-www is the same. In one keyword instance, my www version performs SO much better. Wanting to consolidate to one or the other. My question is as to whether all these issues would ultimately resolve to my chosen consolidated domain (i.e. www or non-www) regardless of which one I choose. OR, would it be smart to choose the one where I am already ranking high for this significant keyword phrase? Thank you in advance for your help.
Technical SEO | | meditationbunny0 -
Best practices for controlling link juice with site structure
I'm trying to do my best to control the link juice from my home page to the most important category landing pages on my client's e-commerce site. I have a couple questions regarding how to NOT pass link juice to insignificant pages and how best to pass juice to my most important pages. INSIGNIFICANT PAGES: How do you tag links to not pass juice to unimportant pages. For example, my client has a "Contact" page off of there home page. Now we aren't trying to drive traffic to the contact page, so I'm worried about the link juice from the home page being passed to it. Would you tag the Contact link with a "no follow" tag, so it doesn't pass the juice, but then include it in a sitemap so it gets indexed? Are there best practices for this sort of stuff?
Technical SEO | | Santaur0 -
Best URL-structure for ecommerce store?
What structure will recommend to the product pages? Lets make an example with the keyword "Luxim FZ200" With category in url:
Technical SEO | | gojesper
www.myelectronicshop.com/digital-cameras/luxim-FZ200.html With /product prefix:
www.myelectronicshop.com/product/luxim-FZ200.html Without category in url:
www.myelectronicshop.com/luxim-FZ200.html I have read in a blog post that Paddy Moogan recommend /lluxim-FZ200.html - i think i prefer this version too. But I can see that many of the bigger ecommerce stores are using a /product prefix before the product name. What is the reason for this? and what is best practice?0 -
Best Practices for adding Dynamic URL's to XML Sitemap
Hi Guys, I'm working on an ecommerce website with all the product pages using dynamic URL's (we also have a few static pages but there is no issue with them). The products are updated on the site every couple of hours (because we sell out or the special offer expires) and as a result I keep seeing heaps of 404 errors in Google Webmaster tools and am trying to avoid this (if possible). I have already created an XML sitemap for the static pages and am now looking at incorporating the dynamic product pages but am not sure what is the best approach. The URL structure for the products are as follows: http://www.xyz.com/products/product1-is-really-cool
Technical SEO | | seekjobs
http://www.xyz.com/products/product2-is-even-cooler
http://www.xyz.com/products/product3-is-the-coolest Here are 2 approaches I was considering: 1. To just include the dynamic product URLS within the same sitemap as the static URLs using just the following http://www.xyz.com/products/ - This is so spiders have access to the folder the products are in and I don't have to create an automated sitemap for all product OR 2. Create a separate automated sitemap that updates when ever a product is updated and include the change frequency to be hourly - This is so spiders always have as close to be up to date sitemap when they crawl the sitemap I look forward to hearing your thoughts, opinions, suggestions and/or previous experiences with this. Thanks heaps, LW0 -
ECommerce: Best Practice for expired product pages
I'm optimizing a pet supplies site (http://www.qualipet.ch/) and have a question about the best practice for expired product pages. We have thousands of products and hundreds of our offers just exist for a few months. Currently, when a product is no longer available, the site just returns a 404. Now I'm wondering what a better solution could be: 1. When a product disappears, a 301 redirect is established to the category page it in (i.e. leash would redirect to dog accessories). 2. After a product disappers, a customized 404 page appears, listing similar products (but the server returns a 404) I prefer solution 1, but am afraid that having hundreds of new redirects each month might look strange. But then again, returning lots of 404s to search engines is also not the best option. Do you know the best practice for large ecommerce sites where they have hundreds or even thousands of products that appear/disappear on a frequent basis? What should be done with those obsolete URLs?
Technical SEO | | zeepartner1 -
Merging several sites into one - best practice
I had 2 sites on the web (www.physicseditor.de, www.texutrepacker.com) and decided to move them all under one single domain (www.codeandweb.com) Both sites were ranking very good for several keywords. I not redirected the most important pages from the old domains with a 301 redirect to the new subpages (www.texturepacker.com => www.codeandweb.com/texturepacker) Google still delivers the old domains but the redirect take people directly to the new content. I've already submitted the new site map to google webmaster tools. Pages are already in the index but do not really show up in the search results. How long does it take until google accepts the new domain and delivers the new content in the search results? Was it ok what I did? Or is there some room for improvement? SeoMoz will of course not find any information about the new page since it is not yet directly linked in google. But I can't get ranking information for the "old" pages since SeoMoz tells me that it can't crawl the old domains....
Technical SEO | | gossi740 -
404 errors on non-existent URLs
Hey guys and gals, First Moz Q&A for me and really looking forward to being part of the community. I hope as my first question this isn't a stupid one but I was just struggling to find any resource that dealt with the issue and am just looking for some general advice. Basically a client has raised a problem with 404 error pages - or the lack thereof- on non-existent URLs on their site; let's say for example: 'greatbeachtowels.com/beach-towels/asdfas' Obviously content never existed on this page so its not like you're saying 'hey, sorry this isn't here anymore'; its more like- 'there was never anything here in the first place'. Currently in this fictitious example typing in 'greatbeachtowels.com/beach-towels/asdfas**'** returns the same content as the 'greatbeachtowels.com/beach-towels' page which I appreciate isn't ideal. What I was wondering is how far do you take this issue- I've seen examples here on the seomoz site where you can edit the URI in a similar manner and it returns the same content as the parent page but with the alternate address. Should 404's be added across all folders on a site in a similar way? How often would this scenario be and issue particularly for internal pages two or three clicks down? I suppose unless someone linked to a page with a misspelled URL... Also would it be worth placing 301 redirects on a small number of common mis-spellings or typos e.g. 'greatbeachtowels.com/beach-towles' to the correct URLs as opposed to just 404s? Many thanks in advance.
Technical SEO | | AJ2340 -
Best Dynamic Sitemap Generator
Hello Mozers, Could you please share the best Dynamic Sitemap Generator you are using. I have found this place: http://www.seotools.kreationstudio.com/xml-sitemap-generator/free_dynamic_xml_sitemap_generator.php Thanks in advanced for your help.
Technical SEO | | SEOPractices0