Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Old subdomains - what to do SEO-wise?
-
Hello,
I wanted the community's advice on how to handle old subdomains.
We have https://www.yoursite.org. We also have two subdomains directly related to the main website: https://www.archive.yoursite.org and https://www.blog.yoursite.org.
As these pages are not actively updated, they are triggering lots and lots of errors in the site crawl (missing meta descriptions, and much much more). We do not have particular intentions of keeping them up-to-date in terms of SEO. What do you guys think is the best option of handling these?
I considered de-indexing, but content of these page is still relevant and may be useful - yet it is not up to date and it will never be anymore.
Many thanks in advance.
-
Thanks for replying Will.
You have mentioned a few ways to deal with this there - and they all seem to point out to the fact that this should not really be a high-priority issue for us at the moment. Especially, if you think that sub-domains do not really have a major effect to the main site (I would not even think it's even worth us deindexing to be honest as it may be relevant to some people and we can just allow Google to continue indexing as it is).
Surely, all considerations point to this: we can come to the conclusion that we won't be doing any SEO-related work on these pages.
Therefore, how do I set up MOZ to ignore these two sub-domains and only show crawl errors related to the main site? We just don't want these pages to be crawled at all by MOZ given we won't do any work on them.
Thanks
-
Hi there. Sorry for the slow follow-up on this - there was an issue that meant I didn't get the email alert when it was assigned to me.
There is increasing evidence that culling old / poor performing content from your site can have a positive effect, though I wouldn't be particularly confident that this would transfer across sub-domains to benefit the main site.
In general, I suspect that most effort expended here will be better-placed elsewhere, and so I would angle towards the least effort option.
I think that the "rightest" long-term answer though would be to move the best content to the main domain (with accompanying 301 redirects) and remove the remainder with 410 status codes. This should enable you to focus on the most valuable content and get the most benefit from the stuff that is valuable, while avoiding having to continue expending effort on the stuff that is no longer useful. The harder this is, though, the less I'd be inclined to do it - and would be more likely to consider just deindexing the lowest quality stuff and getting whatever benefit remains from the better content for as long as it is a net positive, with an eye to eventually removing it all.
Hope that helps - I don't think it's a super clear-cut situation unfortunately.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does ID's in URL is good for SEO? Will SEO Submissions sites allow such urls submissions?
Example url: http://public.beta.travelyaari.com/vrl-travels-13555-online It's our sites beta URL, We are going to implement it for our site. After implementation, it will be live on travelyaari.com like this - "https://www.travelyaari.com/vrl-travels-13555-online". We have added the keywords etc in the URL "VRL Travels". But the problems is, there are multiple VRL travels available, so we made it unique with a unique id in URL - "13555". So that we can exactly get to know which VRL Travels and it is also a solution for url duplication. Also from users / SEO point of view, the url has readable texts/keywords - "vrl travels online". Can some Moz experts suggest me whether it will affect SEO performance in any manner? SEO Submissions sites will accept this URL? Meanwhile, I had tried submitting this URL to Reddit etc. It got accepted.
White Hat / Black Hat SEO | | RobinJA0 -
Dodgy backlinks pointing to my website - someone trying to ruin my SEO rankings?
I just saw in 'Just discovered' section of MOZ that 2 new backlinks have appeared back to my website - www.isacleanse.com.au from spammy websites which look like they might be associated with inappropriate content. 1. http://laweba.net/opinion-y-tecnologia/css-naked-day/comment-page-53/ peepshow says: (peepshow links off to my site)07/17/2016 at 8:55 pm2. http://omfglol.org/archives/9/comment-page-196 voyeur says: (voyeur linking off to my site)
White Hat / Black Hat SEO | | IsaCleanse
July 17, 2016 at 7:58 pm Any ideas if this is someone trying to send me negative SEO and best way to deal with it?0 -
How to make second site in same niche and do white hat SEO
Hello, As much as we would like, there's a possibility that our site will never recover from it's Google penalties. Our team has decided to launch a new site in the same niche. What do we need to do so that Google will not mind us having 2 sites in the same niche? (Menu differences, coding differences, content differences, etc.) We won't have duplicate content, but it's hard to make the sites not similar. Thanks
White Hat / Black Hat SEO | | BobGW0 -
Best URL structure for SEO for Malaysian/Singapore site on .com.au domain
Hi there I know ideally i need a .my or .sg domain, however i dont have time to do this in the interim so what would be the best way to host Malaysian content on a www.domainname.com.au website? www.domainname.com.au/en-MY
White Hat / Black Hat SEO | | IsaCleanse
www.domainname.com.au/MY
domainname.com.au/malaysia
malaysia.domainname.com.au
my.domainname.com.au Im assuming this cant make the .com.au site look spammy but thought I'd ask just to be safe? Thanks in advance! 🙂0 -
Adult Toy Store SEO
Hi fellows, I'm not so strange to SEO. I have been promoting our spiritual network through SEO and we have received great returns from it. I'm planning to promote an adult toy store via SEO. I have never done any adult store promoting before but I think there are a lot of down sides to it, such as: #1 When I search related keywords many porn websites show up; I assume it seems spammy to google's eye. Also most of the links that I will get are probably from porn websites due to relevancy. #2 Many of our returning customers are coming from retargeting but I assume there is no adult promotion via google display. Is that right? (It's not SEO related) I'm wondering to know if google is against adult content in any way? Any feedbacks are appreciated.
White Hat / Black Hat SEO | | Arian-Ya0 -
Why does expired domains still work for SEO?
Hi everyone I’ve been doing an experiment during more than 1 year to try to see if its possible to buy expired domains. I know its considered black hat, but like I said, I wanted to experiment, that is what SEO is about. What I did was to buy domains that just expired, immediately added content on a WP setup, filled it with relevant content to the expired domain and then started building links to other relevant sites from these domains.( Here is a pretty good post on how to do, and I did it in a similar way. http://searchenginewatch.com/article/2297718/How-to-Build-Links-Using-Expired-Domains ) This is nothing new and SEO:s has been doing it for along time. There is a lot of rumors around the SEO world that the domains becomes worthless after they expire. But after trying it out during more than 1 year and with about 50 different expired domains I can conclude that it DOES work, 100% of the time. Some of the domains are of course better than others, but I cannot see any signs of the expired domains or the sites i link to has been punished by Google. The sites im liking to ranks great ONLY with those links 🙂 So to the question: WHY does Google allow this? They should be able to see that a domain has been expired right? And if its expired, why dont they just “delete” all the links to that domain after the expiry date? Google is well aware of this problem so what is stopping them? Is there any one here that know how this works technically?
White Hat / Black Hat SEO | | Sir0 -
Asynchronous loading of product prices bad for SEO?
We are currently looking into improving our TTFB on our ecommerce site. A huge improvement would be to asynchronously load the product prices on the product list pages. The product detail page – on which the product is ordered- will be left untouched. The idea is that all content like product data, images and other static content is sent to the browser first(first byte). The product prices depend on a set of user variables like delivery location, vat inclusive/exclusive,… etc. So they would requested via an ajax call to reduce the TTFB. My question is whether google considers this as black hat SEO or not?
White Hat / Black Hat SEO | | jef22200 -
Negative SEO - Case Studies Prove Results. De-rank your competitors
Reading these two articles made me feel sick. People are actually offering a service to de-rank a website. I could have swore I heard Matt Cutts say this was not possible, well the results are in. This really opens up a whole new can of worms for google. http://trafficplanet.com/topic/2369-case-study-negative-seo-results/ http://trafficplanet.com/topic/2372-successful-negative-seo-case-study/ This is only going to get worse as news like this will spread like wildfire. In one sense, its good these people have done this to prove it to google its just a pity they did it on real business's that rely on traffic.
White Hat / Black Hat SEO | | dean19860