Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Is it a good idea to remove old blogs?
-
So I have a site right now that isn't ranking well, and we are trying everything to help it out. One of my areas of concern is we have A LOT of old blogs that were not well written and honestly are not overly relevant. None of them rank for anything, and could be causing a lot of duplicate content issues. Our newer blogs are doing better and written in a more Q&A type format and it seems to be doing better.
So my thought is basically wipe out all the blogs from 2010-2012 -- probably 450+ blog posts.
What do you guys think?
-
You may find this case study helpful of a blog that decided to exactly that:
http://www.koozai.com/blog/search-marketing/deleted-900-blog-posts-happened-next/
-
It depends on what you mean by "remove."
If the content of all those old blogs truly is poor, I'd strongly consider going through 1 by 1 and seeing how you can re-write, expand upon, and improve the overall blog post. Can you tackle the subject from another angle? Are there images, videos, or even visual assets you can add to the post to make it more intriguing and sharable?
Then, you can seek out some credible places to strategically place your blog content for additional exposure and maybe even a link. Be careful here, however. I'm not talking about forum and comment spam, but there may be some active communities that are open to unique and valuable content. Do your research first.
When going through each post 1 by 1, you'll undoubtedly find blog posts that are simply "too far gone" or not relevant enough to keep. Essentially, it wouldn't even be worth your time to re-write them. In this case, find another page on your website that's MOST SIMILAR to the blog post. This may be in topic, but also could be an author's page, another blog post that is valuable, a contact page, etc. Then perform 301 redirects of the crap blog posts to those pages.
Not only are you salvaging any little value those blog posts may have had, but you're also preventing crawl and index issues by telling the search engine bots where that content is now (assuming it was indexed in the first place).
This is an incredibly long content process and should take you months. Especially if there's a lot of content that's good enough to be re-written, expanded upon, and added to. However making that content relevant and useful is the best thing you can do. It's a long process, but if your best content writers need a project, this would be it.
To recap: **1) **Go through each blog post 1 by 1, determine what's good enough to edit, what's "too far gone." 2) Re-write, edit, add to (content and images/videos) and re-promote them socially and to appropriate audiences and communities. 3) For the posts that were "too far gone," 301 redirect them to the most relevant posts and pages that are remaining "live."
Again, I can say firsthand that this is a LONG process. I've done it for a client in the past. However, the return was well worth the work. And by doing it this way and not just deleting posts, you're preventing yourself a lot of crawl/index headaches with the search engines.
-
we have A LOT of old blogs that were not well written and honestly are not overly relevant.
Wow.... it is great to hear someone looking at their content and deciding that he can kick it up a notch. I have seen a lot of people would never, ever, pull the kill switch on an old blog post. In fact they are still out there hiring people to write stuff that is really crappy.
If this was my site I would first check to be sure that I don't have a penguin or unnatural links problem. If you think you are OK there, here is what I would do.
-
I would look at those blog posts to see if any of them have any traffic, link or revenue value. Value is defined as... A) Traffic from any search engine or other quality source, B) valuable links, C) viewing by current website visitors, D) traffic who enter through those pages making any income through ads or purchases.
-
If any of them pass the value test above then I would improve that page. I would put a nice amount of work into that page.
-
Next I would look at each of those blog posts and see if any have content value. That means an idea that could be developed into valuable content... or valuable content that could be simply rewritten to a higher standard. Valuable content is defined as a topic that might pull traffic from search or be consumed by current site visitors.
-
If any pass the valuable content test then I would improve them. I would make them kickass.
-
After you have done the above, I would pull the plug on everything else.... or if I was feeling charitable I would offer them to a competitor.

Salutes to you for having the courage to clean some slates.
-
-
I would run them through Copyscape to check for plagiarism/duplicate content issues. After that, I would check for referral traffic. If there are some pages that draw enough traffic, you might not want to remove them. Finally, round it off with a page level link audit. Majestic can give you a pretty good idea of where they stand.
The pages that don't make the cut should be set to throw 410 status codes. If you still don't like the content on pages with good links and/or referral traffic, 301 those to better content on the same subject.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why has my website been removed from Bing?
I have a website that has recently been removed from Bing's index, but can't figure out why. The website isn't new, and it is indexed just fine on Google. These are the steps I've tried: The website is verified in Bing Webmaster Tools and successfully submitted the sitemap. I tested the URL to ensure that Bingbot is allowed to crawl the site I submitted URLs to Bing via the URL Submission tool There isn't a "noindex" on the site preventing it from being indexed When I do a URL Inspection, an error message comes up saying "The inspected URL is known to Bing but has some issues which are preventing us from serving it to our users. We recommend you to follow Bing Webmaster Guidelines." I contacted Bing to ask whether the website was removed in error, but received a reply that the website doesn't comply with Bing's quality guidelines, but they wouldn't go into detail as to which guidelines the website isn't meeting. The website URL is https://www.pardeehospital.org. Can anyone offer any advice or insight as to why Bing won't index our site? Thank you!
Intermediate & Advanced SEO | | lindsey.steinkamp0 -
Republishing blog content on LinkedIn and Medium
Hi Mozzers, I'm thinking republishing content from my own website's blog on platforms like LinkedIn and Medium. These sites are able to reach a far bigger (relevant) audience than I can through my own website, so there's strategic reasoning for doing this. However, with SEO being a key activity on my own website, I don't want to be at risk of any penalties for duplicate content. However, I've just read this on Search Engine Journal: "there is confirmation from Google... Gary Illyes has stated that republishing articles won’t cause a penalty, and that it’s simply a filter they use when evaluating sites. Most sites are only penalized for duplicate content if the site is 100% copied content." So, what do people think - is republishing blog content, on LinkedIn and Medium safe? And is it a sound tactic to increase reach?
Intermediate & Advanced SEO | | Zoope0 -
Creative Commons Images Good for SEO?
I've been looking at large image packages through iStock, Getty, Fotolia and 123RF, but before spending a bunch of money, I wanted to get some of your feedback on Creative Commons images. Should be worried that something found on Google Images > Search Tools > Usage Rights section can be used without issue or legal threats from the big image companies so long as they are appropriately referenced? AND will using these types of images and linking to the sources have any affect on SEO efforts or make the blog/website look spammy in Google's eyes because we need to link to the source? How are you using Creative Commons images and is there anything I should be aware of in the process of searching, saving, using, referencing, etc? Patrick
Intermediate & Advanced SEO | | WhiteboardCreations0 -
When removing a product page from an ecommerce site?
What is the best practice for removing a product page from an Ecommerce site? If a 301 is not available and the page is already crawled by the search engine A. block it out in the robot.txt B. let it 404
Intermediate & Advanced SEO | | Bryan_Loconto0 -
Wordpress blog in a subdirectory not being indexed by Google
HI MozzersIn my websites sitemap.xml, pages are listed, such as /blog/ and /blog/textile-fact-or-fiction-egyptian-cotton-explained/These pages are visible when you visit them in a browser and when you use the Google Webmaster tool - Fetch as Google to view them (see attachment), however they aren't being indexed in Google, not even the root directory for the blog (/blog/) is being indexed, and when we query:site: www.hilden.co.uk/blog/ It returns 0 results in Google.Also note that:The Wordpress installation is located at /blog/ which is a subdirectory of the main root directory which is managed by Magento. I'm wondering if this causing the problem.Any help on this would be greatly appreciated!AnthonyToTOHuj.png?1
Intermediate & Advanced SEO | | Tone_Agency0 -
Our login pages are being indexed by Google - How do you remove them?
Each of our login pages show up under different subdomains of our website. Currently these are accessible by Google which is a huge competitive advantage for our competitors looking for our client list. We've done a few things to try to rectify the problem: - No index/archive to each login page Robot.txt to all subdomains to block search engines gone into webmaster tools and added the subdomain of one of our bigger clients then requested to remove it from Google (This would be great to do for every subdomain but we have a LOT of clients and it would require tons of backend work to make this happen.) Other than the last option, is there something we can do that will remove subdomains from being viewed from search engines? We know the robots.txt are working since the message on search results say: "A description for this result is not available because of this site's robots.txt – learn more." But we'd like the whole link to disappear.. Any suggestions?
Intermediate & Advanced SEO | | desmond.liang1 -
How to deal with old, indexed hashbang URLs?
I inherited a site that used to be in Flash and used hashbang URLs (i.e. www.example.com/#!page-name-here). We're now off of Flash and have a "normal" URL structure that looks something like this: www.example.com/page-name-here Here's the problem: Google still has thousands of the old hashbang (#!) URLs in its index. These URLs still work because the web server doesn't actually read anything that comes after the hash. So, when the web server sees this URL www.example.com/#!page-name-here, it basically renders this page www.example.com/# while keeping the full URL structure intact (www.example.com/#!page-name-here). Hopefully, that makes sense. So, in Google you'll see this URL indexed (www.example.com/#!page-name-here), but if you click it you essentially are taken to our homepage content (even though the URL isn't exactly the canonical homepage URL...which s/b www.example.com/). My big fear here is a duplicate content penalty for our homepage. Essentially, I'm afraid that Google is seeing thousands of versions of our homepage. Even though the hashbang URLs are different, the content (ie. title, meta descrip, page content) is exactly the same for all of them. Obviously, this is a typical SEO no-no. And, I've recently seen the homepage drop like a rock for a search of our brand name which has ranked #1 for months. Now, admittedly we've made a bunch of changes during this whole site migration, but this #! URL problem just bothers me. I think it could be a major cause of our homepage tanking for brand queries. So, why not just 301 redirect all of the #! URLs? Well, the server won't accept traditional 301s for the #! URLs because the # seems to screw everything up (server doesn't acknowledge what comes after the #). I "think" our only option here is to try and add some 301 redirects via Javascript. Yeah, I know that spiders have a love/hate (well, mostly hate) relationship w/ Javascript, but I think that's our only resort.....unless, someone here has a better way? If you've dealt with hashbang URLs before, I'd LOVE to hear your advice on how to deal w/ this issue. Best, -G
Intermediate & Advanced SEO | | Celts180 -
Old Redirecting Website Still Showing In SERPs
I have a client, a plumber, who bought another plumbing company (and that company's domain) at one point. This other company was very old and has a lot of name recognition so they created a dedicated page to this other company within their main website, and redirected the other company's old domain to that page. This has worked fine, in that this page on the main site is now #1 when you search for the other old company's name. But for some reason the old domain comes up #2 (despite the fact that it's redirecting). Now, I could understand if the redirect had only been set up recently, but I'm reasonably sure this happened about a year ago. Could it be due to the fact that there are many sites out there still linking to that old domain? Thanks in advance!
Intermediate & Advanced SEO | | VTDesignWorks1