Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Should I delete 100s of weak posts from my website?
-
I run this website: http://knowledgeweighsnothing.com/
It was initially built to get traffic from Facebook. The vast majority of the 1300+ posts are shorter curation style posts. Basically I would find excellent sources of information and then do a short post highlighting the information and then link to the original source (and then post to FB and hey presto 1000s of visitors going through my website). Traffic was so amazing from FB at the time, that 'really stupidly' these posts were written with no regard for search engine rankings.
When Facebook reach etc dropped right off, I started writing full original content posts to gain more traffic from search engines. I am starting to get more and more traffic now from Google etc, but there's still lots to improve.
I am concerned that the shortest/weakest posts on the website are holding things back to some degree. I am considering going through the website and deleting the very weakest older posts based on their quality/backlinks and PA. This will probably run into 100s of posts. Is it detrimental to delete so weak many posts from a website?
Any and all advice on how to proceed would be greatly recieved.
-
This is a very valid question, in my opinion, and one that I have thought about a lot. I even did it on a site before on a UGC section where there were about 30k empty questions, many of which were a reputation nightmare for the site. We used the parameters of:
- Over a year old
- Has not received an organic visit in the past year
We 410d all of them as they did not have any inbound links and we just wanted them out of the index. I believe they were later 301d, and that section of the site has now been killed off.
Directly after the pages were removed, we saw a lift of ~20% in organic traffic to that section of the site. That maintained, and over time that section of the site started getting more visits from organic as well.
I saw it as a win and went through with it because:
- They were low quality
- They already didn't receive traffic
- By removing them, we'd get more pages that we wanted crawled, crawled.
I think Gary's answer of "create more high quality content" is too simplistic. Yes, keep moving forward in the direction you are, but if you have the time or can hire someone else to do it, and those pages are not getting traffic, then I'd say remove them. If they are getting traffic, maybe do a test of going back and making them high quality to see if they drive more traffic.
Good luck!
-
Too many people are going to gloss over the "In general" part of what Gary is saying.
Things not addressed in that thread:
- If a URL isn't performing for you but has a few good backlinks, you're probably still better off to 301 the page to better content to it lend additional strength.
- The value of consistency across the site; wildly uneven content can undermine your brand.
- Consolidating information to provide a single authoritative page rather than multiple thin and weak pages.
- The pointlessness of keeping non-performing pages when you don't have the resources to maintain them.
-
Haha I read this question earlier, saw the post come across feedly and knew what I needed to do with it. Just a matter of minutes.
You're right though - I would've probably said remove earlier as well. It's a toss up but usually when they clarify, I try to follow. (Sometimes they talk nonsense of course, but you just have to filter that out.) -
Just pipped me to it

-
Hi Xpers.
I was reading a very timely, if not the same issue article today from Barry Schwartz over at SEO Round Table. He has been following a conversation from Gary Illyes at Google, whom apparently does not recommend removing content from a site to help you recover from a Panda issue, but rather recommends increasing the number of higher quality pages etc.
If you are continuing to get more traffic by adding your new larger higher quality articles, I would simply continue in the same vein. There is no reason why you cannot still continue to share your content on social platforms too.
In the past I may have suggested removing some thin/outsdated content and repointing to a newer more relevant piece, but in light of this article I now may start to think a tad differently. Hopefully some of the other Mozzers might have more thoughts on Barry's post too.
Here is the article fresh off the press today - https://www.seroundtable.com/google-panda-fix-content-21006.html
-
Google's Gary Illyes basically just answered this on Twitter: https://www.seroundtable.com/google-panda-fix-content-21006.html
"We don't recommend removing content in general for Panda, rather add more highQ stuff"
So rather than spend a lot of time on old work, move forward and improve. If there's terrible stuff, I'd of course remove it. But if it's just not super-high quality, I would do as Gary says in this instance and work on new things.
Truthfully, getting Google to recrawl year or two or five stuff can be tough. If they don't recrawl it you don't even get the benefit until they do, if there were a benefit. Moving forward seems to make more sense to me.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Spammy page with canonical reference to my website
A potentially spammy website http://www.rofof.com/ has included a rel canonical tag pointing to my website. They've included the tag on thousands of pages on their website. Furthermore http://www.rofof.com/ appears to have backlinks from thousands of other low-value domains For example www.kazamiza.com/vb/kazamiza242122/, along with thousands of other pages on thousands of other domains all link to pages on rofof.com, and the pages they link to on rofof.com are all canonicalized to a page on my site. If Google does respect the canonical tag on rofof.com and treats it as part of my website then the thousands of spammy links that point to rofof.com could be considered as pointing to my website. I'm trying to contact the owner of www.rofof.com hoping to have the canonical tag removed from their website. In the meantime, I've disavowed the www.rofof.com, the site that has canonical tag. Will that have any effect though? Will disavow eliminate the effect of a rel canonical tag on the disavowed domain or does it only affect links on the disavowed website? If it only affects links then should I attempt to disavow all the pages that link to rofof.com? Thanks for reading. I really appreciate any insight you folks can offer.
Intermediate & Advanced SEO | | brucepomeroy2 -
Website Snippet Update in Search Console?
I have a company that I started working with that has an outdated and inaccurate snippet coming up. See the link below. They changed their name from DK on Pittsburgh Sports to just DK Pittsburgh Sports several years ago, but the snippet is still putting the old info, including outdated and incorrect description. I'm not seeing that title or description anywhere on the site or a schema plugin. How can we get it updated? I have updated titles, etc. for the home page, and done a Fetch to get re-indexed. Does Snippet have a different type of refresh that I can submit or edit? Thanks in advance https://g.co/kgs/qZAnAC
Intermediate & Advanced SEO | | jeremyskillings0 -
What is the difference between Multilingual and multiregional websites?
Hi all, So, I have studied about multilingual and multiregional websites. As soon as possible, we will expand the website languages to english and spanish. The urls will be like this: http://example.com/pt-br
Intermediate & Advanced SEO | | mobic
http://example.com/en-us
http://example.com/es-ar Thereby, the tags will be like this: Great! But my doubt is: To /es-ar/ The indexing will be only to spanish languages in Argentina? What about the other countries that speak the same language, like Spain, Mexico, etc.I don't know if it will be possible develop a Spanish languages especially for each region. Should I do an multiregional website or only multilingual? How Google sees this case? Thanks for any advice!!1 -
Why does Moz recommend subdomains for language-specific websites?
In Moz's domain recommendations, they recommend subdirectories instead of subdomains (which agrees with my experience), but make an exception for language-specific websites: Since search engines keep different metrics for domains than they do subdomains, it is recommended that webmasters place link-worthy content like blogs in subfolders rather than subdomains. (i.e. www.example.com/blog/ rather than blog.example.com) The notable exceptions to this are language-specific websites. (i.e., en.example.com for the English version of the website). Why are language-specific websites excepted from this advice? Why are subdomains preferable for language-specific websites? Google's advice says subdirectories are fine for language-specific websites, and GSC allows geographic settings at the subdirectory level (which may or may not even be needed, since language-specific sites may not be geographic-specific), so I'm unsure why Moz would suggest using subdirectories in this case.
Intermediate & Advanced SEO | | AdamThompson0 -
Google indexed wrong pages of my website.
When I google site:www.ayurjeewan.com, after 8 pages, google shows Slider and shop pages. Which I don't want to be indexed. How can I get rid of these pages?
Intermediate & Advanced SEO | | bondhoward0 -
Credit Links on Client Websites
I know there have been several people who have asked this but a lot of them were back in 2012 before many of the google changes. My question is the same though. With all the changes with Google's algorithm. Is it okay to put your link on the bottom of your clients website. Like Web Design by, etc. Part of the reason is to drive traffic but also if someone is actually interested who designed the website, they will click it. But now reading about how bad links can hurt you tremendously, it makes me second guess if this is ok. My gut feeling says, no.
Intermediate & Advanced SEO | | blackrino0 -
Delete or not delete old/unanswered forum threads?
Hello everyone, here is another question for you: I have several forum postings on my websites that are pretty old and so they are sort of "dead discussion" threads. Some of those old discussion threads are still getting good views (but not new postings), and so I presume may be valuable for some users. But most of them are just answers to personal questions that I doubt someone else could be interested in. Besides that, many postings are just single, unanswered questions still waiting for an answer, forgotten, they are just sitting there, and will probably stay unanswered for years.... I don't think this may be good for SEO, am I right? How do you suggest to approach this kind of issues on forums or discussions sections on a website? I am eager to know your thoughts on all this. Thank you in advance! All the best, Fab.
Intermediate & Advanced SEO | | fablau0 -
SeoMoz Crawler Shuts Down The Website Completely
Recently I have switched servers and was very happy about the outcome. However, every friday my site shuts down (not very cool if you are getting 700 unique visitors per day). Naturally I was very worried and digged deep to see what is causing it. Unfortunately, the direct answer was that is was coming from "rogerbot". (see sample below) Today (aug 5) Same thing happened but this time it was off for about 7 hours which did a lot of damage in terms of seo. I am inclined to shut down the seomoz service if I can't resolve this immediately. I guess my question is would there be a possibility to make sure this doesn't happen or time out like that because of roger bot. Please let me know if anyone has answer for this. I use your service a lot and I really need it. Here is what caused it from these error lines: 216.244.72.12 - - [29/Jul/2011:09:10:39 -0700] "GET /pregnancy/14-weeks-pregnant/ HTTP/1.1" 200 354 "-" "Mozilla/5.0 (compatible; rogerBot/1.0; UrlCrawler; http://www.seomoz.org/dp/rogerbot)" 216.244.72.11 - - [29/Jul/2011:09:10:37 -0700] "GET /pregnancy/17-weeks-pregnant/ HTTP/1.1" 200 51582 "-" "Mozilla/5.0 (compatible; rogerBot/1.0; UrlCrawler; http://www.seomoz.org/dp/rogerbot)"
Intermediate & Advanced SEO | | Jury0