Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Using disavow tool for 404s
-
Hey Community,
Got a question about the disavow tool for you. My site is getting thousands of 404 errors from old blog/coupon/you name it sites linking to our old URL structure (which used underscores and ended in .jsp).
It seems like the webmasters of these sites aren't answering back or haven't updated their sites in ages so it's returning 404 errors. If I disavow these domains and/or links will it clear out these 404 errors in Google? I read the GWT help page on it, but it didn't seem to answer this question.
Feel free to ask any questions that may help you understand the issue more.
Thanks for your help,
-Reed -
Hey Doug, had another question for you. A big majority (90% of 18,000+ errors) of our 404 errors are coming from .jsp files from our old website.
Of course, it's not ideal to manually update or redirect these, but possibly write a script to automatically change them. Would it be beneficial to add this .jsp to our robots.txt file?
-
Thanks Doug, really helpful answer.
I am getting thousands of 404's but when I dive into them the majority of the 404 URLs can't be found in any of the "linked from" examples GWT gives me.
I think 301 redirects are the best option like you said and/or having a good 404 page.
Thanks,
-Reed -
The disavow tool isn't going to "fix" these 404s.
404's aren't always a bad thing. The warnings in GWT are just there to make you aware that there's potentially a problem with your site. It doesn't mean there IS a problem.
Is there content on your site that visitors clicking on these links should be arriving at? In which case you want to implement 301 redirects so that your visitors arrive on the most appropriate pate.
If there's nothing relevant on the site any more - a 404 error is perfectly acceptable.
Of course, you want to make sure that your 404 page gives the visitors the best chance/incentive to dig into the content on your site. Adding a nice obvious search box and/or links to most popular content may be a good idea. If you're getting lots of visitors from a particular site that you can maybe tailor your 404 message depending on the referrer.
The drawback here is that links pointing at 404 error pages won't pass link-equity. If there is value in the links, and you're happy that they're going to be seen a natural/authentic as far as google is concerned then you can always 301 redirect these.
Where you really should pay attention is where you have internal links on your site that are reporting 404s. These are under your control and you really don't want to give you visitors a poor experience with lots of broken links on your site.
-
I wouldn't recommend using the disavow tool for this. The disavow tool is used to clean up spammy links that were not gained naturally.
A better solution is to use 301 redirects and redirect the 404'd pages to the new pages that work on your website. That way users will land where they should if they click the links, and Google will still give you juice from those links.
Here's a place to get started on how t do that: https://support.google.com/webmasters/answer/93633?hl=en
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
block primary . xxx domain with disavow tool
Hi friends I discovered spam url attack on top sites with good google positions for specific keyword. Can I block primary domain like .xxx with disavow tool? There is hundreds of different domains but primary domain is always the same. for example like this domain:xxx? Thanks
Intermediate & Advanced SEO | | netcomsia0 -
Should I use noindex or robots to remove pages from the Google index?
I have a Magento site and just realized we have about 800 review pages indexed. The /review directory is disallowed in robots.txt but the pages are still indexed. From my understanding robots means it will not crawl the pages BUT if the pages are still indexed if they are linked from somewhere else. I can add the noindex tag to the review pages but they wont be crawled. https://www.seroundtable.com/google-do-not-use-noindex-in-robots-txt-20873.html Should I remove the robots.txt and add the noindex? Or just add the noindex to what I already have?
Intermediate & Advanced SEO | | Tylerj0 -
Using CSS to hide anchor text
Hi all, In my website, I would like to use CSS to set the anchor text to "website design service"(my company provides web design service) but show the button text as "website", due to some artistic reasons. Therefore, the anchor text for the link is "website design service" but what users see is "websites". Does this sound spammy to Google? Is it a risky move that might hurt my SEO? Looking for some advises here. Thank you very much. Best,
Intermediate & Advanced SEO | | Raymondlee0 -
Should I use meta noindex and robots.txt disallow?
Hi, we have an alternate "list view" version of every one of our search results pages The list view has its own URL, indicated by a URL parameter I'm concerned about wasting our crawl budget on all these list view pages, which effectively doubles the amount of pages that need crawling When they were first launched, I had the noindex meta tag be placed on all list view pages, but I'm concerned that they are still being crawled Should I therefore go ahead and also apply a robots.txt disallow on that parameter to ensure that no crawling occurs? Or, will Googlebot/Bingbot also stop crawling that page over time? I assume that noindex still means "crawl"... Thanks 🙂
Intermediate & Advanced SEO | | ntcma0 -
Using pictures from another domain
We are building several sites for several clients which will be using images from the manufacturer. Our dev team wants to insert the manufacturer's url for the images, instead of actually downloading the image and hosting on our server. There are thousands of images, so downloading images to our server will be time consuming, so we are looking for a shortcut.... however I'm concerned this will cause other issues. Is using manufactueresdomain.com/12345.jpg going to cause SEO issues? will this generate Google penalties? Since we are not able to control the image file name, we cannot optimize it. We will add Alt text and Title tag for each image, but the file name is random characters. How important is the file name for SEO?
Intermediate & Advanced SEO | | Branden_S0 -
Use of subdomains, subdirectories or both?
Hello, i would like your advice on a dilemma i am facing. I am working a new project that is going to release soon, thats a network of users with personal profiles seperated in categories for example lets say the categories are colors. So let say i am a member and i belong in red color categorie and i got a page where i update my personal information/cv/resume as well as a personal blog thats on that page. So the main site is giving the option to user to search for members by the criteria of color. My first idea is that all users should own a subdomain (and this is how its developed so far) thats easy to use and since the domain name is really small (just 3 letters) i believe subdomain worth since personal site will be easy to remember. My dilemma is should all users own a subdomain, a subdirectory or both and if both witch one should be the canonical? Since it said that search engines treat subdomains as different stand-alone sites, whats best for the main site? to show multiple search results with profiles in subdomains or subdirectories? What if i use both? meaning in search results i use search directory url for each profile while same time each profile owns a subdomains as well? and if so which one should be the canonical? Thanks in advance, C
Intermediate & Advanced SEO | | HaCos0 -
How long is it safe to use a 302 redirect?
Hi All, Lets assume there is site A and site B, both sites are live on the internet today as standalone businesses, but they sell very similar products. Site B has built up some link equity and will eventually become the domain for site A due to an organisational re-brand. For the time being however site A will remain, but site B needs to disappear temporarily, but not lose the link equity which has been built up against it. My current thinking is to 302 redirect site B to site A such that users and search bots accessing site B will be redirected to site A whilst leaving the link equity that exists against site B fully intact and allowing us to continue to grow it should we wish to. The question is, does anybody have a view on how long it is safe to use a 302 temporary redirect for? i.e., is 8-10 months to long. Thanks, Ben
Intermediate & Advanced SEO | | BenRush0 -
Should I Use City Name in URL?
Having a website designed for a car dealership and deciding what attributes to use in the URL. Should I include the city name in the URL? Or does that help for SEO purposes? Other ideas of what to research or try are appreciated too. Thanks 🙂
Intermediate & Advanced SEO | | kylesuss0