Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Subdomain 403 error
-
Hi Everyone,
A crawler from our SEO tool detects a 403 error from a link from our main domain to a a couple of subdomains. However, these subdomains are perfect accessibly. What could be the problem? Is this error caused by the server, the crawlbot or something else?
I would love to hear your thoughts.
Jens -
no at all
-
Hi Roman,
Thanks for your answer!
It's a commercial tool.
I checked the robots.txt file and .htaccess, but didn't saw any problems.
As you say, the problem can just be caused by the user-agent.If so, this will not affect my SEO efforts, right?
-
Which tool are you using is this a custom tool or commercial tool such as Screamingfrog?
-
These are all for client errors. That means the page wasn’t found and something is wrong with the request. Whatever is happening though, the issue is typically on the client side:
403: Forbidden, So In your case, the first place that you need to check is your .htaccess and your Robots.txt file and make sure that they are not blocking any crawler or at least the crawler of your tools.
For example, some Hosting providers block all the crawlers that are not Google or Bing to save resources. So is usual that Roger (Moz Crawler) has problems to crawl a page that is blocked on the server side. Usually, Moz, Ahrefs, Semrush has this kind of problem so in summary
- Make sure your .htaccess and your Robots.txt is not blocking your crawler
- Make sure your hosting is not blocking your crawler
- If all the above does not work try to modify the user-agent of your tool
Hope this info helps you with your problem
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Google Search Console Still Reporting Errors After Fixes
Hello, I'm working on a website that was too bloated with content. We deleted many pages and set up redirects to newer pages. We also resolved an unreasonable amount of 400 errors on the site. I also removed several ancient sitemaps that listed content deleted years ago that Google was crawling. According to Moz and Screaming Frog, these errors have been resolved. We've submitted the fixes for validation in GSC, but the validation repeatedly fails. What could be going on here? How can we resolve these error in GSC.
Technical SEO | | tif-swedensky0 -
Migrating to new subdomain with new site and new content.
Our marketing department has decided that a new site with new content is needed to launch new products and support our existing ones. We cannot use the same subdomain(www = old subdomain and ww1 = new subdomain)as there is a technically clash between the windows server currently used, and the lamp stack required to run the new wordpress based CMS and site. We also have an aging piece of SAAS software on the www domain which is makes moving it to it's own subdomain far too risky. 301's have been floated as a way of managing the transition. I'm not too keen on that idea due to the double effect of new subdomain and content, and the SEO impact it might have. I've suggested uploading the new site to the new subdomain while leaving the old site in place. Then gradually migrating sections over before turning parts of the old site off and using a 301 at that point to finalise the move. The old site would inform user's there is a new version and it would then convert them to the new site(along with a cookie to auto redirect them in future.) while still leaving the old content in place for existing search traffic, bookmarks and visitors via static URLs. Before turning off sections on the old site we would create rel canonicals to redirect to the new pages based on a a mapped set of URLs(this in itself concerns me as the rel canonical is essentially linking to different content). Would be grateful for any advice on whether this strategy is flawed or whether another strategy might be more suitable?
Technical SEO | | Rezza0 -
Are subdomains a good seo strategy for a multistore e-commerce?
Hi there I'm wondering what is the best strategy to work with multi-stores on magento: to use or not to use subdomains? Suppose we have the www.website.com and we configure it to use multistore. The url base will not have the store id on it so it will not be like www.website.com/store1 and www.website.com/store2. It will simply rely on the user session so if we have two categories for each store it will acces using: www.website.com/category1 (for store 1) www.website.com/category2 (for store 2) The homepage will allways be set on www.website.com so we should have a single page for several "home pages" (depending on the user session / store he is accessing). I guess this is not a good option if we want to rank for different keywords (for each store). So I was wondering if it is a good solution to set: store1.website.com store2.website.com This way we have 2 "home pages" each one able to rank. Does it make sense? Is it good or bad for seo? Another option I was considering was: www.website.com (for store 1) store2.website.com (for store 2) store3.website.com (for store 3) www.website.com/blog (for blog) Can this work? Good or bad for seo? best regards
Technical SEO | | qgairsoft0 -
:443 - 404 error
I get strange :443 errors in my 404 monitor on Wordpress https://www.compleetverkleed.nl:443/hoed-al-capone-panter-8713647758068-2/
Technical SEO | | Happy-SEO
https://www.compleetverkleed.nl:443/cart/www.compleetverkleed.nl/feestkleding
https://www.compleetverkleed.nl:443/maskers/ I have no idea where these come from :S2 -
Is there any benefit in using a subdomain redirected to a single page?
For example if we have a domain www.bobshardware.com.au and we setup a subdomain sydneysupplies.bobshardware.com.au and then brisbanescrewdrivers.bobshardware.com.au and used those in ad campaigns. Each subdomain being redirected back to a single page such as bobshardware.com.au/brisbane-screw-drivers etc. Is there a benefit ? Cheers
Technical SEO | | techdesign0 -
429 Errors?
I have over 500,000 429 errors in webmaster tools. Do I need to be concerned about these errors?
Technical SEO | | TheKrazyCouponLady0 -
404 crawl errors from "tel:" link?
I am seeing thousands of 404 errors. Each of the urls is like this: abc.com/abc123/tel:1231231234 Everything is normal about that url except the "/tel:1231231234" these urls are bad with the tel: extension, they are good without it. The only place I can find this character string is on each page we have this code which is used for Iphones and such. What are we doing wrong? Code: Phone: <a href="[tel:1231231234](tel:7858411943)"> (123) 123-1234a>
Technical SEO | | EugeneF0 -
Subdomain Removal in Robots.txt with Conditional Logic??
I would like to see if there is a way to add conditional logic to the robots.txt file so that when we push from DEV to PRODUCTION and the robots.txt file is pushed, we don't have to remember to NOT push the robots.txt file OR edit it when it goes live. My specific situation is this: I have www.website.com, dev.website.com and new.website.com and somehow google has indexed the DEV.website.com and NEW.website.com and I'd like these to be removed from google's index as they are causing duplicate content. Should I: a) add 2 new GWT entries for DEV.website.com and NEW.website.com and VERIFY ownership - if I do this, then when the files are pushed to LIVE won't the files contain the VERIFY META CODE for the DEV version even though it's now LIVE? (hope that makes sense) b) write a robots.txt file that specifies "DISALLOW: DEV.website.com/" is that possible? I have only seen examples of DISALLOW with a "/" in the beginning... Hope this makes sense, can really use the help! I'm on a Windows Server 2008 box running ColdFusion websites.
Technical SEO | | ErnieB0