Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
What countries does Google crawl from? Is it only US or do they crawl from Europe and Asia, etc.?
-
Where does Google crawl the web from? Is it in the US only, or do they do it from a European base too? The reason for asking is for GeoIP redirection. For example, if a website is using GeoIP redirection to redirect all US traffic to a .com site and all EU traffic to a .co.uk site, will Google ever see the .co.uk site?
-
Hi Keith,
In my experience Google mainly crawls from the US.
You're quite right, GeoIP redirection can cause major issues with indexation - as if you're redirecting everything from the US to the .com googlebot can't see the .co.uk site.
As such, I'm not a fan. Rather than implementing a hard redirect I prefer Amazon's approach. If you visit the amazon.com site from a UK IP you get a javascript overlay that invites you to visit the .co.uk version of the site instead - they let the user decide which site to view rather than actually redirecting them.
This is a nice solution as it ensures that the search bots can crawl both versions of the site, and rankings aren't endangered.
I hope this helps,
Hannah
-
Keith, I am having the same issue and I agree with you. The fact that Google has data centers in Europe does not necessarily mean the algos are indexing from there. I also want to set Europe and US GeoIP redirection. It would be great to get Mozers opinions on this. Hopefully this post gets freshly reviewed
-
Interesting question - I'd quite like to know what happens here too.
Matt Cutts recently posted a video on cloaking (http://youtu.be/QHtnfOgp65Q) saying that as long as you don't do anything 'special' for Googlebot you're OK, but presumably if you are redirecting IP's based on location and you don't want to prevent Googlebot form accessing your site then effectively you have to do something 'special' for Googlebot (e.g. you're doing one thing for everyone else and a different thing for Googlebot).
-
Hi,
The data center locations is interesting, but it isn't what I was looking for. I need to know whether Google crawls the web from any IP other than US IPs.
To clear up the second question, let me be more specific:
Let's say Google is crawling a .co.uk site from a US IP address. The site is using GEO IP redirection to redirect all US traffic to the .com site. Therefore, when Google attempts to crawl the .co.uk from the US IP address site it will be redirected to the .com site, never seeing the .co.uk site. Can anyone confirm that this is what happens?
-
I found an article from 2008 that shows Google data centre locations: http://bit.ly/mONhf9
Your other question is a bit confusing. Why Google wouldn't see the UK site?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Advise on the right way to block country specific users but not block Googlebot - and not be seen to be cloaking. Help please!
Hi, I am working on the SEO of an online gaming platform - a platform that can only be accessed by people in certain countries, where the games and content are legally allowed.
International SEO | | MarkCanning
Example: The games are not allowed in the USA, but they are allowed in Canada. Present Situation:
Presently when a user from the USA visits the site they get directed to a restricted location page with the following message: RESTRICTED LOCATION
Due to licensing restrictions, we can't currently offer our services in your location. We're working hard to expand our reach, so stay tuned for updates! Because USA visitors are blocked Google which primarily (but not always) crawls from the USA is also blocked, so the company webpages are not being crawled and indexed. Objective / What we want to achieve: The website will have multiple region and language locations. Some of these will exist as standalone websites and others will exist as folders on the domain. Examples below:
domain.com/en-ca [English Canada]
domain.com/fr-ca [french Canada]
domain.com/es-mx [spanish mexico]
domain.com/pt-br [portugese brazil]
domain.co.in/hi [hindi India] If a user from USA or another restricted location tries to access our site they should not have access but should get a restricted access message.
However we still want google to be able to access, crawl and index our pages. Can i suggest how do we do this without getting done for cloaking etc? Would this approach be ok? (please see below) We continue to work as the present situation is presently doing, showing visitors from the USA a restricted message.
However rather than redirecting these visitors to a restricted location page, we just black out the page and show them a floating message as if it were a model window.
While Googlebot would be allowed to visit and crawl the website. I have also read that it would be good to put paywall schema on each webpage to let Google know that we are not cloaking and its a restricted paid page. All public pages are accessible but only if the visitor is from a location that is not restricted Any feedback and direction that can be given would be greatly appreciated as i am new to this angle of SEO. Sincere thanks,0 -
How to avoid duplication across multiple country domains
Here's the scenario: I have a client currently running one Shopify site (AU) They want to launch three more country domains (US, UK and EU) They want each to be a standalone site, primarily so the customers can purchase in their local currency, which is not possible from a single Shopify site The inventory is all from the same source The product desscriptions will all be the same as well Question: How do we avoid content duplication (ie. how will canonical tags work in this scenario)?
International SEO | | muzzmoz0 -
Targeting Countries in the Middle East
Hi guys, I have a client based in the Middle East using a generic top level domain (.com), and they want to target multiple countries in the GCC (UAE, Saudi Arabia, Kuwait, Qatar etc). I’m thinking that using the hreflang tag would be the best solution here, however the pages will mostly have the exact same content. There will only be slight changes on some pages in terms of using localised title tags [client service] followed by [targeted country], h1's and meta descriptions. Is this the correct approach? And if so should this be implemented side wide or can it be implemented on selected pages only? The site will be in English only.
International SEO | | Jbeetle0 -
Country subfolders showing as sitelinks in Google, country targeting for home page no longer working
Hi There, Just wondering if you can help. Our site has 3 region versions (General .com, /ie/ for Ireland and /gb/ for UK), each submitted to Google Webmaster Tools as seperate sites with hreflang tags in the head section of all pages. Google was showing the correct results for a few weeks, but I resubmitted the home pages with slight text changes last week and something strange happened, though it may have been coincidental timing. When we search for the brand name in google.ie or google.co.uk, the .com now shows as the main site, where the sitelinks still show the correct country versions. However, the country subdirectories are now appearing as sitelinks, which is likely causing the problem. I have demoted these on GWT, but unsure as to whether that will work and it seems to take a while for sitelink demotion to work. Has anyone had anything similar happen? I thought perhaps it was a markup issue breaking the head section so that Google can no longer see the hreflangs pointing to each other as alternates. I checked the source code in w3 validator and it doesn't show any errors. Anyway, any help would be much appreciated - and thanks to anyone who gets back, it's a tricky type of issue to troubleshoot. Thanks, Ro
International SEO | | romh0 -
How can I change the currency Google lists my products with in the SERP?
I.e. This product - http://www.absoluteautomation.ca/fgd400-sensaphone400-p/fgd400.htm - shows up as USD in the SERP. (In the US it just won't show a currency, if Canada it will show USD on the SERP). My pricing is all in CAD, how can I tell Google this? (It knows pricing is CAD in my Google Product Listings/Merchant Center). Thanks!
International SEO | | absoauto0 -
E-Commerce - Country Domains versus 1 Domain?
Hi, Just wanted to get some feedback and opinions re the idea of segmenting our ecommerce site languages under various domains, like .jp for japanese, .it for italian etc... I do understand the geolocation benefits that this could bring us, but on the flipside, it would mean that we would need to grow our domain authority, link buiding per country domain, which is quite a bit of work. Has anyone ever considered or implemented this and any thoughts? Thanks!
International SEO | | bjs20100 -
Blocking domestic Google's in Robots.txt
Hey, I want to block Google.co.uk from crawling a site but want Google.de to crawl it. I know how to configure the Robots.txt to block Google and other engines - is there a fix to block certain domestic crawlers? any ideas? Thanks B
International SEO | | Bush_JSM0 -
Non US site pages indexed in US Google search
Hi, We are having a global site wide issue with non US site pages being indexed by Google and served up in US search results. Conversley, we have US en pages showing in the Japan Google search results. We currently us IP detect to direct users to the correct regional site but it isn't effective if the users are entering through an incorrect regional page. At the top of each or our pages we have a drop down menu to allow users to manually select their preferred region. Is it possible that Google Bot is crawling these links and indexing these other regional pages as US and not detecting it due to our URL structure? Below are examples of two of our URLs for reference - one from Canada, the other from the US /ca/en/prod4130078/2500058/catalog50008/ /us/en/prod4130078/2500058/catalog20038/ If that is, in fact, what is happening, would setting the links within the drop down to 'no follow' address the problem? Thank you. Angie
International SEO | | Corel0