Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Best and easiest Google Depersonalization method
-
Hello,
Moz hasn't written anything about depersonalization for years. This article has methods, but I don't know if they are valid anymore.
What's an easy, effective way to depersonalize Google search these days? I would just log out of Google, but that shows different ranking results than Moz's rank tracker for one of our main keywords, so I don't know if that method is correct.
Thanks
-
Thanks Rand, really appreciate it!
-
Hi Rand,
Thanks for jumping in and helping us all out. Your response is much appreciated.
Regards,
Vijay
-
I'm surprised at how well this still works, but it does:
- Use an incognito browser window to remove account personalization
- Use a query string like this: https://google.co.nz/search?q=your+keyword+terms&gl=us
With 2) above, you're removing the geographic bias of any particular region/IP address by searching in Google New Zealand, then re-geo-locating the search to the US. This will give you non-geo-biased results.
If you want to see how specific results look from a particular region, there's two semi decent options:
A) Use Google's Ad Preview Tool: https://adwords.google.com/apt/anon/AdPreview?__u=1000000000&__c=1000000000
B) Use the &near parameter, e.g. https://google.co.nz/search?q=your+keyword+terms&gl=us&near=seattle+wa -
Yes, this is one of many factors for de-personalization. Also, there can be many more hidden factors which we are yet to discover.
I have done a lot of research on this matter, I use a specific PC with VPN dedicated to checking keyword SERP ranks for my clients, as they are from many different countries and having a different target audience, we try to replicate the results for different scenarios.
I hope this helps.
-
So am I correct that logging out and adding &pws=0 is not enough?
-
Hi There,
In addition to the methods suggested for de-personalization of results, there are additional few more factors.You may also like to read a blog post I wrote on my website Impact of Personalized Search results .
Incognito window doesn't mean you have deleted history of the previous browsing, you will have to clean the browsing history and cookies.
Use VPN or proxy to get results from different locations and countries. This gives you best Idea about your SERP status in different countries.
I hope this helps, please feel free to ask more questions by responding.
Regards,
Vijay
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How can I discover the Google ranking number for a keyword in Brazil?
Hello, how can I discover the Google ranking number for a keyword in Brazil location. I need to know what is the position in Brazil location for the keyword "ligação internacional" in the Google search engine for the webpage www.solaristelecom.com/ligacao-internacional. I tried to use the Moz tools to discover it but only shows that I am not in the top 50, then I want to know where I am, and if I am listed or not. I tried to search it in my browser and didn't show the name of my website. Thank you.
Algorithm Updates | | lmoraes1 -
Best way to handle outdated & years old Blog-posts?
Hi all, We have almost 1000 pages or posts from our blog which are indexed in Google. Few of them are years old, but they have some relevant and credible content which appears in search results. I am just worried about other hundreds of non-relevant posts which are years old. Being hosting hundreds of them, our website is holding lots of these useless indexing pages which might be giving us little negative impact of keeping non-ranking pages. What's the best way to handle them? Are these pages Okay? Or must be non-indexed or deleted? Thanks
Algorithm Updates | | vtmoz0 -
US domain pages showing up in Google UK SERP
Hi, Our website which was predominantly for UK market was setup with a .com extension and only two years ago other domains were added - US (.us) , IE (.ie), EU (.eu) & AU (.com.au) Last year in July, we noticed that few .us domain urls were showing up in UK SERPs and we realized the sitemap for .us site was incorrectly referring to UK (.com) so we corrected that and the .us domain urls stopped appearing in the SERP. Not sure if this actually fixed the issue or was such coincidental. However in last couple of weeks more than 3 .us domain urls are showing for each brand search made on Google UK and sometimes it replaces the .com results all together. I have double checked the PA for US pages, they are far below the UK ones. Has anyone noticed similar behaviour &/or could anyone please help me troubleshoot this issue? Thanks in advance, R
Algorithm Updates | | RaksG0 -
Deindexed from Google images Sep17th
We have a travel website that has been ranked in Google for 12-14years. The site produces original images with branding on them and have been for years ranking well. There's been no site changes. We have a Moz spamscore 1/17 and Domain Authority 59. Sep 17th all our images just disappeared from Google Image Search. Even searching for our domain with keyword photo results in nothing. I've checked our Search console and no email from Google and I see no postings on Moz and others relating to search algo changes with Images. I'm at a loss here.. does anyone have some advice?
Algorithm Updates | | danta2 -
Lost 50% google traffic in one day - panic?
Hi girls + guys, a site of us were hit by a google update or a google penalty. We have lost 50% google traffic in one day (25th april, 2012). (Total visitors in average per day: 6k, yesterday: 3k) It's a german website, so I think google.de (germany) was updated. Our rankings in google.at (austria) are also affected, but it's not that bad as in google.de. We have not done any specific on page seo activities in the last two months. GWT doesn't have any message for us (no critical errors). After my first analyse I can say this: google has indexed 17k pages (thats fine) we are on 1st place with our domain name the last three days, the google traffic went up (+20%), but yesterday it was 50% below average (so -70%) last week we had a very good day, we had twice the traffic than normal, but this calmed down the following days we have lost number no. 1 places at two high traffic keywords. We had these no 1 rankings for years. We have been outranked by two of our competitors, but they have not done any onpage changes. We have lost a lot of positions at a lot of keywords. But there are also keywords which moved up. We have good content, useres are visiting 5 pages in average. No virus, no hacker (no hidden cloaking page) it's an old domain (2002) Lot of (good) inbound links Lot's of likes, g+. Good twitter activty. So, all in all I think it's more likely a ranking algo change than a penalty (a penalty for what reason?) My specific question(s): Is there any "check list" which could help me to find out the reason for this mess? What is the best strategy to regain the positions? New HTML code? New On page seo? (seomoz grades most of our important pages an A) Any idea would be appreciated! Best wishes,
Algorithm Updates | | GeorgFranz
Georg.1 -
Stop google indexing CDN pages
Just when I thought I'd seen it all, google hits me with another nasty surprise! I have a CDN to deliver images, js and css to visitors around the world. I have no links to static HTML pages on the site, as far as I can tell, but someone else may have - perhaps a scraper site? Google has decided the static pages they were able to access through the CDN have more value than my real pages, and they seem to be slowly replacing my pages in the index with the static pages. Anyone got an idea on how to stop that? Obviously, I have no access to the static area, because it is in the CDN, so there is no way I know of that I can have a robots file there. It could be that I have to trash the CDN and change it to only allow the image directory, and maybe set up a separate CDN subdomain for content that only contains the JS and CSS? Have you seen this problem and beat it? (Of course the next thing is Roger might look at google results and start crawling them too, LOL) P.S. The reason I am not asking this question in the google forums is that others have asked this question many times and nobody at google has bothered to answer, over the past 5 months, and nobody who did try, gave an answer that was remotely useful. So I'm not really hopeful of anyone here having a solution either, but I expect this is my best bet because you guys are always willing to try.
Algorithm Updates | | loopyal0 -
How do I get the expanded results in a Google search?
I notice for certain site (ex: mint.com) that when I search, the top result has a very detailed view with options to click to different subsections of the site. However for my site, even though we're consistently the top result for our branded terms, the result is still only a single line item. How do I adjust this?
Algorithm Updates | | syount1