Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Sudden Drop in Mobile Core Web Vitals
-

For some reason, after all URLs being previously classified as Good, our Mobile Web Vitals report suddenly shifted to the above, and it doesn't correspond with any site changes on our end.
Has anyone else experience something similar or have any idea what might have caused such a shift?
Curiously I'm not seeing a drop in session duration, conversion rate etc. for mobile traffic despite the seemingly sudden change.
-
I can’t understand their algorithm for core web vitals. I have made some technical updates to our website for speed optimization, but the thing that happened in the search console is very confusing for my site.

For desktops, pages are indexed as good URLs
while mobile-indexed URLs are displayed as poor URLs.
Our website is the collective material for people looking for Canada immigration (PAIC), and 70% of the portion is filled with text only. We are using webp images for optimization, still it is not passing Core Web Vitals.I am looking forward to the expert’s suggestion to overcome this problem.
-
I can’t understand their algorithm for core web vitals. I have made some technical updates to our website for speed optimization, but the thing that happened in the search console is very confusing for my site.


For desktops, pages are indexed as good URLs
while mobile-indexed URLs are displayed as poor URLs.
Our website is the collective material for people looking for Canadian immigration (PAIC), and 70% of the portion is filled with text only. We are using webp images for optimization, still it is not passing Core Web Vitals.I am looking forward to the expert’s suggestion to overcome this problem.
-
@rwat Hi, did you find a solution?
-
Yes, I am also experiencing the same for one of my websites, but most of them are blog posts and I am using a lot of images without proper optimization, so that could be the reason. but not sure.
It is also quite possible that Google maybe adding some more parameters to their main web critical score.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Why My site pages getting video index viewport issue?
Hello, I have been publishing a good number of blogs on my site Flooring Flow. Though, there's been an error of the video viewport on some of my articles. I have tried fixing it but the error is still showing in Google Search Console. Can anyone help me fix it out?
Technical SEO | | mitty270 -
Unsolved IP Address Indexed on Google along with Domain
My website is showing/indexed on Google search results twice, with IP address and domain name
Other SEO Tools | | mupetra
I'm using AWS LightSail
I don't know how to fix this, beside I'm afraid this duplicate can harm my website0 -
Unsolved Google URL inspection live test rendering issue.
Hi Everyone, This is my first post on Moz. I have been trying to get this thing sorted and have read everywhere and everyone just says don't worry about it. I would really like some advice/suggestions on this it will be really helpful. When I use the Google URL inspection tool from the Google search console the page rendering is completely broken. The tool refuses to load resources each time. At end of the day that's how the website is rendered in Google cache. I have already tried disabling cache plugins and Cloudflare but nothing works. site - nationalcarparts.co.nz
Support | | caitlinrolex789
This is how it renders when using URL inspection tool -
https://prnt.sc/7XKHtEU01gEl and if you check cache:https://nationalcarparts.co.nz this is how Google is caching it. Plugins I am using - Elementor Version 3.6.1, Elementor Pro Version 3.6.4, Exclusive Addons Elementor Version 2.5.4, Exclusive Addons Elementor Pro Version 1.4.6, WP- rocket, Cloudflare Pro plan with the plugin. Please if someone has fixed this issue and has a possible solution for it. Thanks cacheissue1.PNG1 -
Solved How to solve orphan pages on a job board
Working on a website that has a job board, and over 4000 active job ads. All of these ads are listed on a single "job board" page, and don’t obviously all load at the same time. They are not linked to from anywhere else, so all tools are listing all of these job ad pages as orphans. How much of a red flag are these orphan pages? Do sites like Indeed have this same issue? Their job ads are completely dynamic, how are these pages then indexed? We use Google’s Search API to handle any expired jobs, so they are not the issue. It’s the active, but orphaned pages we are looking to solve. The site is hosted on WordPress. What is the best way to solve this issue? Just create a job category page and link to each individual job ad from there? Any simpler and perhaps more obvious solutions? What does the website structure need to be like for the problem to be solved? Would appreciate any advice you can share!
Reporting & Analytics | | Michael_M2 -
Crawl rate dropped to zero
Hello, I recently moved my site in godaddy from cpanel to managed wordpress. I bought this transfer directly from GoDaddy customer service. in this process they accidentally changed my domain from www to non www. I changed it back after the migration, but as a result of this sites craw rate from search console fell to zero and has not risen at all since then. In addition to this website does not display any other errors, i can ask google manually fetch my pages and it works as before, only the crawl rates seems to be dropped permanently. GoDaddy customer service also claims that do not see any errors but I think, however, that in some way they caused this during the migration when the url changed since the timing match perfectly. also when they accidentally removed the www, crawl rate of my sites non www version got up but fell back to zero when I changed it back to www version. Now the crawl rate of both www and non www version is zero. How do I get it to rise again? Customer service also said that the problem may be related to ftp-data of search console? But they were not able to help any more than .Would someone from here be able to help me with this in anyway please?
Technical SEO | | pok3rplay3r0 -
Why has my search traffic suddenly tanked?
On 6 June, Google search traffic to my Wordpress travel blog http://www.travelnasia.com tanked completely. There are no warnings or indicators in Webmaster Tools that suggest why this happened. Traffic from search has remained at zero since 6 June and shows no sign of recovering. Two things happened on or around 6 June. (1) I dropped my premium theme which was proving to be not mobile friendly and replaced it with the ColorMag theme which is responsive. (2) I relocated off my previous hosting service which was showing long server lag times to a faster host. Both of these should have improved my search performance, not tanked it. There were some problems with the relocation to the new web host which resulted in a lot of "out of memory" errors on the website for 3-4 days. The allowed memory was simply not enough for the complexity of the site and the volume of traffic. After a few days of trying to resolve these problems, I moved the site to another web host which allows more PHP memory and the site now appears reliably accessible for both desktop and mobile. But my search traffic has not recovered. I am wondering if in all of this I've done something that Google considers to be a cardinal sin and I can't see it. The clues I'm seeing include: Moz Pro was unable to crawl my site last Friday. It seems like every URL it tried to crawl was of the form http://www.travelnasia.com/wp-login.php?action=jetpack-sso&redirect_to=http://www.travelnasia.com/blog/bangkok-skytrain-bts-mrt-lines which resulted in a 500 status error. I don't know why this happened but I have disabled the Jetpack login function completely, just in case it's the problem. GWT tells me that some of my resource files are not accessible by GoogleBot due to my robots.txt file denying access to /wp-content/plugins/. I have removed this restriction after reading the latest advice from Yoast but I still can't get GWT to fetch and render my posts without some resource errors. On 6 June I see in Structured Data of GWT that "items" went from 319 to 1478 and "items with errors" went from 5 to 214. There seems to be a problem with both hatom and hcard microformats but when I look at the source code they seem to be OK. What I can see in GWT is that each hcard has a node called "n [n]" which is empty and Google is generating a warning about this. I see that this is because the author vcard URL class now says "url fn n" but I don't see why it says this or how to fix it. I also don't see that this would cause my search traffic to tank completely. I wonder if anyone can see something I'm missing on the site. Why would Google completely deny search traffic to my site all of a sudden without notifying any kind of penalty? Note that I have NOT changed the content of the site in any significant way. And even if I did, it's unlikely to result in a complete denial of traffic without some kind of warning.
Technical SEO | | Gavin.Atkinson1 -
Drop Down Menu - Link Juice Depletion
Hi, We have a site with 7 top level sections all of which contain a large number of subsections which may then contain further sub sections. To try and ensure the best user experience we have a top navigation with the 7 top level sections and when hovered a selection of the key sub sections. Although I like this format for the user as it makes it easier for them to find the most important sections / sub sections it does lead to a lot of links within every page on the site. In general each top section has a drop down with approx 10 - 15 subsections. This has therefore lead to SeoMoz's tools issuing its too many internal links warning. Then alongside this I am left wondering if I shouldn’t have to many links to my subsections and whether I would be better off being more selective of when I link to them. For instance I could choose the top 5 sub sections and place a link to them from our homepage and by doing so I would be passing a greater amount of link juice down the line. So I guess my dilemma is between ensuring the user has as easy a time traversing the site as possible whilst I try to keep a close watch on where, and how, our link juice is distributed. One solution I am considering is whether no-follow links could be utilised within the drop down menus? This way I could then have the desired user navigation and I would be in greater control of what pages link to which sub sections. Would that even work? Any advice would be greatly appreciated, Regards, Guy
Technical SEO | | guycampbell1 -
Is there a great tool for URL mapping old to new web site?
We are implementing new design and removing some pages and adding new content. Task is to correctly map and redirect old pages that no longer exist.
Technical SEO | | KnutDSvendsen0