Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Schema Markup Errors - Priority or Not?
-
Greetings All...
I've been digging through the search console on a few of my sites and I've been noticing quite a few structured data errors. Most of the errors are related to: hcard, hentry and hatom. Most of them are missing author & entry-title, while the other one is missing: fn.
I recently saw an article on SEL about Google's focus on spammy mark-up. The sites I use are built and managed by vendors, so I would have to impress upon them the impact of these errors and have them prioritize, then fix them.
My question is whether or not this should be prioritized? Should I have them correct these errors sooner than later or can I take a phased approach? I haven't noticed any loss in traffic or anything like that, I'm more focused on what negative impact a "phased approach" could have.
Any thoughts?
-
In that case I would say a phased approach would be fine. It is an issue that should be addressed, but I wouldn't classify it as "critical".
-
Thanks for the responses.
These changes will be managed by the developers, so it won't impact any other priorities that I have in the queue. My concern was whether this was pressing enough that I would have to push the developer towards a hard timeline vs a more phased one. I took a look at other clients on this platform and saw the same errors. So, this may be a platform wide markup error.
These are automotive clients, so their sites are OEM mandated. Therefore, I don't have any say in the types of markup they use. But, my goal is to impress upon them the fact that these errors could definitely negatively impact my clients (if not now, in the future).
-
I think it depends on what else you have in the queue for your clients. Without knowing what other priorities you have, I can't say whether this is more or less important. However, as Patrick said, correct markup helps search engines understand more about the context of what's on the page. It probably won't make a big ranking impact, and I doubt you're going to get penalized for some markup errors as long as you're not trying to spam using markup.
I definitely prefer Schema.org markup over Hcard, but rather than implementing it as per Schema.org you may want to consider JSON-LD - https://developers.google.com/schemas/formats/json-ld .
-
Hi there
Yes, I would consider this a priority as you want to have the most upto date and relevant markup on your site, so that crawlers can better understand your important information and attribute the right properties to your site/content.
I would also focus on Schema, as it's recognized by major search engines like Google, Bing, Yahoo, and Yandex, so you are pleasing multiple crawlers at once. Pages that have this markup also tend to rank four positions higher than pages without. So it is definitely worth the investment of time.
Hope this helps! Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Search Console Showing 404 errors for product pages not in sitemap?
We have some products with url changes over the past several months. Google is showing these as having 404 errors even though they are not in sitemap (sitemap shows the correct NEW url). Is this expected? Will these errors eventually go away/stop being monitored by Google?
Technical SEO | | woshea0 -
How to get rid of bot verification errors
I have a client who sells highly technical products and has lots and lots (a couple of hundred) pdf datasheets that can be downloaded from their website. But in order to download a datasheet, a user has to register on the site. Once they are registered, they can download whatever they want (I know this isn't a good idea but this wasn't set up by us and is historical). On doing a Moz crawl of the site, it came up with a couple of hundred 401 errors. When I investigated, they are all pages where there is a button to click through to get one of these downloads. The Moz error report calls the error "Bot verification". My questions are:
Technical SEO | | mfrgolfgti
Are these really errors?
If so, what can I do to fix them?
If not, can I just tell Moz to ignore them or will this cause bigger problems?0 -
Search Console Errors 400 and 405
Hi, Does anyone know if search console errors showing as follows are damaging to serps: /xmlrpc.php is returning 405 error /wp-admin/admin-ajax.php is returning 400 error These errors seem to of coincided almost to the day that there was a ranking drop for the primary keyword from mid page 1 to bottom of page 2. No matter what I do I cannot seem to correct these errors. Any advice would be greatly appreciated. Thanks
Technical SEO | | DaleZon0 -
Schema for blogs
When I run a wordpress blog through the structured data testing tool I see that there is @type hentry. Is this enough for blogs etc? Is this a result of Wordpress adding in this markup? Do you recommend adding @blogposting type and if so why? What benefit to add a specific type of schema? How does it help in blogging? Thanks
Technical SEO | | AL123al4 -
Duplicate content and 404 errors
I apologize in advance, but I am an SEO novice and my understanding of code is very limited. Moz has issued a lot (several hundred) of duplicate content and 404 error flags on the ecommerce site my company takes care of. For the duplicate content, some of the pages it says are duplicates don't even seem similar to me. additionally, a lot of them are static pages we embed images of size charts that we use as popups on item pages. it says these issues are high priority but how bad is this? Is this just an issue because if a page has similar content the engine spider won't know which one to index? also, what is the best way to handle these urls bringing back 404 errors? I should probably have a developer look at these issues but I wanted to ask the extremely knowledgeable Moz community before I do 🙂
Technical SEO | | AliMac260 -
Some URLs were not accessible to Googlebot due to an HTTP status error.
Hello I'm a seo newbie and some help from the community here would be greatly appreciated. I have submitted the sitemap of my website in google webmasters tools and now I got this warning: "When we tested a sample of the URLs from your Sitemap, we found that some URLs were not accessible to Googlebot due to an HTTP status error. All accessible URLs will still be submitted." How do I fix this? What should I do? Many thanks in advance.
Technical SEO | | GoldenRanking140 -
Increase 404 errors or 301 redirects?
Hi all, I'm working on an e-commerce site that sells products that may only be available for a certain period of time. Eg. A product may only be selling for 1 year and then be permanently out of stock. When a product goes out of stock, the page is removed from the site regardless of any links it may have gotten over time. I am trying to figure out the best way to handle these permanently out of stock pages. At the moment, the site is set up to return a 404 page for each of these products. There are currently 600 (and increasing) instances of this appearing on Google Webmasters. I have read that too many 404 errors may have a negative impact on your site, and so thought I might 301 redirect these URLs to a more appropriate page. However I've also read that too many 301 redirects may have a negative impact on your site. I foresee this to be an issue several years down the road when the site has thousands of expired products which will result in thousands of 404 errors or 301 redirects depending on which route I take. Which would be the better route? Is there a better solution?
Technical SEO | | Oxfordcomma0 -
Schema for Price Comparison Services - Good or Bad?
Hey guys, I was just wondering what the whole schema.org markup means for people that run search engines (i.e. for a niche, certain products) or price comparison engines in general. The intend behind schema.org was to help the engines better understand the pages content. Well, I guess such services don't necessarily want Google to understand that they're just another search engine (and thus might get thrown out of the index for polluting it with search result pages). I see two possible scenarios: either not implement them or implement them in a way that makes the site not look like an aggregator, i.e. by only marking up certain products with unique text. Any thoughts? Does the SEOmoz team has any advice on that? Best,
Technical SEO | | derderko
schuon0