Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Unsolved Duplicate LocalBusiness Schema Markup
-
Hello! I've been having a hard time finding an answer to this specific question so I figured I'd drop it here. I always add custom LocalBusiness markup to clients' homepages, but sometimes the client's website provider will include their own automated LocalBusiness markup. The codes I create often include more information.
Assuming the website provider is unwilling to remove their markup, is it a bad idea to include my code as well? It seems like it could potentially be read as spammy by Google.
Do the pros of having more detailed markup outweigh that potential negative impact?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What Service Page Strategy Should We Use to Target City-Specific Local Intent Service Keywords?
Hey guys! We are targeting a number of cities in the Nassau and Suffolk County areas for foundation repair, insulation, and mold remediation keywords, and we were debating on creating city-specific pages for each location and service, or creating one service page for each type of service that contains all of the services and solutions within that service category for each city. Example: City-Specific Pages for Each Service: One page for say foundation repair, one page for foundation crack repair, one page for foundation problems, etc. (for each target city) Service Category Pages for Each City: One page for foundation contractors that lists all services on one page in sections. Which one do you think is better for local SEO and rankings? Both seem to have their advantages and disadvantages to me. Just to throw a couple out there, the category pages may not rank as high as the city pages for each individual service if our competitors have a whole page designed for that service and we only have a part of a page covering the topic. At the same time, they would save labor hours, technical issues would be less, and they would be condensed, and we would have WAY less mess on the backend. I appreciate your expert opinion on this one. The site is www. zavzaseal.com in case you want to check us out.
Local SEO | | everysecond0 -
Good to use disallow or noindex for these?
Hello everyone, I am reaching out to seek your expert advice on a few technical SEO aspects related to my website. I highly value your expertise in this field and would greatly appreciate your insights.
Technical SEO | | williamhuynh
Below are the specific areas I would like to discuss: a. Double and Triple filter pages: I have identified certain URLs on my website that have a canonical tag pointing to the main /quick-ship page. These URLs are as follows: https://www.interiorsecrets.com.au/collections/lounge-chairs/quick-ship+black
https://www.interiorsecrets.com.au/collections/lounge-chairs/quick-ship+black+fabric Considering the need to optimize my crawl budget, I would like to seek your advice on whether it would be advisable to disallow or noindex these pages. My understanding is that by disallowing or noindexing these URLs, search engines can avoid wasting resources on crawling and indexing duplicate or filtered content. I would greatly appreciate your guidance on this matter. b. Page URLs with parameters: I have noticed that some of my page URLs include parameters such as ?variant and ?limit. Although these URLs already have canonical tags in place, I would like to understand whether it is still recommended to disallow or noindex them to further conserve crawl budget. My understanding is that by doing so, search engines can prevent the unnecessary expenditure of resources on indexing redundant variations of the same content. I would be grateful for your expert opinion on this matter. Additionally, I would be delighted if you could provide any suggestions regarding internal linking strategies tailored to my website's structure and content. Any insights or recommendations you can offer would be highly valuable to me. Thank you in advance for your time and expertise in addressing these concerns. I genuinely appreciate your assistance. If you require any further information or clarification, please let me know. I look forward to hearing from you. Cheers!0 -
Unsolved Question about a Screaming Frog crawling issue
Hello, I have a very peculiar question about an issue I'm having when working on a website. It's a WordPress site and I'm using a generic plug in for title and meta updates. When I go to crawl the site through screaming frog, however, there seems to be a hard coded title tag that I can't find anywhere and the plug in updates don't get crawled. If anyone has any suggestions, thatd be great. Thanks!
Technical SEO | | KyleSennikoff0 -
Should I avoid duplicate url keywords?
I'm curious to know Can having a keyword repeat in the URL cause any penalties ? For example xyzroofing.com xyzroofing.com/commercial-roofing xyzroofing.com/roofing-repairs My competitors with the highest rankings seem to be doing it without any trouble but I'm wondering if there is a better way. Also One of the problems I've noticed is that my /commercial-roofing page outranks my homepage for both residential and commercial search inquiries. How can this be straightened out?
Local Website Optimization | | Lyontups0 -
Schema medical speciality error
I'm having an issue correctly formatting a medical specialty for a gastroenterologist. The Google structured data tool is giving me the error of "The property specialty is not recognized by Google for an object of type _Physician". _ Any suggestions on how to correctly update the schema code for a physician's specialty? Thanks, Keith LsPc55X iHUW88a
Local Website Optimization | | Keith_Kaiser1 -
Can I use Schema zip code markup that includes multiple zip codes but no actual address?
The company doesn't have physical locations but offers services in multiple cities and states across the US. We want to develop a better hyperlocal SEO strategy and implement schema but the only address information available is zip codes, names of cities and state. Can we omit the actual street address in the formatting but add multiple zipcodes?
Local Website Optimization | | hristina-m0 -
Advice on applying Service Area Schema
So I have client that delivers goods to residential addresses and commercial businesses. They have 60+ distribution centers but want to target surrounding counties, cities and territories. Our development team was considering using virtual location pages (thousands) for these service areas. I have lobbied against this out of concern that Google would label these "doorway" pages. These pages would not have full addresses. I want to develop a strategy to gain coverage in these surrounding delivery areas. I was told that applying https://schema.org/serviceArea might help. However will this truly bring in the necessary visibility? Would having only a few key select virtual locations suffice (along with Service Area schema)? Any advice on applying https://schema.org/serviceArea attributes would be much appreciated.
Local Website Optimization | | RosemaryB
Thanks0 -
Location Pages and Duplicate Content and Doorway Pages, Oh My!
Google has this page on location pages. It's very useful but it doesn't say anything about handling the duplicate content a location page might have. Seeing as the loctions may have very similar services. Lets say they have example.com/location/boston, example.com/location/chicago, or maybe boston.example.com or chicago.example.com etc. They are landing pages for each location, housing that locations contact information as well as serving as a landing page for that location. Showing the same services/products as every other location. This information may also live on the main domains homepage or services page as well. My initial reaction agrees with this article: http://azwa.1clkaccess.in/blog/local-landing-pages-guide - but I'm really asking what does Google expect? Does this location pages guide from Google tell us we don't really have to make sure each of those location pages are unique? Sometimes creating "unique" location pages feels like you're creating **doorway pages - **"Multiple pages on your site with similar content designed to rank for specific queries like city or state names". In a nutshell, Google's Guidelines seem to have a conflict on this topic: Location Pages: "Have each location's or branch's information accessible on separate webpages"
Local Website Optimization | | eyeflow
Doorway Pages: "Multiple pages on your site with similar content designed to rank for specific queries like city or state names"
Duplicate Content: "If you have many pages that are similar, consider expanding each page or consolidating the pages into one." Now you could avoid making it a doorway page or a duplicate content page if you just put the location information on a page. Each page would then have a unique address, phone number, email, contact name, etc. But then the page would technically be in violation of this page: Thin Pages: "One of the most important steps in improving your site's ranking in Google search results is to ensure that it contains plenty of rich information that includes relevant keywords, used appropriately, that indicate the subject matter of your content." ...starting to feel like I'm in a Google Guidelines Paradox! Do you think this guide from Google means that duplicate content on these pages is acceptable as long as you use that markup? Or do you have another opinion?0