Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to Structure URL's for Multiple Locations
-
We are currently undergoing a site redesign and are trying to figure out the best way to structure the URL's and breadcrumbs for our many locations.
We currently have 60 locations nationwide and our URL structure is as follows:
www.mydomain.com/locations/{location}
Where {location} is the specific street the location is on or the neighborhood the location is in. (i.e. www.mydomain.com/locations/waterford-lakes)
The issue is, {location} is usually too specific and is not a broad enough keyword. The location "Waterford-Lakes" is in Orlando and "Orlando" is the important keyword, not " Waterford Lakes".
To address this, we want to introduce state and city pages. Each state and city page would link to each location within that state or city (i.e. an Orlando page with links to "Waterford Lakes", "Lake Nona", "South Orlando", etc.). The question is how to structure this.
Option 1
Use the our existing URL and breadcrumb structure (www.mydomain.com/locations/{location}) and add state and city pages outside the URL path:
Option 2
Build the city and state pages into the URL and breadcrumb path:
www.mydomain.com/locations/{state}/{area}/{location}
(i.e www.mydomain.com/locations/fl/orlando/waterford-lakes)
Any insight is much appreciated. Thanks!
-
Hi David,
Typically, your main landing pages are going to be those that represent the city of location, as in:
etc.
What I'm trying to understand is if you are saying you have more than one office within a single city (as in orlando office A, orlando office B, orlando office C) and are trying to hash out how to distinguish these same-city offices from one another. Is this the scenario, or am I not getting it? Please feel free to provide further details.
-
David -
It looks like there are two main options for you:
Keep the same URL structure (option 1), and create category pages that are state-based / area-based, that then have a short description of each location in that geographic area, with a link to their location page.
This is typically how it might be done with an eCommerce site, where you'd have a parent category (i.e. shoes) and then a sub-category (i.e. running shoes).
The downside to this is that you risk having duplicate content on these category pages.
Option #2 would be my recommendation, because you are including the area / state information into the URL.
One company that does not do this well is Noodles & Company. Their location URL looks like this:
http://www.noodles.com/locations/150/
... where "150" is a store ID in a database. Easy to pull out of a database table. Less helpful to the end user who doesn't know that store ID 150 = the one closest to them.
It would be much better to have it listed like:
http://www.noodles.com/locations/Colorado/Boulder/2602-Baseline/You don't want to go much beyond 4 layers, but it's a better way of indicating to Google and other search engines the location tree.
Also, I'd highly recommend using a rich-data format for displaying the location information.
For example, on the Customer Paradigm site, we use the RDFa system for tagging the location properly:
Customer Paradigm
5353 Manhattan Circle
Suite 103
Boulder CO, 80303
303.473.4400
... and then Google doesn't have to guess what the location's address and phone number actually are.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Link flow for multiple links to same URL
Hi there,
On-Page Optimization | | doctecs
my question is as follows: How does Google handle link flow if two links in a given page point to the same URL? (do they flow link individually or not?) This seems to be a newbie question, but actually it seems that there is little evidence and even also little consensus in the SEO community about this detail. Answers should include source Information about the current state of art at Google is preferable The question is not about anchor text, general best practises for linking, "PageRank is dead" etc. We do know that the "historical" PageRank was implemented (a long time ago) without special handling for multiple links, as e.g. last stated by Matt Cutts in this video: http://searchengineland.com/googles-matt-cutts-one-page-two-links-page-counted-first-link-192718 On the other hand, many people from the SEO community say that only the first link counts. But so far I could not find any data to back this up, which is quite surprising.0 -
Hiding body copy with a 'read more' drop down option
Hi I just want to confirm how potentially damaging using java script to hide lots of on page body copy with a 'read more' button is ? As per other moz Q&A threads i was told that best not to use Javascript to do this & instead "if you accomplish this with CSS and collapsible/expandable <DIV> tags it's totally fine" so thats what i advised my clients dev. However i recently noticed a big drop in rankings aprox 1 weeks after dev changing the body copy format (hiding alot of it behind a 'read more' button) so i asked them to confirm how they did implement it and they said: "done in javascript but on page load the text is defaulting to show" (which is contrary to my instructions) So how likely is it that this is causing problems ? since coincides with ranking drop OR if text is defaulting to show it should be ok/not cause probs ? And should i request that they redo as originally instructed (css & collapsible divs) asap ? All Best Dan
On-Page Optimization | | Dan-Lawrence0 -
Duplicate Content for Men's and Women's Version of Site
So, we're a service where you can book different hairdressing services from a number of different salons (site being worked on). We're doing both a male and female version of the site on the same domain which users are can select between on the homepage. The differences are largely cosmetic (allowing the designers to be more creative and have a bit of fun and to also have dedicated male grooming landing pages), but I was wondering about duplicate pages. While most of the pages on each version of the site will be unique (i.e. [male service] in [location] vs [female service] in [location] with the female taking precedent when there are duplicates), what should we do about the likes of the "About" page? Pages like this would both be unique in wording but essentially offer the same information and does it make sense to to index two different "About" pages, even if the titles vary? My question is whether, for these duplicate pages, you would set the more popular one as the preferred version canonically, leave them both to be indexed or noindex the lesser version entirely? Hope this makes sense, thanks!
On-Page Optimization | | LeahHutcheon0 -
No Data Available for this URL
Hi,
On-Page Optimization | | ostiguyj
I really don't understand why I have this message "No data available for this URL"
in my SEOMOZ campain. (www.bienchezsoi.ca) When I look at my page rank, I get a score of 0 I have no idea of to fix it. Please help. Thanks0 -
How do we handle sitemaps in robots.txt when multiple domains point to same physical location?
we have www.mysite.net, www.mysite.se, www.mysite.fi and so on. all of these domains point to the same physical location on our webserver, and we replace texts given back to client depending on which domain he/she requested. My problem is this: How do i configure sitemaps in robots.txt when robots.txt is used by multiple domains? If I for instance put the rows Sitemap: http://www.mysite.net/sitemapNet.xml
On-Page Optimization | | nordicnetproducts
Sitemap: http://www.mysite.net/sitemapSe.xml in robots.txt, would that result in some cross submission error?0 -
301 redirect and then keywords in URL
Hi, Matt Cutts says that 301 redirects, including the ones on internal pages, causes the loss of a little bit of link juice. But also, I know that keywords in the URL are very important. On our site, we've got unoptimized URLs (few keywords) in the internal pages. Is it worth doing a 301 redirect in order to optimize the URLs for each main page. 301 redirects are the only way we can do it on our premade cart For example (just an example) say our main (1 of the 4) keywords for the page is "brown shoes". I'm wondering if I should redirect something like shoes.com/shoecolors.html to shoes.com/brown-shoes.html In other words, with the loss of juice would we come out ahead? In what instances would we come out ahead?
On-Page Optimization | | BobGW0 -
Is a Z almost as good as an S?
Possibly seems a strange question, but let me clarify... I have a new site in mind and all the domain names I was considering for it have been taken (I want a .com or a .net if at all possible). However, I can get the domain with a z at the end rather than an s Example: www.keyword-guides.com is taken, but www.keyword-guidez.com is available. Am I completely wrong in thinking that it will still match well for anyone searching Keyword Guide, and should match fairly well (even though it is a partial match) for people searching Keyword Guides. As the keyword is the most relevant bit of the domain, and as the first word on the domain is given the most weight, will having Z instead of S at the end make any difference at all? Personally, I don't really like the Z option, but if it would have no (or little) impact on my SEO efforts, I could live with it.
On-Page Optimization | | Jingo010 -
Best SEO structure for blog
What is the best SEO page/link structure for a blog with, say 100 posts that grows at a rate of 4 per month? Each post is 500+ words with charts/graphics; they're not simple one paragraph postings. Rather than use a CMS I have a hand crafted HTML/CSS blog (for tighter integration with the parent site, some dynamic data effects, and in general to have total control). I have a sidebar with headlines from all prior posts, and my blog home page is a 1 line summary of each article. I feel that after 100 articles the sidebar and home page have too many links on them. What is the optimal way to split them up? They are all covering the same niche topic that my site is about. I thought of making the side bar and home page only have the most recent 25 postings, and then create an archive directory for older posts. But categorizing by time doesn't really help someone looking for a specific topic. I could tag each entry with 2-3 keywords and then make the sidebar a sorted list of tags. Clicking on a tag would then show an intermediate index of all articles that have that tag, and then you could click on an article title to read the whole article. Or is there some other strategy that is optimal for SEO and the indexing robots? Is it bad to have a blog that is too heirarchical (where articles are 3 levels down from the root domain) or too flat (if there are 100s of entries)? Thanks for any thoughts or pointers.
On-Page Optimization | | scanlin0