Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Can anyone recommend a tool that will identify unused and duplicate CSS across an entire site?
-
Hi all,
So far I have found this one: http://unused-css.com/ It looks like it identifies unused, but perhaps not duplicates? It also has a 5,000 page limit and our site is 8,000+ pages....so we really need something that can handle a site larger than their limit.
I do have Screaming Frog. Is there a way to use Screaming Frog to locate unused and duplicate CSS?
Any recommendations and/or tips would be great. I am also aware of the Firefix extensions, but to my knowledge they will only do one page at a time?
Thanks!
-
I read your post at Mstoic Hemant and noticed your comment about Firefox 10. Since I couldn't get Dust-Me Spider to work in my current version of Firefox I tried downloading and installing the older version 10 as you suggested. When I did so, I received the message that the Dust-Me Spider was not compatible with this version of Firefox and it was disabled.
We are considering purchasing the paid version of Unused CSS (http://unused-css.com/pricing) - Do you have any experience using the upgraded version? Does it deliver what it promises?
Thanks!
-
Hi Hemant,
I tried using Dust-Me in Firefox, but for some reason it won't work on this sitemap: http://www.ccisolutions.com/rssfeeds/CCISolutions.xml
Could it be that this sitemap is too large? I even tried setting up a local folder to store the data, but everytime I try the spider I get the message "The sitemap has no links."
I am using Firefox 27.0.1
-
Hi Dana, did either of these responses help? What did you end up settling on? We'd love an update! Thanks.
Christy
-
I have an article on that here. An extension for firefox called Dust-Me selectors can help you identify unused CSS on multiple pages. It tracks all the pages you visit of a website and tracks classes and ids which were never used. Moreover, you can also give it a sitemap and it will figure out the CSS which was never used.
-
This sounds like it might just do the trick. You'll need to have Ruby installed for it to work. If you have a Mac, it's already on there. If you have a Windows you'll need this. It's pretty easy, I installed Ruby on my Windows gaming rig. If you're running a Linux flavor, try this.
Just take your URLs from the site crawl and make a txt file. You can compare that with your CSS file. I've never tried it on a large site, let me know how it goes for you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Incorporating Spanish Page/Site
We bought an exact match domain (in Spanish) to incorporate with regular website for a particular keyword. This is our first attempt at this, and while we do have Spanish speaking staff that will translate/create a nice, quality page, we're not going to redo everything in Spanish page. Any advice on how to implement this? Do I need to create a whole other website in Spanish? Will that be duplicate content if I do? Can I just set it up to show the first page in Spanish, but if they click on anything else it redirects to our site? I'm pretty clueless on this, so if anything I've suggested is off-the-wall or a violation, I'm really just spit-balling, trying to figure out how to implement this. Thanks, Ruben
Web Design | | KempRugeLawGroup0 -
Will interlinking using dynamic parameters in url help us in increasing our rankings
Hi, Will interlinking our internal pages using dynamic parameters(like abc.com/property-in-noida?source=footer) help us in increasing our rankings for linked pages OR we should use static urls for interlinking Regards
Web Design | | vivekrathore0 -
Old site to new WordPress site - Client concerned about Yahoo Ranking
Hello, Back Story I have a client (law firm) who has a large .html website. He has been doing his own SEO for years and it shows. I think the only reason he reached out to a professional is because he got a huge penalty from Google last fall and fell very far down in rankings. Although, he still retains a #1 spot in Yahoo for his site for the keyword phrase he wants. I have been creating a new WordPress theme for the client and creating all new pages and updating the formatting/SEO. From the beginning I have told the client that when we delete the old site and install a new WordPress site (same domain name, but different page hierarchy) he will take a bump in the search engines until all the 301 redirects get sorted out. I told him I can't guarantee any time frame of how long the dip in SEO will last. Some sites bounce right back while others take longer. Last week, during a discussion, he tells me that if he loses his #1 ranking on Yahoo for any length of time he thinks he will go out of business. Needless to say I was a little taken back. When it comes to SEO I use best practice techniques, do my research, stay on top of trends but I never guarantee rankings when moving to a new site. I'm thinking of ways I can help elevate any type of huge SEO drop off and help the client. Here is what I was thinking of suggesting to the client and I would love some feedback. Main Question He has another domain he isn't doing anything with. It's pretty much his domain name with pc added. I was thinking about using that domain to create a simple 1-2 page WordPress website with brand new content (no duplicate content) aimed at attracting his keyword phrase. I would do as much SEO as I could with a 1-2 page site and give it a month or so to see if this smaller site can get into the top #10 in Yahoo, or higher. Then, when we move the site he will still have a website on the first page of Yahoo for his keyword phrase. I hope I explained it clearly 🙂 I would be open to any suggestions anyone may have. Thanks
Web Design | | Bill_K0 -
Average Time to Conversion on Site
I am curious to know if there is a way to view or calculate the average time it takes site visitors to convert per session. For example, based on a current website design, the average time on site might be 3 minutes and the number of conversions might be 100. is there a way to say that for the current website design, it takes 3 minutes for the average site visitor to submit a web form? Then, as I redesign the site, my goal would be to improve the average time to conversion by making the web form more accessible and require less information within the form itself. I don't think this is currently possible in GA. Has anyone figured out a way to accomplish this by use of traditional tracking tools? Or, am I facing having to code my site to record each visitor's time on site from the second they enter and then stop the clock when they submit the form?
Web Design | | dsinger0 -
What's the point of an EU site?
Buongiorno from 18 degrees C Wetherby UK 🙂 On this site http://www.milwaukeetool.eu/ the client wants to hold on to the EU site despite there being multiple standalone country sittes e.g. http://www.milwaukeetool.fr & http://www.milwaukeetool.co.uk Why would you ever need an EU site? I mean who ever searches for an EU site? If the client holds on to the eu site despite my position it's a waiste of time from a search perspective is the folowing the best appeasment? When a user enters the eu url or redirects to country the detected, eg I'm in Paris I enter www.milwaukeetool.eu it redirects to http://www.milwaukeetool.fr. My felling this would be the most pragmatic thing to do? Any ideas please,
Web Design | | Nightwing
Cioa,
David0 -
Website Redesign - Will it hurt SERP?
Hi - I am planning to redesign my blog and I was wondering if this will affect my rankings? The new website template (custom designed) is much more user and seo friendly. The content, url structure, internal linking structure, meta tags, and site structure will remain exactly the same, but the visual design will be different (new sidebar widgets, and slightly different layout on inner pages). The current website is ranking very well (mostly top 5), has a healthy backlink profile, strong social media presence, and great traffic. I have heard that switching to a new template will dramatically hurt the rankings. Is this true? Are there any exceptions? Any ways I can prevent the rankings from dropping? Would really appreciate your input. Thanks in advance. Howard
Web Design | | howardd0 -
Is it better to redirect a url or set up a landing page for a new site?
Hi, One of our clients has got a new website but is still getting quite a lot of traffic to her old site which has a page authority of 30 on the home page and has about 20 external backlinks. It's on a different hosting package so a different C block but I was wondering if anyone could advise if it would be better to simply redirect this page to the new site or set up a landing page on this domain simply saying "Site has moved, you can now find us here..." sort of idea. Any advice would be much appreciated Thanks
Web Design | | Will_Craig0 -
Infinite Scrolling vs. Pagination on an eCommerce Site
My company is looking at replacing our ecommerce site's paginated browsing with a Javascript infinite scroll function for when customers view internal search results--and possibly when they browse product categories also. Because our internal linking structure isn't very robust, I'm concerned that removing the pagination will make it harder to get the individual product pages to rank in the SERPs. We have over 5,000 products, and most of them are internally linked to from the browsing results pages in the category structure: e.g. Blue Widgets, Widgets Under $250, etc. I'm not too worried about removing pagination from the internal search results pages, but I'm concerned that doing the same for these category pages will result in de-linking the thousands of product pages that show up later in the browsing results and therefore won't be crawlable as internal links by the Googlebot. Does anyone have any ideas on what to do here? I'm already arguing against the infinite scroll, but we're a fairly design-driven company and any ammunition or alternatives would really help. For example, would serving a different page to the Googlebot in this case be a dangerous form of cloaking? (If the only difference is the presence of the pagination links.) Or is there any way to make rel=next and rel=prev tags work with infinite scrolling?
Web Design | | DownPour0