Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
What Tools Should I Use To Investigate Damage to my website
-
I would like to know what tools I should use and how to investigate damage to my website in2town.co.uk I hired a person to do some work to my website but they damaged it. That person was on a freelance platform and was removed because of all the complaints made about them. They also put in backdoors on websites including mine and added content.
I also had a second problem where my content was being stolen.
My site always did well and had lots of keywords in the top five and ten, but now they are not even in the top 200. This happened in January and feb.
When I write unique articles, they are not showing in Google and need to find what the problem is and how to fix it. Can anyone please help
-
Repairing website damage requires a structured approach. Start by assessing any issues caused by the freelancer using tools like Wordfence for WordPress to detect backdoors or malicious changes. It’s possible Google penalized you for whatever work the freelancer did. You might need to disavow toxic links they built, for example, or address other issues. Tools such as Google Alerts can help identify content duplicates for action.
For future reference, regular monitoring helps prevent unauthorized changes and stay informed about industry trends. If you monitor a web page for changes, you will ensure no unauthorized adjustments are made on your site. There are several ways to approach this, and several tools you can use; some will notify you when your webpage changes. Monitoring competitors for content trends can also guide your strategy and reveal potential areas for improvement.
-
use google search console and screaming frog
-
The best tool that i used to diagnose problems in my office interior designs service based website is the Google search console GSC. You can also use screaming frog or also use moz to analyze and solve issues
-
Page Freezer: Instantly preserve web pages and social media profiles to capture evidence of website damage
-
To investigate damage to your website, you should consider using tools like Google Search Console for monitoring search performance and detecting issues, website security scanners like Word fence to check for malware or vulnerabilities, and website auditing tools such as SEMrush or Screaming Frog for comprehensive analysis of technical SEO issues and website health.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google keeps marking different pages as duplicates
My website has many pages like this: mywebsite/company1/valuation mywebsite/company2/valuation mywebsite/company3/valuation mywebsite/company4/valuation ... These pages describe the valuation of each company. These pages were never identical but initially, I included a few generic paragraphs like what is valuation, what is a valuation model, etc... in all the pages so some parts of these pages' content were identical. Google marked many of these pages as duplicated (in Google Search Console) so I modified the content of these pages: I removed those generic paragraphs and added other information that is unique to each company. As a result, these pages are extremely different from each other now and have little similarities. Although it has been more than 1 month since I made the modification, Google still marks the majority of these pages as duplicates, even though Google has already crawled their new modified version. I wonder whether there is anything else I can do in this situation? Thanks
Technical SEO | | TuanDo96270 -
Ranking going south
Hi - I have a site Simply Stairlifts and I don't understand it but I've followed all the SEO processes of cleaning the site and building links, but ranking just keeps falling - any advise would be very gratefully received 👍 .
SEO Tactics | | Naju2310 -
Appending a code at the end of a URL
Hi All, Some real estate/ news companies have a code appended to the end of a URL https://www.realestate.com.au/property-house-qld-ormiston-141747584 https://www.brisbanetimes.com.au/national/queensland/childcare-centre-could-face-prosecution-for-leaving-child-on-hot-bus-20230320-p5ctqs.html Can I ask if there's any negative SEO implications for doing this? Cheers Dave
Technical SEO | | Redooo0 -
How to rank a website in different countries
I have a website which I want to rank in UK, NZ and AU and I want to keep my domain as .com in all the countries. I have specified the lang=en now what needs to be done to rank one website in 3 different English countries without changing the domain extension i.e. .com.au or .com.nz
SEO Tactics | | Ravi_Rana0 -
Why did my website DA fell down?
Hello, Could you please let me know why might my website's DA have fallen down in merely a week? What might be a reason? I also noticed traffic from google dropped down at the very same week. Will be very thankful for any advise!
Technical SEO | | kirupa0 -
Is there a pinging tool to ping all sites at once
hi, i am just wondering if there is a tool that you can put on your toolbar that allows you to ping all the sites at once. The last thing i want to keep doing is to go through every single one and ping my article. I would like to find a tool that does it all for me, can anyone let me know if there is one out there. many thanks
Technical SEO | | ClaireH-1848860 -
403 forbidden error website
Hi Mozzers, I got a question about new website from a new costumer http://www.eindexamensite.nl/. There is a 403 forbidden error on it, and I can't find what the problem is. I have checked on: http://gsitecrawler.com/tools/Server-Status.aspx
Technical SEO | | MaartenvandenBos
result:
URL=http://www.eindexamensite.nl/ **Result code: 403 (Forbidden / Forbidden)** When I delete the .htaccess from the server there is a 200 OK :-). So it is in the .htaccess. .htaccess code: ErrorDocument 404 /error.html RewriteEngine On
RewriteRule ^home$ / [L]
RewriteRule ^typo3$ - [L]
RewriteRule ^typo3/.$ - [L]
RewriteRule ^uploads/.$ - [L]
RewriteRule ^fileadmin/.$ - [L]
RewriteRule ^typo3conf/.$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME} !-l
RewriteRule .* index.php Start rewrites for Static file caching RewriteRule ^(typo3|typo3temp|typo3conf|t3lib|tslib|fileadmin|uploads|screens|showpic.php)/ - [L]
RewriteRule ^home$ / [L] Don't pull *.xml, *.css etc. from the cache RewriteCond %{REQUEST_FILENAME} !^..xml$
RewriteCond %{REQUEST_FILENAME} !^..css$
RewriteCond %{REQUEST_FILENAME} !^.*.php$ Check for Ctrl Shift reload RewriteCond %{HTTP:Pragma} !no-cache
RewriteCond %{HTTP:Cache-Control} !no-cache NO backend user is logged in. RewriteCond %{HTTP_COOKIE} !be_typo_user [NC] NO frontend user is logged in. RewriteCond %{HTTP_COOKIE} !nc_staticfilecache [NC] We only redirect GET requests RewriteCond %{REQUEST_METHOD} GET We only redirect URI's without query strings RewriteCond %{QUERY_STRING} ^$ We only redirect if a cache file actually exists RewriteCond %{DOCUMENT_ROOT}/typo3temp/tx_ncstaticfilecache/%{HTTP_HOST}/%{REQUEST_URI}/index.html -f
RewriteRule .* typo3temp/tx_ncstaticfilecache/%{HTTP_HOST}/%{REQUEST_URI}/index.html [L] End static file caching DirectoryIndex index.html CMS is typo3. any ideas? Thanks!
Maarten0