Loading...
Google Penalty Recovery Illustration

How I Successfully Removed a Google Penalty

Google penalties can leave website owners feeling stranded and losing a lot of money. One website owner came to me for help after learning that their site wasnt even ranking for its own brand name. The site, an industry leader with a 10-year old domain, was under an algorithmic penalty.

The site wasranking well until 2015 when traffic started to decline.

You know the story: they hired multiple SEO agencies over the years, these agencies exported SEMRush reports, and the sites traffic remained low. When the they hired an SEO Agency or so-called Google Penalty Removal Agency, they hoped to get the penalty removed. These guys worked on the site for six months with no success.

A company this big relies on a healthy stream of traffic and leads to their site. Six months is a lot of lost:

  • Traffic
  • Leads
  • Revenue

So, they came to me for help. It took eight weeks of work, but luckily I was able to remove the penalty and get the sites traffic back to previous levels.

The steps Itook are steps that every site owner can take to recover from a Google penalty.

Jump to Each Week to See What I Did

Tools I Used During the Penalty Recovery

Tools helpautomate some of the tasks and keep me organized during the recovery process.Since the site had over 50,000 pages at the start, I definitely needed a fewtools to help me make sense of the site, its backlink portfolio and to providea good foundation for a penalty recovery.

The tools Irelied on the most were:

Week 1: Learning the Site’s History and Tracking Changes

Every sitehas a past, and with a ten-year-old site, its easy for common issues to gooverlooked. A broken link or redirect issue may be noted by one team member andnever be corrected. I decided to build a foundation for the site and make surethat the sites main issues were corrected first.

I took sevenmain steps during the first week.

1. Crawl the Site with Screaming Frog

Screaming Frog takes a lot of the manual work out of the equation. I decided to run the program to:

  • Conduct a link audit
  • Find broken links
  • Find redirect issues
  • Find site errors
  • Review page titles
  • Review meta data
  • Find duplicate pages

The freeversion works fine for sites with less than 500 URLs, but there is a paidversion with a lot of extras, including crawl customization, AMP crawling andvalidation, and other options that are not available on the free version.

You can find most of the main issues that Ill be talking about with the free version, if your site is small.

Analytics and annotations can be your best friend

2. Added Search Engine Updates as Annotations in Google Analytics

Google pushes a lot of updates and algorithmic changes to hold their position as the top search engine. What I wanted to do was find a way to add all of these updates into Google Analytics so that I had a clearer picture of what happened to the sites traffic and when.

Search Engine Watch has a great article on how to add annotations and all of the search engine updates to Google Analytics.

The reasonsI did this was:

  • Annotations allow me to quickly see what major updates correlate with the sites drop in traffic.
  • To determine which of the sites issues were caused during major updates.

By beingable to visualize the sites traffic, it was a lot easier to make update-specificchanges to the site in an effort to remove the sites penalty.

3. Track My Own Actions in Annotations

Its easy to forget every step that you take to correct a penalty. I wanted to have a clear way to know which actions I took and when. I decided to track all of my own actions through annotations to have a strong history of the steps were taken to remove the Google Penalty.

4. Identified Broken Site Code

When looking through Screaming Frog, I noticed that there were a lot of pagination issues for all of the sites URLs, especially in the blog section. I discovered that each new page had issues with:

  • Canonical URLs
  • Duplicate content of the main page

For example, /page/ and /page/12 would have the same content. There was broken code that was producing this issue, so I tracked it down.

5. Corrected Broken Site Code

I was ableto correct the issue by fixing the broken coding, but you may need to hiresomeone to do this part for you. Even after I fixed the broken code, I foundthat there were thousands of these pages still indexed in Google search.

These pages should drop off eventually.

If these URLs have a common prefix, you can use the Google Search Consoles Remove all URLs with this prefix feature:

Google Search Console: Remove all URLs with this prefix feature
Google Search Console: Remove all URLs with this prefix feature

6. Updated All Broken Links

With the pagination issue corrected, I decided to open up Screaming Frog again and update all of the broken links that were found. Some links were removed, and some were updated.

Every time I ran a crawl, I used Excel to load in all the data and compare to the previous version of the site. My report page logged and showed me a number of issues, including but not limited to:

  • Number of pages crawled
  • Number of 301s
  • Number of 404s
  • Number of other errors
  • Number of pages for each section of the site (blog, main pages, pillar content, etc.)
  • Number of pages without meta details
  • Number of index/noindex pages
  • Etc.

7. Updated All 301 Redirects

The finalstep that I took was to update all of the 301 redirects on the site to make sure that theyworked properly and went to the appropriate pages. Youll have to look throughyour .htaccess files or use a plugin (depending on your CMS) to make all ofthese updates.

After Iperformed all of these tasks, I decided to run Screaming Frog again to makesure that there were no issues that I missed. If you find any issues that youmissed, well go through an additional stage in week two to correct these.

Week 2: Major Site Cleanup, External Link Removal

Week two, Isat down and went through the site even further. Each step in this weeksremoval process is rather straight forward.

1. Removed Unused Development Subdomains

Developersoften create subdomains to test a site and any updates to the site or codingbefore pushing the changes to the live server. I found a few subdomains thatwere left behind by the developer.

The subdomains were all indexed by Google, potentially causing duplicated content issues.

I decided to remove all of these subdomains and started the deindexation process on search engines.

2. Cleaned Up More Broken Links and 301 Redirects

The unnatural links and redirects that needed to be corrected after the first weeks work were corrected this week. When you have a major site with over 50,000 pages, you may not be able to fix all broken links or redirects in a week.

3. Action Tracking and Screaming Frog Progress

I ranScreaming Frog every day. I would also:

  • Createa summary page
  • Pulldata from export
  • Trackprogress

I wanted tobe able to track my progress, so I made sure to run Screaming Frog to make sureI fixed any major errors along the way. I also used Excel to track each actionI was taking so that I had a clear way to identify everything I did.

My initial crawl of the site had 50,000 pages, and during week two, we were down to 2,000 pages (remember the pagination issue). I still had quite a few pages to remove at this point, so I continued correcting any duplicate or unnecessary pages on the site.

4. Cleaned Up the Site’s External Links

The site hada lot of external links, and a lot of links needed to be removed. I wentthrough each page on the site to remove any of the following external linksthat didnt make sense:

  • Brokenthird-party links
  • Brandedhomepage links (many overused)
  • Fakesecurity badges

I left allof the external links on the site that made sense, but I made sure to removeany links that no longer made sense to have on the site.

5. Conducted a Full Content Audit

I conducteda full content audit that I would use in subsequent weeks to further clean upthe site. Screaming Frog already provides you with a list of all of yourcontent, so you can use this list for your audit.

Export the results of the list to CSV or XLS, and then start to analyze your pages.

Youll want to make note of pages that have very thin content or pages that no longer make sense for the domain to publish. You can easily do this in Excel and use it in week three.

6. Site Speed and Lighthouse Reports

Google wants sites to be snappy, and if theyre not, it will impact rankings and user experience. I ran the site through PageSpeed Insights and created LightHouse reports so that I could better understand the sites overall speed.

Youll find that both of these tools will provide you with insights and suggestions to improve your sites speed. You can also integrate Googles PageSpeed Insights API with Screaming Frog.

Work on yoursites speed until your reports are sufficient.

7. Removed Firewall Blocks Blocking Google Bot IP’s

While goingthrough the site, I noticed that WordPress and Cloudflare both manually blockedsome of Googlebot IP addresses. I removed all of these firewall blocks toensure that all of Googles crawlers could properly crawl the site.

You can run a reverse DNS on the IPs, or you can follow a general Googlebot IP address list to find any IPs you may be blocking that belong to Google.

Week 3: Corrected Issues Found in Week 2 and Started to Tackle Backlinks

I did a lotof preliminary work in the first two weeks and started to get a feel for thesite I was working on. A lot of key issues were corrected, and I did a lot ofthe groundwork for the weeks to come.

The thirdweek is where I started to correct a lot of the issues I already knew existedon the site from last weeks research.

1. Removed Low-Quality Pages and Pages with Little Traffic

The contentaudit from last week allowed me to remove all of the thin, low-quality pages onthe site. I removed all of these pages and also removed any pages that:

  • Were created more than two months ago
  • Had less than 10 visitors in the past year

I wentthrough each page on the site and removed all of these low-quality pages one byone.

But I alsolooked at all of the pages I was going to remove to find any pages that hadvaluable backlinks. I used Ahrefs for this so that I could leveragesome of this link value in the next step.

2. Removed Additional Broken Links

Removingpages on the site caused a few broken links to pop up. I ran Screaming Frog tocorrect all of these broken links that were linking to the pages in the firststep of week three.

3. 301 Redirected Removed Pages That Had Valuable Links

I removed a lot of pages from the site, but a lot of these pages still had considerable backlinks to them. I decided to 301 redirect these pages to maintain some of these valuable backlinks.

Remember, dont just 301 everything to the homepage, always do the redirect to related content or related pages.

4. CMS Cleanup

The site was using WordPress, but the site you may be working on could use any other CMS. What I did was look at the theme and found that it was:

  • Generatingunnecessary links
  • Generatingunnecessary code

I noted all of these generation issues and also removed any plugins that:

  • Generatedmassive amounts of code
  • Servedlittle purpose to the site

Plugins canslow down a site drastically, and removing them helped clean up a lot of codeand speed issues.

5. Correct Site Speed Issues

Lighthouseprovides in-depth information that I passed along to the development team tocorrect. The team followed all of the recommendations from Lighthouse to speedup the site considerably.

6. Time to Tackle Backlinks

Backlinks arealways going to be a major focal point when trying to recover from a penalty. Idecided that it was time to dig into the backlinks that the site had because Iwas going to be working on them extensively in the coming weeks.

I used two maintools to help me:

  1. LinkResearchTools
  2. Link Detox

When usingboth of these tools, it will be quick and easy to find backlinks that are toxicto your site.

I downloadedthe disavowed links file, which had over 2,000 links listed. I started to lookinto all of the URLs individually to learn more about the links that weredisavowed and potential links that I would add to the list in the coming weeks.

7. Deleted Pages That May Get the Site Considered as YMYL

Googles EATand YMYL really allows usto better understand how Google distinguishes between high-quality andlow-quality content. YMYL stands for: Your Money or Your Life.

YMYL was amajor issue when I decided to look through historical pages on the site.

I noticedthat many of these pages mentioned specific drugs, and I was very worried thatthese pages may be considered YMYL.

The backlinks to the site also had a lot of links from controlled substances pages, so I asked the company if they still deal with these substances. Since they no longer dealt with the substances, it was safe to delete all of the pages that may fall in the scope of YMYL.

Essentially,some of these pages and the links to them may have been seen as:

  • Deceptive
  • Inaccurate
  • Untruthful

I knew thatthe coming weeks would include a lot less groundwork, and a lot of steps toreally try and lift the penalty off of this clients site.

Week 4-6: Backlink Cleanup, Structure Data Issues and Citation Work

Over thenext few weeks, I decided it was time to work on backlink cleanup. I had beenadding URLs to the disavow, but I noted that there wasnt any recovery as aresult. I was trying to recover anything at this point, but the site didnteven budge for branded keywords, such as company name.

Facebook,Glassdoor and Yelp, along with a few other popular sites, started to dominatethe search results, pushing the websites rankings down even further.

A lot of thework done during these few weeks included:

1. Tedious Backlink Cleanup and Disavow File Updates

I initiallyremoved the disavow file, but I found that the sites rankings dropped a fewpositions. When the sites rankings started to fall, I uploaded the disavowfile again and started the time-intensive backlink cleanup.

I wentthrough all of the sites, added any sites that were toxic and updated thedisavow file.

Every fewdays, I would upload an updated disavow file. Combing through all of thebacklinks took up a lot of time, but it was an important part of the entireprocess. A lot of sites that were once authority sites were now toxic.

2. Structured Data Issues Fixes

Structured data helps Google display your website with search appearance elements. Google has a really good introduction to structured data if youve never heard of it before.

When you usestructured data properly, it helps organize and optimize your website. Searchengines will read this data and display it accordingly. For example, a lot ofrecipe sites will use this structured data to have richer snippets on Googlethat include ingredients, pictures, ratings and other useful information.

I found thatthe site had a lot of issues with structure data.

I wentthrough the site to correct these issues. With the structured data cleaned up,I was confident that the site would have richer displays in the searchresults.

3. Citation and Social Profile Updates

The site had a lot of citation links and social profiles that linked to their site. Some of these links pointed to the non-www version of the site and some linked to the http version rather than the https version.

I wanted toeliminate the unnecessary load on the server from the 301 redirects that theselinks caused.

What I did wasmake sure that all of these citations and social profile links were linking tothe sites https://www version of the site.

Week 7-8: Penalty Recovery and Rankings Restored

Google has alot more resources and processing power than I have at my disposal. When I wasgoing through my tedious backlink checking tasks, I realized that Googlealready knows that these sites are toxic.

I decided todelete all of the disavow files and trust that Google knew whatbacklinks were toxic already.

The site hadfour main properties:

  1. http-non-www
  2. http-www
  3. https-non-www
  4. https-www

I removed thedisavow file from all four of these properties and let Google take care of therest.

When I firststarted to clean up the website, it had over 50,000 pages. A lot of these pageswere created with erroneous pagination code. Some pages had thin content orcontent that simply didnt make sense for the site to have any longer.

The page totalafter the entire site cleanup was 400-500 pages, with:

  • Allon-site issues corrected
  • Loadingspeeds increased by at least 50%
  • Structureddata cleaned up
  • Redirectsin place

LinkDetox is a tool I relied on heavily to find good and bad links in my disavow files and in the sites backlink portfolio. When I removed the disavow files, I needed Google to check all of these backlinks again and really allow Google to evaluate the backlinks on its own.

Link DetoxBoost® really helped in this case because it helps sites recover from Googlepenalties faster. The disavow only works when Googlebot recrawls the links to asite. The Boost tool nudges Googlebot to check out these links again and valuelinks that actually make sense.

After a fewdays, Google re-indexed pages and started its own evaluation of the backlinks.

It tookabout a week before the site finally reappeared in the search results for itsown brand name. The site was in position two in the search results, anda day later, was in position one for its own brand name.

The penalty was finally lifted.

Mobileresults took a few additional days to catch up to the regular search results,but they eventually did catch up.

Google washappy. I was happy. The client was happy, and started to recover to trafficlevels they hadnt experienced in years.

What to Check When Doing a Penalty Audit

Whenperforming my penalty audit, I had a long list of items that needed to bechecked. If youre performing a penalty audit, I recommend following my penaltyaudit checklist below:

  • Backlink profile
  • Broken links
  • Broken site code
  • Citations
  • CMS plugins
  • Content audit
  • Duplicate pages
  • External links
  • Firewall settings
  • Redirect issues
  • Site errors
  • Site speed
  • Social profile links
  • Structured data
  • Unused subdomains
  • Code Audit

Manual Penalties

If you know you website received a manual penalty then you should check the reason in Google Search Console, fix the issues and communicate the updates back to Google Search Quality Team for further review. Submit a reconsideration request when you have fixed the issues.


Leave a Reply