I’m not going to lie to you: ranking sites in Google was very easy a few years ago. In fact, it was so easy that you could guarantee top 3 rankings in Google for almost any keyword. If your competitor had 50 spammy, PageRank 5 backlinks pointing to his website, it was easy to outrank him by pointing 70 spammy, PR5 backlinks to your site. And if that wasn’t enough, a few extra PR6 backlinks would have fixed the problem for good!

Everything changed with the Google Penguin update; many SEO consultants all over the world went from being online heroes to being broke, with clients refusing to pay for search engine optimization that wasn’t working anymore. Not only that, but many client websites were penalized by Google, and sometimes they were completely removed from their search results! Ouch!

google penalties

Several years have passed since the mighty Penguin reconfigured the search engine optimization world. And the clever SEOs have adapted, learning to promote their clients’ sites trough clean, 100% white hat methods that stood the test of time.

One of these methods that has always worked and will always work, regardless of Google’s future search algorithm updates, is broken link building. “How do you have the nerve to make this bold claim?”, I hear you asking.

First of all, broken link building helps make the web a better place by eliminating the broken links that plague millions of websites, and Google simply loves that. Then, it is a website traffic generation method that will continue to work even if Google decides to shut down its business tomorrow morning. Who needs search engines, when you can get plenty of visitors from industry-related, highly trafficked websites that are linking to your site?

referral website traffic

Take a look at the picture above; it’s from one of my clients’ accounts, and his website is getting 90.1% referral traffic. If Google disappeared tomorrow, he’d still get his 10,000+ highly targeted, monthly website visitors. This is the ideal traffic generation method for your business.

A link from a popular website can send you dozens of visitors per day, each day. Multiply that by 100 links and you will understand the huge power of broken link building.

Here’s how the process works in a nutshell:

a) Find high quality resource pages that have one or more broken links on them. In plain English, these are backlinks that lead to inexistent resources;

b) Create a resource that’s similar with and better than the defunct resource;

c) Contact the webmaster, telling him about the broken link and suggesting an alternative – your much better resource.

Now let’s break down the entire process into detailed, easy to follow steps.

1. Find industry-related resource pages

The Internet is a very dynamic medium, with many sites being created and disappearing on a daily basis. If a website links to a defunct web property, that link is broken. This is exactly what we want to fix, by creating a similar resource and asking the webmaster to update the page, replacing the broken link with a link to our website.

So how do you find high quality, industry related websites?

Everything starts with a few Google searches. We want to find reputable websites that have already created good resource pages. Here’s a good list of search strings that will help you get started.

“keyword links”

“keyword” + inurl:links

“keyword” + “helpful links”

“keyword” + “useful links”

“keyword” + “resources”

“keyword” + inurl:resources

“keyword” + “helpful resources”

“keyword” + “other resources”

“keyword” + “useful resources”

“keyword” + “resource” + “email us”

As an example, if you work in the travel industry, your Google search strings will look like this:

“travel links”

“travel” inurl:links


Yes, keep the quotes, because they will help filter many unrelated, irrelevant sites.

finding resource pages

How many results should you gather for your list? Try to limit your searches to the top three Google pages for each search string; otherwise, you will never get the job done.

In addition to this, if a website ranks in Google’s top 30 for a particular keyword, it should be a high quality website. This means that a link coming from it will be beneficial, also helping your website move closer to the top.

It is true that there are also a few sites that use spammy tactics and manage to crawl to the first Google page. We don’t want to get links from them, of course! Don’t worry, I’ll show you how to determine if a site is good or bad. But first, let me show you a tiny piece of code that will greatly simplify your work: Google Results Bookmarklet.

Open the link above in Firefox, and then drag the big green “Simple Google Results” button to the browser toolbar. The result should look like this:

google results bookmarklet

Run a Google search, and then press “Simple Google Results”; you will notice that the returned URLs are nicely formatted, so you can easily copy them to the clipboard.

Scroll down a bit; the 100% clean results, without any extraneous html, can be found in the “Plain Listing” section.

plain listing google results

Copy the plain text results, and then paste them in a spreadsheet.

google search sites links resources

Once you are done with the first keyword, running it through the 10 search strings and grabbing the URLs, move on to the second one and repeat the process.

Make sure to remove the duplicates at the end. You don’t want to evaluate the quality of a site several times just because slightly different search strings have brought it up more than once, and you have forgotten that you’ve already checked it.

If this looks like a lot of work, well… it is! That’s another reason why you should limit your research to the top 30 results.

Make sure to use relevant keywords for your research, including industry slang, etc. I have written an article that teaches advanced keyword research and analysis techniques; it’s a useful resource if you want to build a high quality, relevant keyword list.

As always, there are tools that can simplify the URL gathering process, and one of the best is… Scrapebox!

It’s too bad that this fantastic SEO tool has gotten such a bad reputation in the past, because many people have used it as an automated blog comment spammer. And while Scrapebox continues to incorporate its infamous blog comment poster, it is also jam-packed with tools that will simplify your white hat SEO life.

You can use Scrapebox to automatically gather URLs for resource pages not only from Google, but from dozens of other search engines in the entire world by simply selecting their check boxes. That is exactly what I did, and I have gotten a list of over 35,000 travel resource pages.

scrapebox resource harvesting

Scrapebox has a built-in data deduplication tool as well, so you won’t need Excel to get rid of duplicates.

Of course, not all these resource pages will have the desired quality. Some of them may not be actual resource pages, others may belong to spammy sites, and so on. It’s time to filter the bad ones!

2. Evaluate the quality of your target websites

So how do you determine if a website is going to help your SEO efforts, rather than harm them?

PageRank has been a key quality indicator for many years, but now that Google has stopped showing us its updated values, there are a few other metrics that need to be taken into consideration as well.

Don’t get me wrong, I’m not throwing PR out of the window; it’s Google’s own way of measuring website quality, after all. It’s just that we can’t always take the best decisions by only using its value.

So let’s start by evaluating PageRank; it’s easy to measure it, and then get rid of the low quality websites.

Simply Google “bulk pagerank checker” (without using the quotes) and you’ll get plenty of sites that allow you to paste several URLs at a time, and then check their PageRank value.

As an example, bulkpagerank.com allows you to paste up to 500 URLs at once. Copy the URLs to a new sheet, and then trim them, keeping only the domains. This can be done manually, or (much easier) by making use of one of the free services below:



You could also use an Excel formula to do the same thing, but from my experience it will not work fine for every URL – some of these web addresses can have a really weird structure <img src="http://s.w.org/images/core/emoji/72×72/1f609.png&quot; alt="