All posts by DEVBlogger

17 Killer Link Building Strategies for 2018 (with Examples and Scripts)

Note: “The SEO Playbook“, my SEO training course will re-open to 100 more students on 10/29. See what other students are saying here

Link building is one of the most challenging aspects of SEO.

But it also has a HUGE impact on where your site ranks in the SERPs.

(this has been shown time and time again in pretty much every ranking factors study)

Some people, however, still argue that Google’s reliance on backlinks is diminishing, but let’s be real here:

Backlinks aren’t going anywhere any time soon.

After studying 1 million SERP results, this study found the number of domains linking to a page correlated with rankings more than any other factor:

Number of referring domains


Don’t take my (or the many, many studies) word for it. Type any keyword into Ahrefs Keyword Explorer and look at the top 10 ranking pages:

Ahrefs Keyword Explorer showing link count of top ranking pages

I guarantee the top ranking pages will have a ton of links (assuming the keywords have significant search volume, of course).

In this post, I’ll show you how to use 17 killer link building strategies to build a boat load of high quality links to your site…FAST 🙂

I’ve used each of these tactics to build white hat backlinks to client websites, and steadily grow the link profile of this blog:

Growth in backlink profile

Here’s a taste of what you’ll learn:
  • Actionable tactics to quickly build white hat backlinks
  • How to hack links to “money” product/service pages 
  • How to build channels you can tap for links over and over again
  • How to build links that actually drive targeted referral traffic

And much more. 

Because this post is a 12,000+ word beast, I’ve included a table of contents below for your quick reference below (feel free to jump around):

Let’s get started.

17 Actionable Strategies to Build Quality Backlinks and Boost Rankings (Fast)

#1. Build High-Probability Link Channels with
Custom Search Engines

Let’s say I wanted to build links to my recently published on-page SEO checklist.

My (hypothetical) list of prospects includes Point Blank SEO (by Jon Cooper) and

Both of these are well-known SEO blogs, but who is more likely to link to me?


It’s Matthew Barby.

Why? Because Matthew Barby has linked to me in the past whereas Jon Cooper hasn’t (c’mon Jon…gimme that link :D)

Backlink from Matthew Barby

In general, websites that have linked to you in the past are more likely to do so again.

Why? Because they’re already fans of your content and therefore, are more likely to be interested any new content your produce (perhaps even enough to link to it).

(pretty obvious, right?)

Now, imagine for a second that you had a fully searchable database of all the websites who’ve already linked to you and, therefore, are likely to be interested in your new content.

Wouldn’t that be crazy powerful!?


And here’s the good news: you can build this database using just two tools: Ahrefs Site Explorer and Google’s CSE.

Here’s how to do it:

  1. Find all websites already linking to you (with ‘natural’ editorial links).
  2. Load them into a custom search engine.
  3. Use the search engine to find likely link prospects for future content.

Let me walk you through it from start-to-finish (note: you can read my complete Ahrefs review here).

First, use the “backlinks” tab in Ahrefs Site Explorer to export a list of all websites currently linking to you:

Use Ahrefs Site Explorer to find all sites linking to you

NOTE: I also recommend adding these filter settings before exporting (this will weed out the junk and make sure most of the remaining links “natural” editorial links from blogs):

Ahrefs backlink filters

Export this file as a csv.

Now, before you do anything else, make a copy of this spreadsheet and import the Ahrefs .csv export into the first tab (i.e. the tab appropriately labelled “IMPORT AHREFS EXPORT HERE”), by selecting cell A1 then going to:

File > Import > Upload > Select a file from your computer (note: choose the Ahrefs export csv file)

NOTE: Make sure your import settings are as follows:

Backlink export settings

Click “Import” and you should see all of the data from the csv imported into the tab.

IMPORTANT: Don’t close this spreadsheet; you’ll need it again in a sec.

Next, sign up for Google’s CSE tool and create a new search engine.

Create a Google Custom Search Engine

On this next screen, it’s going to ask you to enter at least one website you want to search + also name your custom search engine.

Go back to the spreadsheet and navigate to the next tab (note: this is labelled “1. First website”); you should see a value in cell A1 (hint: this should be formatted as “*”).

Copy/paste this into the “Sites to search” box on your custom search engine; also give your search engine a name.

Set up Custom Search Engine

Click “create”.

You now have a custom search engine that will allow you to search one domain from your backlink profile. But that’s not much use, so we need to add the rest of the domains.

To do this, click “edit search engine” from the left-hand menu and select the search engine you just created.

Under the “sites to search” section, click the “add” button to add more sites.

Add sites to Custom Search Engine

Select the option to “include sites in bulk” then copy the list the big list of domains from the next tab of your spreadsheet into the box (note: this is the tab labelled “2. More websites”):

Bulk adding domains to Custom Search Engine

Click “save”.

That’s it – your custom search engine is now set up and ready to go 🙂

Now, whenever you have a new piece of content to promote (i.e. build links to), you can use your custom search engine to search for prospects who have written about that topic before.

For example, if I wanted to build links to my on-page SEO checklist, I could search enter the phrase “on page seo” into my custom search engine to search for sites that have written about on-page SEO before:

Custom Search Engine results

Tip: This link building tactic becomes more powerful as your link profile grows because you have a larger database of websites to search across.

If you only have a small number of sites linking to you, use semantic keywords to widen the pool of potential link pospects.

For example:

When I perform a keyword search for “seo checklist” there are several results showing for on-page SEO checklist, local SEO checklist, technical SEO checklist, and WordPress SEO checklist:

On page SEO semantic variations

Google is telling us it sees those variations as semantically similar to the main query, “on-page SEO checklist”.

Try entering some of those other keyword variations into your Google Custom Search Engine to see if any new link prospects surface:


There are a lot of different ways you can find semantic and related keywords. Here are a few:

  1. Google autosuggest
  2. Google related searches
  3. LSI Graph
LSI Graph

Make a list of any new related keywords and search across your custom search engine with those terms to uncover additional link prospects.

NOTE: All of these results are solely from the pool of sites entered into your custom search engine (i.e. those already linking to you). This will not search the entire web.

Now you know exactly who to promote your post to.

Prefer video? Watch the video walk-through below:


#2. Steal Competitor Backlinks with the RLR Framework

I’ll let you in on a secret…

You’re not the only person in your niche actively trying to build backlinks.

In fact, I’d be willing to bet that most of your competitors are actively investing in link building, too. They may even have built a ton of links already.

And, believe it or not, this is actually a good thing.

Why? Because they’ve done all the hard work for you; all you need to do is to replicate their link profiles.

I use the RLR technique for this (learn why it’s called this and get access to the competitor link tracking show below, here).

Competitor link tracking template

Here’s the general process:

  1. Research the competition (i.e. find similar websites and/or pages in your niche with a bunch of links)
  2. Figure out how competitors are building links
  3. Replicate/steal their tactics

Simple, right!?

Let’s walk through the basics.

First, go to Google, enter a term you’re trying to rank for (e.g. “link building strategies”), then copy the URL of the no.1 ranking page:

Copy URL of top ranking article

Paste this into Ahrefs Site Explorer and go to the “backlinks” report (on the left-hand menu):

Ahrefs backlinks report

It’s now a case of sifting through these backlinks and trying to figure out how your competitor is attracting these links.

Here are a couple of common link acquisition strategies to look out for:

Guest posts — these can be found quite easily by checking out the page itself as it’ll probably state that it’s a guest post. Most guest posts will also have links with branded anchor text (e.g. “Robbie Richards”):

Competitor guest post placements

Simply hit Details >> Referring Domains to verify the source of the links:

Source of branded anchor links in Ahrefs

Make a note of the strategic guest post placements in this competitor link tracking template: 


Recurring backlink sources — if you see the same domain popping up time and time again, it’s likely that your competitor has built a relationship with that website. It’s also likely they’re happy to link to good content, so it may be worth forging a relationship and reaching out when you publish a quality content asset:

Finding recurring backlink sources in Ahrefs

NOTE: I dedicated an entire post to this link building strategy, where I go into much more detail regarding a number of tactics for finding and stealing competitor backlinks. I recommend giving it a read!

In essence, once you’ve identified the methods your competitors are using to acquire backlinks, it’s then a case of replicating their tactics.

For example, let’s assume your competitor was acquiring a lot of links from guest posts.

You could simply round up a list of prospects (using Ahrefs) then ask if they’d be interested in a guest post from you, too.

Here’s a sample outreach email:

Hey [NAME],

It’s Robbie here from

I was just reading a couple of posts on your website and noticed [COMPETITOR NAME] wrote a guest post for you a while back.


Are you still accepting guest posts?

If so, I’d love to write something for you, too. I have a ton of ideas, so just let me know and I’ll send them over.


It’s as simple as that! 🙂

I used this exact strategy to find that one of my competitors was guest posting on Digital Marketer. So, I wrote one for them too:

Guest post conversions


866 referral visits and 68 new subscribers (7.85% conversion rate). 

I’ve since scaled this strategy, driving thousands of new visitors to the site. But, more on this later…


#3. Build Hundreds of Links with DEEP Broken Link Building (at Scale)

Broken link building is nothing new.

It’s been around for years. And it’s a tried and tested link building strategy.

Here’s the basic process:

  1. Find broken links
  2. Replicate the content (that used to exist at that link) on your website
  3. Reach out to the person/website linking to the broken resource and suggest that they change the broken link to the working link (i.e. to the replicated content on your website)


CitationLabs has a huge list of link building resources here.

But, resource #19 on the list no longer exists—it’s a broken link.

Finding broken link opportunities

Note: You can use LinkMiner (free Chrome extension) to instantly find and highlight all broken links on any webpage. This is how I found this broken link.

If you plug the broken link into the Wayback Machine, you can see this used to be a resource about broken link building…

Wayback Machine

(yep, totally meta!)

So, how could you capitalize on this?

Simple…create a similar piece of content about broken link building, publish it on your website, then reach out to the folks at Citation Labs with an email like this:

“Hey Garrett,

It’s Robbie Richards here. How are things?

I’m reaching out because I just spotted a dead link on your list of 53 link building resources. It’s #19 on the list—basically, it just takes me to a 404 page 🙁

Just thought I’d give you a quick heads-up as you may want to fix it?

PS. If you’re not sure what to replace the link with, I have a huge list of link building strategies (which covers most of the stuff the broken resource used to cover, and more) here:


BOOM…you just landed a link (probably).

Now, I know what you’re thinking:

“Robbie, this seems like a CRAZY amount of work for just one measly link…”

I hear you.

So, here’s a simple trick you can use to (potentially) turn a single broken link into hundreds of link building opportunities:

Go to Ahrefs Site Explorer and paste in the broken link:

Ahrefs Site Explorer

Then, look at the number of backlinks and referring domains pointing to the page:

Number of Referring Domains in Ahrefs


Because this broken link has 9 referring domains, you can also contact those websites and suggest that they replace the broken link with a link to your website.

Note: Some broken links will have 100’s, sometimes 1000’s, of referring domains.

While this process does work, it is pretty damn slow and inefficient.

So here’s a scalable process:

  1. Gather a BIG list of resource/links pages in your industry (using Google advanced search operators)
  2. Find all broken links on those pages in bulk (using Screaming Frog)
  3. Find the broken links with the most inbound links (using Ahrefs Batch Analysis tool)
  4. Recreate the content (or create something similar) on your website
  5. Reach out to all the sites linking to the broken link


Let’s say you were trying to build links to my website, which is about SEO (obviously).

You would start by gathering a BIG list of resources/links pages in the SEO industry.

This can be done using Google search operators, such as:

  • SEO intitle:”resources”
  • SEO inurl:”resources”
  • SEO intitle:”links” inurl:”links”

Here’s an example of the types of results you should see:

You can then scrape these results using LinkClump or a similar extension.

Just remember to change the settings for LinkClump so that it copies the links to your clipboard, instead of opening all the links in new tabs (which can be annoying!)

This can be changed in the extension settings, like so:

Setting up Link Clump

I also recommend selecting “URLs only” under the advanced settings, too. Otherwise, it will copy both the URLs and titles.

Linkclump URL settings

Now, using LinkClump is as simple as dragging with your left mouse button + Z (or whatever shortcut you chose in the settings) over the links you want to copy.

Using Link Clump

This will get you a nice neat list of pages.

You can then simply paste them into a spreadsheet.

Pasting Link Clump results into spreadsheet

Note: I also recommend Citation Labs Link Prospector. It isn’t free (it costs around $5 for a scrape) but it is a great tool that makes scraping thousands of results from Google super easy.

Why are we looking for “resource” pages?

Because these types of pages usually link out to tons of external resources.

And, unless people make an effort to keep them up to date, they’re usually full of broken links about a particular topic!

Example of finding broken links

Next, you want to run all of the web pages (extracted from Google) through Screaming Frog, to find any external broken links on the pages.

Just remember to tick the “Check external links” box under Spider Configuration:

Screaming Frog setup

And set the crawl limit to 1:

Setting crawl depth

And tick the “Always follow Redirects” box, too!

Setting Screaming Frog crawl depth

Then, you can copy/paste your entire list of scraped URLs into Screaming Frog and it’ll work its magic…

Running screaming frog

Let it run until complete.

Next, go to Bulk Export > Response Codes > Client Error (4xx) Inlinks:

Finding 404 errors

This will export a CSV will all the broken links.

Export broken links

But, you still have no idea how many inbound links are pointing to these pages.

This is where Ahrefs comes in.

Copy/paste the broken links from your CSV file (200 at a time) into Ahrefs’ Batch Analysis tool. Sort by number of referring domains.

Number of referring domains in Batch Analysis tool

Now, it’s simply a case of looking through the list and finding a page that is relevant to your website (i.e. covers a similar topic).

Hint: Wayback Machine is your friend, here.

Once you’ve found a broken page that looks promising, recreate the content on your website.

IMPORTANT: For this technique to succeed, you NEED to be asking people to replace the link with a similar link. If you ask them to replace the link with a completely different link, it ain’t going to work!

When you have the content, export the list of backlinks pointing at the dead page from Ahrefs:

Export backlinks from Ahrefs

Work your way through the list, find the contact information for each website (here’s a great video on how to do this), and reach out to each site to suggest your broken link replacement.

Bonus tip:

You can also sometimes find broken links with a TON of inbound links using Ahrefs.

Just enter a competitors’ website (ideally quite a well-known/big website) into Ahrefs Site Explorer, then go to the “best by links” report and filter for 404s:

Ahrefs broken link report

Very simple, but very effective!


#4: Create Free Industry-Related Tools to Attract LOADS of Quality Backlinks (in 3 Different Ways)

Most people stick to infographics, “ultimate guides”, and other similar kinds of content when trying to build backlinks.


Did you know that free tools are just as effective, if not more effective, at attracting links?

And I’m not talking about anything particularly costly or magnificent here…

It could be something as simple as a visualisation of popular keyboard shortcuts:

Adobe Keyboard Shortcut Visualizer

Or a simple calorie calculator:

Calorie calculator example

Don’t believe me? Try throwing that calorie calculator into Ahrefs.

Links to calorie calculator

That’s 5K+ backlinks from 1,270 referring domains!

But, the question is, how do you actually build backlinks to free tools like these?

Here are 3 methods:

  1. Target resource/links pages (like this one)
  2. Target how-to guides (i.e. guides for which your tool provides a shortcut—more on this later!)
  3. Run “shotgun skyscraper” outreach for your tool

Let’s take a look at these one-by-one.

Resource pages

Resource/links pages are web pages that list—and link-out to—resources and tools in your niche. 

Koozai SEO resource list

They’re created with the sole purpose of providing links to useful resources around the web.

This makes them a great link target, as they’re usually responsive to pitches.

Here’s my process for doing this at scale:

  1. Use google search operators to find TONS of resource pages (e.g. “SEO intitle:resources”, “SEO inurl:”resources”, etc.)
  2. Scrape the results (using Linkclump)
  3. Find contact information and pitch your content


Let’s say you were trying to build links to the aforementioned calorie calculator.

It’s a fantastic guide and is well-deserving of a place on any fitness-related resources page.

To start, you would find a bunch of resource/links pages using Google search operators such as:

  • Fitness intitle:”resources” inurl:”resources”
  • Paleo intitle:”resources”
  • Health intitle:”useful links” inurl:”links”

Note: Basically, you just need to add a keyword related to your tool (e.g. health, fitness, paleo, etc.) to the search operators. You can find hundreds more search operators here.

Here’s what the results should look like: 

Results from Google search operators

You can then scrape these using LinkClump.

It’s then simply a case of sifting through the pages, plucking out the ones that you feel should add a link to your resource, then contacting the blogger/webmaster with an email like this:

“Hey [NAME],

It’s Robbie Richards here. How are things?

I’m reaching out because I recently created a free calorie calculator (here’s a link: and I was hoping you might add it to your list of paleo-related resources?

I think it would be super-useful for your visitors, as it’s a useful resource for anyone interested in aligning their caloric intake with personal fitness/diet goals 🙂

Let me know what you think!


How-to guides

Most tools make solving a problem easier.

For example, take a calorie calculator tool—this makes it super-easy to figure out how many calories you need to consume each day:

Without such a tool, you could have to do some pretty complex math yourself!

But…there are still plenty of posts talking about “how to count calories” the hard way.

Calorie counting article

These are a prime link targets.

Why? Because your tool is genuinely useful for the people reading these posts, as it saves them time and solves their problem.

So, here’s what to do:

  1. Find “how-to” articles related to the issue your tool solves (e.g. if you have a calorie counter tool, find articles about “how to count calories” or “how to lose weight”)
  2. See if they’re linking out to any similar tools
  3. If not, reach out and pitch your tool for inclusion

Ahrefs Content Explorer offers a great way of finding link prospects.

Just enter a “how-to” query in the search bar and select “in title” as the location:

Ahrefs Content Explorer

As you can see, there are over 12,000 articles with the phrase “how to lose weight” in the title!

That’s a LOT of link prospects!

I recommended filtering these down to only the best prospects. I usually do this by adding an organic traffic filter so only pages with 500+ organic visits/month are shown.

Filtering Content Explorer results

You can also filter by domain rating. For example, I like to set the filter for sites between 25-45 as this will return a lot mid tier blogs that might be a little more receptive to cold outreach efforts:

Filtering by domain rating

It’s then simply a case of sifting through the results and looking for solid link prospects.

This will usually be pages that talk about calorie counting (or whatever your tool is about) but don’t link to a tool that solves the problem.

If you find a page like this, simply reach out to them and saying something like:

“Hey [NAME],

It’s Robbie Richards here. How are things?

I’m reaching out because I just read your post about how to lose weight and, well, that is definitely one of the most in-depth guides I’ve ever come across. You really nailed the process!

Also, I noticed you talked about the importance of counting calories (which, I agree, is important). But, as you know, this can be quite difficult to do, as it involves some pretty complex math.

That’s why I wanted to quickly reach out and let you know about a free calorie counter tool that I’ve just created. Basically, you enter your details – height, weight, gender, fitness goals – and it spits out EXACTLY how many calories you should consume each day.

I would love to get your feedback on it and, if you think it’s useful, perhaps you could add it to your guide? I think it would be super-useful for anyone reading that post!



No matter how great your free tool happens to be, it’s likely that similar tools already exist.

For example, a simple Google search for “calorie calculator tool” returns over 600K results!

That’s a LOT of calorie calculators!

But, luckily, competition is a good thing when it comes to link building, especially if your free tool knocks the competition out of the water.

Why? Because this makes it a prime candidate for shotgun skyscraper outreach.

Here’s how this works:

  1. Find tools that are similar to yours, yet not quite as good
  2. Reach out to anyone linking to those tools and explain WHY they should link to your tool instead (i.e. because it’s better!)

As mentioned above, I’ve used this strategy to quickly build hundreds of quality links to my client’s content:

Graph showing growth in referring domains

Here’s the basic process for doing this at scale:

  1. Use google to find similar tools (this is as simple as searching for “calorie calculator” or whatever you’re looking for)
  2. Scrape the results (again, use LinkClump)
  3. Extract the backlinks for inferior, yet highly linked-to tools (that are similar to yours)
  4. Reach out to those people with a “skyscraper” outreach email


Let’s go back to the 600K+ results for “calorie calculator tool”.

Most of these are, as you would expect, calorie calculators.

So, let’s use Linkclump to gather these into a nice neat list.

Using Linkclump to find link targets

Next, paste this entire list into Ahrefs Batch Analysis tool, then sort by number of referring domains.

This will reveal similar tools that also have a TON of backlinks.

Visit each of the links individually and note down links that are:

  1. Similar tools to yours
  2. Not quite as good as yours (note: you should also make a note of the reason—e.g. poor design).

For example, here’s a calorie calculator that is not only super UGLY, but also looks pretty confusing to use:

Bad calorie counter example

It also has 338 referring domains!

This is a GREAT shotgun skyscraper prospect!

When you have a list of similar tools that fit the bill (i.e. are not quite as good as your tool AND have a ton of backlinks), export the backlinks for all the links using Ahrefs Site Explorer.

Ahrefs referring domains

​​​​Reach out to these people with an email like this:

“Hey [NAME],

It’s Robbie Richards here. How are things?

I’m reaching out because I just read your post about [INSERT LINKING POST TOPIC] and noticed you were linking out to this calorie calculator: [INSERT INFERIOR TOOL LINK]

I just tried to use that tool and, honestly, it was pretty confusing to use. Also, it was very ugly and [INSERT REASON WHY IT’S AN INFERIOR TOOL HERE]

I just thought I’d let you know that this calorie calculator is much better: [INSERT LINK TO YOUR TOOL]… it looks a lot nicer and is generally easier to use.

Might be worth swapping the link out for that tool?



#5. Boost Your Blog Posts (and “Money” Pages) By Converting Homepage Links into Deep Content Links

Let’s do a quick experiment.

Go to Ahrefs Site Explorer, paste in your root domain (e.g., then click the “best by links” filter from the left-hand menu.

Which page on your site appears at the top of the list?

Let me guess…


The homepage.

Am I right?

If not, then apologies…my experiment backfired.

However, most websites receive the bulk of their links to their homepage. As you can see below, this is true for my blog:

Table showing majority of links pointing to the home page

But, for most websites, the homepage isn’t really the page you’ll want to attract visitors to.

It’s much more likely that you want people visit your blog posts, or perhaps your product pages; the homepage is usually just a gateway page for the real meat of the website.

(there are exceptions to this rule, of course)

If this is the case, this “link juice” is wasted pointing at your homepage. I mean, it would be much better if the links went directly to the pages you want to rank, right!?


Well, here’s the good news: it usually isn’t that difficult to get the link changed to a more appropriate page.

How? Just ask.

But first, you need to find out who is actually linking to your homepage.

To do this, go to Ahrefs Site Explorer and paste in your homepage (note: make sure to select “URL” from the drop-down):

Entering home page URL into Site Explorer

Go to the “backlinks” tab and you’ll see a list of all backlinks pointing to your homepage:

List of over 260 backlinks pointing to the home page

It’s only worth making the effort to convert the links to deep pages if the links are dofollow, so it’s worth adding a “dofollow” filter, too.

NOTE: I did this and it left me with 200 links.

It’s now a case of looking through the links and identifying those that make sense to change to deep links.

In general, you’re looking for links that are less than optimum (i.e. they’re linking to your homepage when it would make much more contextual sense to link to an actual blog post or product page).

For example, here’s a post about blog promotion that links to my homepage:

Example of article link pointing to the home page

However, right below this link, there’s a screenshot of my blog post about increasing website traffic.

I appreciate the link but personally, I think it would make much more sense to link directly to this post rather than my homepage, as this is clearly what the post is talking about.

Let’s ask if they’d be willing to change it, shall we!?

Here’s an example outreach message:

Hey [NAME],

It’s Robbie here from

I was just looking through my backlinks and noticed you mentioned me in this post:

However, I noticed that although you mention a particular blog post of mine (the one about how increased website traffic by 272%), you actually linked to my homepage rather than the blog post itself.

Any chance it would be possible swap the link out for a link directly to the blog post instead?

No worries if not, I just think it’d make more sense in the overall context of the post (as people may be interested to read that post).

Either way, have a great week!


Not everyone will change the link but hey, even if a few people do change it, it may be enough to give that post/page a significant rankings boost!

For anyone who has ever tried to scale link building efforts to a deep product page of any kind, it’s easy to see the potential upside here.


#6. Establish a Link Velocity Target (and Stay  Competitive in the SERPs)

Your competitors are actively building new links to their most valuable content.

Therefore, you need to understand two things:

  1. The number of links needed to get first page rankings at a point in time
  2. The rate at which you need to acquire new links in order to stay on the first page

You can quickly get an answer to the first part of the equation by looking at the Ahrefs (affiliate) SERP Overview report:

Ahrefs SERP overview report

Here’s how to find a monthly Link Velocity target to remain competitive in the SERPs:

Enter your target keyword into the Ahrefs Keyword Explorer:

Ahrefs Keyword Explorer report

Scroll down to the SERP overview report again:

Link metrics in the SERP Overview report

Note: Beardbrand has the highest number of referring domains (96) pointing to it. It also has the highest UR in the top 10 results, which is driving the #1 ranking position.


This is where they are now. We want to understand how quickly they are building new links to create more page-level authority and remain in the #1 position.

Next – click the green drop down arrow next to the competing page URL and select the Overview link:

Selecting the New Links report

This will generate an overview report for that specific URL. Click on the “New” link the Referring Domains header:

This report will show the number of links won and lost over different date ranges:

New links report in Ahrefs

Scroll to the bottom of the report to see the number of new links acquired over the last 7 days:

Report showing new links acquired in last 7 days

Two new links – only one looking topically relevant. But, this doesn’t tell us much.

Next – filter the report to the last 30 days:

30 day new link report

We can see 6 new referring domains in the past 30 days. Apart from the low DR link from beardshapeup, the quality and relevancy of the links look pretty low.

Important: Pay closer attention to the QUALITY and RELEVANCE of the referring domains. You can have 100 poor quality links pointing to your page and it won’t move the needle. But, if you have 5-10 med-high DR links from industry-specific sites, it’ll have a big impact.

For example:

Relevant backlink examples

Finally – filter the report to the last 60 days to gauge the consistency of link acquisition:

New links acquired over the last 60 days

Looking at the data broken out over the last 60 days, it looks as if beardbrand is only getting a couple new links a week to its beard oil product category page. And, only a small percentage of these are even relevant.

If I was setting up a link building campaign for a beard oil category page, I’d be trying to build 15-20 quality links to the page in the first 60 days, and then acquire new links at a rate of 2-3/month to remain competitive. 


#7. Build Links from MASSIVELY Authoritative Resources by Catering to Multiple Learning Methods

People learn in different ways.

There are visual learners…

Auditory learners…

And so forth!


Nearly all of the content on the web is the same…it’s written blog posts that are made-up of mostly text.

This means if you’re an auditory or visual learner, well, you’re flat out of luck!

There’s no video…

No audio version…

No instructographics…

Nope. It’s just written blog posts, and that’s it!

So, how can you get links from these high authority sites to “ultimate guides”?

Simple…convert that content into a different format (aimed at different types of learners) and give it away for free.

Here’s the process:

  1. Identify informational “how-to” articles and “ultimate guides” in your industry with a ton of links
  2. Create a video or audio version of that post (or at least part of it)
  3. Contact the website owner and give them the content for free


Let’s say you had an ecommerce website selling rice cookers.

And you wanted to build links to the rice cooker page.

(pretty difficult task, right?…it’s always SUPER HARD to build links to ecommerce pages!)

Here’s how you could do it:

Take a look at the SERPs for the term “how to cook rice”:

Example of ultimate guide

Straight away, you can see that the SERPs are showing video content—this shows that the people searching for this term really want to see a video.

And, in fact, a lot of the websites in the top 10 have realised this…

That’s why they have a video showing how to cook rice on their page.

Video example

But, some pages don’t have a video…

This means they’re not catering to different learning styles. And in this example, they’re not providing the content in the format that people really want to see!

Here’s how to take advantage of this:

  1. Identify the pages without videos
  2. Create a video showing how to cook rice (hint: bonus points if it’s uniquely created for them and features their branding etc.)
  3. Reach out and offer the video to the websites for free (to include in their content)

Most of them will probably give you a link without you even having to ask.

But, how do you do this at scale?

First, identify some informational terms (e.g. “how to X, ultimate guide to X, beginners guide to X, etc) related to the content/website you’re trying to build links to.

So, if it was the rice cooker page, it would be terms like:

  • “How to cook rice”
  • “Curry recipe”
  • “How to cook risotto”

Next, search Google for these terms and scrape the top 10-100 results (with Linkclump).

This will give you a nice neat list of results.

You can then paste into Ahrefs Batch Analysis tool and sort them by referring domains to find the most authoritative pages.

Referring domains in Ahrefs Batch Analysis tool

Next, you’ll want to check each of the pages manually to see if they’re catering to different learning methods—if they aren’t, note them down.

For example, this page IS catering to different learning methods, as it has an embedded video:

BUT, this page IS NOT catering to different learning methods, as there is no video or audio aspect to the post.

This is a GREAT prospect.

Note: There are plenty of other posts like this, too.

Keep running through the results from the batch analysis tool and continue to note down any pages that fit the bill.

Finally, create a video/audio version of each post (or simply create ONE video that would feel perfectly at home embedded in all of the posts) and send it over to the blogger/webmaster.

Here’s a sample outreach email:

“Hey [NAME],

It’s Robbie Richards here. How are things?

I’m reaching out as I was reading your “how to” guide for cooking rice (great guide, btw), but noticed that you didn’t include a video of the process 🙁

Like many people, I’m a visual learner so I have to admit, I did struggle to follow your guide. I did crack it eventually, though! 🙂

So, I decided to create a video version of your post. Here’s a link: [INSERT LINK TO VIDEO]

I’m 100% happy for you to add it to your post (if you would like?). I think it would make a really nice addition and help out folks like myself who are visual learners.

PS. Not asking for anything in return; just wanted to help 🙂


Usually, there’s no need to mention links at all. If they do choose to embed your video in their post, 90% of them will add a link without the need to ask.

Note: I did this recently for a client (in the blogging space) and our conversion rate was roughly 5%.


#8. Reclaim Links from Stolen Images

Do you have a lot of high-quality imagery on your website?

I’m talking infographics, photography (that you own the copyright to), diagrams, screenshots, etc.

“Yes, Robbie…I do!”

In that case, I have bad news: you’re probably a target for image theft.

But here’s the good news: you can leverage image theft to quickly build quality links 🙂

Here’s how to do it (in 3 simple steps!):

  1. Roundup any high-quality images on your website (this will generally be infographics, photographs, diagrams, etc)
  2. Find websites using these images without permission
  3. Make sure these websites are giving you credit for those images (if not, reach out and reclaim the link)

Let’s go through this step-by-step.

Step #1 – Find high-quality images on your website

It’s important to note that you’re not really looking for any ol’ images here; you’re mainly looking for:

  • Infographics
  • Photographs
  • Illustrations
  • Diagrams

IMPORTANT: You MUST own the copyright to these images; this won’t work if you’re using an infographic/photograph/diagram that isn’t yours.

Let’s assume we were doing this on behalf of Brian Dean (i.e. Backlinko).

I know Brian’s content pretty well, so I know he has a really cool infographic about on-page SEO:

Brian Dean on page seo infographic

Most of the time, bloggers will link to the original source of the infographic when they embed it on their own website. But because some bloggers won’t remember to do this, there are probably a fair few links we can reclaim.

I’m going to add this link to a Google Sheet (make a copy here) along with any other images/graphics on Brian’s website that I feel are highly-stealable (note: this is just to keep track of the images for the next stage of the process).

Link tracking spreadsheet

Step #2 – Find websites using these images (without permission)

Next, we need to find websites that are embedding these images without giving us credit (i.e. without linking.

There are two tools you can use to do this:

They both work pretty much the same, but let’s use Google images for this example.

Go to Google images, click “search by image” and paste in a URL from your spreadsheet:

Google Image Search

Hit “search by image” to get a list of all the sites that have used your image:

List of all the websites using an image

Make a note of these pages (use the second tab of the spreadsheet):

Step #3 – Make sure these websites are giving you credit for those images

Finally, you need to sift through these links and check they’re giving you credit for using your image(s).

This can be done by searching for your domain within the source code of each page.

In this example, you can see that the site is, in fact, linking back to Brian:

Source code showing link attribution

All good!

But if you find a website that isn’t giving you credit, reach out to them and ask them to add a link.

Here’s an email template:

Hey [NAME],

It’s Robbie here from

I’m reaching out because I noticed you used one of my images in this post: [INSERT POST URL].

Unfortunately, however, you didn’t give me credit for the image.

I’d really appreciate it if you were able to add a source link below the image; could you do this?



#9. Strategic Guest Blogging (and Tenant SEO)

Guest blogging has been met with heavy skepticism recently.

However, when done properly, guest blogging can:

  • Generate backlinks from high authority sites in your industry
  • Build exposure and credibility for your brand
  • Deliver targeted traffic to your site
  • Create a powerful relationship-building platform
  • Provide a vehicle to rank for insanely competitive keywords (tenant SEO)

I’ve written several guest posts for high authority sites, including this one on Digital Marketer.

It sent 866 people to my site, and added 68 people to my email list:

Guest post conversions

And, this guest post for Sumo.

It sent 631 people to my site and added 87 subscribers to my email list:

Sumo conversion rate

That’s an insane 13.79% conversion rate!

Guest posting also forms the foundation of most tenant SEO strategies.

What am I talking about?

Basically – you use guest posts on high authority websites to rank for competitive keywords you couldn’t realistically target on your own site. 

This can be an extremely powerful SEO strategy for new websites, or companies trying to compete in insanely competitive industries.

Here are a couple examples from folks in the SEO space:

Steve Webb wrote this SEO audit article on Moz to rank for the highly competitive search term “SEO audit”.

This article has been king of SERP mountain for over 4 years!

SEO audit SERP

Steve used the Moz domain authority to rank for a keyword he would have otherwise had no chance ranking for on his own site, Web Gnomes.

I bet this guest post is a healthy meal ticket for Steve – drives a boat load of qualified leads to his agency.

Matt Barby used this strategy to get a client in the app development software niche ranking for a massive keyword “app makers” (22,000 monthly searches) by targeting it on the Business News Daily website.

The results were staggering: 74,000 referral visits and close to 4,300 user registrations!

Tenant SEO strategy

So, we know that guest blogging can be a powerful vehicle to build quality links and drive targeted referral traffic…

But, where do you start?

Basically, we need to find websites that (a) we want to write for, and (b) accept guest posts.

Here are 2 ways we can do this:

  1. Use Google search operators (this is more powerful than you might think!)
  2. Reverse engineer prolific guest bloggers (in your industry)

IMPORTANT: Any sites you target should meet the following criteria:

  • High domain authority
  • Related to your niche
  • Post high quality content
  • Receives lots of traffic (use Alexa)
  • Has an engaged audience
  • Provides contextual links
  • Active social presence

OK, let’s go over each of these tactics one by one.

How to find guest post opportunities using Google Search Operators

Head over to Google and start entering the following search queries (one by one):

  • Keyword “guest post”
  • Keyword intitle:“write for us”
  • Keyword inurl:”write for us”
  • Keyword “submit a guest post”
  • Keyword “submit” AROUND(4) “guest post”
  • Keyword “guest post by”
  • Keyword “accepting guest posts”
  • Keyword “guest post guidelines”
  • Keyword “submit blog post”
  • Keyword “contribute to our site”
  • Keyword “submit article”
  • Keyword “guest author”
  • Keyword inurl:“guest post”
  • Inpostauthor: “guest + post” Keyword
  • Inpostauthor: “guest + blog” Keyword

You can also use the wildcard operator (*) to expand your results. The previous search strings included quotation marks which returned phrase matches (all keywords had to appear in exactly that order):

Guest post search operator

The wildcard operator will help returned slightly different search results, without sacrificing relevancy. For example, if your write “submit * guest post”, search results will include:

  • “submit a guest post”
  • “submit your guest post”
  • “submit a new guest post”
Reverse engineering guest post placements

We can take this a step further using the tild (~) sign. This will help us return guest blog opportunities for sites using synonyms of of our target keywords.

For example, “~SEO” might return the following synonyms, “SEM”, “online marketing”, “link building”:

Search operator

Make a note of any websites that look good in a Google Sheet (here’s a template):

Track guest post opportunities

Reverse engineer prolific guest bloggers

If you read popular blogs in your industry you will have a good idea who the prolific guest bloggers are.

For online marketing, these names include Brian Dean, Neil Patel, Kristi Hines, Marcus Sheridan, to name a few.

Head over to Google and enter NAME “guest post by”. This will show you all the sites where these influencers have written guest posts:

Finding guest post placements for industry influencers with search operators

Add these sites to your list of guest blog targets.

Another way to uncover where influencers are guest posting is by using this Google search operator: “guest post by COMPETITOR NAME”

This will show you pages that link to your competitors website + also contain the phrase “guest post” – these pages are usually places your competitor has submitted a guest contribution.

Using search operators to find Brian Dean;s guest posts

You can also search this term in Ahrefs Content Explorer to supercharge this tactic:



Using Ahrefs Content Explorer to find guest post opportunities

Add any worthwhile sites (like the one below) to your spreadsheet:

Ahrefs content explorer

Note: Pay attention to the keywords and topics your competitors are targeting in their guest posts. Are they using tenant SEO as part of their broader search strategy?

It’s time to pitch your targets

Now that you have an extensive list of guest post targets, it’s time to reach out and pitch to them.

Before you email the site, try to build a relationship with the guest post target. There are a number of ways to do this –

  • Engage them on Twitter
  • Connect on LinkedIn
  • Actively participate in forums and comment on their post
  • Email them and let them know how much you love a particular post they have written

After you have engaged the prospect it’s time to reach out and request a guest post opportunity.

Here’s a template:

Subject: You should blog about [insert your guest post topic]

Hey [NAME].

First, I just want to say I’m a big fan of [INSERT BLOG NAME].

Anyways, I’m writing to you today because I’d love the opportunity to contribute a guest post to [insert blog name].

I’ve been reading through some of the content on the blog and have put together a short list of topics that I think would provide a ton of value to your readers –



I have a personal SEM blog that will give you an idea of the style and quality of my writing. You can view some of my recent posts here [insert blog URL].

Let me know if you’re interested.

Keep up the great work!



Instead of simply listing guest post ideas, you can take it a step further and include a link to the complete article you’ve written. Some bloggers prefer this because they don’t have the time to communicate back and forth with everyone pitching to them.

The definitive guide to guest blogging written by Brian Dean over Backlinko is one of the best posts I’ve read that covers this link building tactic, and has heavily influenced my guest posting strategy. 


#10. Compounding Growth with the “Ranking for Links Technique

According to Worldometers, millions of blog posts are published every single day:

Blog written today

In many of these blog posts, you’ll see bloggers/journalists referencing pieces of data from other sources in order to backup their point(s), like so:

Example of ranking for links

(if you look a couple of sentences back, you’ll see that I did this exact thing in this post when talking about how many blog posts are published each day :D)

But how do bloggers find these sources?

They Google them.

When I wanted to know how many blog posts are published each day, I did this Google search:

I looked at the first 2-3 results and decided that the Worldometers page was the best resource – this is the one I referenced (and linked to) above.

Here’s my point:

If you can rank for these kinds of “reference” terms, chances are that you’ll attract backlinks on a consistent basis from journalists and bloggers seeking references for their own content.

Smart, right!?

Here’s the basic process for doing this:

  1. Build a list of keywords/topics that are likely to get cited a lot
  2. Create a piece of content around that topic/keyword (and, hopefully, rank for it!)

In general, these kinds of topics/keywords will be informational terms and will predominantly be these kinds of queries:

  • “How to” / “What is” queries (e.g. “how to screenshot on a Mac”, “what is yeast”, etc.)
  • Definitions
  • Statistics
  • Lists

So, the first step is to build a list of these types of queries for your niche – I recommend doing this in a spreadsheet (or text doc) for simplicity.

If I was doing this for my website (i.e. an SEO blog), my keywords may look like this:

List of ranking for links terms

Next, throw your list of keywords into Ahrefs’ Keyword Explorer.

In general, you’re looking for terms with high search volumes and a low KD score:

Ahrefs Keyword Explorer returning opportunities for link building

It’s then simply a case of putting together a piece of content around this topic and ranking for it.

Journalists/bloggers will then, hopefully, reference (and link to) your post on a regular basis without the need for any additional work.

Brian Dean’s list of Google ranking factors is the perfect example of this technique in action.

It currently has an insane 14,000+ backlinks from almost 2,750 referring domains!

Article with over 14000 backlinks

This is because bloggers/journalists are constantly referencing this page when writing about SEO:

Ranking for links

Note: This is a strategy/term was coined by Matthew Barby (he’s a super smart guy; I highly recommend following his blog!)


#11. Reclaim Lost Link Equity from 404 Pages

When it comes to link building, a lot of businesses jump straight into creating new campaigns with the single goal of landing BIG wins. (i.e. backlinks from massive sites like Washington Post and NBC).

While these links are incredibly valuable, they require a lot of resources – time and hard work – and the success rate is very low.


Before you go after the big fish, make sure you’ve first collected all the “quick-win” link opportunities. This will help generate faster results, and build trust with new clients.

One of the fastest ways to do this is recover lost link equity from 404 pages.

Think about it:

Websites change all the time. Products come and go. Information is pruned. URL structures get updated. Content is moved around.

All this movement can have a big impact on all the existing backlinks pointing to your website.

For example:

If you created a piece of content that acquired a bunch of quality links, and then made a minor update to the post/page URL without properly implementing a 301 redirect, you’d waste valuable link equity.

Therefore, one of the best ways to land quick link wins is to ensure you don’t have backlinks pointing to dead pages.

Here’s how to do it:

Head over to the Ahrefs “Best by Links” report. Filter by “404”, and sort referring domains (RD) in descending order:

Finding links to 404 pages in Ahrefs

The first 404 page in the list has 6 referring domains pointing to it.

Not a huge number. But, check out what happens when you click on the Referring Domains link:

List of referring domains pointing to 404 page

The 404 page has two solid links pointing to it:

Search Engine Watch (DR 71)
ReputationX (DR 48)

High authority backlinks like these are very hard to get even with great content and a dialed in outreach campaign. But, we managed to find them in a matter of seconds.

Important: Scan down the list and only attempt to reclaim links from quality sites relevant to your industry:

Examples of low DR sites

Only the first two opportunities in this report are worth looking at closer. The others are low DR, and look completely irrelevant. Redirecting these types of links into other important assets on your site would do more harm than good.

Action item:

Once you’ve gathered a shortlist of “safe” backlinks to reclaim, you can either:

  1. 301 redirect the 404 page into a relevant asset on your site. Ideally, a page/ post with some type of search traffic potential. 
  2. Reach out to the owner/ author of the site linking to your 404 page. Ask them to update the link.

Since 301 redirects leak little-to-no link equity, option #1 is my preferred course of action. 


#12. Build Links with Blog Comments (Hint: This Isn’t Your Usual Spammy Blog Commenting Strategy!)

I know what you’re thinking…

“Building links with blog comments, Robbie!? You know it’s not 1995, right!?”

I feel you, but hear me out….

I’m absolutely not talking about spammy mass blog commenting here. In fact, this technique doesn’t involve leaving any blog comments at all, but rather utilising existing blog comments (on your own blog) to create a list of link prospects.

Let me explain…

Most blogging systems ask commenters for their name, email address, and website (if they have one) when submitting a comment. This is even true of my blog:

Website comment blox

And those who choose to enter their website URL in this box will see their name hyperlinked to their website when the comment goes live:

Example of blog comment link

“Where are you going with this, Robbie?”

Well, these people are clearly interested in what you have to say, meaning it’s highly likely that their blog (i.e. the site they linked to when commenting) is in the same niche.

Let’s click through to Jeff’s website to see if this is true…

BOOM. Jeff also runs an SEO-related website.

With this in mind, here’s my 3-step process for building links with blog comments:

  1. Scrape the websites of everyone who left a blog comment in the last 30 days.
  2. Check if they have any content on their website related to your niche (e.g. in my case, this would be SEO/marketing-related content).
  3. If so, reach out, thank them for the comment and ask if they’d consider linking to your post.

I’ll use this blog post of mine (with 220+ comments) to walk you through the process.

To get started, we need to scrape the websites for those who’ve left comments in (roughly) the past 30 days. This can be done manually but life is much easier with this Google Chrome add-on.

Simply right-click on commenter’s name (hint: make sure it’s a linked comment!) and select “scrape similar”:

Scraping blog comment names

This will “automagically” scrape a list commenters name + URLs from the page:

Results from blog comment scraping

Click “Export to Google Docs…”

You should now have a list of websites + names in a Google Sheet, like this:

Blog comment link prospects

It’s now a case of using the “site:” search operator (combined with a keyword related to the topic of your website/content) to find sites with content related to your niche.

Here’s an example (for the SEO niche):

No results for this site; let’s try another:

BINGO. Looks like these guys have a few SEO/marketing related posts, one of which is this post about generating more blog traffic:

Example of a website owned by a person commenting on the blog

Definitely a great post, but it doesn’t even touch on many of the blog promotion strategies mentioned in my post. I’m, therefore, pretty sure his readers would also get a lot of value from my post.

Stuart clearly enjoyed my post (see his comment below) so let’s reach out and kindly ask if he’d be willing to add a link to my post in that article:

Example of blog comment link

Here’s our message:

Hey Stuart,

It’s Robbie (Richards) from

I was just reading through the comments on my blog and noticed you commented a while back (on this post) – thanks for that! It’s always good to know my posts are of value to other bloggers 🙂

Also, I ended up reading your blog promotion case study on your blog. Really cool stuff…loved the tip about not focusing on vanity metrics. I, too, see so many people doing that!

Don’t mean to sound cheeky, but is there any chance you’d consider adding a link to my post at the end of that article? I think it follows on nicely from what you had to say, so I’m pretty sure your visitors would find it interesting, too.

Either way, have a great weekend. Keep in touch!


Do this for every prospect that fits the bill (note: make sure to personalise the email as much as possible first, of course!)

This is not only a great way to build links but also, a great way to forge relationships with other bloggers.

Pro tip: You can scale this strategy by creating a Custom Google Search engine similar to the one we created in the first link building strategy mentioned in this post. 

The only difference is instead of uploading the domains already linking to your site, you will add the domains of the people commenting on your site:


#13. Quora Hacking (and Scaling Referral Traffic Streams)

Quora links may be nofollow, but that doesn’t mean they’re worthless.

In fact, Quora links can be an amazing source of referral traffic. They’re also great for diversifying your link profile (a link profile consisting solely of dofollow links won’t look natural at all!)

Here’s a three-step process for getting a ton of referral traffic (and links) from Quora:

  1. Plug Quora into Ahrefs (to find the highest traffic threads).
  2. Search for a keyword related to your content (this will filter out relevant threads that have rankings and ongoing passive traffic).
  3. Write a top notch answer on the threads with lots of traffic.

OK, so the first step is super simple; just paste “” into Ahrefs Site Explorer, then go to the “top pages” tab (under “organic search” on the left-hand menu).

This will list all URLs on the domain in order of search traffic:

Using Ahrefs Site Explorer to find highest organic traffic threads on Quora

Next, enter a keyword related to your niche in the search box (note: the aim here is to search quora for high-traffic threads related to your industry).

Let’s use “fitness” for this example:

Using Top Pages report in Ahrefs

We now see a list of URLs (i.e. threads) on the site related to fitness — some of them have a ton of traffic!

It’s now simply a case of combing through the threads for those with the following criteria:

  1. Niche-related (e.g. fitness related).
  2. Plenty of search traffic (note: the ones near the top of the list in Site Explorer have the most search traffic).
  3. No good answer currently (this is super important!)

If you find a thread that fits the bill, answer the question yourself (note: make sure to answer with a well-crafted, useful response – this will increase the chances of your answer being upvoted and, in turn, the chances of you receiving referral traffic from the thread).

Here’s a thread that fits the bill:

Example Quora thread

It has 500+ visitors per month from search (from the US alone), only has 2 answers (none of which are particularly in-depth), and is clearly niche-related.

Because Quora allows you to reference sources when writing your answer, it’s easy to link back to relevant websites when writing.

This means that should we have a page on our website listing some great fitness-related subreddits, we could simply answer with a condensed version of that list and quote our website as the source.

Here’s an example of this logic in action (from Rand Fishkin):

Example Quora thread answered by Rand Fishkin

The question was: “What are some simple things companies can do to create a stronger Internet presence?”

You can see that Rand’s answer is extremely thorough, useful and helpful, yet it isn’t overly promotional. He includes a link to a Moz blog post where appropriate, but also mentions other notable tools/sources, too.

I’d be willing to bet that this drives a couple hundred visits per month to that blog post (if not more).

He also gets a nice juicy link (albeit a nofollow one).

Quora is the second highest source of referral traffic for Wishpond:

Wishpond Quora referral traffic

Use the technique above to identify relevant high-traffic threads, engage in the conversation, and start driving targeted referral traffic to your site. 


#14. Using Google Alerts for Link Reclamation

Google Alerts is one of the most powerful, and often overlooked, link building tools you have at your disposal.

If someone mentions your business name or website online, you likely want them to link to you.

Google Alerts allows you to keep track of brand mention across the web (the idea being that you can reach out to them and request they add a link if they haven’t already done so).

Here’s how to do it:

  1. Set up your Google Alerts
  2. Monitor them (and reach out to reclaim links, where appropriate)

I’ll show you how to do this below.

How to setup your Google Alerts

It’s pretty straightforward to set up Google Alerts; you simply enter phrases you want to be alerted about and, well, that’s it.

If I was doing this for my blog, some example queries might include:

  • robbierichards
  • robbie richards
  • [article name] “100 link building tactics from 50 SEO experts”

Anyway, here’s how to do it…

First, head over to Google Alerts and enter the term(s) you want to be alerted for:

Setting up Google Alerts

In the Result Type box, select to receive alerts for everything.

Next, choose how often you want to receive alerts. For me, the frequency depends on what I am monitoring.

I usually selec once a day in the how often box.

The Deliver To box lets you choose if you want alerts sent to you via email or RSS feed. For brand mentions, I prefer to have them delivered via email.

NOTE: Ahrefs Alerts can also help you with this – it gives a few extra options compared to Google Alerts, such as specifying alerts where the keyword must be in the title tag, etc.

Ahrefs link alerts

How to reclaim links from brand mentions

Each time you brand or website is mentioned on the web check the source and make sure the site is linking to you.

If they don’t provide a link reach with a friendly email to the webmaster requesting you link.

You can use this email template for you outreach:


I was checking out your site today and noticed that you mentioned my brand/post in [insert post name].

I appreciate the call out and wanted to ask if you wouldn’t mind linking back to my site [insert URL].

Keep up the great work 🙂




#15. Mining Expert Roundups for Quick Link Wins 

Expert roundups are one of the easiest (and quickest) ways to build backlinks and generate serious traffic to your site.

Here’s the basic process:

  1. Think of a question to ask influencers/experts in your niche (e.g. “what are your top 3 keyword research tools?”)
  2. Gather a list of influencers/experts in your niche
  3. Consolidate responses into a blog post
  4. Tell the influencers about the live post (and ask them to link to it)

Because you are featuring insights of influencers in your industry, those people (and many of their followers) are very likely to link to, or at least share your blog post across their social channels.

Let’s walk through the process from start-to-finish.

First, think about a topic related to your industry that people will be interested in. For example, SEOs will likely be interested in the following topics:

  • Best link building tools?
  • Link building tactics to focus on 2014?
  • Are black hat link building tactics dead?
  • How do you measure the success of an SEO campaign?
  • If you could only use three link building tools, which three would you choose? (Richard Marriott put together a fantastic roundup for this topic)

These are all topics people in the SEO field would be interested in.

Once you have a solid topic, the next step is finding experts to pitch.

The easiest way to build a list of influencers is to identify the round ups already out there in your niche. Influencers that have already taken part in an expert round up will be more likely to respond to your pitch.

Go to Google and search for roundups in your niche:

“link building experts” + roundup

As you can see, the roundup “55 SEO Experts Reveal 3 Favorite Link Building Tools” attracted 155 backlinks from 77 domains:

Expert roundup backlinks

Now, scroll through the post and add all the featured influencers to a spreadsheet.

Check their Twitter profiles to see if they have a website listed:

Richard Marriott Twitter profile

Go to website and collect their email address or contact page URL:

Richard Marriott contact page

Tip: I use Voilanorbert to scale the gather of contact information. 

Add the name and domain of your roundup targets into a spreadsheet and save it as a .csv file.

Next, run a bulk upload inside Voilanorbert:


Let the tool run for 5-10 minutes and it’ll go through and scrape the emails for you. Huge time saver!

Still short of influencers? Here are a few other tools/ways you can find them:

Pre-curated lists: Head over to Google and do a search for “top [kw] bloggers” and find pre-curated lists of influencers. 

Other bloggers have already done the heavy lifting for you:

Pre curated list

Ahrefs Content Explorer: enter a keyword, then click the “who tweeted” button. It’ll then show you everyone who tweeted that post and also, tell you exactly how many followers they have on Twitter.

Buzzsumo: Head over to Buzzsumo, select past year and the “influencers” tab. Enter a broad search term related to your niche or the topic of your question.

You’ll notice many of these people have tens of thousands of followers and a lot of authority. These are the type of people you want sharing and amplifying your content:

Buzzsumo influencers

I have put together a number of expert roundups for clients in different industries here and here

In a lot of these cases, I’m looking for specialists in very specific occupations. So, I’ll use LinkedIn Sales Navigator to get laser focused with my roundup prospecting:

LinkedIn Sales Navigator

And extract contact information using the SellHack chrome extension:


This process allows me to work faster and build a very specific list of outreach targets.

It’s then simply a case of reaching out to everyone on your list and ask them the question you decided upon.

The key here is to make sure that your outreach email is short, to the point and personal.

Here is a template you can use:

Hey [NAME],

Robbie Richards here, from I came across your LinkedIn profile today and thought I’d reach out regarding an expert roundup I’m putting together.

Here’s the question:


Please leave your response on this form: [URL]

I’ll include a link to your website and promote the article to my 35,000+ audience.

Deadline for contribution is [DATE].


P.S. Here is an example of similar article I published (shared over 10,000 times).

Thanks to Richard Marriott for the advice in his expert roundup post.

In your spreadsheet, keep track of who you have reached out to and when you contacted them.

It’s always a good idea to send a follow up email 1-2 weeks after your initial outreach.

Follow up email

If there is still no response, reach out to them on Twitter.

When you have your list of responses, it’s time to create your roundup post.

This should include a short description about the topic being covered with a brief introduction to the featured experts:

Roundup leaderboard

Then, name the expert and include their response with a link back to their site:

IMPORTANT! When the post is published, follow this 1-2 punch strategy…

Part 1: reach out to contributors, let them know the post has gone live, and ask them to share/upvote.

Here’s a template you can use:

NOTE: Also, make sure you mention the experts when you are promoting the roundup on social media.

Part 2: after you do the initial outreach letting people know the article is live and ask them to share it, follow up again to all people who meet the following criteria:

  1. They respond to your email and engage with the content
  2. They have written a piece of content related to the topic of your roundup on their site

Say something like this:

Hey [blogger name],

Thanks for sharing the post 🙂

I noticed you also talk about [roundup topic] in this post [insert URL on their site]. Any chance you could drop a link to the roundup?

I’m trying to get it to rank so everyone gets more exposure 🙂



Pro tip:

This is another tactic where you can use Custom Search Engines to quickly build a database of experts who contributed to your roundups, and search across their sites for related articles.

Here’s how to do it:

First, grab the domains of all the people who contributed to your roundup. You should already have a list of these in a spreadsheet:

Roundup emails

Create a Custom Search Engine and upload the domains of the contributors:

Roundup CSE

Now, enter search terms related to the topic of the roundup to find any contributor sites that have already written about the post and reach out to them with the template above. 

This should give you a handful of quick link wins: 


#16. Drive Traffic (and Build Relationships) With “Targeted” Blog Commenting

I can already hear the trolls coming out on this one 🙂

Blog commenting is a spam tactic.
Blog comments are no follow.
Blog comments don’t boost rankings.

All of these arguments hold merit if you’re just dropping hundreds of blog comments to boost rankings.

But, that’s not the basis of what we’re doing here.


We’re going to use blog commenting to accomplish the following objectives:

#1: Pillow your link profile to make it look more natural
#2: Get your content (and brand) in front of more people
#3: Drive targeted referral traffic to important content on your site
#4: Build relationships with influential content creators in your industry

The process is simple: Identify high-traffic blog posts, and leave value-add comments that drive people back to your most relevant content.

This is the exact strategy Twoodoo used to grow their user base without spending money on ads:

Example blog comment strategy

After two weeks the startup saw the following results:

Total blog comments: 40
Unique referral visitors: 452
Visitors Per Comment: 11.25
Number of sign ups: 72 (16% conversion rate!)
Total time invested: 6.5hrs
ROI: 11 sign ups/ hour spent

While this strategy might not open the traffic floodgates, it does provide a low-cost opportunity to get your content (and brand) in front of a very targeted audience.

Here’s how to scale finding relevant blogs with high traffic potential:

Open the Ahrefs Content Explorer (affiliate) and search for a relevant phrase. Filter out the pages that get a lot of organic traffic and are written in the same language as your site:

Ahrefs Content Explorer

Note: the volume threshold will depend on the niche you are in. For a larger topic like “content marketing”, we could ramp up the volume threshold to 1,000+ and still get a long list of blogs to look at:

Scan through the list of results and see which ones have an active comment thread. If it does, put together a comment that adds value and insert a link to a relevant piece of content on your site.

Tip: Focus more on the blog comment sections where the moderator is (1) actively responding to commenters, and (2) allows relevant link placements in the comments.

This will not only improve the chances of your link being approved, but will also provide an opportunity to form relationships with industry influencers. This opens the door to guest post opportunities, and an increased likelihood they drop an editorial link to your content in the future.


#17. Transcribe Videos From Industry Influencers (and Outreach)

Video transcription can used as a scalable link building strategy.

Experts and influencers in virtually every industry used video as medium to communicate their expertise.

Today, video can be used in a number of formats:

  • Tutorials
  • Presentations
  • Webinars, Hangouts, Q&A’s
  • Vlogs (video blogs)
  • Industry updates

Transcribing the video content of influencers in your industry and publishing the content to your site is a great way to get links from authoritative sites that get a lot of traffic.

Here are some additional benefits of video transcription:

  • It is a fast way to build linkable content as it comes directly from influencer/ expert in your industry
  • You can get links through attribution (experts use transcription with their video)
  • It helps you get in front of influencers and stand out (you are helping them out). This makes it easier to build relationships.
  • You can leverage their large social audience. If they share your transcription, you get more (targeted) eyeballs on your site.

Here’s how to do it:

First, head over to YouTube and type the name of influencer in your industry. We’ll use Matt Cutts in this example.

Set the filter to the “this month”:

Searching for influencers in YouTube

This looks like a good topic:

Matt Cutts YouTube video

Make sure the video doesn’t already have a full transcription. This one doesn’t have one (bingo!).

Once you have transcribed the video, it’s just a matter of reaching out to the influencer and letting them know.

Here is an example from Ross Hudgens over at Siegemedia:

Ross Hudgens Twitter

Matt responds:

Matt Cutts Twitter response

Matt Cutts has 352,000 followers, so any link promoted in one of his posts has the potential to send a ton of traffic to your site.

Thanks again to Jason Acidre over at Kasier The Sage for introducing me to this tactic.

Final Thoughts

Link building is definitely one of the most challenging aspects of SEO, but it’s also one of the most crucial if you want your website to rank highly in the SERPs.

Each of the 17 link building techniques mentioned in this article will help you build quality links to your site. Let me know how it goes <img class=” 

If you have had success with any link building strategies not mentioned in this post, please leave a comment below so other readers can benefit from your feedback. 

The post 17 Killer Link Building Strategies for 2018 (with Examples and Scripts) appeared first on Robbie Richards.

3 Lessons Fractl Learned From Creating and Promoting Holiday Content

What do Halloween, the winter holidays, and the Fourth of July have in common? For PR pros, these times of the year represent not only a festive season, but more importantly a competitive, short, and oversaturated pitching landscape.

There are few things in the PR world worse than going to pitch to a top-tier outlet and finding out that their editorial calendar is full or that similar content has been placed by the publisher before yours. At Fractl, we’ve executed Halloween, Christmas, Thanksgiving, and New Year’s holiday campaigns on behalf of our clients for the past 6 years. What have we learned?

Read on and avoid these three mistakes to ensure that all your holiday campaigns will succeed.

Lesson 1:  Prepare Your Campaign Well in Advance

We’ve heard this lesson loud and clear in one instance when promoting one of our clients’ holiday “gadget” gift guides. Even though the writer over at SimpleMost loved our pitch, she knew her editor wasn’t going to have time to read through the pitch and assign it out.

Unfortunately, it takes a while—sometimes a month or more—for editors to go through story pitches, so I will not be able to write this in time. If you ever have any other less timely stories, however, please feel free to send them my way!”

Holiday campaigns with a minimum of three to four weeks in active outreach before the actual holiday are more likely to succeed. Most major online publications already have a full content calendar during the holidays. If by chance your pitch piques their interest and there is room for your story, journalists still need time to read the pitch, find their angle for it, pitch their own editor the idea, get approval, and then go through multiple drafts before actually publishing the article.

In addition, any extra time you spend beforehand creating the campaign and developing the pitch strategy will ensure the highest-quality content is delivered to potential publications while the holiday is still relevant. The time spent strategizing, list-building, pitch writing, and content writing is crucial to the success of holiday content.

If you’re still unsure when to begin pitching your holiday content, ask a writer or editor how much lead time they need when receiving a pitch. Amanda Cargill, food content director at The Latin Kitchen told mediabistro that “pitching online only requires about six weeks lead time. Four could work, but the writer has to be able to write it in that time, and promote it.”

You can also consult the publication’s editorial calendar directly. Most print and online publications include an editorial calendar in their media kit. For example, the Men’s Health’s media kit clearly states that the theme for their December coverage is tech and gear. Have a tech-themed holiday gift guide you’re hoping to get featured on Men’s Health? Late November would be the time to pitch it!

Content Marketers are aware of major holidays on the calendar, yet we’re usually pushing to produce a holiday campaign just weeks before the holiday. Others might miss out on links and coverage by getting a late start. With a clear plan in mind and a focus on preparedness, your content has the potential to be one of the first holiday content campaigns that a publisher sees.

Lesson 2: Appeal to the Masses By Including Regional Data

Regional data that is broken down by state or city is always a great way to help holiday campaigns succeed. Across the board, campaigns with some regional aspect earned more do-follows and total press mentions than campaigns lacking in a regional angle.

So, why did our campaigns perform better when they had some regional appeal to them? Having regional data allows you to send pitches at a highly efficient rate. Once the exclusive has been secured and published on a top-tier website, having a regional strategy in place is a quick and efficient way to blast your campaign to relevant audiences. In one instance, our team secured the exclusive to a Halloween campaign on Mashable at the last minute. Within 48 hours, we were able to send over 500 tailored and targeted pitches to regional publications across the U.S. This resulted in a total of 70 dofollow links and 126 total pickups—all on a time crunch.

We also heard this idea reflected back in some feedback from an editor at a regional newspaper:  

I took a quick look at the story and would be interested in doing a local story that uses some or all of your graphics and data. The more specific information you have, the better — particularly you have any data specific to the state of Tennessee or the west TN region, [and] the greater Memphis area.”

In situations like these, it pays to be able to comb your dataset for regional data that you can offer to smaller publications when the situation arises. Say the exclusive you placed didn’t earn as many engagements as you hoped. Being able to fall back on the regional aspect of a campaign can take performance to the next level.

Lesson 3:  Highlight Unique and Newsworthy Stories

Content marketers need to realize that there are only so many viral holiday ideas you can produce—you can’t bank on being the first to produce an idea, especially when the success of your campaign is limited to a few weeks. We realized this at Fractl when we received this piece of feedback from a writer at SheKnows:

We actually ran a very similar story in our food vertical last week, so we’re going to have to pass. But thanks for thinking of us!”

Focus on producing something with a truly unique and noteworthy methodology in future newsjacking or holiday campaigns. This will help it stand out as well as potentially add an evergreen variable.

In our case, miraculously, we were able to spin an over-covered topic to our benefit. A similar Halloween campaign to our client’s went viral about 2 days before our content was ready. Our topic was the same, but their methodology was different. We rewrote our outreach pitch to highlight what made ours unique and sent pitches to all the people who covered the other project. Once we explained that our methodology was different and highlighted our original findings, the performance of the campaign exceeded our client’s expectations. Phew!

How can I come up with unique holiday ideas? Everything has already been done.

Sure, it’s easy to say you will generate unique content, but what does that actually look like? How can a team come up with ideas that are both relevant to the holiday and haven’t been done before? It’s hard, but it’s not impossible.

Here’s an example. Say your client wants to come out with a holiday gift guide. Your team might be thinking, “oh no, not another gift guide. The internet is already saturated in gift guides! How can we stand out?”  

A quick Google search for “best holiday gift guide” churns out 473,000,000 results. When you analyze the keyword in Buzzsumo from only the last two years, there are 42 pages of related articles. Everyone from the New York Times to Buzzfeed to Prevention to BroBible are covering the best holiday gifts.

But, is anyone covering the “worst”?  What happens when you search in Google for “worst holiday gift guide”? There are far fewer results—about a tenth of the results—coming in at 43,800,000.

And results in Buzzsumo? Nonexistent.

See where I’m going here? Take a risk, and put a unique spin on topics that have been covered before, and reap the rewards. Hannah Agran, senior food editor of Midwest Living told mediabistro that for holiday content, the “challenge is to hit those key visual and topical notes without repeating the same stories we did two years ago.”

Build out a list of topics that have been covered by the publisher during last year’s coverage. Is there any way you can update a story they wrote with new data, or put a spin on a recurring topic (i.e. gift guides)? If you’re still struggling to come up with a unique hook for your holiday campaign, check out 98 Ways to Find Inspiration for Content Ideas.


During active outreach of your holiday campaign, it’s important to listen to how journalists and editors reply to your pitch. Not only is publisher feedback valuable for calculating open, click, and response rate, it’s also incredibly useful information to use to optimize to your content creation and promotion process. All three lessons we learned above were highlighted by publisher feedback we received with pitching a campaign. These conversations can also be a starting point to build a relationship, which is what media relations is all about.  

When thinking about holiday content marketing strategies, it’s important to consider these three overarching factors. Calendar awareness, mass-appeal, and newsworthy content are all equally important when planning and promoting your holiday content. Without proper time, a unique angle, or regional data, your campaign your campaign may fail to have a healthy promotions report, if one at all.

The post 3 Lessons Fractl Learned From Creating and Promoting Holiday Content appeared first on BuzzStream.

How Google Identifies Primary Versions of Duplicate Pages

Identifying Primary Versions of Duplicate Pages

We know that Google doesn’t penalize duplicate pages on the Web, but it may try to identify which version it prefers to other versions of the same page.

I came across this statement on the Web about duplicate pages earlier this week, and wondered about it, and decided to investigate more:

If there are multiple instances of the same document on the web, the highest authority URL becomes the canonical version. The rest are considered duplicates.

~ Link inversion, the least known major ranking factor.

Man in a cave
unsplash-logoLuke Leung

I read that article from Dejan SEO about duplicate pages, and thought it was worth exploring more. As I was looking around at Google patents that included the word “Authority” in them, I found this patent which doesn’t quite say the same thing that Dejan does, but is interesting in that it finds ways to distinguish between duplicate pages on different domains based upon priority rules, which is interesting in determining which duplicate pages might be the highest authority URL for a document.

The patent is:

Identifying a primary version of a document
Inventors: Alexandre A. Verstak and Anurag Acharya
Assignee: Google Inc.
US Patent: 9,779,072
Granted: October 3, 2017
Filed: July 31, 2013


A system and method identifies a primary version out of different versions of the same document. The system selects a priority of authority for each document version based on a priority rule and information associated with the document version and selects a primary version based on the priority of authority and information associated with the document version.

Since the claims of a patent are what patent examiners at the USPTO look at when they are prosecuting a patent, and deciding whether or not it should be granted. I thought it would be worth looking at the claims contained within the patent to see if they helped encapsulate what it covered. The first one captures some aspects of it that are worth thinking about while talking about different document versions of particular duplicate pages, and how the metadata associated with a document might be looked at to determine which is the primary version of a document:

What is claimed is:

1. A method comprising: identifying, by a computer system, a plurality of different document versions of a particular document; identifying, by the computer system, a first type of metadata that is associated with each document version of the plurality of different document versions, wherein the first type of metadata includes data that describes a source that provides each document version of the plurality of different document versions; identifying, by the computer system, a second type of metadata that is associated with each document version of the plurality of different document versions, wherein the second type of metadata describes a feature of each document version of the plurality of different document versions other than the source of the document version; for each document version of the plurality of different document versions, applying, by the computer system, a priority rule to the first type of metadata and the second type of metadata, to generate a priority value; selecting, by the computer system, a particular document version, of the plurality of different document versions, based on the priority values generated for each document version of the plurality of different document versions; and providing, by the computer system, the particular document version for presentation.

This doesn’t advance the claim that the primary version of a document is considered the canonical version of that document, and all links pointed to that document are redirected to the primary version.

There is another patent that shares an inventor with this one that refers to one of the duplicate content URL being chosen as a representative page, though it doesn’t use the phrase “canonical.” From that patent:

Duplicate documents, sharing the same content, are identified by a web crawler system. Upon receiving a newly crawled document, a set of previously crawled documents, if any, sharing the same content as the newly crawled document is identified. Information identifying the newly crawled document and the selected set of documents is merged into information identifying a new set of documents. Duplicate documents are included and excluded from the new set of documents based on a query-independent metric for each such document. A single representative document for the new set of documents is identified in accordance with a set of predefined conditions.

In some embodiments, a method for selecting a representative document from a set of duplicate documents includes: selecting a first document in a plurality of documents on the basis that the first document is associated with a query independent score, where each respective document in the plurality of documents has a fingerprint that identifies the content of the respective document, the fingerprint of each respective document in the plurality of documents indicating that each respective document in the plurality of documents has substantially identical content to every other document in the plurality of documents, and a first document in the plurality of documents is associated with the query-independent score. The method further includes indexing, in accordance with the query independent score, the first document thereby producing an indexed first document; and with respect to the plurality of documents, including only the indexed first document in a document index.

This other patent is:

Representative document selection for a set of duplicate documents
Inventors: Daniel Dulitz, Alexandre A. Verstak, Sanjay Ghemawat and Jeffrey A. Dean
Assignee: Google Inc.
US Patent: 8,868,559
Granted: October 21, 2014
Filed: August 30, 2012


Systems and methods for indexing a representative document from a set of duplicate documents are disclosed. Disclosed systems and methods comprise selecting a first document in a plurality of documents on the basis that the first document is associated with a query independent score. Each respective document in the plurality of documents has a fingerprint that indicates that the respective document has substantially identical content to every other document in the plurality of documents. Disclosed systems and methods further comprise indexing, in accordance with the query independent score, the first document thereby producing an indexed first document. With respect to the plurality of documents, only the indexed first document is included in a document index.

Regardless of whether the primary version of a set of duplicate pages is treated as the representative document as suggested in this second patent (whatever that may mean exactly), I think it’s important to get a better understanding of what a primary version of a document might be.

Why One Version Among a Set of Duplicate Pages might be considered a Primary Version

The primary version patent provides some reasons why one of them might be considered a primary version:

(1) Including of different versions of the same document does not provide additional useful information, and it does not benefit users.
(2) Search results that include different versions of the same document may crowd out diverse contents that should be included.
(3) Where there are multiple different versions of a document present in the search results, the user may not know which version is most authoritative, complete, or best to access, and thus may waste time accessing the different versions in order to compare them.

Those are the three reasons this duplicate pages patent says it is ideal to identify a primary version from different versions of a document that appears on the Web. The search engine also wants to furnish “the most appropriate and reliable search result.”

How does it work?

The patent tells us that one method of identifying a primary version is as follows.

The different versions of a document are identified from a number of different sources, such as online databases, websites, and library data systems.

For each document version, a priority of authority is selected based on:

(1) The metadata information associated with the document version, such as

  • The source
  • Exclusive right to publish
  • Licensing right
  • Citation information
  • Keywords
  • Page rank
  • The like

(2) As a second step, the document versions are then determined for length qualification using a length measure. The version with a high priority of authority and a qualified length is deemed the primary version of the document.

If none of the document versions has both a high priority and a qualified length, then the primary version is selected based on the totality of information associated with each document version.

The patent tells us that scholarly works tend to work under the process in this patent:

Because works of scholarly literature are subject to rigorous format requirements, documents such as journal articles, conference articles, academic papers and citation records of journal articles, conference articles, and academic papers have metadata information describing the content and source of the document. As a result, works of scholarly literature are good candidates for the identification subsystem.

Meta data that might be looked at during this process could include such things as:

  • Author names
  • Title
  • Publisher
  • Publication date
  • Publication location
  • Keywords
  • Page rank
  • Citation information
  • Article identifiers such as Digital Object Identifier, PubMed Identifier, SICI, ISBN, and the like
  • Network locution (e.g., URL)
  • Reference count
  • Citation count
  • Language
  • So forth

The duplicate pages patent goes into more depth about the methodology behind determining the primary version of a document:

The priority rule generates a numeric value (e.g., a score) to reflect the authoritativeness, completeness, or best to access of a document version. In one example, the priority rule determines the priority of authority assigned to a document version by the source of the document version based on a source-priority list. The source-priority list comprises a list of sources, each source having a corresponding priority of authority. The priority of a source can be based on editorial selection, including consideration of extrinsic factors such as reputation of the source, size of source’s publication corpus, recency or frequency of updates, or any other factors. Each document version is thus associated with a priority of authority; this association can be maintained in a table, tree, or other data structures.

The patent includes a table illustrating the source-priority list.

The patent includes some alternative approaches as well. It tells us that “the priority measure for determining whether a document version has a qualified priority can be based on a qualified priority value.”

A qualified priority value is a threshold to determine whether a document version is authoritative, complete, or easy to access, depending on the priority rule. When the assigned priority of a document version is greater than or equal to the qualified priority value, the document is deemed to be authoritative, complete, or easy to access, depending on the priority rule. Alternatively, the qualified priority can be based on a relative measure, such as given the priorities of a set of document versions, only the highest priority is deemed as qualified priority.

Take aways

I was in a Google Hangout on air within the last couple of years where I and a number of other SEOs (Ammon Johns, Eric Enge, Jennifer Slegg, and I) asked some questions to John Mueller and Andrey Lipattse, and we asked some questions about duplicate pages. It seems to be something that still raises questions among SEOs.

The patent goes into more detail regarding determining which duplicate pages might be the primary document. We can’t tell whether that primary document might be treated as if it is at the canonical URL for all of the duplicate documents as suggested in the Dejan SEO article that I started with a link to in this post, but it is interesting seeing that Google has a way of deciding which version of a document might be the primary version. I didn’t go into much depth about quantified lengths being used to help identify the primary document, but the patent does spend some time going over that.

Is this a little-known ranking factor? The Google patent on identifying a primary version of duplicate pages does seem to find some importance in identifying what it believes to be the most important version among many duplicate documents. I’m not sure if there is anything here that most site owners can use to help them have their pages rank higher in search results, but it’s good seeing that Google may have explored this topic in more depth.

Another page I wrote about duplicate pages is this one: How Google Might Filter Out Duplicate Pages from Bounce Pad Sites

Copyright © 2019 SEO by the Sea ⚓. This Feed is for personal non-commercial use only. If you are not reading this material in your news aggregator, the site you are looking at may be guilty of copyright infringement. Please contact SEO by the Sea, so we can take appropriate action immediately.
Plugin by Taragana

The post How Google Identifies Primary Versions of Duplicate Pages appeared first on SEO by the Sea ⚓.

Google Ads Parallel Tracking Guide

Things are moving quickly in the Google product ecosystem. One change that’s flown under the radar is the switch to parallel tracking on October 30, 2018. If you’ve been procrastinating on prepping for the change, don’t worry! We’ve got your back.

What Is Parallel Tracking?

Announced way back in 2017, parallel tracking is Google’s next advertising tool to speed up performance in the “mobile-first world,” alongside features such as Accelerated Mobile Pages (AMP) landing pages and the Mobile Speed Scorecard. With a one-second delay in load time decreasing conversions by up to 20%, every moment counts.

Parallel tracking changes the way click measurement happens when using 3rd-party tracking software that relies on redirects (Marin, Kenshoo, etc.) outside the Google ecosystem. If you don’t use one of those, you don’t need to do anything. Go optimize your accounts!

When using a tracking URL with linear or sequential tracking, a user clicks on an ad, is sent to an intermediate landing page, then redirected to the final destination. While this often happens so quickly most people can’t perceive it, Google insists that it can negatively impact conversion rates.

Sequential tracking

With parallel tracking, the user is taken directly to the final URL while the tracking URL is loaded in the background, similar to asynchronously loading scripts. The idea is that the user is taken directly (and quickly) to their ultimate destination, improving the user experience and increasing conversion rates.

Parallel Tracking

This is currently an optional feature for Search and Shopping campaigns, but starting October 30, 2018, advertisers will be automatically switched over to parallel tracking for all campaign types. You can see if you have parallel tracking turned on by going to All Campaigns > Settings > Account Settings and looking under “Tracking.”

What Do I Need to Do to Prepare?

Ultimately, you’ll need to talk with your 3rd-party vendor to ensure your tracking templates are compatible. Google has been working with the major players, so you’re likely covered.

Marin has already announced support for parallel tracking and an upcoming webinar to help users get prepped. And according to Kenshoo’s Twitter account, they’ve been working with Google since the change was announced, although they don’t have public documentation on what that looks like.

Here are some general best practices to keep in mind and get ahead of the switch:


Ensure the tracking server supports HTTPS. Also, make sure to include the HTTPS protocol in the tracking template URL to keep everything consistent and clean.


Make sure the tracking redirects use server-side redirects as opposed to on-page redirection through JavaScript. The tracking sequence will stop otherwise.

Ad Changes

If you’re making changes to the ads themselves, the review process will be initiated and ad delivery will be paused until the process is complete. If the changes are at a higher level (ad group, campaign, account), the review process will be initiated in the background, allowing the ads to still serve unless flagged.

AdWords Editor

If using the AdWords Editor Tool, doublecheck that you have the latest version. If you’re not, any uploaded changes may delete previously made URL or parallel tracking changes.


Make sure the GCLID (Google Click Identifier) is still being added. We would recommend using auto-tagging to accomplish this.

Testing Campaigns Ahead of the Switch

Test individual campaigns (without opting into parallel tracking for the whole account) to make sure the templates are compatible before rolling the change out to the entire account. This can be found under Campaings > Settings > Campaign URL options.

Under the Campaign URL options setting (you may have to expand Additional settings), create a custom parameter called {_beacon} and assign it the value true. You don’t need to use this custom parameter in any of your URLs.

If Google finds the presence of this parameter, all clicks under that campaign will get parallel tracking treatment.

Verify parallel tracking is enabled by looking for &gb=1 in your tracking calls (this indicates a background call from the browser).

Additional Resources

Google has also provided several resources for partners to refer to, including an implementation checklist we recommend looking at before you make the switch. We’ve compiled them for you below:

Link inversion, the least known major ranking factor.

link inversion

If there are multiple instances of the same document on the web, the highest authority URL becomes the canonical version. The rest are considered duplicates.

  • Inbound links pointing towards duplicates are inverted towards the canonical URL
  • This is called “link inversion”


Let’s say, today you publish a document on your website and in a few days a few small scrapers copy your page. You’re the higher authority and still count as the canonical document. Your URL shows up in Google’s search. All other URLs are considered duplicates and their links are counted towards your URL.

So far so good.

Imagine then, a week after that Google picks up the exact document on a website with higher authority than yours. What happens? You’re the duplicate now and your inbound links count towards the new canonical URL of that document.


Almost a decade ago, Google realised they no longer meet their user’s expectations. Their results were stale and lagging behind what was really happening on the web. So, in 2010, Frank Dabek and Daniel Peng tackle the problem of speed and freshness by retiring the traditional batch-based indexing system powered by MapReduce. They introduce a completely new concept which allowed them to transform large datasets progressively, utilising numerous small, independent mutations.

They called it Percolator.

Full Paper: Large-scale Incremental Processing Using Distributed Transactions and Notifications.

Percolator & Caffeine

  • Percolator is an incremental processing system which prepares web pages for inclusion in the live index
  • Caffeine is a percolator-based indexing system

The ‘dynamic duo’ has been in use since April 2010:

“We have built and deployed Percolator and it has been used to produce Google’s websearch index since April, 2010.”

Page 13, 5. Conclusion and Future Work

Google officially announced it in June that year.

Ranking Complexity

Of course things get a bit more complex when dealing with partial content and there’s a multitude of other signals which may influence rankings including content visibility, personalisation, location, device, timing, search context and intent. That said, Google works at scale and ultimately their search quality team and engineers care about the end user first even if the publisher is somewhat disadvantaged in the process, that’s not as bad as if it was the other way around.

Nothing New

Two years after Caffeine was released, I demonstrated this feature in a controlled set of experiments including Rand’s blog (with the permission of all included parties). As a reward, Google penalised me.


Whenever I run an experiment, there will always be people who tell me that it’s impossible to test Google because there’s just too many variables. These are the people who would also have a hard time accepting they’re wet if I poured a bucket of water on their heads.

One such bucket is the fact that “link inversion” isn’t some concept SEO people read in a research paper which Google may or may not use in practice. When triggered, inverted links from other domains actually show up in your Search Console.

Follow-Up Tests

Before publishing this article I conducted a few quick tests and successfully took over as the canonical result every time. I used Search Console to submit the new page to index and take over the original content publisher.

It took me 30 seconds.

A week after the test the links from the other domain showed up in my Search Console as if they were mine.

Other Factors

I recently did another test to see if content behind tabs and accordions would rank as well as the one that’s visible. Google later stated otherwise and mocked me a little bit calling me the “authority”. So in the spirit of friendly banter, I took one of their “authority” assets (Google Scholar) that uses tabs and outranked its content while citing the source URL on my page.



Note: One of my colleagues form Europe reports that he doesn’t see DEJAN link, instead it’s filtered out as a duplicate. My guess is that geo-location may play a role here.

I’m hoping Google won’t slap me with a penalty for this because they’re literally asking for examples and I just made one to illustrate a situation where an original publisher may not rank as well as the one using their content.

But sure, if they say it’s not the tabs on that page that’s enabled my little demonstration I believe them.

It would be nice to know exactly what did though, so I can do my job and advise clients with confidence.

Dan Petrovic, the managing director of DEJAN, is Australia’s best-known name in the field of search engine optimisation. Dan is a web author, innovator and a highly regarded search industry event speaker.

More PostsWebsite

SEO Smoke Tests for Dynamic Rendering


Google just published an article on how to “Get Started With Dynamic Rendering.” If you are working on a site with a “modern framework” (e.g. Angular, React, or other tech with a lot of JavaScript features), you’ll want to bookmark that post. If reading is not your thing, a few weeks ago I put together Server Side Rendering For Dummies (& Non-Technical SEO Decision Makers), which boils down a lot of the Google techno-jargon into a single PowerPoint slide.

While that Google post has most of what you’ll need to get started with server side rendering, I’d like to focus on the Troubleshooting section – talk all you want about answering user questions, relevance, domain authority, etc. – if I had to define 2018 SEO with one word, it would be “troubleshooting.”

Google gives you most of what you need to troubleshoot prerendering problems in the “Verify your configuration” and “Troubleshooting” sections. Here’s what they say to do (edited for brevity):

Verify your configuration

Check a URL with the following tests:

  1. Test your mobile content with the Mobile-Friendly Test to make sure Google can see your content.
  2. Test your desktop content with Fetch as Google to make sure that the desktop content is also visible on the rendered page (the rendered page is how Googlebot sees your page)
  3. If you use structured data, test that your structured data renders properly with the Structured Data Testing Tool.


If your content is showing errors in the Mobile-Friendly Test or if it isn’t appearing in Google Search results, try to resolve the most common issues listed below.

Content is incomplete or looks different

What caused the issue: Your renderer might be misconfigured or your web application might be incompatible with your rendering solution. Sometimes timeouts can also cause content to not be rendered correctly.

High response times

What caused the issue: Using a headless browser to render pages on demand often causes high response times, which can cause crawlers to cancel the request and not index your content. High response times can also result in crawlers reducing their crawl-rate when crawling and indexing your content.

Structured data is missing

What caused the issue: Missing the structured data user agent, or not including JSON-LD script tags in the output can cause structured data errors.

We call these “Smoke Tests.” Here’s a little more nuance to server side rendering troubleshooting based on some real-world situations we’ve encountered.

  1. How To Test Server Side Rendering On A New Site Before It’s Launched
    It often is the case that SEOs get brought into the process well after a site has been built, but only a few days before it will be launched. We will need a way to test the new site in Google without competing in Google with the old site. For a variety of reasons we don’t want the entire new site to get crawled and indexed, but we want to know that Googlebot can index the content on a URL, that it can crawl internal links and that it can rank for relevant queries. Here’s how to do this:
    1. Create test URLs on new site for each template (or use URLs that have already been built) and make sure they are linked from the home page.
    2. Add a robots.txt file that allows only these test URLs to be crawled.
      Here’s an example:
      User-Agent: Googlebot
      Disallow: / (this means don’t crawl the entire site)
      Allow: /$ (allow Gbot to crawl only the home page even though the rest of the site is blocked in the line above)
      Allow: /test-directory/$ (allow crawling of just the /test-directory/ URL)
      Allow: /test-directory/test-url (allow crawling of /test-directory/test-url)(you can add as many URLs as you want to test – the more you test, the more certain you can be, but a handful is usually fine)
    3. Once the robots.txt is set up, verify the test site in Google Search Console.
    4. Use the Fetch as Google tool to fetch and render the home page and request crawling of all linked URLs. We will be testing here that Google can index all of the content on the home page and can crawl the links to find the test URLs. You can view how the content on the home page looks in the Fetch tool, but I wouldn’t necessarily trust it – we sometimes see this tool out of sync with what actually appears in Google.
    5. In a few minutes, at least the test home page should be indexed. Do exact match searches for text that appears in the title tag and in the body of the home page. If the text is generic, you may have to include in your query to focus only on the test domain. You are looking for your test URL to show up in the results. This is a signal that at least Google can index and understand the content on your home page. This does not mean the page will rank well, but at least it now has a shot.
    6. If the test links are crawlable, soon you should the test URLs linked from the home page show up in Google. Do the same tests. If they don’t show up within 24 hours, while this doesn’t necessarily mean the links aren’t crawlable, it’s at least a signal in that direction. You can also look at the text-only cache of the indexed test home page. If the links are crawlable, you should see them there.
    7. If you want to get more data, unblock more URLs in robots.txt and request more indexing.
    8. Once you have finished the test, request removal of the test domain in GSC via the Remove URLs tool.
    9. We often can get this process done in 24 hours, but we recommend to clients giving it a week in case we run into any issues.
    10. Pro-tip: If you are using Chrome and looking at a test URL for the SEO content like title tag text, often SEO extensions and viewing the source will only show the “hooks” (e.g. {metaservice.metaTitle}) and not the actual text. Open Chrome Developer Tools and look in the Elements section. The SEO stuff should be there.
  2. Do Not Block Googlebot on Your PreRender Server
    Believe it or not, we had a client do this. Someone was afraid that Googlebot was going to eat up a lot of bandwidth and cost them $. I guess they were less afraid of not making money to pay for that bandwidth.
  3. Do Not Throttle Googlebot on Your PreRender Server
    We convinced the same client to unblock Googlebot, but noticed in Google Search Console’s crawl report that pages crawled per day was very low. Again someone was trying to save money in a way that guaranteed them to lose money. There may be some threshold where you may want to limit Googlebot’s crawling, but my sense is Googlebot is pretty good at figuring that out for you.

How To Make Better SEO Reports For Your SEO Campaigns

Reading Time: 7 minutes


This article is about the development and purpose of standard SEO reports in particular. Not to be confused with SEO platforms, like Conductor and Sitebulb for example, which are extraordinary in giving you insights to dig through. The result of those platforms would fit nicely with this post.

If you are interested in a custom dashboard or an Analytics audit, and would like our help, please don’t hesitate to contact us.

The SEO report. It’s a calling card for some agencies. These reports can be ornate or no-frills (everyone has their own style). Smart companies use APIs to compile reports without spending manual hours. Some rely on automatic SEO reporting tools. For other companies, it’s a time-intensive and considerably low-value exercise.

At the end of the day, the SEO report can be a tool by which you can gain insights and build powerful campaigns for organic search. They say, “teaching is the best way to learn.” We often think of this as a client deliverable, or a monthly expectation to appease your boss, but an SEO report provides the opportunity to dig into your data. It’s a tool to enhance your existing marketing acumen. It can cover everything from organic traffic, external and internal link building, social media, and more.

What is an SEO Report?

For those entering the field, an SEO report is the common name given to any type of document meant to inform the viewer of their SEO status. It can be built by tools, humans, or a combination of both. Most SEO agencies provide a monthly SEO report to their clients. Sometimes, however, it is more of a ticking of the box than a valuable endeavor. It’s one thing to export data from Google Analytics and rank tracking software. It’s another to inquire into what should get exported. There’s no value in creating any kind of website report if the data can’t help you answer questions.

Join our mailing list!

Want to know when new free tools and blog posts are released? Join our newsletter!

  1. What does the data tell me about our visitors?
  2. What direction should I take based on the data?
  3. Why is “X” happening?
  4. What dubious claims and theories can we correct?
  5. What campaigns should we renounce, and how can I change direction for the better?
  6. What data can I use to sell back the SEO investment?

Prospects often ask me what our SEO reports look like. For some, this client report is a staple in their previous agency relationship (or the “calling card” I mentioned above.) It’s a valid question, to which my answer can be unexpected and welcomed. I explain, “we develop reports with a data-first philosophy, to which the KPIs that move your business are primary. Sure, we include the obligitory ranking, traffic, and conversion data, but we want to benchmark against the particulars that your business is based on.” If the prospect hasn’t developed a KPI set or set goals in GA, we will help them. We do everything in our power to make sure rankings aren’t the main KPI.

I’m one of the many SEOs who don’t live and die by keyword rankings anymore. You won’t see me exhaustively tracking keywords before focusing on other performance indicators (like traffic, conversions, time on page, bounce rate, and revenue). Although SEO ranking reports are traditional, rankings are an imperfect metric. Expect flux in search engine rankings. Keywords that typically perform for a business can appear lower in the results on any given day. It’s important that a report doesn’t capture a snapshot of rankings (like a monthly report). If a keyword is down on the day you compile the report, but high on other days, you’re going to get a faulty signal.

Below is a ranking trend for a keyword that has much competition. If this report was made around 5/15/2018, it would look like a victory with a top 10 ranking. In truth, this is a keyword that is not performing well.

Keyword Rank Trend

Instead, you should make sure to reflect the average position. The average position for the keyword above is position 48. This represents organic visibility in a clear and digestible way.

It’s important to group keywords to appropriate landing pages. Instead of thinking of each keyword on its own, I prefer an organic visibility score for a page. This allows the target keywords and your non-target keywords to represent the traffic to the website. Identify the pages you are working on and average the ranks for all the keywords that are driving organic. Repeat each month for a trend. SEMrush now gives page level data so you can easily extract the keywords with a download or API call. That’s a very helpful addition.

Different SEO tools offer visibility scores of their own, using their own preferred formula. More than just averages, sometimes traffic and impression data is calculated. Here is a visibility score from Rank Ranger that’s telling me how well one of my important pages is performing (check out their calculation description). I’d rather learn from this report and drill into each keyword only as needed. It’s a perfect chart for any SEO reporting dashboard.

visibility score

Follow The Numbers

Rankings can help define traffic intent, but qualified traffic is the most important data in any Google Analytics report. After all, it’s why we are all so focused on studying Google’s ranking factors. We are looking to attract traffic that does something – make a purchase, become aware of a brand, inquire about a service, read content, and so on. The key to receiving qualified organic traffic starts with understanding your best visitors’ wants and needs. Every query done in search engines represents a need by a user. Your website data discloses what this need is, and it gives you the ability to update your website accordingly. As long as you take the time to dig into your organic search data.

If you have a website with properly themed pages, seeing the organic growth is telling. High click-through rates and engagement signals two things – you’re on the right track with your visitors, and you’ve convinced Google that you’re worth the traffic. Using the keyword groupings we discussed earlier helps paint an even clearer picture. A good SEO report should allow you to see this, and help you consider a need for improving the page or moving onto another SEO campaign. Alternatively, if your reports are not showing webpage success, it should report on the reasons why and suggest the efforts that should be made.

Your Time Is Too Important

Don't Be A Report MonkeyIn a past life, I was part of an agency that spent too much time – by hand – downloading Omniture reports (remember them?). I was copying and pasting cells, customizing charts, running formulas, and beautifying spreadsheets. I could make a spreadsheet look like a work of art, but it wasn’t the work I should have been doing. This exercise took 10+ hours a month. Clients would receive these reports in the middle of the month. In summary, my clients were paying me to be a report monkey. I was spending far too much time building spreadsheets, and not enough time analyzing. That’s a problem.

Improved technology has given SEOs various methods in which data collection can accelerate. APIs from Google Analytics and Search Console can plug into Google Data Studio or Google Sheets (with a plugin like Supermetrics). No longer do we have an excuse for being a report monkey. Instead, we can use this extra time wisely. We can use this available time to find the stories in the data. We can use this time to provide actionable insights pulled from the data.

As far as I’m concerned, there isn’t a need for a duplicated version of something that can easily be exported from an analytics tool. Businesses need our talents and research to raise the ROI. Keep that in mind as you draft SEO reports for clients, customers, or even your boss. It will certainly keep you held in higher regard as you continue your SEO journey together.

SEO Report Example

I’ve talked a bit about quality reporting, but I haven’t shown any examples of what we do at Greenlane. This is a default report designed in Google Data Studio. It’s our most current “out of the box” version (as of this writing). It has not been customized for client KPIs. But it’s clean and clear. If you’d like to download this to use as an SEO report template, you can download it here.

SEO report 5 SEO report 4 SEO report 3 SEO report 2 SEO report 1

So far, pretty standard stuff. So what happens when a client has their own SEO metric requests? You simply build them in (as you’re about to see).

Additionally, in this next example, you’ll see where insights tell the story of the data. The benefit to API driven reports, you don’t need to spend the time pulling data. They essentially become automated SEO reports. You can use this saved time to really understand what is going on with the site.

SEO Report Advanced 4SEO Report Advanced 3 SEO Report Advanced 2 SEO Report Advanced 1There’s no limit to how much customization you can do to your report template. As long as the data is relevant, and the insights are valuable, you’re on your way to creating the best SEO reports you can possibly deliver. Developing the template is certainly a bit of upfront work, but it pays off in the long run.

The post How To Make Better SEO Reports For Your SEO Campaigns appeared first on Greenlane.

How To Calculate Total Addressable Market Online

There are a number of key business reasons you should know how much demand exists within an online niche; or the size of your total addressable market.

The most common ones are:

  • Assess your business’s current market share
  • Identify opportunities for expansion
  • Analyze investment potential

Calculating the total addressable market (TAM) offline is an age-old practice when it comes to assessing and deploying investment capital, but it’s always been a bit of a fickle process — data sources are mediocre at best, and you are almost always left approximating.

While a degree of approximation is still required when calculating online market size, we can at least use more quantitative data sources, like search data.

We are hired with increasing frequency to help companies determine their online TAM, almost exclusively for one of the 3 reasons I’ve mentioned above.

We tend to perform these analyses mostly for ecommerce companies, but we are starting to work with more and more enterprise software and SaaS companies to help them define both their keyword strategies and even product roadmaps informed by TAM data.

Attention: Before we get any further into the process, this section that you just read is massively important.

Everything you and your team does from here on must be goal-oriented. Whether it be for one of the reason listed above or any another you may have in mind, you will waste your time collecting data and sorting through thousands and thousands of keywords if you do not have a focused goal in mind.

WATCH Video Version

Putting Keyword Data to Work

As with almost all quantitative approaches, the more data we have the better – we recommend collecting as much data as possible in the beginning stage because it will decrease the number of times you will need to filter.

We collect data from the following four sources:

1. AdWords/PLA

One of the best initial data sources to start with is historical AdWords, if you have it. This becomes even more powerful if you’re an ecommerce website and you have at least 12 months of historical PLA data.

The caveat to having AdWords data is we can match up clicks and conversions at the keyword level and infer which head terms and modifiers are driving the most conversions for our client.

2. Google Search Console

If you are not an ecommerce brand and PLA data is not available to you or your team, the next (or equally as important as historical Adwords data) data source is Google Search Console (GSC.) We use the Google Sheets plugin SuperMetrics to pull in the last 16 months of raw search query data from GSC.

This allows us to immediately grow the list of keywords and we can begin to see what users in that particular industry are looking for.

3. Competitors

We use Ahrefs to identify which sites have the largest keyword footprints, downloading all of their terms and then also pulling in all the data from their lists of top competing domains.

At this point, we’re only focused on expanding our total list of terms and not worried so much about all the additional keyword level data, we’ll get that later.

4. Keyword Tools

Lastly, we collect keywords from search tools (TermExplorer or KeywordKeg or WonderSearch) that scrape Google Suggest, Wikipedia and eBay.

Identifying Keyword Modifier Patterns

First, we need to sanitize this data so we can extract insights from it to use to expand our term list.

To do this, our SEO Analysts painstakingly review the term lists (often between 30,000 and 50,000 terms) looking for modifier patterns to then query against the list, and build new lists of included modifiers and excluded modifiers, as well as pulling out core head terms.

You’ll want to score each modifier as Included or Excluded in its own column, and then build this into the formula logic for aggregating all the data for the report.

For ecommerce sites these head terms are typically going to be your category and subcategory terms. We exclude terms that are branded because it will be extremely difficult to rank for and aren’t worth the effort.

Here’s an example of what one of these lists looks like:

Next, we take the list of included modifiers and head terms and multiply it using a keyword multiplier tool, giving us a new base term list.

Expanding the List Using Modifier Cohorts

Now that we have identified patterns of terms that include topical keywords and intent or product focused modifiers, we can get to scraping.

Additionally, it’s worth mentioning that keywords can also be grouped into cohorts using keyword-level metrics, such as highest traffic, lowest difficulty, lowest CPC, and so on – like this:

Once we’ve blown out our total list of terms to a place we feel is representative of the majority of the niche online, which involves pulling the keyword ranking data for the top 10-20 websites within each modifier cohort, we go to work expanding the list even more.

At this stage in the process we user Term Explorer to expand the list to include all related keywords across Wikipedia, Google Suggest, Amazon, and eBay.

Here is what the UI looks like inside of Term Explorer for our Social Security example once we inputted our entire list:

We can then export the data from Term Explorer (upwards of 200,000 keywords sometimes), add any newly found unique keywords to the list, and query the modifier lists against the new list. Any keywords that are not relevant can be removed from the data set.

Refining The List of Keywords

A lot of the time there are going to be terms in the list that inflate the total monthly search volume (MSV.) This is because search engines see terms that are similar in nature or include the same terms as having the same MSV, CPC, and Competition.

Here is an example:

Google groups ‘veteran disability benefits’ and ‘benefits for disabled veterans’ as the same keyword, so we must remove these or the MSV and total keyword count will be inflated.

This is most easily done using the AdWords API, but can be done also copying and pasting all your terms into Keyword Planner, which as a bonus will also sanitize your terms list by grouping synonyms that Google sees as so semantically related that it returns the same volume, CPC, and difficulty data for each.

Understanding Search Intent

Understanding what exactly a user wants out of their search is most certainly one, if not the, most important aspect of doing any sort of keyword research.

If your site or specific page that you are conducting keyword research for is targeting a term that is informational in nature but your actual intention is to target users in the commercial phase of the sales funnel, you are not going to see any results.

Check out our 21:22-minute long video about search intent here on our YouTube channel.

Here is the search intent funnel that we use during this process. Notice that it is directly related to the sales funnel:

Mapping Search Intent

Once you have a firm understanding of what search intent is, it’s time to put the funnel to use.

This process can often become quite a tedious but a great place to start is with the included modifier list that you created just a few steps ago.

Here’s an example from the Social Security research our team put together:

Informational terms are perhaps the easiest to identify in a data set. You can start by using modifiers “how”, “why”, “when”, or “where” to begin the mapping process.

These are the terms that show that the user is not yet ready to apply, buy, or sign anything just yet but they are looking for the best resource in order to make that next step.

Once you are at the bottom of the funnel you are looking for terms that are transactional in nature. Users here are using modifiers like “apply” or “file” or “application” which shows they are looking for a place to fill out a form or application.

By the way, when we say “transactional” that doesn’t necessarily mean an exchange of anything of monetary value but instead whatever the intent of your site or page is.

Whether it be getting users to enter their email, request a demo, or fill out a contact form, there are transactional intent keywords for any business.

Creating a Priority List

This is the best part of the process by far. By this time you have collected data, went through it line by line, identified modifiers, expanded the list using modifier cohorts, and began to understand search intent for you target audience.

When deciding which terms to prioritize there are a few different things you can do:

  1. Sort the list by monthly search volume, then pick the top 100 terms that your site does not rank in the top 5 for.
  2. Add in a competitor layer of data so you can find the low hanging fruit that competitors are not optimized for or targeting.
  3. Filter for terms with commercial intent and then sort descending by current keyword position, to identify keywords that convert.

In Conclusion

To design holistic and comprehensive SEO strategies you need to be aware of the key metrics that exist within your online SEO landscape;

  1. The size of your market (both in terms of the number of keywords and the average search volume of those terms).
  2. Who your online competitors are (these offer vary significantly from your offline competitors).
  3. Your current market share, and areas where you have little to no visibility (which usually identifies areas where your product or service offering is either weak or doesn’t exist).

We have helped FTF clients identify literally hundreds of millions of dollars in additional SEO revenue opportunity using this data.

Put in the time, aggregate and organize all the data (we’ve given you the process), and identify those large growth opportunities.

A Special Thank You

This process has been refined significantly throughout 2018, and it’s thanks directly to the hard (and brilliant) work of 2 our FTF SEO Team Members: Matt DiMenno and Kurtis Nysmith. Follow them… they’re doing big things.

Questions? Criticisms? Ideas?

Please drop me a comment below. Thanks.

The post How To Calculate Total Addressable Market Online appeared first on From The Future.

100: Marie Haynes on The August 1st Update, Link Building, EAT – Special Call-In Show!

headshot of marie haynes google expert

We did it! One hundred episodes of Experts on the Wire.

Enjoy 🙂


Our Sponsor

BuzzSumo – One of my favorite tools for coming up with content ideas, finding people who share content in an industry, and tons more (like alerts to keep an eye on your competitor’s links). Listen to the show for a special code to get 30% off BuzzSumo for 3 months. Also, check out Brian Dean’s Definitive Guide to BuzzSumo.

Related Episodes You Might Like

Show Agenda and Timestamps

  • Episode Introduction [0:22]
  • Some pre show notes  [1:25]
  • What this episode will be about – 3 listeners ask their own questions for Dan and Marie to answer [2:05]
  • Show Begins [2:46]
  • Marie Haynes’ Podcast Search news You Can Use [4:04]
  • What has Marie learned since the August 1st Google update and have there been more updates since then? [5:24]
    • Local update [5:57]
    • Organic update- update that hit businesses with trust issues [6:44]
      • Has Marie seen any recoveries for those sites affected? [8:44]
  • 1st caller Steve [10:10]
    • Steve runs a website builder review website and also has a blog for seo purposes. Steve doesn’t know if his blog helps with seo and is unsure which direction to take the blog in to help with SEO. [11:09]
      • Marie’s thoughts on blogs [12:48]
      • Dan’s Thoughts [16:00]
      • Marie’s thoughts on blog content unrelated to the business site [17:50]
  • More thoughts on the Google update and how EAT and trust plays a factor [20:25]
  • 2nd caller Talmage [24:54]
    • Can you optimize for TF-IDF, Proof Terms, Semantics, LSI? [25:18]
      • Dan’s response [26:14]
      • Marie’s response [27:28]
    • Google update caused some of Talmage’s clients to suffer what can he do to counteract it [29:16]
      • Marie’s thoughts [30:31]
      • Dan’s thoughts [33:38]
      • Marie’s addition recommendations [34:55]
    • How to best build links and what are Dan and Marie’s thoughts on link building? [36:56]
      • Dan’s thoughts [37:33]
      • Marie’s thought [38:58]
  • 3rd caller Victor [42:06]
    • How to optimize for an “hours open” listing site?[42:42]
      • Marie’s response [44:17]
      • Dan’s thoughts [46:43]
      • Marie’s additional thoughts on providing value that people can’t find elsewhere [49:02]
      • Dan’s additional thoughts on Victor’s website [51:46]
      • Marie’s final thoughts on link building and user experience [52:32]
  • Dan and Marie’s final thoughts on incorrect business models [54:11]
  • Where to find Marie online [55:10]

Articles, Resources, and Links Mentioned

Find Marie Online


The post 100: Marie Haynes on The August 1st Update, Link Building, EAT – Special Call-In Show! appeared first on Evolving SEO.