Archives 2019

Yoast SEO 9.6: Improving our code

We’re still recuperating from an awesome edition of YoastCon last week, but that won’t keep us from releasing a new version of Yoast SEO. Yoast SEO 9.6 is a bug fix release with an additional focus on improving the code base of the plugin to better adhere to coding standards. Find out what else is new in Yoast SEO 9.6!

A reminder: The beta test toggle will be removed

Testing the new SEO analysis — due for release in Yoast SEO 10.0 — has been a great success. More than 100.000 people are helping us test the new version in real-world situations. We are in awe of those numbers — thanks everyone! All this input will give us enough feedback to improve the new analysis even further before we release it into the wild some time from now. Read more on this beta test in the release post of Yoast SEO 9.4 or find out why you should help us test.

In Yoast SEO 9.6, we will remove the toggle to sign up for the beta as we have more than enough participants and data. If you’ve already enrolled, you can continue using it. After the update, it’s no longer possible to sign up or to reactivate it once you’ve switched it off.

Improving Yoast SEO by using better code standards

One of the main improvements in this release of Yoast SEO is not a new feature or some bug fixes, but something less visible: better code through code standards. Together with the awesome Juliette Reinders Folmer, we’ve embarked on a journey to drastically improve the code of our plugins.

We’re in the process of discarding old standards and embracing new ones. There are lots of reason to use modern standards: from code that’s easier to maintain, to read and to debug. It leads to more consistency and a much more secure code base, hardening it for security risks. At the moment, Yoast SEO is on PHPCS 2.8.1, WPCS 0.10.0, YoastCS 0.4.3, PHPCompatibility 9.1.0, PHPCompatibilityWP 2.0.0.

This is an ongoing process that will eventually lead to a healthier and modern code base that is a joy to develop on. All of this will, of course, ultimately benefit users as well!

Other improvements

In this release, among other things, we’ve removed Schema output from 404 pages as that is not necessary. We’ve also improved the accessibility of the Search Console part of the interface, now show a 404 for empty feeds for non-existing pages (thanks Saša Todorović!) and improved our open source content analysis library (thanks Alexander Varwijk!). You can read the full list of changes in the changelog.

Update now!

There you have it. On the outside, this might seem like a rather small release but there are a lot of improvements under the hood. You might not see it, but adhering to new coding standards streamlines a code base, making it faster, easier to maintain and more secure. We’re continuing to improve our plugins in a two-weekly cycle and there’s a lot of cool stuff down the road.

Thanks for using Yoast SEO!

The post Yoast SEO 9.6: Improving our code appeared first on Yoast.

Using Python to recover SEO site traffic (Part one)

Helping a client recover from a bad redesign or site migration is probably one of the most critical jobs you can face as an SEO.

The traditional approach of conducting a full forensic SEO audit works well most of the time, but what if there was a way to speed things up? You could potentially save your client a lot of money in opportunity cost.

Last November, I spoke at TechSEO Boost and presented a technique my team and I regularly use to analyze traffic drops. It allows us to pinpoint this painful problem quickly and with surgical precision. As far as I know, there are no tools that currently implement this technique. I coded this solution using Python.

This is the first part of a three-part series. In part two, we will manually group the pages using regular expressions and in part three we will group them automatically using machine learning techniques. Let’s walk over part one and have some fun!

Winners vs losers

SEO traffic after a switch to shopify, traffic takes a hit

Last June we signed up a client that moved from Ecommerce V3 to Shopify and the SEO traffic took a big hit. The owner set up 301 redirects between the old and new sites but made a number of unwise changes like merging a large number of categories and rewriting titles during the move.

When traffic drops, some parts of the site underperform while others don’t. I like to isolate them in order to 1) focus all efforts on the underperforming parts, and 2) learn from the parts that are doing well.

I call this analysis the “Winners vs Losers” analysis. Here, winners are the parts that do well, and losers the ones that do badly.

visual analysis of winners and losers to figure out why traffic changed

A visualization of the analysis looks like the chart above. I was able to narrow down the issue to the category pages (Collection pages) and found that the main issue was caused by the site owner merging and eliminating too many categories during the move.

Let’s walk over the steps to put this kind of analysis together in Python.

You can reference my carefully documented Google Colab notebook here.

Getting the data

We want to programmatically compare two separate time frames in Google Analytics (before and after the traffic drop), and we’re going to use the Google Analytics API to do it.

Google Analytics Query Explorer provides the simplest approach to do this in Python.

  1. Head on over to the Google Analytics Query Explorer
  2. Click on the button at the top that says “Click here to Authorize” and follow the steps provided.
  3. Use the dropdown menu to select the website you want to get data from.
  4. Fill in the “metrics” parameter with “ga:newUsers” in order to track new visits.
  5. Complete the “dimensions” parameter with “ga:landingPagePath” in order to get the page URLs.
  6. Fill in the “segment” parameter with “gaid::-5” in order to track organic search visits.
  7. Hit “Run Query” and let it run
  8. Scroll down to the bottom of the page and look for the text box that says “API Query URI.”
    1. Check the box underneath it that says “Include current access_token in the Query URI (will expire in ~60 minutes).”
    2. At the end of the URL in the text box you should now see access_token=string-of-text-here. You will use this string of text in the code snippet below as  the variable called token (make sure to paste it inside the quotes)
  9. Now, scroll back up to where we built the query, and look for the parameter that was filled in for you called “ids.” You will use this in the code snippet below as the variable called “gaid.” Again, it should go inside the quotes.
  10. Run the cell once you’ve filled in the gaid and token variables to instantiate them, and we’re good to go!

First, let’s define placeholder variables to pass to the API

metrics = “,”.join([“ga:users”,”ga:newUsers”])

dimensions = “,”.join([“ga:landingPagePath”, “ga:date”])

segment = “gaid::-5”

# Required, please fill in with your own GA information example: ga:23322342

gaid = “ga:23322342”

# Example: string-of-text-here from step 8.2

token = “”

# Example or

base_site_url = “”

# You can change the start and end dates as you like

start = “2017-06-01”

end = “2018-06-30”

The first function combines the placeholder variables we filled in above with an API URL to get Google Analytics data. We make additional API requests and merge them in case the results exceed the 10,000 limit.

def GAData(gaid, start, end, metrics, dimensions, 

           segment, token, max_results=10000):

  “””Creates a generator that yields GA API data 

     in chunks of size `max_results`”””

  #build uri w/ params

  api_uri = “{gaid}&”\




  # insert uri params

  api_uri = api_uri.format(










  # Using yield to make a generator in an

  # attempt to be memory efficient, since data is downloaded in chunks

  r = requests.get(api_uri)

  data = r.json()

  yield data

  if data.get(“nextLink”, None):

    while data.get(“nextLink”):

      new_uri = data.get(“nextLink”)

      new_uri += “&access_token={token}”.format(token=token)

      r = requests.get(new_uri)

      data = r.json()

      yield data

In the second function, we load the Google Analytics Query Explorer API response into a pandas DataFrame to simplify our analysis.

import pandas as pd

def to_df(gadata):

  “””Takes in a generator from GAData() 

     creates a dataframe from the rows”””

  df = None

  for data in gadata:

    if df is None:

      df = pd.DataFrame(


          columns=[x[‘name’] for x in data[‘columnHeaders’]]



      newdf = pd.DataFrame(


          columns=[x[‘name’] for x in data[‘columnHeaders’]]


      df = df.append(newdf)

    print(“Gathered {} rows”.format(len(df)))

  return df

Now, we can call the functions to load the Google Analytics data.

data = GAData(gaid=gaid, metrics=metrics, start=start, 

                end=end, dimensions=dimensions, segment=segment, 


data = to_df(data)

Analyzing the data

Let’s start by just getting a look at the data. We’ll use the .head() method of DataFrames to take a look at the first few rows. Think of this as glancing at only the top few rows of an Excel spreadsheet.


This displays the first five rows of the data frame.

Most of the data is not in the right format for proper analysis, so let’s perform some data transformations.

First, let’s convert the date to a datetime object and the metrics to numeric values.

data[‘ga:date’] = pd.to_datetime(data[‘ga:date’])

data[‘ga:users’] = pd.to_numeric(data[‘ga:users’])

data[‘ga:newUsers’] = pd.to_numeric(data[‘ga:newUsers’])

Next, we will need the landing page URL, which are relative and include URL parameters in two additional formats: 1) as absolute urls, and 2) as relative paths (without the URL parameters).

from urllib.parse import urlparse, urljoin

data[‘path’] = data[‘ga:landingPagePath’].apply(lambda x: urlparse(x).path)

data[‘url’] = urljoin(base_site_url, data[‘path’])

Now the fun part begins.

The goal of our analysis is to see which pages lost traffic after a particular date–compared to the period before that date–and which gained traffic after that date.

The example date chosen below corresponds to the exact midpoint of our start and end variables used above to gather the data, so that the data both before and after the date is similarly sized.

We begin the analysis by grouping each URL together by their path and adding up the newUsers for each URL. We do this with the built-in pandas method: .groupby(), which takes a column name as an input and groups together each unique value in that column.

The .sum() method then takes the sum of every other column in the data frame within each group.

For more information on these methods please see the Pandas documentation for groupby.

For those who might be familiar with SQL, this is analogous to a GROUP BY clause with a SUM in the select clause

# Change this depending on your needs

MIDPOINT_DATE = “2017-12-15”

before = data[data[‘ga:date’] < pd.to_datetime(MIDPOINT_DATE)]

after = data[data[‘ga:date’] >= pd.to_datetime(MIDPOINT_DATE)]

# Traffic totals before Shopify switch

totals_before = before[[“ga:landingPagePath”, “ga:newUsers”]]\


totals_before = totals_before.reset_index()\

                .sort_values(“ga:newUsers”, ascending=False)

# Traffic totals after Shopify switch

totals_after = after[[“ga:landingPagePath”, “ga:newUsers”]]\


totals_after = totals_after.reset_index()\

               .sort_values(“ga:newUsers”, ascending=False)

You can check the totals before and after with this code and double check with the Google Analytics numbers.

print(“Traffic Totals Before: “)

print(“Row count: “, len(totals_before))

print(“Traffic Totals After: “)

print(“Row count: “, len(totals_after))

Next up we merge the two data frames, so that we have a single column corresponding to the URL, and two columns corresponding to the totals before and after the date.

We have different options when merging as illustrated above. Here, we use an “outer” merge, because even if a URL didn’t show up in the “before” period, we still want it to be a part of this merged dataframe. We’ll fill in the blanks with zeros after the merge.

# Comparing pages from before and after the switch

change = totals_after.merge(totals_before, 



                            suffixes=[“_after”, “_before”], 


change.fillna(0, inplace=True)

Difference and percentage change

Pandas dataframes make simple calculations on whole columns easy. We can take the difference of two columns and divide two columns and it will perform that operation on every row for us. We will take the difference of the two totals columns, and divide by the “before” column to get the percent change before and after out midpoint date.

Using this percent_change column we can then filter our dataframe to get the winners, the losers and those URLs with no change.

change[‘difference’] = change[‘ga:newUsers_after’] – change[‘ga:newUsers_before’]

change[‘percent_change’] = change[‘difference’] / change[‘ga:newUsers_before’]

winners = change[change[‘percent_change’] > 0]

losers = change[change[‘percent_change’] < 0]

no_change = change[change[‘percent_change’] == 0]

Sanity check

Finally, we do a quick sanity check to make sure that all the traffic from the original data frame is still accounted for after all of our analysis. To do this, we simply take the sum of all traffic for both the original data frame and the two columns of our change dataframe.

# Checking that the total traffic adds up

data[‘ga:newUsers’].sum() == change[[‘ga:newUsers_after’, ‘ga:newUsers_before’]].sum().sum()

It should be True.


Sorting by the difference in our losers data frame, and taking the .head(10), we can see the top 10 losers in our analysis. In other words, these pages lost the most total traffic between the two periods before and after the midpoint date.


You can do the same to review the winners and try to learn from them.

winners.sort_values(“difference”, ascending=False).head(10)

You can export the losing pages to a CSV or Excel using this.


This seems like a lot of work to analyze just one site–and it is!

The magic happens when you reuse this code on new clients and simply need to replace the placeholder variables at the top of the script.

In part two, we will make the output more useful by grouping the losing (and winning) pages by their types to get the chart I included above.

The post Using Python to recover SEO site traffic (Part one) appeared first on Search Engine Watch.

Are Web Directories Still Relevant for SEO in 2019?

Web Directories have been the norm for the world wide web and for search engine optimization for a long time. But are they still relevant in this age of advanced AI Google Updates?


Well, as Head of Web SPAM at Google John Mueller said, “in general, no”. But the key words in that answer are “in general”. That’s the usual answer we get from John and, as much as we’d like to say we will dissect it, the truth is that we can only be speculating and interpreting things based on own knowledge and experience.




Let’s jump to it and find out whether web directories are still useful for SEO in 2019.


  1. What Are Web Directories Really?
    1. Why Directories Were Created
    2. Why Directories Died
    3. The Web Directory Legacy
  2. Do Web Directories Still Work for SEO in 2019?
    1. The Problem With Today’s Directories
    2. Local SEO & Directory Listings
    3. Directories That Are Still Good
    4. Why Do We Still Have Web Directories?
  3. Should You Disavow Directory Links?

What Are Web Directories Really?


In order to better understand why directories work or not in 2019 we have to create a background for them. Like an introduction.


Most people probably think of directories as just sites that list other sites. While that’s true, the reality is that they’re much more.


Before modern search engines, such as Google, web directories were THE INTERNET.


Why Web Directories Were Created


In the very early days of the internet, only a few people had websites. Internet connections were uncommon and there wasn’t much profit to be made out of it. Still, different websites emerged, in different niches.


However, you had to actually remember the domain names in order to access those websites, as there was no Google.


But, as the internet grew, more websites started showing up. Soon enough, people couldn’t remember all those names. But some clever guys from Yahoo! had the brilliant idea to store the best website into something they called a web directory.


yahoo web directory seo


At first, only a few websites were listed, the most ‘popular’ ones or the ones that they thought were worthy of being listed. This way, you would only have to remember one website (Yahoo!) if you wanted to find all the other good sites.


At first, the sites were listed alphabetically but, as the number grew, they started getting listed under categories and subcategories. This way it was easier for users to find what they needed.


Web directories made money out of selling advertising space or boosting certain results in exchange for a fee.


Why Web Directories Died


Unfortunately, when the number of sites started getting way too high, people soon realized it wasn’t so convenient to browse through all those categories. It would take too much time and, ultimately, the best results were mixed up with the less good ones.


This is how search engines appeared. It was by far more convenient to type in a keyword and get a list of top results than to browse through categories trying to figure out which website is the best.


web directories died


The search engines worked on algorithms, just as they do now (although not as complex). Depending on the keywords in the title and other factors, some sites would rank better than others.


Yahoo! had its own search engine. However, it is no secret that Google took the lead because of their improved algorithm, but this isn’t the only reason why Google eventually took the lead. Yahoo! was a giant back then, in the days of Yahoo! Mail, News, Answers and Messenger, but a number of bad decisions made them fade out.


The Web Directory Legacy


Even though web directories dropped dramatically in popularity, people still used them. But not for what they were supposed to used them.


As much as people say that web directories are still useful, the truth is that people have been using them merely for link building and online marketing for a long time.


I’m 25 now and I’ve been using a computer since I was 6. My first contact with the internet was in 2002 and I can’t recall ever browsing on a web directory. I remember Yahoo! and Google. I remember ODC ,DC++ and other popular hubs. I remember Floppy disks with drivers. But I don’t remember web directories.


In fact, my first contact with web directories was back in 2013, when I first started learning about SEO. I had probably ended up on similar websites before, but not solely for searching a specific website.


Although the truth was that web directories weren’t of interest to the general public, new web directories kept popping up.


There were of course, useful directories, such as business listings (Yellow Pages), which people visited if they wanted to search for a particular business (name, phone, address, legal details, etc.).


There were also niche directories, which only listed websites in a particular field. Those might have got some organic traffic and potential customers when people searched for things like “list of XYZ websites”.


But the truth is that, most of the time, web directories were only there because links brought better rankings. People found ways to get listed on as many directories as they could, therefore destroying the purpose of a directory. Even paid directories were full of SPAM.


Fees were usually low, ranging from $5 to $20 if you wanted to get a listing on the homepage. However, most of the time that wouldn’t bring much more traffic to your site, since most of the users visiting web directories were SEOs trying to build links.


I’m not afraid to say that I only visited web directories for link building purposes. Everybody was doing it and it worked. It was easy and convenient.


But the question is… do they still work now?


Do Web Directories Still Work for SEO in 2019?


In a local SEO social media group somebody tried to prove a point by showing us a #1 result for a keyword, claiming that he had only built 2 directory links to that page and it’s now #1.


He was trying to attack us “white hats” that always state web directories don’t work when, in fact, he had proof that they do. Only problem was the keyword he was ranking for had virtually 0 searches and the competition was basically inexistent.


Did the web directory links actually helped him to rank better in Google’s search results? Probably… Did he prove any points? Definitely not.


Web Directories probably still have a small impact on your rankings, just like any other site that links to you and passes Page Rank. However, because they’ve been abused as an SEO tactic over the years, Google penalizes people that use web directories. With the rise of Penguin 4.0, many of them are probably ignored.


Or, as John Mueller (the current Head of Web SPAM @ Google) said:



Bu that answer can be interpreted in very many ways. Let’s see first why web directories got such a bad reputation in SEO.


Web Directories’ Main Issues for SEO


I’ve already discussed how web directories don’t really provide much value as users don’t really use them for anything other than SEO.


However, the story doesn’t end there:


Web directories got spammy & tricky: As people always want the quick way, they started automating processes. With more automation, more SPAM came along. These types of things are inconvenient for Google because they waste a lot of resources. Don’t you just hate spam comments? I know I do.


Google eventually figured out that the best way to deal with this was to penalize people that would use directories as a way of improving their rankings. So when important updates such as Penguin started to roll out, web directories had the perfect opportunity to make money.


Although some websites were listed for free, now that Google was penalizing websites for spammy link profiles, some web directories asked for removal fees.


Web directories are not managed anymore: Even if you do find a good web directory, chances are that nobody is going to answer you.


A couple of years ago, I was in the process of building links for someone. Advertorials were too expensive for his budget and all their competitors had the directory links. Without exception. They ranked well, so I thought I’d better try it at least.


After a few days of efforts, it was pretty clear that it won’t work. Not because Google penalized me or anything, but because the directory sites either had errors on their submission forms or nobody replied/approved my sites.


Since they don’t make as much money as they used to, people have stopped putting effort into managing them.


Web directories are not efficient: Even if you manage to get some business directory listings, they will probably have a very little impact, if not none at all.


Your time is better spent creating content and engaging with your audience. 1-2 links won’t help at all and scaling this is risky and can result in penalties.


Google penalizes you: Of course, the biggest issue with web directories these days is that your website can get penalized if it tries to trick the Google algorithms.


Building links in mass, be it through web directories, blog commenting, article submission, guest posting, you name it, is risky and can get your site in trouble.


Now I’m not saying that you’re going to be penalized if you submit your site to a directory. By all means, if you find an interesting site that also happens to have a list of other interesting sites, go for it!


But it’s a good idea to check their background a little bit. What’s their domain authority? Is their backlink profile spammy? Are they linking to quality websites or a bunch of phishing potholes full of malware?


In other words, does that website bring any value at all to the internet or for any users? Is it in any way relevant to your site? If the answer is no, then you should definitely stay away from it.


Local SEO & Directory Listings


When it comes to local, it’s probably the only field where people actually still visit renowned “directories” to find a good location.


For example, Google My Business is considered to be a web business directory. To some extent, it is. It lists businesses. However, it’s turning more and more into a search engine itself. The rankings are done based on algorithms.


google my business directory

Local Search Listings | Google My Business


People do browse and look for reviews, but most of them just stick with the top businesses (the ones that get listed first by Google My Business on local search and Google Maps) because they are, usually, the best results.


When it comes to local SEO, it does help if your NAP details (name, address, phone) are listed on multiple websites. However, title, distance from the user and reviews probably play a more important role.


The key thing with NAP is consistency. Google might have some issues if you keep having a mixture of different names, phones and addresses all over the place.


However, let’s all be honest that citations here and there aren’t the key to local search SEO success. It’s not that they don’t help (they do, you get brand visibility) but whenever you have a citation, you should accompany it with a backlink.


Local business directories are a great start. For example, you might find sites such as “” that have a section called “Great Locations in Our City”. If you can list your local business there, that’s great because the relevancy comes from the local orientation of the website itself. Most news will be about your city, which means your business should be there to rank better in that particular city.


If you can’t find a local business directory, you can also find these types of backlink opportunities with BrandMentions. This tool monitors when your name/brand is mentioned on the web. You can then engage with people and build relationships (and also ask for a backlink in case they haven’t made one).


Directories That Are Still Good


Now obviously you’re probably asking “Hey, where can I still find good web directories if all are bad?”


Well… there are some. This isn’t really the purpose of this article, but I’ll only list a few so that you get an idea how modern, useful web directories actually look like.


White Hat SEOs will probably jump and say “How can you recommend profile link building or any sort of web directory link?”


web directories bad for seo


The key word here is RELEVANCY. If the listing is relevant to your site, it is going to help you. 


I don’t want you to take “web directories are bad” for granted and miss an opportunity that might actually be helpful to you.


Many search engine optimization experts react badly to blog commenting or forum posts, because they know it doesn’t work. However, blog comments are still the fuel of blogs. Forums are still popular in industries such as music, automotive, health and even fashion.


Matthew Woodward built an award winning blog and much of his direct and organic traffic was thanks to the posts he made in the BlackHat Forums. Just don’t spam. Be relevant. Be useful.


An example of good directories were the local sites that might have listings. I can’t really give a specific example, but I hope you get the point. If they have an active social media account, post from time to time and bring in some traffic, you can try to list your local business there.


It all comes down to relevancy. Is the listing relevant to your scope/niche? If yes, go for it!


Here I can give a specific example. SoundCloud is a directory for musicians, for example. If you’re a musician, you can get a profile backlink from your SoundCloud profile. Another similar example is


Is this spammy? It could be if you just create a profile and leave it like that. But if you’re a musician, maybe it also makes sense to post on your SoundCloud account from time to time, build an audience there and actually send some traffic to your site through that link and maybe even get some potential customers.


Other good local listing sites are Yelp or maybe Yellow Pages. Although Google has heavily bashed them because they’re evil, it’s still a good place to list your business.


Why Do We Still Have Web Directories?


Most businesses, such as Moz or Yahoo, have closed their directories (Dmoz and Yahoo Directory).


The truth is probably that the web is still full of directories because of low maintenance costs (hosting and domain).


Many directories if not all lack moderators (which actually cost money), but still probably make some small amounts of revenue from advertising.


This whole article idea started from us observing these weird anchor texts such as “.2523765” in our tool for multiple clients. You might see these anchor texts in your cognitiveSEO dashboard as well.


They’re generated by The Globe network, a sort of directory that crawls and indexes websites by simply linking to them on pages which contain thousands of external links. The network is owned by Ascio Technologies Inc (a domain registrar). Luckily, I’ve looked at their backlink profile and they also have the spammy anchor texts, so at least it’s fair.


Their model is different. They want to sell the entire network of sites for millions of dollars. They present it as some sort of world wide web, the most visited site on the planet (which obviously, it isn’t). I’m not sure how old this thing is or how it started, but there’s a BlackHat Forum thread about the network where you can read more about this.


Should You Disavow Directory Links?


One thing that people rush in to do is disavow links that they think Google might consider spammy or that might get their websites penalized.


This is an entire topic to discuss and I really recommend that you read my article about disavowing links.


However, just to give you a short answer, in general, it’s a good idea not to play with the disavow tool unless you’re sure your site has been penalized and you have isolated the reason to be the link profile, after you’ve checked and excluded everything else.


For example, most of The Globe network, the site I’ve mentioned above, isn’t even indexed in Google, so it doesn’t really make any sense disavowing those links, since Google already doesn’t count those links in any way.




Generally, you should avoid them, but sometimes web directory submissions might be useful, especially when it comes to local businesses and local SEO. Before you submit your links, make sure that the online directory is relevant to your niche and has a good link profile.


What do you think about web directory submissions? Have you ever used them? Do you still use them? Do they still work? Let us know in the comments section (but be careful with the links if you’re a WhiteHat SEO as you might get penalized!)


The post Are Web Directories Still Relevant for SEO in 2019? appeared first on SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies.

Voices of Search Podcast: Optimizing Top of Funnel & Awareness Content

Episode 39 Overview

Content Optimization Week continues with a focus on how you can educate prospective clients with mid-funnel content during this critical phase of the consideration and buying cycle. Join Ben Shapiro and Marlon Glover, the Searchmetrics Content Services Team lead,  as they dig into the strategy behind development and optimization of educational content.

iTunes       Overcast      Spotify      Stitcher

Episode Transcript

Ben:                 Welcome back to content optimization week on the Voices of Search podcast. I’m your host, Benjamin Shapiro, and this week we’re going to publish an episode every day, covering what you need to know to optimize every stage of your content marketing funnel. Joining us for content optimization week is Marlon Glover, who is the content services team lead, here at Searchmetrics. He is responsible for shepherding Searchmetrics’ largest and most strategic clients into content marketing success.

Ben:                 Today, we’re going to continue our conversation with Marlon, and discuss how to optimize your middle of funnel content. But before we get started, I want to remind you that this podcast is brought to you by the marketing team at Searchmetrics. We’re an SEO and content and marketing platform that helps enterprise scale businesses, monitor their online presence, and make data-driven decisions. To support you, our loyal podcast listeners, we’re offering a complimentary digital diagnostic. A member of our digital strategies group will provide you with a consultation that reviews how your website, content, and SEO strategies can be optimized. To schedule your free digital diagnostic, go to

Ben:                 Here is the second installment of Content Optimization Week with Mr. Marlon Glover, Searchmetrics’ Content Services Team Lead. Marlon, welcome back to the Voices of Search Podcast.

Marlon:            Ben, thank you for having me again.

Ben:                 Of course, of course. Glad to have you here. Yesterday we talked about building awareness, casting a wide net, understanding who your customers are, and building just content, doing some keyword research, looking at your industry, but just generally, trying to build something that’s going to get in front of them and be tangentially related to the products or services you’re selling.

Ben:                 Today, we’re going to talk a little bit about the middle of the funnel, which is what I called, educational content. I think you’d call it, consideration. Talk to me a little bit about the dividing line between building awareness, and improving consideration.

Marlon:            Yeah, Ben. I’m rubbing my hands together because this is actually my favorite part of the funnel. This is where the rubber really meets the road, with a customer truly teaching their perspective buyers, throughout their buying process. The dividing line that you mentioned, is between awareness and a potential buyer as a need that’s so significant that they want to find a solution for it.

Marlon:            An example may be, let’s say, I’m a content marketer. Let’s say I’m a content marketer and I’m looking to do a content gap analysis, but I don’t necessarily have the tools to do it. My issue is, and my need is, that I need to do a content gap analysis. I may be a new marketer, or maybe I’m not up to speed with the new technology out there. First, I want to understand how to conduct a content audit. Right? That may be an example of the first search I take. Now, we do understand and recent data has shown from Google, that the consumer journey may be different for different types of products and solutions. Maybe someone will start with knowledge of a specific product, or a service provider and they start their search there.

Marlon:            It’s not necessarily linear in all cases, but there still is a need to educate and to provide individuals with these teaching moments in this research phase, so it’s important that we understand the demands for these questions in looking at the data.

Ben:                 I’ll use an example that you brought up in our last podcast where we were talking about a company that’s selling blenders, an ecommerce product. You mentioned that building awareness, that that company can create recipe pages. They’re trying to understand and build and audience of people that are active and avid chefs. To me, the dividing line between building awareness and just getting your brand in front of someone and then improving their considerations, when you start talking about what makes a good product, so this comparison sites between all of the different blenders, or content that talks about, what are the most important tools to have in your kitchen? When you’re actually educating someone on not only what your product specifically does, but why it’s important to have a product in that class. It’s basically working your way down to why our product is the best.

Marlon:            Yeah. I think what we’re looking to do with our content strategy is, I come from a school of thought around this idea of challenger selling. Teaching, tailoring that’s already controlled throughout the buying process is really a sales methodology. But I think there’s overlap with our content strategy. Typically, at this stage of the buying process, buyers are looking for how much or how little they need. Right? Looking for the important elements that they should be considering when they’re looking to solve that problem.

Marlon:            Now, it’s up to the solution provider, the manufacturer, or whoever may be that’s looking to solve that problem with their products, to help customers understand what they should be considering, and if they’re doing it well, then they should be leading to their unique value proposition. At this moment where they’re educating them about the things that make not only them unique, but the things that their customers should be considering, and then ideally helping them overcome some potential challenges, as they consider other stages of that purchasing decision, as well.

Ben:                 To me, the questions you’re answering with these types of content is, what is it, and do I need it? Right? Like, hey, I’ve been exposed to this brand. I understand that they have these products. That’s the awareness stage. Getting into consideration is, what is the product? Does it have value in my life? And then down the road, when we get into tomorrow’s episode and we’re talking about actually getting into the purchases, what are the things that I need to understand to feel comfortable making a purchase? Is it right for me?

Marlon:            Yeah. The only thing that I would change there, in your language is, at this stage it’s not so much about the product. It’s about the solution. It’s important for every organization to truly understand what problems they’re solving for their buyers. When you look at different types of products, we can find different needs and solutions within a single product. An example of that is again, a search metrics. Searchmetrics has a product that can solve the needs of both SEOs and content. Maybe an SEO needs to do an SEO technical audit. This solution solves that problem. A content marketer could also use this to develop their content plan.

Marlon:            I think it’s important for us to think about the consideration stage as not necessarily building a significant value in our products, and this is where a lot of organizations make that mistake. It’s about teaching them around the things that they should be considering when they’re looking to solve that problem. It’s not until we start getting towards the later stages of the consideration, moving into the purchase stage, is when we’re really talking about the unique value proposition of our individual products and solutions, and we’re peppering-in brand deterrents in their way.

Marlon:            I just wanted to make one clarification there, and this is where we see a lot of organizations tend to overemphasize on product and brand, and feature-specific content, is because they think they’re truly answering those questions at the earlier stage, when indeed, they’re not effectively teaching and educating their customers at the consideration stage.

Ben:                 It’s not necessarily, do I need this product? It is, what is the solution to my problem? Which to me, when we talk about the format, it seems like a lot of this type of, middle of funnel consideration content is really answering questions, as opposed to, we talked before about blog posts and just tangentially related to your brand, the content. This is answering questions about the overall industry product class and set.

Marlon:            That’s exactly right. When we do that initial keyword research, typically what I’m doing next is, within our research cloud we have a keyword discovery tool that allows me to type-in question modifiers. I may take a single keyword, and do a phrase match, or for the what, when, why, how, which, attached to the keyword to understand the questions that folks may be asking, potential buyers may be asking that’s associated with those keywords. That’s what I’m starting to think about, exactly the questions that buyers are asking at this stage.

Marlon:            Consideration to me, doesn’t mean consideration of our products and solutions. Considerations to me means, what should I be considering when I’m looking to solve my unique problem?

Ben:                 I think one of the things that if you do this content right is, you end up building a lot of credibility. If you’re able to clearly communicate an answer, a customer’s question and they can see a direct path between, hey, I’m aware of this brand. I’m aware that this company makes blenders. Now I’m starting to think about whether I need a blender, and they are giving me the information that helps me make that decision. There is naturally inferred credibility that if you can communicate and help them with the decision, that you understand the problems that they’re having, so your products will be valuable.

Marlon:            That’s exactly right. Again, I think there’s a very close parallel to the sales process. It used to be, you would walk into a car dealership, or maybe even walk into a Best Buy and you’ll quickly get annoyed with the overly aggressive sales rep, telling you about all the features and specs of a specific product. Like, hey guy, I’m just in here browsing, want to learn. But, I think what we find is, the folks that we tend to not get so annoyed with are the ones that aren’t so focused on the features and specs, but they may be helping us think about understanding what it is we’re looking to again … I hate being repetitive and using this term but, solve for. But they’re really teaching us and educating us, and helping us uncover things that we may not have previously considered.

Marlon:            This is what we’re hoping to achieve with content at this stage is, we want to focus on building that credibility and using this term that is again, in this challenging methodology, but is commercially teaching. Is teaching, but you’re not teaching and leading them to a competitor. You’re teaching them and leading them to what it is uniquely solve.

Ben:                 I think going back to your sales metaphor, when I’m thinking about the sales process, the best salesmen try to understand what problem the customer is trying to solve first, before actually making product suggestions.

Marlon:            Yeah.

Ben:                 To me, that is very much what this middle class of content is about, education, understanding the customer, building relationships, building credibility. In terms of doing the research for understanding what types of questions or what content to build here, we mentioned that a lot of these are going to be questions, so you’re looking for who, where, when, what, why related to specific keywords relative to your brand. Are there any other ways that you suggest companies think about building their middle of funnel, their educational, their consideration content lists?

Marlon:            Yeah, sure. The data can help give us direction in our content strategy. I oftentimes would like to speak to the people that are having these conversations with customers, every day. Even if we’re talking about products that are sold online, if you’re a manufacturer and you have an online store, then I want to understand from the folks that are out there doing this research, they’re talking to customers, product teams, sales teams, anyone that has direct connections with the prospective buyers and current customers, I want to hear from them. I want to understand what’s happening on the ground.

Marlon:            Then, I think it’s important for us to always tap into the trends and the different types of content that is resonating. I use Google’s research. I’m a big fan of Think with Google. I’m a big fan of search in general. I’m big fan of some of these other resources out there that help me understand what’s happening in trends in the marketplace. But again, I think that our technology also allows us to understand what Google is rewarding in terms of search. Our search integration features and our technology allow us to get really clear direction in terms of the type of content that makes the most sense for a specific topic.

Ben:                 Obvious answer, Searchmetrics can help you with that, but the other takeaway from what you’re saying is that there are other people that are influenced or communicating with your customers and leads on a regular basis. They have a great understanding of what the frequently asked questions are going to be. Whether it’s your sales team, whether it’s your customer service arm, whether it’s your manufacturers, go around the rest of the organizations and ask them, what questions their customers are asking, and answer those in a format that is question-based, whether it’s your FAQs, whether it’s somewhere on your site, even if it’s in your blog.

Ben:                 Building consideration and building credibility shows that you have an understanding of what the customer’s problems are. If you’re able to build credibility by answering those questions clearly and articulately, you’ll get that inferred credibility and you’ll get more people coming into your purchase cycle, which is what we’re going to talk about tomorrow.

Marlon:            That’s exactly right.

Ben:                 Great. Well, we covered a lot of ground today. I think that’s a great place for us to stop, so that wraps up this episode of the Voices of Search podcast. Thanks for listening to my conversation with Marlon Glover, Searchmetrics’ Content Services Team Lead. We’d love to continue the conversation with you, so if you’re interested in contacting Marlon, you can find a link to his bio on our show notes, or you can send him a tweet. His handle is Marlon_Glover. If you have general marketing questions, or if you’d like to talk about this podcast, you can find my contact information on our show notes, or you can send me a tweet @BenJShap.

Ben:                 If you’re interested in learning more about how to use search data to boost your organic traffic, online visibility or to gain competitive insights, head over to, for your complimentary advisory session with our digital strategies team. If you like this podcast, and you want a regular stream of SEO and content marketing insights in your podcast feed, hit the subscribe button in your podcast in your app, and we’ll be back in your feed tomorrow morning to discuss Marlon’s tip for optimizing your bottom of funnel purchase related content.

Ben:                 Lastly, if you’ve enjoyed this podcast, and you’re feeling generous, we’d love for you to leave us your review in the Apple iTunes store. That’s it for today, but until next time, remember, the answers are always in the data.

How to Join Many to Many with a Bridge Table in Power BI

One of the greatest values of data visualization tools is being to connect different types of data tables to calculate results, illustrate trends, or discover outliers. Relationships between your tables are necessary to create these connections.

Relationships in Power BI

Relationships in Power BI have multiple configurations, including cardinality and cross-filter direction, which determine the way that your tables connect and interact with each other.


A one-to-one cardinality means that the related column in both tables has only one instance of each value.

Many to one (*:1)

A many-to-one relationship means that one column in one table has only one instance of each unique value, but the related column in the other table has multiple instances of a particular value.

Many to many (*:*)

Power BI released a many-to-many relationship feature as part of the composite models capability in Power BI desktop. You can enable this by navigating to file > options & settings > options > preview features > and checking off composite models. You’ll need to restart Power BI after enabling composite models.

So…wouldn’t the ability to create a many-to-many relationship mean that there’s no need to build a bridge table?

Technically yes…but depending on how many data sources you’re connecting, how complex your data is, and other factors you might want to build a bridge to ensure accuracy in your relationships.

In some initial comparisons I’ve found that Power BI’s many-to-many connections are seemingly accurate. However, I’ve yet to perform any in-depth comparisons of the accuracy of a bridge table vs. many-to-many relationships in more complex analyses.

Cross filter directions

When creating a relationship you can choose between a single cross filter direction or a both cross filter direction. With both cross filter direction, when you filter any values in one table, the same filter will apply to values in the other table if they are connected by a relationship—treating all aspects of connected tables as if they are a single table.

Building a bridge table

A bridge table—also known as an associative entity table—is a way to create a many-to-many join by creating a table with a column that contains a singular instance of each unique value, which creates a bridge to join two or more many columns together.

Step 1: Clean your data

I can’t stress enough how important cleaning your data is before building a bridge.

If you take the time to clean your original tables, by the time we create references in the next step, your references will be clean because you already cleaned your data. It saves time to just clean everything first then to have to go back and try cleaning multiple tables after you’ve made transformations and references.

Anyway, how do you expect to properly connect your data if you have mixed case URLs in one table and not the other? If some URLs have trailing slashes and some don’t? Maybe your values from one table have whitespace at the end and you’d have no idea!

If you don’t clean your data, then you won’t have a trustworthy output. You’ll spend time building a beautiful dashboard that doesn’t actually tell you anything because the data isn’t connected properly.

So, before we go onto building a bridge, clean your data. See our checklist for cleaning URLs and our post on cleaning and deduping your data in Power BI to make sure your tables are prepped before moving on to Step 2.

Remember that you won’t need to de-duplicate any of your data for this tutorial since we’ll be building a de-duplicated table to bridge your other tables.

In this example, I’m going to compare our conversions from paid search terms vs. rankings for organic keywords to find opportunities to optimize organic pages and augment paid spend.

Step 2: Create references of the tables you’re bridging

Create a reference of each table that you would like to bridge by right clicking the query and selecting reference from the dropdown. (In my example, I’ll be creating a reference of my Adwords table and my organic rankings table).

Why a reference and not a duplicate? A query reference will only refer the result of the query, whereas a duplicate will duplicate the code of the query, creating a new object in the memory.

Step 3: Remove other columns

Now that you’ve created references of both tables, you can remove all other columns in the reference queries except the columns you will be bridging. (In this case, the keyword column in the reference to the organic rankings query and the search term column in the Adwords reference query).

Right click the column you want to keep and select remove other columns from the dropdown menu.

Your reference queries should now only have one column each—the columns that will be combined into a bridge.

Step 4: Update all column headers to the same value

The next step is to ensure that the single column in your reference table has the same header value. Our following step is combining all of the tables together. And if your columns have different headers, Power BI will assume they are completely separate columns. This is case-sensitive!

For example, I’m going to update my “search term” column to “keyword” to match my organic rankings query. You can do this by double clicking the header and changing the value.

Step 4: Append all references to the bridge table

Now you’re going to choose which reference query will become your bridge. It doesn’t matter which one you choose, but I like to rename it to bridge to make things easy.

While your bridge query is selected, you will then select append queries in the top right of the home section in the query editor.

In the append window, select whether you’ll be appending two tables or 3+ tables—in this example we’re only appending two tables—and select the references that you’re appending. References will all have a (2) after the query name unless you renamed them.

If you did everything right you should still only see a single column in your bridge table. If you didn’t properly rename your headers you’ll have multiple columns in your bridge.

Step 5: De-duplicate your bridge

Next we’ll want to deduplicate our bridge of combined keywords and search terms by selecting remove rows > remove errors, remove blank rows, remove duplicates.

You can also right click on the reference query that did not become your bridge and deselect enable load. This removes the reference table from the query results that are available for report builder. (You won’t need this since all your data is in your bridge).

You can close out of the query editor and apply changes.

Step 6: Create a relationship between your bridge and data tables

The next step is to create a relationship between your tables and your bridge. Select manage relationships—depending on how you named your headers, you may already see that Power BI is trying to recognize connections between tables and has already created some relationships.

If you see some already-created relationships from Power BI’s autodetect, make sure you click and edit the cross filter direction if needed since it will automatically be set to single.

To create a new relationship, select new and then highlight the columns in each table that we should be matching. Cardinality should be many to one (*:1), since the search term column has many values and the bridge keyword column should have a single, unique value to join.

You can also view and create relationships in the relationship pane. A double-sided arrow indicates a “both” cross-filter direction for quick QA.

Now you’re ready to build your dashboard with connected data!

When using a bridge, make sure that you use the bridge column value whenever applicable (instead of data from one of the tables), since your bridge table should contain one of all values from all of the many tables.

For example, when I build out my scatter plot to compare paid conversions to organic rankings, I’ll be pulling the keyword column from my bridge table.

We’re done! We’ve created a relationship between our paid search terms and organic keywords. Now we can analyze which search terms are converting and start to dig into how we’re ranking for the related organic keyword, then find opportunities to optimize or create new content.

Facebook is a local search engine. Are you treating it like one?

As soon as Facebook launched its Graph Search in 2013, it was only a matter of time before it became a big player in the search engine game.

But Graph Search was limited as far as optimization was concerned. The results it returned focused on providing an answer on the relationships between people, places, and things on Facebook rather than a link that contained what was likely the answer to your query like Google.

It was an IFTTT statement on steroids that never missed a workout.

It wasn’t until 2015 that Facebook became a major player in keyword search.

A brief history on Facebook Search

2011, August: Facebook beta tests sponsored results in search feature

Since Facebook’s search was in its infancy at this time, testing sponsored results may have been too soon since search functionality was limited to page titles, profiles, games, etc…

So you couldn’t target based on keywords quite yet, nor could you take advantage of a searchers affinity. Which means the only way to target was by targeting another page or product.

facebook beta tests sponsored results in its search box, 2011

Sponsored results were short-lived as it turned into competitors targeting competitors relentlessly to steal traffic. In June of 2013, less than a year of Sponsored Results’ formal release, they were terminated.

2013, January: Facebook launches Graph Search

Prior to 2013, Facebook Search had only returned results based on existing pages. There wasn’t much logic in the algorithm other than the name of the person or page you’re looking for in your query. That changed with Graph Search.

Graph search took pages, categories, and people and put them into a revolving wheel of filters that was heavily personalized to each searcher and returned different results for everyone. It even allowed users to use natural language like, “friends near me that like DC Comics” and return results for just that.

At least that’s what you would have had to search to find friends you should stay away from…

The two major flaws in this were:

  1. You had to already know what and whom you were looking for.
  2. You also had to hope that what you were looking for had a page or tag associated with it.

 2014, December: Facebook adds keywords search functionality to Graph Search

This was the obvious next move by Facebook. They took what was already seemingly amazing at the time and added the ability to pull results based on keywords alone from your friends, people your friends have interacted with, or pages you like.

As far as a search engine was concerned, Facebook still had a long way to go to compete with Google.

2015, October: Facebook expands search to all 2 trillion public posts

Less than a year later, Facebook opens the floodgates in their search engine from inner-circle results to global, real-time, public news and posts. You could now search for any public post in their over 2 trillion indexed posts. Facebook now became a search engine that competed with the likes of Google for industry news and trending topics.

Viewing Facebook as a search engine

Prior to any real search functionality in Facebook, social media platforms were merely viewed as potential ranking signals and traffic boosters for your website.

Despite Matt Cuts denouncing the claim that Google uses social signals in their algorithm, Bing has gone the opposite direction about how they use engagement metrics from social media.

Facebook has been a solid directory site for local businesses to have their business page since 2007 when the number of businesses listed on the social media was only about 100,000. It allowed businesses to be in front of one of the fastest growing social media networks in the world, but the discoverability of your business was limited to mutual connections and brand awareness.

It wasn’t until 2014 when Facebook launched a new and improved version of the 2011 Places directory that Local Business Search became a focal point for Facebook to compete with Yelp and FourSquare.

Now, when searching for a company in Facebook’s Search that’s near you, you’ll get results that are eerily similar to the local 3-pack on Google. If we know anything about local 3-packs, it’s that there’s definitely an algorithm behind them that determines which businesses get to show up and in which order.

Facebook sees over 1.5 billion searches every day and over 600 million users visit business pages every day. They still have a ways to go to reach Google’s 3.5 billion searches per day. That said, claiming search queries just over 40% of what the search engine giant has — as a social media platform — isn’t anything to scoff at.

Why Facebook Search is important for local businesses

Facebook has provided a different means for customers to engage with brands than a typical search engine. But now the search engine has come to Facebook and the data shows people are using it. Not only to stalk their ex but also to find businesses.

  1. Facebook has a large user base using their search engine
  2. Local businesses can be discovered using search
  3. Local business are ranked using search

So I guess that means we should be optimizing our local business pages to rank higher in Facebook’s search for specific queries…

Sounds a lot like local SEO. But this time, it’s not about your website or Google.

The whole reason us SEOs are obsessed with SEO is that the value and opportunity it holds when 74% of buying journeys start with search engines during the consideration stage. If no one used search engines, SEO wouldn’t be much of a big deal. But they do and it is.

graph, "which of the following ways do you typically discover or find out about new products, brands, or services?"

According to a survey by Square, 52% of consumers discovered a brand through Facebook. That’s a pretty significant number to be passing off.

And with the launch of Facebook Local in late 2017, the network is getting more discover-friendly.

Optimizing for local Facebook SEO

Facebook has caught up with typical search engines and started implementing keywords in their algorithm and database. Bundle that knowledge with the fact that 600 million users visit business pages, and Facebook alone has a whopping 1.5 billion searches every day. It doesn’t take a genius to understand that Facebook SEO shows to be valuable in local business discoverability on the platform.

All we have to do is crack the code in optimizing Facebook business pages. Unfortunately, it seems Facebook is a bit more secretive than Google on what are and aren’t ranking signals in their local business Graph search.

It’s a matter of finding out why Facebook chose to rank Lightning Landscape & Irrigation over Dynamic Earth Lawn & Landscape, LLC when Dynamic Earth is both verified and closer.


In most of my research, Facebook tends to heavily weight posts and pages based on user engagement. But it doesn’t mean other ranking factors don’t exist in search. We’re looking at around 200 ranking signals similar to Google, but also vastly different.

Trying to crack this code has led the idea of “optimizing your Facebook business page”, including myself. But most seem to be focused on optimizing Facebook pages to rank in other search engines rather than Facebook itself.

While it is definitely a good idea to follow SEO best practices for the former reason, why not do both?

Facebook testing search ads

Coming into 2019, Facebook has started beta-testing search ads. They’re not keyword-based yet. Rather, they serve as extensions of news feed ads and only on supported ad formats. It’s quite an improvement from the original search ads that were abandoned in 2015.

It’s the type of subtle testing that could definitely produce some useful analytics in pursuing a full-blown search ad feature with keywords.

Related: “Facebook is expanding into Search Ads. What will this mean?”

Facebook knows two things:

1) Ad space on news feeds is decreasing.

2) More and more people are using their search feature.

The fact that Facebook is testing this doesn’t really tell me anything about local SEO signals on the platform. But it does tell me even Facebook sees a real opportunity in their own search engine. And their analytics are probably better than what we have right now.


Without any solid advice from Facebook, I think it’s time for the SEO community to start thinking about organic brand exposure through the social media platform itself. We should start viewing it as an enhanced search engine as it continues to grow and improve its search features.

More so, without search analytics from Facebook, there really isn’t a lot we can do in regards to optimizing for placement. At least right now there isn’t.

I’d love to see their new search ads beta really get traction and prompt Zuckerberg to consider a more SEO-friendly approach to search marketing on his own platform.

Of course, this is going to give “social media gurus” another reason to clog up my news feed with ads.

Jake Hundley is the founder of Evergrow Marketing.

The post Facebook is a local search engine. Are you treating it like one? appeared first on Search Engine Watch.

YouTube Expands Advertiser Access to 15-Second Non-Skippable Ads

Advertisers on Google Ads have historically had a few main video format options when launching auction-based YouTube campaigns:

  • Skippable TrueView in-stream ads: These appear before, during, or after YouTube videos. The advertiser pays if the user interacts with the video, watches 30 seconds of it, or watches the entire video ad (if it’s shorter than 30 seconds).
  • TrueView video discovery ads: These appear next to related YouTube videos, on the YouTube mobile homepage, or in YouTube search results. The advertiser pays when the user clicks on the video thumbnail.
  • Bumper ads: These are non-skippable, 6-second ads. The advertiser pays per 1,000 impressions.

Another format that we see on YouTube fairly often is 15 second non-skippable ads. These have only been available to advertisers via a reserved media placement on YouTube, which often involves a minimum spend and more work than the typical YouTube campaign. This format hasn’t been available to advertisers via the auction.

However, Google recently announced that they’ll be expanding use of these 15-second non-skippable ads to all advertisers on Google Ads, display and video 360 participating in auctions.

Here’s an example of what those look like on YouTube:

What does this mean for advertisers?

With this update, advertisers now have another creative option and aren’t limited to just TrueView ads or non-skippable bumper ads. Google says that they want to make the buying experience more flexible and intuitive and give advertisers access to all video ad formats, regardless of how they buy.

In terms of pricing, advertisers will pay based on impressions (CPM bidding) and may see higher CPMs than other ad formats on YouTube because of the extended, non-skippable format. The new format should be available to all advertisers in the Google Ads UI. When selecting the Video campaign type, the non-skippable format should be one of the options: 

When should you use the new format?

Are you currently running a YouTube campaign, or getting ready to launch one in the future? Non-skippable ads are great to use when an advertiser wants to deliver an entire message to the viewer, but consider testing out the new format with bumper and TrueView In-Stream ads as well. Depending on the campaign goal and targeted audience, different creative lengths and viewer experiences can lead to different results.

Additionally, since CPMs will likely be higher for this format, advertisers should make sure that their message is clear and compelling and that their targeting methods are reaching viewers who fit their target audience.

Looking for more tips and tricks on running YouTube campaigns, optimizing videos for search, or tracking YouTube success? Check out our full YouTube Video Marketing Guide!