New AdWords Features: How To Stay Up To Date

Paid Search, and Google AdWords in particular, moves fast. Just when you think your account build is complete now that Customer Match lists and Dynamic Search Ads have been added, you open Twitter and find a new feature you didn’t even know about yet! Keeping up to speed on changes is a big part of the job. Here are the go-to sites to check out so that you’re never in the dark.

Search Engine Land
“When You Need to Know Right Now”

There are tons of news sites out there that cover Paid Search, but I always turn to Search Engine Land. Ginny Marvin does an excellent job of keeping readers abreast of changes in AdWords minutia, like in this recent piece on changes to the mobile product card unit ads:

Google AdWords Official Blog
“When You Need to Learn More About a Product Announcement”

Inside AdWords, the official blog for Google AdWords, has been around for years via the Blogspot platform. This is the spot to learn about official product announcements and features.

Google AdWords New Features Page
“When You Need a Recap”

AdWords makes updates basically all the time. The running news items can start to feel jumbled in your head. Thankfully, the product team recognized this and launched a New Features section in the Fall of 2016. This helpful page gives you a timeline of expanded features and a link to the support documentation for each item. Invaluable!

LunaMetrics
“When You Need a Deep Dive”

Shameless self promotion here. We have two main ways that we help companies that use AdWords learn more and get value out of the tool. This applies if you manage AdWords yourself or if you lean on an agency to help manage your accounts.

LunaMetrics Blog

We don’t write up-to-the-minute news about AdWords changes on the LunaMetrics blog as there are better outlets around for that. We do have a knack for instructive posts that exhaust a particular topic. Here are a few highlights:

Google AdWords Search Term Report Action Items

Learn how to effectively use the AdWords Search Term Report to cut wasteful spending and optimize your AdWords campaigns by asking yourself some questions.

Increase Revenue with Strategic Audiences in Google Analytics & Google AdWords

Learn how to create & bid on targeted audiences to reduce wasted spend and increase revenue in this definitive guide. Maximizing your Adwords spend today!

Digging Deeper with Attribution Reports in Google AdWords

Not sure about the tools available for attribution modeling and analysis in Google AdWords? Check out our handy guide to attribution reports in AdWords!

Google AdWords Cross-Device Reporting & Attribution

Visit the AdWords attribution section more frequently. The device attribution reports there are a valuable resource. Let’s look at three device reports.

LunaMetrics In-Person Training

Need more? We’ve got our two-day Google AdWords training that we hold around the country to both ease people into AdWords and make sure they’ve got a strong footing when they leave. There are many other trainings out there, so you’ll understandably want to shop around and compare reviews and agendas to make sure the course you take is worth the investment. We don’t offer online courses, but we certainly recognize that there are pros and cons to both online and offline courses.

As a trainer, I enjoy taking my years of experience with clients big and small and using that to help people see the potential in AdWords for the unique business. No two courses are exactly the same, and it’s great to watch attendees making business connections and sharing tips during the breaks and workshop segments of the class.

Here are a few reviews we’ve received:

“The course was rich, relevant and clearly up-to-date on the latest Google AdWords topics and settings. I appreciated the examples, discussions and answers provided during the session.”

—Boston – Google AdWords 101

“So much information! I knew that there was stuff that I did not know but now I definitely am well-versed in all that’s available—now I have some practicing to do!”

—Chicago – Google AdWords 201

Google AdWords Best Practices Guide
“When You Need a Refresher”

This one is a favorite of those who have already attended the LunaMetrics AdWords training seminars. This one single Best Practices page has some of the best planning tools out there for your AdWords account.

When you pop open any topic above you get a great list of guides and checklists that can all be downloaded as PDFs for you to work through. Super handy!

Think With Google
“When You Need to Be Inspired”

Going beyond using the tool and latest product updates, it’s important to see actually see successes with the different features. Think with Google is my personal favorite site on today’s list. Packed with case studies, videos, downloads and ideas, this site is where Google docks it’s best advertising efforts and biggest client successes. For example, see what big brands are doing with RLSA and get inspired to try your own Remarketing Lists for Search Ads campaigns.


Google Analytics API v4: Histogram Buckets

Back in April of last year, Google released version 4 of their reporting API. One of the new features they’ve added is the ability to request histogram buckets straight from Google, instead of binning the data yourself. Histograms allow you to examine the underlying frequency distribution of a set of data, which can help you make better decisions with your data. They’re perfect for answering questions like:

  • Do most sessions take about the same amount of time to complete, or are there distinct groups?
  • What is the relationship between session count and transactions per user?

How It Really Works

Here’s how to use this new Histogram feature yourself with the API.

Note: we’re assuming you’ve got the technical chops to handle authorizing access to your own data and issuing the requests to the API.

Here’s what a typical query looks like with the new version of the API:

{ "reportRequests": [ { "viewId": "VIEW_ID", "dateRanges": [ { "startDate": "30daysAgo", "endDate": "yesterday" } ], "metrics": [ { "expression": "ga:users" } ], "dimensions": [ { "name": "ga:hour" } ], "orderBys": [ { "fieldName": "ga:hour", "sortOrder": "ASCENDING" } ] } ]
}

This query will return a row for each hour, with the number of users that generated a session during that hour for each row; simplified, it’d be something like this:


[ ['0', 100], ['1', 100], ['2', 100], ['3', 110], ['4', 120], ['5', 140], ['6', 220], ['7', 300], ...
]

Wouldn’t this data be more useful if it were dayparted? Let’s use the histogram feature to bucket our data into traditional TV dayparts:

Early Morning 6:00 AM – 10:00 AM
Daytime 10:00 AM – 5:00 PM
Early Fringe 5:00 PM – 8:00 PM
Prime Time 8:00 PM – 11:00 PM
Late News 11:00 PM – 12:00 PM
Late Fringe 12:00 PM – 1:00 AM
Post Late Fringe 1:00 AM – 2:00 AM
Graveyard 2:00 AM – 6:00 AM

To request our data be returned in these new buckets, we’ll need to make two modifications to our query from before. The first change we’ll make is to add a histogramBuckets array to the ga:hour object in our dimensions array. We’ll populate this with ["0", "2", "6", "10", "17", "20", "22", "23"]. Each number in this sequence marks the beginning of a new histogram bin.

The end of the bin is inferred by the number that follows it, and if values exist below the first bin’s minimum an additional bin will be tacked on for us at the beginning to contain those values. For example, if we had started our histogramBuckets with “2” instead of “0”, the API would add a new bucket to the beginning named “<2″, and it would contain the values for matching rows where the ga:hour dimension was 0 or 1. The second change we need to make is to add the “orderType”: “HISTOGRAM_BUCKET” to the orderBys portion of our request.

{ "reportRequests": [ { "viewId": "70570703", "dateRanges": [ { "startDate": "30daysAgo", "endDate": "yesterday" } ], "metrics": [ { "expression": "ga:users" } ], "dimensions": [ { "name": "ga:hour", "histogramBuckets": [ "0", "2", "6", "10", "17", "20", "22", "24" ] } ], "orderBys": [ { "fieldName": "ga:hour", "orderType": "HISTOGRAM_BUCKET", "sortOrder": "ASCENDING" } ] } ]
}

Here’s what the response for that query looks like for some data from a personal site:

{ "reports": [ { "columnHeader": { "dimensions": [ "ga:hour" ], "metricHeader": { "metricHeaderEntries": [ { "name": "ga:users", "type": "INTEGER" } ] } }, "data": { "rows": [ { "dimensions": [ "0-1" ], "metrics": [ { "values": [ "31" ] } ] }, { "dimensions": [ "2-5" ], "metrics": [ { "values": [ "113" ] } ] }, { "dimensions": [ "6-9" ], "metrics": [ { "values": [ "155" ] } ] }, { "dimensions": [ "10-16" ], "metrics": [ { "values": [ "247" ] } ] }, { "dimensions": [ "17-19" ], "metrics": [ { "values": [ "52" ] } ] }, { "dimensions": [ "20-21" ], "metrics": [ { "values": [ "25" ] } ] }, { "dimensions": [ "22-23" ], "metrics": [ { "values": [ "21" ] } ] } ], "totals": [ { "values": [ "644" ] } ], "rowCount": 7, "minimums": [ { "values": [ "21" ] } ], "maximums": [ { "values": [ "247" ] } ], "isDataGolden": true } } ], "queryCost": 1
}

Some Downsides

As of this writing, the chief advantage of this feature is that it can save you a little logic and time when your own application wants to use histograms with your Google Analytics data. There’s no “give me X buckets” though – you have to know the range of your data ahead of time. Additionally, data is coerced into an integer, so floats are out.

That means if you want to generate bins dynamically (like we’re doing in our example), you need to first get the range of the data from Google Analytics, then calculate those buckets and send a second request. You may wish to simply request the raw data and calculate the histogram yourself.

Hopefully Google will add some more functionality to this feature to simplify dynamic binning, too. I’d also welcome the ability to create histograms within the Google Analytics interface! Hopefully this API feature is a sign that something like that is in the works.

There are a limited set of dimensions that can be queried in this manner; here’s a complete list:

Count of Sessions ga:sessionCount
Days Since Last Session ga:daysSinceLastSession
Session Duration ga:sessionDurationBucket
Days to Transaction ga:daysToTransaction
Year ga:year
Month of the year ga:month
Week of the Year ga:week
Day of the month ga:day
Hour ga:hour
Minute ga:minute
Month Index ga:nthMonth
Week Index ga:nthWeek
Day Index ga:nthDay
Minute Index ga:nthMinute
ISO Week of the Year ga:isoWeek
ISO Year ga:isoYear
Hour Index ga:nthHour
Any Custom Dimension ga:dimensionX (where X is the Custom Dimension index)

Great Example Use Cases

Wondering how you might use this feature? Here are some more examples to get your juices flowing:

  • Use Events to capture more accurate page load times and store the time in the label, then bin the times using the API.
  • Capture blog publish dates and see when blog posts peak in engagement
  • Look at months and transactions to identify seasonality
  • Compare Session Count and Revenue to see, in general, the number of sessions required to drive your highest revenue.

Have a clever use case of your own? Let me know about it the comments.


Email Outreach Tips: +22 Link Building Email Templates

Content-based and relationship-driven links are apparently the ones that make a huge impact in today’s digital marketing (not just SEO).

Outreach accounts for the majority of these link acquisitions – which I can personally attest to as it is evident on most of the campaigns we’ve handled.

Long gone are the days of solely winning through submission-type link building tactics.

The post Email Outreach Tips: +22 Link Building Email Templates appeared first on Kaiserthesage.


A Developer’s Guide To Implementing The Data Layer

This is a post written for developers. If you’re not a developer or you do not have access to make changes to the source code of the site that you’d like to add initial dataLayer values to, forward this post to the appropriate persons. Seriously, this post is for developers only. Get out. Go.

Hello Developer!

I understand you recently received an (email|task|ticket|request|sticky note) asking YOU to implement something called a ‘data layer’ on your site, possibly with some details about what it should include. This guide is meant to flesh out that request so you’ll know what you’re doing.

Before We Talk Data Layer

In order to set this up, you need a Google Tag Manager or Google Optimize snippet (or potentially both, if your team has asked for both products to be deployed) from the team that submitted the request. if you don’t have that, request it from your team and cool your heels until they send them over. Both of these tools can use the same data layer!

They snippets look similar to what’s below, but will have special IDs for your organization instead of FOO, which I’ve used below. These snippets must go as high in the head of your page as possible.

Google Optimize Snippet

Google Optimize is a great free/paid A/B testing tool from Google that allows you to create and run experiments on your website. Before you can start using it however, you need to install it on your website. Need this? Check out more details on Google Optimize installation instructions.

Here’s generally what it will look like, and you’ll need to update certain values.


Google Tag Manager Snippet

Google Tag Manager is tool that makes it easier to add tags and tracking to your site for analytics, advertising, SEO fixes, you name it! It’s our preferred method for adding Google Analytics to a page. Learn more here about how Google Tag Manager and Google Analytics work together.

Just like Google Optimize, a developer will need to add this to your website before you can start using the tool. Here are detailed instructions for Google Tag Manager installation.


Note: To keep it simple, the vast majority of websites can safely ignore the GTM iframe snippet. More on what that is here.

Why Do We Need A Data Layer?

The above instructions and links are all that you need to successfully install and start using tools like Google Optimize and Google Tag Manager. These tools will load in the browser for the user that is viewing your site and begin performing their the tasks they were instructed to do, whether that’s tracking clicks on PDFs or showing two different versions of a headline. Great!

Both of these tools will use information from the page itself and manually entered into the tool to make decisions and share information to other tools.

Making decisions comes in the form of deciding when to Trigger certain tags inside of Google Tag Manager, or determining when an experiment is shown in Google Optimize. Sharing data with other tools can be sending information about your page or users to another tool, like Google Analytics, Google AdWords, or third-party tools.

We use the data layer to use information from your server to help us make decisions or share data with other tools.

Depending on what platform you use to host your website, you likely have a wealth of information on the server about the content of your pages and the users that are accessing it. The data layer is how we make that information available to tools like Google Tag Manager and Google Optimize so they can be used easily. The setup on the backend will be specific to your platform, but the output will be standardized.

The Data Layer Snippet

Immediately before the GTM and/or Optimize snippets, place this code:

var dataLayer = window.dataLayer = window.dataLayer || [];
dataLayer.push({ key: 'value' ...
});

Replace the ellipses with data from your backend. The team that requested the change should be able to tell you what data it is that they need. You will need to extract that data from your system and populate it into the value here. This data will appear as key/value pairs. The keys can be named almost anything you like, and the values should be dynamic. Important rules here:

  1. It must be on the page when the browser receives the initial response from the server. Absolutely no AJAX.
  2. It must not be edited via code after the snippets. No going back and adding stuff – the data must be in place at the time the page is sent to the user.
  3. It must always appear ABOVE the GTM and/or Optimize snippets.
  4. If no data is required for a given page, the Data Layer snippet can be omitted – both snippets will see that it isn’t present and initialize a blank one of their own.

Placing Values on the Data Layer

This part gets a little tricky. As a website developer, you know best how to get the data out of your platform and echo it onto the page. I can’t help really help you here. Here’s an example from a post that we wrote about pulling information from WordPress to create date range cohorts for content, but every platform will be unique.

The part I can help with is what it should look like on the page!

Here’s a complete example with a data layer and Google Tag Manager snippet, based on an implementation we used with a client:

var dataLayer = window.dataLayer = window.dataLayer || [];
dataLayer.push({ page: { category1: 'Auto', category2: 'Life hacks', platform: 'Foo', wordCount: 40, length: 400 }, user: { backendId: '20d75b5c-5143-11e7-b114-b2f933d5fe66' }, site: { owner: 'bar' }, session: { status: 'anonymous', checkedOut: false }
});

Wrapping It All Up

So there you have it – information is taken from your server, and added to the page in the correct format. By following these instructions, now your team members using Google Tag Manager and Google Optimize will be able to use the information you’ve given them in a variety of ways.

Want a visual guide? Here’s a handy example of taking Category information a blog post (this one!), storing it on the data later, accessing it Google Tag Manager and Google Optimize, then seeing the final result.

Frequently Asked Questions

There’s a lot of documentation around these items, and a lot of our recommendations have come from our consulting experience with customers.

Q: Can’t I just do dataLayer = [{}]?

A: I don’t have enough fingers and toes to count the number of times that sloppy instantiation like that has led to data loss. Just use my syntax. Why? Not checking for an existing variable (like above) can overwrite the reference that GTM depends on, and it can be a real pain to troubleshoot, and it happens all. The. Time. Please, please just take my advice and don’t get cute. Remember: if it breaks and you changed it, who’s going to get yelled at? You. It’s going to be you. And I can be very loud.

Q: What kind of things go in the dataLayer part?

A: Here’s a great post from my very talented colleague Dorcas Alexander that lists off a bunch of ideas for inspiration. Again, think about any information that will be helpful for making decisions or that may need to shared with another tool.

Q: But what’s actually going on under the hood? I want to know how it works.

A: Great! Here’s a laundry list of blog posts to go through; they’ll teach you just about everything you’ll need.

Still have questions? Is this guide incomplete? Sound off in the comments below and I’ll take a look.


How to Plan a Successful PPC Campaign in 2019

How to Plan a Successful PPC Campaign in 2019

We at SEMrush constantly monitor Google’s updates, as the search engine rarely stands still. While testing the updates that are most likely to impact advertisers, we have figured out the most important factors that will help increase the efficiency of your Google Ads campaigns in 2019. And we also explored some ways you can benefit from them.


Get Free Advertising For Your Nonprofit with Google Grants

For the majority of nonprofits, having a significant advertising or marketing budget is not possible. Therefore, these organizations that give back to the community, the world and/or the greater good need to take full advantage of all and every opportunity that comes their way.

Well nonprofits, you may be in luck! With Google Ad Grants, a nonprofit organization can apply for $10,000 in free spend from Google! This money – which amounts to >$100K per year – will be used in the Google AdWords platform to run paid search campaigns.

Are you a nonprofit that could use that extra advertising budget? If so, listen up!

Get Started

This is how you can get started.

  1. Check Criteria
  2. Submit Google for Nonprofits Application
  3. Create a Google AdWords account
  4. Enroll in Google Ad Grants

1. Check Criteria

Review Google’s eligibility criteria. Your organization must be based in the approved countries, must hold valid charity status and meet several other criteria. There are some organizations which are excluded such as government organizations and hospitals. If you are eligible, move on to the next step.

2. Submit Google for Nonprofits Application

Next, submit the Google For Nonprofits application. Get ready to input organization information (address, mission statement, EIN, etc.). I was told by a Google Rep that the application approval process can take up to 20 days. However, the last application I submitted took a fraction of that time.

3. Create a Google AdWords Account

While you are waiting to hear if your application was approved, create a Google AdWords account. When building an account prior to the Google Grants approval, you have two options.

Option #1: Build out your full account. Do keyword research, build campaigns, write ads, etc.

Option #2: Build a shell of an account – or – the minimum needed to get Google Grants approval. Once you know that you are approved, then go back and finish building out the account. To do this, you must have the following in your account:

  • At least one campaign
  • At least one keyword
  • At least one ad copy
  • Daily budget of $329
  • Network Type of “Search Network Only”
  • Not opted into Search Partners
  • Do not input any billing information

4. Enroll in Google Ad Grants

If your enrollment is approved, navigate back to Google For Nonprofits . Sign in using the email address associated with the account you created earlier.  You will now have the option to Enroll in Google Ad Grants. Submit that application and insert the Customer ID of the AdWords account you just created. It may take up to 10 days to get Grants status for your AdWords account

Though I’ve outlined the steps above, you can also find instructions in the How To Apply section of the Google Ad Grants website.

Building Your Account

Now onto building your campaigns. Here are some important things to know.

  • Your overall campaign can spend ~$330 per day
  • Google has a bid limit of $2 for Grants accounts. You cannot exceed a $2 max cost per click
  • Your ads can only run on the Search Network. Google Grants accounts are not allowed to serve on the Display network (that includes Display Remarketing)
  • Ads will only run on the Google Search results, not the Google Search Partner network

There is also a minimum for management activity in the account to maintain $10K in monthly spend. Advertisers must:

  • Log in at least once a month
  • Make changes at least once every 90 days

Managing Your Account

The management style for a Google Grants account tends to be different than that of a typical AdWords account. One of the major differences is the fact that an advertiser doesn’t have control over their bids above and beyond $2. Now, let’s take a step back and review how clicks, bids and cost work together for this type of account.

  • With $10K in free money, an advertiser is incentivized to spend as much as possible
  • How does an advertiser accrue cost? By getting clicks
  • How do you spend more and drive more traffic to your website? By getting more clicks!
  • What’s one important way to drive more click volume? By being able to control your bids. Bidding up is one way to spend more and drive more clicks to your website (generally speaking)

Without full control over bidding, an advertiser needs to get creative and find other ways to move the needle/drive more click traffic. We know (or if you don’t, you will learn now!) that Google rewards advertiser with high quality, relevant ads – i.e. strong quality scores. By improving quality score and providing searchers with an optimal user experience, it is possible for an advertiser to pay less for a higher ad position. And hopefully that leads to more clicks for your Google Grants account!

Here are some tips for getting the most out of your Grants account:

Tip #1 – Improve CTR

Make it your job to improve CTR in any way you can. Test and then test some more. Come up with multiple ad copy ideas and rotate them in as you declare “winners” from your older ad copy tests.

Ad extensions are another great way to improve CTR – and it helps to improve the relevancy of your ads, take up more space in the SERP and enhance your message. Think about using sitelinks, callout extensions, structured snippets, review extensions, call extensions, location extensions, etc.

Tip #2 – Think About Your Match Types

In a non-Grants account, I prefer to stick to exact, phrase and modified-broad keyword match types. With a Google Grants account, I tend to include a greater amount of broad match terms. Broad match terms can help match to other terms that you haven’t necessarily thought about. Broad match types can also help to expand your reach especially with lower bids.

Tip #3 – Remember Your Search Query Reporting

This is a great tip for all AdWords accounts. Make sure you check your search term reporting. This will tell you which search queries are triggering your ads. From there, you can decide if you need to suppress keywords or add new ones.

Tip #4 – Landing Pages

Don’t forget about your landing pages! Your quality score is not only tied to your keywords, but your landing pages as well. Check your landing page quality score and if it’s low, try testing. Remember, by improving the quality score it is possible to pay a lower cost per click for a better ad position.

Continued Learning

Want to learn even more about managing a Google Ad Grants account? Many of our consultants have experience doing just that! To get information on consulting or to sign up for an AdWords training course, visit our services and training pages.


Comparing Google Analytics 360 and Google Analytics

Seamless Integrations

With Google Analytics 360, you can enjoy integrations with other products in the Google Marketing Platform and beyond, including Google Ads, Google Display & Video 360, Google Optimize, and turnkey integrations with Salesforce Marketing Cloud and Sales Cloud, to name a few. While Google Analytics allows importing data from custom data sources, including advertising cost data, product or content data, and more, Google Analytics 360 enhances this functionality with a feature called query-time import, meaning we can look at our historic data in the interface and view the new imported data, even if we have changed or updated the import after the data was originally collected.

Shared Audiences

In Google Marketing Platform, “audiences” are how we pass collections of users between tools — like sharing a Google Analytics audience with Google Ads, Google Display & Video 360, or Google Optimize. While there are many ways to accomplish the same objective, using simple audience definitions in Google Analytics can improve your flexibility and accuracy when remarketing to users through Google Ads. By keeping each audience definition modular and relying on tool-specific features, you can avoid situations that waste your money and annoy users.

Hit-Level Data

Google Analytics 360 allows hit-level data to be automatically passed from Google Analytics into Google BigQuery, Google’s big data storage and querying tool. This allows for more complex analysis, or can act as a conduit for exporting more granular data into a data warehouse. For companies looking to answer challenging questions with their data, like defining the customer journey, looking at user behavior across sessions, or joining together external data sources, Google BigQuery is the tool you will be most excited about.

Reporting Advancements

There are a number of advanced reports available only to Google Analytics 360 customers including Advanced Analysis, Data-Driven Attribution and more. There are also enhanced custom funnel reporting options available, enabling better flow reporting for on-site actions across users and sessions. This feature is great for visualizing multiple conversion paths through your website.

Full Services with a Certified Partner

When you purchase Google Analytics 360 through a sales partner like us, we work with you to make sure you get the most value out of Google Marketing Platform. Not only do you get our top-of-the-line services and support, you also get our market-leading expertise.
 


The Value of Google BigQuery and Google Analytics 360

The integration between Google Analytics 360 and BigQuery is perhaps the most empowering feature in all of web analytics. (There, I said it!) Its hit-level data and cloud-based infrastructure give BigQuery analysis capabilities not found in other web analytics platforms, including both free tools and paid. BigQuery can be the link between third-party data and marketing analytics data. It is the facilitator of advanced data science techniques, and is our preferred data source for visualization.

At LunaMetrics, we work with organizations in a variety of industries that are looking for ways to find deeper insight in their data and turn to Google Analytics 360 for a variety of reasons. For many, the link to BigQuery is actually the reason why they have adopted the platform and in some cases they use BigQuery more than the interface itself.

The Google Analytics 360-to-BigQuery integration serves three primary purposes: querying raw data, connecting with other data sources, and exporting data for visualization. Each capability builds off of the last, and each feature further extends all that Google Analytics can do.

Querying Raw Data

BigQuery is a database, hosted in the cloud. With your subscription to Google Analytics 360, your Analytics data is exported, hit by hit, into BigQuery for you to query, just as you would query a SQL database. The data that comes into BigQuery is raw, hit-level data.

By comparison, inside the Google Analytics interface the data you see is session-based and aggregated. That’s fine for simple marketing questions we might have. For example, in Google Analytics we can easily count the number of sessions that came from a mobile device. But if we wanted to count the number of video play events by a particular user, across multiple sessions, that would be much more difficult to answer.

The Google Analytics interface is relatively easy to use and has a number of tools to make it easy to perform on-the-fly analysis. In order to keep the interface as fast as possible, there are certain limitations in the ways you can access your data and how much you can customize the interface.

This is where BigQuery really shines. You’re using the same underlying data as Google Analytics, but you don’t have the same limitations. Let’s look at some of the limitations that BigQuery overcomes.

Sampling, What Sampling?

One of the most noticeable limitations within the Google Analytics interface is “sampling,” which will kick in when you try to run a complicated or customized report, or a large date range. When this happens, Google extrapolates the data you see by counting only some of your data points and modeling the rest. The result is that the data you are looking at is, in some cases, merely an approximation.

For the majority of users and queries within the interface, this translates to a better user experience and more immediate results. Sampling is improved when you move from the free version to Google Analytics 360: it typically kicks in with non-standard queries when there are over 500K sessions at the property level. With GA360, that increases to 100M sessions at the view level.

While there are ways to combat these sampling limitations in the interface, with BigQuery we can circumvent sampling. We have all of the raw data, so we can query it, slice it, or dice it, any which way, and still have 100% real data in our results.

Focusing On Users Instead of Sessions

In the Google Analytics interface goals and goal funnels are session-based: that means if a person takes multiple visits to complete a task, you won’t see behavior from their earlier session in the goal funnel in Google Analytics; BigQuery has the data and the capability to allow you to surface that information. Likewise, if you’d like to see a goal conversion rate by Users instead (say, what percentage of my users filled out this form?), you’d need to use BigQuery.

This also has a big impact for ecommerce-focused companies. If you use the Enhanced Ecommerce reports inside Google Analytics’ interface, you know that those reports are session-based. But in many real-world scenarios, a person might add an item to their cart in one visit and wait to complete the purchase in another visit. BigQuery allows you to see purchasing behavior from users who take more than one session to pull the trigger.

But what about using User segments in Google Analytics? I’m hearing some of you ask. User-based segments in the interface are restricted to a 90-day lookback window; if the interaction you’re looking at occurred before that, you can’t see it at all. BigQuery’s data never goes away. You can create complex segmentation rules, going back in time as far as you like, or even create dynamic segments where one specific behavior is a requirement for another — imagine using variables in segments. You can do that in BigQuery.

Combining Dimensions

Google Analytics’ data model is structured so that session-based dimensions (like source/medium) don’t play well when combined with user-level or page-level dimensions and metrics. And there’s a limit to the number of dimensions we can see side-by-side: two dimensions is usually our limit in the interface, five in custom reports, and the API allows for seven dimensions. With BigQuery, there are no such limitations.

For example – ecommerce customers may have trouble pulling exact stats of the number of users from Social Media that saw a product page and then subsequently purchased the same product. BigQuery users can handle that with a single query.

Going Forward By Going Backwards

Google Analytics has a very specific way that it processes data and your configuration settings, like Goals and Filters. As data is collected from your site or app, GA will apply those settings and store the finished, altered data for you to use in your reports later on. Because of this, it’s not possible to go backwards and change your data in Google Analytics.

However, with BigQuery you can essentially rewrite history! If you made a tracking error in the past, and you’d like to filter out or modify data (page paths, events, entire sessions), you can dynamically adjust your queries in BigQuery to account for those issues.

Goals in particular can cause frustration in the GA interface – you might decide after-the-fact that you wanted to set up a Goal for some key action on your site, or find a mistake in a Goal setup. In the GA interface, goals only work moving forward; with BigQuery, we can write a query to calculate goal completions based on past data. Furthermore, we can add complicated funnels or required actions that can extend across sessions in order for a Goal to be considered complete.

The Fun Stuff

Finally, we can use this raw data and another programming language, like R or Python, to perform data science techniques using the raw data from BigQuery. We can derive predictive insights, for example, using statistical models to forecast how many purchases we might expect to have in the coming month, estimate how many leads might result from a new email blast, or look at projecting return on ad spend.

With machine learning algorithms, based on data science models, the data can speak for itself. Hidden in the wealth of data in Google Analytics are what marketers call “personas”: common user characteristics or user behaviors that, when grouped together, form what data scientists call “clusters” of users. We can employ machine learning algorithms to comb through this data in BigQuery and discover something new — or maybe validate what we already knew.

The final outcome of this machine learning might just be a newfound understanding of our idealized “customer journey”, that elusive most-valuable-path (MVP) from one page to another leading up to a conversion.

If you have data scientists on your team, they can help you leverage BigQuery to perform these exciting data science techniques, or you can lean on us. LunaMetrics is doing just this kind of work every day. Check out our most recent case studies to learn more about these solutions that we provided:

As you explore Google Analytics data in Google BigQuery, check out some of our new Google BigQuery Recipes, already created queries that you can copy-and-paste to immediately begin exploring your data in BigQuery.

Connecting with Other Data Sources

If you’re like most companies, the data from Google Analytics is just a small portion of all the data that you rely on to make digital marketing, content, and usability decisions. You probably have data everywhere: in your CRM, your ecommerce platform, DoubleClick, AdWords, third-party advertising platforms, pixel trackers, Facebook, YouTube, and more. The more tools you use, the more value you are able to create… but only if those tools can talk to one another.

When you have such a large amount of data, there’s always a question about where it should live. BigQuery allows us to combine our Google Analytics data with third-party data sources, in one of two ways: you can either import data into BigQuery, or you can export data out of BigQuery.

Bringing Data Into BigQuery

Since BigQuery is a database, it can actually be your single data source of truth. For example, imagine that you have a CRM. You can configure your website to pass in the Google Analytics client ID (cid) into your CRM when a customer record is created. This identifier is unique to every visitor on your website. Once the client ID is in your CRM, you can export your CRM data into BigQuery. The client ID can be used as the “key” from one data source to the other. The result is that you can see customer information alongside your web analytics and marketing data in BigQuery.

The process for getting data into BigQuery will change based on your needs – how often is the data updated, how large is the data, and what internal processes you already have in place. Google BigQuery natively integrates with Google Drive and Google Sheets, which can make bring in files or sheets a breeze. We can use API’s to programmatically update datasources inside of BigQuery, connecting to other cloud-based systems or internal data warehouses.

This is all so important because BigQuery allows you to import personally-identifiable information (PII) into BigQuery, which is not something that is possible in Google Analytics’ interface. Want to understand how your top customers first found your website? Once you’ve integrated your data sets, you can write a BigQuery query to generate a report of which marketing sources produced which customers, combining data from your CRM with the behavioral and acquisition information from Google Analytics. This will also help you get a better sense of your true “customer lifetime value” (CLV).

Connect BigQuery Data With Your System

Perhaps more common, you can export data from BigQuery to another system. BigQuery has its own API which helps explain this process. With the aforementioned client ID import, we can send data from BigQuery into any data warehouse that will accept a CSV import. For example, we might want to summarize web behavior for each customer in Salesforce. We could easily obtain the original source/medium information from BigQuery and export it to Salesforce. With a little programming work, we might even develop a lead-scoring model for every sales prospect. (LunaMetrics actually does this internally ourselves.)

Beyond just the raw data, bringing in external data can help with one of the most valuable uses of BigQuery, audience generation. Following a detailed data science project, we might identify an audience or “persona” in BigQuery that we’d like to pursue. Connect with external data to either give you more info to help with creating the audiences or to help us take those audiences and use those outside of BigQuery.

In addition to identifying the initial audiences, with BigQuery we can also generate dynamic audiences for remarketing, A/B testing and Google Analytics reporting. There are several ways this is possible.

For example, we might choose to combine our Google Analytics data from BigQuery with email addresses or related emails from a 3rd-party system. This integration would enable us to leverage a feature called “Customer Match” in Google AdWords, allowing us to target matched prospects or existing customers. By uploading a list of emails to AdWords, we can create and target audiences of those users in Google Advertising tools like Search, Gmail & YouTube. Or we might want to send this list of email addresses to a mailing service like MailChimp, or a marketing automation platform like HubSpot or Marketo.

At the same time, we might choose to see these audiences back inside the Google Analytics interface. The easy way to do this would be to approximate the characteristics we are looking for – creating a segment that closely matches, or mimics, the BigQuery audiences that we created. There are more complex ways to do this as well, for example using the measurement protocol to send these client IDs back into Google Analytics and define an audience. Again, there are number of possible options with varying degrees of complexity.

Once you have built or imported an audience inside Google Analytics, you can conduct advanced reporting in the interface, or use the audience for remarketing in a platform like DoubleClick. And with Optimize 360, another tremendously powerful Google Analytics 360 Suite product, you can actually target these audiences for A/B testing on your website.

Exporting Data for Visualization

At LunaMetrics, we love that BigQuery can integrate with many other tools, as already mentioned: MailChimp, HubSpot, Optimize, DoubleClick, AdWords, etc. These tools form a virtuous cycle, that can help you better understand your full customer journey. If the dots are connected, the value of all your data is amplified.

But perhaps the most virtuous cycle of all is the connection between Google Analytics, Google BigQuery and Google Data Studio. There is no better way to visualize analytics data.

If your goal is to create one, and only one, “dashboard of record”, this is it. If you’ve combined your 3rd-party data with BigQuery, as mentioned above, you will only reap greater rewards when all of that data is visualized in one place. You can combine your Google Analytics data from BigQuery, unsampled, raw, hit-level, with your customer-connected PII data from BigQuery, and see it all, dynamically, inside Data Studio.

The complicated, behind-the-scenes magic occurs within BigQuery and connects easily into Data Studio – which allows you to bring in tables from BigQuery or even custom queries. Interested? Check out this great post about BigQuery data in Data Studio.

There are other ways to visualize BigQuery data, of course. Tableau has a native connector to BigQuery, as do many other 3rd-party visualization tools, like Shufflepoint for Excel (a LunaMetrics favorite).

(Did I mention Google Data Studio is free?)

We know that Google has some exciting new tools on the horizon for the 360 Suite, and we are excited to see the benefits continue to grow as additional integrations and features are announced.

One More Thing…

Because BigQuery is hosted on Google’s extremely fast cloud-based server architecture, Google Cloud Platform (GCP), the queries we run can range across billions of rows of data and come back in seconds. There’s nothing to set up, and no virtual machines to manage.

Oh, and with Google Analytics 360 your usage of BigQuery is free, up to $500/month of usage (which equals 25 terabytes of storage or 100 terabytes of queried data). For reference, even our customers with over one-billion hits per month are in the ballpark of $150/month for storage and $50 to $100/month for querying — well within the free quota.

When an organization starts with Google Analytics 360, we can import the last 13 months of Google Analytics data into BigQuery, so you can start writing advanced queries for year-over-year reports immediately.

BigQuery is an extremely powerful tool, perhaps more important to data analysis and reporting than any other feature in Google Analytics 360. We see BigQuery as an intrinsic part of the platform, and a necessary precursor to any mid- to enterprise-level web analytics strategy.