Back to Top

Do Businesses Really Use Google My Business Posts? A Case Study

Posted by Ben_Fisher

Google My Business (GMB) is one of the most powerful ways to improve a business’ local search engine optimization and online visibility. If you’re a local business, claiming your Google My Business profile is one of the first steps you should take to increase your company’s online presence.

As long as your local business meets Google’s guidelines, your Google My Business profile can help give your company FREE exposure on Google’s search engine. Not only can potential customers quickly see your business’ name, address and phone number, but they can also see photos of your business, read online reviews, find a description about your company, complete a transaction (like book an appointment) and see other information that grabs a searcher’s attention — all without them even visiting your website. That’s pretty powerful stuff!

Google My Business helps with local rankings

Not only is your GMB Profile easily visible to potential customers when they search on Google, but Google My Business is also a key Google local ranking factor. In fact, according to local ranking factor industry research, Google My Business “signals” is the most important ranking factor for local pack rankings. Google My Business signals had a significant increase in ranking importance between 2017 and 2018 — rising from 19% to 25%.

Claiming your Google My Business profile is your first step to local optimization — but many people mistakenly think that just claiming your Google My Business profile is enough. However, optimizing your Google My Business profile and frequently logging into your Google My Business dashboard to make sure that no unwanted updates have been made to your profile is vital to improving your rankings and ensuring the integrity of your business profile’s accuracy.

Google My Business features that make your profile ROCK!

Google offers a variety of ways to optimize and enhance your Google My Business profile. You can add photos, videos, business hours, a description of your company, frequently asked questions and answers, communicate with customers via messages, allow customers to book appointments, respond to online reviews and more.

One of the most powerful ways to grab a searcher’s attention is by creating Google My Business Posts. GMB Posts are almost like mini-ads for your company, products, or services.

Google offers a variety of posts you can create to promote your business:

  • What’s New
  • Event
  • Offer
  • Product

Posts also allow you to include a call to action (CTA) so you can better control what the visitor does after they view your post — creating the ultimate marketing experience. Current CTAs are:

  • Book
  • Order Online
  • Buy
  • Learn More
  • Sign Up
  • Get Offer
  • Call Now

Posts use a combination of images, text and a CTA to creatively show your message to potential customers. A Post shows in your GMB profile when someone searches for your business’ name on Google or views your business’ Google My Business profile on Google Maps.

Once you create a Post, you can even share it on your social media channels to get extra exposure.

Despite the name, Google My Business Posts are not actual social media posts. Typically the first 100 characters of the post are what shows up on screen (the rest is cut off and must be clicked on to be seen), so make sure the most important words are at the beginning of your post. Don’t use hashtags — they’re meaningless. It’s best if you can create new posts every seven days or so.

Google My Business Posts are a great way to show off your business in a unique way at the exact time when a searcher is looking at your business online.

But there’s a long-standing question: Are businesses actually creating GMB Posts to get their message across to potential customers? Let’s find out…

The big question: Are businesses actively using Google My Business Posts?

There has been a lot of discussion in the SEO industry about Google My Business Posts and their value: Do they help with SEO rankings? How effective are they? Do posts garner engagement? Does where the Posts appear on your GMB profile matter? How often should you post? Should you even create Google My Business Posts at all? Lots of questions, right?

As industry experts look at all of these angles, what do average, everyday business owners actually do when it comes to GMB Posts? Are real businesses creating posts? I set out to find the answer to this question using real data. Here are the details.

Google My Business Post case study: Just the facts

When I set out to discover if businesses were actively using GMB Posts for their companies’ Google My Business profiles, I first wanted to make sure I looked at data in competitive industries and markets. So I looked at a total of 2,000 Google My Business profiles that comprised the top 20 results in the Local Finder. I searched for highly competitive keyword phrases in the top ten cities (based on population density, according to Wikipedia.)

For this case study, I also chose to look at service type businesses.

Here are the results.


New York, Los Angeles, Chicago, Philadelphia, Dallas, San Jose, San Francisco, Washington DC, Houston, and Boston.


real estate agent, mortgage, travel agency, insurance or insurance agents, dentist, plastic surgeon, personal injury lawyer, plumber, veterinarian or vet, and locksmith

Surprise! Out of the industries researched, Personal Injury Lawyers and Locksmiths posted the most often.

For the case study, I looked at the following:

  • How many businesses had an active Google My Business Post (i.e. have posted in the last seven days)
  • How many had previously made at least one post
  • How many have never created a post

Do businesses create Google My Business Posts?

Based on the businesses, cities, and keywords researched, I discovered that more than half of the businesses are actively creating Posts or have created Google My Business Posts in the past.

  • 17.5% of businesses had an active post in the last 7 days
  • 42.1% of businesses had previously made at least one post
  • 40.4% have never created a post

Highlight: A total of 59.60% of businesses have posted a Google My Business Post on their Google My Business profile.

NOTE: If you want to look at the raw numbers, you can check out the research document that outlines all the raw data. (NOTE: Credit for the research spreadsheet template I used and inspiration to do this case study goes to SEO expert Phil Rozek.)

Do searchers engage with Google My Business Posts?

If a business takes the time to create Google My Business Posts, do searchers and potential customers actually take the time to look at your posts? And most importantly, do they take action and engage with your posts?

This chart represents nine random clients, their total post views over a 28-day period, and the corresponding total direct/branded impressions on their Google My Business profiles. When we look at the total number of direct/branded views alongside the number of views posts received, the number of views for posts appears to be higher. This means that a single user is more than likely viewing multiple posts.

This means that if you take the time to create a GMB Post and your marketing message is meaningful, you have a high chance of converting a potential searcher into a customer — or at least someone who is going to take the time to look at your marketing message. (How awesome is that?)

Do searchers click on Google My Business Posts?

So your GMB Posts show up in your Knowledge Panel when someone searches for your business on Google and Google Maps, but do searchers actually click on your post to read more?

When we evaluated the various industry post views to their total direct/branded search views, on average the post is clicked on almost 100% of the time!

Google My Business insights

When you log in to your Google My Business dashboard you can see firsthand how well your Posts are doing. Below is a side-by-side image of a business’ post views and their direct search impressions. By checking your GMB insights, you can find out how well your Google My Business posts are performing for your business!

GMB Posts are worth it

After looking at 2,000 GMB profiles, I discovered a lot of things. One thing is for sure. It’s hard to tell on a week-by-week basis how many companies are using GMB Posts because posts “go dark” every seven business days (unless the Post is an event post with a start and end date.)

Also, Google recently moved Posts from the top of the Google My Business profile towards the bottom, so they don’t stand out as much as they did just a few months ago. This may mean that there’s less incentive for businesses to create posts.

However, what this case study does show us is that businesses that are in a competitive location and industry should use Google My Business optimizing strategies and features like posts if they want to get an edge on their competition.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 6 days ago from

The Basics of Building an Intent-based Keyword List

Posted by TheMozTeam

This post was originally published on the STAT blog.

This week, we’re taking a deep dive into search intent.

The STAT whitepaper looked at how SERP features respond to intent, and the bonus blog posts broke things down even further and examined how individual intent modifiers impact SERP features, the kind of content that Google serves at each stage of intent, and how you can set up your very own search intent projects. And look out for Seer’s very own Scott Taft’s upcoming post on how to use STAT and Power BI to create your very own search intent dashboard.

Search intent is the new demographics, so it only made sense to get up close and personal with it. Of course, in order to bag all those juicy search intent tidbits, we needed a great intent-based keyword list. Here’s how you can get your hands on one of those.

Gather your core keywords

First, before you can even think about intent, you need to have a solid foundation of core keywords in place. These are the products, features, and/or services that you’ll build your search intent funnel around.

But goodness knows that keyword list-building is more of an art than a science, and even the greatest writers (hi, Homer) needed to invoke the muses (hey, Calliope) for inspiration, so if staring at your website isn’t getting the creative juices flowing, you can look to a few different places for help.

Snag some good suggestions from keyword research tools

Lots of folks like to use the Google Keyword Planner to help them get started. Ubersuggest and Yoast’s Google Suggest Expander will also help add keywords to your arsenal. And Answer The Public gives you all of that, and beautifully visualized to boot.

Simply plunk in a keyword and watch the suggestions pour in. Just remember to be critical of these auto-generated lists, as odd choices sometimes slip into the mix. For example, apparently we should add [free phones] to our list of [rank tracking] keywords. Huh.

Spot inspiration on the SERPs

Two straight-from-the-SERP resources that we love for keyword research are the “People also ask” box and related searches. These queries are Google-vetted and plentiful, and also give you some insight into how the search engine giant links topics.

If you’re a STAT client, you can generate reports that will give you every question in a PAA box (before it gets infinite), as well as each of the eight related searches at the bottom of a SERP. Run the reports for a couple of days and you’ll get a quick sense of which questions and queries Google favours for your existing keyword set.

A quick note about language & location

When you’re in the UK, you push a pram, not a stroller; you don’t wear a sweater, you wear a jumper. This is all to say that if you’re in the business of global tracking, it’s important to keep different countries’ word choices in mind. Even if you’re not creating content with them, it’s good to see if you’re appearing for the terms your global searchers are using.

Add your intent modifiers

Now it’s time to tackle the intent bit of your keyword list. And this bit is going to require drawing some lines in the sand because the modifiers that occupy each intent category can be highly subjective — does “best” apply transactional intent instead of commercial?

We’ve put together a loose guideline below, but the bottom line is that intent should be structured and classified in a way that makes sense to your business. And if you’re stuck for modifiers to marry to your core keywords, here’s a list of 50+ to help with the coupling.

Informational intent

The searcher has identified a need and is looking for the best solution. These keywords are the core keywords from your earlier hard work, plus every question you think your searchers might have if they’re unfamiliar with your product or services.

Your informational queries might look something like:

  • [product name]
  • what is [product name]
  • how does [product name] work
  • how do I use [product name]
Commercial intent

At this stage, the searcher has zeroed in on a solution and is looking into all the different options available to them. They’re doing comparative research and are interested in specific requirements and features.

For our research, we used best, compare, deals, new, online, refurbished, reviews, shop, top, and used.

Your commercial queries might look something like:

  • best [product name]
  • [product name] reviews
  • compare [product name]
  • what is the top [product name]
  • [colour/style/size] [product name]
Transactional intent (including local and navigational intent)

Transactional queries are the most likely to convert and generally include terms that revolve around price, brand, and location, which is why navigational and local intent are nestled within this stage of the intent funnel.

For our research, we used affordable, buy, cheap, cost, coupon, free shipping, and price.

Your transactional queries might look something like:

  • how much does [product name] cost
  • [product name] in [location]
  • order [product name] online
  • [product name] near me
  • affordable [brand name] [product name]
A tip if you want to speed things up

A super quick way to add modifiers to your keywords and save your typing fingers is by using a keyword mixer like this one. Just don’t forget that using computer programs for human-speak means you’ll have to give them the ol’ once-over to make sure they still make sense.

Audit your list

Now that you’ve reached for the stars and got yourself a huge list of keywords, it’s time to bring things back down to reality and see which ones you’ll actually want to keep around.

No two audits are going to look the same, but here are a few considerations you’ll want to keep in mind when whittling your keywords down to the best of the bunch.

  1. Relevance. Are your keywords represented on your site? Do they point to optimized pages
  2. Search volume. Are you after highly searched terms or looking to build an audience? You can get the SV goods from the Google Keyword Planner.
  3. Opportunity. How many clicks and impressions are your keywords raking in? While not comprehensive (thanks, Not Provided), you can gather some of this info by digging into Google Search Console.
  4. Competition. What other websites are ranking for your keywords? Are you up against SERP monsters like Amazon? What about paid advertising like shopping boxes? How much SERP space are they taking up? Your friendly SERP analytics platform withshare of voice capabilities (hi!) can help you understand your search landscape.
  5. Difficulty. How easy is your keyword going to be to win? Search volume can give you a rough idea — the higher the search volume, the stiffer the competition is likely to be — but for a different approach, Moz’s Keyword Explorer has a Difficulty score that takes Page Authority, Domain Authority, and projected click-through-rate into account.

By now, you should have a pretty solid plan of attack to create an intent-based keyword list of your very own to love, nurture, and cherish.

If, before you jump headlong into it, you’re curious what a good chunk of this is going to looks like in practice, give this excellent article by Russ Jones a read, or drop us a line. We’re always keen to show folks why tracking keywords at scale is the best way to uncover intent-based insights.

Read on, readers!

More in our search intent series:

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 6 days ago from

Exclusive: IAB Europe to release updated consent framework later this year, Google to sign on

Google has been working with IAB Europe on the updates, which will incorporate new tech and policy features.

The post Exclusive: IAB Europe to release updated consent framework later this year, Google to sign on appeared first on Marketing Land.

Please visit Marketing Land for the full article.

Reblogged 6 days ago from

Marketing Day: IAB Europe to update consent framework, LiveRamp’s latest identity resolution capabilities

Here’s our recap of what happened in online marketing today, as reported on Marketing Land and other places across the web.

The post Marketing Day: IAB Europe to update consent framework, LiveRamp’s latest identity resolution capabilities appeared first on Marketing Land.

Please visit Marketing Land for the full article.

Reblogged 6 days ago from

Optimizing content for voice search and virtual assistants for a competitive edge

Learn from SMX West presenters about specific tactics SEOs can use to create content for voice and how to roll out voice search campaigns.

Please visit Search Engine Land for the full article.

Reblogged 6 days ago from

Facebook is a local search engine. Are you treating it like one?

As soon as Facebook launched its Graph Search in 2013, it was only a matter of time before it became a big player in the search engine game.

But Graph Search was limited as far as optimization was concerned. The results it returned focused on providing an answer on the relationships between people, places, and things on Facebook rather than a link that contained what was likely the answer to your query like Google.

It was an IFTTT statement on steroids that never missed a workout.

It wasn’t until 2015 that Facebook became a major player in keyword search.

A brief history on Facebook Search

2011, August: Facebook beta tests sponsored results in search feature

Since Facebook’s search was in its infancy at this time, testing sponsored results may have been too soon since search functionality was limited to page titles, profiles, games, etc…

So you couldn’t target based on keywords quite yet, nor could you take advantage of a searchers affinity. Which means the only way to target was by targeting another page or product.

Sponsored results were short-lived as it turned into competitors targeting competitors relentlessly to steal traffic. In June of 2013, less than a year of Sponsored Results’ formal release, they were terminated.

2013, January: Facebook launches Graph Search

Prior to 2013, Facebook Search had only returned results based on existing pages. There wasn’t much logic in the algorithm other than the name of the person or page you’re looking for in your query. That changed with Graph Search.

Graph search took pages, categories, and people and put them into a revolving wheel of filters that was heavily personalized to each searcher and returned different results for everyone. It even allowed users to use natural language like, “friends near me that like DC Comics” and return results for just that.

At least that’s what you would have had to search to find friends you should stay away from…

The two major flaws in this were:

  1. You had to already know what and whom you were looking for.
  2. You also had to hope that what you were looking for had a page or tag associated with it.

 2014, December: Facebook adds keywords search functionality to Graph Search

This was the obvious next move by Facebook. They took what was already seemingly amazing at the time and added the ability to pull results based on keywords alone from your friends, people your friends have interacted with, or pages you like.

As far as a search engine was concerned, Facebook still had a long way to go to compete with Google.

2015, October: Facebook expands search to all 2 trillion public posts

Less than a year later, Facebook opens the floodgates in their search engine from inner-circle results to global, real-time, public news and posts. You could now search for any public post in their over 2 trillion indexed posts. Facebook now became a search engine that competed with the likes of Google for industry news and trending topics.

Viewing Facebook as a search engine

Prior to any real search functionality in Facebook, social media platforms were merely viewed as potential ranking signals and traffic boosters for your website.

Despite Matt Cuts denouncing the claim that Google uses social signals in their algorithm, Bing has gone the opposite direction about how they use engagement metrics from social media.

Facebook has been a solid directory site for local businesses to have their business page since 2007 when the number of businesses listed on the social media was only about 100,000. It allowed businesses to be in front of one of the fastest growing social media networks in the world, but the discoverability of your business was limited to mutual connections and brand awareness.

It wasn’t until 2014 when Facebook launched a new and improved version of the 2011 Places directory that Local Business Search became a focal point for Facebook to compete with Yelp and FourSquare.

Now, when searching for a company in Facebook’s Search that’s near you, you’ll get results that are eerily similar to the local 3-pack on Google. If we know anything about local 3-packs, it’s that there’s definitely an algorithm behind them that determines which businesses get to show up and in which order.

Facebook sees over 1.5 billion searches every day and over 600 million users visit business pages every day. They still have a ways to go to reach Google’s 3.5 billion searches per day. That said, claiming search queries just over 40% of what the search engine giant has — as a social media platform — isn’t anything to scoff at.

Why Facebook Search is important for local businesses

Facebook has provided a different means for customers to engage with brands than a typical search engine. But now the search engine has come to Facebook and the data shows people are using it. Not only to stalk their ex but also to find businesses.

  1. Facebook has a large user base using their search engine
  2. Local businesses can be discovered using search
  3. Local business are ranked using search

So I guess that means we should be optimizing our local business pages to rank higher in Facebook’s search for specific queries…

Sounds a lot like local SEO. But this time, it’s not about your website or Google.

The whole reason us SEOs are obsessed with SEO is that the value and opportunity it holds when 74% of buying journeys start with search engines during the consideration stage. If no one used search engines, SEO wouldn’t be much of a big deal. But they do and it is.

graph, "which of the following ways do you typically discover or find out about new products, brands, or services?"

According to a survey by Square, 52% of consumers discovered a brand through Facebook. That’s a pretty significant number to be passing off.

And with the launch of Facebook Local in late 2017, the network is getting more discover-friendly.

Optimizing for local Facebook SEO

Facebook has caught up with typical search engines and started implementing keywords in their algorithm and database. Bundle that knowledge with the fact that 600 million users visit business pages, and Facebook alone has a whopping 1.5 billion searches every day. It doesn’t take a genius to understand that Facebook SEO shows to be valuable in local business discoverability on the platform.

All we have to do is crack the code in optimizing Facebook business pages. Unfortunately, it seems Facebook is a bit more secretive than Google on what are and aren’t ranking signals in their local business Graph search.

It’s a matter of finding out why Facebook chose to rank Lightning Landscape & Irrigation over Dynamic Earth Lawn & Landscape, LLC when Dynamic Earth is both verified and closer.


In most of my research, Facebook tends to heavily weight posts and pages based on user engagement. But it doesn’t mean other ranking factors don’t exist in search. We’re looking at around 200 ranking signals similar to Google, but also vastly different.

Trying to crack this code has led the idea of “optimizing your Facebook business page”, including myself. But most seem to be focused on optimizing Facebook pages to rank in other search engines rather than Facebook itself.

While it is definitely a good idea to follow SEO best practices for the former reason, why not do both?

Facebook testing search ads

Coming into 2019, Facebook has started beta-testing search ads. They’re not keyword-based yet. Rather, they serve as extensions of news feed ads and only on supported ad formats. It’s quite an improvement from the original search ads that were abandoned in 2015.

It’s the type of subtle testing that could definitely produce some useful analytics in pursuing a full-blown search ad feature with keywords.

Related: “Facebook is expanding into Search Ads. What will this mean?”

Facebook knows two things:

1) Ad space on news feeds is decreasing.

2) More and more people are using their search feature.

The fact that Facebook is testing this doesn’t really tell me anything about local SEO signals on the platform. But it does tell me even Facebook sees a real opportunity in their own search engine. And their analytics are probably better than what we have right now.


Without any solid advice from Facebook, I think it’s time for the SEO community to start thinking about organic brand exposure through the social media platform itself. We should start viewing it as an enhanced search engine as it continues to grow and improve its search features.

More so, without search analytics from Facebook, there really isn’t a lot we can do in regards to optimizing for placement. At least right now there isn’t.

I’d love to see their new search ads beta really get traction and prompt Zuckerberg to consider a more SEO-friendly approach to search marketing on his own platform.

Of course, this is going to give “social media gurus” another reason to clog up my news feed with ads.

Jake Hundley is the founder of Evergrow Marketing.

The post Facebook is a local search engine. Are you treating it like one? appeared first on Search Engine Watch.

Reblogged 6 days ago from

Using Python to recover SEO site traffic (Part one)

Helping a client recover from a bad redesign or site migration is probably one of the most critical jobs you can face as an SEO.

The traditional approach of conducting a full forensic SEO audit works well most of the time, but what if there was a way to speed things up? You could potentially save your client a lot of money in opportunity cost.

Last November, I spoke at TechSEO Boost and presented a technique my team and I regularly use to analyze traffic drops. It allows us to pinpoint this painful problem quickly and with surgical precision. As far as I know, there are no tools that currently implement this technique. I coded this solution using Python.

This is the first part of a three-part series. In part two, we will manually group the pages using regular expressions and in part three we will group them automatically using machine learning techniques. Let’s walk over part one and have some fun!

Winners vs losers

Last June we signed up a client that moved from Ecommerce V3 to Shopify and the SEO traffic took a big hit. The owner set up 301 redirects between the old and new sites but made a number of unwise changes like merging a large number of categories and rewriting titles during the move.

When traffic drops, some parts of the site underperform while others don’t. I like to isolate them in order to 1) focus all efforts on the underperforming parts, and 2) learn from the parts that are doing well.

I call this analysis the “Winners vs Losers” analysis. Here, winners are the parts that do well, and losers the ones that do badly.

visual analysis of winners and losers to figure out why traffic changed

A visualization of the analysis looks like the chart above. I was able to narrow down the issue to the category pages (Collection pages) and found that the main issue was caused by the site owner merging and eliminating too many categories during the move.

Let’s walk over the steps to put this kind of analysis together in Python.

You can reference my carefully documented Google Colab notebook here.

Getting the data

We want to programmatically compare two separate time frames in Google Analytics (before and after the traffic drop), and we’re going to use the Google Analytics API to do it.

Google Analytics Query Explorer provides the simplest approach to do this in Python.

  1. Head on over to the Google Analytics Query Explorer
  2. Click on the button at the top that says “Click here to Authorize” and follow the steps provided.
  3. Use the dropdown menu to select the website you want to get data from.
  4. Fill in the “metrics” parameter with “ga:newUsers” in order to track new visits.
  5. Complete the “dimensions” parameter with “ga:landingPagePath” in order to get the page URLs.
  6. Fill in the “segment” parameter with “gaid::-5” in order to track organic search visits.
  7. Hit “Run Query” and let it run
  8. Scroll down to the bottom of the page and look for the text box that says “API Query URI.”
    1. Check the box underneath it that says “Include current access_token in the Query URI (will expire in ~60 minutes).”
    2. At the end of the URL in the text box you should now see access_token=string-of-text-here. You will use this string of text in the code snippet below as  the variable called token (make sure to paste it inside the quotes)
  9. Now, scroll back up to where we built the query, and look for the parameter that was filled in for you called “ids.” You will use this in the code snippet below as the variable called “gaid.” Again, it should go inside the quotes.
  10. Run the cell once you’ve filled in the gaid and token variables to instantiate them, and we’re good to go!

First, let’s define placeholder variables to pass to the API

metrics = “,”.join([“ga:users”,”ga:newUsers”])

dimensions = “,”.join([“ga:landingPagePath”, “ga:date”])

segment = “gaid::-5”

# Required, please fill in with your own GA information example: ga:23322342

gaid = “ga:23322342”

# Example: string-of-text-here from step 8.2

token = “”

# Example or

base_site_url = “”

# You can change the start and end dates as you like

start = “2017-06-01”

end = “2018-06-30”

The first function combines the placeholder variables we filled in above with an API URL to get Google Analytics data. We make additional API requests and merge them in case the results exceed the 10,000 limit.

def GAData(gaid, start, end, metrics, dimensions, 

           segment, token, max_results=10000):

  “””Creates a generator that yields GA API data 

     in chunks of size `max_results`”””

  #build uri w/ params

  api_uri = “{gaid}&”\




  # insert uri params

  api_uri = api_uri.format(










  # Using yield to make a generator in an

  # attempt to be memory efficient, since data is downloaded in chunks

  r = requests.get(api_uri)

  data = r.json()

  yield data

  if data.get(“nextLink”, None):

    while data.get(“nextLink”):

      new_uri = data.get(“nextLink”)

      new_uri += “&access_token={token}”.format(token=token)

      r = requests.get(new_uri)

      data = r.json()

      yield data

In the second function, we load the Google Analytics Query Explorer API response into a pandas DataFrame to simplify our analysis.

import pandas as pd

def to_df(gadata):

  “””Takes in a generator from GAData() 

     creates a dataframe from the rows”””

  df = None

  for data in gadata:

    if df is None:

      df = pd.DataFrame(


          columns=[x[‘name’] for x in data[‘columnHeaders’]]



      newdf = pd.DataFrame(


          columns=[x[‘name’] for x in data[‘columnHeaders’]]


      df = df.append(newdf)

    print(“Gathered {} rows”.format(len(df)))

  return df

Now, we can call the functions to load the Google Analytics data.

data = GAData(gaid=gaid, metrics=metrics, start=start, 

                end=end, dimensions=dimensions, segment=segment, 


data = to_df(data)

Analyzing the data

Let’s start by just getting a look at the data. We’ll use the .head() method of DataFrames to take a look at the first few rows. Think of this as glancing at only the top few rows of an Excel spreadsheet.


This displays the first five rows of the data frame.

Most of the data is not in the right format for proper analysis, so let’s perform some data transformations.

First, let’s convert the date to a datetime object and the metrics to numeric values.

data[‘ga:date’] = pd.to_datetime(data[‘ga:date’])

data[‘ga:users’] = pd.to_numeric(data[‘ga:users’])

data[‘ga:newUsers’] = pd.to_numeric(data[‘ga:newUsers’])

Next, we will need the landing page URL, which are relative and include URL parameters in two additional formats: 1) as absolute urls, and 2) as relative paths (without the URL parameters).

from urllib.parse import urlparse, urljoin

data[‘path’] = data[‘ga:landingPagePath’].apply(lambda x: urlparse(x).path)

data[‘url’] = urljoin(base_site_url, data[‘path’])

Now the fun part begins.

The goal of our analysis is to see which pages lost traffic after a particular date–compared to the period before that date–and which gained traffic after that date.

The example date chosen below corresponds to the exact midpoint of our start and end variables used above to gather the data, so that the data both before and after the date is similarly sized.

We begin the analysis by grouping each URL together by their path and adding up the newUsers for each URL. We do this with the built-in pandas method: .groupby(), which takes a column name as an input and groups together each unique value in that column.

The .sum() method then takes the sum of every other column in the data frame within each group.

For more information on these methods please see the Pandas documentation for groupby.

For those who might be familiar with SQL, this is analogous to a GROUP BY clause with a SUM in the select clause

# Change this depending on your needs

MIDPOINT_DATE = “2017-12-15”

before = data[data[‘ga:date’] < pd.to_datetime(MIDPOINT_DATE)]

after = data[data[‘ga:date’] >= pd.to_datetime(MIDPOINT_DATE)]

# Traffic totals before Shopify switch

totals_before = before[[“ga:landingPagePath”, “ga:newUsers”]]\


totals_before = totals_before.reset_index()\

                .sort_values(“ga:newUsers”, ascending=False)

# Traffic totals after Shopify switch

totals_after = after[[“ga:landingPagePath”, “ga:newUsers”]]\


totals_after = totals_after.reset_index()\

               .sort_values(“ga:newUsers”, ascending=False)

You can check the totals before and after with this code and double check with the Google Analytics numbers.

print(“Traffic Totals Before: “)

print(“Row count: “, len(totals_before))

print(“Traffic Totals After: “)

print(“Row count: “, len(totals_after))

Next up we merge the two data frames, so that we have a single column corresponding to the URL, and two columns corresponding to the totals before and after the date.

We have different options when merging as illustrated above. Here, we use an “outer” merge, because even if a URL didn’t show up in the “before” period, we still want it to be a part of this merged dataframe. We’ll fill in the blanks with zeros after the merge.

# Comparing pages from before and after the switch

change = totals_after.merge(totals_before, 



                            suffixes=[“_after”, “_before”], 


change.fillna(0, inplace=True)

Difference and percentage change

Pandas dataframes make simple calculations on whole columns easy. We can take the difference of two columns and divide two columns and it will perform that operation on every row for us. We will take the difference of the two totals columns, and divide by the “before” column to get the percent change before and after out midpoint date.

Using this percent_change column we can then filter our dataframe to get the winners, the losers and those URLs with no change.

change[‘difference’] = change[‘ga:newUsers_after’] – change[‘ga:newUsers_before’]

change[‘percent_change’] = change[‘difference’] / change[‘ga:newUsers_before’]

winners = change[change[‘percent_change’] > 0]

losers = change[change[‘percent_change’] < 0]

no_change = change[change[‘percent_change’] == 0]

Sanity check

Finally, we do a quick sanity check to make sure that all the traffic from the original data frame is still accounted for after all of our analysis. To do this, we simply take the sum of all traffic for both the original data frame and the two columns of our change dataframe.

# Checking that the total traffic adds up

data[‘ga:newUsers’].sum() == change[[‘ga:newUsers_after’, ‘ga:newUsers_before’]].sum().sum()

It should be True.


Sorting by the difference in our losers data frame, and taking the .head(10), we can see the top 10 losers in our analysis. In other words, these pages lost the most total traffic between the two periods before and after the midpoint date.


You can do the same to review the winners and try to learn from them.

winners.sort_values(“difference”, ascending=False).head(10)

You can export the losing pages to a CSV or Excel using this.


This seems like a lot of work to analyze just one site–and it is!

The magic happens when you reuse this code on new clients and simply need to replace the placeholder variables at the top of the script.

In part two, we will make the output more useful by grouping the losing (and winning) pages by their types to get the chart I included above.

The post Using Python to recover SEO site traffic (Part one) appeared first on Search Engine Watch.

Reblogged 6 days ago from

Back to basics: What does ‘long-tail’ keyword really mean?

The definition of long-tail keywords is not synonymous with the number of words typed into the search query, but rather the search volume they produce. Here’s how it works.

Please visit Search Engine Land for the full article.

Reblogged 6 days ago from

Updated for 2019! ‘Enterprise SEO Platforms: A Marketer’s Guide’

Learn about the latest SEO trends.

Please visit Search Engine Land for the full article.

Reblogged 6 days ago from

Bing Ads multiple language targeting: What you need to know

There are important nuances to ensuring your ads reach users in the languages they understand.

Please visit Search Engine Land for the full article.

Reblogged 6 days ago from