Back to Top

Google Shares Details About the Technology Behind Googlebot

Posted by goralewicz

Crawling and indexing has been a hot topic over the last few years. As soon as Google launched Google Panda, people rushed to their server logs and crawling stats and began fixing their index bloat. All those problems didn’t exist in the “SEO = backlinks” era from a few years ago. With this exponential growth of technical SEO, we need to get more and more technical. That being said, we still don’t know how exactly Google crawls our websites. Many SEOs still can’t tell the difference between crawling and indexing.

The biggest problem, though, is that when we want to troubleshoot indexing problems, the only tool in our arsenal is Google Search Console and the Fetch and Render tool. Once your website includes more than HTML and CSS, there’s a lot of guesswork into how your content will be indexed by Google. This approach is risky, expensive, and can fail multiple times. Even when you discover the pieces of your website that weren’t indexed properly, it’s extremely difficult to get to the bottom of the problem and find the fragments of code responsible for the indexing problems.

Fortunately, this is about to change. Recently, Ilya Grigorik from Google shared one of the most valuable insights into how crawlers work:

Interestingly, this tweet didn’t get nearly as much attention as I would expect.

So what does Ilya’s revelation in this tweet mean for SEOs?

Knowing that Chrome 41 is the technology behind the Web Rendering Service is a game-changer. Before this announcement, our only solution was to use Fetch and Render in Google Search Console to see our page rendered by the Website Rendering Service (WRS). This means we can troubleshoot technical problems that would otherwise have required experimenting and creating staging environments. Now, all you need to do is download and install Chrome 41 to see how your website loads in the browser. That’s it.

You can check the features and capabilities that Chrome 41 supports by visiting Caniuse.com or Chromestatus.com (Googlebot should support similar features). These two websites make a developer’s life much easier.

Even though we don’t know exactly which version Ilya had in mind, we can find Chrome’s version used by the WRS by looking at the server logs. It’s Chrome 41.0.2272.118.

It will be updated sometime in the future

Chrome 41 was created two years ago (in 2015), so it’s far removed from the current version of the browser. However, as Ilya Grigorik said, an update is coming:

I was lucky enough to get Ilya Grigorik to read this article before it was published, and he provided a ton of valuable feedback on this topic. He mentioned that they are hoping to have the WRS updated by 2018. Fingers crossed!

Google uses Chrome 41 for rendering. What does that mean?

We now have some interesting information about how Google renders websites. But what does that mean, practically, for site developers and their clients? Does this mean we can now ignore server-side rendering and deploy client-rendered, JavaScript-rich websites?

Not so fast. Here is what Ilya Grigorik had to say in response to this question:

We now know WRS’ capabilities for rendering JavaScript and how to debug them. However, remember that not all crawlers support Javascript crawling, etc. Also, as of today, JavaScript crawling is only supported by Google and Ask (Ask is most likely powered by Google). Even if you don’t care about social media or search engines other than Google, one more thing to remember is that even with Chrome 41, not all JavaScript frameworks can be indexed by Google (read more about JavaScript frameworks crawling and indexing). This lets us troubleshoot and better diagnose problems.

Don’t get your hopes up

All that said, there are a few reasons to keep your excitement at bay.

Remember that version 41 of Chrome is over two years old. It may not work very well with modern JavaScript frameworks. To test it yourself, open http://jsseo.expert/polymer/ using Chrome 41, and then open it in any up-to-date browser you are using.

The page in Chrome 41 looks like this:

The content parsed by Polymer is invisible (meaning it wasn’t processed correctly). This is also a perfect example for troubleshooting potential indexing issues. The problem you’re seeing above can be solved if diagnosed properly. Let me quote Ilya:

“If you look at the raised Javascript error under the hood, the test page is throwing an error due to unsupported (in M41) ES6 syntax. You can test this yourself in M41, or use the debug snippet we provided in the blog post to log the error into the DOM to see it.”

I believe this is another powerful tool for web developers willing to make their JavaScript websites indexable. We will definitely expand our experiment and work with Ilya’s feedback.

The Fetch and Render tool is the Chrome v. 41 preview

There’s another interesting thing about Chrome 41. Google Search Console’s Fetch and Render tool is simply the Chrome 41 preview. The righthand-side view (“This is how a visitor to your website would have seen the page”) is generated by the Google Search Console bot, which is… Chrome 41.0.2272.118 (see screenshot below).

Zoom in here

There’s evidence that both Googlebot and Google Search Console Bot render pages using Chrome 41. Still, we don’t exactly know what the differences between them are. One noticeable difference is that the Google Search Console bot doesn’t respect the robots.txt file. There may be more, but for the time being, we’re not able to point them out.

Chrome 41 vs Fetch as Google: A word of caution

Chrome 41 is a great tool for debugging Googlebot. However, sometimes (not often) there’s a situation in which Chrome 41 renders a page properly, but the screenshots from Google Fetch and Render suggest that Google can’t handle the page. It could be caused by CSS animations and transitions, Googlebot timeouts, or the usage of features that Googlebot doesn’t support. Let me show you an example.

Chrome 41 preview:

Image blurred for privacy

The above page has quite a lot of content and images, but it looks completely different in Google Search Console.

Google Search Console preview for the same URL:

As you can see, Google Search Console’s preview of this URL is completely different than what you saw on the previous screenshot (Chrome 41). All the content is gone and all we can see is the search bar.

From what we noticed, Google Search Console renders CSS a little bit different than Chrome 41. This doesn’t happen often, but as with most tools, we need to double check whenever possible.

This leads us to a question…

What features are supported by Googlebot and WRS?

According to the Rendering on Google Search guide:

  • Googlebot doesn’t support IndexedDB, WebSQL, and WebGL.
  • HTTP cookies and local storage, as well as session storage, are cleared between page loads.
  • All features requiring user permissions (like Notifications API, clipboard, push, device-info) are disabled.
  • Google can’t index 3D and VR content.
  • Googlebot only supports HTTP/1.1 crawling.

The last point is really interesting. Despite statements from Google over the last 2 years, Google still only crawls using HTTP/1.1.

No HTTP/2 support (still)

We’ve mostly been covering how Googlebot uses Chrome, but there’s another recent discovery to keep in mind.

There is still no support for HTTP/2 for Googlebot.

Since it’s now clear that Googlebot doesn’t support HTTP/2, this means that if your website supports HTTP/2, you can’t drop HTTP 1.1 optimization. Googlebot can crawl only using HTTP/1.1.

There were several announcements recently regarding Google’s HTTP/2 support. To read more about it, check out my HTTP/2 experiment here on the Moz Blog.

Via https://developers.google.com/search/docs/guides/r…

Googlebot’s future

Rumor has it that Chrome 59’s headless mode was created for Googlebot, or at least that it was discussed during the design process. It’s hard to say if any of this chatter is true, but if it is, it means that to some extent, Googlebot will “see” the website in the same way as regular Internet users.

This would definitely make everything simpler for developers who wouldn’t have to worry about Googlebot’s ability to crawl even the most complex websites.

Chrome 41 vs. Googlebot’s crawling efficiency

Chrome 41 is a powerful tool for debugging JavaScript crawling and indexing. However, it’s crucial not to jump on the hype train here and start launching websites that “pass the Chrome 41 test.”

Even if Googlebot can “see” our website, there are many other factors that will affect your site’s crawling efficiency. As an example, we already have proof showing that Googlebot can crawl and index JavaScript and many JavaScript frameworks. It doesn’t mean that JavaScript is great for SEO. I gathered significant evidence showing that JavaScript pages aren’t crawled even half as effectively as HTML-based pages.

In summary

Ilya Grigorik’s tweet sheds more light on how Google crawls pages and, thanks to that, we don’t have to build experiments for every feature we’re testing — we can use Chrome 41 for debugging instead. This simple step will definitely save a lot of websites from indexing problems, like when Hulu.com’s JavaScript SEO backfired.

It’s safe to assume that Chrome 41 will now be a part of every SEO’s toolset.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 5 hours ago from feedproxy.google.com

Does Googlebot Support HTTP/2? Challenging Google’s Indexing Claims – An Experiment

Posted by goralewicz

I was recently challenged with a question from a client, Robert, who runs a small PR firm and needed to optimize a client’s website. His question inspired me to run a small experiment in HTTP protocols. So what was Robert’s question? He asked…

Can Googlebot crawl using HTTP/2 protocols?

You may be asking yourself, why should I care about Robert and his HTTP protocols?

As a refresher, HTTP protocols are the basic set of standards allowing the World Wide Web to exchange information. They are the reason a web browser can display data stored on another server. The first was initiated back in 1989, which means, just like everything else, HTTP protocols are getting outdated. HTTP/2 is one of the latest versions of HTTP protocol to be created to replace these aging versions.

So, back to our question: why do you, as an SEO, care to know more about HTTP protocols? The short answer is that none of your SEO efforts matter or can even be done without a basic understanding of HTTP protocol. Robert knew that if his site wasn’t indexing correctly, his client would miss out on valuable web traffic from searches.

The hype around HTTP/2

HTTP/1.1 is a 17-year-old protocol (HTTP 1.0 is 21 years old). Both HTTP 1.0 and 1.1 have limitations, mostly related to performance. When HTTP/1.1 was getting too slow and out of date, Google introduced SPDY in 2009, which was the basis for HTTP/2. Side note: Starting from Chrome 53, Google decided to stop supporting SPDY in favor of HTTP/2.

HTTP/2 was a long-awaited protocol. Its main goal is to improve a website’s performance. It’s currently used by 17% of websites (as of September 2017). Adoption rate is growing rapidly, as only 10% of websites were using HTTP/2 in January 2017. You can see the adoption rate charts here. HTTP/2 is getting more and more popular, and is widely supported by modern browsers (like Chrome or Firefox) and web servers (including Apache, Nginx, and IIS).

Its key advantages are:

  • Multiplexing: The ability to send multiple requests through a single TCP connection.
  • Server push: When a client requires some resource (let’s say, an HTML document), a server can push CSS and JS files to a client cache. It reduces network latency and round-trips.
  • One connection per origin: With HTTP/2, only one connection is needed to load the website.
  • Stream prioritization: Requests (streams) are assigned a priority from 1 to 256 to deliver higher-priority resources faster.
  • Binary framing layer: HTTP/2 is easier to parse (for both the server and user).
  • Header compression: This feature reduces overhead from plain text in HTTP/1.1 and improves performance.

For more information, I highly recommend reading “Introduction to HTTP/2” by Surma and Ilya Grigorik.

All these benefits suggest pushing for HTTP/2 support as soon as possible. However, my experience with technical SEO has taught me to double-check and experiment with solutions that might affect our SEO efforts.

So the question is: Does Googlebot support HTTP/2?

Google’s promises

HTTP/2 represents a promised land, the technical SEO oasis everyone was searching for. By now, many websites have already added HTTP/2 support, and developers don’t want to optimize for HTTP/1.1 anymore. Before I could answer Robert’s question, I needed to know whether or not Googlebot supported HTTP/2-only crawling.

I was not alone in my query. This is a topic which comes up often on Twitter, Google Hangouts, and other such forums. And like Robert, I had clients pressing me for answers. The experiment needed to happen. Below I’ll lay out exactly how we arrived at our answer, but here’s the spoiler: it doesn’t. Google doesn’t crawl using the HTTP/2 protocol. If your website uses HTTP/2, you need to make sure you continue to optimize the HTTP/1.1 version for crawling purposes.

The question

It all started with a Google Hangouts in November 2015.

When asked about HTTP/2 support, John Mueller mentioned that HTTP/2-only crawling should be ready by early 2016, and he also mentioned that HTTP/2 would make it easier for Googlebot to crawl pages by bundling requests (images, JS, and CSS could be downloaded with a single bundled request).

“At the moment, Google doesn’t support HTTP/2-only crawling (…) We are working on that, I suspect it will be ready by the end of this year (2015) or early next year (2016) (…) One of the big advantages of HTTP/2 is that you can bundle requests, so if you are looking at a page and it has a bunch of embedded images, CSS, JavaScript files, theoretically you can make one request for all of those files and get everything together. So that would make it a little bit easier to crawl pages while we are rendering them for example.”

Soon after, Twitter user Kai Spriestersbach also asked about HTTP/2 support:

His clients started dropping HTTP/1.1 connections optimization, just like most developers deploying HTTP/2, which was at the time supported by all major browsers.

After a few quiet months, Google Webmasters reignited the conversation, tweeting that Google won’t hold you back if you’re setting up for HTTP/2. At this time, however, we still had no definitive word on HTTP/2-only crawling. Just because it won’t hold you back doesn’t mean it can handle it — which is why I decided to test the hypothesis.

The experiment

For months as I was following this online debate, I still received questions from our clients who no longer wanted want to spend money on HTTP/1.1 optimization. Thus, I decided to create a very simple (and bold) experiment.

I decided to disable HTTP/1.1 on my own website (https://goralewicz.com) and make it HTTP/2 only. I disabled HTTP/1.1 from March 7th until March 13th.

If you’re going to get bad news, at the very least it should come quickly. I didn’t have to wait long to see if my experiment “took.” Very shortly after disabling HTTP/1.1, I couldn’t fetch and render my website in Google Search Console; I was getting an error every time.

My website is fairly small, but I could clearly see that the crawling stats decreased after disabling HTTP/1.1. Google was no longer visiting my site.

While I could have kept going, I stopped the experiment after my website was partially de-indexed due to “Access Denied” errors.

The results

I didn’t need any more information; the proof was right there. Googlebot wasn’t supporting HTTP/2-only crawling. Should you choose to duplicate this at home with our own site, you’ll be happy to know that my site recovered very quickly.

I finally had Robert’s answer, but felt others may benefit from it as well. A few weeks after finishing my experiment, I decided to ask John about HTTP/2 crawling on Twitter and see what he had to say.

(I love that he responds.)

Knowing the results of my experiment, I have to agree with John: disabling HTTP/1 was a bad idea. However, I was seeing other developers discontinuing optimization for HTTP/1, which is why I wanted to test HTTP/2 on its own.

For those looking to run their own experiment, there are two ways of negotiating a HTTP/2 connection:

1. Over HTTP (unsecure) – Make an HTTP/1.1 request that includes an Upgrade header. This seems to be the method to which John Mueller was referring. However, it doesn’t apply to my website (because it’s served via HTTPS). What is more, this is an old-fashioned way of negotiating, not supported by modern browsers. Below is a screenshot from Caniuse.com:

2. Over HTTPS (secure) – Connection is negotiated via the ALPN protocol (HTTP/1.1 is not involved in this process). This method is preferred and widely supported by modern browsers and servers.

A recent announcement: The saga continues

Googlebot doesn’t make HTTP/2 requests

Fortunately, Ilya Grigorik, a web performance engineer at Google, let everyone peek behind the curtains at how Googlebot is crawling websites and the technology behind it:

If that wasn’t enough, Googlebot doesn’t support the WebSocket protocol. That means your server can’t send resources to Googlebot before they are requested. Supporting it wouldn’t reduce network latency and round-trips; it would simply slow everything down. Modern browsers offer many ways of loading content, including WebRTC, WebSockets, loading local content from drive, etc. However, Googlebot supports only HTTP/FTP, with or without Transport Layer Security (TLS).

Googlebot supports SPDY

During my research and after John Mueller’s feedback, I decided to consult an HTTP/2 expert. I contacted Peter Nikolow of Mobilio, and asked him to see if there were anything we could do to find the final answer regarding Googlebot’s HTTP/2 support. Not only did he provide us with help, Peter even created an experiment for us to use. Its results are pretty straightforward: Googlebot does support the SPDY protocol and Next Protocol Navigation (NPN). And thus, it can’t support HTTP/2.

Below is Peter’s response:


I performed an experiment that shows Googlebot uses SPDY protocol. Because it supports SPDY + NPN, it cannot support HTTP/2. There are many cons to continued support of SPDY:

    1. This protocol is vulnerable
    2. Google Chrome no longer supports SPDY in favor of HTTP/2
    3. Servers have been neglecting to support SPDY. Let’s examine the NGINX example: from version 1.95, they no longer support SPDY.
    4. Apache doesn’t support SPDY out of the box. You need to install mod_spdy, which is provided by Google.

To examine Googlebot and the protocols it uses, I took advantage of s_server, a tool that can debug TLS connections. I used Google Search Console Fetch and Render to send Googlebot to my website.

Here’s a screenshot from this tool showing that Googlebot is using Next Protocol Navigation (and therefore SPDY):

I’ll briefly explain how you can perform your own test. The first thing you should know is that you can’t use scripting languages (like PHP or Python) for debugging TLS handshakes. The reason for that is simple: these languages see HTTP-level data only. Instead, you should use special tools for debugging TLS handshakes, such as s_server.

Type in the console:

sudo openssl s_server -key key.pem -cert cert.pem -accept 443 -WWW -tlsextdebug -state -msg
sudo openssl s_server -key key.pem -cert cert.pem -accept 443 -www -tlsextdebug -state -msg

Please note the slight (but significant) difference between the “-WWW” and “-www” options in these commands. You can find more about their purpose in the s_server documentation.

Next, invite Googlebot to visit your site by entering the URL in Google Search Console Fetch and Render or in the Google mobile tester.

As I wrote above, there is no logical reason why Googlebot supports SPDY. This protocol is vulnerable; no modern browser supports it. Additionally, servers (including NGINX) neglect to support it. It’s just a matter of time until Googlebot will be able to crawl using HTTP/2. Just implement HTTP 1.1 + HTTP/2 support on your own server (your users will notice due to faster loading) and wait until Google is able to send requests using HTTP/2.


Summary

In November 2015, John Mueller said he expected Googlebot to crawl websites by sending HTTP/2 requests starting in early 2016. We don’t know why, as of October 2017, that hasn’t happened yet.

What we do know is that Googlebot doesn’t support HTTP/2. It still crawls by sending HTTP/ 1.1 requests. Both this experiment and the “Rendering on Google Search” page confirm it. (If you’d like to know more about the technology behind Googlebot, then you should check out what they recently shared.)

For now, it seems we have to accept the status quo. We recommended that Robert (and you readers as well) enable HTTP/2 on your websites for better performance, but continue optimizing for HTTP/ 1.1. Your visitors will notice and thank you.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 5 hours ago from feedproxy.google.com

Marketing Day: Instagram redesigns call-to-action bar, mobile-specific fraud & more

Here’s our recap of what happened in online marketing today, as reported on Marketing Land and other places across the web.

Please visit Marketing Land for the full article.

Reblogged 6 hours ago from feeds.marketingland.com

SearchCap: Facebook food ordering, Google Posts automation & machine learning

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

The post SearchCap: Facebook food ordering, Google Posts automation & machine learning appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.

Reblogged 7 hours ago from feeds.searchengineland.com

What Is Whitespace? 9 Websites to Inspire Your Web Design

Empty space is not always wasted space.

In fact, when it comes to web design, it’s a best practice to give your content a little breathing room.

Today’s website visitors are content-scanners. They scroll quickly, skim posts, and get distracted by busy layouts trying to accomplish too much. The key to getting your visitors’ undivided attention is simplicity — and that starts with an effective use of whitespace.

In this article, we’ll take a brief look at why whitespace matters, what it means for conversion-driven web design, and how eight websites are using whitespace to lead their visitors towards the desired action.

What Is Whitespace?

Whitespace is the negative areas in any composition. It’s the unmarked distance between different elements that gives viewers some visual breaks when they process design, minimizing distractions and making it easier to focus.

Intentionally blank areas aren’t just aesthetically pleasing — they actually have a big impact on how our brains take in and process new material. Too much information or visual data crammed into a small, busy space can cause cognitive fatigue, and our brains have difficulty absorbing anything at all. It’s information overload at its very worst.

Why We Need Whitespace

To understand the importance of whitespace, think about how difficult it is for your brain to process an entire page from the phone book or white pages. All those columns of teeny tiny text get squished together into one indigestible chunk of information, and it can be a real challenge to find what you’re looking for.

While phone books are designed to display maximum information in minimum space, the majority of print layouts are created to be more easily understood — thanks to whitespace.

To illustrate how effective whitespace is at helping our brains process information in print, check out the example below from Digital Ink:

See the difference? The layout on the left uses the vast majority of available space, but it looks crowded and severe — not exactly something you’d feel comfortable staring at for a long time to read.

In contrast, the layout on the right uses wider columns and more distance between paragraphs. It’s a simple design shift that has a major impact on making the article look more approachable and readable.

In addition to making layouts easier to understand, whitespace can also place emphasis on specific elements, helping the viewer understand what they should focus on. Using whitespace to break up a layout and group particular things together helps create a sense of balance and sophistication.

Take a look at this business card example from Printwand:

The business card on the left does include negative space, but the elements are still crammed into one area, making the whole card look cluttered and unprofessional. The card on the right uses whitespace to a better effect, spacing the individual elements out so the composition is easier to make sense of.

When it comes to designing websites, whitespace is crucial — not only from an aesthetic standpoint, but also from a conversion optimization perspective. Using whitespace effectively can make your website more easily navigable, comprehensible, and conversion-friendly, directing users more smoothly to call-to-actions and encouraging them to convert.

In fact, classic research by Human Factors International found that using whitespace to highlight or emphasize important elements on a website increased visitor comprehension by almost 20%.

Just take a look at these two website layouts:

On the left, the call-to-action button has no room to breathe — it’s wedged between busy dividers and tightly packed text. There’s too much distraction around the button, making it difficult for visitors to focus on what matters.

On the right, the call-to-action has been padded with some much-needed whitespace. The button now appears to be a focal point on the page, encouraging visitors to stop and take notice.

You’ll notice that adding some whitespace around our call-to-action has caused some of the other content on the page to be pushed down — and that’s perfectly okay. Not everything has to be above the fold (the part of the website that appears before the user starts to scroll). In fact, designers shouldn’t try to stuff a ton of content before the fold of the page, since it will end up looking cluttered and overwhelming.

9 Websites Using Whitespace Marketing to Their Advantage

1) Shopify

The homepage for ecommerce platform Shopify has a simple objective: Get visitors to sign up for a free trial.

To direct users to this action, they’ve surrounded their one-field sign-up form with plenty of whitespace, minimizing distractions and ensuring visitors can’t miss it. The site’s main navigation is displayed much smaller than the form text, and placed out of the way at the top of the screen to avoid taking attention away from the central form.

Screen Shot 2017-10-10 at 2.29.33 PM

2) Everlane

Whitespace doesn’t have to mean the complete absence of color or pictures — it means making sure page elements are generously and strategically spaced to avoid overwhelming or confusing your visitors.

To show off its latest clothing collection, fashion retailer Everlane opts for a minimal set up: The full page background shows off a photograph of its “GoWeave” blazer, and a small, expertly placed call-to-action appears in the center of the screen, encouraging users to click and “shop now.” It’s a perfect example of leading users towards an action without being pushy or aggressive. 

Screen Shot 2017-10-10 at 2.31.35 PM

3) Wistia

Using whitespace strategically can be as easy as making sure your forms and call-to-action buttons are noticeably separated from the rest of your content. This simple change makes a huge difference in how your content is perceived. 

Wistia, a video platform, anchors their homepage with a friendly question and a drop-down form. The two central CTA buttons serve as the central focal point(s) of the whole page, and it’s given plenty of space to set it apart from the site’s main navigation and image.

Screen Shot 2017-10-10 at 2.36.41 PM

4) Welikesmall

Digital agency Welikesmall proves that whitespace doesn’t have to be boring, empty, or even static. Their homepage displays a fullscreen demo reel of their recent video projects, filtering through a variety of exciting vignettes to immediately capture the visitor’s attention. 

Full-screen video in any other context could seem busy and aggressive, but since the layout is designed with generous whitespace, it looks polished. With all the focus on the video background, the text is kept minimal. The agency’s logo appears in one corner, and a folded hamburger style menu appears in the other. Welikesmall’s slogan — “Belief in the Making” — is fixed in the center of the screen, along with a call-to-action button linking to the agency’s full 2016 demo reel.  

Screen Shot 2017-10-10 at 2.37.24 PM

5) Simpla

This homepage from Simpla demonstrates the power that a relatively empty above the fold section can have. This simple, decidedly minimal homepage uses whitespace to urge users to keep scrolling.

Beneath the logo and navigation, a large portion of the site has been left unmarked. The top of a photo — along with a short paragraph of text and an arrow — invites visitors to keep reading to learn more about the company and their mission.

This unique use of whitespace not only looks sophisticated, but it strategically draws visitors further into the site. 

6) Harvard Art Museums

The Harvard Art Museums might be known for displaying antiquated paintings, but their homepage is decidedly modern. The whitespace here provides the perfect backdrop for the featured art, making sure that nothing distracts from the pieces themselves. It’s about as close to a digital art exhibition as you can get. 

The masonry-style layout gives the user a reason to keep scrolling, and also ensures that none of the images are crowded together. To maintain the minimal gallery aesthetic, the site’s navigation is completely hidden until the user hovers their mouse towards the top of the page.

Screen Shot 2017-10-10 at 2.39.02 PM

7) Burnkit

When working with whitespace on your homepage, you’ll have to make some tough decisions about what’s important enough to display, since there’s less room for a pile of cluttered content. This design agency shows us that you can display a wide variety of content in a minimal layout, without squishing things together and muddying the composition. 

Burnkit‘s homepage features blog content, key excerpts from the agency’s portfolio of client work, and behind-the-scenes looks at the agency’s culture. So how did they manage to fit so much onto one page without overwhelming the visitor? Whitespace. Lots and lots of whitespace. Each article is given generous padding, and the user can keep scrolling to reveal new material. 

Screen Shot 2017-10-10 at 2.40.43 PM

8) Medium

Medium cleverly uses whitespace to get readers to keep scrolling further down the page by enticing them with notes showing how many people have “clapped” for a post, how many people have commented on it, and what related content is next on the docket for them to read.

The whitespace pushes the reader to look at the center column of their screen, featuring a compelling title and cover photo — and uses social proof to show readers why they should keep scrolling.

medium-whitespace.png

10) Ahrefs

Ahrefs‘ website is another example of whitespace that decidedly isn’t white, and its homepage uses both whitespace and text formatting to focus the visitor’s eyes on the glowing orange button — to start their free trial.

In bold, large font, Ahrefs offers its software’s value proposition, and in smaller, center-justified text, it uses whitespace to guide the viewer to click the CTA button. Smart, right?

ahrefs-whitespace.png

Reblogged 14 hours ago from blog.hubspot.com

Facebook officially rolls out food ordering as part of longer-term commerce evolution

Facebook has been building a range of commerce tools and capabilities, many of which are directed at local and offline transactions.

The post Facebook officially rolls out food ordering as part of longer-term commerce evolution appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.

Reblogged 17 hours ago from feeds.searchengineland.com

8 of the Best Social Media Analytics Tools of 2017

What are your top performing Tweets? What time of day do your Instagram posts get the most engagement? Is your Facebook marketing strategy translating to traffic and leads? If you’re not using social media analytics tools, you’ll probably have a hard time answering these questions.

By now, most marketers understand the need for tools to measure their efforts. But as social media marketing has grown, the number of tools available to analyze your efforts has skyrocketed. At one point, you could probably name all the major social media analytics tools out there. Fast forward to today, and it seems like there’s a new tool released every day.

With all the options available, the question is what is the best social media analytics tool for your brand? Whether you want data on the performance of a specific campaign, Instagram and Snapchat stories or an overview of all your profiles, we’ve got you covered. Here are eight social media analytics tools to add to your arsenal:

1. Sprout Social

sprout social group report

Obviously, we couldn’t put together a list of the best social media analytics tools without mentioning our own. Every brand should have a dedicated social media management tool. With Sprout’s social media analytics, you can measure performance across Facebook, Twitter, Instagram and LinkedIn, all within a single platform. Having all of your analytics in one place makes it easier to track and compare your efforts across multiple profiles and platforms.

  • Networks: Facebook, Twitter, Instagram, LinkedIn and Google+
  • Price: Starts at $59/month (try a free 30-day trial)
  • Recommended for: Any brand that manages multiple social media profiles across multiple networks. If your brand is active on social media, a tool like Sprout is a must-have.

2. Snaplytics

snaplytics

Of all the major social networks, Snapchat gives brands the least amount of data on performance. One reason for this is the platform itself isn’t as robust as Facebook, Twitter and others. Aside from viewing and leaving comments, there isn’t a lot of data to be collected. But luckily, there are third-party tools you can use to get more in-depth Snapchat analytics. The most popular of which is Snaplytics.

Snaplytics gives you data on the performance of your snaps, audience growth and more. Another unique feature of Snaplytics is that it also gives you insights on your Instagram Stories as well.

  • Networks: Snapchat and Instagram
  • Price: Contact for pricing
  • Recommended for: Brands that want to measure their Snapchat performance.

Snaplytics Alternatives

Snapchat analytic tools are still fairly new, so there aren’t a lot to choose from. But it’s always nice to have options, so here are some Snaplytics alternatives:

  • Delmondo: Delmondo is a social video analytics tool that provides performance data on your snaps, campaign reports and even lets you measure your Snapchat content against other channels.
  • Storyheap: Storyheap is a newer social media analytics tool that provides Snapchat and Instagram Story analytics, as well as the ability to upload stories.

3. Iconosquare

iconosquare

Iconosquare is a social media analytics tool specifically for Instagram. One of the standout features that separates Iconosquare from other tools is that in addition to analysis of your normal photos and videos. It also gives you insights into Instagram stories. With higher level plans, you can also get influencer analytics as well.

  • Networks: Instagram
  • Price: Starts at $9/month
  • Recommended for: Brands heavily invested into Instagram marketing.

Iconosquare Alternatives

Iconosquare is one of many Instagram analytics tools on the market. If you want to explore some other options before making your decision, here are some Iconosquare alternatives to consider:

  • Later: Later is another tool that focuses specifically on Instagram. In addition to publishing and organizing your feed, Later also provides analytics on your followers, engagement and more.
  • Instagram Insights: If you want a free social media analytics tool for Instagram, the app’s native analytics for businesses is a good start. You can see your top posts, optimal posting times, audience demographics and more.

4. Buzzsumo

buzzsumo search

Buzzsumo is different than the other social media analytics tools on our list. Instead of analyzing your brand’s individual social media performance, Buzzsumo looks at how content from your website performs on social media. For instance, if you want to see how many shares your latest blog post received on Facebook and Twitter, Buzzsumo can provide you with that data.

Social media is one of the top ways content gets spread. So it’s important to understand what resonates the most with your audience. Buzzsumo will not only show you the number of shares for each piece of content, but it also shows you which type of content performs best on each network based on length, type, publish date and more.

  • Networks: Facebook, Twitter, LinkedIn, Pinterest and Instagram
  • Price: Starts at $99/month
  • Recommended for: Brands that that blog and distribute their content on social media.

Buzzsumo Alternatives

It’s hard to beat the data and features provided by Buzzsumo. But here are some alternative tools with similar functionality:

  • Epicbeat: Epicbeat is a content discovery tool from Epictions. You can use it to discover content trends, find top performing content and more.
  • Ahrefs: Ahrefs is technically an SEO tool, but one feature they’ve added is their content explorer. You can enter a keyword and Ahrefs will show you the most shared content on that topic. While not as robust as Buzzsumo, it’s a good option when paired with the rest of Ahref’s analytics.

5. Tailwind

tailwind app

While Instagram and Snapchat are currently the most talked about players in the visual social media landscape, Pinterest is still very active. And just like with any other social network, you need to measure your performance. Tailwind is arguably the most popular third-party Pinterest analytics tool.

Through Tailwind, you can track trends in followers and engagement, analyze your audience and they even provide some Instagram analytics as well at certain plan levels.

  • Networks: Pinterest and Instagram
  • Price: Starts at $9.99/month
  • Recommended for: Brands that use Pinterest as one of their top marketing channels.

Tailwind Alternatives

If you’re looking for other tools that provide Pinterest analytics, here are some alternatives to Tailwind:

  • Pinterest Analytics: If you currently don’t use anything to measure your Pinterest efforts, this is a good place to start. If Pinterest’s native reporting doesn’t provide you with enough data or you want more, then explore other paid options.
  • Viralwoot: Viralwoot is a complete Pinterest marketing tool. You can see how influential your pins are, data on your boards (including group boards) and more.

6. Google Analytics

google analytics social media

While it’s not technically a “social media analytics tool,” Google Analytics (GA) is one of the best ways to track social media campaigns and even help you measure social ROI. You likely already have GA setup on your website to monitor and analyze your traffic. But did you know you can access and create reports specifically for social media tracking?

For instance, you can see how much traffic comes to your website from each social network, or use UTM parameters to track specific social media campaigns.

  • Networks: All
  • Price: Free
  • Recommended for: Any brand with a website.

Google Analytics Alternatives

While Google Analytics is the standard for analytics (and our personal recommendation), there are some alternatives:

  • Adobe analytics: Adobe analytics is an enterprise solution, primarily for brands that need advanced segmentation and very specific sets of data.
  • StatCounter: StatCounter is on the other end of the spectrum. It’s a very simplified website analytics tool that’ll show you information on your traffic including basic info on social media traffic.

7. ShortStack

shortstack analytics

Have you ever run a social media contest before? Did you stop at picking a winner, or did you take the time to analyze how the contest went? ShortStack is a social media contest app that also provides performance analytics. Social media contests can be great for growing your following quickly, but if you’re not careful you could wind up just giving away free stuff with nothing to show for it.

By analyzing your contest’s performance with a tool like ShortStack, you’ll be able to see engagement metrics and identify which types of contests work best with your audience.

  • Networks: Facebook, Instagram and Twitter
  • Price: Free. Paid plans start at $29/month
  • Recommended for: Brands that regularly run social media contests.

ShortStack Alternatives

Although ShortStack is one of the most popular tools to analyze social media contests, it’s not the only one on the market. Here are some ShortStack alternatives to explore:

  • Gleam: Gleam is a social media campaign platform. It allows you to do lead generation, run campaigns and analyze all your efforts.
  • Woobox: Woobox allows you to run a variety of types of social media contests, giveaways, coupons and more.

8. TapInfluence

tapinfluence

With influencer marketing becoming one of the most commonly used social media tactics, there’s a growing needs for analytics tool to measure your efforts. TapInfluence is a complete influencer marketing platform that research potential influencers you want to work with, as well as track campaign performance.

For brands that only work with influencers every now and then, a tool this robust might not be necessary. But if influencer marketing is a key part of your social media marketing strategy, then you need a dedicated tool to manage and analyze your campaigns.

  • Networks: Pinterest, LinkedIn, Facebook, Twitter, Instagram and YouTube
  • Price: Contact for pricing
  • Recommended for: Brands doing large scale influencer marketing campaigns

TapInfluence Alternatives

As we mentioned, TapInfluence is more of an enterprise level option. If you don’t need something of that size, or just want to explore other options, here are some TapInfluence alternatives to consider:

  • Traackr: Traackr is an influencer marketing platform that allows you to manage everything from the influencers you work with to the specific campaigns you’re working on. Its analytics allow you to benchmark your efforts and track all your initiatives.
  • Influencity: Influencity allows you to measure your influencer marketing efforts in real time. The good thing about Influencity is it puts a focus on measuring the success of your influencer marketing campaigns by setting your main KPI’s such as engagement, reach interactions and more.

What’s Your Favorite Social Media Analytics Tool?

Like we mentioned earlier, there are several social media analytics tools out there. Our list just scratches the surface. What are some of your favorite social media analytics tools that make reporting and analyzing data easier?

This post 8 of the Best Social Media Analytics Tools of 2017 originally appeared on Sprout Social.

Reblogged 21 hours ago from feedproxy.google.com

215: Simplify Your Business and Make More Money Blogging

Ways You Can Simplify Your Business and Increase Your Blogging Profitability

Today, I want to share two big lessons I learned this year at our Australian ProBlogger events. They were lessons I think apply to many aspects of blogging and online business.

It’s all about simplifying what you do while making more profit.

I’m heading to Dallas for our Success Incubator event and to speak at FinCon in a few days time.  So I’ll be taking a couple of weeks off the podcast to travel and focus on the event attendees as much as possible.



In the meantime, dig into the archives. There are now 215 to choose from.

Recommended Further Listening for the Next Couple of Weeks:



Full Transcript
Expand to view full transcript
Compress to smaller transcript view

Hi there. My name is Darren Rowse. Welcome to Episode 215 of the ProBlogger Podcast. ProBlogger is a blog, a podcast, event, job board, and a series of ebooks all designed to help you as a blogger to grow a profitable blog. You can learn more about ProBlogger over at problogger.com.

Now in today’s lesson, I want to share two big things that I learned at our Australian ProBlogger events this year. They were lessons that really apply to business as a whole, but I think they’re particularly applied to many aspects of blogging and online business. I guess really the theme of today’s show is to think about simplifying what you do whilst also increasing your profit because both of the lessons that I’m going to talk about today do exactly that; simplifying what you do, taking some of the complexity out of what you do, but also increasing profit.

Now before I get into the lessons today, I just want to share I’m heading off to Dallas later this week for two events, the Success Incubator event, the ProBlogger event that we’re running in Dallas, and also to speak at FinCon. I’m doing the keynote there. I’ll be taking off to Dallas in a couple of weeks time. I’m looking forward to meeting many of you at those events. There still are a few tickets left for the Success Incubator event, it’s a one and a half day event with people like Pat Flynn and Kim Garst and Rachel Miller who many of you will be familiar with from previous episodes of this podcast.

You can go to problogger.com/success to get any last tickets that may still be available. There’s also a virtual pass there which is pretty affordable. You get plenty of teaching with that.

I’m heading off to that event in a few days time and while I’m away, I am going to be pressing pause on this podcast. Just wanted to let you know that for the next couple of weeks, there won’t be episodes, highly unlikely that there will be episodes. I may chime in and suggest some previous ones to listen to, but there’s plenty in the archives to dig back into. I will suggest a few episodes at the end of today’s show that you might find useful, particularly practical episodes that we’ve done in the past. Dig around in the archives and I look forward to getting back with you late in October, probably early November.

You can get all of the details of our events and I will link to all the podcasts that I recommend you dig back into over on our show notes today at problogger.com/podcast/215.

Okay, so let’s get into today’s show. The lessons I learned this year were from our event. As I’ve thought about it, I’ve realized that these are lessons that I’ve been learning over the year in other areas as well, and I’ll touch on some of those towards the end. But just to give you a little bit of the backstory, the ProBlogger event, for those of you who haven’t been, we’ve been running it since about 2010. This makes it our seventh year of running the event. Since we ran the first event back in 2010, the event has evolved a lot. And I’ve told the story of that evolution in previous episodes. Back in that first event, it was a very simple event. It was one day, one stream, so we were all in the room all day. I think it was 120 or so people there. We had five or six speakers and really it was very simple. We didn’t add in extra parties, it was just hastily organized and as a result very simple.

Over the years, it evolved from something very simple into something that got quite complicated. We were getting, in our biggest year, I think up towards 700 attendees and speakers at the event, so it was getting quite large. But it also had lots of moving parts. We added in sponsors, we did two days instead of one day, then we added in an extra half a day before it, and some extra stuff at the end. We had five tracks, five different rooms of sessions running multiple at that same time. We had 40 or so speakers one year. It was very complicated.

It was great on many levels. Every year, our attendees told us that they loved it and it was the best event that we’d ever run. As a result of that, we felt driven, or I felt driven, to keep adding more and more to it. I’m a people pleaser. I just wanted to keep making it the best event ever, I wanted to make it more impressive, more valuable to people. So we added more sessions, we added parties, we added workshops, we added more speakers. we added teepees one year, which we had our party in. I drove in on a Segway one year. It got more and more complicated. We had more and more bells and whistles, more and more sparkle.

But all of this extra stuff came at an expense. It was beginning to take over my life, it was beginning to take over my business. The amount of time and energy that we were putting into this event was enormous, it was taking 12 months to plan. In fact, some years we were thinking about the next year’s event before we had even done this year’s event, so it was taking longer than a year.

The other factor was that whilst it was making some profit, the amount of time that we were putting in versus the profit that was coming out, it really didn’t compare. It was profitable on paper but in terms of the amount of effort we were putting in, it wasn’t particularly profitable. And this was partly because we weren’t… well I felt we weren’t able to charge as much as some other conferences. Many of our attendees were new to blogging, or they were mums and dads doing their blogs on the side while they’re looking after kids. And with travel to get to the event, it was a big ask. And so I felt really like I wanted to keep it as affordable as possible.

And so the model for the event, in terms of the business model, was that we actually charged less for the tickets than it cost us to put the event on. And we subsidized the tickets and took our profit out of getting sponsors into the event. Now this worked really well some years where we were able to land some big sponsors and we got some great sponsors who added a lot of value and paid us to access our audience. But other years, it was harder to get those sponsors. And so it was a bit of an up and down rollercoaster ride. And it was a lot of work working with sponsors at that kind of level. That was an area where we’re putting in enormous amounts of work and it was quite stressful as well.

The event was dominating our time, it wasn’t really the most profitable thing that we do, and we realized also that it was only really serving a small segment of our audience being an event for Australians whilst our audience is very global. And we realized that there was so many of you listening to this podcast, it just wasn’t feasible for you to get to our event, even though a few did fly in from overseas. And so after 2016’s event, I did a lot of soul searching, my team did a lot of soul searching, and we really considered carefully how we moved forward with the event. I realized that we just couldn’t keep going in the direction that we were going by adding more and more value in.

To be honest, I very nearly pulled the plug on the events. I almost stopped doing events altogether. But at the same time I had this little nagging feeling that events were also one of the best things that I did. I enjoyed it incredibly and I could see that it was having a big impact upon the people who were coming. So rather than giving up on doing the ProBlogger event, I decided we needed to evolve what we do as an event. And to do that, we really needed to simplify what we were doing and get back to the basics. I guess return to what we did at that very first event.

We began to dream of a simpler event. The simpler event that we came up with, we sat as a team and really wrestled with this, but we came up with let’s go back to a single day event, let’s go back to a single stream event, everyone in the one room. Let’s strip back those 40 or 50 sessions that we had available to attendees, let’s just strip it back to five or six core sessions on the core things that ProBlogger stands for. In those 40 or so sessions that we were running, we were doing really interesting stuff but it wasn’t our core teaching.

Let’s strip back having sponsors, and add in some extra profitability through other means – through decreasing our expenses but also building in a little bit more in terms of what we were charging as well to people. So that’s what we did. We designed this event. It was significantly less expensive to run because we only had six speakers instead of 40. We weren’t flying in 40 or so speakers and putting them up in hotel rooms. We had a smaller venue because we only needed one room rather than a hotel with lots of different rooms. Really, it cut down our cost in terms of things like audio and video and all of that type of thing. No more teepees, no more Segway.

We really pulled back in many regards. We simplified things and we did it for our own benefit, really, in terms of organizing the event. But it had some unexpected benefits which I’ll talk about in a moment.

This new format of event felt right. But it also felt risky. I lost a lot of sleep in the lead-up to putting the tickets on sale and running the event. My worry was that our past attendees might feel like they were missing out on some of what we previously offered because we were pulling things out. I was pretty stressed about doing that. But at the same time I felt it was going to allow us to spend more time on other projects, it was going to be a more sustainable model, and it was something I needed to do.

There were two other things that we tried as part of what we were doing as well, which I’ll briefly touch on. Firstly, we wondered when we saw this simple event whether we’d designed something that could be run and reproduced in different places. We often talk on ProBlogger about repurposing your content, and I began to wonder what can we do with this event. Could we repurpose this event? It was a simple event where we had almost built a product, a formula for an event. ‘Could we do the event more than once?’ was an idea that I came up with.

We began to think about could we do it one on one weekend, one on another weekend in different cities to make it more accessible to our attendees, to reduce some of their expense, which might get more people there. We decided to run it over two consecutive weekends, we did it in Brisbane and in Melbourne here in Australia, and really had the idea that maybe we could even reproduce the event in more places as well, maybe even in other countries in future years.

The last big change that we did this year was to offer masterminds – an extra day for those who wanted to have a more intimate, higher-level, more personal, more interactive experience. A smaller group, we knew that it wouldn’t appeal to the large percentage of our audience, but could we offer this higher-value event on top of a premium experience for our attendees. This is something we’d actually been asked for for years, ever since the first year I ran the event. It was always something that I was used to because I knew I’d have to significantly raise the price and charge a lot more to be able to run that type of event. It would take quite a bit more expense of having speakers who are there to really do that one on-one-stuff.

I decided, ‘Okay, I’ve been asked for this, the demand’s there, maybe we need to give it a go’. And we decided to add the mastermind day into both of the cities. So day one was everyone all in together, that cheaper event, single stream, larger event, less personal but still valuable. Then, the mastermind event for day two, more intimate. This all felt really risky to me. I worried a lot. I lost some sleep in the lead-up to it all. But the results were fantastic, and I really am grateful that I took that leap and that my team went with me with this as well.

The events were a few months ago now. But it was one of the best things that we’ve done over the last year. The planning of the events were so much simpler – we designed the content very quickly, we locked in our speakers very quickly, we booked venues very quickly, we released the tickets and got it all out there very quickly. Not having the sponsors cut down a massive amount of work. Preparing for the event was a lot less work, and it enabled us to then move onto other things within the business.

Running the event was so much simpler. We came away from the first event nowhere near as tired. Also, having felt like we were able to really pay a lot more attention to our attendees. It took a much smaller team to run the event and we were more present with that audience.

The only tough part of the event really was on a more personal note. Unfortunately my father-in-law passed away the day before that first event which was a tough time for the family. And it was I guess a bit of an emotional rollercoaster for me personally. I’m not sure how I would’ve gotten through the event if it had been a bigger, previous event. Having that event, a simpler event, certainly took a less toll upon me. Despite that setback and that tough part of the event, on a personal level, the event was much more of a pleasure to run if I can say that in the midst of a tough time. Attendees’ feedback was really positive.

We did get some of our previous attendees who mentioned in their feedback that they definitely missed some of the sparkle of previous years, but over half of our attendees were actually first timers. They had nothing to compare it to, I guess. I was worried that by stripping back the amount of choice of our sessions, going from 40 or so sessions to six, I was worried that maybe there would be complaints about that.

Interestingly, even amongst our previous attendees, the overwhelming feedback was that people actually liked having to make fewer choices. This was a massive lesson for me. We actually simplified the event for our benefit as a business but it actually benefitted our attendees. What we realized is that in previous events, we’d actually created an event that for some of our audience was quite stressful to attend, it was quite overwhelming and they really enjoyed the stripped back, simplified event. I think this is a big lesson and this is something I’ll talk about in a moment or two as well. I lost count of the amount of people who told me they enjoyed the simple event. Whilst it certainly didn’t suit everyone, it worked very well.

I guess the big lesson for me was for years I felt like I needed to add more and more and more into the event, but in this case I actually learned that less is more. Whilst we made the event simpler for our own benefit, it really benefitted that audience. They were less stressed out.

On reflection, I think maybe we stripped things back a little bit too much and we would probably add a little bit more in, a little bit of that sparkle back in over in the coming years if we continue to go forward with this event. But I think we are on the right track.

The other two changes that I mentioned went really well as well. Creating an event format that could be reproduced or repurposed in different cities worked well. I’m not sure whether we’ll continue to do that or not in future years, but it certainly taught me that an event can be repurposed. Creating a simple structure that can be repurposed is something that we could do again.

Lastly, the masterminds. They went off. Wow, they were my highlight personally. They sold out and so there was demand there even at that significantly higher price. Secondly, they ran really well. The overwhelming feedback from mastermind attendees was really positive. We saw people taking action at the event that paid for what they paid to attend the event. There were people at the event who were creating courses and products. That week later, they had already made more than what they paid to attend the event. People took action, and that was probably the best thing for me. But they loved their intimacy, access to speakers, the networking, and we’ll definitely be doing more masterminds in future. And I personally loved having that more intimate experience with attendees as well. Again, we’ll evolve masterminds, but it was a big lesson for me.

The two big lessons, and these are two lessons that if you’re running events will apply, but I think these also really apply to blogging. I’ll really tie them back to blogging in each case. The first lesson, simple is good, less is more. Sometimes, as product creators, as bloggers, we feel compelled to add and add and add when it comes to value. And ‘value’ I put in italics, I guess. We feel like we want to add in more value, we want to add in more features, we want to add in more bonuses in the products we create and what we do as bloggers. And we do it because we genuinely want to provide as much value as possible. We think it will benefit our readers to add in more. We think it will also make our products more attractive people if there’s more features, if there’s more bells and whistles. Maybe people will be more attracted to what we do.

But in doing that, sometimes by adding in extra, we create complexity. Our products can end up feeling overwhelming. They can also end up feeling unfocused, and this is one of the things I realized about our event. Our first event was about how to make money blogging. But we’d actually built an event that was more about how to take photos, how to do social media, and some of these extraneous things which are important as bloggers, but really we’d lost some of that focus by adding in and adding in and adding in. By adding in the extra, we’d actually created something that was stressing out some of our attendees as well.

Sometimes, we end up putting out more and more and more and we overwhelm, we create complicated products, and we create complicated blogs. But also, we are putting in more time and expense as well, that really isn’t needed. The big question I came out of this event with was, ‘What else can I strip out of what I do? What other areas in my business have become complicated?’ It’s very easy for a business to evolve and become complicated in many different areas. I’ll talk about some of those in a moment.

What can you strip out, I guess is the big challenge, from what you do? We’ve actually been experimenting in a number of ways. I think simplification can relate to blogging in many different ways. Let me just touch on a couple.

Firstly, content. The content this year on ProBlogger, we’ve really simplified it. I know some of you have noticed this. A year ago, we were producing upwards of seven, sometimes up to ten pieces of content every week. I was getting emails from readers saying ‘That’s too much, I can’t read it all, I can’t consume it all. I’m feeling stressed by the amount of content that you’re producing.’ So we really stripped it back. Instead of ten pieces of content every week, we now do a podcast, two blog posts, a live video, and an email. that’s five pieces of content every week. The email is really a summary of the other four. It’s really four main pieces of content every week.

Simpler, it’s simpler to consume I hope for you, but it’s also simpler to produce. In doing that, we’ve reduced our expenses and the amount of time we’ve put into that and we’ve been able to increase the quality of what we do as well, which is always a good thing. It really has led to no dip in traffic, but it’s increased the engagement that we’ve had around each piece of content. Content scenario you can simplify.

Community. This year, again on ProBlogger, we simplified our approach to community. We really focused in our efforts on one area, our Facebook group. Rather than trying to provide community in lots of places, we’re encouraging anyone who’s a part of the ProBlogger audience to join our ProBlogger community Facebook group and to interact in the one place. In that group, we’ve tried to simplify things as well. Those of you who joined that group in the early days knew that it was a pretty noisy place and we’ve simplified it. We’ve pulled it back and we’ve asked you only to share tips and ask questions, not do anything else. We’ve built a rhythm for the week as well, we do different things on different days. Simplifying what is happening within that community has helped as well.

Simplify content, simplify community, simplify monetization, simplifying if you’ve got products, you probably can already see some things in what I’ve said before. Obviously, we did this this year with our event, we pulled things out of this product of the event. But you can do the same thing as well with other types of products that you offer as well.

I think back to a product we used to offer at ProBlogger, which was our membership site a few years ago. In that membership site, we had weekly calls, I had weekly teaching, we had a forum, we had deals of the week, we offered plugins, we offered a lot of bits and pieces within that community. Again, I wanted to add in as much as possible. I wanted to make it as valuable as possible, I wanted to add in extra features. But in doing so, it created so much work for my team but it also became quite overwhelming. As a result, you as the audience who are part of that weren’t engaging in that community as you could’ve been. I really realized that I created this beast that was hard to continue, it was hard to sustain from my end but it also wasn’t being utilized from others.

My friends who have really successful, the most successful membership sites that I’ve come across, really in most cases offer something that is very stripped back. They don’t offer loads of new content every week, they don’t offer forums with hundreds of threads, they offer very simplified things. They offer a little bit of content, high-quality content. They have very focused areas of community, they offer a little bit of coaching and personal access, they keep things minimal, they keep things focused. Again, you can simplify either the products that you create, the monetization that you do as well, and then the systems that you have as well.

It’s very easy as bloggers to evolve your systems and what you do to become quite complicated. For example, I know bloggers that have very complicated social media sharing systems. They share 20 times an hour on Twitter. In fact I’ve got one friend who’s a podcast friend who recently I was looking at what he did on Twitter. He tweets every two minutes. It’s not him, of course, it’s automation. It’s evolved to the point where he’s just being very noisy and maybe it’s a little out of control. I think it could be more in that particular case because I, for one, have muted his tweets. I’m not actually engaging at all with him anymore because there’s just too much going on. Less can be more, and it could be ‘less can be more’ in many different areas of your business. ‘What can you simplify?’ I guess is the question that I have for you there from that first lesson, less can be more.

The second lesson that I want to talk about that I learned at this event that I think really does apply in many ways to business in general, but also to blogging particularly, online business, is that a certain percentage of your audience is going to be willing to pay a higher premium for more. I’ve always, as I mentioned before, kept our prices for our events very low, the low cost, below what it actually cost us to put it on and we make our profit from sponsors. This was to make our event more accessible. On that front, I’m really proud of what we’ve done. I know that there are people who attend our events because they are so much cheaper. Every year, we get to hear from people saying, “This event is four times cheaper than other events I go to in industry events.”

I’m proud of that on some levels, but it also has been an increasingly risky move to do for my business, and it’s not really sustainable. I know that it’s risky. If my business goes under because of it, then that’s a disservice to our attendees to charge them less. Keeping our prices lower is a risky move, it’s something that wasn’t sustainable, but it also actually doesn’t allow us to fully serve our audience as well. Our audience have been asking for more, they want more personal, they want more interactional experiences. We’ve not been able to afford to offer that because we’re not charging as much.

This year, we didn’t actually put our prices up. But because we reduced our expenses and reduced the length of the event, that first day as well, we’re able to increase our profit margin and our tickets as well. In essence, we gave our attendees less but charged them the same, In effect, I guess putting our prices up a little. Also by adding in that premium level product, we offered a product that was significantly higher, I don’t exactly remember how much higher, I think it was four or five times what they might’ve paid in previous years to attend that mastermind. Our margin grew in that regards. As I said before, I was really nervous in doing that, by having that premium level product at that higher price point. But I guess what I learned is that it was well worth doing.

One of our speakers this year was James Schramko. He’s got a business called SuperFastBusiness. He did a video recently on his Facebook page that said that, “Ten percent of your audience will pay ten times more for what you offer.” Ten percent of your audience are going to pay ten times more, they’d be willing to pay ten times more for what you offer. I’m not suggesting that we all just increase our prices tenfold, but it’s kind of food for thought, isn’t it? If there’s ten percent of your audience who are willing to pay ten times more, that means you’re leaving some money on the table, I’m leaving some money on the table. I was really worried about offering that premium type product, but what I realized is that there was a significant proportion of our audience who wanted more and they were willing to pay for it.

Over ten percent of our attendees this year ended up coming to the mastermind, in fact it was closer to 20% of our attendees ended up coming to our mastermind. By significantly increasing the price for the masterminds, I learned that a significant proportion of our attendees could afford a higher price and were willing to afford that higher price if I could offer something extra value.

Really, this for me is the key. What can you add to what you offer? What can you add to your products to make it a premium level product? Not everyone is going to take that offer, that’s totally fine. They will continue to buy your low-priced products. But there are a proportion of your audience who would be willing to take the extra step if it’s valuable. Really, that’s the key. It’s got to be valuable. I think our masterminds proved this year that that was the case. As I said before, we saw people taking action at the masterminds who were making money at a higher rate and it paid for them to really attend those masterminds.

I know masterminds are going to be a part of what we offer going forward. In fact, if anything, I think we’ll expand them from one-day events to longer ones as something that our attendees actually want more of, they want a longer, more intense, more immersive experience as well.

How does this particular lesson apply to blogging? I think it can apply in a few different ways. If you are monetizing with a product, an ebook or a course or something else, what could you add to make it a premium level product? I’m not suggesting just put your prices up, although that may be the case, maybe you could do that. But what could you add to make it into a premium level offering?

If you’re selling an ebook, what could you add? Could you add some bonus videos? Could you add some printables? Could you add some access to you personally in a coaching package? Could you add access to a private Facebook group? You might already have the thing that you could add, or you might need to create it. In most cases, something could be added to make it an upsell I guess, to make it a premium level offering.

If you don’t have products, you could also take this same principle and apply it in other areas as well. For example, if you’re doing affiliate promotions, maybe you should be considering throwing into the mix of the things that you promote the occasional higher price point product. We’ve done this on Digital Photography School. We typically promote ebooks or courses that may be $20 to $50 as a price point. That’s a sweet spot for our audience. They like to buy products around the $20 mark up to $50.

But occasionally, what we’ve done over the last couple of years is promoted very comprehensive courses that have sold for over $200, up to ten times the price of the $20 product. Whilst a small percentage of our audience buy those products, you don’t have to sell too many at that kind of price point to make a pretty decent product. Maybe mixing it up, the types of product that you promote and promoting different price points as well.

Alternatively, if you’re promoting physical products on Amazon or some other store, maybe when you promote a product that’s a budget product, maybe putting alongside a premium product as well. On Digital Photography School we quite often review lenses. We might review a budget lens for a camera, might be a $200 lens, very affordable. But we know there are other lenses out there that are more professional grade lenses, maybe during the review, in the middle of the review, we might mention if you’ve got a higher budget, here’s a professional grade lens and here are some of the benefits you’ll get from upgrading. Maybe putting products alongside each other in that way may be worthwhile as well.

These are the two big lessons that I learned this year about events, but I think they really do apply across to blogging. Less is more, simplify what you do. You may be adding too much complexity into your content, into your community, into your monetization, into some of the systems that you have. What fat can you cut out of what you’re doing to simplify and reduce the expenses, and also to remove some of the stress and overwhelm amongst your audience as well.

Secondly, there are a percentage of your audience who are willing to pay more for what you do than you’re already charging. So what can you add? What extra can you add in to give a premium level product and service to what you do as well? I think it does apply to not just products but also services as well. If you’re a freelancer and you offer your services as a writer, what premium-level package could you add in as well? What could you add in on top of the writing for the clients that you have? You can add in premium level stuff on that regard as well.

I would love to hear your feedback on today’s show around these things. How are you going to simplify what you do? What premium-level products could you create? You can let us know over on the show notes at problogger.com/podcast/215 or you can find us on Facebook if you just search for the ProBlogger Community on Facebook, you’ll find our little community, or you can just go to problogger.com/group and you’ll be forwarded into that group as well. Let us know what you think of today’s episode.

As I said before, I’m heading away to Dallas in a few days’ time so I will not be doing new podcasts over the next couple of weeks. But there’s plenty of episodes to dig into. One that I really do recommend that you go back and listen to, in fact it’s just the first of a series that we did a year or so ago now, was Episode 137. I really think that if you want to give your blog an injection of goodness and greatness, if you wanna get your blogging groove back, I would really recommend that you go back and listen to Episode 137. It was the start of a series that I did over a week. It was called Seven Days To Getting Your Blogging Groove Back. Actually goes from Episode 137 through to Episode 143, I guess.

It gives you, every day for seven days, a different type of blog post to create. Every day I teach you how to do a different type of blog post. Then, I challenge you to create that blog post. We went through this little challenge as a community over seven days a couple of years ago now. It was amazing to see the feedback as a result of that.

You may choose to do this over seven days, you might want to do it over the next week, or you can spread it out a little. I’m away for two and a half weeks from this podcast, so over the next couple of weeks, you might want to choose one every couple of days and create those posts as a result of that. You can let us know how you go with those over in the Facebook group as well. If you go to problogger.com/podcast/137, you’ll find links to all of those shows. It’s Episode 137. Alternatively, you can find them over in iTunes, or in Stitcher, or in any of the other podcast apps that you use as well. Episode 137, Seven Days To Getting Your Blogging Groove Back.

Hope you enjoy that little series. I look forward to chatting with you in the next episode of the ProBlogger podcast in a few weeks time. Thanks for listening.

How did you go with today’s episode?

Enjoy this podcast? Sign up to our ProBloggerPLUS newsletter to get notified of all new tutorials and podcasts.

Click Here to Subscribe to ProBloggerPLUS for Free

The post 215: Simplify Your Business and Make More Money Blogging appeared first on ProBlogger.

Reblogged 21 hours ago from feedproxy.google.com

Google Posts can now be automated with new API support

Google My Business API version 4.0 adds new features, including the ability to manage your Google Posts.

The post Google Posts can now be automated with new API support appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.

Reblogged 22 hours ago from feeds.searchengineland.com

How machine learning levels the SERP playing field

Contributor Kristopher Jones explains how SEOs should be changing their practices to keep up with trends in the way Google evaluates web pages.

The post How machine learning levels the SERP playing field appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.

Reblogged 22 hours ago from feeds.searchengineland.com