Back to Top

SMX 2021 will provide the most comprehensive search marketing education available

6 events, actionable tactics, affordable for all marketers

Please visit Search Engine Land for the full article.

Reblogged 1 year ago from feeds.searchengineland.com

Counting Down To Bundles Of Smashing Joy And Workshops In 2021

This year has been quite a ride — all the more reason to look forward to a new year with new beginnings, right? Well, we’ll never really know what awaits us in the next months to come, but what I do know is that everyone on this planet can do only so much and really just the best they can to pull through. It’s certainly been a year of less ups and more downs for so many people around the world, and we hope that with everything we’ve been doing at Smashing has helped make life at least a lil’ bit easier.

Plan Your Year Ahead With Online Workshops

Have you attended one of our workshops yet? The Smashing Events team is thrilled each and every time they run a workshop with all of the wonderful attendees from all over the world coming together to learn together. So many ideas have been brought to life thanks to the live design and coding sessions, and there are many folks that have found new friends, too!

It gets even better: We now have workshop bundles from which you can choose three, five or even ten workshop tickets for the workshops of your choice — ongoing, upcoming or the ones happening in the future!

Jan. 5 – Jan. 19 Build, Ship and Extend GraphQL APIs from Scratch Christian Nwamba Dev
Jan. 19 – Jan. 27 Form Design Masterclass Adam Silver Dev
Jan. 21 – Feb. 5 New Adventures In Front-End, 2021 Edition Vitaly Friedman Design & UX
Feb. 2 – Feb. 10 Building Modern HTML Emails Rémi Parmentier Dev
Feb. 11 – Feb. 26 The SVG Animation Masterclass Cassie Evans Dev
Feb. 16 – Feb. 17 The CSS Layout Masterclass Rachel Andrew Dev
Feb. 23 – Mar. 9 Successful Design Systems Brad Frost Dev
Mar. 4 – Mar. 12 Psychology For UX and Product Design Joe Leech Design & UX
Mar. 16 – Mar. 24 Finding Clients Masterclass Paul Boag Design & UX
Mar. 18 – Apr. 1 Behavioral Design Susan & Guthrie Weinschenk Design & UX
Mar. 30 – Mar. 31 Designing The Perfect Navigation Vitaly Friedman Design & UX

We hope you’ll find at least one workshop in the list above that fits your projects and career path, and if not, please do get in touch with us on Twitter and we promise to do our best to make it happen. Also, feel free to subscribe here if you’d like to be one of the first folks to be notified when new workshops come up, and get access to early-bird prices as well — we’ll have lots of goodies coming your way very soon!

Members Get Access To Videos And More

We’re proud to have a steadily growing Membership family who love good content, appreciate friendly discounts, and are an active part of our lovely web community. If you’re not involved yet, we’d love for you to join in and become a member, too! There are constant discounts on printed books, job postings, conference tickets, and your support really helps us pay the bills. ❤️

Smashing Podcast: Tune In And Get Inspired

This year, we’ve published a new Smashing Podcast episode every two weeks, and the feedback has been awesome! With over 56k downloads (just over a thousand per week, and growing!), we’ve had 34 guests on the podcast with different backgrounds and so much to share!

If you don’t see a topic you’d like to hear and learn more about, please don’t hesitate to reach out to host Drew McLellan or get in touch via Twitter anytime — we’d love to hear from you!

1. What Is Art Direction? 2. What’s So Great About Freelancing?
3. What Are Design Tokens? 4. What Are Inclusive Components?
5. What Are Variable Fonts? 6. What Are Micro-Frontends?
7. What Is A Government Design System? 8. What’s New In Microsoft Edge?
9. How Can I Work With UI Frameworks? 10. What Is Ethical Design?
11. What Is Sourcebit? 12. What Is Conversion Optimization?
13. What Is Online Privacy? 14. How Can I Run Online Workshops?
15. How Can I Build An App In 10 Days? 16. How Can I Optimize My Home Workspace?
17. What’s New In Drupal 9? 18. How Can I Learn React?
19. What Is CUBE CSS? 20. What Is Gatsby?
21. Are Modern Best Practices Bad For The Web? 22. What Is Serverless?
23. What Is Next.js? 24. What Is SVG Animation?
25. What Is RedwoodJS? 26. What’s New In Vue 3.0?
27. What Is TypeScript? 28. What Is Eleventy?
29. How Does Netlify Dogfood The Jamstack? 30. What Is Product Design?
31. What Is GraphQL? 32. Coming up on December 29

Stay tuned for the next episode coming out very soon!

Smashing Newsletter: Best Picks

With our weekly newsletter, we aim to bring you useful content and share all the cool things that folks are working on in the web industry. There are so many talented folks out there working on brilliant projects, and we’d appreciate it if you could help spread the word and give them the credit they deserve!

Also, by subscribing, there are no third-party mailings or hidden advertising involved, and your support really helps us pay the bills. ❤️

Interested in sponsoring? Feel free to check out our partnership options and get in touch with the team anytime — they’ll be sure to get back to you as soon as they can.

Preventing Layout Shifts With CSS Grid

It’s no news that CSS Grid is a fantastic tool to build complex layouts. But did you know that it can help you prevent layout shifts, too? When Hubert Sablonnière discovered a layout shift problem with a toggling state on a UI component he worked on, he came up with a solution: the “Anti Layout Shift Grid Stacking Technique”.

Compared to solving the layout shift with absolute positioning, Hubert’s Grid-based technique supports complex situations that require more than two panels. Another benefit: You don’t need to assume which panel should guide the size of the whole component. If you want to dive in deeper, Hubert wrote up everything you need to know to prevent both vertical and horizontal shifts in a practical blog post. (cm)

Fixing Headers And Jump Links

Jump links in combination with fixed headers can cause quite some frustration. Maybe you’ve run into the same issue before: When clicked, your jump link takes you to the desired element, but a fixed header is hiding it. In the past, wild hacks were required to solve the issue. Luckily, there’s now a straightforward and well-supported CSS solution.

The trick: scroll-margin-top. Assign it to your headers, and the position: fixed header won’t get into their way anymore when you navigate to them with a jump link. A short line of code that makes a huge difference. (cm)

Fluid Typography With clamp()

When it comes to fluid scaling, CSS has some exciting new features: clamp(), min(), and max(). They cap and scale values as the browser grows and shrinks. min() and max() return the respective minimum and maximum values at any given time while clamp lets you you pass in both a minimum and maximum plus a preferred size for the browser to use.

As Trys Mudford points out, clamp() comes in particularly handy when you want broadly fluid typography without being 100% specific about the relationship between the varying sizes. In his in-depth article about the new feature, he shares valuable hands-on tips for using clamp() effectively. (cm)

Open-Source Screen Recorder And Annotation Tool

If you’ve been looking for a free and easy-to-use tool to record your screen, it might be hard to find something more powerful than Alyssa X’s open-source screen recorder Screenity.

No matter if you want to give contextual feedback on a project, provide detailed explanations, or showcase your product to potential customers, Screenity offers a number of practical features to capture, annotate, and edit your recordings — without any time limit. You can draw on the screen and add text and arrows, for example, highlight clicks and focus on the mouse, push to talk, and much more. Screenity is available for Chrome. (cm)

A Human-Friendly Date Picker

Date pickers can be hard to get right. A beautiful example of a human-friendly and fully accessible date picker comes from Tommy Feldt.

Thanks to Chrono.js, it supports natural language inputs, so that a user can type something like “tomorrow”, “December 2”, or “in 5 days” to select a date. Shortcut buttons also help to select the most common dates. The date picker is fully accessible with the keyboard and screen readers (there’s even an on-demand help feature for screen reader and keyboard users) and degrades gracefully when JavaScript or CSS aren’t available. A very inspiring proof of concept. (cm)

Become A Jamstack Explorer

The Jamstack is still unexplored territory for you? Jamstack Explorers helps change that. Its mission: teaching you about building for the web with modern tools and techniques.

You can choose from three courses, track your progress, and earn rewards as you proceed through the Jamstack universe. Tara Z. Manicsic leads you through the wilds of Angular, Phil Hawksworth teaches you how to serve and track multiple versions of your site with Netlify, and Cassidy Williams guides you through all the essentials of Next.js. Once you’ve completed the three missions, there’s not only a certificate waiting, but you can call yourself a Jamstack Explorer, ready to use the newest tools to build experiences that are robust, performant, and secure. (cm)

Making Remote Design Work

Design reviews, sprints, feedback — design is a collaborative effort that brings along quite some challenges when doing it remotely. The folks at InVision put together a collection of handy resources to help you and your team master these challenges.

The content covers three of the most trickiest aspects of working remotely: fostering creativity, aiding collaboration, and staying focused. For more best practices for running a remote design team, InVision also published a free eBook drawing from their own experience of working remotely with 700 employees spread across 30 countries and no single office. (cm)

Full-Screen Countdown Timer To Stay On Track

Sticking to the schedule can be tricky when you are running a long video call or are giving a talk or workshop. To help you make sure the session stays on track, Koos Looijesteijn built Big Timer.

The bold yet minimalist timer counts down the remaining minutes right in your browser window — and even if you accidentally close the browser tab or need to restart your device, it will take the disruption into account. Keyboard shortcuts make it easy to adjust the duration and pause or stop the countdown. One for the bookmarks. (cm)

Sounds And Music To Help You Focus

Are you the type of person who can’t focus when it’s quiet around them? Then one of the following tools might help you become more productive. If you’re missing the familiar office sounds when working from home, I Miss The Office brings some office atmosphere into your home office — with virtual colleagues who produce typical sounds like typing, squeaking chairs, or the occasional bubbling of the watercooler.

Office sounds have always distracted you more than helped you focus? Then Noizio could be for you. The app lets you mix nature and city sounds to create your personal ambient sound. Another approach to increasing focus with sound comes from Brain.fm. Their team of scientists, musicians, and developers designs functional music that affects the brain to achieve the desired mental state. Last but not least, [email protected] is also based on neuroscience and helps increase focus by changing the characteristics of music at the right time intervals. Promising alternatives to your usual playlist. (cm)

The Web Almanac 2020

Looking back at 2020, what’s the state of the web this year? The yearly Web Almanac gives in-depth answers to this question, combining the raw stats and trends of the HTTP Archive with the expertise of the web community. The results are backed up by real data taken from more than 7.5 million websites and trusted web experts.

22 chapters make up this years’ almanac. They are divided into four parts — content, experience, publishing, distribution —, and each one of them is explored from different angles. An insightful look into the state of performance is included, too, of course. (cm)

Generate A Request Map Of Your Site

Where do all the transmitted bytes on your site come from? Analyzing third-party components in detail is a time-consuming task, but it’s already a good start to know which third parties are on your site — and how they got there.

Simon Hearne’s request map generator tool visualizes a node map of all the requests on a page for any given URL. The size of the nodes on the map is proportional to the percentage of total bytes, and, when you hover over a node, you’ll get information on its size, response and load times. No more bad surprises. (cm)

Let’s Tweak Our JavaScript Bundles!

Chances are high that with your JavaScript code being around for a while, your JavaScript bundles are a little bit outdated. You might have some outdated polyfills, or you might be using a slightly outdated JavaScript syntax. But now there is a little tool that helps you identify those bottlenecks and fix them for good.

EStimator calculates the size and performance improvement a site could achieve by switching to modern JavaScript syntax. It shows which bundles could be improved, and what impact this change would have on your overall performance. The source code is also available on GitHub. (vf)

Reblogged 1 year ago from smashingmagazine.com

Building A Stocks Price Notifier App Using React, Apollo GraphQL And Hasura

The concept of getting notified when the event of your choice has occurred has become popular compared to being glued onto the continuous stream of data to find that particular occurrence yourself. People prefer to get relevant emails/messages when their preferred event has occurred as opposed to being hooked on the screen to wait for that event to happen. The events-based terminology is also quite common in the world of software.

How awesome would that be if you could get the updates of the price of your favorite stock on your phone?

In this article, we’re going to build a Stocks Price Notifier application by using React, Apollo GraphQL, and Hasura GraphQL engine. We’re going to start the project from a create-react-app boilerplate code and would build everything ground up. We’ll learn how to set up the database tables, and events on the Hasura console. We’ll also learn how to wire up Hasura’s events to get stock price updates using web-push notifications.

Here’s a quick glance at what we would be building:

Stock Price Notifier Application

Let’s get going!

An Overview Of What This Project Is About

The stocks data (including metrics such as high, low, open, close, volume) would be stored in a Hasura-backed Postgres database. The user would be able to subscribe to a particular stock based on some value or he can opt to get notified every hour. The user will get a web-push notification once his subscription criteria are fulfilled.

This looks like a lot of stuff and there would obviously be some open questions on how we’ll be building out these pieces.

Here’s a plan on how we would accomplish this project in four steps:

  1. Fetching the stocks data using a NodeJs script
    We’ll start by fetching the stock data using a simple NodeJs script from one of the providers of stocks API — Alpha Vantage. This script will fetch the data for a particular stock in intervals of 5mins. The response of the API includes high, low, open, close and volume. This data will be then be inserted in the Postgres database that is integrated with the Hasura back-end.
  2. Setting up The Hasura GraphQL engine
    We’ll then set-up some tables on the Postgres database to record data points. Hasura automatically generates the GraphQL schemas, queries, and mutations for these tables.
  3. Front-end using React and Apollo Client
    The next step is to integrate the GraphQL layer using the Apollo client and Apollo Provider (the GraphQL endpoint provided by Hasura). The data-points will be shown as charts on the front-end. We’ll also build the subscription options and will fire corresponding mutations on the GraphQL layer.
  4. Setting up Event/Scheduled triggers
    Hasura provides an excellent tooling around triggers. We’ll be adding event & scheduled triggers on the stocks data table. These triggers will be set if the user is interested in getting a notification when the stock prices reach a particular value (event trigger). The user can also opt for getting a notification of a particular stock every hour (scheduled trigger).

Now that the plan is ready, let’s put it into action!

Here’s the GitHub repository for this project. If you get lost anywhere in the code below, refer to this repository and get back to speed!

Fetching The Stocks Data Using A NodeJs Script

This is not that complicated as it sounds! We’ll have to write a function that fetches data using the Alpha Vantage endpoint and this fetch call should be fired in an interval of 5 mins (You guessed it right, we’ll have to put this function call in setInterval).

If you’re still wondering what Alpha Vantage is and just want to get this out of your head before hopping onto the coding part, then here it is:

Alpha Vantage Inc. is a leading provider of free APIs for realtime and historical data on stocks, forex (FX), and digital/cryptocurrencies.

We would be using this endpoint to get the required metrics of a particular stock. This API expects an API key as one of the parameters. You can get your free API key from here. We’re now good to get onto the interesting bit — let’s start writing some code!

Installing Dependencies

Create a stocks-app directory and create a server directory inside it. Initialize it as a node project using npm init and then install these dependencies:

npm i isomorphic-fetch pg nodemon --save

These are the only three dependencies that we’d need to write this script of fetching the stock prices and storing them in the Postgres database.

Here’s a brief explanation of these dependencies:

  • isomorphic-fetch
    It makes it easy to use fetch isomorphically (in the same form) on both the client and the server.
  • pg
    It is a non-blocking PostgreSQL client for NodeJs.
  • nodemon
    It automatically restarts the server on any file changes in the directory.
Setting up the configuration

Add a config.js file at the root level. Add the below snippet of code in that file for now:

const config = {
  user: '<DATABASE_USER>',
  password: '<DATABASE_PASSWORD>',
  host: '<DATABASE_HOST>',
  port: '<DATABASE_PORT>',
  database: '<DATABASE_NAME>',
  ssl: '<IS_SSL>',
  apiHost: 'https://www.alphavantage.co/',
};

module.exports = config;

The user, password, host, port, database, ssl are related to the Postgres configuration. We’ll come back to edit this while we set up the Hasura engine part!

Initializing The Postgres Connection Pool For Querying The Database

A connection pool is a common term in computer science and you’ll often hear this term while dealing with databases.

While querying data in databases, you’ll have to first establish a connection to the database. This connection takes in the database credentials and gives you a hook to query any of the tables in the database.

Note: Establishing database connections is costly and also wastes significant resources. A connection pool caches the database connections and re-uses them on succeeding queries. If all the open connections are in use, then a new connection is established and is then added to the pool.

Now that it is clear what the connection pool is and what is it used for, let’s start by creating an instance of the pg connection pool for this application:

Add pool.js file at the root level and create a pool instance as:

const { Pool } = require('pg');
const config = require('./config');

const pool = new Pool({
  user: config.user,
  password: config.password,
  host: config.host,
  port: config.port,
  database: config.database,
  ssl: config.ssl,
});

module.exports = pool;

The above lines of code create an instance of Pool with the configuration options as set in the config file. We’re yet to complete the config file but there won’t be any changes related to the configuration options.

We’ve now set the ground and are ready to start making some API calls to the Alpha Vantage endpoint.

Let’s get onto the interesting bit!

Fetching The Stocks Data

In this section, we’ll be fetching the stock data from the Alpha Vantage endpoint. Here’s the index.js file:

const fetch = require('isomorphic-fetch');
const getConfig = require('./config');
const { insertStocksData } = require('./queries');

const symbols = [
  'NFLX',
  'MSFT',
  'AMZN',
  'W',
  'FB'
];

(function getStocksData () {

  const apiConfig = getConfig('apiHostOptions');
  const { host, timeSeriesFunction, interval, key } = apiConfig;

  symbols.forEach((symbol) => {
    fetch(${host}query/?function=${timeSeriesFunction}&symbol=${symbol}&interval=${interval}&apikey=${key})
    .then((res) => res.json())
    .then((data) => {
      const timeSeries = data['Time Series (5min)'];
      Object.keys(timeSeries).map((key) => {
        const dataPoint = timeSeries[key];
        const payload = [
          symbol,
          dataPoint['2. high'],
          dataPoint['3. low'],
          dataPoint['1. open'],
          dataPoint['4. close'],
          dataPoint['5. volume'],
          key,
        ];
        insertStocksData(payload);
      });
    });
  })
})()

For the purpose of this project, we’re going to query prices only for these stocks — NFLX (Netflix), MSFT (Microsoft), AMZN (Amazon), W (Wayfair), FB (Facebook).

Refer this file for the config options. The IIFE getStocksData function is not doing much! It loops through these symbols and queries the Alpha Vantage endpoint ${host}query/?function=${timeSeriesFunction}&symbol=${symbol}&interval=${interval}&apikey=${key} to get the metrics for these stocks.

The insertStocksData function puts these data points in the Postgres database. Here’s the insertStocksData function:

const insertStocksData = async (payload) => {
  const query = 'INSERT INTO stock_data (symbol, high, low, open, close, volume, time) VALUES ($1, $2, $3, $4, $5, $6, $7)';
  pool.query(query, payload, (err, result) => {
    console.log('result here', err);
  });
};

This is it! We have fetched data points of the stock from the Alpha Vantage API and have written a function to put these in the Postgres database in the stock_data table. There is just one missing piece to make all this work! We’ve to populate the correct values in the config file. We’ll get these values after setting up the Hasura engine. Let’s get to that right away!

Please refer to the server directory for the complete code on fetching data points from Alpha Vantage endpoint and populating that to the Hasura Postgres database.

If this approach of setting up connections, configuration options, and inserting data using the raw query looks a bit difficult, please don’t worry about that! We’re going to learn how to do all this the easy way with a GraphQL mutation once the Hasura engine is set up!

Setting Up The Hasura GraphQL Engine

It is really simple to set up the Hasura engine and get up and running with the GraphQL schemas, queries, mutations, subscriptions, event triggers, and much more!

Click on Try Hasura and enter the project name:

I’m using the Postgres database hosted on Heroku. Create a database on Heroku and link it to this project. You should then be all set to experience the power of query-rich Hasura console.

Please copy the Postgres DB URL that you’ll get after creating the project. We’ll have to put this in the config file.

Click on Launch Console and you’ll be redirected to this view:

Let’s start building the table schema that we’d need for this project.

Creating Tables Schema On The Postgres Database

Please go to the Data tab and click on Add Table! Let’s start creating some of the tables:

symbol table

This table would be used for storing the information of the symbols. For now, I’ve kept two fields here — id and company. The field id is a primary key and company is of type varchar. Let’s add some of the symbols in this table:

stock_data table

The stock_data table stores id, symbol, time and the metrics such as high, low, open, close, volume. The NodeJs script that we wrote earlier in this section will be used to populate this particular table.

Here’s how the table looks like:

Neat! Let’s get to the other table in the database schema!

user_subscription table

The user_subscription table stores the subscription object against the user Id. This subscription object is used for sending web-push notifications to the users. We’ll learn later in the article how to generate this subscription object.

There are two fields in this table — id is the primary key of type uuid and subscription field is of type jsonb.

events table

This is the important one and is used for storing the notification event options. When a user opts-in for the price updates of a particular stock, we store that event information in this table. This table contains these columns:

  • id: is a primary key with the auto-increment property.
  • symbol: is a text field.
  • user_id: is of type uuid.
  • trigger_type: is used for storing the event trigger type — time/event.
  • trigger_value: is used for storing the trigger value. For example, if a user has opted in for price-based event trigger — he wants updates if the price of the stock has reached 1000, then the trigger_value would be 1000 and the trigger_type would be event.

These are all the tables that we’d need for this project. We also have to set up relations among these tables to have a smooth data flow and connections. Let’s do that!

Setting up relations among tables

The events table is used for sending web-push notifications based on the event value. So, it makes sense to connect this table with the user_subscription table to be able to send push notifications on the subscriptions stored in this table.

events.user_id  → user_subscription.id

The stock_data table is related to the symbols table as:

stock_data.symbol  → symbol.id

We also have to construct some relations on the symbol table as:

stock_data.symbol  → symbol.id
events.symbol  → symbol.id

We’ve now created the required tables and also established the relations among them! Let’s switch to the GRAPHIQL tab on the console to see the magic!

Hasura has already set up the GraphQL queries based on these tables:

It is plainly simple to query on these tables and you can also apply any of these filters/properties (distinct_on, limit, offset, order_by, where) to get the desired data.

This all looks good but we have still not connected our server-side code to the Hasura console. Let’s complete that bit!

Connecting The NodeJs Script To The Postgres Database

Please put the required options in the config.js file in the server directory as:

const config = {
  databaseOptions: {
    user: '<DATABASE_USER>',
    password: '<DATABASE_PASSWORD>',
    host: '<DATABASE_HOST>',
    port: '<DATABASE_PORT>',
    database: '<DATABASE_NAME>',
    ssl: true,
  },
  apiHostOptions: {
    host: 'https://www.alphavantage.co/',
    key: '<API_KEY>',
    timeSeriesFunction: 'TIME_SERIES_INTRADAY',
    interval: '5min'
  },
  graphqlURL: '<GRAPHQL_URL>'
};

const getConfig = (key) => {
  return config[key];
};

module.exports = getConfig;

Please put these options from the database string that was generated when we created the Postgres database on Heroku.

The apiHostOptions consists of the API related options such as host, key, timeSeriesFunction and interval.

You’ll get the graphqlURL field in the GRAPHIQL tab on the Hasura console.

The getConfig function is used for returning the requested value from the config object. We’ve already used this in index.js in the server directory.

It’s time to run the server and populate some data in the database. I’ve added one script in package.json as:

"scripts": {
    "start": "nodemon index.js"
}

Run npm start on the terminal and the data points of the symbols array in index.js should be populated in the tables.

Refactoring The Raw Query In The NodeJs Script To GraphQL Mutation

Now that the Hasura engine is set up, let’s see how easy can it be to call a mutation on the stock_data table.

The function insertStocksData in queries.js uses a raw query:

const query = 'INSERT INTO stock_data (symbol, high, low, open, close, volume, time) VALUES ($1, $2, $3, $4, $5, $6, $7)';

Let’s refactor this query and use mutation powered by the Hasura engine. Here’s the refactored queries.js in the server directory:


const { createApolloFetch } = require('apollo-fetch');
const getConfig = require('./config');

const GRAPHQL_URL = getConfig('graphqlURL');
const fetch = createApolloFetch({
  uri: GRAPHQL_URL,
});

const insertStocksData = async (payload) => {
  const insertStockMutation = await fetch({
    query: mutation insertStockData($objects: [stock_data_insert_input!]!) {
      insert_stock_data (objects: $objects) {
        returning {
          id
        }
      }
    },
    variables: {
      objects: payload,
    },
  });
  console.log('insertStockMutation', insertStockMutation);
};

module.exports = {
  insertStocksData
}

Please note: We’ve to add graphqlURL in the config.js file.

The apollo-fetch module returns a fetch function that can be used to query/mutate the date on the GraphQL endpoint. Easy enough, right?

The only change that we’ve to do in index.js is to return the stocks object in the format as required by the insertStocksData function. Please check out index2.js and queries2.js for the complete code with this approach.

Now that we’ve accomplished the data-side of the project, let’s move onto the front-end bit and build some interesting components!

Note: We don’t have to keep the database configuration options with this approach!

Front-end Using React And Apollo Client

The front-end project is in the same repository and is created using the create-react-app package. The service worker generated using this package supports assets caching but it doesn’t allow more customizations to be added to the service worker file. There are already some open issues to add support for custom service worker options. There are ways to get away with this problem and add support for a custom service worker.

Let’s start by looking at the structure for the front-end project:

Please check the src directory! Don’t worry about the service worker related files for now. We’ll learn more about these files later in this section. The rest of the project structure looks simple. The components folder will have the components (Loader, Chart); the services folder contains some of the helper functions/services used for transforming objects in the required structure; styles as the name suggests contains the sass files used for styling the project; views is the main directory and it contains the view layer components.

We’d need just two view components for this project — The Symbol List and the Symbol Timeseries. We’ll build the time-series using the Chart component from the highcharts library. Let’s start adding code in these files to build up the pieces on the front-end!

Installing Dependencies

Here’s the list of dependencies that we’ll need:

  • apollo-boost
    Apollo boost is a zero-config way to start using Apollo Client. It comes bundled with the default configuration options.
  • reactstrap and bootstrap
    The components are built using these two packages.
  • graphql and graphql-type-json
    graphql is a required dependency for using apollo-boost and graphql-type-json is used for supporting the json datatype being used in the GraphQL schema.
  • highcharts and highcharts-react-official
    And these two packages will be used for building the chart:

  • node-sass
    This is added for supporting sass files for styling.

  • uuid
    This package is used for generating strong random values.

All of these dependencies will make sense once we start using them in the project. Let’s get onto the next bit!

Setting Up Apollo Client

Create a apolloClient.js inside the src folder as:

import ApolloClient from 'apollo-boost';

const apolloClient = new ApolloClient({
  uri: '<HASURA_CONSOLE_URL>'
});

export default apolloClient;

The above code instantiates ApolloClient and it takes in uri in the config options. The uri is the URL of your Hasura console. You’ll get this uri field on the GRAPHIQL tab in the GraphQL Endpoint section.

The above code looks simple but it takes care of the main part of the project! It connects the GraphQL schema built on Hasura with the current project.

We also have to pass this apollo client object to ApolloProvider and wrap the root component inside ApolloProvider. This will enable all the nested components inside the main component to use client prop and fire queries on this client object.

Let’s modify the index.js file as:

const Wrapper = () => {
/* some service worker logic - ignore for now */
  const [insertSubscription] = useMutation(subscriptionMutation);
  useEffect(() => {
    serviceWorker.register(insertSubscription);
  }, [])
  /* ignore the above snippet */
  return <App />;
}

ReactDOM.render(
  <ApolloProvider client={apolloClient}>
    <Wrapper />
  </ApolloProvider>,
  document.getElementById('root')
);

Please ignore the insertSubscription related code. We’ll understand that in detail later. The rest of the code should be simple to get around. The render function takes in the root component and the elementId as parameters. Notice client (ApolloClient instance) is being passed as a prop to ApolloProvider. You can check the complete index.js file here.

Setting Up The Custom Service Worker

A Service worker is a JavaScript file that has the capability to intercept network requests. It is used for querying the cache to check if the requested asset is already present in the cache instead of making a ride to the server. Service workers are also used for sending web-push notifications to the subscribed devices.

We’ve to send web-push notifications for the stock price updates to the subscribed users. Let’s set the ground and build this service worker file!

The insertSubscription related snipped in the index.js file is doing the work of registering service worker and putting the subscription object in the database using subscriptionMutation.

Please refer queries.js for all the queries and mutations being used in the project.

serviceWorker.register(insertSubscription); invokes the register function written in the serviceWorker.js file. Here it is:

export const register = (insertSubscription) => {
  if ('serviceWorker' in navigator) {
    const swUrl = `${process.env.PUBLIC_URL}/serviceWorker.js`
    navigator.serviceWorker.register(swUrl)
      .then(() => {
        console.log('Service Worker registered');
        return navigator.serviceWorker.ready;
      })
      .then((serviceWorkerRegistration) => {
        getSubscription(serviceWorkerRegistration, insertSubscription);
        Notification.requestPermission();
      })
  }
}

The above function first checks if serviceWorker is supported by the browser and then registers the service worker file hosted on the URL swUrl. We’ll check this file in a moment!

The getSubscription function does the work of getting the subscription object using the subscribe method on the pushManager object. This subscription object is then stored in the user_subscription table against a userId. Please note that the userId is being generated using the uuid function. Let’s check out the getSubscription function:

const getSubscription = (serviceWorkerRegistration, insertSubscription) => {
  serviceWorkerRegistration.pushManager.getSubscription()
    .then ((subscription) => {
      const userId = uuidv4();
      if (!subscription) {
        const applicationServerKey = urlB64ToUint8Array('<APPLICATION_SERVER_KEY>')
        serviceWorkerRegistration.pushManager.subscribe({
          userVisibleOnly: true,
          applicationServerKey
        }).then (subscription => {
          insertSubscription({
            variables: {
              userId,
              subscription
            }
          });
          localStorage.setItem('serviceWorkerRegistration', JSON.stringify({
            userId,
            subscription
          }));
        })
      }
    })
}

You can check serviceWorker.js file for the complete code!

Notification.requestPermission() invoked this popup that asks the user for the permission for sending notifications. Once the user clicks on Allow, a subscription object is generated by the push service. We’re storing that object in the localStorage as:

The field endpoint in the above object is used for identifying the device and the server uses this endpoint to send web push notifications to the user.

We have done the work of initializing and registering the service worker. We also have the subscription object of the user! This is working all good because of the serviceWorker.js file present in the public folder. Let’s now set up the service worker to get things ready!

This is a bit difficult topic but let’s get it right! As mentioned earlier, the create-react-app utility doesn’t support customizations by default for the service worker. We can achieve customer service worker implementation using workbox-build module.

We also have to make sure that the default behavior of pre-caching files is intact. We’ll modify the part where the service worker gets build in the project. And, workbox-build helps in achieving exactly that! Neat stuff! Let’s keep it simple and list down all that we have to do to make the custom service worker work:

  • Handle the pre-caching of assets using workboxBuild.
  • Create a service worker template for caching assets.
  • Create sw-precache-config.js file to provide custom configuration options.
  • Add the build service worker script in the build step in package.json.

Don’t worry if all this sounds confusing! The article doesn’t focus on explaining the semantics behind each of these points. We’ve to focus on the implementation part for now! I’ll try to cover the reasoning behind doing all the work to make a custom service worker in another article.

Let’s create two files sw-build.js and sw-custom.js in the src directory. Please refer to the links to these files and add the code to your project.

Let’s now create sw-precache-config.js file at the root level and add the following code in that file:

module.exports = {
  staticFileGlobs: [
    'build/static/css/**.css',
    'build/static/js/**.js',
    'build/index.html'
  ],
  swFilePath: './build/serviceWorker.js',
  stripPrefix: 'build/',
  handleFetch: false,
  runtimeCaching: [{
    urlPattern: /this\\.is\\.a\\.regex/,
    handler: 'networkFirst'
  }]
}

Let’s also modify the package.json file to make room for building the custom service worker file:

Add these statements in the scripts section:

"build-sw": "node ./src/sw-build.js",
"clean-cra-sw": "rm -f build/precache-manifest.*.js && rm -f build/service-worker.js",

And modify the build script as:

"build": "react-scripts build && npm run build-sw && npm run clean-cra-sw",

The setup is finally done! We now have to add a custom service worker file inside the public folder:

function showNotification (event) {
  const eventData = event.data.json();
  const { title, body } = eventData
  self.registration.showNotification(title, { body });
}

self.addEventListener('push', (event) => {
  event.waitUntil(showNotification(event));
})

We’ve just added one push listener to listen to push-notifications being sent by the server. The function showNotification is used for displaying web push notifications to the user.

This is it! We’re done with all the hard work of setting up a custom service worker to handle web push notifications. We’ll see these notifications in action once we build the user interfaces!

We’re getting closer to building the main code pieces. Let’s now start with the first view!

Symbol List View

The App component being used in the previous section looks like this:

import React from 'react';
import SymbolList from './views/symbolList';

const App = () => {
  return <SymbolList />;
};

export default App;

It is a simple component that returns SymbolList view and SymbolList does all the heavy-lifting of displaying symbols in a neatly tied user interface.

Let’s look at symbolList.js inside the views folder:

Please refer to the file here!

The component returns the results of the renderSymbols function. And, this data is being fetched from the database using the useQuery hook as:

const { loading, error, data } = useQuery(symbolsQuery, {variables: { userId }});

The symbolsQuery is defined as:

export const symbolsQuery = gql`
  query getSymbols($userId: uuid) {
    symbol {
      id
      company
      symbol_events(where: {user_id: {_eq: $userId}}) {
        id
        symbol
        trigger_type
        trigger_value
        user_id
      }
      stock_symbol_aggregate {
        aggregate {
          max {
            high
            volume
          }
          min {
            low
            volume
          }
        }
      }
    }
  }
`;

It takes in userId and fetches the subscribed events of that particular user to display the correct state of the notification icon (bell icon that is being displayed along with the title). The query also fetches the max and min values of the stock. Notice the use of aggregate in the above query. Hasura’s Aggregation queries do the work behind the scenes to fetch the aggregate values like count, sum, avg, max, min, etc.

Based on the response from the above GraphQL call, here’s the list of cards that are displayed on the front-end:

The card HTML structure looks something like this:

<div key={id}>
  <div className="card-container">
    <Card>
      <CardBody>
        <CardTitle className="card-title">
          <span className="company-name">{company}  </span>
            <Badge color="dark" pill>{id}</Badge>
            <div className={classNames({'bell': true, 'disabled': isSubscribed})} id={subscribePopover-${id}}>
              <FontAwesomeIcon icon={faBell} title="Subscribe" />
            </div>
        </CardTitle>
        <div className="metrics">
          <div className="metrics-row">
            <span className="metrics-row--label">High:</span> 
            <span className="metrics-row--value">{max.high}</span>
            <span className="metrics-row--label">{' '}(Volume: </span> 
            <span className="metrics-row--value">{max.volume}</span>)
          </div>
          <div className="metrics-row">
            <span className="metrics-row--label">Low: </span>
            <span className="metrics-row--value">{min.low}</span>
            <span className="metrics-row--label">{' '}(Volume: </span>
            <span className="metrics-row--value">{min.volume}</span>)
          </div>
        </div>
        <Button className="timeseries-btn" outline onClick={() => toggleTimeseries(id)}>Timeseries</Button>{' '}
      </CardBody>
    </Card>
    <Popover
      className="popover-custom" 
      placement="bottom" 
      target={subscribePopover-${id}}
      isOpen={isSubscribePopoverOpen === id}
      toggle={() => setSubscribeValues(id, symbolTriggerData)}
    >
      <PopoverHeader>
        Notification Options
        <span className="popover-close">
          <FontAwesomeIcon 
            icon={faTimes} 
            onClick={() => handlePopoverToggle(null)}
          />
        </span>
      </PopoverHeader>
      {renderSubscribeOptions(id, isSubscribed, symbolTriggerData)}
    </Popover>
  </div>
  <Collapse isOpen={expandedStockId === id}>
    {
      isOpen(id) ? <StockTimeseries symbol={id}/> : null
    }
  </Collapse>
</div>

We’re using the Card component of ReactStrap to render these cards. The Popover component is used for displaying the subscription-based options:

When the user clicks on the bell icon for a particular stock, he can opt-in to get notified every hour or when the price of the stock has reached the entered value. We’ll see this in action in the Events/Time Triggers section.

Note: We’ll get to the StockTimeseries component in the next section!

Please refer to symbolList.js for the complete code related to the stocks list component.

Stock Timeseries View

The StockTimeseries component uses the query stocksDataQuery:

export const stocksDataQuery = gqlquery getStocksData($symbol: String) {
    stock_data(order_by: {time: desc}, where: {symbol: {_eq: $symbol}}, limit: 25) {
      high
      low
      open
      close
      volume
      time
    }
  };

The above query fetches the recent 25 data points of the selected stock. For example, here is the chart for the Facebook stock open metric:

This is a straightforward component where we pass in some chart options to [HighchartsReact] component. Here are the chart options:

const chartOptions = {
  title: {
    text: `${symbol} Timeseries`
  },
  subtitle: {
    text: 'Intraday (5min) open, high, low, close prices & volume'
  },
  yAxis: {
    title: {
      text: '#'
    }
  },
  xAxis: {
    title: {
      text: 'Time'
    },
    categories: getDataPoints('time')
  },
  legend: {
    layout: 'vertical',
    align: 'right',
    verticalAlign: 'middle'
  },
  series: [
    {
      name: 'high',
      data: getDataPoints('high')
    }, {
      name: 'low',
      data: getDataPoints('low')
    }, {
      name: 'open',
      data: getDataPoints('open')
    },
    {
      name: 'close',
      data: getDataPoints('close')
    },
    {
      name: 'volume',
      data: getDataPoints('volume')
    }
  ]
}

The X-Axis shows the time and the Y-Axis shows the metric value at that time. The function getDataPoints is used for generating a series of points for each of the series.

const getDataPoints = (type) => {
  const values = [];
  data.stock_data.map((dataPoint) => {
    let value = dataPoint[type];
    if (type === 'time') {
      value = new Date(dataPoint['time']).toLocaleString('en-US');
    }
    values.push(value);
  });
  return values;
}

Simple! That’s how the Chart component is generated! Please refer to Chart.js and stockTimeseries.js files for the complete code on stock time-series.

You should now be ready with the data and the user interfaces part of the project. Let’s now move onto the interesting part — setting up event/time triggers based on the user’s input.

Setting Up Event/Scheduled Triggers

In this section, we’ll learn how to set up triggers on the Hasura console and how to send web push notifications to the selected users. Let’s get started!

Events Triggers On Hasura Console

Let’s create an event trigger stock_value on the table stock_data and insert as the trigger operation. The webhook will run every time there is an insert in the stock_data table.

We’re going to create a glitch project for the webhook URL. Let me put down a bit about webhooks to make easy clear to understand:

Webhooks are used for sending data from one application to another on the occurrence of a particular event. When an event is triggered, an HTTP POST call is made to the webhook URL with the event data as the payload.

In this case, when there is an insert operation on the stock_data table, an HTTP post call will be made to the configured webhook URL (post call in the glitch project).

Glitch Project For Sending Web-push Notifications

We’ve to get the webhook URL to put in the above event trigger interface. Go to glitch.com and create a new project. In this project, we’ll set up an express listener and there will be an HTTP post listener. The HTTP POST payload will have all the details of the stock datapoint including open, close, high, low, volume, time. We’ll have to fetch the list of users subscribed to this stock with the value equal to the close metric.

These users will then be notified of the stock price via web-push notifications.

That’s all we’ve to do to achieve the desired target of notifying users when the stock price reaches the expected value!

Let’s break this down into smaller steps and implement them!

Installing Dependencies

We would need the following dependencies:

  • express: is used for creating an express server.
  • apollo-fetch: is used for creating a fetch function for getting data from the GraphQL endpoint.
  • web-push: is used for sending web push notifications.

Please write this script in package.json to run index.js on npm start command:

"scripts": {
  "start": "node index.js"
}
Setting Up Express Server

Let’s create an index.js file as:

const express = require('express');
const bodyParser = require('body-parser');

const app = express();
app.use(bodyParser.json());

const handleStockValueTrigger = (eventData, res) => {
  /* Code for handling this trigger */
}

app.post('/', (req, res) => {
  const { body } = req
  const eventType = body.trigger.name
  const eventData = body.event

  switch (eventType) {
    case 'stock-value-trigger':
      return handleStockValueTrigger(eventData, res);
  }

});

app.get('/', function (req, res) {
  res.send('Hello World - For Event Triggers, try a POST request?');
});

var server = app.listen(process.env.PORT, function () {
    console.log(`server listening on port ${process.env.PORT}`);
});

In the above code, we’ve created post and get listeners on the route /. get is simple to get around! We’re mainly interested in the post call. If the eventType is stock-value-trigger, we’ll have to handle this trigger by notifying the subscribed users. Let’s add that bit and complete this function!

Fetching Subscribed Users
const fetch = createApolloFetch({
  uri: process.env.GRAPHQL_URL
});

const getSubscribedUsers = (symbol, triggerValue) => {
  return fetch({
    query: query getSubscribedUsers($symbol: String, $triggerValue: numeric) {
      events(where: {symbol: {_eq: $symbol}, trigger_type: {_eq: "event"}, trigger_value: {_gte: $triggerValue}}) {
        user_id
        user_subscription {
          subscription
        }
      }
    },
    variables: {
      symbol,
      triggerValue
    }
  }).then(response => response.data.events)
}


const handleStockValueTrigger = async (eventData, res) => {
  const symbol = eventData.data.new.symbol;
  const triggerValue = eventData.data.new.close;
  const subscribedUsers = await getSubscribedUsers(symbol, triggerValue);
  const webpushPayload = {
    title: ${symbol} - Stock Update,
    body: The price of this stock is ${triggerValue}
  }
  subscribedUsers.map((data) => {
    sendWebpush(data.user_subscription.subscription, JSON.stringify(webpushPayload));
  })
  res.json(eventData.toString());
}

In the above handleStockValueTrigger function, we’re first fetching the subscribed users using the getSubscribedUsers function. We’re then sending web-push notifications to each of these users. The function sendWebpush is used for sending the notification. We’ll look at the web-push implementation in a moment.

The function getSubscribedUsers uses the query:

query getSubscribedUsers($symbol: String, $triggerValue: numeric) {
  events(where: {symbol: {_eq: $symbol}, trigger_type: {_eq: "event"}, trigger_value: {_gte: $triggerValue}}) {
    user_id
    user_subscription {
      subscription
    }
  }
}

This query takes in the stock symbol and the value and fetches the user details including user-id and user_subscription that matches these conditions:

  • symbol equal to the one being passed in the payload.
  • trigger_type is equal to event.
  • trigger_value is greater than or equal to the one being passed to this function (close in this case).

Once we get the list of users, the only thing that remains is sending web-push notifications to them! Let’s do that right away!

Sending Web-Push Notifications To The Subscribed Users

We’ve to first get the public and the private VAPID keys to send web-push notifications. Please store these keys in the .env file and set these details in index.js as:

webPush.setVapidDetails(
  'mailto:<YOUR_MAIL_ID>',
  process.env.PUBLIC_VAPID_KEY,
  process.env.PRIVATE_VAPID_KEY
);

const sendWebpush = (subscription, webpushPayload) => {
  webPush.sendNotification(subscription, webpushPayload).catch(err => console.log('error while sending webpush', err))
}

The sendNotification function is used for sending the web-push on the subscription endpoint provided as the first parameter.

That’s all is required to successfully send web-push notifications to the subscribed users. Here’s the complete code defined in index.js:

const express = require('express');
const bodyParser = require('body-parser');
const { createApolloFetch } = require('apollo-fetch');
const webPush = require('web-push');

webPush.setVapidDetails(
  'mailto:<YOUR_MAIL_ID>',
  process.env.PUBLIC_VAPID_KEY,
  process.env.PRIVATE_VAPID_KEY
);

const app = express();
app.use(bodyParser.json());

const fetch = createApolloFetch({
  uri: process.env.GRAPHQL_URL
});

const getSubscribedUsers = (symbol, triggerValue) => {
  return fetch({
    query: query getSubscribedUsers($symbol: String, $triggerValue: numeric) {
      events(where: {symbol: {_eq: $symbol}, trigger_type: {_eq: "event"}, trigger_value: {_gte: $triggerValue}}) {
        user_id
        user_subscription {
          subscription
        }
      }
    },
    variables: {
      symbol,
      triggerValue
    }
  }).then(response => response.data.events)
}

const sendWebpush = (subscription, webpushPayload) => {
  webPush.sendNotification(subscription, webpushPayload).catch(err => console.log('error while sending webpush', err))
}

const handleStockValueTrigger = async (eventData, res) => {
  const symbol = eventData.data.new.symbol;
  const triggerValue = eventData.data.new.close;
  const subscribedUsers = await getSubscribedUsers(symbol, triggerValue);
  const webpushPayload = {
    title: ${symbol} - Stock Update,
    body: The price of this stock is ${triggerValue}
  }
  subscribedUsers.map((data) => {
    sendWebpush(data.user_subscription.subscription, JSON.stringify(webpushPayload));
  })
  res.json(eventData.toString());
}

app.post('/', (req, res) => {
  const { body } = req
  const eventType = body.trigger.name
  const eventData = body.event

  switch (eventType) {
    case 'stock-value-trigger':
      return handleStockValueTrigger(eventData, res);
  }

});

app.get('/', function (req, res) {
  res.send('Hello World - For Event Triggers, try a POST request?');
});

var server = app.listen(process.env.PORT, function () {
    console.log("server listening");
});

Let’s test out this flow by subscribing to stock with some value and manually inserting that value in the table (for testing)!

I subscribed to AMZN with value as 2000 and then inserted a data point in the table with this value. Here’s how the stocks notifier app notified me right after the insertion:

Neat! You can also check the event invocation log here:

The webhook is doing the work as expected! We’re all set for the event triggers now!

Scheduled/Cron Triggers

We can achieve a time-based trigger for notifying the subscriber users every hour using the Cron event trigger as:

We can use the same webhook URL and handle the subscribed users based on the trigger event type as stock_price_time_based_trigger. The implementation is similar to the event-based trigger.

Conclusion

In this article, we built a stock price notifier application. We learned how to fetch prices using the Alpha Vantage APIs and store the data points in the Hasura backed Postgres database. We also learned how to set up the Hasura GraphQL engine and create event-based and scheduled triggers. We built a glitch project for sending web-push notifications to the subscribed users.

Reblogged 1 year ago from smashingmagazine.com

Video: Joe Beccalori on the diminishing value of organic

We go way back to discuss how SEO has changed over the years.

Please visit Search Engine Land for the full article.

Reblogged 1 year ago from feeds.searchengineland.com

10 Common Copywriting Templates to Use in Marketing

Ask any marketer who’s responsible for copywriting about their writing process, and you’ll quickly find out that there’s no specific process to follow.

Additionally, copywriting varies depending on your audience, purpose, and format — copywriting for an Instagram post, for instance, is entirely different than copywriting for a press release.

At HubSpot, we know the struggle. Copywriting demands creativity, inspiration, and hard work — and it can be difficult to find all three, day-in and day-out.

To help with writer’s block, we’ve put together 10 copywriting templates you might use for any of your marketing efforts, including blogging, social media, email marketing, and even internal memos.

Let’s dive in.

10 Copywriting Templates to Use in Marketing

1. Email Marketing

First, you’ll need to determine what type of email you’re writing to ensure you’re speaking to the right audience. Coordinate with your team to see if this is a one-off marketing email like a monthly newsletter, or if you’re being asked to write for a series of emails, like a nurture campaign.

As you’re drafting your copy, consider how your email will encourage the reader to take a desired action, like clicking a link to purchase or scheduling a call with a sales rep to learn more about your services.

If you’re not aiming for the reader to take a specific action and instead just want to send a general update, like a company announcement, you’ll want the copy to easily and clearly communicate the core of your message to your reader.

Here’s an example of a template you might use to welcome new subscribers to your newsletter:

Hi [First Name],

Thank you for signing up for [include what someone just signed up for like a blog subscription, newsletter subscription, company services, etc.]

At [Company Name] we’re working to [list a few of your company’s core goals, or include your mission statement]. We highly encourage you to check out [suggest a few recommendations so the reader can continue learning more about your company].

If you ever have any questions please feel free to contact us at [Contact information].

Thank you,

[Company Name, or individual sender’s name]

Featured Resource: 15 Email Templates for Marketing and Sales

We’ve considered the types of emails marketers and salespeople are likely to send on a repeat basis, and crafted templates that can help eliminate that time.

Download these templates

2. Blogging

Blogs give copywriters a chance to dive deeper into topics in a way that isn’t captured through emails, ads, or social media posts. There are so many different types of blogs you might write, so be sure to develop your blog strategy to keep a close pulse on what types of blog posts and clusters perform best for your business.

Since blogs tend to be longer than other types of copy, you want to make sure you’re keeping your audience engaged. Consider what your reader is reading your post for and center your post on answering the topic-related questions readers are most likely to ask.

This blog post template is an example of a product or service review.

Title

Introduction

[Introduce the product/service that you’re reviewing and relevant background information about the company the product/service is from. Clearly state what the reader will gain from reading the post.]

Subheading

[Write a brief using keywords. Use headings throughout the post to break up the key sections your post]

Body

[A few paragraphs will cover the bulk of the review here. If there are multiple features to the product/service section them separately as you review. Be detailed and answer as many questions you think your audience may have about the product or service]

  • How much did it cost?
  • How is the functionality?
  • How was the customer service?
  • Are you recommending the product/service?
  • Who would benefit from using the product/service?

Conclusion

[Wrap up your post with final thoughts and a CTA if you want readers to check out the product/service.]

Featured Resource: 6 Free Blog Post Templates

We’ve put together six essential blog post templates every marketer needs — from how-to posts to listicles.

image of hubspot's free blog post templates

Download these templates

3. Social Media

Writing copy for social media depends on the social platform. If you’re writing copy for Twitter, you have a strict character count, so the copy has to be brief but still appealing enough to get the attention of someone scrolling.

Similar to Twitter, Instagram is known for catchy captions. Character count isn’t as much of a concern on Instagram. However, since the social media powerhouse is visually oriented, you’ll want to write a caption that echoes the image or video in a post.

Overall, the primary goal when copywriting for social media is to thoroughly understand the different use cases of the social media platform for which you’re writing. Here’s an example of an outreach template you could use for another major social media platform, LinkedIn.

Hi [First Name], I just finished [reading/watching your post, reading/watching a post you shared, reading a comment you left on a post, etc.] I found it interesting that [include a few brief key points you found interesting, or anything that you feel showcases some common ground]. I also noticed that we share a few mutual connections like [list mutual connections].

Let’s connect and keep sharing great content with each other!

Featured Resource: Social Media Templates

social media template

Download these templates

4. Website Copy

Copywriting for websites is about staying true to the business’ overall brand, while making it easy for users to navigate the site. The copy that makes it to a site plays a huge role in setting the tone for a brand’s voice. When writing website copy, then, it’s critical you collaborate with key decision-makers for feedback to ensure your copy is on brand.

There’s so many different components of a website, so start with clarifying what type of page you’re writing for on the site. This may include, but is not limited to, the following:

  • Home page
  • About Us page
  • Contact page
  • Product or Service category page(s)
  • FAQ page
  • Blog page

Let’s take a look at one of the most necessary pages to include on your site, the About Us page:

[Company name] was founded in [Year] by [Founder’s name]. When [Founder’s name] began building [Company name] [he/she/they was/were] determined to [help ,build, create] a company that offers [include the solution that the company problem solves for].

[Include as much or as little about the founders of your company. Sharing personable stories about how your company was founded is a great way to connect with readers and provide more insight into the people behind your brand.]

[Company name] helps people with [identified pain points of your buyer persona(s)]. To give our customers the best [product or service] we focus on [value proposition #1], [value proposition #2], and [value proposition #3].

[Company name] takes pride in working with people like you to provide quality [product/s or service/s] and exceptional customer service. We look forward to having you as a valued customer.

[Closing Signature]

Featured Resource: About Us Pages Guide + Lookbook

Get inspired by these awesome ‘About Us’ page examples and learn how to make yours great, too.

about-cover-1

Download these templates

5. Ebooks

Ebooks are one of the most common types of content copywriters can create. Since ebooks are meant to contain extensive information, it’s best to take the drafting process one section at a time.

Here’s an example of a general ebook template.

Cover/Title Page

[In addition to including the title of your ebook, you’ll also include your cover image. If this is a company resource also add your company’s logo. If it’s a resource coming directly from an individual contributor, include the author’s name.]

Table of Contents

[The table of contents should clearly include a list of all the chapters or sections in the ebook, with the corresponding page numbers.]

Introduction
[Introduce the ebook topic with relevant background information and clearly state what the reader will gain from reading the ebook.]

Chapter/Section Pages

[This is the best part of your ebook because it’s where the core of your information will be for your readers. Break the writing into digestible paragraphs for better readability, and include relevant images to help break up the copy and fill excessive white space.]

Conclusion Page

[This is the closing of your ebook. The goal of your conclusion should emphasize what the reader has gained, and any actionable steps they can use to put their new knowledge to good use.]

Optional pages to include:

About the Author page

[This page helps readers learn more about the author. The background information can vary depending on the author’s level of comfortability, but overall the tone should be personable. This is also an opportunity to speak to the author’s credibility of the ebook topic.]

Interactive pages

[Interactive pages can help keep your readers engaged. These pages may include: quizzes, worksheets, checklists etc. Including an interactive page in each chapter or section can help your reader feel they’re actively learning as they read.]

Resources page

[You’ve most likely referenced tons of sources to help you get the final version of your ebook. Include the most important resources on this page for readers that may want to do further exploration on their own.]

Featured Resource: Ebook Templates

Let us take care of the design for you. We’ve created six free ebook design templates — available for PowerPoint, Google Slides, and InDesign — for a total of 18 templates.

Ebook-Templates-2-2

Download these templates

6. Crisis Communications

If you’ve been tasked with writing for a crisis, you’ll need to be especially attentive since this type of content is usually addressing serious or sensitive matters.

Developing clear messaging for crisis communications requires a special level of detail. You’ll want to convey an empathetic tone that appropriately addresses the crisis. It’s a good idea to collaborate with team members to ensure the overall message is properly aligned with your company’s brand.

You may end up creating several pieces of content for a crisis including blog posts, social media posts, emails, an announcement from the CEO, a newsletter, etc. The following template is an overview of what to address:

An overview of the crisis

[Clearly identify the crisis and share detailed background information on what has occurred. If you’re addressing something that includes individuals use discretion. Check with your company’s legal team to ensure all documents are following proper protocol.]

Plan of action and timeline

[Create a plan that includes a timeline of how the events have developed and how your team will be addressing the issue/s at hand. Consider the types of questions media outlets could ask and write prepared statements the company, leadership, and general team members can use to respond.]

Contact information

[Share the best contact information people can use to learn more about what’s happening and ask any additional questions. This could be your company’s PR team or agency or an internal customer service or support team.]

Featured Resource: Crisis Management and Communication Kit

The templates in this crisis communication kit will help your crisis management team prepare for how to handle a crisis and respond to the media during the difficult time. Having clear lanes gives your team to operate effectively during times of crisis.

cover image of hubspot's crisis management and communications kit

Download these templates

7. Customer Communications

Customer service is an essential part of any business. Writing to better understand and better communicate with your customers is necessary to foster stronger connections. One of the best ways to better understand your customers is by creating buyer personas. Buyer personas are semi-fictional representations of your ideal customers based on data and research.

Use this template outline to begin developing your buyer personas.

Background

[Create a background for your persona that best exemplifies the types of customers you have. This can include their job title, career path, and family life.]

Demographics

[Include age, gender, salary range, location, and anything else that best represents your customer persona.]

Identifiers

[Identifiers can include your personas general demeanor or communication preferences. This type of information is vital because it helps businesses build a more curated approach for their customers.]

Featured Resource: 17 Templates to Help You Put the Customer First

To help you foster better relationships with delighted customers, we put together this collection of templates — buyer persona templates, email templates, and survey templates — that put the customer first.

image of hubspot's templates to help readers put the customer first

Download these templates

8. Case Studies

Potential customers often turn to case studies when they’re researching a product or service they’re interested in buying. Case studies provide evidence as to how a product or service has helped customers by identifying a pain point and providing a solution. They’re a great resource for copywriters to show off their interview skills and boast strong statistics.

The key components of a case study are listed in the following template:

Executive Summary

[Provide a mini headline to grab your reader’s attention. Then, underneath this headline, write 2–4 sentences (under 50 words) summarizing the whole story, making sure to include the most relevant points of the case study.]

About the Client

[Share a brief description of the company you’re featuring in the case study. This should include the name of the company, when the company was founded, what the company does, and any other relevant information you think would be helpful for readers.]

The Challenges

[Write 2–3 short paragraphs describing the pain points your client was experiencing before they bought from you, the challenges this presented and/ the goals that were trying to be achieved.]

The Solution

[Write 2–3 short paragraphs describing how your company worked with your customer to find a solution to their challenges and implement a winning strategy. Use this space to describe how they are now using your product or service to solve their challenges from the previous section.]

Results

[Write a 2–3 paragraph conclusion to prove that your product/service impacted the customer’s business and helped them to achieve their goals, especially if they’ve been able to quantify or speak to the ROI of their investment.]

Call-to-Action

[Use your CTA to lead your prospect to a landing page or a contact form. This will give you more information on who’s reading your case study and who’s interested in your company.]

Featured Resource: Case Study Template

Need help getting your first case study off the ground? Look no further. We’ve put together a comprehensive guide, complete with templates, designed to make the process a whole lot easier.

Case-Study-Cover

Download this template

9. Call-to-Action

A call-to-action (CTA) is an image or line of text that’s included in different types of content to encourage leads and/or customers to take action. In short, you want someone to click your CTA to carry out a desired action.

Add CTAs to blogs, emails, ebooks, and anywhere else you want a lead to complete a certain action to push them to the next stage of the buyer’s journey.

Featured Resource: CTA Templates

These resources will empower you to create an impressive call-to-action strategy by helping you understand how CTAs work across different use cases, while also providing you with the means to create them for your own website.

image of hubspot cta templates

Download these templates

10. Memos

A memorandum, or memo, is used to address internal communications within an organization. Think about the type of message you’re aiming to communicate. If you’re sharing minutes from a meeting, or detailing new policies and procedures, or communicating anything that people may need to refer back to in the future, a memo is likely a good idea.

Memos tend to be longer and more formal than emails (although you may attach a memo to an announcement email) and may be formatted according to your company’s style guidelines.

Use this general memo template to get started.

Memo: [Memo Title]

Date: [Date of sending]

Memo To: [Individual(s), Department(s), or Organization(s) the memo is being sent to]

From: [Your Name, or the Name of the Department on whose behalf the memo is being sent]

Subject: [Enter a brief, 5-10 word subject line to describe the purpose’s memo]

Introduction

Provide an executive summary of this memo in one-two paragraphs, highlighting the change that is happening, when it is effective, and what the key takeaways are for the memo recipient.

Background

Explain the background for this organizational change in one-two paragraphs. Some questions to answer in this section might be:

  • Why was this idea pursued in the first place?
  • What data, research, or background information informed this decision?
  • What are the intended results of this organizational change?

Overview and Timeline

Describe the organizational change in clear, direct language. Specify the following:

  • What will be changing.
  • Who will be responsible for driving the change.
  • When the changes will go into effect.

Closing

Close things out with a final note on:

  • Why employees should feel excited and motivated about this change.
  • Where and when employees should submit questions, comments, and/or concerns.

Featured Resource: 4 Free Memo Templates

We’ve drafted up four free memo templates for general, organizational, financial, and problem-solving updates. We’ve also included a best practices checklist for you to review before sending your memo out.

image of hubspot's memo templates

Download these templates

Adding these templates to your marketing arsenal can help you save time during your drafting process. Copywriters are shifting gears from blogs to case studies to emails all the time.

If you’re responsible for writing amazing copy for different types of content on a regular basis, using templates is a great way to get your creative juices flowing.

Reblogged 1 year ago from blog.hubspot.com

SMX Overtime: Eternal testing, the key to Facebook Ads success

SMX speaker Amy Bishop shares insights on Facebook’s learning curve and testing strategies.

Please visit Search Engine Land for the full article.

Reblogged 1 year ago from feeds.searchengineland.com

What’s Changed (and What Hasn’t): The 2020 Moz Blog Reader Survey Results

Posted by morgan.mcmurray


You’re tired of hearing it and I’m tired of saying it, but 2020 really has been a year like no other. SEOs and marketers around the world had to deal with their day-to-day work moving home, alongside a host of natural disasters, civil rights issues, and a pandemic that will alter our industry and global economy for years to come. 

We could have held off on launching this year’s reader survey, but we decided to move forward anyway because we know your work and your interests have been impacted, and we wanted to know how much. 

I’m excited to share with you the results from that survey in this post. We’ll go through what’s changed — and what hasn’t — for our readership since our last survey in 2017, and detail what those insights mean for the Moz Blog in 2021. 

Methodology

We published this survey in July 2020, with questions asking for details on the professional occupations of our readers, how those readers interact with the blog, and what those readers like to see from the blog. We also included COVID-19-specific questions to gauge the pandemic’s impact on our readers. The survey was shared on the blog, through email blasts, and on our social media accounts.

The percentages shared in the sections below are part of a total of 388 responses we received over four months. This is actually our first data point, showing that engagement with surveys has shifted drastically since our 2017 survey, which got nearly 600 responses in just one month. Given the interruptive nature of 2020’s events, we won’t let that difference discourage us from utilizing surveys in the future. Where able, I’ve compared 2020’s results to those of the 2017 survey, to better visualize the differences. 

Answers were not required for all questions, so if something did not apply to a respondent, they could leave the answer blank or choose a variety of “no opinion” or “N/A” options. 

We don’t typically include demographic or geographic questions in our reader surveys, but given the overwhelmingly positive response to the Gender Gap in SEO and Diversity and Inclusion in SEO surveys published this year, we will do so moving forward. Understanding the struggles SEOs and marketers face in the industry due to race, gender, and sexual orientation is imperative to understanding how to best work with and for everyone, and we acknowledge that shortcoming in this year’s survey. 


Who our readers are

Let’s dive in. First up: the questions asking readers to tell us more about themselves. 

What is your job title?

The word cloud below is an amalgamation of the top-used words in response to this question, and the size of the word correlates to the number of mentions that word received. 

No surprises here: number one (by far) was “SEO”. Our readership remains heavily SEO-focused in their occupations, with content marketers coming in close second.

What percentage of your day-to-day work involves SEO?

That said, 2020 saw an increase in respondents in the lower percentage brackets of readers who use SEO strategies in their daily work, specifically the 1-10% and 41-50% ranges. This could be due, in part, to the broadening of tasks assigned to SEOs in the marketing industry, as several respondents also mentioned a need to wear multiple hats in their organization. 

On a scale of 1-5, how advanced would you say your SEO knowledge is?

The majority of our readers remain intermediately knowledgeable about SEO concepts, leaving plenty of room for new learnings across skill levels.

Do you work in-house, or at an agency/consultancy?

While the majority of Moz Blog readers are still in-house SEOs and marketers, an interesting takeaway for us in 2020 is the increase of those who are independent consultants or freelancers from 11% in 2017 to just under 17% in 2020. We’ll make sure to take that into account for our content strategy moving forward. 

What are some of the biggest challenges you face in your work today?

Far and away, the challenge most often mentioned in response to this question was the high volume and rapid cadence of new SEO information, new tools, and algorithm updates. Readers are struggling to determine what to focus on and when, what to prioritize, and what even applies to their work. We can certainly help you with that in 2021.

Other frequently-mentioned struggles were familiar to us from previous surveys, showing us that the SEO industry still needs to address these issues, and that the Moz Blog can continue offering up content in response. These issues included: 

  • Lack of resources and cross-functional collaboration at work.
  • SEO prioritization at work.
  • Lack of consolidation in analytics and reporting tools.
  • Difficulty explaining the value of SEO to bosses/clients/non-SEOs.
  • Difficulty explaining what SEO CAN’T do to bosses/clients/non-SEOs.
  • Attracting new clients and customers.
  • Having to wear multiple hats.

How our readers read

Keeping in mind all that context of who our readers are, we dug into preferences in terms of formats, frequency, and subject matter on the blog.

How often do you read posts on the Moz Blog?

As an increasing number of readers rely on social media channels for their news and content consumption, the shift from frequent readers to “every once in a while” readers is not a surprise, but it is a concern. It also necessitates our incorporation of social media engagement as a top KPI for blog performance. 

Given the multiple off-blog distribution methods and frequency of prompts to take this year’s survey, we saw a sharp increase in “non-reader” responses from 1% in 2017 to 6% in 2020. That said, it’s interesting that Moz email and social media subscribers who weren’t Moz Blog readers felt motivated to take a survey entitled “Moz Blog Reader Survey”. We’ve taken note of the topics requested from those respondents, in the hopes of encouraging more engagement with the blog. 

On which types of devices do you prefer to read blog posts?

While desktop and laptop computers remain the most common way to consume blog content, mobile phone use saw an increase of nearly 10 percentage points. Mobile phones have only improved in the last three years, and it’s no secret that we’re using them more often for actions we’d normally take on a computer. As we move toward blog CMS improvements in 2021, mobile-friendliness will be a priority. 

Which other site(s), if any, do you regularly visit for information or education on SEO?

Across the board, we saw a decrease in the number of respondents listing other SEO news resources, as well as the first instance of a social media platform in the top 10 resources mentioned. This only serves as further evidence that social media is continuing its growth as a news and content medium. 


What our readers think of the blog

Here’s where we get into more specific feedback about the Moz Blog, including whether it’s relevant, how easy it is for readers to consume, and more. 

What percentage of the posts on the Moz Blog would you say are relevant to you and your work?

While the trends regarding readers’ opinions on relevancy remained similar between 2017 and 2020, we saw about a 6% dip in respondents who said 81-90% of posts are relevant to them, and increases in the bottom four percentage brackets. These results, paired with the topic requests we’ll cover later, indicate a need to shift and slightly narrow our content strategy to include more posts specific to core SEO disciplines, like on-page SEO and analytics. 

Do you feel the Moz Blog posts are generally too basic, too advanced, or about right?

Given the breadth of topics on the blog and the wide range of reader skill levels, we’re happy to see that, for the most part, readers find our posts just about right on a scale of too easy to too advanced. 

In general, what do you think about the length of Moz Blog posts?

Similarly, it’s great to see that readers continue to be satisfied with the amount of content served up in each post. 

How often do you comment on blog posts? 

RIP, comment section. A trend we’ve seen over the last several years continues its downward slope: 82% of readers who took part in the survey never comment on posts.

When asked for the reasons why they never comment, we saw some frequent responses: 

  • “I have nothing to add.”
  • “It wouldn’t add value.”
  • “I’m still learning.” 
  • “I never comment anywhere.” 
  • “I don’t have enough time.” 
  • “Follow-up questions go unanswered.” 
  • “I read posts in the RSS feed.”
  • “English isn’t my first language.”
  • “I’m not signed in.” 

Blog comment sections and forums used to be the place for online conversations, so this drop in engagement certainly signals the end of an era. However, these concerns also give us some areas of improvement, like working with our authors to be more responsive and improving comment accessibility. But sorry to those who prefer not to sign in — without that gate, we’d be inundated with spam.

In contrast, here were the reasons for commenting: 

  • “I have a question.”
  • “I have a strong emotional connection to the material.”
  • “I strongly agree or disagree.” 
  • “I want to add my personal experience or advice.” 

We definitely encourage readers who do have questions or concerns to continue commenting! 

What, if anything, would you like to see different about the Moz Blog?

Outside the responses along the lines of “No changes! Keep up the good work!” for which we thank you, these were the top asks from readers: 

  • More thoughtful feedback from and interaction with authors.
  • More variety and diversity in our author pool.
  • More video content.
  • More specific case studies, tests, and experiments.
  • More step-by-step guides with actionable insights showing how to solve problems.
  • Ability to filter or categorize by skill level.
  • Diversity in location (outside the US). 

These are great suggestions, some of which we’ve already begun to address! 

We also received only a few responses along the lines of “keep your politics out of SEO”, specifically referencing our Black Lives Matter support and our posts on diversity. To those concerned, I will reiterate: human rights exist beyond politics. Our understanding of the experiences our co-workers and clients have had is essential to doing good, empathetic work with and for them. The Moz Blog will continue our practice of the Moz TAGFEE code in response to these ongoing issues. 


What our readers want to see

Which of the following topics would you like to learn more about?

Survey respondents could choose multiple topics from the list below in their answers, and the most-requested topics look very similar to 2017. A noticeable shift is in the desire for mobile SEO content, which dropped from being requested in 33% of responses in 2017 to just under 20% in 2020. 

In 2020, we certainly had more content addressing the broader marketing industry and local SEOs impacted by the pandemic. To better address the relevancy issue mentioned earlier, the top four core SEO subjects of on-page SEO, keyword research, link building, and analytics (all included in over 50% of responses) will become blog priorities in 2021.

Which of the following types of posts would you most like to see on the Moz Blog?

The way readers want to consume those topics hasn’t changed much at all in the last three years — the desire for actionable, tactical insights is as strong as ever, with the request for tools, tips, and techniques remaining at 80% of respondents. These types of posts have been and will remain our go-to moving forward. 


COVID-19 

Moving into our last and newest section for the survey, we asked readers questions regarding the way in which they consume SEO-related content during the COVID-19 era. 

Has your consumption of SEO-related content changed due to COVID-19?

Only 34% of respondents said that their consumption of SEO-related content had changed as a result of the pandemic, a number we expected to be higher. It’s encouraging to see that so many readers were able to maintain a sense of normalcy in this area.

Of those who did see a shift, these were the most common reasons why: 

  • Job loss and job hunting
  • Shift to work from home and being online 24-7
  • E-Commerce industry shifts
  • Online engagement shifts and ranking and traffic drops
  • Loss of clients and constricting budgets
  • More time to read paired with less time or opportunity to implement learnings

Would any of the following topics be helpful for you as a result of COVID-19 impacts? 

Along those same lines, the most popular topic requested as a result of COVID-19 impacts with 27% of responses was tracking/reporting on traffic and ranking drops. Content and marketing strategies during a crisis came in close second and third, with 24% and 21%, respectively. 



The answers to these questions show us that pivoting our content strategy in spring 2020 to address areas of concern was helpful for about a third of our readers, and probably contributed to the relevancy issue for the other two-thirds. We’ll continue to include these topics (on a smaller scale) until we see the other side of this crisis.


What happens next?

Primary takeaways

You asked, and we hear you. Moving into 2021, we’ll be writing on more technical, core SEO topics along with issues on the business side of SEO. We’ll also be building out our Whiteboard Friday series to provide more fresh video content. And as always, we’ll strive to provide you with actionable insights to apply to your daily work.

Given the steep decline in comment section engagement, we’ll be encouraging our authors to be more responsive to questions, and to interact with you on social media. Make sure to follow Moz on Twitter, Instagram, Facebook, and LinkedIn to stay up-to-date with the blog and our guest authors. 

Finally, stay tuned, as next year we’re planning UX improvements to our blog CMS to address usability and accessibility concerns.

My genuine thanks goes out to those readers who took the time to give us their feedback. It is immeasurably valuable to us, and we’re looking forward to applying it to all the amazing content we have coming your way in 2021.

Have a safe and healthy holiday season, Moz fans, and happy reading! 

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Reblogged 1 year ago from feedproxy.google.com

1-800 Contacts doubles down on the digital customer experience

How 1-800 Contacts responded to record stay-at-home demand.

Please visit Marketing Land for the full article.

Reblogged 1 year ago from feeds.marketingland.com

Sex workers fear targeting under Instagram's terms of service

Sex workers fear targeting under Instagram's terms of service

Normally when a social network updates its terms of service, barely anyone notices. But that’s not the case with Instagram’s newest Terms of Use, which have caused quite an uproar among sex workers online.

The issue that’s drawing notice is the platform’s Community Guidelines. They require adherence to Facebook’s Sexual Solicitation rules, which is not a new addition to this version of the terms — but it’s continued inclusion is a big concern for sex workers who reach their audience on the platform.

Facebook, which owns Instagram, states in its policy that users can’t post sexually explicit and implicit content, including suggestive emojis or references to “wetness” or an erection. This, as you can imagine, makes it difficult for people to promote sex work. Read more…

More about Instagram, Social Media, Sex Workers, Culture, and Social Media Companies

Reblogged 1 year ago from feeds.mashable.com

Google Maps Marketing Strategy: The Ultimate Cheat Sheet

When I’m looking for a business that will have what I need, whether it’s food or office supplies, I use Google Maps. I know that the search results will give me enough information to pick the best option, like reviews, ratings, photos, and location information.

I’m not alone in my preference for Google Maps, as it is six times more popular than other navigation apps. Given its popularity, it’s critical you optimize your business to appear in Google Maps search results.

In this piece we’ll cover the basics of Google Maps marketing, best practices for optimizing your Google My Business for search results, and marketing tactics you can employ to improve your local SEO.  

Google Maps marketing is beneficial for large and small businesses alike, especially because of Google Maps Local 3-Pack. Local 3-Pack is the first three businesses shown in search results that Google considers to be most relevant based on a user’s current location or the location where they’re searching.

The image below is an example of a Local 3-Pack for a restaurant query in Cambridge, MA. These three businesses are what Google considers to be most relevant based on the search query and the location where the search is being conducted.

You can think of it like this: if your Google Maps marketing strategy is focused on optimizing your Google My Business profile for search results, Google will take note of this and show your business in the Local 3-Pack for relevant queries relating to your company. This increases visibility, and increases the likelihood of consumers visiting your business and making a purchase.

What is Google My Business?

Google My Business is a free tool for business owners to manage and optimize their business’ profiles on Google Maps.

A Google My Business profile typically includes your business’ name, location, and hours of operation. The image below is an example of a typical Google My Business profile from one of the top restaurants from the image above.

example of optimized google my business account for a restaurant google maps marketing

When you claim your business using a Google My Business profile, you can optimize your Google Maps presence by ensuring that customers only have access to relevant, up-to-date information.

You can also use your profile to interact with customers when they leave reviews and track Google Maps Ads insights to understand how customers interact with your profile.

Google Maps Ads

Although marketing on Google Maps is free, local search ads can boost your business’ visibility in Google Maps search results.

Using these ads will bring your business listing to the top of search queries for keywords relevant to your business. Google Maps ads use location targeting, so they’ll only show up for location-relevant searches. These ads appear in desktop and mobile searches (shown below).  

example of google maps marketing business ads in search results

Image Source

When you create ads for your business, you can track their performance to get a sense of how customers respond to your ads and take further actions like using your address to get Google Maps directions or click on the listed phone number to call you.

How To Optimize Your Google My Business Account

To appear in Google Maps results, you need to optimize your Google My Business profile to include up-to-date information. Let’s go over the profile elements that marketers should focus on to perfect their Google Maps marketing strategy.

Verify Your Business

The first thing you’ll want to do is claim and verify your business. This means that you claim a business as your own within Google Maps, so you’re the only person with the ability to change information in your business profile.

This verification process is done through Google, and most businesses can do so through postal mail.

Mark Relevant Categories and Attributes

Categories describe your business and the services you provide. Customers often come across businesses in Google Maps by searching for categories, like “pizza restaurant” or “coffee shop.”

Google recommends that you choose a primary category that describes your overall business, like “Pet Groomer.” A further best practice is using additional categories that let customers know more about what your business offers. So, continuing with the previous example, if you’re a pet groomer that also sells pet supplies, you’d want to include an additional category for “Pet Supplies.”

Attributes are the amenities a business offers, like accessibility functions or outdoor seating. Google has recently added attributes to mark when businesses are black-owned, women-owned, and LGBTQ friendly. Google recommends that businesses select the most-relevant attributes to display in their profile.

Location and Service Areas

If your business has a store-front location, using the right address is crucial for optimizing your profile. Google recommends using a USPS or postal service approved address that includes all relevant information like room or suite number and a nine-digit zip code.

If your business services customers at their homes or businesses, like a cleaning service, you’d want to include an address for your primary location and add specific service areas where you do business.

Optimized Introduction

One of the most essential optimization factors for your Google My Business profile is an optimized introduction that gives customers a summary of your business. These introductions should be brief and include relevant keywords that users may search for. Short descriptions of your business are shown in tile search results, as shown below.

google maps marketing optimized introduction example

Longer descriptions appear when a customer clicks on your specific business, as shown below.

optimized introduction in google my business profile

Reviews

Google recommends responding to online reviews of your business, which enables new people to find you online, and demonstrating to existing customers that you value their feedback. You might consider asking happy customers to leave reviews or even incentivizing them with a coupon in exchange.

Having reviews on your profile is also a great way to build social proof based on a psychological idea that says people determine what is good by finding out what other people think is correct. Reviews help build social proof because consumers value enthusiasm from other consumers, and they spend 31% more when a business has positive reviews. Consumers also trust user-generated content (UGC), like reviews, 9.8x more than paid advertisements.

Alternatively, if someone leaves a negative review, it’s still essential you respond. Look at Google’s guidelines for how to respond to a negative review.

In search result overviews, reviews appear as star ratings in a business profile, as shown below.

google maps search results business ratings example

When a user clicks on the business they’re interested in, reviews appear towards the bottom of the business’ profile. Anyone can read these reviews, view ratings, and leave their own. This is demonstrated in the GIF below.

google maps business profile reviews example

Photos

According to Google, businesses with photos receive 42% more requests for driving directions to their location and 35% more click-throughs to their website. To optimize your account for search, it’s vital that you provide pictures of your physical store and images of your products or services.

Customers can also upload photos, which is another form of UGC to benefit from in business reviews.

Best Google Maps Marketing

Let’s look at two examples of businesses in the Boston area that have successfully optimized their Google My Business profile for effective Google Maps marketing.

Flour Bakery

When I searched for “bakery” in my area, Flour Bakery was one of the first results. Flour Bakery has taken advantage of all Google My Business optimization options to build their Google Maps presence. They’ve included a brief summary of their services, accurate location, hours, and links. They also used the Google Attributes feature to call out relevant aspects of their business, like being women-led and LGBTQ friendly.

example of optimized google maps marketing profile for a restaurant

They’ve also included relevant photos and allow customers to leave reviews.

example google maps business profile reviews and photos

Flour Bakery also responds to reviews, showing existing and future customers alike that they care about consumer experience and are willing to rectify problems, should they arise (shown below).

business replying to google maps customer reviews example

Blue Nile Restaurant

Say you’re looking for a black-owned business to support during your next meal. If I query the words “black-owned,” Blue Nile Restaurant comes up.

They’ve successfully optimized their Google My Business profile and selected an attribute that makes them easily identifiable (being black-owned). They’ve included an accurate address, hours, and phone numbers and allowed customers to leave reviews and add their own photos (shown in the gif below).

example of optimized google my business profile using attributes

Marketing Strategies to Improve Local SEO

Google Maps is the most popular navigation app by a landslide, with 67% of total navigation users — second to Maps is Waze, with only 12% of total users.

This means it’s a critical tool for ensuring local consumers can find your business.

Google Maps can help bolster your online presence in local search results, which demonstrates your legitimacy and relevancy as a company — you wouldn’t be able to rank #1 for “nail salons near me” if you’d closed down months ago. Additionally, ranking high in search will ideally convince consumers to choose your business over a competitor’s.

To improve local SEO, try these three tactics:

Get links back to your site from local businesses or bloggers.

To improve your local SEO, consider crafting a strategy to get backlinks from other local businesses. You could create a deal in which you link to another business’ site in exchange for a link from them.

Alternatively, depending on your industry, try reaching out to local bloggers to ask for reviews — for instance, if you run a Bed & Breakfast, you might ask travel bloggers to stay for free in exchange for an online review.

Submit your site to local business directories.

Try submitting your site to local business directories like Yelp, Local.com, and Superpages, and ensure all information across each of the sites is accurate — ultimately, the more websites that reference your business, the better.

To submit your site to Yelp, for instance, go to https://biz.yelp.com/signup_business/new and fill in the form. Once the site is verified and approved by moderators, you’ll receive an email on how to claim your business page.

Keep track of performance via Google’s Insights tool.

Once you’ve created a Google My Business profile and optimized it for Google Maps, you’ll want to track analytics using Google’s Insights tool.

To access Insights on a desktop, sign in to your Google My Business account, open the location you’d like to manage (if you have multiple locations), and click “Insights” from the menu. On mobile, open the Google My Business app, click “More,” and then tap “Insights.”

Insights allow you to see how many times people have seen your business information, how many times people have requested driving directions to your location, and how customers can find your listing.

Optimize Your Google Maps Presence

Ultimately, an optimized Google My Business profile is another addition to your digital marketing strategy that gives your business more visibility, especially in Google Maps search results.

A business profile that gives an accurate address, phone number, and contains a significant amount of customer reviews will likely attract new customers, helping you grow your reach and drive revenue.

Reblogged 1 year ago from blog.hubspot.com